ReLU activation kernels for BF16 tensors. More...
Go to the source code of this file.
Functions | |
| void | relu_backward_bf16 (const uint16_t *input, const uint16_t *d_output, uint16_t *d_input, size_t n) |
| void | relu_forward_bf16 (const uint16_t *input, uint16_t *output, size_t n) |
| void | relu_forward_inplace_bf16 (uint16_t *data, size_t n) |
ReLU activation kernels for BF16 tensors.
After changes: make test && make llamacpp-parity-full
ReLU: y = max(0, x)
Definition in file relu_kernels_bf16.c.
| void relu_backward_bf16 | ( | const uint16_t * | input, |
| const uint16_t * | d_output, | ||
| uint16_t * | d_input, | ||
| size_t | n | ||
| ) |
| void relu_forward_bf16 | ( | const uint16_t * | input, |
| uint16_t * | output, | ||
| size_t | n | ||
| ) |
| void relu_forward_inplace_bf16 | ( | uint16_t * | data, |
| size_t | n | ||
| ) |