SwiGLU activation kernels for BF16 tensors. More...
Go to the source code of this file.
Functions | |
| void | swiglu_backward_bf16 (const uint16_t *input, const uint16_t *d_output, uint16_t *d_input, int tokens, int dim) |
| void | swiglu_forward_bf16 (const uint16_t *input, uint16_t *output, int tokens, int dim) |
SwiGLU activation kernels for BF16 tensors.
After changes: make test && make llamacpp-parity-full
SwiGLU: y = silu(gate) * up = (gate * sigmoid(gate)) * up
Definition in file swiglu_kernels_bf16.c.
| void swiglu_backward_bf16 | ( | const uint16_t * | input, |
| const uint16_t * | d_output, | ||
| uint16_t * | d_input, | ||
| int | tokens, | ||
| int | dim | ||
| ) |
Definition at line 108 of file swiglu_kernels_bf16.c.
References bf16_to_float(), float_to_bf16(), sigmoid_scalar(), and silu().
| void swiglu_forward_bf16 | ( | const uint16_t * | input, |
| uint16_t * | output, | ||
| int | tokens, | ||
| int | dim | ||
| ) |
Definition at line 66 of file swiglu_kernels_bf16.c.
References bf16_to_float(), float_to_bf16(), sigmoid_scalar(), and silu().