Trait leaf::layer::ComputeParametersGradient
[−]
[src]
pub trait ComputeParametersGradient<T, B: IBackend> { fn compute_parameters_gradient(&self, backend: &B, output_data: &[&SharedTensor<T>], output_gradients: &[&SharedTensor<T>], input_data: &[&SharedTensor<T>], parameters_gradients: &mut [&mut SharedTensor<T>]) { ... } }
A Layer that can compute the gradient with respect to its parameters (= weights, bias, etc.).
Provided Methods
fn compute_parameters_gradient(&self, backend: &B, output_data: &[&SharedTensor<T>], output_gradients: &[&SharedTensor<T>], input_data: &[&SharedTensor<T>], parameters_gradients: &mut [&mut SharedTensor<T>])
Compute gradients with respect to the parameters and write them into parameters_gradients
.
Implementors
impl<B: IBackend + Relu<f32>> ComputeParametersGradient<f32, B> for ReLU
impl<B: IBackend + Sigmoid<f32>> ComputeParametersGradient<f32, B> for Sigmoid
impl<B: IBackend + Tanh<f32>> ComputeParametersGradient<f32, B> for TanH
impl<B: IBackend + LayerOps<f32>> ComputeParametersGradient<f32, B> for Linear
impl<B: IBackend + LogSoftmax<f32>> ComputeParametersGradient<f32, B> for LogSoftmax
impl<B: IBackend + Softmax<f32>> ComputeParametersGradient<f32, B> for Softmax
impl<B: IBackend> ComputeParametersGradient<f32, B> for NegativeLogLikelihood
impl<B: IBackend> ComputeParametersGradient<f32, B> for Reshape
impl<B: IBackend + LayerOps<f32> + 'static> ComputeParametersGradient<f32, B> for Sequential<B>