Skip to main content

Module gradient

Module gradient 

Source
Expand description

Gradient operators for QSM

Forward difference gradient and backward divergence operators used in TV regularization and other algorithms.

Functions§

bdiv
Backward divergence operator (negative adjoint of forward gradient)
bdiv_inplace
Backward divergence operator (in-place)
bdiv_inplace_f32
Backward divergence operator (in-place, f32) Uses zero boundary conditions (matching Julia)
bdiv_masked_inplace_f32
Backward divergence operator (in-place, f32) - masked version
fgrad
Forward difference gradient operator
fgrad_f32
Forward difference gradient operator (allocating, f32)
fgrad_inplace
Forward difference gradient operator (in-place)
fgrad_inplace_f32
Forward difference gradient operator (in-place, f32)
fgrad_masked_inplace_f32
Forward difference gradient operator (in-place, f32) - masked version Only computes gradient where mask is non-zero
grad_magnitude_squared
Compute gradient magnitude squared: |∇x|² = gx² + gy² + gz²
symdiv_inplace_f32
Divergence of symmetric tensor field (adjoint of symgrad)
symgrad_inplace_f32
Symmetric gradient operator for TGV regularization (in-place, f32)