Expand description
Gradient operators for QSM
Forward difference gradient and backward divergence operators used in TV regularization and other algorithms.
Functions§
- bdiv
- Backward divergence operator (negative adjoint of forward gradient)
- bdiv_
inplace - Backward divergence operator (in-place)
- bdiv_
inplace_ f32 - Backward divergence operator (in-place, f32) Uses zero boundary conditions (matching Julia)
- bdiv_
masked_ inplace_ f32 - Backward divergence operator (in-place, f32) - masked version
- fgrad
- Forward difference gradient operator
- fgrad_
f32 - Forward difference gradient operator (allocating, f32)
- fgrad_
inplace - Forward difference gradient operator (in-place)
- fgrad_
inplace_ f32 - Forward difference gradient operator (in-place, f32)
- fgrad_
masked_ inplace_ f32 - Forward difference gradient operator (in-place, f32) - masked version Only computes gradient where mask is non-zero
- grad_
magnitude_ squared - Compute gradient magnitude squared: |∇x|² = gx² + gy² + gz²
- symdiv_
inplace_ f32 - Divergence of symmetric tensor field (adjoint of symgrad)
- symgrad_
inplace_ f32 - Symmetric gradient operator for TGV regularization (in-place, f32)