Skip to content

Filters API

Pictologics provides IBSI 2-compliant convolutional filters for image response map generation.

Overview

All filters can be used via the RadiomicsPipeline filter step or called directly:

# Pipeline usage
{"step": "filter", "params": {"type": "log", "sigma_mm": 1.5}}

# Direct usage
from pictologics.filters import laplacian_of_gaussian, BoundaryCondition
response = laplacian_of_gaussian(image.array, sigma_mm=1.5, spacing_mm=image.spacing)

Available Filters

Filter Function Use Case
Mean mean_filter Local averaging
LoG laplacian_of_gaussian Edge/blob detection
Laws laws_filter Texture energy
Gabor gabor_filter Directional patterns
Wavelet wavelet_transform Multi-resolution analysis
Simoncelli simoncelli_wavelet Non-separable wavelet
Riesz riesz_transform, riesz_log, riesz_simoncelli Rotation-equivariant transforms

Boundary Conditions

pictologics.filters.BoundaryCondition

Bases: Enum

IBSI 2 boundary conditions for image padding (GBYQ).

Maps to scipy.ndimage mode parameter values.

Source code in pictologics/filters/base.py
class BoundaryCondition(Enum):
    """
    IBSI 2 boundary conditions for image padding (GBYQ).

    Maps to scipy.ndimage mode parameter values.
    """

    ZERO = "constant"  # Zero padding (Z3VE)
    NEAREST = "nearest"  # Nearest value padding (SIJG)
    PERIODIC = "wrap"  # Periodic/wrap padding (Z7YO)
    MIRROR = "reflect"  # Mirror/symmetric padding (ZDTV)

pictologics.filters.FilterResult dataclass

Container for filter response maps and metadata.

Source code in pictologics/filters/base.py
@dataclass
class FilterResult:
    """Container for filter response maps and metadata."""

    response_map: npt.NDArray[np.floating[Any]]
    filter_name: str
    filter_params: Dict[str, Any]

    @property
    def shape(self) -> tuple[int, ...]:
        """Shape of the response map."""
        return self.response_map.shape  # type: ignore[no-any-return]

    @property
    def dtype(self) -> np.dtype[Any]:
        """Data type of the response map."""
        return self.response_map.dtype  # type: ignore[no-any-return]

dtype property

Data type of the response map.

shape property

Shape of the response map.

pictologics.filters.LAWS_KERNELS = _LAWS_KERNELS module-attribute

Dictionary of normalized Laws kernels (IBSI 2 Table 6).

Filter Functions

pictologics.filters.mean_filter(image, support=15, boundary=BoundaryCondition.ZERO, source_mask=None)

mean_filter(
    image: npt.NDArray[np.floating[Any]],
    support: int = ...,
    boundary: Union[BoundaryCondition, str] = ...,
    source_mask: None = ...,
) -> npt.NDArray[np.floating[Any]]
mean_filter(
    image: npt.NDArray[np.floating[Any]],
    support: int = ...,
    boundary: Union[BoundaryCondition, str] = ...,
    source_mask: npt.NDArray[np.bool_] = ...,
) -> tuple[
    npt.NDArray[np.floating[Any]], npt.NDArray[np.bool_]
]

Apply 3D mean filter (IBSI code: S60F).

The mean filter computes the average intensity over an M×M×M spatial support. Per IBSI 2 Eq. 2.

Parameters:

Name Type Description Default
image NDArray[floating[Any]]

3D input image array

required
support int

Filter support M in voxels (must be odd, YNOF)

15
boundary Union[BoundaryCondition, str]

Boundary condition for padding (GBYQ)

ZERO
source_mask Optional[NDArray[bool_]]

Optional boolean mask where True = valid voxel. When provided, uses normalized convolution to exclude invalid (sentinel) voxels from mean computation.

None

Returns:

Type Description
Union[NDArray[floating[Any]], tuple[NDArray[floating[Any]], NDArray[bool_]]]

If source_mask is None: Response map with same dimensions as input

Union[NDArray[floating[Any]], tuple[NDArray[floating[Any]], NDArray[bool_]]]

If source_mask provided: Tuple of (response_map, output_valid_mask)

Raises:

Type Description
ValueError

If support is not an odd positive integer

Example

Apply Mean filter with 15-voxel support:

import numpy as np
from pictologics.filters import mean_filter

# Create dummy 3D image
image = np.random.rand(50, 50, 50)

# Apply filter (original API)
response = mean_filter(image, support=15, boundary="zero")

# With source_mask for sentinel exclusion
mask = image > -1000  # Valid voxels
response, valid_mask = mean_filter(image, support=15, source_mask=mask)
Note

Support M is defined in voxel units as per IBSI specification.

Source code in pictologics/filters/mean.py
def mean_filter(
    image: npt.NDArray[np.floating[Any]],
    support: int = 15,
    boundary: Union[BoundaryCondition, str] = BoundaryCondition.ZERO,
    source_mask: Optional[npt.NDArray[np.bool_]] = None,
) -> Union[
    npt.NDArray[np.floating[Any]],
    tuple[npt.NDArray[np.floating[Any]], npt.NDArray[np.bool_]],
]:
    """
    Apply 3D mean filter (IBSI code: S60F).

    The mean filter computes the average intensity over an M×M×M
    spatial support. Per IBSI 2 Eq. 2.

    Args:
        image: 3D input image array
        support: Filter support M in voxels (must be odd, YNOF)
        boundary: Boundary condition for padding (GBYQ)
        source_mask: Optional boolean mask where True = valid voxel.
            When provided, uses normalized convolution to exclude invalid
            (sentinel) voxels from mean computation.

    Returns:
        If source_mask is None: Response map with same dimensions as input
        If source_mask provided: Tuple of (response_map, output_valid_mask)

    Raises:
        ValueError: If support is not an odd positive integer

    Example:
        Apply Mean filter with 15-voxel support:

        ```python
        import numpy as np
        from pictologics.filters import mean_filter

        # Create dummy 3D image
        image = np.random.rand(50, 50, 50)

        # Apply filter (original API)
        response = mean_filter(image, support=15, boundary="zero")

        # With source_mask for sentinel exclusion
        mask = image > -1000  # Valid voxels
        response, valid_mask = mean_filter(image, support=15, source_mask=mask)
        ```

    Note:
        Support M is defined in voxel units as per IBSI specification.
    """
    # Validate support
    if support < 1 or support % 2 == 0:
        raise ValueError(f"Support must be an odd positive integer, got {support}")

    # Convert to float32 as required by IBSI
    image = ensure_float32(image)

    # Handle string boundary condition
    if isinstance(boundary, str):
        boundary = BoundaryCondition[boundary.upper()]

    mode = get_scipy_mode(boundary)

    if source_mask is not None:
        # Use normalized convolution for source masking
        return _normalized_uniform_filter(image, source_mask, size=support, mode=mode)
    else:
        # Original behavior - return just the result (backward compatible)
        return uniform_filter(image, size=support, mode=mode)  # type: ignore[no-any-return]

pictologics.filters.laplacian_of_gaussian(image, sigma_mm, spacing_mm=1.0, truncate=4.0, boundary=BoundaryCondition.ZERO, source_mask=None)

laplacian_of_gaussian(
    image: npt.NDArray[np.floating[Any]],
    sigma_mm: float,
    spacing_mm: Union[
        float, Tuple[float, float, float]
    ] = ...,
    truncate: float = ...,
    boundary: Union[BoundaryCondition, str] = ...,
    source_mask: None = ...,
) -> npt.NDArray[np.floating[Any]]
laplacian_of_gaussian(
    image: npt.NDArray[np.floating[Any]],
    sigma_mm: float,
    spacing_mm: Union[
        float, Tuple[float, float, float]
    ] = ...,
    truncate: float = ...,
    boundary: Union[BoundaryCondition, str] = ...,
    source_mask: npt.NDArray[np.bool_] = ...,
) -> tuple[
    npt.NDArray[np.floating[Any]], npt.NDArray[np.bool_]
]

Apply 3D Laplacian of Gaussian filter (IBSI code: L6PA).

The LoG is a band-pass, spherically symmetric operator. Per IBSI 2 Eq. 3.

Parameters:

Name Type Description Default
image NDArray[floating[Any]]

3D input image array

required
sigma_mm float

Standard deviation in mm (σ*, 41LN)

required
spacing_mm Union[float, Tuple[float, float, float]]

Voxel spacing in mm (scalar for isotropic, or tuple)

1.0
truncate float

Filter size cutoff in σ units (default 4.0, WGPM)

4.0
boundary Union[BoundaryCondition, str]

Boundary condition for padding (GBYQ)

ZERO
source_mask Optional[NDArray[bool_]]

Optional boolean mask where True = valid voxel. When provided, uses normalized convolution to exclude invalid (sentinel) voxels from computation.

None

Returns:

Type Description
Union[NDArray[floating[Any]], tuple[NDArray[floating[Any]], NDArray[bool_]]]

If source_mask is None: Response map with same dimensions as input

Union[NDArray[floating[Any]], tuple[NDArray[floating[Any]], NDArray[bool_]]]

If source_mask provided: Tuple of (response_map, output_valid_mask)

Example

Apply LoG filter with 5.0mm sigma on an image with 2.0mm spacing:

import numpy as np
from pictologics.filters import laplacian_of_gaussian

# Create dummy 3D image
image = np.random.rand(50, 50, 50)

# Apply filter (original API)
response = laplacian_of_gaussian(
    image,
    sigma_mm=5.0,
    spacing_mm=(2.0, 2.0, 2.0),
    truncate=4.0
)

# With source_mask for sentinel exclusion
mask = image > -1000
response, valid_mask = laplacian_of_gaussian(
    image, sigma_mm=5.0, spacing_mm=2.0, source_mask=mask
)
Note
  • σ is converted from mm to voxels: σ_voxels = σ_mm / spacing_mm
  • Filter size: M = 1 + 2⌊d×σ + 0.5⌋ where d=truncate
  • The kernel should sum to approximately 0 (zero-mean)
Source code in pictologics/filters/log.py
def laplacian_of_gaussian(
    image: npt.NDArray[np.floating[Any]],
    sigma_mm: float,
    spacing_mm: Union[float, Tuple[float, float, float]] = 1.0,
    truncate: float = 4.0,
    boundary: Union[BoundaryCondition, str] = BoundaryCondition.ZERO,
    source_mask: Optional[npt.NDArray[np.bool_]] = None,
) -> Union[
    npt.NDArray[np.floating[Any]],
    tuple[npt.NDArray[np.floating[Any]], npt.NDArray[np.bool_]],
]:
    """
    Apply 3D Laplacian of Gaussian filter (IBSI code: L6PA).

    The LoG is a band-pass, spherically symmetric operator. Per IBSI 2 Eq. 3.

    Args:
        image: 3D input image array
        sigma_mm: Standard deviation in mm (σ*, 41LN)
        spacing_mm: Voxel spacing in mm (scalar for isotropic, or tuple)
        truncate: Filter size cutoff in σ units (default 4.0, WGPM)
        boundary: Boundary condition for padding (GBYQ)
        source_mask: Optional boolean mask where True = valid voxel.
            When provided, uses normalized convolution to exclude invalid
            (sentinel) voxels from computation.

    Returns:
        If source_mask is None: Response map with same dimensions as input
        If source_mask provided: Tuple of (response_map, output_valid_mask)

    Example:
        Apply LoG filter with 5.0mm sigma on an image with 2.0mm spacing:

        ```python
        import numpy as np
        from pictologics.filters import laplacian_of_gaussian

        # Create dummy 3D image
        image = np.random.rand(50, 50, 50)

        # Apply filter (original API)
        response = laplacian_of_gaussian(
            image,
            sigma_mm=5.0,
            spacing_mm=(2.0, 2.0, 2.0),
            truncate=4.0
        )

        # With source_mask for sentinel exclusion
        mask = image > -1000
        response, valid_mask = laplacian_of_gaussian(
            image, sigma_mm=5.0, spacing_mm=2.0, source_mask=mask
        )
        ```

    Note:
        - σ is converted from mm to voxels: σ_voxels = σ_mm / spacing_mm
        - Filter size: M = 1 + 2⌊d×σ + 0.5⌋ where d=truncate
        - The kernel should sum to approximately 0 (zero-mean)
    """
    # Convert to float32 as required by IBSI
    image = ensure_float32(image)

    # Handle scalar spacing
    if isinstance(spacing_mm, (int, float)):
        spacing_mm = (float(spacing_mm),) * 3

    # Convert sigma from mm to voxels for each axis
    sigma_voxels = tuple(sigma_mm / s for s in spacing_mm)

    # Handle string boundary condition
    if isinstance(boundary, str):
        boundary = BoundaryCondition[boundary.upper()]

    mode = get_scipy_mode(boundary)

    if source_mask is not None:
        # Use normalized convolution for source masking
        return _normalized_gaussian_laplace(
            image, source_mask, sigma=sigma_voxels, mode=mode, truncate=truncate
        )
    else:
        # Original behavior - return just the result (backward compatible)
        return gaussian_laplace(image, sigma=sigma_voxels, mode=mode, truncate=truncate)  # type: ignore[no-any-return]

pictologics.filters.laws_filter(image, kernels, boundary=BoundaryCondition.ZERO, rotation_invariant=False, pooling='max', compute_energy=False, energy_distance=7, use_parallel=None, source_mask=None)

laws_filter(
    image: npt.NDArray[np.floating[Any]],
    kernels: str,
    boundary: Union[BoundaryCondition, str] = ...,
    rotation_invariant: bool = ...,
    pooling: str = ...,
    compute_energy: bool = ...,
    energy_distance: int = ...,
    use_parallel: Union[bool, None] = ...,
    source_mask: None = ...,
) -> npt.NDArray[np.floating[Any]]
laws_filter(
    image: npt.NDArray[np.floating[Any]],
    kernels: str,
    boundary: Union[BoundaryCondition, str] = ...,
    rotation_invariant: bool = ...,
    pooling: str = ...,
    compute_energy: bool = ...,
    energy_distance: int = ...,
    use_parallel: Union[bool, None] = ...,
    source_mask: npt.NDArray[np.bool_] = ...,
) -> Tuple[
    npt.NDArray[np.floating[Any]], npt.NDArray[np.bool_]
]

Apply 3D Laws kernel filter (IBSI code: JTXT).

Laws kernels detect texture patterns via separable 1D filters combined into 2D/3D filters via outer products.

Parameters:

Name Type Description Default
image NDArray[floating[Any]]

3D input image array

required
kernels str

Kernel specification as string, e.g., "E5L5S5" for 3D

required
boundary Union[BoundaryCondition, str]

Boundary condition for padding (GBYQ)

ZERO
rotation_invariant bool

If True, apply pseudo-rotational invariance (O1AQ) using max pooling over 24 right-angle rotations

False
pooling str

Pooling method for rotation invariance ("max", "average", "min")

'max'
compute_energy bool

If True, compute texture energy image (PQSD)

False
energy_distance int

Chebyshev distance δ for energy computation (I176)

7
use_parallel Union[bool, None]

If True, use parallel processing for rotation_invariant mode. If None (default), auto-enables for images > ~128³ voxels. Only affects rotation_invariant mode.

None
source_mask Optional[NDArray[bool_]]

Optional boolean mask where True = valid voxel. When provided, uses normalized separable convolution to exclude invalid (sentinel) voxels from computation. Only supported in non-rotation-invariant mode.

None

Returns:

Type Description
Union[NDArray[floating[Any]], Tuple[NDArray[floating[Any]], NDArray[bool_]]]

If source_mask is None: Response map (or energy image if compute_energy=True)

Union[NDArray[floating[Any]], Tuple[NDArray[floating[Any]], NDArray[bool_]]]

If source_mask provided: Tuple of (response_map, output_valid_mask)

Example

Apply Laws E5L5S5 kernel with rotation invariance and texture energy:

import numpy as np
from pictologics.filters import laws_filter

# Create dummy 3D image
image = np.random.rand(50, 50, 50)

# Apply filter
response = laws_filter(
    image,
    "E5L5S5",
    rotation_invariant=True,
    pooling="max",
    compute_energy=True,
    energy_distance=7
)
Note
  • Kernels are normalized (deviate from Laws' original unnormalized)
  • Energy is computed as: mean(|h|) over δ neighborhood
  • For rotation invariance, energy is computed after pooling
  • Uses separable 1D convolutions for ~8x speedup over full 3D
Source code in pictologics/filters/laws.py
def laws_filter(
    image: npt.NDArray[np.floating[Any]],
    kernels: str,
    boundary: Union[BoundaryCondition, str] = BoundaryCondition.ZERO,
    rotation_invariant: bool = False,
    pooling: str = "max",
    compute_energy: bool = False,
    energy_distance: int = 7,
    use_parallel: Union[bool, None] = None,
    source_mask: Optional[npt.NDArray[np.bool_]] = None,
) -> Union[
    npt.NDArray[np.floating[Any]],
    Tuple[npt.NDArray[np.floating[Any]], npt.NDArray[np.bool_]],
]:
    """
    Apply 3D Laws kernel filter (IBSI code: JTXT).

    Laws kernels detect texture patterns via separable 1D filters combined
    into 2D/3D filters via outer products.

    Args:
        image: 3D input image array
        kernels: Kernel specification as string, e.g., "E5L5S5" for 3D
        boundary: Boundary condition for padding (GBYQ)
        rotation_invariant: If True, apply pseudo-rotational invariance (O1AQ)
                            using max pooling over 24 right-angle rotations
        pooling: Pooling method for rotation invariance ("max", "average", "min")
        compute_energy: If True, compute texture energy image (PQSD)
        energy_distance: Chebyshev distance δ for energy computation (I176)
        use_parallel: If True, use parallel processing for rotation_invariant mode.
            If None (default), auto-enables for images > ~128³ voxels.
            Only affects rotation_invariant mode.
        source_mask: Optional boolean mask where True = valid voxel.
            When provided, uses normalized separable convolution to exclude
            invalid (sentinel) voxels from computation. Only supported in
            non-rotation-invariant mode.

    Returns:
        If source_mask is None: Response map (or energy image if compute_energy=True)
        If source_mask provided: Tuple of (response_map, output_valid_mask)

    Example:
        Apply Laws E5L5S5 kernel with rotation invariance and texture energy:

        ```python
        import numpy as np
        from pictologics.filters import laws_filter

        # Create dummy 3D image
        image = np.random.rand(50, 50, 50)

        # Apply filter
        response = laws_filter(
            image,
            "E5L5S5",
            rotation_invariant=True,
            pooling="max",
            compute_energy=True,
            energy_distance=7
        )
        ```

    Note:
        - Kernels are normalized (deviate from Laws' original unnormalized)
        - Energy is computed as: mean(|h|) over δ neighborhood
        - For rotation invariance, energy is computed after pooling
        - Uses separable 1D convolutions for ~8x speedup over full 3D
    """

    # Convert to float32
    image = ensure_float32(image)

    # Parse kernel names (e.g., "E5L5S5" -> ["E5", "L5", "S5"])
    kernel_names = _parse_kernel_string(kernels)
    if len(kernel_names) != 3:
        raise ValueError(
            f"Expected 3 kernel names for 3D, got {len(kernel_names)}: {kernel_names}"
        )

    # Handle boundary condition
    if isinstance(boundary, str):
        boundary = BoundaryCondition[boundary.upper()]
    mode = get_scipy_mode(boundary)

    # Validate pooling method if used
    if rotation_invariant and pooling not in ("max", "average", "min"):
        raise ValueError(f"Unknown pooling method: {pooling}")

    # Get 1D kernels for separable convolution
    g1 = LAWS_KERNELS[kernel_names[0]].astype(np.float32)
    g2 = LAWS_KERNELS[kernel_names[1]].astype(np.float32)
    g3 = LAWS_KERNELS[kernel_names[2]].astype(np.float32)

    # Auto-detect parallel mode based on image size
    if use_parallel is None:
        use_parallel = image.size > _PARALLEL_THRESHOLD

    if rotation_invariant:
        rotations = _get_rotation_permutations_3d()

        def apply_rotated_convolution(
            rotation: Tuple[Tuple[int, int, int], Tuple[bool, bool, bool]],
        ) -> npt.NDArray[np.floating[Any]]:
            """Apply separable convolution with rotated kernels."""
            perm, flips = rotation
            # Permute kernel order to match rotation
            rotated_kernels = [g1, g2, g3]
            rotated_kernels = [rotated_kernels[p] for p in perm]
            # Flip kernels as needed
            for i, do_flip in enumerate(flips):
                if do_flip:
                    rotated_kernels[i] = rotated_kernels[i][::-1].copy()
            # Apply separable convolution
            return _separable_convolve_3d(
                image, rotated_kernels[0], rotated_kernels[1], rotated_kernels[2], mode
            )

        if use_parallel:
            # Parallel processing for large images
            # Use as_completed to process results as they finish, avoiding
            # holding all 24 response maps in memory at once.
            from concurrent.futures import ThreadPoolExecutor, as_completed

            with ThreadPoolExecutor() as executor:
                # Submit all rotation tasks
                future_to_rot = {
                    executor.submit(apply_rotated_convolution, rot): rot
                    for rot in rotations
                }

                # Pool responses incrementally
                result: npt.NDArray[np.floating[Any]] | None = None
                for _, future in enumerate(as_completed(future_to_rot)):
                    response = future.result()

                    if result is None:
                        # Initialize accumulator with first result
                        result = (
                            response.astype(np.float64)
                            if pooling == "average"
                            else response.copy()
                        )
                    else:
                        if result is None:  # pragma: no cover
                            raise RuntimeError("Result should not be None")

                        res = result

                        if pooling == "max":
                            np.maximum(res, response, out=res)
                        elif pooling == "average":
                            res += response
                        elif pooling == "min":
                            np.minimum(res, response, out=res)
                        else:
                            raise ValueError(
                                f"Unknown pooling method: {pooling}"
                            )  # pragma: no cover

                    # Explicitly delete response to free memory
                    del response
        else:
            # Sequential processing for small images (avoid thread overhead)
            result = None
            for i, rotation in enumerate(rotations):
                response = apply_rotated_convolution(rotation)

                if i == 0:
                    result = (
                        response.astype(np.float64)
                        if pooling == "average"
                        else response.copy()
                    )
                else:
                    if result is None:  # pragma: no cover
                        raise RuntimeError("Result should not be None")

                    res = result

                    if pooling == "max":
                        np.maximum(res, response, out=res)
                    elif pooling == "average":
                        res += response
                    elif pooling == "min":
                        np.minimum(res, response, out=res)
                    else:
                        raise ValueError(
                            f"Unknown pooling method: {pooling}"
                        )  # pragma: no cover

        # Finalize average pooling
        if pooling == "average" and result is not None:
            result /= len(rotations)
        valid_mask = None
    else:
        # Non-rotation-invariant: single separable convolution
        if source_mask is not None:
            result, valid_mask = _normalized_separable_convolve_3d(
                image, source_mask, g1, g2, g3, mode
            )
        else:
            result = _separable_convolve_3d(image, g1, g2, g3, mode)
            valid_mask = None

    # Compute energy image if requested
    if compute_energy:
        if result is None:  # pragma: no cover
            raise RuntimeError("Result should not be None")

        # Energy = mean of absolute values over δ neighborhood
        # This is equivalent to uniform_filter on |result|
        abs_result = np.abs(result)
        energy_support = 2 * energy_distance + 1
        result = uniform_filter(abs_result, size=energy_support, mode=mode)

    if result is None:  # pragma: no cover
        raise RuntimeError("Result should not be None")

    if source_mask is not None and valid_mask is not None:
        return result, valid_mask
    return result  # type: ignore[no-any-return]

pictologics.filters.gabor_filter(image, sigma_mm, lambda_mm, gamma=1.0, theta=0.0, spacing_mm=1.0, boundary=BoundaryCondition.ZERO, rotation_invariant=False, delta_theta=None, pooling='average', average_over_planes=False, use_parallel=None, source_mask=None)

Apply 2D Gabor filter to 3D image (IBSI code: Q88H).

The Gabor filter is applied in the axial plane (k1, k2) and optionally averaged over orthogonal planes. Per IBSI 2 Eq. 9.

Parameters:

Name Type Description Default
image NDArray[floating[Any]]

3D input image array

required
sigma_mm float

Standard deviation of Gaussian envelope in mm (41LN)

required
lambda_mm float

Wavelength in mm (S4N6)

required
gamma float

Spatial aspect ratio (GDR5), typically 0.5 to 2.0

1.0
theta float

Orientation angle in radians (FQER), clockwise in (k1,k2)

0.0
spacing_mm Union[float, Tuple[float, float, float]]

Voxel spacing in mm (scalar or tuple)

1.0
boundary Union[BoundaryCondition, str]

Boundary condition for padding (GBYQ)

ZERO
rotation_invariant bool

If True, average over orientations

False
delta_theta Optional[float]

Orientation step for rotation invariance (XTGK)

None
pooling str

Pooling method ("average", "max", "min")

'average'
average_over_planes bool

If True, average 2D responses over 3 orthogonal planes

False
use_parallel Union[bool, None]

If True, process slices in parallel. If None (default), auto-enables for images > ~80³ voxels.

None
source_mask Optional[NDArray[bool_]]

Optional boolean mask where True = valid voxel. When provided, zeros out invalid (sentinel) voxels before FFT-based convolution to prevent contamination.

None

Returns:

Type Description
NDArray[floating[Any]]

Response map (modulus of complex response)

Example

Apply Gabor filter with rotation invariance over orthogonal planes:

import numpy as np
from pictologics.filters import gabor_filter

# Create dummy 3D image
image = np.random.rand(50, 50, 50)

# Apply filter
response = gabor_filter(
    image,
    sigma_mm=10.0,
    lambda_mm=4.0,
    gamma=0.5,
    rotation_invariant=True,
    delta_theta=np.pi/4,
    average_over_planes=True
)
Note
  • Returns modulus |h| = |g ⊗ f| for feature extraction
  • 2D filter applied slice-by-slice, then optionally over planes
  • Uses single complex FFT convolution for ~2x speedup
Source code in pictologics/filters/gabor.py
def gabor_filter(
    image: npt.NDArray[np.floating[Any]],
    sigma_mm: float,
    lambda_mm: float,
    gamma: float = 1.0,
    theta: float = 0.0,
    spacing_mm: Union[float, Tuple[float, float, float]] = 1.0,
    boundary: Union[BoundaryCondition, str] = BoundaryCondition.ZERO,
    rotation_invariant: bool = False,
    delta_theta: Optional[float] = None,
    pooling: str = "average",
    average_over_planes: bool = False,
    use_parallel: Union[bool, None] = None,
    source_mask: Optional[npt.NDArray[np.bool_]] = None,
) -> npt.NDArray[np.floating[Any]]:
    """
    Apply 2D Gabor filter to 3D image (IBSI code: Q88H).

    The Gabor filter is applied in the axial plane (k1, k2) and optionally
    averaged over orthogonal planes. Per IBSI 2 Eq. 9.

    Args:
        image: 3D input image array
        sigma_mm: Standard deviation of Gaussian envelope in mm (41LN)
        lambda_mm: Wavelength in mm (S4N6)
        gamma: Spatial aspect ratio (GDR5), typically 0.5 to 2.0
        theta: Orientation angle in radians (FQER), clockwise in (k1,k2)
        spacing_mm: Voxel spacing in mm (scalar or tuple)
        boundary: Boundary condition for padding (GBYQ)
        rotation_invariant: If True, average over orientations
        delta_theta: Orientation step for rotation invariance (XTGK)
        pooling: Pooling method ("average", "max", "min")
        average_over_planes: If True, average 2D responses over 3 orthogonal planes
        use_parallel: If True, process slices in parallel. If None (default),
            auto-enables for images > ~80³ voxels.
        source_mask: Optional boolean mask where True = valid voxel.
            When provided, zeros out invalid (sentinel) voxels before
            FFT-based convolution to prevent contamination.

    Returns:
        Response map (modulus of complex response)

    Example:
        Apply Gabor filter with rotation invariance over orthogonal planes:

        ```python
        import numpy as np
        from pictologics.filters import gabor_filter

        # Create dummy 3D image
        image = np.random.rand(50, 50, 50)

        # Apply filter
        response = gabor_filter(
            image,
            sigma_mm=10.0,
            lambda_mm=4.0,
            gamma=0.5,
            rotation_invariant=True,
            delta_theta=np.pi/4,
            average_over_planes=True
        )
        ```

    Note:
        - Returns modulus |h| = |g ⊗ f| for feature extraction
        - 2D filter applied slice-by-slice, then optionally over planes
        - Uses single complex FFT convolution for ~2x speedup
    """
    # Convert to float32
    image = ensure_float32(image)

    # Apply source_mask preprocessing (zero out invalid voxels for FFT-based filter)
    if source_mask is not None:
        image = _prepare_masked_image(image, source_mask)

    # Handle spacing
    if isinstance(spacing_mm, (int, float)):
        spacing_mm = (float(spacing_mm),) * 3

    # Convert mm to voxels (use in-plane spacing for 2D filter)
    sigma_voxels = sigma_mm / spacing_mm[0]  # Assume isotropic in-plane
    lambda_voxels = lambda_mm / spacing_mm[0]

    # Handle boundary
    if isinstance(boundary, str):
        boundary = BoundaryCondition[boundary.upper()]
    mode = get_scipy_mode(boundary)

    # Validate pooling parameter early
    valid_poolings = ("max", "average", "min")
    if pooling not in valid_poolings:
        raise ValueError(f"Unknown pooling: {pooling}. Must be one of {valid_poolings}")

    # Auto-detect parallel mode based on image size
    if use_parallel is None:
        use_parallel = image.size > _PARALLEL_THRESHOLD

    if rotation_invariant and delta_theta is not None:
        # Generate orientations from 0 to 2π
        n_orientations = int(np.ceil(2 * np.pi / delta_theta))
        thetas = [i * delta_theta for i in range(n_orientations)]
    else:
        thetas = [theta]

    if average_over_planes:
        # Apply to all 3 orthogonal planes and average with in-place aggregation
        result: npt.NDArray[np.floating[Any]] | None = None
        for plane_axis in range(3):
            plane_response = _apply_gabor_to_plane(
                image,
                sigma_voxels,
                lambda_voxels,
                gamma,
                thetas,
                plane_axis,
                mode,
                pooling,
                use_parallel,
            )
            if result is None:
                result = plane_response.astype(np.float64)
            else:
                result += plane_response

        if result is None:  # pragma: no cover
            raise RuntimeError("Result should not be None after plane loop")

        return (result / 3.0).astype(np.float32)  # type: ignore[union-attr]
    else:
        # Apply only to axial plane (axis 2 = k3 slices)
        return _apply_gabor_to_plane(
            image,
            sigma_voxels,
            lambda_voxels,
            gamma,
            thetas,
            plane_axis=2,
            mode=mode,
            pooling=pooling,
            use_parallel=use_parallel,
        )

pictologics.filters.wavelet_transform(image, wavelet='db2', level=1, decomposition='LHL', boundary=BoundaryCondition.ZERO, rotation_invariant=False, pooling='average', use_parallel=None, source_mask=None)

Apply 3D separable wavelet transform (undecimated/stationary).

Uses the à trous algorithm for undecimated wavelet decomposition. The transform is translation-invariant (unlike decimated transform).

Supported wavelets
  • "haar" (UOUE): Haar wavelet
  • "db2", "db3": Daubechies wavelets
  • "coif1": Coiflet wavelet

Parameters:

Name Type Description Default
image NDArray[floating[Any]]

3D input image array

required
wavelet str

Wavelet name (e.g., "db2", "coif1", "haar")

'db2'
level int

Decomposition level (GCEK)

1
decomposition str

Which response map to return, e.g., "LHL", "HHH"

'LHL'
boundary Union[BoundaryCondition, str]

Boundary condition for padding

ZERO
rotation_invariant bool

If True, average over 24 rotations

False
pooling str

Pooling method for rotation invariance

'average'
use_parallel Union[bool, None]

If True, use parallel processing for rotation_invariant mode. If None (default), auto-enables for images > ~128³ voxels.

None
source_mask Optional[NDArray[bool_]]

Optional boolean mask where True = valid voxel. When provided, zeros out invalid (sentinel) voxels before wavelet decomposition to prevent contamination.

None

Returns:

Type Description
NDArray[floating[Any]]

Response map for the specified decomposition

Example

Apply Daubechies 2 wavelet transform at level 1, returning LHL coefficients:

import numpy as np
from pictologics.filters import wavelet_transform

# Create dummy 3D image
image = np.random.rand(50, 50, 50)

# Apply transform
response = wavelet_transform(
    image,
    wavelet="db2",
    level=1,
    decomposition="LHL"
)
Source code in pictologics/filters/wavelets.py
def wavelet_transform(
    image: npt.NDArray[np.floating[Any]],
    wavelet: str = "db2",
    level: int = 1,
    decomposition: str = "LHL",
    boundary: Union[BoundaryCondition, str] = BoundaryCondition.ZERO,
    rotation_invariant: bool = False,
    pooling: str = "average",
    use_parallel: Union[bool, None] = None,
    source_mask: Optional[npt.NDArray[np.bool_]] = None,
) -> npt.NDArray[np.floating[Any]]:
    """
    Apply 3D separable wavelet transform (undecimated/stationary).

    Uses the à trous algorithm for undecimated wavelet decomposition.
    The transform is translation-invariant (unlike decimated transform).

    Supported wavelets:
        - "haar" (UOUE): Haar wavelet
        - "db2", "db3": Daubechies wavelets
        - "coif1": Coiflet wavelet

    Args:
        image: 3D input image array
        wavelet: Wavelet name (e.g., "db2", "coif1", "haar")
        level: Decomposition level (GCEK)
        decomposition: Which response map to return, e.g., "LHL", "HHH"
        boundary: Boundary condition for padding
        rotation_invariant: If True, average over 24 rotations
        pooling: Pooling method for rotation invariance
        use_parallel: If True, use parallel processing for rotation_invariant mode.
            If None (default), auto-enables for images > ~128³ voxels.
        source_mask: Optional boolean mask where True = valid voxel.
            When provided, zeros out invalid (sentinel) voxels before
            wavelet decomposition to prevent contamination.

    Returns:
        Response map for the specified decomposition

    Example:
        Apply Daubechies 2 wavelet transform at level 1, returning LHL coefficients:

        ```python
        import numpy as np
        from pictologics.filters import wavelet_transform

        # Create dummy 3D image
        image = np.random.rand(50, 50, 50)

        # Apply transform
        response = wavelet_transform(
            image,
            wavelet="db2",
            level=1,
            decomposition="LHL"
        )
        ```
    """
    from concurrent.futures import ThreadPoolExecutor

    # Convert to float32
    image = ensure_float32(image)

    # Apply source_mask preprocessing (zero out invalid voxels)
    if source_mask is not None:
        image = _prepare_masked_image(image, source_mask)

    # Handle boundary
    if isinstance(boundary, str):
        boundary = BoundaryCondition[boundary.upper()]
    mode = get_scipy_mode(boundary)

    # Get wavelet filters
    w = pywt.Wavelet(wavelet)
    lo = np.array(w.dec_lo, dtype=np.float32)  # Low-pass decomposition filter
    hi = np.array(w.dec_hi, dtype=np.float32)  # High-pass decomposition filter

    # Auto-detect parallel mode based on image size
    if use_parallel is None:
        use_parallel = image.size > _PARALLEL_THRESHOLD

    if rotation_invariant:
        rotations = _get_rotation_perms()

        def apply_rotated_wavelet(
            rotation: Tuple[Tuple[int, int, int], Tuple[bool, bool, bool]],
        ) -> npt.NDArray[np.floating[Any]]:
            """Apply wavelet transform with rotated image."""
            perm, flips = rotation
            # Permute and flip image
            rotated = np.transpose(image, perm)
            for axis, flip in enumerate(flips):
                if flip:
                    rotated = np.flip(rotated, axis=axis)

            # Apply wavelet
            response = _apply_undecimated_wavelet_3d(
                rotated, lo, hi, level, decomposition, mode
            )

            # Undo rotation for response
            for axis, flip in enumerate(flips):
                if flip:
                    response = np.flip(response, axis=axis)
            inv_perm = tuple(np.argsort(perm))
            return np.transpose(response, inv_perm)

        if use_parallel:
            from concurrent.futures import ThreadPoolExecutor, as_completed

            with ThreadPoolExecutor() as executor:
                # Submit all rotation tasks
                future_to_rot = {
                    executor.submit(apply_rotated_wavelet, rot): rot
                    for rot in rotations
                }

                # Pool responses incrementally
                result: npt.NDArray[np.floating[Any]] | None = None
                for _, future in enumerate(as_completed(future_to_rot)):
                    response = future.result()

                    if result is None:
                        # Initialize accumulator with first result
                        result = (
                            response.astype(np.float64)
                            if pooling == "average"
                            else response.copy()
                        )
                    else:
                        if result is None:  # pragma: no cover
                            raise RuntimeError("Result should not be None")

                        # Fix mypy narrowing issue
                        # Assert was sufficient
                        res = result

                        if pooling == "max":
                            np.maximum(res, response, out=res)
                        elif pooling == "average":
                            res += response
                        elif pooling == "min":
                            np.minimum(res, response, out=res)
                        else:
                            raise ValueError(
                                f"Unknown pooling: {pooling}"
                            )  # pragma: no cover

                    # Explicitly delete response
                    del response
        else:
            # Sequential processing for small images
            result = None
            for i, rotation in enumerate(rotations):
                response = apply_rotated_wavelet(rotation)

                if i == 0:
                    result = (
                        response.astype(np.float64)
                        if pooling == "average"
                        else response.copy()
                    )
                else:
                    if result is None:  # pragma: no cover
                        raise RuntimeError("Result should not be None")

                    res_seq = result

                    if pooling == "max":
                        np.maximum(res_seq, response, out=res_seq)
                    elif pooling == "average":
                        res_seq += response
                    elif pooling == "min":
                        np.minimum(res_seq, response, out=res_seq)
                    else:
                        raise ValueError(
                            f"Unknown pooling: {pooling}"
                        )  # pragma: no cover

        # Finalize average pooling
        if pooling == "average" and result is not None:
            result /= len(rotations)
        return result.astype(np.float32)  # type: ignore[union-attr]
    else:
        return _apply_undecimated_wavelet_3d(image, lo, hi, level, decomposition, mode)

pictologics.filters.simoncelli_wavelet(image, level=1, boundary=BoundaryCondition.PERIODIC, source_mask=None)

Apply Simoncelli non-separable wavelet (IBSI code: PRT7).

The Simoncelli wavelet is isotropic (spherically symmetric) and implemented in the Fourier domain. Per IBSI 2 Eq. 27.

For decomposition level N, the frequency band is scaled by j = N-1: - Level 1 (j=0): band [π/4, π] (highest frequencies) - Level 2 (j=1): band [π/8, π/2] - Level 3 (j=2): band [π/16, π/4]

Parameters:

Name Type Description Default
image NDArray[floating[Any]]

3D input image array

required
level int

Decomposition level (1 = highest frequency band)

1
boundary Union[BoundaryCondition, str]

Boundary condition (FFT is inherently periodic)

PERIODIC
source_mask Optional[NDArray[bool_]]

Optional boolean mask where True = valid voxel

None

Returns:

Type Description
NDArray[floating[Any]]

Band-pass response map (B map) for the specified level

Example

Apply first-level Simoncelli wavelet (highest frequency band):

import numpy as np
from pictologics.filters import simoncelli_wavelet

# Create dummy 3D image
image = np.random.rand(50, 50, 50)

# Apply wavelet
response = simoncelli_wavelet(image, level=1)
Source code in pictologics/filters/wavelets.py
def simoncelli_wavelet(
    image: npt.NDArray[np.floating[Any]],
    level: int = 1,
    boundary: Union[BoundaryCondition, str] = BoundaryCondition.PERIODIC,
    source_mask: Optional[npt.NDArray[np.bool_]] = None,
) -> npt.NDArray[np.floating[Any]]:
    """
    Apply Simoncelli non-separable wavelet (IBSI code: PRT7).

    The Simoncelli wavelet is isotropic (spherically symmetric) and
    implemented in the Fourier domain. Per IBSI 2 Eq. 27.

    For decomposition level N, the frequency band is scaled by j = N-1:
        - Level 1 (j=0): band [π/4, π] (highest frequencies)
        - Level 2 (j=1): band [π/8, π/2]
        - Level 3 (j=2): band [π/16, π/4]

    Args:
        image: 3D input image array
        level: Decomposition level (1 = highest frequency band)
        boundary: Boundary condition (FFT is inherently periodic)
        source_mask: Optional boolean mask where True = valid voxel

    Returns:
        Band-pass response map (B map) for the specified level

    Example:
        Apply first-level Simoncelli wavelet (highest frequency band):

        ```python
        import numpy as np
        from pictologics.filters import simoncelli_wavelet

        # Create dummy 3D image
        image = np.random.rand(50, 50, 50)

        # Apply wavelet
        response = simoncelli_wavelet(image, level=1)
        ```
    """
    # Convert to float32
    image = ensure_float32(image)

    # Apply source_mask preprocessing (zero out invalid voxels for FFT-based filter)
    if source_mask is not None:
        image = _prepare_masked_image(image, source_mask)

    shape = image.shape
    ndim = len(shape)

    # IBSI level N corresponds to j = N-1
    # Level 1 = j=0 → max_freq = 1.0 (normalized Nyquist)
    j = level - 1
    # Normalized max frequency for this level (relative to Nyquist=1.0)
    max_freq = 1.0 / (2**j)

    # Build frequency grid using centered [-1, 1] coordinates (IBSI 2 convention).
    # NOTE: This grid differs from np.fft.fftfreq by a factor of (N-1)/N.
    # The IBSI 2 reference values were validated with this specific grid, so
    # it must be preserved exactly. The non-symmetric grid for even N also
    # means rfftn/irfftn cannot be used (they assume conjugate symmetry).
    center = (np.array(shape) - 1.0) / 2.0

    grids = []
    for i, s in enumerate(shape):
        dim_grid = np.arange(s)
        # Normalize to [-1, 1] relative to center
        grid_norm = (dim_grid - center[i]) / center[i]
        # Shift to move DC to array start (index 0), matching fftn layout
        grid_shifted = np.fft.ifftshift(grid_norm)
        grids.append(grid_shifted)

    # Use broadcasting for full 3D grid
    mesh_vectors = np.meshgrid(*grids, indexing="ij", sparse=True)
    dist_sq = np.asarray(sum(g**2 for g in mesh_vectors), dtype=np.float64)
    dist = np.sqrt(dist_sq)

    # Calculate transfer function (Simoncelli band-pass, IBSI 2 Eq. 27)
    val = 2.0 * dist / max_freq
    log_arg = np.where(val > 0, val, 1.0)

    with np.errstate(all="ignore"):
        g_sim = np.cos(np.pi / 2.0 * np.log2(log_arg))

    # Apply band-pass mask
    mask = (dist >= max_freq / 4.0) & (dist <= max_freq)
    g_sim = np.where(mask, g_sim, 0.0)

    # Apply filter in frequency domain using full FFT
    # (full FFT required because the centered grid is non-symmetric for even N)
    F = np.fft.fftn(image)

    # Explicitly specify axes to avoid NumPy 2.0 DeprecationWarning
    axes = tuple(range(ndim))
    response = np.fft.ifftn(F * g_sim, s=shape, axes=axes)

    return np.real(response).astype(np.float32)

pictologics.filters.riesz_transform(image, order, source_mask=None)

Apply Riesz transform (IBSI code: AYRS).

The Riesz transform computes higher-order all-pass image derivatives in the Fourier domain. Per IBSI 2 Eq. 34.

Parameters:

Name Type Description Default
image NDArray[floating[Any]]

3D input image array

required
order Tuple[int, ...]

Tuple (l1, l2, l3) specifying derivative order per axis e.g., (1,0,0) = first-order along k1 (gradient-like) (2,0,0), (1,1,0), (0,2,0) = second-order (Hessian-like)

required
source_mask Optional[NDArray[bool_]]

Optional boolean mask where True = valid voxel. When provided, zeros out invalid (sentinel) voxels before FFT-based transform to prevent contamination.

None

Returns:

Type Description
NDArray[floating[Any]]

Riesz-transformed image (real part)

Example

Compute first-order Riesz transform along the k1 axis:

import numpy as np
from pictologics.filters import riesz_transform

# Create dummy 3D image
image = np.random.rand(50, 50, 50)

# Apply transform (gradient-like along axis 0)
response = riesz_transform(image, order=(1, 0, 0))
Note
  • First-order Riesz components form the image gradient
  • Second-order Riesz components form the image Hessian
  • All-pass: doesn't amplify high frequencies like regular derivatives
Source code in pictologics/filters/riesz.py
def riesz_transform(
    image: npt.NDArray[np.floating[Any]],
    order: Tuple[int, ...],
    source_mask: Optional[npt.NDArray[np.bool_]] = None,
) -> npt.NDArray[np.floating[Any]]:
    """
    Apply Riesz transform (IBSI code: AYRS).

    The Riesz transform computes higher-order all-pass image derivatives
    in the Fourier domain. Per IBSI 2 Eq. 34.

    Args:
        image: 3D input image array
        order: Tuple (l1, l2, l3) specifying derivative order per axis
               e.g., (1,0,0) = first-order along k1 (gradient-like)
                     (2,0,0), (1,1,0), (0,2,0) = second-order (Hessian-like)
        source_mask: Optional boolean mask where True = valid voxel.
            When provided, zeros out invalid (sentinel) voxels before
            FFT-based transform to prevent contamination.

    Returns:
        Riesz-transformed image (real part)

    Example:
        Compute first-order Riesz transform along the k1 axis:

        ```python
        import numpy as np
        from pictologics.filters import riesz_transform

        # Create dummy 3D image
        image = np.random.rand(50, 50, 50)

        # Apply transform (gradient-like along axis 0)
        response = riesz_transform(image, order=(1, 0, 0))
        ```

    Note:
        - First-order Riesz components form the image gradient
        - Second-order Riesz components form the image Hessian
        - All-pass: doesn't amplify high frequencies like regular derivatives
    """
    # Convert to float32
    image = ensure_float32(image)

    # Apply source_mask preprocessing (zero out invalid voxels for FFT-based filter)
    if source_mask is not None:
        image = _prepare_masked_image(image, source_mask)

    L = sum(order)  # Total order

    if L == 0:
        raise ValueError("At least one order component must be > 0")

    shape = image.shape
    ndim = len(shape)

    # Generate frequency coordinates appropriately for rfftn
    # Last dimension uses rfftfreq, others use fftfreq
    freqs = []
    for i, s in enumerate(shape):
        if i == ndim - 1:
            # Last dimension for rfftn is non-negative frequencies only
            freqs.append(np.fft.rfftfreq(s) * 2 * np.pi)
        else:
            freqs.append(np.fft.fftfreq(s) * 2 * np.pi)

    # Create grid using broadcasting (lazy evaluation) to avoid huge meshgrid matching input size
    # meshgrid with sparse=True returns coordinate vectors that broadcast
    nu_vectors = np.meshgrid(*freqs, indexing="ij", sparse=True)

    # Compute ||ν||^2 via broadcasting
    nu_sq_norm = np.asarray(sum(n**2 for n in nu_vectors), dtype=np.float64)
    nu_norm = np.sqrt(nu_sq_norm)

    # Avoid division by zero at DC
    nu_norm_safe = np.where(nu_norm > 0, nu_norm, 1.0)

    # Compute normalization factor
    norm_factor = sqrt(factorial(L) / np.prod([factorial(o) for o in order]))

    # Compute numerator via broadcasting
    numerator = np.ones(nu_norm.shape, dtype=np.float64)
    for i, ord_val in enumerate(order):
        if ord_val > 0:
            numerator *= nu_vectors[i] ** ord_val

    # Riesz transfer function
    phase = np.exp(-1j * np.pi * L / 2)

    transfer = phase * norm_factor * numerator / (nu_norm_safe**L)
    transfer = np.where(nu_norm > 0, transfer, 0)  # Set DC to 0

    # Apply in frequency domain using Real FFT
    F = np.fft.rfftn(image)

    # Verify shapes match (should match due to rfftfreq logic)
    # F has shape (N1, N2, N3//2 + 1)
    # transfer should have same shape or broadcastable

    # Explicitly specify axes to avoid NumPy 2.0 DeprecationWarning
    axes = tuple(range(ndim))
    response = np.fft.irfftn(F * transfer, s=shape, axes=axes)

    return response.astype(np.float32)

pictologics.filters.riesz_log(image, sigma_mm, spacing_mm=1.0, order=(1, 0, 0), truncate=4.0, source_mask=None)

Apply Riesz transform to LoG-filtered image.

Combines multi-scale analysis (LoG) with directional analysis (Riesz). First applies LoG filtering, then applies Riesz transform.

Parameters:

Name Type Description Default
image NDArray[floating[Any]]

3D input image array

required
sigma_mm float

LoG scale in mm

required
spacing_mm Union[float, Tuple[float, float, float]]

Voxel spacing in mm

1.0
order Tuple[int, ...]

Riesz order tuple (l1, l2, l3)

(1, 0, 0)
truncate float

LoG truncation parameter

4.0
source_mask Optional[NDArray[bool_]]

Optional boolean mask where True = valid voxel

None

Returns:

Type Description
NDArray[floating[Any]]

Riesz-transformed LoG response

Example

Compute first-order Riesz transform of LoG-filtered image at 5mm scale:

import numpy as np
from pictologics.filters import riesz_log

# Create dummy 3D image
image = np.random.rand(50, 50, 50)

# Apply filter
response = riesz_log(
    image,
    sigma_mm=5.0,
    spacing_mm=(2.0, 2.0, 2.0),
    order=(1, 0, 0)
)
Source code in pictologics/filters/riesz.py
def riesz_log(
    image: npt.NDArray[np.floating[Any]],
    sigma_mm: float,
    spacing_mm: Union[float, Tuple[float, float, float]] = 1.0,
    order: Tuple[int, ...] = (1, 0, 0),
    truncate: float = 4.0,
    source_mask: Optional[npt.NDArray[np.bool_]] = None,
) -> npt.NDArray[np.floating[Any]]:
    """
    Apply Riesz transform to LoG-filtered image.

    Combines multi-scale analysis (LoG) with directional analysis (Riesz).
    First applies LoG filtering, then applies Riesz transform.

    Args:
        image: 3D input image array
        sigma_mm: LoG scale in mm
        spacing_mm: Voxel spacing in mm
        order: Riesz order tuple (l1, l2, l3)
        truncate: LoG truncation parameter
        source_mask: Optional boolean mask where True = valid voxel

    Returns:
        Riesz-transformed LoG response

    Example:
        Compute first-order Riesz transform of LoG-filtered image at 5mm scale:

        ```python
        import numpy as np
        from pictologics.filters import riesz_log

        # Create dummy 3D image
        image = np.random.rand(50, 50, 50)

        # Apply filter
        response = riesz_log(
            image,
            sigma_mm=5.0,
            spacing_mm=(2.0, 2.0, 2.0),
            order=(1, 0, 0)
        )
        ```
    """
    from .log import laplacian_of_gaussian

    # First apply LoG
    log_response = laplacian_of_gaussian(
        image,
        sigma_mm=sigma_mm,
        spacing_mm=spacing_mm,
        truncate=truncate,
        source_mask=source_mask,
    )

    # Handle tuple return from LoG if source_mask was used
    if isinstance(log_response, tuple):
        log_response = log_response[0]

    # Then apply Riesz transform
    # We pass source_mask again to enforce zeroing of invalid regions
    # (though LoG normalized convolution might have filled them, Riesz is global)
    return riesz_transform(log_response, order=order, source_mask=source_mask)

pictologics.filters.riesz_simoncelli(image, level=1, order=(1, 0, 0), source_mask=None)

Apply Riesz transform to Simoncelli wavelet-filtered image.

Combines isotropic multi-scale analysis (Simoncelli) with directional analysis (Riesz) for rotation-invariant directional features.

Parameters:

Name Type Description Default
image NDArray[floating[Any]]

3D input image array

required
level int

Simoncelli decomposition level

1
order Tuple[int, ...]

Riesz order tuple (l1, l2, l3)

(1, 0, 0)
source_mask Optional[NDArray[bool_]]

Optional boolean mask where True = valid voxel

None

Returns:

Type Description
NDArray[floating[Any]]

Riesz-transformed Simoncelli response

Example

Compute second-order Riesz transform (Hessian-like) of Simoncelli level 2:

import numpy as np
from pictologics.filters import riesz_simoncelli

# Create dummy 3D image
image = np.random.rand(50, 50, 50)

# Apply filter
response = riesz_simoncelli(
    image,
    level=2,
    order=(2, 0, 0)
)
Source code in pictologics/filters/riesz.py
def riesz_simoncelli(
    image: npt.NDArray[np.floating[Any]],
    level: int = 1,
    order: Tuple[int, ...] = (1, 0, 0),
    source_mask: Optional[npt.NDArray[np.bool_]] = None,
) -> npt.NDArray[np.floating[Any]]:
    """
    Apply Riesz transform to Simoncelli wavelet-filtered image.

    Combines isotropic multi-scale analysis (Simoncelli) with
    directional analysis (Riesz) for rotation-invariant directional features.

    Args:
        image: 3D input image array
        level: Simoncelli decomposition level
        order: Riesz order tuple (l1, l2, l3)
        source_mask: Optional boolean mask where True = valid voxel

    Returns:
        Riesz-transformed Simoncelli response

    Example:
        Compute second-order Riesz transform (Hessian-like) of Simoncelli level 2:

        ```python
        import numpy as np
        from pictologics.filters import riesz_simoncelli

        # Create dummy 3D image
        image = np.random.rand(50, 50, 50)

        # Apply filter
        response = riesz_simoncelli(
            image,
            level=2,
            order=(2, 0, 0)
        )
        ```
    """
    from .wavelets import simoncelli_wavelet

    # Preprocess once: float32 conversion + source mask zeroing
    image = ensure_float32(image)
    if source_mask is not None:
        image = _prepare_masked_image(image, source_mask)

    # Apply Simoncelli wavelet (already preprocessed, skip redundant work)
    sim_response = simoncelli_wavelet(image, level=level)

    # Apply Riesz transform to the Simoncelli response
    return riesz_transform(sim_response, order=order)