Skip to content

Conversation

@Poncharm
Copy link

@Poncharm Poncharm commented May 26, 2025

Summary by Sourcery

Centralize detection parameters as module-level constants and refine the FPCM spike detection logic in fpcm_detector.py by improving convolution alignment, boundary handling, and background estimation while adding progress reporting.

New Features:

  • Add tqdm progress bar during channel processing in detect_spikes_fpcm.

Bug Fixes:

  • Fix convolution output length alignment by switching to mode='same'.
  • Handle boundary conditions in error calculations to prevent indexing errors.

Enhancements:

  • Extract hardcoded numeric parameters into named module-level constants.
  • Refactor inst_power background estimation using np.roll instead of manual concatenation.
  • Refine candidate waveform window slicing and adjust peak index correction with a separation interval constant.

@sourcery-ai
Copy link

sourcery-ai bot commented May 26, 2025

Reviewer's Guide

This PR refactors fpcm_detector.py by centralizing hyperparameters into module-level constants, standardizing convolution behavior, improving background and window boundary handling, adding a progress bar, and updating spike separation logic to enhance clarity, configurability, and robustness.

Sequence Diagram for Spike Detection Logic in detect_spikes_fpcm

sequenceDiagram
    participant D as detect_spikes_fpcm
    participant CF as _convolve_filters
    participant AP as _apply_predicates

    D->>D: Initialize parameters (defaults to new constants)
    D->>D: data = raw.get_data()
    D->>D: Bp, piBp = _build_model(...)
    loop For each channel ch (with tqdm progress bar)
        D->>D: x = data[ch,:]
        D->>CF: Call _convolve_filters(x, piBp) with mode="same"
        CF-->>D: C (coefficients)
        D->>D: Calculate flat (using SLOPES_HEIGHTS_RATIO const)
        D->>D: Calculate bkg_left, bkg_right (updated logic with np.roll, mode="same" for inst_power)
        D->>AP: Call _apply_predicates(..., bkg_coeff=BCK_COEFF, slope_tol=SLOPE_TOLERANCE, apex_wave_ratio1=APEX_WAVE_RATIO1, apex_wave_ratio2=APEX_WAVE_RATIO2)
        AP-->>D: morph_ok (morphology mask)
        D->>D: Process candidate spikes (t in cand_idx)
        D->>D:   Handle boundary for slice rng (st, en using T//2 due to "same" convolution)
        D->>D:   Calculate err_peak, err_wave (with boundary checks)
        D->>D: Update hit_mask[ch]
    end
    D->>D: Aggregate hits (acc_hits)
    D->>D: Identify spike_centres
    D->>D: Suppress spikes (min_separation_samp from SEPARATION_INTERVAL const)
    D->>D: Adjust unique_peaks (using T//2 shift)
    D->>D: Format and return events & info
Loading

Class Diagram for fpcm_detector.py Module Changes

classDiagram
    class fpcm_detector {
        <<Module>>
        +PEAK_HW_MS: float
        +WAVE_HW_MS: float
        +WAVE_POWER: float
        +BCK_COEFF: float
        +ERR_PEAK_TH: float
        +ERR_WAVE_TH: float
        +HIT_THRESHOLD: int
        +SLOPES_HEIGHTS_RATIO: float
        +SLOPE_TOLERANCE: float
        +APEX_WAVE_RATIO1: float
        +APEX_WAVE_RATIO2: float
        +SEPARATION_INTERVAL: int

        +_build_model(sfreq, peak_hw_ms, wave_hw_ms, wave_power)
        +_convolve_filters(X, piBp)  # Internally uses mode="same"
        +_apply_predicates(C, HC, bkg_left, bkg_right, flat_peak, bkg_coeff: float = BCK_COEFF, slope_tol: float = SLOPE_TOLERANCE, apex_wave_ratio1: float = APEX_WAVE_RATIO1, apex_wave_ratio2: float = APEX_WAVE_RATIO2) # Updated defaults and new parameters
        +detect_spikes_fpcm(raw, peak_hw_ms: float = PEAK_HW_MS, wave_hw_ms: float = WAVE_HW_MS, wave_power: int = WAVE_POWER, bkg_coeff: float = BCK_COEFF, err_peak_th: float = ERR_PEAK_TH, err_wave_th: float = ERR_WAVE_TH, hit_threshold: int = HIT_THRESHOLD, slopes_heights_ratio: float = SLOPES_HEIGHTS_RATIO) # Updated defaults and new parameter
    }
Loading

File-Level Changes

Change Details Files
Parameterize hyperparameters via module-level constants
  • Add a constants section for peak, wave, background, error thresholds, and tolerances
  • Replace hard-coded defaults in detect_spikes_fpcm and _apply_predicates with these constants and new parameters
src/fpcm_detector.py
Standardize convolution modes
  • Switch filter convolutions in _convolve_filters to mode='same'
  • Change instantaneous power convolution to mode='same'
src/fpcm_detector.py
Refactor background power estimation
  • Replace manual concatenation with np.roll for bkg_left and bkg_right
  • Zero out edge segments after rolling to maintain correct boundary behavior
src/fpcm_detector.py
Enhance candidate window boundary handling
  • Compute start/end indices based on even/odd window length
  • Add bounds check before error calculation and default errors to 1 when out of range
src/fpcm_detector.py
Add progress feedback
  • Import and wrap channel loop with tqdm to display progress
src/fpcm_detector.py
Update spike suppression logic
  • Use SEPARATION_INTERVAL constant for min separation sample calculation
  • Adjust unique_peaks offset calculation to T//2+peak_hw_s
src/fpcm_detector.py

Tips and commands

Interacting with Sourcery

  • Trigger a new review: Comment @sourcery-ai review on the pull request.
  • Continue discussions: Reply directly to Sourcery's review comments.
  • Generate a GitHub issue from a review comment: Ask Sourcery to create an
    issue from a review comment by replying to it. You can also reply to a
    review comment with @sourcery-ai issue to create an issue from it.
  • Generate a pull request title: Write @sourcery-ai anywhere in the pull
    request title to generate a title at any time. You can also comment
    @sourcery-ai title on the pull request to (re-)generate the title at any time.
  • Generate a pull request summary: Write @sourcery-ai summary anywhere in
    the pull request body to generate a PR summary at any time exactly where you
    want it. You can also comment @sourcery-ai summary on the pull request to
    (re-)generate the summary at any time.
  • Generate reviewer's guide: Comment @sourcery-ai guide on the pull
    request to (re-)generate the reviewer's guide at any time.
  • Resolve all Sourcery comments: Comment @sourcery-ai resolve on the
    pull request to resolve all Sourcery comments. Useful if you've already
    addressed all the comments and don't want to see them anymore.
  • Dismiss all Sourcery reviews: Comment @sourcery-ai dismiss on the pull
    request to dismiss all existing Sourcery reviews. Especially useful if you
    want to start fresh with a new review - don't forget to comment
    @sourcery-ai review to trigger a new review!

Customizing Your Experience

Access your dashboard to:

  • Enable or disable review features such as the Sourcery-generated pull request
    summary, the reviewer's guide, and others.
  • Change the review language.
  • Add, remove or edit custom review instructions.
  • Adjust other review settings.

Getting Help

Copy link

@sourcery-ai sourcery-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey @Poncharm - I've reviewed your changes - here's some feedback:

  • Consider wrapping the tqdm progress bar behind an optional verbose flag instead of unconditionally using it, to avoid spamming output in non-interactive contexts.
  • The even/odd window boundary logic (calculating st, en, and rng slices) is repeated—extracting it into a small helper function would reduce duplication and improve readability.
  • Rolling plus manual zero-padding for background estimation can lead to edge artifacts; you might use numpy.pad or a convolution with appropriate boundary handling to make the edge behavior more explicit and robust.
Here's what I looked at during the review
  • 🟢 General issues: all looks good
  • 🟢 Security: all looks good
  • 🟢 Testing: all looks good
  • 🟢 Documentation: all looks good

Sourcery is free for open source - if you like our reviews please consider sharing them ✨
Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.

# --------------------------------------------------------------------
# 3. Detection
# --------------------------------------------------------------------
def detect_spikes_fpcm(
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

issue (code-quality): Low code quality found in detect_spikes_fpcm - 21% (low-code-quality)


ExplanationThe quality score for this function is below the quality threshold of 25%.
This score is a combination of the method length, cognitive complexity and working memory.

How can you solve this?

It might be worth refactoring this function to make it shorter and more readable.

  • Reduce the function length by extracting pieces of functionality out into
    their own functions. This is the most important thing you can do - ideally a
    function should be less than 10 lines.
  • Reduce nesting, perhaps by introducing guard clauses to return early.
  • Ensure that variables are tightly scoped, so that code using related concepts
    sits together within the function rather than being scattered.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant