Skip to content

Karl deck/spike drop redesign#45

Open
max-rosenblattl wants to merge 5 commits intomainfrom
KarlDeck/spike-drop-redesign
Open

Karl deck/spike drop redesign#45
max-rosenblattl wants to merge 5 commits intomainfrom
KarlDeck/spike-drop-redesign

Conversation

@max-rosenblattl
Copy link
Copy Markdown
Collaborator

@max-rosenblattl max-rosenblattl commented Apr 8, 2026

Redesign Spike Detector And Remove Drop Detector

♻️ Current situation & Problem

This PR simplifies the structural event detection logic by redesigning the spike detector and removing the separate drop detector. The previous setup had additional drop-related logic and explorer plumbing that are no longer needed.

⚙️ Release Notes

  • Redesigned the spike detector logic.
  • Removed the drop detector and related drop-specific handling.
  • Updated the explorer and structural caption plumbing to work with spike-only structural detection.

📚 Documentation

This change simplifies the detector setup by keeping structural point-event detection focused on spikes only. Related drop-specific configuration, explorer handling, and templates were removed.

✅ Testing

  • Verified the updated code compiles successfully.
  • Checked that the explorer and detector wiring no longer reference the removed drop detector.

Code of Conduct & Contributing Guidelines

By creating and submitting this pull request, you agree to follow our Code of Conduct and Contributing Guidelines:

@coderabbitai
Copy link
Copy Markdown

coderabbitai bot commented Apr 8, 2026

📝 Walkthrough

Walkthrough

Removed drop event detection from the codebase by eliminating drop support from DetectionResult, refactoring SpikeDetector with new candidate clustering and scoring logic, updating UI to remove drop/nonwear displays, adjusting detector configurations, and removing per-signal event limits.

Changes

Cohort / File(s) Summary
Core Detection Logic
detectors/__init__.py, detectors/spike.py
Removed "drop" event type from DetectionResult; refactored SpikeDetector to eliminate smoothing, prominence-scaling, and drop detection; introduced candidate clustering, localized jump-based scoring, and new ranking helper methods (_collect_candidates, _select_results, _cluster_ids).
Configuration & Templates
mhc/constants.py, templates/templates.json
Removed drop event templates; adjusted spike detector parameters (decreased min_distance and min_width, standardized top_k, removed prominence_scale and smooth_window).
Data Processing
extractors/structural.py
Removed per-signal event cap (MAX_EVENTS_PER_SIGNAL = 6); all sorted results now included in output.
UI/Explorer
explorer.py
Removed drop/nonwear event support and overlay toggles; added spike ranking by spike_minute per detector; introduced _hit_target_label() and _matches_hit_target() for hit-target matching with detector:event_type format; refactored event formatting and summary wear reporting.
Version Control
.gitignore
Added data/ directory to gitignore patterns.

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~50 minutes

Possibly related PRs

Suggested reviewers

  • KarlDeck
🚥 Pre-merge checks | ✅ 3
✅ Passed checks (3 passed)
Check name Status Explanation
Title check ✅ Passed The title references the main change (spike detector redesign and drop detector removal) but uses informal phrasing ('Karl deck') that may not be immediately clear to teammates reviewing history.
Description check ✅ Passed The description clearly explains the redesign of the spike detector and removal of the drop detector, matching the changeset which removes drop detection across multiple files and restructures spike detection logic.
Docstring Coverage ✅ Passed No functions found in the changed files to evaluate docstring coverage. Skipping docstring coverage check.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
📝 Generate docstrings
  • Create stacked PR
  • Commit on current branch
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Commit unit tests in branch KarlDeck/spike-drop-redesign

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Copy Markdown

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR redesigns spike/drop handling by splitting downward-event detection into a new DropDetector (including optional “return to baseline” classification), updating configuration/templates accordingly, and enhancing the interactive explorer to display and jump to the new event types.

Changes:

  • Added DropDetector with optional return_to_baseline classification and wired it into MHC_CHANNEL_CONFIG.
  • Refactored SpikeDetector to only emit “spike” events and updated explorer rendering/jump targets for drop vs return_to_baseline.
  • Added new structural caption templates for return_to_baseline and removed per-signal event truncation in structural extraction.

Reviewed changes

Copilot reviewed 7 out of 8 changed files in this pull request and generated 4 comments.

Show a summary per file
File Description
templates/templates.json Adds return_to_baseline structural templates for caption generation.
mhc/constants.py Updates detector wiring: separate spike vs drop detectors per channel with new parameters.
extractors/structural.py Removes per-signal max event cap so all detector events can be surfaced.
explorer.py Updates UI formatting, hit-target selection, and visualization for new event types and spike ranking.
detectors/spike.py Refactors spike detection logic and removes drop detection responsibilities.
detectors/drop.py Introduces DropDetector with drop + return-to-baseline classification.
detectors/__init__.py Extends DetectionResult.event_type to include return_to_baseline and nonwear.
.gitignore Ignores data/ directory.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment thread detectors/spike.py
prominence=prominence,
distance=self.min_distance,
filtered_signal,
prominence=self.min_prominence,
Copy link

Copilot AI Apr 8, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

SpikeDetector.min_distance is no longer applied in the find_peaks call (no distance=... argument). This makes the min_distance parameter ineffective and can substantially change detection behavior given the configs set min_distance (e.g., 15/20/30). Pass distance=self.min_distance to find_peaks (or remove the parameter if intentional) so spacing constraints are enforced consistently.

Suggested change
prominence=self.min_prominence,
prominence=self.min_prominence,
distance=self.min_distance,

Copilot uses AI. Check for mistakes.
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

maybe solve by rename

Comment thread detectors/drop.py Outdated
Comment thread detectors/drop.py Outdated
Comment thread detectors/__init__.py
Comment thread detectors/__init__.py Outdated
@max-rosenblattl max-rosenblattl marked this pull request as ready for review April 8, 2026 22:10
Copy link
Copy Markdown

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@detectors/spike.py`:
- Around line 48-54: The peak-finding call does not apply the stored
min_distance constraint, so update the find_peaks invocation that processes
filtered_signal (where peaks, properties = find_peaks(...)) to include
distance=self.min_distance so spacing is enforced during candidate collection;
keep the existing duplicate_radius logic in _select_results for clustering but
ensure initial suppression uses the min_distance attribute.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

Run ID: e53c9c4f-021e-4c48-9322-5dfca6faf58b

📥 Commits

Reviewing files that changed from the base of the PR and between 4a10d06 and afa4deb.

📒 Files selected for processing (7)
  • .gitignore
  • detectors/__init__.py
  • detectors/spike.py
  • explorer.py
  • extractors/structural.py
  • mhc/constants.py
  • templates/templates.json
💤 Files with no reviewable changes (2)
  • extractors/structural.py
  • templates/templates.json

Comment thread detectors/spike.py
Comment on lines 48 to 54
peaks, properties = find_peaks(
smoothed_signal,
prominence=prominence,
distance=self.min_distance,
filtered_signal,
prominence=self.min_prominence,
width=self.min_width,
)
if len(peaks) == 0:
return []
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

🧩 Analysis chain

🌐 Web query:

scipy find_peaks distance parameter behavior

💡 Result:

In scipy.signal.find_peaks, the distance parameter enforces a minimum horizontal spacing (in samples) between returned peak indices:

  • distance is the required minimal horizontal distance (>= 1) in samples between neighbouring peaks. If candidate peaks are closer than distance, the smaller peaks are removed first until all remaining peaks satisfy the spacing constraint. [1]
  • Conceptually, SciPy’s implementation keeps the taller peaks preferentially and suppresses nearby smaller ones (i.e., it effectively processes peaks from high to low and removes neighbors within the forbidden window). [2]
  • Guarantee is on peak indices being at least distance samples apart; however, for flat/plateau peaks, the edges of plateaus may end up closer than distance even though the returned peak indices respect it. [1]

Sources: SciPy docs for find_peaks [1] and an explanation/derivation based on the SciPy implementation [2].


🏁 Script executed:

find . -name "spike.py" -path "*/detectors/*" | head -5

Repository: StanfordBDHG/SensorTSLM

Length of output: 87


🏁 Script executed:

cat -n ./detectors/spike.py | sed -n '1,60p'

Repository: StanfordBDHG/SensorTSLM

Length of output: 2749


🏁 Script executed:

cat -n ./detectors/spike.py | sed -n '43,100p'

Repository: StanfordBDHG/SensorTSLM

Length of output: 3171


🏁 Script executed:

cat -n ./detectors/spike.py | sed -n '100,180p'

Repository: StanfordBDHG/SensorTSLM

Length of output: 4015


min_distance parameter is not applied during peak finding.

The min_distance attribute is stored in the constructor (line 30) but is not passed to find_peaks() (line 48). This means the minimum spacing constraint is not enforced during candidate collection. The duplicate_radius clustering in _select_results handles deduplication of collected candidates but doesn't replace the distance parameter's role in actively suppressing weaker peaks that fall within the minimum distance during the initial peak detection phase.

Pass distance=self.min_distance to enforce spacing during peak finding:

🔧 Proposed fix
        peaks, properties = find_peaks(
            filtered_signal,
            prominence=self.min_prominence,
            width=self.min_width,
+            distance=self.min_distance,
        )
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@detectors/spike.py` around lines 48 - 54, The peak-finding call does not
apply the stored min_distance constraint, so update the find_peaks invocation
that processes filtered_signal (where peaks, properties = find_peaks(...)) to
include distance=self.min_distance so spacing is enforced during candidate
collection; keep the existing duplicate_radius logic in _select_results for
clustering but ensure initial suppression uses the min_distance attribute.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants