Skip to content

Commit 496da51

Browse files
docs: update changeset and examples for in-batch deduplication
Update documentation to reflect the simplified deduplication feature that only works within a single batch/queue cycle. Changes: - Rewrote changeset to describe in-batch deduplication only - Removed cross-batch specific documentation - Restored original useBatcher example (basic batching demo) - Created new useBatcherInBatchDedup example with: - Interactive UI for testing in-batch deduplication - Activity log showing adds vs. duplicates - Visual feedback for deduplication behavior - Deleted old useBatcherDedup example The feature is now simpler and more focused: it prevents duplicates within the same batch but does not track across batch cycles.
1 parent 72223cc commit 496da51

File tree

12 files changed

+291
-484
lines changed

12 files changed

+291
-484
lines changed
Lines changed: 26 additions & 39 deletions
Original file line numberDiff line numberDiff line change
@@ -1,57 +1,44 @@
11
---
2-
'@tanstack/pacer': minor
2+
"@tanstack/pacer": minor
33
---
44

5-
Add cross-batch/cross-execution deduplication support to Batcher and Queuer
5+
Add in-batch/in-queue deduplication support to Batcher and Queuer
66

7-
This feature extends the existing `deduplicateItems` option to track processed items across batch/execution cycles. When enabled, items that have already been processed will be automatically skipped.
7+
This feature adds `deduplicateItems` option to prevent duplicate items within the same batch or queue.
88

9-
### Enhanced Options
9+
### New Options
1010

11-
- `deduplicateItems: boolean` - Now prevents duplicates **both within and across batches** (default: false)
12-
- `deduplicateStrategy: 'keep-first' | 'keep-last'` - Only affects in-batch duplicates (default: 'keep-first')
13-
- `getItemKey: (item) => string | number` - Extract unique key from item
14-
- `maxTrackedKeys: number` - Maximum keys to track with FIFO eviction (default: 1000)
15-
- `onDuplicate: (newItem, existingItem?, instance) => void` - Called for both in-batch and cross-batch duplicates
16-
17-
### New Methods
18-
19-
- `hasProcessedKey(key)` - Check if a key has been processed
20-
- `peekProcessedKeys()` - Get a copy of all processed keys
21-
- `clearProcessedKeys()` - Clear the processed keys history
22-
23-
### New State Properties
24-
25-
- `processedKeys: Array<string | number>` - Keys that have been processed (similar to RateLimiter's executionTimes)
11+
- `deduplicateItems: boolean` - Enable automatic deduplication within the current batch/queue (default: false)
12+
- `deduplicateStrategy: 'keep-first' | 'keep-last'` - Strategy for handling duplicates (default: 'keep-first')
13+
- `getItemKey: (item) => string | number` - Extract unique key from item (defaults to JSON.stringify for objects)
2614

2715
### Behavior
2816

2917
When `deduplicateItems` is enabled:
18+
- **'keep-first'**: Ignores new items if an item with the same key already exists in the batch/queue
19+
- **'keep-last'**: Replaces existing items with new items that have the same key
3020

31-
1. **In-batch duplicates**: Merged based on `deduplicateStrategy` ('keep-first' or 'keep-last')
32-
2. **Cross-batch duplicates**: Skipped entirely (already processed)
33-
3. `onDuplicate` called with `existingItem` for in-batch, `undefined` for cross-batch
34-
35-
### Use Case
36-
37-
Prevents redundant processing when the same data is requested multiple times:
38-
39-
- API calls: Don't fetch user-123 if it was already fetched
40-
- No-code tools: Multiple components requesting the same resource
41-
- Event processing: Skip events that have already been handled
42-
43-
Similar to request deduplication in TanStack Query, but at the batching/queuing level.
21+
### Use Cases
4422

45-
### Persistence Support
23+
Prevents redundant items within a single batch or queue cycle:
24+
- API batching: Avoid duplicate IDs in the same batch request
25+
- Event processing: Deduplicate events before processing
4626

47-
The `processedKeys` can be persisted via `initialState`, following the existing Pacer pattern (similar to RateLimiter):
27+
### Example
4828

4929
```typescript
50-
const savedState = localStorage.getItem('batcher-state')
51-
const batcher = new Batcher(fn, {
52-
deduplicateItems: true,
53-
initialState: savedState ? JSON.parse(savedState) : {},
54-
})
30+
const batcher = new Batcher<{ userId: string }>(
31+
(items) => fetchUsers(items.map(i => i.userId)),
32+
{
33+
deduplicateItems: true,
34+
getItemKey: (item) => item.userId,
35+
}
36+
);
37+
38+
batcher.addItem({ userId: 'user-1' }); // Added to batch
39+
batcher.addItem({ userId: 'user-2' }); // Added to batch
40+
batcher.addItem({ userId: 'user-1' }); // Ignored! Already in current batch
41+
batcher.flush(); // Processes [user-1, user-2]
5542
```
5643

5744
Fully opt-in with no breaking changes to existing behavior.

examples/react/useBatcherDedup/README.md

Lines changed: 0 additions & 80 deletions
This file was deleted.

0 commit comments

Comments
 (0)