@Pipeline | @Filter | |
|---|---|---|
| Where it runs | MongoDB server | Your JVM |
| When it runs | Once at stream startup | On every received event |
| What it filters | Events crossing the wire | Events dispatched to handlers |
| Best for | Reducing network traffic | Complex logic, Spring beans, service calls |
@Pipeline as your first line of defense — events that don’t match never leave MongoDB. Use @Filter when the decision requires Java code that can’t be expressed as an aggregation stage.
@Pipeline — Server-side filtering
The@Pipeline method returns an aggregation pipeline that MongoDB applies directly in the oplog. Only matching events are sent to your application over the wire.
The method is called once at stream startup — the pipeline is static for the lifetime of the stream.
Signatures
Three return types are supported:- List<Bson>
- Aggregation (Spring Data)
- List<AggregationOperation>
Native MongoDB driver — most explicit, no Spring dependency:
@Filter — Application-side filtering
The@Filter method is a Java predicate evaluated on every event received by FlowWarden, before the event is dispatched to your handler. Events that don’t pass the filter are silently dropped — their resume token is still tracked.
Because @Filter runs in Java, it can call Spring services, access request context, or apply any logic that cannot be expressed as a MongoDB aggregation stage.
Signatures
Two styles are supported:- boolean method
- Predicate factory
Takes a
ChangeStreamContext and returns a boolean — the most readable style:Combining both — the funnel approach
@Pipeline and @Filter can coexist on the same @ChangeStream. MongoDB pre-filters server-side, then Java refines application-side. This is the most efficient pattern for high-throughput streams with complex filtering needs.
Example — pre-filter by operation type server-side, then check business logic in Java:
Constraints and pitfalls
What happens to filtered events?
What happens to filtered events?
Events rejected by
@Filter are silently dropped — they never reach any handler method. Their resume token is still tracked internally, so checkpointing is not affected. Filtered events are not sent to the Dead Letter Queue.If you need to log or count filtered events, use the boolean filter(ChangeStreamContext<?> ctx) style and add your own logging before returning false.@Pipeline stages and operationType
@Pipeline stages and operationType
The most common first stage in a This is equivalent to the
@Pipeline is filtering by operationType:operationTypes attribute on @ChangeStream but expressed as an aggregation stage. You can combine both, though declaring it in @Pipeline gives you more flexibility (e.g. pairing with additional field conditions in the same $match).See Also
@Pipeline reference
All supported return types and validation rules
@Filter reference
Signature styles and startup validation
How it Works
Where filtering fits in the event processing pipeline
Checkpoint & Resume
How filtered events interact with resume tokens

