The Adapter Pattern
The Problem
Different ML components may expect different data formats. Consider a scenario where:
- Dataset returns dictionaries of tensors
- Model expects tensors
- Loss function needs specific argument order
- Metrics need different format than loss
Traditionally, you'd implement a pipeline specific to this scenario. This tightly couples components, making reuse and experimentation difficult.
The Solution: Adapters
Data flow through Lighter's System. Adapters bridge components with incompatible interfaces.
In software engineering, the adapter pattern allows incompatible interfaces to work together. Lighter uses adapters to handle variability in data formats.
Lighter's Adapter Types
| Adapter | Purpose | When to Use |
|---|---|---|
| BatchAdapter | Extract data from batches | Different dataset formats |
| CriterionAdapter | Format loss inputs | Custom loss functions |
| MetricsAdapter | Format metric inputs | Third-party metrics |
| LoggingAdapter | Transform before logging | Visualization needs |
Example: Task-Agnostic Configuration
adapters:
train:
criterion:
_target_: lighter.adapters.CriterionAdapter
pred_transforms: # Apply sigmoid before loss
_target_: torch.sigmoid
pred_argument: 0 # Map pred to first argument
target_argument: 1 # Map target to second argument
This enables any task—classification, segmentation, self-supervised learning—without framework modifications.
Under the Hood
Adapters are invoked in this order during training:
- BatchAdapter - Extract input/target from batch
- Forward pass - Model processes input
- CriterionAdapter - Format for loss computation
- MetricsAdapter - Format for metric computation
- LoggingAdapter - Transform for visualization
Practical Usage
For detailed adapter configuration and examples, see:
- Adapters How-To Guide - Complete usage guide
- Metrics Guide - Using MetricsAdapter
- Writers Guide - Using LoggingAdapter