
YAML configuration for PyTorch Lightning experiments
-
Fast Iteration
Change hyperparameters from CLI without editing code.
-
Reproducible
One YAML file = one experiment. Version control configs like code.
-
Pure Lightning
Use any LightningModule. Full PyTorch Lightning power. Zero lock-in.
What is Lighter?
Lighter runs PyTorch Lightning experiments from YAML configs instead of hardcoded Python values.
You write Lightning code. Lighter handles configuration.
import pytorch_lightning as pl
class MyModule(pl.LightningModule):
def __init__(self, learning_rate=0.001):
super().__init__()
self.lr = learning_rate
# ... your model code ...
def training_step(self, batch, batch_idx):
# ... your training logic ...
return loss
model:
_target_: project.model.MyModule # Auto-discovered with __lighter__.py
learning_rate: 0.001
trainer:
max_epochs: 10
# Run it
lighter fit config.yaml
# Override from CLI
lighter fit config.yaml model::learning_rate=0.01
Two Approaches, Same Power
Choose the approach that fits your workflow:
LightningModule
Best for:
- Existing Lightning projects
- Custom training logic
- Full control over everything
You write:
- All step methods
configure_optimizers()- Your own logging
Lighter adds:
- YAML configuration
- CLI overrides
- Experiment tracking
LighterModule
Best for:
- New projects
- Standard workflows
- Less boilerplate
You write:
- Step implementations only
- Your model's forward logic
Lighter adds:
- Automatic
configure_optimizers() - Dual logging (step + epoch)
- Config-driven everything
You can switch anytime
Both approaches use the same config system. Start with one, switch to the other by changing _target_. No code rewrite needed.
Quick Comparison
import torch
import torch.nn.functional as F
import pytorch_lightning as pl
class MyModule(pl.LightningModule):
def __init__(self, network, learning_rate=0.001):
super().__init__()
self.network = network
self.lr = learning_rate
def training_step(self, batch, batch_idx):
x, y = batch
loss = F.cross_entropy(self.network(x), y)
self.log("train/loss", loss)
return loss
def configure_optimizers(self):
return torch.optim.Adam(self.parameters(), lr=self.lr)
trainer:
_target_: pytorch_lightning.Trainer
max_epochs: 10
model:
_target_: project.model.MyModule # project.file.Class
network:
_target_: torchvision.models.resnet18
num_classes: 10
learning_rate: 0.001
data:
_target_: lighter.LighterDataModule
train_dataloader:
_target_: torch.utils.data.DataLoader
batch_size: 32
dataset:
_target_: torchvision.datasets.CIFAR10
root: ./data
train: true
download: true
from lighter import LighterModule
class MyModel(LighterModule):
def training_step(self, batch, batch_idx):
x, y = batch
pred = self(x)
loss = self.criterion(pred, y)
if self.train_metrics:
self.train_metrics(pred, y)
return {"loss": loss}
def validation_step(self, batch, batch_idx):
x, y = batch
pred = self(x)
loss = self.criterion(pred, y)
if self.val_metrics:
self.val_metrics(pred, y)
return {"loss": loss}
trainer:
_target_: pytorch_lightning.Trainer
max_epochs: 10
model:
_target_: project.model.MyModel # project.file.Class
network:
_target_: torchvision.models.resnet18
num_classes: 10
criterion:
_target_: torch.nn.CrossEntropyLoss
optimizer:
_target_: torch.optim.Adam
params: "$@model::network.parameters()"
lr: 0.001
train_metrics:
- _target_: torchmetrics.Accuracy
task: multiclass
num_classes: 10
val_metrics: "%model::train_metrics"
data:
_target_: lighter.LighterDataModule
train_dataloader:
_target_: torch.utils.data.DataLoader
batch_size: 32
dataset:
_target_: torchvision.datasets.CIFAR10
root: ./data
train: true
download: true
Why Lighter?
Reproducibility
One YAML = one experiment. Version control, share, compare.
See exactly what changed between experiments.
Fast Iteration
Override any config value from CLI:
# Change learning rate
lighter fit config.yaml model::learning_rate=0.01
# Use more GPUs
lighter fit config.yaml trainer::devices=4
# Combine multiple changes
lighter fit config.yaml model::learning_rate=0.01 trainer::max_epochs=100
No Lock-In
Lighter is a thin layer over PyTorch Lightning:
- Use any LightningModule
- Use any Lightning callback
- Use any Lightning logger
- Switch back to pure Lightning anytime
Installation
Get Started
Ready to try it? Pick your path:
-
Quick Start
Get a model training in 10 minutes.
-
Complete Examples
Full, working code you can copy-paste.
-
Guides
Task-focused how-to guides.
Example Projects
Ready-to-run projects demonstrating Lighter across domains:
| Project | Domain | Features |
|---|---|---|
| cifar10 | Image Classification | Basic setup, MetricCollection, FileWriter |
| eeg | EEG Analysis | Braindecode integration, regression |
| huggingface_llm | Sentiment Classification | Transformers, datasets, model-computed loss |
| lora | Fine-Tuning | PEFT/LoRA, parameter filtering |
| medical_segmentation | Medical Imaging | MONAI, 3D volumes, sliding window |
| self_supervised | SSL Computer Vision | SimCLR, lightly library |
| video_recognition | Video | 3D CNNs, PytorchVideo |
| vision_language | Vision-Language | CLIP-style dual encoders |
Each project includes a README with setup instructions and demonstrates different Lighter features.