AnyPINN
Work in Progress β This project is under active development and APIs may change. If you run into any issues, please open an issue on GitHub.
A modular Python library for solving differential equations with Physics-Informed Neural Networks.
AnyPINN lets you go from zero to a running PINN experiment in seconds, or give you the full control to define custom physics, constraints, and training loops. You decide how deep to go.
π Quick Start
The fastest way to start is the bootstrap CLI. It scaffolds a complete, runnable project interactively. Run it with uvx (ships with uv):
or with pipx:
? Choose a starting point:
> SIR Epidemic Model
...
Custom ODE
Blank project
? Select training data source:
> Generate synthetic data
Load from CSV
? Include Lightning training wrapper? (Y/n)
Creating my-project/
β pyproject.toml project metadata & dependencies
β ode.py your ODE definition
β config.py hyperparameters with sensible defaults
β train.py ready-to-run training script
β data/ data directory
Done! Run: cd my-project && uv sync && uv run train.py
All prompts are also available as flags to skip the interactive flow:
| Flag | Values | Description |
|---|---|---|
--help, -h |
β | Show help and exit |
--list-templates, -l |
β | Print all templates with descriptions and exit |
--template, -t |
built-in template name, custom, or blank |
Starting template |
--data, -d |
synthetic, csv |
Training data source |
--lightning, -L |
β | Include PyTorch Lightning wrapper |
--no-lightning, -NL |
β | Exclude PyTorch Lightning wrapper |
π₯ Who Is This For?
AnyPINN is built around progressive complexity. Start simple, go deeper only when you need to.
| User | Goal | How |
|---|---|---|
| Experimenter | Run a known problem, tweak parameters, see results | Pick a built-in template, change config, press start |
| Researcher | Define new physics or custom constraints | Subclass Constraint and Problem, use the provided training engine |
| Framework builder | Custom training loops, novel architectures | Use anypinn.core directly β zero Lightning required |
π‘ Examples
The examples/ directory has ready-made, self-contained scripts covering epidemic models, oscillators, predator-prey dynamics, and more β from a minimal ~80-line core-only script to full Lightning stacks. They're a great source of inspiration when defining your own problem.
π¬ Defining Your Own Problem
If you want to go beyond the built-in templates, here is the full workflow for defining a custom ODE inverse problem.
1: Define the ODE
Implement a function matching the ODECallable protocol:
from torch import Tensor
from anypinn.core import ArgsRegistry
def my_ode(x: Tensor, y: Tensor, args: ArgsRegistry) -> Tensor:
"""Return dy/dx given current state y and position x."""
k = args["k"](x) # learnable or fixed parameter
return -k * y # simple exponential decay
2: Configure hyperparameters
from dataclasses import dataclass
from anypinn.problems import ODEHyperparameters
@dataclass(frozen=True, kw_only=True)
class MyHyperparameters(ODEHyperparameters):
pde_weight: float = 1.0
ic_weight: float = 10.0
data_weight: float = 5.0
3: Build the problem
from anypinn.problems import ODEInverseProblem, ODEProperties
props = ODEProperties(ode=my_ode, args={"k": param}, y0=y0)
problem = ODEInverseProblem(
ode_props=props,
fields={"u": field},
params={"k": param},
hp=hp,
)
4: Train
import pytorch_lightning as pl
from anypinn.lightning import PINNModule
# With Lightning (batteries included)
module = PINNModule(problem, hp)
trainer = pl.Trainer(max_epochs=50_000)
trainer.fit(module, datamodule=dm)
# Or with your own training loop (core only, no Lightning)
optimizer = torch.optim.Adam(problem.parameters(), lr=1e-3)
for batch in dataloader:
optimizer.zero_grad()
loss = problem.training_loss(batch, log=my_log_fn)
loss.backward()
optimizer.step()
ποΈ Architecture
AnyPINN is split into four layers with a strict dependency direction β outer layers depend on inner ones, never the reverse.
graph TD
EXP["Your Experiment / Generated Project"]
EXP --> CAT
EXP --> LIT
subgraph CAT["anypinn.catalog"]
direction LR
CA1[SIR / SEIR]
CA2[DampedOscillator]
CA3[LotkaVolterra]
end
subgraph LIT["anypinn.lightning (optional)"]
direction LR
L1[PINNModule]
L2[Callbacks]
L3[PINNDataModule]
end
subgraph PROB["anypinn.problems"]
direction LR
P1[ResidualsConstraint]
P2[ICConstraint]
P3[DataConstraint]
P4[ODEInverseProblem]
end
subgraph CORE["anypinn.core (standalone Β· pure PyTorch)"]
direction LR
C1[Problem Β· Constraint]
C2[Field Β· Parameter]
C3[Config Β· Context]
end
CAT -->|depends on| PROB
CAT -->|depends on| CORE
LIT -->|depends on| CORE
PROB -->|depends on| CORE
anypinn.core β The Math Layer
Pure PyTorch. Defines what a PINN problem is, with no opinions about training.
Problemβ Aggregates constraints, fields, and parameters. Providestraining_loss()andpredict().Constraint(ABC) β A single loss term. Subclass it to express any physics equation, boundary condition, or data-matching objective.Fieldβ MLP mapping input coordinates to state variables (e.g.,t β [S, I, R]).Parameterβ Learnable scalar or function-valued parameter (e.g.,Ξ²in SIR).InferredContextβ Runtime domain bounds and validation references, extracted from data and injected into constraints automatically.
anypinn.lightning β The Training Engine (optional)
A thin wrapper plugging a Problem into PyTorch Lightning:
PINNModuleβLightningModulewrapping anyProblem. Handles optimizer setup, context injection, and prediction.PINNDataModuleβ Abstract data module managing loading, config-driven collocation sampling, and context creation. Collocation strategy is selected viaTrainingDataConfig.collocation_sampler("random","uniform","latin_hypercube","log_uniform_1d", or"adaptive").- Callbacks β SMMA-based early stopping, formatted progress bars, data scaling, prediction writers.
anypinn.problems β ODE Building Blocks
Ready-made constraints for ODE inverse problems:
ResidualsConstraintββdy/dt β f(t, y)βΒ²via autogradICConstraintββy(tβ) β yββΒ²DataConstraintββprediction β observed dataβΒ²ODEInverseProblemβ Composes all three with configurable weights
anypinn.catalog β Problem-Specific Building Blocks
Drop-in ODE functions and DataModules for specific systems. See anypinn/catalog/ for the full list.
π οΈ Tooling
| Tool | Purpose |
|---|---|
| uv | Dependency management |
| just | Task automation |
| Ruff | Linting and formatting |
| pytest | Testing |
| ty | Type checking |
All common tasks (test, lint, format, type-check, docs) are available via just.
devenv users: devenv redirects
(uv syncinstalls to.devenv/state/venvinstead of the standard.venv, so ty cannot auto-discover it. Create a gitignoredty.tomlat the project root with:ty.tomltakes full precedence overpyproject.toml, so all three settings are required.)
π€ Contributing
See CONTRIBUTING.md for setup instructions, code style guidelines, and the pull request workflow.