A lightweight, modular library for neural network weight optimization using randomized search algorithms built directly on top of PyTorch. Pyperch is a research and teaching-oriented library for training neural networks using randomized optimization methods (RHC, SA, GA), gradient-based methods, and hybrid combinations.
pip install pyperchIf developing locally:
poetry installThe public API and usage examples are documented and organized as follows:
The user-facing APIs are documented under:
Key entry points include:
- Perch Builder API - experiment construction, training, and hybrid optimization
- Optuna Search API - hyperparameter search using an adapter-based Optuna integration
Notebook/colab examples showing common workflows can be found in:
The examples cover:
- Classification and regression
- Randomized optimization (RHC, SA, GA)
- Gradient and hybrid optimization
- Layer freezing and meta-optimization
- Optuna-based hyperparameter search
If you are upgrading from Pyperch ≤ 0.1.6, the original standalone (functional) optimizers have been preserved for backward compatibility.
You can find the previous implementations here:
- Git tag:
v1-legacy - Directory:
pyperch/optim/
The new refactored optimizers can be found under:
pyperch.optim.*
Contributions are welcome. To submit a change:
- Fork the repository
- Create a feature branch:
git checkout -b feature/my-change- Commit your work:
git commit -m "feat: describe your change"- Push your branch:
git push origin feature/my-change- Open a pull request on GitHub
Before opening a PR:
poetry run black pyperch
poetry run ruff check pyperch --fixThis ensures consistent formatting and linting across the project.
MIT License