neptune.ai examples
Neptune is a lightweight experiment tracker for ML teams that struggle with debugging and reproducing experiments, sharing results, and messy model handover. It offers a single place to track, compare, store, and collaborate on experiments and models.
With Neptune, Data Scientists can develop production-ready models faster, and ML Engineers can access model artifacts instantly in order to deploy them to production.
In this repo, you'll find examples of using Neptune to log and retrieve your ML metadata.
You can run every example with zero setup (no registration needed).
Docs | Neptune | GitHub | Colab | |
---|---|---|---|---|
Quickstart | ||||
Track and organize runs | ||||
Monitor runs live |
Docs | Neptune | GitHub | Colab | |
---|---|---|---|---|
Re-run failed training | ||||
Log from sequential pipelines | ||||
DDP training experiments |
Neptune | GitHub | Colab | |
---|---|---|---|
Text classification using fastText | |||
Text classification using Keras | |||
Text summarization | |||
Time series forecasting |
GitHub | Colab | |
---|---|---|
Import runs from Weights & Biases | ||
Copy runs from one Neptune project to another |
GitHub | Colab | |
---|---|---|
Get Neptune storage per project and user |
Check out our docs β https://docs.neptune.ai/