Infernet ML
What is Infernet ML?
Ritual provides easy-to-use abstractions for users to create AI/ML workflows that can be
deployed on Infernet nodes.
The infernet-ml
library is a Python SDK
that provides a set of tools and extendable classes for creating and
deploying machine learning workflows. It is designed to be easy to use, and provides a
consistent interface for data pre-processing, inference, and post-processing of data.
Batteries Included
We provide a set of pre-built workflows for common use-cases. We have workflows for running ONNX models, Torch models, any Huggingface model via Huggingface inference client and even for closed-source models such as OpenAI's GPT-4.
Getting Started
Head over to the next section for installation and a quick walkthrough of the ML workflows.
🎉 What's new in infernet-ml 2.0
?
The following new features have been added in infernet-ml 2.0
:
- Addition of
ModelManager
class for uploading/downloading models to/from various storage layers. Currently supported:huggingface
,arweave
. For a tutorial on this, head to the Managing Models page. RitualArtifactManager
is a base class for managing various kinds of artifacts. Used for both ML models as well as EZKL artifacts. To watch it in action, check out the Artifact Management tutorial.RitualVector
is an easy-to-use class to represent vectors on-chain. It supports both fixed-point, and floating-point representations. Check out the Vectors section for more information on this.- EZKL is an engine for doing inference for ML models. We provide utility functions to generate zk artifacts, generate proofs, and verify those proofs. Check out the EZKL documentation for more information.