Ritual ML Workflows
Overview

🔦 Overview

What are Ritual Machine Learning workflows

Ritual machine learning workflows provides an easy-to-use abstraction for users to bring their AI workflow to deployment with Infernet nodes.

Ritual workflows are a set of dependency scripts and docker images used to integrate modular AI functions onto Ritual nodes. Examples of workflows include training, inference, fine-tuning, and proof generation for classical and large language models.

Developers can customize and extend Ritual's ML workflows as they need. Ritual currently offers optimized support for training and inference on both classical and open-sourced large language models.

See supported AI hooks, models, and tasks (for instance, training, inference, proof generation).

Getting Started

To get started, run installation and customize templates there. All that's needed to add in AI models as complex as LLMs to your workflow can be a variable change!

Tutorials

Here are some tutorials you can follow for bringing both classical and large language models to your on-chain use case, with Ritual. We currently have tutorials for building A simple logistic classifier tutoriallarge language model tutorial.

Here are a few other example workflow scripts.

Classic training workflow with ZKPs (coming soon)

LLM Inference workflow (coming soon)

Base Torch Inference workflow (coming soon)

Not sure what model to use? Here are some additional resources

ML resources 101

FAQs

Checkout FAQs here.