Infernet Node
The Infernet Node (opens in a new tab) is a lightweight off-chain client for Infernet responsible for fulfilling compute workloads:
- Nodes listen for on-chain subscriptions (via the Coordinator contract) or off-chain requests (via the REST API)
- Nodes orchestrate dockerized containers (aka "services") consuming on-chain and off-chain provided inputs
- Nodes deliver outputs and optional proofs via on-chain transactions or the off-chain API
Example use cases
James has setup a Governor contract for his DAO, inheriting the Infernet
SDK. Every time a proposal is created, the contract
kicks off a new on-chain Subscription
request alerting an Infernet Node of a new
proposal. Once picked up by the node, James' custom
governor-quantitative-workflow
is run and the node responds on-chain with an
output and associated computation proof.
Emily is developing a new NFT collection that lets minters automatically add
new traits to their NFTs by posting what they'd like in plaintext (think, "an
Infernet-green hoodie"). Emily sets up a minting website that posts signed
Delegate Subscriptions to an Infernet node
running her custom stable-diffusion-workflow
. This workflow parses plaintext
user input and generates a new Base64 image, with the Infernet node posting
the final image to her smart contract via an on-chain transaction.
Travis is building a new web app that allows his users to chat with AI
avatars. He posts new messages via Delegate
Subscriptions to his Infernet node running his
custom llm-inference-workflow
via the HTTP API and receives a
response instantly over the same API. He surfaces these responses to users in
his web app, offering a snappy user experience, while his node asynchronously
publishes a proof of computation on-chain, letting his users verify the
integrity of the responses in the future.
Granular configuration
Infernet Nodes offer granular runtime configuration and permissioning. Operators have full flexibility in:
- Running any arbitrary compute workload (by creating an Infernet-compatible container)
- Using both public container images and private images via Docker Hub
- Choosing to serve on-chain subscriptions, off-chain requests, or both
- Configuring on-chain parameters including
max_gas_limit
, how many blocks to trail chain head, and more - Restricting workload access by IP address, on-chain address, or delegated contract address
- Specifying workload configuration parameters (including environment variables, execution ordering, etc.)
- Optionally forwarding system statistics to Ritual, which powers the Infernet Router
All of these parameters can be configured via a single runtime config.json
file. Read more about sane defaults and modifying this configuration for your own use cases in Configuration.
System specifications
Infernet Node requirements depend greatly on the type of compute workflows you plan to run. Because all workflows run in Docker containers (opens in a new tab), we recommend optimizing for at least a minimum set of requirements that support Virtualization (opens in a new tab). Memory-enhanced machines are preferred.
Minimum Requirements
Minimum | Recommended | GPU-heavy workloads | |
---|---|---|---|
CPU | Single-core vCPU | 4 modern vCPU cores | 4 modern vCPU cores |
RAM | 128MB | 16GB | 64GB |
DISK | 512MB HDD | 500GB IOPS-optimized SSD | 500GB NVME |
GPU | - | - | CUDA-enabled GPU |
Off-chain events
If you choose to service off-chain Web2 requests via the REST API you will have to expose port 4000
to the Internet.
On-chain events
If you plan to use your Infernet Node to listen and respond to on-chain events, via the Infernet SDK, you will also need access to a blockchain node RPC.
Next steps
You may choose to:
- Follow a Quickstart tutorial for setting up and using an Infernet Node end-to-end
- Head to Ritual Learn (opens in a new tab) for step-by-step tutorials and videos using our starter examples (opens in a new tab)
- Understand the Node's runtime configurations
- Explore your options to deploy an Infernet Node
- Check out the REST API and REST client reference
- Read in-depth about the node architecture and compatible containers
- Find out more about the Infernet SDK