Decentralized AI overview
Decentralized AI, or DeAI for short, refers to the intersection between AI and blockchain technology. The term denotes a wide range of applications, starting from peripheral elements such as tokenization or decentralized marketplaces, and going all the way to running AI models fully on-chain as smart contracts.
Why use DeAI?
DeAI in its truest form can potentially solve AI's trust problem. Today, users have to blindly trust AI running on centralized servers. They have no visibility into how their data is used, how AI models produce responses, or whether they work correctly, reliably, and consistently.
Since AI models behave like black boxes to users, building trustworthy AI models is a difficult challenge. This can be addressed if users are able to verify how a model has been trained and that the inference process uses that very model to generate an output.
Trustworthy DeAI is possible on ICP using canister smart contracts.
DeAI on ICP
Running AI models on-chain is too compute and memory-intensive for traditional blockchains. ICP was designed to make smart contracts powerful by leveraging the following features:
- The WebAssembly virtual machine that provides near-native performance.
- Deterministic time slicing that automatically splits long-running computation over multiple blocks.
- Powerful node hardware with a standardized specification. Nodes have 32-core CPUs, 512GiB RAM, and 30TB NVMe.
Currently, ICP supports on-chain inference of small models using AI libraries such as Sonos Tract that compile to WebAssembly. Check out the image classification example to learn how it works. The long-term vision of DeAI on ICP is to support on-chain GPU compute to enable both training and inference of larger models.
Technical working group: DeAI
A technical working group dedicated to discussing decentralized AI and related projects meets bi-weekly on Thursdays at 5pm UTC. You can join via the community Discord server.
You can learn more about the group, review the notes from previous meetings, and ask questions on the DFINITY forum.
ICP AI projects
Several community projects that showcase how to use AI on ICP are available:
Rust
Sonos Tract: An open-source AI inference engine that supports ONNX, TensorFlow, PyTorch models and compiles to WebAssembly. The image classification example explains show to integrate it into a canister to run on ICP.
Tract-IC-AI: A fork of Sonos Tract that shows an alternative approach of running the inference engine on ICP.
Rust-Connect-Py-AI-to-IC: Open-source tool for deploying Python AI models.
ic-mnist: A sort of machine learning 'Hello, world!' running on ICP. Try it here.
ELNA AI: A fully on-chain AI agent platform and marketplace. Supports both on-chain and off-chain LLMs. Try it here.
Juno + OpenAI: An example using Juno and OpenAI to generate images from prompts.
ArcMind AI: An autonomous agent using Chain of Thoughts for reasoning and actions. Try the app in-browser.
ArcMind Vector DB: A vector database that supports text, image, and audio embedding.
Vectune: Vectune is a lightweight VectorDB with incremental indexing, based on FreshVamana.
ICPanda AI: An AI chatbot canister based on Huggingface’s Candle framework and running Qwen 0.5B model.
Motoko
MotokoLearn: A Motoko package that enables on-chain machine learning.
DeVinci: An AI chatbot that uses an open-source LLM. Check out the canister yourself.
C++
TypeScript / JavaScript
Tensorflow on ICP: An Azle example that uses a pre-trained model for making predictions.
ICGPT: A React frontend that uses a C/C++ backend running an LLM fully on-chain. Check it out yourself.
AI resources
Machine learning programming:
Distributed AI compute:
Lightweight AI models: