witnessthe powerof trusted aion chain

Broadcast trustworthy proofs on the Layer-1 blockchain for AI.

layer 1for ai

Nesa is the global blockchain network bringing AI on-chain. The Nesa platform lets applications and protocols seamlessly integrate with AI.

Explore
Testnet Online
01

100

/sec
Inference requests
02

1,000+

AI models on-chain
03

0.0

trillion
Hosted AI parameters
04

100K+

Nodes on network
Secure execution

the firstfully privateai network

Your data is never revealed to other participants on the Nesa network.

Working with Nesa
protocols
01
marquee-logo-0
marquee-logo-1
marquee-logo-2
marquee-logo-3
marquee-logo-4
marquee-logo-5
marquee-logo-6
marquee-logo-7
marquee-logo-8
marquee-logo-9
networks
02
marquee-logo-0
marquee-logo-1
marquee-logo-2
marquee-logo-3
marquee-logo-4
marquee-logo-5
marquee-logo-6
marquee-logo-7

Breakthrough distributed ai

breakthrough-thumb
Secured byTrusted Execution Environment1
Verified BYZero Knowledge Proof2
ExecutingSharded Model Block3
HostingModel Hyper-Paramaters4
more than
AI INFERENCE NODES WORLDWIDE
Run a node

Your home laptop is all you need

The lowest ever minimum hardware requirements to run an AI node

01

Every other Ai platform

A100 GPU48 GB VRAM~$10,000 minimum cost
02
Home laptop2 GB RAM$0 minimum cost

trending ai
models right
now on nesa

3,000+ AI models in Nesa's global AI model repository.

Text Classification

Distilbert/Distilbert-Base-Uncased-Finetuned-Sst-2-English

DistilBERT model fine-tuned on the SST-2 dataset for sentiment analysis in English.

< 5sec avg
490
0.001 NES
Text Classification

Cardiffnlp/Twitter-Roberta-Base-Sentiment-Latest

cardiffnlp/twitter-roberta-base-sentiment-latest is a RoBERTa-based model fine-tuned for sentiment analysis on Twitter data, providing accurate predictions for sentiment classification tasks.

< 5sec avg
5100
0.001 NES
Content Summarization

/Facebook/Bart-Large-Cnn

facebook/bart-large-cnn is a large-scale model for text summarization tasks, based on the BART architecture. It is pre-trained on a massive amount of data and fine-tuned for summarization tasks, achieving state-of-the-art results.

< 5sec avg
490
0.001 NES
Image Generation

Stabilityai/Sdxl-Turbo

DistilBERT model fine-tuned on the SST-2 dataset for sentiment analysis in English.

< 5sec avg
490
0.001 NES
Language Translation

Helsinki-NLP/Opus-Mt-Zh-En

The Helsinki-NLP/opus-mt-zh-en model is a machine translation model that translates text from Chinese to English. It is part of the OPUS-MT project which focuses on multilingual translation.

< 5sec avg
490
0.001 NES
Text Classification

ProsusAI/Finbert

FinBERT is a pre-trained BERT model fine-tuned for financial sentiment analysis, capable of understanding the nuances of financial language and sentiment in text data.

< 5sec avg
490
0.001 NES
Image Generation

Yodayo-Ai/Kivotos-Xl-2.0

Kivotos XL 2.0, also built on Animagine XL V3, is designed for creating high-quality anime-style art, focusing on vibrant and detailed visuals.

< 5sec avg
490
0.001 NES
Text Generation

Gpt-4o

GPT-4o is optimized for efficiency, balancing performance with resource usage, and is suitable for deployment in environments where computational efficiency is crucial without sacrificing text generation quality.

< 5sec avg
490
0.001 NES
Text Generation

Meta-Llama/Meta-Llama-3-70B

Meta developed and released the Meta Llama 3 family of large language models (LLMs), a collection of pretrained and instruction tuned generative text models in 8 and 70B sizes.

< 5sec avg
490
0.001 NES
Text Classification

SamLowe/Roberta-Base-Go_emotions

SamLowe/roberta-base-go_emotions is a RoBERTa-based model fine-tuned for emotion classification, specifically designed to predict emotions in text data. It has been trained on the GoEmotions dataset, which contains a wide range of emotions for more accurate predictions.

< 5sec avg
490
0.001 NES
Text Classification

Claude-3-Opus-20240229

GPT-4o is optimized for efficiency, balancing performance with resource usage, and is suitable for deployment in environments where computational efficiency is crucial without sacrificing text generation quality.

< 5sec avg
490
0.001 NES
Text Classification

Mistralai/Mixtral-8x7B-Instruct-V0.1

The Mistralai/Mixtral-8x7B-Instruct-V0.1 model is a state-of-the-art AI language model by Mistral AI, combining eight expert models for superior accuracy and efficiency in instruction following and problem-solving.

< 5sec avg
490
0.001 NES
Token Classification

Dslim/Bert-Base-NER

The dslim/bert-base-NER model is a BERT-based model fine-tuned for Named Entity Recognition (NER) tasks, capable of identifying and classifying entities in text data. It utilizes the powerful BERT architecture to achieve high accuracy and performance in NER applications.

< 5sec avg
490
0.001 NES
Text Classification

Detr-Resnet-50

The facebook/detr-resnet-50 model is a state-of-the-art object detection model that utilizes a transformer architecture to directly predict bounding boxes and class labels in a single end-to-end network.

< 5sec avg
460
0.001 NES
Text Classification

Deepset/Roberta-Base-Squad2

The deepset/roberta-base-squad2 model is a RoBERTa-based model fine-tuned on the SQuAD 2.0 dataset for question answering tasks. It excels in providing accurate answers to questions based on given contexts and has been optimized for improved performance on the SQuAD 2.0 benchmark.

< 5sec avg
460
0.001 NES
Token Classification

Mattmdjaga/Segformer_b2_clothes

The mattmdjaga/segformer_b2_clothes model is a Segformer model fine-tuned for clothes segmentation tasks, providing accurate and detailed segmentation of various types of clothing items in images.

< 5sec avg
460
0.001 NES

The leadership team at Nesa has a combined 200+ research publications in AI and deep learning.

100+

Supported by decades of research

Our team has worked in AI for the past decade in areas like generative AI, large language models, computer vision, neural network design, and data science.

Our Research
Alan Turing
NeurIPS
NeurIPS
CVPR
Google
Meta
Google
ICML
IEEE
Area Chair
ICLR
Norton
Github
WiSE Merit
ACM CCS
Embassy
Baidu
Nature Science
MIT EECS
AAAI
MIT-Takeda
Wold AI WAICO

Introducing AI's Native Asset

Incubated by

NES IS AN ESSENTIAL PART OF HOW DEVELOPERS BUILD ON NESA.

NES is used to secure the network for participation in consensus and aggregation, to compensate model developers and miners operating nodes for AI orchestration, and to pay for queryspace when running inference on the Nesa blockchain.

Token Node