
Data Anchoring Token (DAT) and ERC-8028: An AI-Native Asset Standard for Ethereum’s dAI Era
In this article, we trace how “AI tokens” evolved from narrative to infrastructure and where DAT fits into Ethereum’s emerging dAI stack. We:
1. Expose the façade of “decentralized AI”
2. Discuss Ethereum’s dAI mission
3. Introduce DAT and ERC-8028 as an AI-native asset standard on Ethereum
4. Explain what makes DAT different from existing AI tokens
5. Detail what the DAT EIP (ERC-8028) unlocks for builders
6. Show DAT in practice through Lazbubu, CreateAI and LazAI’s Alpha Mainnet
1. The façade of “decentralized AI”
AI tokens have become one of crypto’s strongest narratives since late 2022, have posted outsized returns, and almost every cycle brings a new wave of “decentralized AI” projects.
But most so-called AI tokens are not AI-native at all. The first generation of projects can be roughly grouped into three categories:
- Compute networks like Render or Akash Network, where the token pays for GPU cycles.
- Agent or intelligence networks such as Bittensor or Fetch.ai, where tokens reward models or autonomous agents.
- Data marketplaces like Ocean Protocol that tokenize access to datasets rather than the AI outputs that consume them.
These systems are important experiments. On paper, they promise “decentralized AI”; they show that tokens can coordinate compute, data and agents at scale. But they share several structural problems:
- AI workloads remain almost entirely off-chain, with blockchains acting as payment and registry layers.
- The token does not represent the AI asset itself.
- Revenue routing, usage accounting, and provenance are often added ad hoc, not encoded as a reusable standard
These limitations explain why recent academic reviews describe much of the current AI-token landscape as an “illusion of decentralised AI”: the architecture and economics often remain close to centralised AI services, with forced token layers added on top.
2. Ethereum’s dAI turn: agents need assets
Against this backdrop, Ethereum has started to articulate a more concrete role in the AI economy. In recent posts and talks, Vitalik Buterin has outlined a cautious but clear direction for “crypto + AI”:
“Blockchains are not there to let AI govern protocols. Instead, they should provide verification, provenance and credible neutrality for AI-driven systems: an open layer where agents settle, prove and share value under transparent rules.”
The Ethereum Foundation’s new dAI initiatives build on the same intuition: Ethereum should act as a settlement and coordination layer for AI agents and machine economies, and explicitly positioning itself as ‘infrastructure for the AI economy’ led by @DavideCrapis, the stated mission in this X post:
“Make Ethereum the preferred settlement and coordination layer for AIs.”
If you take that view seriously, two requirements fall out immediately:
- Standards for agents and payments (e.g. ERC-8004 / x402-style schemes for agent-to-agent settlement).
- Standards for AI assets themselves, the data, models, inferences and histories those agents consume and generate.
If Ethereum is to become the settlement and coordination layer for AI agents, it needs a way to represent AI-native assets that is as universal as ERC-20, but expressive enough for AI-specific economics.
DAT and ERC-8028 address this and propose a canonical way for AI assets to live on Ethereum as programmable, verifiable claims, not just as metadata behind a symbol.
3. DAT and ERC-8028: positioning AI as an on-chain asset
DAT (Data Anchoring Token) is a semi-fungible token (SFT) standard “designed for AI-native assets” developed by LazAI Network. Each DAT is defined as a dynamic bundle of three elements: an ownership certificate, usage rights, and a share in revenue associated with a particular AI asset.
LazAI has proposed DAT as an official Ethereum standard via EIP-8028. The goal is to move DAT from a protocol-specific mechanism to a reusable, audited and ecosystem-wide standard for representing AI assets on Ethereum.
Check EIP-8028 here: Fellowship of Ethereum Magicians
The proposal extends familiar ERC patterns with a small set of additional semantics specific to AI:
- a notion of Class, representing an abstract AI asset such as a dataset, model, agent profile or inference pool, together with metadata and integrity references;
- a notion of quota attached to each token, quantifying how much of the underlying asset can be consumed;
- a notion of share, specifying how revenue arising from that asset should be apportioned among token holders;
- standard events and methods for recording usage and settling revenue for a Class in ETH or ERC-20.
That is exactly the kind of concreteness Ethereum’s deAI roadmap calls for: a verifiable, composable standard that treats AI artefacts as first-class onchain assets.
4. Why DAT is different from existing AI tokens
4.1. Asset-centric design.
Existing tokens typically are unnecessary layers added on top of a protocol with no actual utility.
In DAT (ERC-8028), the organising unit is the AI asset itself. Classes make datasets, models and agents individually referenceable. Tokens express a concrete stake in one of those ‘Classes’. This aligns more closely with how AI systems are actually built.
4.2. Usage as a first-class economic variable.
By design, a DAT position is associated with a quantity of permitted usage. The exact unit is left to the Class policy (for example, tokens, calls, steps or a composite metric), but the existence of a quota at the standard level allows usage to be accounted and priced consistently.
This is a precondition for building sustainable economics where contributors are paid in proportion to actual workload rather than aggregate protocol metrics.
4.3. Standardised revenue routing at the asset layer.
The revenue-sharing model in ERC-8028 is part of the standard interface. This makes it possible to express, in a common format, how value from an AI asset should flow back to data contributors, model builders, fine-tuners, evaluators and infra providers. For on-chain analytics and risk assessment, this is crucial: revenue flows become inspectable and composable instead of buried in custom contracts.
Taken together, these properties mean DAT is not a governance or payment token in the conventional sense. It behaves more like a tokenized claim on verifiable AI activity tied to a specific asset, with standardized semantics for how that activity consumes quota and distributes value.
5. Implications of the DAT EIP (ERC-8028) for builders
The decision to advance DAT as an Ethereum standard, rather than as a protocol-specific mechanism, has concrete consequences for builders.
The DAT EIP (ERC-8028) codifies this into a reusable, audited and ecosystem-wide standard:
- It specifies the interfaces for Class creation, token minting, quota accounting, usage recording, revenue settlement and claims.
- It defines how AI-specific semantics like quota and shareRatio are represented on-chain.
- It sets expectations for metadata, integrity and policy references so wallets, explorers, indexers and analytics tools can understand and visualize AI assets without custom integrations.
5.1 Data and model providers
For data and model providers, ERC-8028 offers a canonical way to publish AI artifacts as onchain assets with:
- verifiable metadata and integrity references,
- explicit policies for usage and licensing,
- and a standard interface for sharing revenue among multiple contributors.
Instead of re-implementing licensing or royalty logic per project, providers can rely on a shared interface that downstream protocols understand.
5.2. Agent and application developers
DAT provides a uniform abstraction for the external assets agents depend on. An agent that consumes multiple datasets and models from different ecosystems can hold DAT positions in each relevant Class and have usage and economics handled through a single, coherent interface rather than a patchwork of integrations.
5.3. Infrastructure, DeFi and analytics project
The EIP formalises a new class of on-chain object to index, collateralise or hedge. Because DATs expose usage and value flows, they can underpin new instruments: revenue-backed notes on specific models, baskets of dataset exposure, or structured products reflecting future AI workload on particular Classes.
5.4. Broader Ethereum ecosystem
At the ecosystem scale, ERC-8028 helps make “AI on-chain” precise.
It does not attempt to move heavy training or inference onto the base layer. Instead, it standardizes how the economic and provenance layer of AI, meaning which asset is being used, under which rules, and who benefits, should be represented in an EVM-native way, composable across:
- rollups,
- sidechains, and
- specialized off-chain compute networks.
That is strongly aligned with the Ethereum Foundation’s vision of decentralized AI systems with verifiable security guarantees, and with the dAI Team’s goal of building a decentralized AI stack on top of Ethereum’s consensus and cryptography.
DAT in practice: Lazbubu, CreatAI and Alpha Mainnet
Standards matter only if they are exercised in real systems.
DAT is being trialled today on LazAI, a Web3-native AI infrastructure protocol that focuses on verifiable data, agent economies, and programmable AI revenue flows.
A few concrete touchpoints:
- Lazbubu, a data-anchored companion agent. Lazbubu is an AI companion whose behavior is shaped by a user’s ongoing interaction history. LazAI uses DAT to anchor those interactions so that a user’s chats, quests and choices become part of a structured asset. To date, over 14,307 Lazbubu DATs have been minted
- CreateAI - an AI agent market, leveraging the GMPayer cross-chain payment hub (powered by x402 protocol), treating each agent as an economic object with its own DAT Class, making downstream monetisation transparent and programmable.
- SoulTarot, a tarot reading AI agent, extends this to a more narrative, emotion-driven surface. Each reading is settled onchain, with GMPayer wiring cross-chain payments.
- Alpha Mainnet: turning AI interaction into on-chain value.
On LazAI Alpha Mainnet (coming soon), interactions with AI agents like Lazbubu and CreateAI are being anchored as DATs, with METIS-settled gas and PoS + QBFT consensus beneath.
These early deployments are less about perfect economics and more about validating the mental model: AI can live as assets whose rights, usage and rewards are machine-readable.
Closing
The first wave of AI tokens proved that markets are hungry for “AI + crypto.” They also showed that wrapping a token around off-chain infrastructure is not enough. The real leverage comes when AI itself - data, models, agents, histories - becomes an on-chain asset with clear rights, usage and value flows.
DAT, formalized as ERC-8028, is an attempt to make that asset layer explicit.
It does not compete with compute networks, AI L1s or model marketplaces. It gives them a shared grammar for representing what they sell, how it is used, and how everyone in the supply chain is paid.
If decentralized AI is going to mature beyond hype and price charts, it needs standards at this level of concreteness. DAT is one of the first serious attempts to define such a standard for AI-native assets, and that is why it matters.
