Technology
June 18, 2025
Step-by-Step Guide to Anchoring and Monetizing Your AI Data with LazAI
Share on:

Ever wondered why AI models make billions while the people who own the real data behind them get… nothing?

You have data maybe it’s research, financial signals, co-created AI model, or insights no one else has. But handing it over to big platforms means losing control, credit, and any hope of profit.

LazAI gives developers and data contributors a way to share privacy-sensitive data with AI systems while keeping full control over how that data is used. In return, contributors receive Data Anchoring Tokens, known as DATs.

DAT is an on-chain asset you actually own, control, and monetize. By anchoring your data, you get paid every time your data, model, or compute resource is used. 

This guide will walk you through encrypting, uploading, registering, and verifying your data. But before diving into the how, let’s look at why it matters.

Why Turn Your Data Into an Asset in the New AI Economy

Imagine you are a medical researcher. You hold a rare disease dataset. Until now, there was no way to share it securely, prove its use, or earn recognition. With DATs, you anchor your data, set terms, track usage, and earn income without giving up privacy.

Why contribute your data:

  • Earn based on utility, not volume
  • Retain full ownership with licensing and audit trails
  • Keep data private and secure
  • Support open, decentralized AI

Unlike other platforms that reward clicks or participation, LazAI values the data itself. By contributing, you help build AI that is owned by users, not platforms. LazAI is the only system where rewards are tied to verifiable inference, not speculation or popularity, making DATs a true measure of data utility.

What You Need

  • A Web3 wallet such as MetaMask.
  • An IPFS account with a valid Pinata JWT token.
  • The Alith SDK for Rust, or the corresponding Python or Node.js SDK.
  • Your own privacy-sensitive data to contribute.
  • Basic command line and programming knowledge.

You will need a development environment set up for Rust, Python or Node.js. You can find full implementation details for each here: https://alith.lazai.network/docs/lazai/data-provider

Follow these steps to go from “I have data” to “I have revenue, control, and recognition.”

LazAI Data Contribution Guide [Node.js Implementation]

Toggle Arrow

Step 1: Project Setup Create a new project directory and initialize it.

Shell
mkdir LazAI-contribution
cd LazAI-contribution
npm init -y
Toggle Arrow

Step 2: Install Alith and Dependencies Install the required packages for Node.js.

Shell
npm install alith
npm install --save-dev @types/node-rsa
Toggle Arrow

Step 3: Set Your Environment Variables Before proceeding, make sure your environment includes your wallet and IPFS credentials.

Shell
export PRIVATE_KEY=your_wallet_private_key
export IPFS_JWT=your_pinata_jwt
Toggle Arrow

Step 4: Encrypt Your Privacy Data Use your Web3 wallet to sign a message. The result will serve as your encryption password.

Javascript
const encryptionSeed = "Sign to retrieve your encryption key";
const password = client.getWallet().sign(encryptionSeed).signature;
const encryptedData = await encrypt(Uint8Array.from(privacyData), password);
Toggle Arrow

Step 5: Upload the Encrypted Data to IPFS Upload the encrypted data to IPFS and retrieve the shareable link.

Javascript
const fileMeta = await ipfs.upload({
name: dataFileName,
data: Buffer.from(encryptedData),
token: token,
});
const url = await ipfs.getShareLink({ token: token, id: fileMeta.id });
Toggle Arrow

Step 6: Register the Data on Chain Submit the IPFS URL of your encrypted file to the LazAI smart contract.

Javascript
let fileId = await client.getFileIdByUrl(url);
if (fileId == BigInt(0)) {
fileId = await client.addFile(url);
}
Toggle Arrow

Step 7: Request Proof from a Verified Node Request that a verified compute node processes your data securely using a trusted execution environment.

Javascript
await client.requestProof(fileId, BigInt(100));
const jobIds = await client.fileJobIds(fileId);
const jobId = jobIds[jobIds.length - 1];
Toggle Arrow

Encrypt the password using the compute node's public key.

Javascript
const nodeInfo = await client.getNode(job.nodeAddress);
const rsa = new NodeRSA(nodeInfo.publicKey, "pkcs1-public-pem");
const encryptedKey = rsa.encrypt(password, "hex");
Toggle Arrow

Send the proof request.

Javascript
const response = await axios.post(`${nodeInfo.url}/proof`, {
job_id: Number(jobId),
file_id: Number(fileId),
file_url: url,
encryption_key: encryptedKey,
encryption_seed: encryptionSeed,
nonce: null,
proof_url: null,
}, {
headers: { "Content-Type": "application/json" },
});
Toggle Arrow

Step 8: Claim Your DAT Reward Once the node validates your contribution, you can request your DAT reward.

Javascript
await client.requestReward(fileId);
console.log("Reward requested for file id", fileId);
Toggle Arrow

Complete Implementation
Create index.ts

Javascript
import { Client } from "alith/lazai";
import { PinataIPFS } from "alith/data/storage";
import { encrypt } from "alith/data/crypto";
import NodeRSA from "node-rsa";
import axios, { AxiosResponse } from "axios";
async function main() {
const client = new Client();
const ipfs = new PinataIPFS();
// Step 4: Encrypt your privacy data
const dataFileName = "your_encrypted_data.txt";
const privacyData = "Your Privacy Data";
const encryptionSeed = "Sign to retrieve your encryption key";
const password = client.getWallet().sign(encryptionSeed).signature;
const encryptedData = await encrypt(Uint8Array.from(privacyData), password);
// Step 5: Upload the encrypted data to IPFS
const token = process.env.IPFS_JWT || "";
const fileMeta = await ipfs.upload({
name: dataFileName,
data: Buffer.from(encryptedData),
token: token,
});
const url = await ipfs.getShareLink({ token: token, id: fileMeta.id });
// Step 6: Register the data on chain
let fileId = await client.getFileIdByUrl(url);
if (fileId == BigInt(0)) {
fileId = await client.addFile(url);
}
// Step 7: Request proof from a verified node
await client.requestProof(fileId, BigInt(100));
const jobIds = await client.fileJobIds(fileId);
const jobId = jobIds[jobIds.length - 1];
const job = await client.getJob(jobId);
const nodeInfo = await client.getNode(job.nodeAddress);
// Encrypt the password using the compute node's public key
const rsa = new NodeRSA(nodeInfo.publicKey, "pkcs1-public-pem");
const encryptedKey = rsa.encrypt(password, "hex");
// Send the proof request
const response: AxiosResponse = await axios.post(
`${nodeInfo.url}/proof`,
{
job_id: Number(jobId),
file_id: Number(fileId),
file_url: url,
encryption_key: encryptedKey,
encryption_seed: encryptionSeed,
nonce: null,
proof_url: null,
},
{
headers: { "Content-Type": "application/json" },
},
);
if (response.status === 200) {
console.log("Proof request sent successfully");
} else {
console.log("Failed to send proof request:", response.data);
}
// Step 8: Claim your DAT reward
await client.requestReward(fileId);
console.log("Reward requested for file id", fileId);
}
await main();
Toggle Arrow

Configuration Files
Add to package.json:

JSON
"scripts": {
"start": "node --loader ts-node/esm index.ts"
}
Toggle Arrow

Create tsconfig.json:

JSON
{
"compilerOptions": {
"target": "ES2022",
"module": "ES2022",
"moduleResolution": "node",
"strict": true,
"esModuleInterop": true,
"skipLibCheck": true,
"forceConsistentCasingInFileNames": true,
"outDir": "./dist",
"allowJs": true,
"resolveJsonModule": true
},
"ts-node": {
"esm": true,
"experimentalSpecifierResolution": "node"
},
"include": ["*.ts"],
"exclude": ["node_modules"]
}
Toggle Arrow

Run the Script

Shell
npm run start
You should see output similar to:
Proof request sent successfully
Reward requested for file id <file_id>

Summary

LazAI turns your privacy-sensitive data into an asset that supports open AI systems. By encrypting your contribution, anchoring it to a decentralized archive, and validating it through trusted nodes, you retain ownership while earning DAT for your role in the process. They are on-chain records of value, participation and access. They form a foundation for aligning incentives in AI infrastructure that is collaborative, private, and decentralized by design. 

To explore more or switch to Python or Rust, refer to the full documentation here

Follow LazAI on social channels to stay updated on new tools, contributor rewards and governance opportunities. DATs can become a foundational component of decentralized AI.

By anchoring your data with DATs, you are not just participating in a new system. You are actively building a future where intelligence is powered by individuals, not corporations. 

Related Posts
Subscribe to our Newsletter
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.