In brief

Decentralized data layer Walrus is aiming to provide a "verifiable data foundation for AI workflows" in conjunction with the Sui stack.

The Sui stack includes data availability and provenance layer Walrus, offchain environment Nautilus and access control layer Seal.

Several AI teams have already chosen Walrus as their verifiable data platform, with Walrus functioning as "the data layer in a much larger AI stack."

AI models are getting faster, larger, and more capable. But as their outputs begin to shape decisions in finance, healthcare, enterprise software, and beyond, an important question needs to be answered—can we actually verify the data and processes behind those outputs?"Most AI systems rely on data pipelines that nobody outside the organization can independently verify," states Rebecca Simmonds, Managing Executive of the Walrus Foundation—a company which supports the development of decentralized data layer Walrus.As she explains, there is no standard way to confirm where data came from, whether it was tampered with, or what was authorized for use in the pipeline. That gap doesn't just create compliance risk—it erodes trust in the outputs AI produces."It's about moving from 'trust us' to 'verify this,'" Simmonds said, "and that shift matters most in financial, legal, and regulated environments where auditability isn't optional."Why centralized logs aren't enoughMany AI deployments today rely on centralized infrastructure and internal audit logs. While these can provide some visibility, they still require trust in the entity running the system.External stakeholders have no choice but to trust that the records haven't been altered. With a decentralized data layer, integrity is anchored cryptographically, so independent parties can verify them without relying on a single operator.This is where Walrus positions itself, as the data foundation within a broader architecture referred to as the Sui Stack. Sui itself is a layer-1 blockchain network that records policy events and receipts onchain, coordinating access and logging verifiable activity across the stack.The Sui Stack. Image: Walrus"Walrus is the data availability and provenance layer—where each dataset gets a unique ID derived from its contents," Simmonds explained. "If the data changes by even a single byte, the ID changes. That makes it possible to verify that the data in a pipeline is exactly what it claims to be, hasn't been altered, and remains available."Other components of the Sui Stack build on that foundation. Nautilus lets developers run AI workloads in a secure offchain environment and generate proofs that can be checked onchain, while Seal handles access control, letting teams define and enforce who can see or decrypt data, and under what conditions."Sui then ties everything together by recording the rules and proofs onchain,” Simmonds said “That gives developers, auditors, and users a shared record they can independently check.""No single layer solves the full AI trust problem," she added. "But together, they form something important: a verifiable data foundation for AI workflows—data with provable provenance, access you can enforce, computation you can attest to, and an immutable record of how everything was used."Several AI teams have already chosen Walrus as their verifiable data platform, Simmonds said, including open-source AI agent platform elizaOS, and blockchain-native AI intelligence platform Zark Lab.