
The dead internet theory used to be a conspiracy. Now it’s a forecast.
AI-generated content is flooding every channel: social media, search results, product reviews, news aggregation. The economics are simple. Synthetic content is cheap. Human attention is finite. The ratio shifts daily.
The response has been largely fatalistic. We’re told to develop better detection, to label AI content, to educate users. These are rear-guard actions. Detection models lag behind generation models. Labels get stripped in syndication. Users are overwhelmed.
But the framing is wrong. The dead internet assumes we have no choice but to receive content passively and guess at its origins. That assumption doesn’t hold.
We have the technology to build an internet where authenticity is verifiable by default. The primitives exist. The standards are mature. The question is adoption.
The Stack Already Exists
Content authenticity has working infrastructure:
C2PA (Coalition for Content Provenance and Authenticity) embeds cryptographic provenance directly into binary media files — images, video, audio. It’s shipping in cameras from Sony and Leica, in Adobe’s creative tools, in social platforms. When you see a C2PA-signed image, you can verify who created it, what device captured it, and whether it’s been modified. C2PA handles binary artifacts. It doesn’t address text or structured data.
SLSA and in-toto provide the same guarantees for software. Provenance attestations prove how a binary was built, from what source, by which build system. This is now a federal procurement requirement in the United States.
Decentralized Identifiers (DIDs) give you cryptographic identity that doesn’t depend on any single provider. Your identity is a key pair, resolvable to a document you control. W3C standard, multiple implementations, production-ready.
Verifiable Credentials (VCs) let institutions make attestations about you — degrees, certifications, memberships — that you hold and selectively disclose. Another W3C standard, already deployed in education and government contexts.
JSON-LD provides semantic structure for data on the web. Over 50% of websites already use it for search engine optimization. The same structure that helps Google understand your content can carry signatures and provenance.
None of this is speculative. These are deployed technologies with real adoption curves.
Cloudflare and the Routing Layer
Cloudflare took a small but significant step in 2024: preserving C2PA manifests through their CDN. This matters because CDNs routinely strip metadata from images for performance. By preserving provenance data, Cloudflare ensures that signed images remain verifiable after delivery.
It’s a baby step. But consider where Cloudflare sits: between origin servers and users, handling a substantial percentage of web traffic.
A CDN that preserves authenticity metadata today could verify it tomorrow. The infrastructure is in position. A routing layer with content authenticity support could:
- Surface authenticity signals to users before content loads
- Provide verification as a service for origins that don’t implement it themselves
- Aggregate trust signals across the web
- Offer filtering or ranking based on provenance
Cloudflare isn’t there yet. But the architecture supports it. Any infrastructure provider at sufficient scale has the same opportunity.
Characteristics of the Internet of Authenticity
What does an internet built on verifiable authenticity look like?
Signed by default. Content carries cryptographic proof of origin. Not every piece of content — that’s unrealistic — but enough that unsigned content becomes conspicuous. The default shifts from “assume authentic until proven fake” to “verify if you need to trust.”
Provenance chains. You can trace how content was created, modified, and distributed. An image has a history: captured on this device, edited in this tool, published by this account. A document has a chain: drafted by this author, reviewed by this editor, attested by this institution.
User-controlled trust. You decide whose signatures matter to you. Maybe you trust Reuters and the Associated Press for news images. Maybe you trust a specific research institution for scientific claims. Maybe you trust your own contacts for personal communication. These preferences are yours to configure, not dictated by a platform.
Decentralized verification. No single authority decides what’s authentic. Verification is a function anyone can perform given the signature and the public key. Trust anchors are plural and user-selected.
Portable identity. Your identity moves with you. A DID isn’t tied to a platform. Your credentials aren’t locked in a silo. When you leave a service, your attestations come with you.
The PKI Problem
The current web has a trust infrastructure: the certificate authorities that underpin HTTPS. This system has known weaknesses.
Over a hundred root CAs are trusted by default in most browsers. Any of these CAs can issue a certificate for any domain. The system’s security depends on the least trustworthy CA in the set. We’ve seen what happens when one fails — DigiNotar’s compromise in 2011, Symantec’s misissuance leading to its distrust in 2018.
More fundamentally, PKI answers the wrong question for content authenticity. It tells you that you’re connected to the domain you requested. It doesn’t tell you anything about the content that domain serves, who created it, or whether it’s been modified since creation.
The internet of authenticity requires a different model. Trust anchors should be selected by users, not bundled by browser vendors. Verification should apply to content, not just connections. Identity should be decentralized, not dependent on a hierarchy of certificate authorities.
DIDs and VCs provide this model. Your trust graph is yours. You add the signers you trust, revoke the ones you don’t, and evaluate content against your own criteria.
From Content to Context
C2PA handles binary artifacts. But as AI agents become primary consumers of information, binary content is only part of the picture.
Agents don’t just consume images and videos. They reason over structured data: knowledge graphs, entity relationships, claims with attributed sources. When an agent decides how to act, it traverses context — a web of facts about customers, contracts, policies, and permissions. This context is structured, semantic, and — currently — unsigned.
JSON-LD and schema.org vocabularies provide universal structure for data on the web. Schema.org types are understood by search engines, LLMs, and agent frameworks. Over 50% of websites already publish JSON-LD. The structure is there. The authenticity layer is missing.
This is what we’re building at Noosphere: extending content authenticity to structured data. JSON-LD signing with multi-party signatures, integrated with decentralized trust infrastructure.
Because structured data has internal organization, you can do things with it that binary content doesn’t support:
Multi-party signatures. Different parties attest to different parts of the same data structure. The author signs the content. The publisher signs the distribution metadata. The institution signs the author’s credentials. Each signature is separable and independently verifiable.
Layered trust models. You can trust different signers for different claims. A university attests that someone holds a degree. An employer attests to their role. A professional body attests to their certification. When you evaluate a claim, you evaluate the specific attestation chain for that claim.
Selective disclosure. Reveal only the parts of a credential relevant to a given context. Prove you’re over 18 without revealing your birthdate. Prove you hold a certification without revealing your full employment history.
These techniques require structure. You can’t selectively sign part of a flat file. You can’t layer trust over an undifferentiated blob. The semantic structure of JSON-LD makes this possible.
And because JSON-LD is built on open standards — RDF, schema.org vocabularies, W3C specifications — there’s no vendor lock-in. Your signed context is portable, interoperable, and yours.
The Work Ahead
The infrastructure for an internet of authenticity exists in pieces. C2PA is shipping for binary media. DIDs and VCs are standardized. JSON-LD is ubiquitous. The cryptographic primitives are mature.
The missing piece is the integration layer — especially for structured data. C2PA doesn’t handle JSON-LD. Existing signing standards don’t support multi-party attestation over semantic graphs. The decentralized trust infrastructure to replace PKI is still emerging.
This is what we’re building. The goal is an authenticity layer for the semantic web: signed context that agents can verify, trust models that users control, provenance that persists across time and platforms.
Camera manufacturers are embedding signatures at capture. Creative tools are adding provenance by default. Regulatory pressure is pushing software supply chains toward attestation. Structured data is next — the knowledge graphs and ontologies that agents reason over.
The dead internet is one possible future. The internet of authenticity is another. The tools exist. The standards are ready. The integration work is underway.
Next in this series: Context Integrity — why agents need more than signed content, and how ontologies built on open standards become the foundation for trustworthy AI.