The Agora Document

A Constitution for the Digital Age

Draft v0.1 — March 2026


Preamble

Precedents of Constitutional Documents

Human history is in large part the story of the social contract. As such, the focus of our story begins in the ‘agora’, the public space of the ancient Greeks. It is commonly held that the agora gave rise to the “birthplace of democracy”, where self-government would take its first steps, and where liberty, equality, and security became the supporting pillars of a system between the Athenian people and the ruling system of government in Athens.

Our story then quickly moves to June 15th, 1215, in a meadow near Windsor in England, where King John puts his royal seal on a world-altering document: The Magna Carta. It is at this moment, more or less, that the medieval social contract changes, making the king subject to the law, rather than above the law. These 63 clauses generate protection from abuses like undue process, or taxation without representation.

At the next stop of our story, a 1789 document put forth a framework in which power would be set against power in a nuanced system of checks and balances. The founders of the U.S. Constitution were steeped in knowledge of the Greeks and the history of English law.

Between the values proclaimed in the Declaration of Independence in 1776 and the specific articles of the Constitution, their political project would create conditions for a hitherto new social contract: a state where life, liberty, property (and the pursuit of happiness) were tautologically defined as inalienable rights, to be enforced through consent, legitimacy, and participation — a government by the people, for the people.

It is also essential to note that our story is not merely a collection of documents, agreements or social contracts. It is fundamentally about the fact that an abstract agreement can be made concrete, that the idea of power mediated through a concrete agreement forces that power to become accountable, and that the delineation of consent from the governed plays a role providing a bulwark against tyranny and abuse.

The Digital Age

Now our story moves to the technological progress of the digital age. In 1973 the forward thinking Willis Ware developed the “Five Principles of Fair Information Practice” at the Department of Health, Education and Welfare, which ultimately informed the Privacy Act of 1974.

The five original core principles are worth summarizing here:

  1. No secret systems
  2. Right to know what’s collected
  3. Right to prevent misuse
  4. Right to correct records
  5. Organization must ensure reliability

In 1998 the FTC reworked these five into the following principles:

  1. Notice — Tell consumers what you collect and how you use it
  2. Choice — Give consumers options about how their data is used
  3. Access — Let consumers see and contest their data
  4. Security — Keep data accurate and secure
  5. Enforcement — Have mechanisms to punish noncompliance

The General Data Protection Regulation (GDPR) went into effect in the European Union in 2018, enshrining many aspects of these principles into a complex legal restriction on businesses to take specific steps to protect personal data.

Our story really begins now, and the constant is, simply put, the human experience of power.

It begins with a “user” trying to navigate to a web page, and clicking buttons on a pop up window with massive legal implications about the collection and use of personal data. It begins with a mind-numbing scroll through “terms and conditions”, or trying to make sense of a privacy policy, that might be too long to read and more or less impossible to understand.

Many of the biggest tech companies, notably Apple, have made real strides in attempting to simplify and clarify the terms of this relationship, and privacy tools have increased in their capability to provide some meaningful protection.

But in the year 2026 the power of technology in our world is undeniably great and the relationship between the “governed” and this powerful governing technology is still mired in a state of disrepair. With advances in Artificial Intelligence this relationship matters perhaps now more than ever.

The Center for Humane Technology has done tremendous work in identifying, beyond the misuse of personal data, the damage that has been done and the damage that technology continues to exert on children specifically and persons in general. They have pinpointed the dangers of the algorithms and the flawed and often intentionally malicious design choices being made against our well-being.

The Techno-Social Contract

Reining in technological power is more often than not a long uphill battle in the face of the incredibly powerful forces of profit incentives and amoral markets. Wishing for benevolence is often futile and awareness alone is insufficient. What is needed is something more fundamental — a structural blueprint for a rightful relationship between humans and powerful technology.

Accordingly, there already exist many principles, terms, and protections out there. However many of these simply focus on what an organization does or must do with personal data. This is fundamentally, an incredibly limited imagination of the scope of the interaction between humans and technology. Why not expand on this?

Our actual social contracts, our governmental powers, our constitutions, these are always in need of examination. They too are important and they too are perhaps also being tested in novel ways with the emergence of powerful technology. But our relationship with technology requires more explicit, delineated consent, and what is desperately needed is a new techno-social contract.

What shape might a more thorough techno-social contract take?


Constitution

We the people have a fundamental right to consent to power, whether that power resides within a government, a device, an algorithm or any other technological system.

We the people, in order to maintain and perfect our natural social contracts, within and among our governments and entities, hereby set down this document as an agreement by which this ongoing human endeavor will be free from abuses of power from our increasingly capable technological systems.

This document is not meant to be an agreement between the people and a government or other organization. Rather, this document is meant to serve as a compact between a powerful technological system and the people that use it and rely on it.

Technological systems have power over people in many fundamental ways:

  1. They identify you — deciding who you are in the system, or whether you exist at all
  2. They track you — recording what you do, where you go, and when, and the resulting data can be lost, locked, sold, or erased without your consent
  3. They decide for you — algorithms determine what you see, what you qualify for, what you’re denied
  4. They distract you — shaping your attention, your urgency, your priorities
  5. They answer to no one — when the system fails, there is no one to call and few paths for remedy

These powers become more clear when defined through their relationship with the people, with a real human person, instead of a “user”.

There is no reason to assume that a technological system should be given these powers by divine, absolute, or essential right. In fact, it is much more readily the case the person, or the people, should hold the power in this relationship.

Where there is power, there must be an agreement on behalf of those subjected to that power. These articles are the delineation of the rights of the people in this regard, and the obligations of the relevant technological systems.

Those that control these systems, be they individuals, entities, organizations, companies, or governments, inherit these obligations. It is the duty of the technologist to take action to work to build and safeguard these systems appropriately.

It is likewise the duty of the owners of these systems to adopt these principles, to ensure meaningful action. It is also therefore essential that compliance exists through meaningful enforcement.

The “techno-social contract” eventually meets the social contract.


Articles

I — Identity — Identity is arguably the most important distinction made in the world of technology. When a system identifies an individual their identity becomes part of the system.

Often this is done through the collection of personal data. Personally identifying data must be treated with special care. There are rightly many laws that govern this transaction (Privacy Act, GDPR) but these are insufficient.

We declare that a system that identifies a human individual must acknowledge, affirm, and act in accordance with their rights as set out in this document.

Typically a system refers to individuals as “users”. This has been the typical language of the system. Instead, the system must recognize the human as an able actor, capable of consent, and with unalienable rights.

II — Record — When a system tracks the actions, movements, or transactions of a person, it creates a record. That record is not the property of the system. It is a shared artifact of the relationship, and the person it describes has rights over it: the right to see it, the right to challenge it, and the right to understand how it is used.

III — Restraint — A system must acknowledge the boundaries of its own knowledge. Not everything about a person should be captured, categorized, or made legible.

A system has an obligation to not know certain things. The right to opacity exists in many contexts.

A system must operate within a bounded and clearly defined framework extending only to its core purposes and the alignment of those purposes with the rights of the people that it serves and interacts with.

IV — Transparency — A system must present itself honestly to the people it serves. Its rules, its logic, and its actions must be legible to those it affects.

When a system makes a decision that impacts a person — approving, denying, ranking, recommending, restricting — that person has the right to understand what happened and why. When a system changes its rules, those changes must be communicated clearly and in advance.

A system that cannot explain itself to the people it serves has no legitimate authority over them.

V — Decision — When a system makes a determination that affects a person’s outcomes, opportunities, or standing, it exercises judgment. That judgment carries weight regardless of whether it was made by a human or an algorithm.

This article does not concern the general design choices inherent in any system. It concerns the specific determinations a system makes about an individual person that affect their rights, opportunities, or outcomes.

A system must not delegate consequential decisions to automated processes without clear justification and meaningful human review. A person has the right to know when a decision about them has been made by an automated system, and the right to have that decision reviewed by a human being.

The power to decide is not neutralized by the complexity of the system that exercises it. An algorithm is not an excuse. It is a choice made by the people who built and deployed it, and it must be answerable as such.

VI — Attention — A system that notifies, alerts, or otherwise demands the attention of a person exercises a form of power over that person’s time, focus, and mental energy.

This power must be exercised with restraint and respect. A person has the right to control when and how a system may demand their attention.

Notifications must serve the interest of the person, not the engagement metrics of the system. A system must never exploit human psychology to manufacture urgency, compel interaction, or prevent departure.

The attention of a human being is a faculty to be respected.

VII — Accountability — Those who build, own, and operate systems that fall under this document bear an obligation that cannot be delegated to the technology itself.

A system must identify the entity responsible for its operation. A person affected by the actions of a system must have a means of reaching a responsible human being — not solely an automated process. When a system causes harm through error, negligence, or design, there must be a clear path to remedy.

No system may claim that its complexity, scale, or automated nature absolves it of responsibility to the people it serves. The obligation follows the power. Where the system exercises power, the operator answers for it.


Examples

The Constitutional Layer

In most technological systems today, every person is a “user.” A mayor paying a water bill is a user. A resident contesting a parking ticket is a user. The official who issued that ticket is a user. The system treats them identically — they authenticate, they have permissions, they interact with the interface.

This document requires something more. Above the system layer, there must be a constitutional layer that recognizes the relationship between the person and the system.

A system may maintain an internal concept of “user” for technical purposes — authentication, sessions, credentials. That is plumbing. But the moment a person interacts with the system as a human with rights and obligations, the system must know: what is the relationship?

A Resident has privacy rights. An Official has public accountability. A Consumer has commercial protections. A Merchant has obligations of fair dealing. These are not permission levels. They are constitutional identities — and the architecture must reflect them.

The Parking Ticket

Consider the act of issuing a parking ticket. It is a small civic event, but it contains every principle in this document.

Identity: A Resident receives the ticket. An Official issues it. The system must know the difference — not because they have different login credentials, but because one has the right to privacy and the other has the obligation of public accountability.

Record: The ticket creates a record — who, what, where, when, why. That record belongs to the relationship, not solely to the system. The Resident has the right to see it. The Official’s action is part of the public record.

Restraint: The system does not collect any information beyond what is necessary for the parking ticket. It does not collect the driving history, the income level of the offender, or their recent purchases.

Transparency: The ordinance violated must be clear. The reason for the fine must be stated. The Resident should never receive a penalty they cannot understand.

Decision: If the municipality uses automated systems to detect violations — camera-based enforcement, sensor-triggered citations — the Resident has the right to know that the determination was made by a machine, and the right to have it reviewed by a human being.

Attention: The notification of the ticket must serve the Resident’s interest — informing them of an obligation — not the system’s interest in generating revenue.

Accountability: The Official who issued the ticket is identified. Their action is transparent. If the ticket was issued in error, there is a path to remedy. The system does not shield the operator from responsibility.

This is not a hypothetical. This is how civic life already works, in principle. The Agora Document simply requires that technological systems encode these principles rather than ignore them.

The Commercial Equivalent

The same principles apply in commerce. When a person becomes a Customer of a system — an online marketplace, a subscription service, a banking application — they enter a relationship with rights.

They are recognized not as a data profile but as a person entering a commercial relationship (Identity). They have the right to know what the system records about them (Record). They have the right not to have things made legible that have nothing to do with the transaction (Restraint). They have the right to understand why they were approved or denied (Transparency). They have the right to know when a decision about them — a credit determination, a price offered, a service denied — was made by an algorithm, and the right to have that decision reviewed by a person (Decision). They have the right not to be manipulated into purchases through dark patterns (Attention). And there must be a human being accountable when the system fails them (Accountability).

These are not novel ideas. They are the ancient principles of fair dealing, applied to the systems that now mediate nearly all commerce.


Adoption

This document is not legislation. It is not a regulation. It is a constitution — a framework that precedes and informs the specific laws, policies, and technical implementations that follow.

Adoption of this document means different things to different actors:

For technologists and developers: Adoption means building systems that recognize constitutional identities, not just users. It means architecting for transparency, contestability, and accountability from the beginning — not retrofitting them as compliance requirements. Open source implementations of these principles should be developed, shared, and improved collectively.

For organizations and companies: Adoption means committing to the principles of this document as a standard for how your systems relate to the people they serve. It means auditing existing systems against these articles and publicly acknowledging where they fall short.

For municipalities and governments: Adoption means requiring that civic technology — the systems through which residents interact with their government — comply with these articles. Public systems, funded by public money, serving public interests, should be the first to embody these principles.

For lawmakers and regulators: Adoption means using this document as a foundation for legislation that moves beyond data protection toward a fuller framework of rights in the relationship between people and technology. The existing regulatory landscape (Privacy Act, GDPR, FTC guidelines) addresses a fraction of the relationship. This document addresses the whole.

For individuals: Adoption means demanding that the systems you rely on recognize you as a person with rights — not a user to be managed, a data point to be harvested, or an engagement metric to be optimized.

This document will be maintained as a living text, revised through open and transparent process, and made freely available to all who wish to adopt, implement, or build upon it.


References

Historical Documents

Magna Carta (1215). National Archives (UK). https://www.archives.gov/milestone-documents/magna-carta

United States Declaration of Independence (1776). National Archives. https://www.archives.gov/founding-docs/declaration

United States Constitution (1789). National Archives. https://www.archives.gov/founding-docs/constitution

Law & Policy

U.S. Department of Health, Education and Welfare, Secretary’s Advisory Committee on Automated Personal Data Systems. Records, Computers and the Rights of Citizens (July 1973). https://aspe.hhs.gov/reports/records-computers-rights-citizens

Privacy Act of 1974, 5 U.S.C. § 552a. https://www.justice.gov/opcl/privacy-act-1974

Federal Trade Commission. Privacy Online: A Report to Congress (June 1998). https://www.ftc.gov/reports/privacy-online-fair-information-practices-electronic-marketplace-federal-trade-commission-report

Regulation (EU) 2016/679 of the European Parliament and of the Council (General Data Protection Regulation). Effective May 25, 2018. https://gdpr-info.eu/

Books

Barlow, John Perry. “A Declaration of the Independence of Cyberspace” (February 8, 1996). Electronic Frontier Foundation. https://www.eff.org/cyberspace-independence

Lessig, Lawrence. Code and Other Laws of Cyberspace. Basic Books, 1999.

Locke, John. Two Treatises of Government. 1689.

Williams, James. Stand Out of Our Light: Freedom and Resistance in the Attention Economy. Cambridge University Press, 2018.

Zuboff, Shoshana. The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. PublicAffairs, 2019.

Podcasts & Film

Harris, Tristan and Aza Raskin. Your Undivided Attention (podcast). Center for Humane Technology. https://www.humanetech.com/podcast

Orlowski, Jeff (director). The Social Dilemma (documentary). Netflix, 2020.

Organizations

Center for Humane Technology. https://www.humanetech.com/

Electronic Frontier Foundation. https://www.eff.org/

Electronic Privacy Information Center (EPIC). https://epic.org/

Renew Democracy Initiative. https://rdi.org/