WTF is the DePub (and Why You should Care)

Mike Natanzon
9 min readJan 2, 2023

There are currently over 97 zettabytes of data online — that is roughly three gigabytes of data for every dollar of US national debt. Within this tremendous mountain of data — somewhere between old high school photos, junk email and the occasional tweet — lies nearly all of human knowledge. Now how much of this knowledge is actually correct, how much of it is faulty knowledge that relies on questionable sources or fictitious assumptions, and how much of it is outright misinformation and lies? With so much information out there, how do we even begin to sort the facts from the… alternative facts?

DePub — the Decentralized Public Sphere

Now what if there was a technology that could accurately tell you the credibility of online information? ‘Tell the credibility’ here means that anyone would be able to trust the technology’s output. It doesn’t mean that some people would trust it but others would think it’s manipulated or come up with conspiracy theories about it. It will be universally trusted because anyone would be able to access and verify the tech’s source code and processes.

Imagine then how such a technology could make our lives better. What if the tech allowed liberals and conservatives on social media to agree on the facts of a news story or a political controversy. Wouldn’t we have more civilized discussions and less polarization in society? What if the tech allowed an old grandma to know if that email she received is a scam or a virus? Wouldn’t that make our digital lives a lot safer, and save people and companies countless billions of dollars every year?

What if a student could use the tech to explore topics she’s unfamiliar with. Instead of spending countless hours going down various rabbit holes, reading pseudoscientific blog posts, or simply giving up, she could use the tech to sort out what is credible information and what isn’t. Wouldn’t then people be more informed, and confident in their ability to learn new things?

What if biochemists could use the tech to clearly see the frontier of their field — if they could quickly tell the credibility of all the latest research and publications, instead of having to manually check each article and its citations for errors or faulty methodology. Wouldn’t then researchers have a much clearer understanding of their field? Wouldn’t they be able to use their energy more effectively to move science forward?

The potential of such a technology to improve the world is certainly enormous. So why hasn’t it been built yet? The main challenge is that this technology cannot be built within the structure of the market economy nor with the help of government. Building such technology would require a new kind of framework — a decentralized public sphere, or DePub for short. Before discussing what that is, let’s try to understand why it’s not possible to create this tech within our existing economic structures.

So why can’t this tech be built within the market economy structure? Let’s consider two important properties of the tech: it has to be universally trusted, and it has to be accurate. Let’s start with a simple question then: who out there — what person, group or corporation — can build a tech that would be trusted universally? The problem of course is not just the particular set of individuals and corporations we have today — the problem is structural. When you have a system that is based on scarcity, such as the market economy, everyone’s economic interests conflict with the interests of others. Relations within such a system are — necessarily — adversarial and therefore it is not possible in such a system to have any individual or company that everyone else can trust to represent the public interest. When you are building a system that is likely to have a lot of economic value, that only amplifies the conflicts within the system, thus making it even less trustworthy.

Moreover, since such a technology would likely be based on artificial intelligence, and powered by machine learning, it would have to process value judgment calls from countless individuals. What is there to incentivize these individuals to favor the public interest above their own economic interests? Unfortunately there is no way in the market to do that, and no way for anyone to credibly arbitrate statements of fact.

In the market, nearly everything that benefits someone also harms someone else (economically). Even the most ‘noble’ pursuits are still likely to put some people at a financial disadvantage. If you advocate for people to eat healthier, that harms fast food chains and candy manufacturers (as well as their employees). If you find a cure to a chronic illness, that financially harms pharmaceutical companies who manufacture drugs to treat the condition. If you come up with a technological improvement in an industry, that harms your competitors. When the stock of a company goes up it financially harms those who are selling the stock short. When employees make more money that comes at the expense of the employer. When job-replacing automation makes productivity much higher it also harms workers, and so on and so forth. That is the nature of participating in a system that is based on scarcity.

Since people in the market are driven by profit maximization, and not by the public interest, they would do what it takes to promote their narrow economic interests. That is why if you advocate for people to eat healthier, fast food chains may claim that their food is actually healthy, may sponsor research that underplays the harm in their products, or try to discredit their critics and the claims they make. The same is true for any industry, for any product, and for any claim of fact.

Because of these market dynamics, there is no one in the market who could credibly arbitrate statements of fact. Since everyone is a participant in the market economy to some degree, they may have an economic interest in promoting views that may conflict with the public interest.

If it is not possible to build a ‘credibility tech’ — because no one in the market economy can credibly speak in the public interest, and because no one with such economic power can be trusted to not abuse it — perhaps it can be done with the help of government? Maybe if public funds were used instead of private equity to build the tech there would be no credibility issue and no economic conflicts of interest, since then the system would ultimately be in the hands of democratically elected representatives?

Even if we set aside for a moment the issue of which government should fund the project — since other countries are unlikely to trust a tech funded by their adversary — we’re still left with other problems. While governments are supposed to represent the public interest, in practice we have political parties or coalitions of parties that also have their own interest in maintaining their power. This dynamic in itself makes it impossible for the party or parties in power to be universally trusted with funding a ‘credibility tech.’ The incentive to manipulate such tech by the ruling group is great. While simultaneously the incentive to spread doubt and fear about the integrity of the system — even if such fears are unfounded — from those not in power is equally great. It is therefore not possible for any group in power, under any constellation, to be trusted with such technology.

Even if there was unanimous agreement between everyone in government about building such tech, it would still not be sufficient to gain universal trust from the people. The free flow of information, and people’s ability to make sense out of that information, is fundamental to people being able to act as a check on their government. If government is given the power to determine what information is credible through tech, it can very easily abuse such power and become tyrannical. For that reason alone, though it is supposed to act in the public interest, it is not possible for government to be universally trusted in funding a ‘credibility tech.’

So a ‘credibility tech’ cannot be built within the structure of the market economy because of economic conflicts of interest, and it cannot be built with the help of government because of potential abuse of power. How can it be built at all then?

Building such tech would require an environment where everyone’s economic interests align, and where no centralized authority can control the tech. This is precisely the problem that a decentralized public sphere (DePub) resolves.

What is a decentralized public sphere? It is an environment created by a blockchain-based protocol where the economic incentives of all participants are aligned and where public projects can be funded without centralized control. These properties are achieved through the design of the protocol and the incentive structure it creates.

How does the protocol guarantee that everyone’s economic incentives are aligned in the decentralized public sphere? Projects in the DePub are meant to be in the public interest. The protocol therefore issues funding to projects based on their realized economic impact. Since funding issuance dilutes the value of funds held by all participants, it is in the interest of all participants to preserve the value of their currency. At the same time, everyone in the ecosystem benefits from maximizing economic growth from contributors creating public projects for the ecosystem. If contributors’ projects are not funded fairly they’d contribute to the ecosystem where they are better rewarded. All participants in the ecosystem are therefore aligned in their economic incentive to fund public projects based on their economic impact, since that ensures that the value of currency is preserved while economic growth from public projects is maximized. This answers the question of aligned economic interests, but what about decentralized funding of the tech?

Funding any project through the protocol is based on a Proof-of-Impact consensus mechanism. This essentially means that unless the impact validation process is followed to the T the project will not be funded. There is a formal process that is designed to both make the review of every project as accurate as possible as well as minimize the possibility of bad actors manipulating the process. Validators are selected at random (to prevent collusion) to review the impact of the project. Each validator’s review is weighted by their level of domain-specific expertise in the project’s field. The number of validators required depends on the expected impact of the project. Another group of validators is selected from across the ecosystem to decide the economic impact of the project based on the review of the ‘experts.’ The review of ecosystem-wise validators is similarly weighted based on general expertise. Thus the review process is meritocratic, and cannot be ‘bought’ by having more funds in the protocol. Finally, once the validation process is complete, anyone in the ecosystem can challenge the review (which can result in validators losing funds). After the challenge period funds are issued to the project. This process ensures that every review is done accurately (as that is in the interest of every participant in the ecosystem) while at the same time no centralized authority has control over the process and no individual or group can manipulate the process regardless of their wealth or political power.

The result is an environment where the public interest is aligned with everyone’s economic interest, and where no centralized authority can control the system — a decentralized public sphere.

Let us now revisit the challenge of the ‘credibility tech’ and see how the framework of the DePub solves it. The ‘credibility tech’ has to be universally trusted, and it has to be accurate. Since the tech is publicly funded and isn’t controlled by any government or corporation, there is no reason for it not to be trusted. The project can be entirely open sourced and anyone can contribute to it (as well as get paid based on their contribution — all within the DePub). Similarly, anyone can (permissionlessly) contribute to ‘training’ the AI system and be compensated proportionately to their contribution.

Every contributor also has an economic incentive to do their best to provide the most accurate value judgment to train the tech. The reason for that is as follows: contributors are compensated based on the realized economic impact of the tech. The better the tech is at accurately telling the credibility of online content, the more economic impact the tech would have. Each contributor therefore benefits the most from providing the most accurate and unbiased value judgment, and flagging any inaccurate reviews made by other contributors so that the AI would be trained on the most accurate data.

Similarly, every participant’s economic interest in the ecosystem is aligned with the public interest, which is to accurately review the impact of each project. This serves as a check on contributors, who know that there is no way around doing the best work to get more compensation.

Thus, the decentralized public sphere creates a framework where the incentives of all contributors are aligned to do their best work (building an accurate ‘credibility tech’ in this case), and the incentives of all participants in the ecosystem are aligned to accurately judge the impact of the project. No corporation or government has the ability to manipulate this system, which means that the integrity of projects in the ecosystem can be universally trusted.

While a ‘credibility tech’ can greatly improve our lives, it is only one element in a much broader paradigm shift made possible thanks to the framework of the DePub — a framework that can transition us from a scarcity-based economy to an abundance economy. This will be the focus of the posts to follow.

Abundance Protocol: Transforming our economy and solving the problem of public goods through crypto.

Read the Abundance Protocol White Paper.
Follow us on Twitter @BuildingWeb4
Abundance Protocol website.

--

--

Mike Natanzon

How crypto can transform the economy and solves the problem of public goods. Abundance Protocol Whitepaper: http://shorturl.at/lqV37