Defeating Tech Giants With Open Protocols, Interoperability, And Shared Stewardship

from the alternative-approaches dept

Across the ideological spectrum, there seems to be a consensus that something must be done about the biggest tech companies — that the legal mechanisms we currently have to address monopolization in the United States are inadequate to deal with the realities of the digital market. While recognizing how powerless our institutions have become in the face of Big Tech’s massive lobbying power, there’s an idea that’s gaining traction as a viable approach to curb the societal and economic impacts of tech monopolies. The idea is to restore the core of a healthy internet ecosystem: interoperability and the revival of open protocols.

Lawmakers, policy experts, and even Twitter, are advocating for tech companies to open up their platforms to enable other services and start-ups to enter the playing field. There are different approaches to doing so and varying layers of interoperability that are possible. The overarching goal, however, would be to get rid of the worst aspects of tech monopolization and bring about a new era of competition and innovation.

As a writer and nonprofit tech entrepreneur who has focused on projects promoting digital justice and community networks, I became interested in these ideas. Why do we need to extend interoperability into the application layer? How do we create new Internet standards that open up our networks and platforms in ways that invite new features and applications that better respect our individual and collective rights online? As I examined the three most likely scenarios being discussed, I realized that we had much to learn from the past. We need to revive the power of standards bodies, and ensure that they stay relevant and effective by observing known principles about how to successfully govern a commons.

A Brief Overview of Interoperability and Competition

What made the early Internet so exciting was how quickly it changed. Different services like bulletin board systems (BBS), email, and Internet relay chat (IRC) came about and allowed people to communicate in ways that were impossible before. That rich ecosystem of tools and services were enabled by downstream innovation. New applications and features could be built with existing technologies with or without permission from the prevailing tech companies. Yes, there were plenty of lawsuits against these start-ups back then. But people were still willing to take the risk, and there were investors that wanted to back them up. There were less onerous laws hindering experimental technologies.

Perhaps most importantly, much of the Internet ran on open protocols and standards. The academics and others who initially designed the protocols wanted to build a relatively free ecosystem, so they made it possible for services to interoperate with each other. Standards bodies like the World Wide Web Consortium (W3C) established shared protocols in the name of the collective interest. These institutions have helped companies and organizations come together and set rules based on agreed upon needs, making them transparent and representative of the interests of more than one stakeholder. At standards bodies, companies sit alongside non-profit organizations, educational institutions, policy experts, and academics.

But standards bodies have grown increasingly inefficient and exploitable. Not only were they always slow and under-resourced, tech companies grew powerful enough to bend them to their will or ignore them altogether by building walled gardens with no interoperability built into their platform (besides providing some public APIs with varying levels of consistency). In the era of Move Fast and Break Things, there was little patience for the kind of multi-stakeholder dialogue and decision-making that is required to build and conform to shared technical standards.

There has been little incentive for tech companies to play well with others. Not only that, it’s become the norm for tech monopolies to destroy any competition. Laws that regulate the internet such as the Digital Millennium Copyright Act (DMCA) and the Computer Fraud and Abuse Act (CFAA) have had a chilling effect on the types of innovation that was characteristic of the early internet. These regulations can be weaponized by big players to crush new start-ups over even the most trivial violations. If they don’t sue them, tech companies can easily buy them out or throw all their resources into imitating the services of their smaller competitors until they crush them.

Now most people communicate, get news, and publish their work through closed platforms run as web services. When people think of the Internet, they think about the platforms, not the protocols that run beneath them and make them work. To many, email is Gmail, chat is Slack, and discussion forums are Facebook.

Of course the underlying protocols are still core to the Internet’s functionality. But these closed platforms severely lack the traits of interoperability. As we’ve become dependent on them, our digital lives have been left at the mercy of companies whose primary goal is to enclose as much of the Internet’s infrastructure as they can get away with. Especially when it comes to social networks, their ability to mediate every aspect of our relationships and interactions online has come at an immense cost to our right to free expression, privacy, and access to knowledge.

Possible Paths Towards an Interoperable Internet

There are those who are calling for a revival of antitrust enforcement to break up the tech monopolies. But federal agencies in the U.S. such as the Federal Trade Commission (FTC) move too slowly and are under-resourced. And then there are others who say that breaking up the tech companies is entirely the wrong approach — that we need to build protocols to again make the Internet more interoperable as it was in the early days.

The European Commission, the Electronic Frontier Foundation, the University of Chicago Booth School of Business, Mozilla, Twitter CEO Jack Dorsey, and others are calling for a revival of interoperability as means to address Big Tech’s dominance over the Internet. Among them they present three possible ways this could come about, with or without state intervention.

1) State Antitrust Enforcement

Through litigation or legislative action, the state could require companies to make their platforms more open and interoperable. Mozilla’s Chris Riley asserts that the agency best suited to take this on would be the FTC, which has the explicit mandate to protect consumer protection and enforce U.S. antitrust laws. Harold Feld of Public Knowledge calls for an entirely new agency empowered to oversee any implementation of any proposed law enforcing digital platform competition, given the specific technical complexities of enforcing such a law.

There is precedence for this in Europe. The European Commission brought a case against Microsoft in the early 2000’s that resulted in the company being required to release information enabling competing software to interoperate with Windows desktops. The U.S. and Europe have their own approaches to antitrust, of course. Interoperability enforcement would look very different depending on which state(s) had the mandate to move forward with this type of action.

2) Established Platform Companies Seek Standardization

One of the big players could willingly embark on a path to build open protocols. In December, Twitter CEO Jack Dorsey announced Blue Sky, an initiative to help develop an open and decentralized standard for social media. In his Twitter thread about the project, Dorsey says that Twitter would fund further development of an existing decentralized standard or as he says, “create one from scratch”.

Many responded to him asking about ActivityPub — the protocol behind Mastodon, the federated alternative to Twitter. Why wouldn’t Twitter invest its resources into that? Dorsey responded that it might be possible, but that it’s up to the Blue Sky team to decide whether that protocol would be best. It’s worth pointing out that ActivityPub has already gone through discussions at the W3C and is officially a recommended standard.

It makes sense that a major platform would want to decentralize their platform, the most obvious reason being to relieve themselves of the responsibility over content moderation. The second reason is to fortify itself against even bigger competitors, like Facebook, that threaten to enclose even more of the Internet.

3) Building Open Protocols from Scratch

Within the last seven years there’s been an explosion of decentralized protocols, dealing with everything ranging from currency and commerce to social media and decision-making. We are way beyond the proof of concept stage. There are all kinds of ways to build decentralized protocols — based on gossip, distributed files, blockchains, or federated databases. The issue isn’t whether decentralization is technically feasible. The issue is that there are so many ways to do it and how each protocol is appropriate for different use cases.

Developer and writer Jay Graber compared a few of the most well-known decentralized social network protocols. She explains the pros and cons of each protocol and how they operate. Protocols that put users in full control over their data and identity in a network can be too technically challenging for the average user. Protocols that rely on append-only logs, such as secure scuttlebutt, make it impossible to edit or delete posts. Federated networks can carry many of the same user-friendly features as centralized networks, but still leave the server administrators hosting the network with the same challenges — such as overseeing content moderation and platform security. So while protocols can be more neutral than platforms they still contain biases.

This Is a Human Problem, Not a Technical One

If we’re talking about interoperability, we’re talking about public Internet infrastructure. Open protocols and standards are part of a digital commons, and a commons thrives when people use and maintain it together.

As economist Elinor Ostrom declared in her Nobel Prize winning work, commoning is a social practice. What Ostrom asserted was that the success or failure of a commons, or what she called a “common-pool resource”, rested on “how a group of principals who are in an interdependent situation can organize and govern themselves to obtain continuing joint benefits when all face temptations to free-ride, shirk or otherwise act opportunistically.” For any type of commons, it’s the relationships between those governing and relying on shared resources together that’s central to its success.

Standards bodies were in many ways an implementation of Ostrom’s eight principles for a commons — what she found were the basic elements needed for a commons to be governed sustainability and equitably. Thus Dorsey’s call to build an “open community” around a new social media protocol is encouraging. It suggests the need to build organizations that keep various stakeholders engaged in an open dialogue about how we make social networks open and interoperable. This is the promise of a functional standards body. When they’re robust and effective, they can play a critical role in ensuring that the Internet remains free, open, and equitable.

Overcoming the Challenges

It’s exciting to see this resurgence of energy for greater interoperability. But I’m not going to let myself get too hopeful. Anyone involved in this project needs to prove that this is going to be done right. Whether due to state-driven antitrust enforcement, tech companies’ self-motivation, or from the bottom up with a new decentralized protocol, the manner in which protocols come about will be critical.

There are foreseeable issues with all three paths towards interoperability. Even if the government were to regulate companies, the state is feeble in the face of overpowering influence from corporate lobbyists. The revolving door that exists between private industry and public oversight bodies nearly guarantees a compromised process. Companies can’t be trusted either. Independent developers have been continually burned by companies being unpredictable and negligent regarding the availability of public APIs — not to mention the hundreds of other ways they’ve violated public trust. Finally, nearly all of the decentralized web protocols were built by lone geniuses who collectively represent one demographic. If new protocols are to address the needs of diverse online communities, more types of people will need to be involved in their development.

Internet interoperability cannot be a project embarked on for the sake of profit, power, or someone’s ego. Who is or is not in the room when critical decisions are being made about the protocol will make or break whether it will succeed in bringing about a more interoperable internet. We ought to learn from experts who know what it takes to govern shared resources together. If we’re serious about re-building the Internet as public infrastructure, we need to be prudent enough to assemble the types of organizations that can steward the Internet protocols of the future.

Further Reading

Filed Under: , , , , , ,
Companies: facebook, google

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Defeating Tech Giants With Open Protocols, Interoperability, And Shared Stewardship”

Subscribe: RSS Leave a comment
12 Comments
christenson says:

Bias...a requirement of intelligence

I’d like to point out that bias, of some sort is a necessity for intelligence. I mean this in the technical sense of the word.

All intelligent systems have to choose what is important and what is not, what to emphasize and what to fudge, and fundamentally, it is bias, for better or for worse.

Techdirt, for example, is undoubtedly biased against our least-loved posters. And it just doesn’t cover some subjects well or at all. Local news here?? forget it, boring as far as this site is concerned. That’s bias.

Anonymous Coward says:

None of those things will protect the identity of people. Numerous stories lately on td related how people were getting bullied unfairly, by, for example, the extremist democrat regime of NYC, or other parties like that.

If innately there were anonymity or at least pseudonymity, and a defense of identity through systems (there are many devised, none used….), then there might be some use to the other things mentioned in the article. Without them, the pall of speech-destroying censor-democrat-commie mobs will hang over the ‘Net, like COVID-19 haunts the very thought of the city of Wuhan.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...