Summary: Formed as a more decentralized alternative to Twitter that allowed users to more directly moderate the content they wanted to see, Mastodon has experienced slow, but steady, growth since its inception in 2016.
Unlike other social media networks, Mastodon is built on open-source software and each "instance" (server node) of the network is operated by users. These separate "instances" can be connected with others via Mastodon's interlinked "fediverse." Or they can remain independent, creating a completely siloed version of Mastodon that has no connection with the service's larger "fediverse."
This puts a lot of power in the hands of the individuals who operate each instance: they can set their own rules, moderate content directly, and prevent anything the "instance" and its users find undesirable from appearing on their servers. But the larger "fediverse" -- with its combined user base -- poses moderation problems that can't be handled as easily as those presenting themselves on independent "instances." The connected "fediverse" allows instances to interact with each other, allowing unwanted content to appear on servers that are trying to steer clear of it.
That's where Gab -- another Twitter alternative -- enters the picture. Gab has purposely courted users banned from other social media services. Consequently, the platform has developed a reputation for being a haven for hate speech, racists, and bigots of all varieties. This toxic collection of content/users led to both Apple and Google banning Gab's app from their app stores.
Faced with this app ban, Gab began looking for options. It decided to create its own Mastodon instance. With its server now technically available to everyone in the Mastodon "fediverse," those not explicitly blocking Gab's "instance" could find Gab content available to its users -- and also allow for Gab’s users to direct content to their own users. It also allowed Gab to utilize the many different existing Mastodon apps to sidestep the app bans handed down by Google and Apple.
Decisions to be made by Mastodon:
Should Gab (and its users) be banned from setting up "instances," given that they likely violate the Mastodon Server Covenant?
Is it possible to moderate content across a large number of independent nodes?
Is this even an issue for Mastodon itself to deal with, given that the individuals running different servers can decide for themselves whether or not to allow federation with the Gab instance?
Given the open source and federated nature of Mastodon, would there reasonably be any way to stop Gab from using Mastodon?
Questions and policy implications to consider:
Will moderation efforts targeting the "fediverse" undercut the independence granted to "instance" owners?
Do attempts to attract more users create moderation friction when the newly-arriving users create content Mastodon was created to avoid?
If Mastodon continues to scale, will it always face challenges as certain instances are created to appeal to audiences that the rest of the “fediverse” is trying to avoid?
Can a federated system, in which unique instances choose not to federate with another instance, such as Gab, work as a form of “moderation-by-exclusion”?
Resolution: Mastodon's founder, Eugen Rochko, refused to create a blanket ban on Gab, leaving it up to individual "instances" to decide whether or not to interact with the interlopers. As he explained to The Verge, a blanket ban would be almost impossible, given the decentralized nature of the service.
On the other hand, most "fediverse" members would be unlikely to have to deal with Gab or its users, considering the content contained in Gab's "instance" routinely violates the Mastodon "covenant." Violating these rules prevents instances from being listed by Mastodon itself, lowering the chances of other "instance" owners inadvertently adding toxic content and users to their server nodes. And Rochko himself encouraged users to preemptively block Gab's "instance," resulting in ever fewer users being affected by Gab's attempted invasion of the Mastodon fediverse.
But running a decentralized system creates an entirely new set of moderation issues, which has turned Mastodon itself into a moderation target. Roughly a year after the Gab "invasion," Google threatened to pull Mastodon-based apps from its store for promoting hate speech, after users tried to get around the Play Store ban by creating apps that pointed to Mastodon “instances” filled with hateful content. Google ultimately decided to leave Mastodon-based apps up, but appears ready to pull the trigger on a ban in future.
I've been talking a lot lately about the unfortunate shift of the web from being more decentralized to being about a few giant silos and I expect to have plenty more to say on the topic in the near future. But I'm thinking about this again after Andy Baio reminded me that this past weekend was five years since Google turned off Google Reader. Though, as he notes, Google's own awful decision making created the diminished use that allowed Google to justify shutting it down. Here's Andy's tweeted thread, and then I'll tie it back to my thinking on the silo'd state of the web today:
Google Reader shut down five years ago today, and I’m still kind of pissed about it.
Google ostensibly killed Reader because of declining usage, but it was a self-inflicted wound. A 2011 redesign removed all its social features, replaced with Google+ integration, destroying an amazing community in the process.
The audience for Google Reader would never be as large or as active as modern social networks, but it was a critical and useful tool for independent writers and journalists, and for the dedicated readers who subscribed to their work.
There are great feedreaders out there — I use Feedly myself, but people love Newsblur, Feedbin, Inoreader, The Old Reader, etc. But Google Reader was a *community* and not easily replaced. Google fragmented an entire ecosystem, for no good reason, and it never recovered.
Many people have pointed to the death of Google Reader as a point at which news reading online shifted from things like RSS feeds to proprietary platforms like Facebook and Twitter. It might seem odd (or ironic) to bemoan a move by one of the companies now considered one of the major silos for killing off a product, but it does seem to indicate a fundamental shift in the way that Google viewed the open web. A quick Google search (yeah, yeah, I know...) is not helping me find the quote, but I pretty clearly remember, in the early days of Google, one of Larry Page or Sergey Brin saying something to the effect of how the most important thing for Google was to get you off its site as quickly as possible. The whole point of Google was to take you somewhere else on the amazing web. Update It has been pointed out to me that the quote in question most likely is part of Larry Page's interview with Playboy in which he responded to the fact that in the early days all of their competitors were "portals" that tried to keep you in with the following:
We built a business on the opposite message. We want you to come to Google and quickly find what you want. Then we’re happy to send you to the other sites. In fact, that’s the point. The portal strategy tries to own all of the information.
Somewhere along the way, that changed. It seems that much of the change was really an overreaction by Google leadership to the "threat" of Facebook. So many of Google's efforts from the late 2000s until now seemed to have been designed to ward off Facebook. This includes not just Google's multiple (often weird) attempts at building a social network, but also Google's infatuation with getting users to sign in just to use its core search engine. Over the past decade or so, Google went very strongly from a company trying to get you off its site quickly to one that tried to keep you in. And it feels like the death of Reader was a clear indication of that shift. Reader started in the good old days, when the whole point of an RSS reader was to help you keep track of new stuff all over the web on individual sites.
But, as Andy noted above, part of what killed Reader was Google attempting desperately to use it as a tool to boost Google+, the exact opposite of what Google Reader stood for in helping people go elsewhere. I don't think Google Reader alone would have kept RSS or the open web more thriving than it is today, but it certainly does feel like a landmark shift in the way Google itself viewed its mission: away from helping you get somewhere else, and much more towards keeping you connected to Google's big data machine.
Last week, I came across two separate speeches that were given recently about the future of the internet -- both with very different takes and points, but both that really struck a chord with me. And the two seem to fit together nicely, so I'm combining both of them into one post. The first speech is Jennifer Granick's recent keynote at the Black Hat conference in Las Vegas. You can see the video here or read a modified version of the speech entitled, "The End of the Internet Dream."
It goes through a lot of important history -- some of which is already probably familiar to many of you. But, it's also important to remember how we got to where we are today in order to understand the risks and threats to the future of the internet. The key point that Granick makes is that for too long, we've been prioritizing a less open internet, in favor of a more centralized internet. And that's a real risk:
For better or for worse, we’ve prioritized things like security, online civility, user interface, and intellectual property interests above freedom and openness. The Internet is less open and more centralized. It’s more regulated. And increasingly it’s less global, and more divided. These trends: centralization, regulation, and globalization are accelerating. And they will define the future of our communications network, unless something dramatic changes.
Twenty years from now,
You won’t necessarily know anything about the decisions that affect your rights, like whether you get a loan, a job, or if a car runs over you. Things will get decided by data-crunching computer algorithms and no human will really be able to understand why.
The Internet will become a lot more like TV and a lot less like the global conversation we envisioned 20 years ago.
Rather than being overturned, existing power structures will be reinforced and replicated, and this will be particularly true for security.
Internet technology design increasingly facilitates rather than defeats censorship and control.
Later in the speech, she digs deeper into those key trends of centralization, regulation and globalization:
Centralization means a cheap and easy point for control and surveillance.
Regulation means exercise of government power in favor of domestic, national interests and private entities with economic influence over lawmakers.
Globalization means more governments are getting into the Internet regulation mix. They want to both protect and to regulate their citizens. And remember, the next billion Internet users are going to come from countries without a First Amendment, without a Bill of Rights, maybe even without due process or the rule of law. So these limitations won’t necessarily be informed by what we in the U.S. consider basic civil liberties.
This centralization is often done in the name of convenience -- because centralized systems currently offer up plenty of cool things:
Remember blogs? Who here still keeps a blog regularly? I had a blog, but now I post updates on Facebook. A lot of people here at Black Hat host their own email servers, but almost everyone else I know uses gmail. We like the spam filtering and the malware detection. When I had an iPhone, I didn’t jailbreak it. I trusted the security of the vetted apps in the Apple store. When I download apps, I click yes on the permissions. I love it when my phone knows I’m at the store and reminds me to buy milk.
This is happening in no small part because we want lots of cool products “in the cloud.” But the cloud isn’t an amorphous collection of billions of water droplets. The cloud is actually a finite and knowable number of large companies with access to or control over large pieces of the Internet. It’s Level 3 for fiber optic cables, Amazon for servers, Akamai for CDN, Facebook for their ad network, Google for Android and the search engine. It’s more of an oligopoly than a cloud. And, intentionally or otherwise, these products are now choke points for control, surveillance and regulation.
So as things keep going in this direction, what does it mean for privacy, security and freedom of expression? What will be left of the Dream of Internet Freedom?
She goes on to note how this centralization comes with a very real cost: mainly in that it's now one-stop shopping for government surveillance.
Globalization gives the U.S. a way to spy on Americans…by spying on foreigners we talk to. Our government uses the fact that the network is global against us. The NSA conducts massive spying overseas, and Americans’ data gets caught in the net. And, by insisting that foreigners have no Fourth Amendment privacy rights, it’s easy to reach the conclusion that you don’t have such rights either, as least when you’re talking to or even about foreigners.
Surveillance couldn’t get much worse, but in the next 20 years, it actually will. Now we have networked devices, the so-called Internet of Things, that will keep track of our home heating, and how much food we take out of our refrigerator, and our exercise, sleep, heartbeat, and more. These things are taking our off-line physical lives and making them digital and networked, in other words, surveillable.
At the end of her speech, Granick talks about the need to "build in decentralization where possible," to increase strong end-to-end encryption, to push back on government attempts to censor and spy.
And that's where the second speech comes in. It's by the Internet Archive's Brewster Kahle. And while he actually gave versions (one longer one and one shorter one) earlier this year, he just recently wrote a blog post about why we need to "lock the internet open" by building a much more distributed web -- which would counteract many of Granick's quite accurate fears about our growing reliance on centralized systems.
Kahle also notes how wonderful new services are online and how much fun the web is -- but worries about the survivability of a centralized system and the privacy implications. He notes how the original vision of the internet was about it being a truly distributed system, and it's the web (which is a subsegment of the internet for those of you who think they're the same), seems to be moving away from that vision.
Contrast the current Web to the Internet—the network of pipes on top of which the World Wide Web sits. The Internet was designed so that if any one piece goes out, it will still function. If some of the routers that sort and transmit packets are knocked out, then the system is designed to automatically reroute the packets through the working parts of the system. While it is possible to knock out so much that you create a chokepoint in the Internet fabric, for most circumstances it is designed to survive hardware faults and slowdowns. Therefore, the Internet can be described as a “distributed system” because it routes around problems and automatically rebalances loads.
The Web is not distributed in this way. While different websites are located all over the world, in most cases, any particular website has only one physical location. Therefore, if the hardware in that particular location is down then no one can see that website. In this way, the Web is centralized: if someone controls the hardware of a website or the communication line to a website, then they control all the uses of that website.
In this way, the Internet is a truly distributed system, while the Web is not.
And, thus, he wants to build a more distributed web, built on peer-to-peer technology that has better privacy, distributed authentication systems (without centralized usernames and passwords), a built-in versioning/memory system and easy payment mechanisms. As he notes, many of the pieces for this are already in existence, including tools like BitTorrent and the blockchain/Bitcoin. There's a lot more in there as well, and you should read the whole thing.
Our new Web would be reliable because it would be hosted in many places, and multiple versions. Also, people could even make money, so there could be extra incentive to publish in the Distributed Web.
It would be more private because it would be more difficult to monitor who is reading a particular website. Using cryptography for the identity system makes it less related to personal identity, so there is an ability to walk away without being personally targeted.
And it could be as fun as it is malleable and extendable. With no central entities to regulate the evolution of the Distributed Web, the possibilities are much broader.
Fortunately, the needed technologies are now available in JavaScript, Bitcoin, IPFS/Bittorrent, Namecoin, and others. We do not need to wait for Apple, Microsoft or Google to allow us to build this.
What we need to do now is bring together technologists, visionaries, and philanthropists to build such a system that has no central points of control. Building this as a truly open project could in itself be done in a distributed way, allowing many people and many projects to participate toward a shared goal of a Distributed Web.
Of course, Kahle is hardly the first to suggest this. Nearly five years ago we were writing about some attempts at a more distributed web, and how we were starting to see elements of it showing up in places the old guard wouldn't realize. Post-Snowden, the idea of a more distributed web got a big boost, with a bunch of other people jumping in as well.
It's not there yet (by any stretch of the imagination), but a lot of people have been working on different pieces of it, and some of them are going to start to catch on. It may take some time, but the power of a more decentralized system is only going to become more and more apparent over time.
If you've been following the whole net neutrality fight for a while, the following graphic may be familiar to you -- showing what a potential "cable-ized" world the internet would become without strong protections for net neutrality:
At some point, someone created a similar version, that was specific to AT&T:
A little while ago, however, someone took the joke even further, and set up a website for a fake broadband provider, asking people to Join the Fastlane!, and it was pretty dead on in terms of what such a site might look like:
I particularly like this bit:
It's now come out that this campaign (along with some associated billboards) has been put together by BitTorrent Inc., not all that different than the company's billboard campaign against the NSA. Along with this, BitTorrent has put out a blog post explaining, in part, how we got here, but more importantly how we need to start thinking about a better way to handle internet traffic to avoid the kind of future described above.
The key issue: building a more decentralized internet:
Many smart researchers are already thinking about this problem. Broadly speaking, this re-imagined Internet is often called Content Centric Networking. The closest working example we have to a Content Centric Network today is BitTorrent. What if heavy bandwidth users, say, Netflix, for example, worked more like BitTorrent?
If they did, each stream — each piece of content — would have a unique address, and would be streamed peer-to-peer. That means that Netflix traffic would no longer be coming from one or two places that are easy to block. Instead, it would be coming from everywhere, all at once; from addresses that were not easily identified as Netflix addresses — from addresses all across the Internet.
To the ISP, they are simply zeroes and ones.
All equal.
There's obviously a lot more to this, but it's good to see more and more people realizing that one of the fundamental problems that got us here is the fact that so much of the internet has become centralized -- and, as such, can be easily targeted for discrimination. Making the internet much more decentralized is a big step in making it so that discrimination and breaking net neutrality aren't even on the table.
This is hardly surprising, but it appears that in the wake of the feds taking down the "dark marketplace" Silk Road and arresting its alleged creator Russ Ulbricht, replacement marketplaces quickly sprung into place to try to take its place. This was exactly as we predicted. A few of the markets have come and gone (usually associated with scandals), but it appears that the one that has stuck around is "Silk Road 2.0" -- and it's actually now larger than Silk Road ever was, in terms of the amount of products being offered. The article linked above, from Coindesk, notes that, somewhat ironically, the reason why Silk Road 2.0 seems to be standing out above the others is because it's worked hard to establish trust.
This effect was likely boosted by sensible policies at Silk Road. Most significantly, soon after February’s hack, the site’s operators announced that they would pay back bitcoins lost by customers.
Silk Road’s moderator Defcon said at the time: “We are committed to getting everyone repaid even if it takes a year.”
In anonymous drugs marketplaces, as in any market, confidence is key, it seems.
That's not to say Silk Road 2.0 is going to stick around -- there are plenty of reasons to think it won't. But, in some ways, you wonder if this is a kind of Napster moment all over again. After the original got shut down, a series of replacements all came about vying to take its place, leading to some interesting innovations -- even if those who wanted to shut down the original decried how awful and illegal each new version was.
Last week, over on our Step 2 discussion platform we kicked off a discussion on what an "innovation agenda" might look like for a US-politician for 2012. What kinds of regulatory changes should they be focused on? This effort, done in partnership with Engine Advocacy, has already kicked off a nice discussion over there with some interesting ideas being tossed around. If you haven't yet, please join in the discussion. I'm not surprised that copyright issues and open internet issues top the list of things most interesting to folks -- the SOPA/PIPA debate has pretty much guaranteed that. I am a little surprised that issues around helping skilled entrepreneurs -- the folks who create jobs -- was seen as less of an issue compared to some of the others on the list. Either way, the discussion is still going on there, and we'll be taking it further over the coming weeks and months, so feel free to join in.
It's not like this wasn't easily predictable, but as the entertainment industry has "succeeded" in taking down Megaupload and continues to move against The Pirate Bay and others, anyone who's followed this space had to have known that file sharing would just move one step further underground. We've seen the same thing after every single "victory" against file sharing since Napster was shut down. Each time, it moves to a system slightly more underground and more distributed. The early ones were still easy to take down but as they get further underground, it just becomes worse for the industry (and makes it that much harder to win back those users). The latest news is that there's been massive uptake of a growing number of anonymous, decentralized file-sharing tools. As is pretty typical in these "shift" periods, it's still not clear which systems will "win" out over the others, but the leaders are starting to emerge. The Torrentfreak article above mentions players like Tribler and RetroShare. People in our comments have been discussing both, as well as Ares Galaxy. Who knows if any of these apps are actually any good, but it seems pretty clear that people are continuing to file share -- they're just finding ways to do so that are even harder to track down and stop. How long until the legacy entertainment industry starts publishing articles about these evil anonymous, decentralized file sharing systems and demanding new laws against them?
In the last few months it's become clear that it's no longer acceptable for politicians to "not get" the internet. The internet has become such a key part of our lives that anyone who is trying to regulate it without understanding it doesn't deserve to be in office. Of course, there are some politicians who really do want to do the right thing, and it's time to help them out. In association with Engine Advocacy, we're looking to do a little "crowdsourcing" around what an internet "Innovation Agenda" should look like for any politician in 2012. We're starting with this basic principle:
New businesses are the key to job creation and economic growth, and the Internet is one of the most fertile platforms for new businesses ever established.
We believe deeply in the value of decentralized, emergent, bottom-up innovation, and we want to shape public policies that will allow it to flourish.
From there, we have a list of twelve topics that we think are important -- but we want your input. So we've posted this same thing both here and over at our Step 2 discussion platform. Over at Step 2, we've also posted those initial twelve topics, with each one as a separate comment on the original post, so you can vote them up and down. If you want to really participate, please head on over to Step 2, where you can do three separate things (and, yes, your Techdirt login works there too):
Suggest your own topics that should be part of an innovation agenda by responding to the main post.
Vote on existing topics to show which ones are more important... and which ones are less important.
Comment on the existing topics to provide feedback or suggest ways to improve them.
Please help us shape a comprehensive Innovation Agenda for 2012. Engine Advocacy is working closely with the internet community and helping give them a voice in DC, and this is one way to take part, as your suggestions may help shape what politicians are hearing.
We've discussed in the past how the whole Wikileaks response from governments has only helped to expose areas of internet infrastructure that should be decentralized and distributed, but are not. Of course, much of that is now being cleared up. For example, there was plenty of talk -- what with the US government seizing domains and all -- about setting up a distributed web system that bypasses a centralized server (and potential censorship choke point), such that it can't easily be filtered. It appears that this may already be happening and as was just announced, it's being undertaken by the W3C. That ought to add plenty of legitimacy to the concept, which many anti-Wikileaks folks have insisted was merely a geek pipedream.
We already posted Glyn Moody's response to Jaron Lanier's critique of Wikileaks, but I also wanted to point to and discuss an excellent rebuttal/debunking to Lanier's piece by professor Zeynep Tufecki, who notes that, contrary to Lanier's claims, Wikileaks hasn't exposed "the hazards of nerd supremacy," but rather the "dissent tax." The dissent tax is a great way to summarize the point I've been trying to make about how Wikileaks has really exposed corporate intermediaries who are too centralized. In Tufecki's explanation, the "cost" of avoiding those intermediaries is the dissent tax:
What the Wikileaks furor shows us is that a dissent tax is emerging on the Internet. As a dissident content provider, you might have to fight your DNS provider. You might need to fund large-scale hosting resources while others can use similar capacity on commercial servers for a few hundred dollars a year. Fund-raising infrastructure that is open to pretty much everyone else, including the KKK, may not be available. This does not mean that Wikileaks cannot get hosted, as it is already well-known and big, but what about smaller, less-famous, less established, less well-off efforts? Will they even get off the ground?
These developments should alarm every concerned citizen, even those who are thoroughly disgusted by Wikileaks. This is the issue that the Wikileaks furor has exposed, not nerd ideology. This is the story and likely will be more important than the release of diplomatic cables (which were already available to millions of people) through major newspapers after scrutiny by journalists. This question will stay with us even if Wikileaks dissolves, and Julian Assange is never heard from again.
This does such a nice job of summarizing the point I'd been trying (and probably failing) to make over the past few weeks that it's worth reading again. Of course, the real question is what happens next. And what we're seeing is that the response is for a lot of smart people to start looking at all these chokepoints that have created that dissent tax, and look for ways to route around them, and build more distributed, more censor-proof infrastructure pieces, such that any such dissent taxes in the future will be minimized.