Mastodon, the rise of the Fediverse

Mastodon, Fediverse, do you already feel lost just by using those 2 words which aren’t quite as new as you’d think? ... Flashback

The origins

We need to go back to 2008 and the creation of the social network identi.ca to understand where the Fediverse comes from. The idea of its founder, Evan Prodromou, was to federate (i.e interconnect) servers that are used for web publishing purposes such as social networking, microblogging, blogging, or websites. While independently hosted (i.e. not centralised by one entity), each application could communicate with each other through a standardised protocol, a common language – OStatus in this case.

Thereafter, a W3C Community Group was opened to maintain and further develop the OStatus standard which was itself eclipsed later on by a W3C Federated Social Web Working Group and its ActivityPub standard. The ActivityPub standard provides developers with a client/server API (“application programming interface”) for creating, updating and deleting content, as well as with a federated (i.e. interconnected) server-to-server API for delivering notifications and content.

In short, it provides developers with the ability to share live content with different servers, even when those servers are not owned by the same persons, nor sharing the same database using a federated network. The best way to look at it is the way email works. If you have a Gmail email address, and your friend a Microsoft one, you can still email each other because the email system is decentralised. But if you have a Twitter account and your friend has a Facebook account you can’t message, comment, follow, like, or share their posts from within Twitter. You must create an account at Facebook in order to do so because they are both centralised.

The Fediverse is like email, but for everything.

Mastodon, the flying pachyderm

Let’s get back to Mastodon itself. Born in 2016 and developed by Eugen Rochko Mastodon is free and Open Source, meaning that anyone can read the code, participate in its evolution. The easiest way to describe it would be to call it Twitter, without a central company owning and running the show.
Today Mastodon is the largest federating social network with a population of approximately 5 million accounts, 1.5 million active users per day on well over 4.000 “instances” (in the Fediverse, websites are called instances) according to Fediverse.party. Examples of such instances are mastodon.social (the original one), mastodon.brussels, mastodon-belgium.be, etc …. Those instances are managed by individuals called administrators and helped by moderators. Each instance has its own physical server(s).

If you can run and administrate a website server, you can create your own Mastodon instance.

As stated previously, a good analogy to Mastodon is the email standard and the way it works.

There is no single entity that runs “email”. The email standard is a system by which different email servers communicate with each other according to a shared protocol. Some providers like Gmail or Yahoo, are very large and have millions of users, and lots of us choose them because they’re easy to use. Anyone with technical knowledge can set up and run his/her own email server and participate as part of the email network. And while any given email server can block and refuse to communicate with any other email server, there’s no central authority that can force anyone off the email system.

It is important to understand that Mastodon’s decentralisation stems from its architecture design, rather than policy enforcement. Like an email server can decide to not accept email from a domain, a Mastodon instance can choose not to communicate with some other instances, like when Gab tried to join the network.

Eugen Rochko, the original Mastodon developer, explained clearly that he was unable to prevent them from joining the Federation but that it was the role of each and every instance to decide what relationship it wanted to have with Gab. Within days, the major instances had cut ties with Gab and the outcome was that each community was able to come to equilibrium as to whom it wanted to be associated with.

When a threat appears on a centralised platform, it is the role of this platform to mitigate it by banning or limiting its reach. Facebook, Twitter, Reddit and many others have internal policies allowing them this moderation. When it comes to Mastodon, there is no such authority. This is the difference between a decentralised platform as a matter of changeable policy and one that’s decentralised as a matter of unchangeable architecture, like Mastodon.

With centralised platforms content moderation acts as a single voice. The rules are defined by the centralised authority and if you don’t like the rules, you have no other option but to either complain to the platform and try to get them to change the rules, or you can leave the platform altogether. Mastodon and the Fediverse in general are different and offer a multitude of policies evolving into the same universe.

When a user signs up on Mastodon they are not only taking a username on the Mastodon network, they’re also asked to pick an instance, which will have its own content moderation policy. If the user is not satisfied with their instance moderation, the design of the system allows them to move to another instance with another content moderation policy while avoiding the loss of any of their data (following, followers, toots, …). This design allows users to sort into instances that better reflect their values and preferences.

A network without algorithms

Today, most of the social networks are using algorithms with a view to retaining the user as long as possible within their network. This action is usually called the “retention time” (i.e. the time a user will spend using the services of the platform) or daily active users (“DAU”). Instances such as the YouTube “playing next” algorithm, which chains together video after video, or the Twitter or Facebook feeds where the order of the posts are not chronological but “interest based”. Those algorithms are monitored by CheckFirst for the CrossOver project.

By knowing you better, by retaining you, platforms are able to display more targeted ads to you and therefore increase your potential clicks and, thus their revenues.  

Mastodon doesn’t use any algorithm but rather hard statistics and human moderation to promote content per instance. As Mastodon’s business model is not based on the user’s attention time or knowledge, the focus of the system has been shifted into the empowerment of the users.   Instead of algorithms selecting, ordering and shadow banning the content for the users, the users are provided with a set of tools allowing them to have the control over the content they are exposed to.

We cannot solve our problems with the same thinking we used when we created them.

Albert Einstein

When nearly all platforms rely on their algorithms to protect users from the downside of the social network, Mastodon on the other hand decided to rely on the instance itself. The clear benefits of such decentralised systems are the diffused content moderation responsibilities, user empowerment, and disincentives for user conflict. However, this still leaves the question of illegal content such as child sexual abuse material and terrorist content.

Instances have incentives to moderate and get rid of such content; however, it is also important to remember that decentralised networks are not above government legislation, nor are they the ultimate solution against content moderation. In the same way that governments can order the takedown of a website, they can also order the takedown of Mastodon instances.

Mastodon and the Digital Services Act (DSA)

As the DSA went into effect this November 2022, questions arise on how this legislative effort may affect the entire network and its instances. As far as our understanding of the DSA, it does not provide clarity on the question of decentralised social media.

As described previously, Mastodon is divided into multiple instances each having its own set of rules and guidelines regarding content. Based on the categorisations of the DSA, it is most probable that each instance could be seen as an “independent online platform” on which a user hosts and publishes content that can reach a potentially unlimited number of users.

This interpretation would mean that each instance administrator would have to comply to the minimum obligations for intermediary and hosting services, including having a single point of contact and legal representative, providing clear terms and conditions, publishing bi-annual transparency reports, having a notice and action mechanism and, communicating information about removals or restrictions to both notice and content providers.

Additionally, if an instance would reach the threshold for the DSA’s ‘VLOP’ (i.e. Very Large Online Platform) status with 45 million monthly users, there would be a significant number of obligations. That instance would have to comply with risk assessments, independent audits, … An expensive business!

However, those obligations are not likely to be applicable for today’s instances as most of them are based on a non-profit model with a volunteer administration. It is therefore important for the European Commission to provide further clarification about these instances, and to do so quickly.

It is difficult today to envision what Mastodon’s future will look like but the decentralisation process will probably continue to grow in the upcoming years. The DSA is designed to hold a single entity accountable through obligations. This becomes nearly impossible for decentralised networks where the content moderation relies on its community.

Conclusion

The Fediverse and Mastodon bring forth a plethora of questions and actually few answers are provided by the regulators for decentralised networks while their future may highly depend on it.  

The new answer to content moderation provided by Mastodon where there isn’t a central authority will be a challenge and shows that when designing Internet regulations, it is important to do so keeping innovation and creativity in mind.

About us

Check First is a leading Finnish software and methodologies company, spearheading adversarial research techniques. We believe that everyone should be able to understand how and why content is presented to them. We advocate for online clarity and accountability, building solutions to attain this goal. Partnering with leading institutions, regulators, NGOs and educators, we aim at curbing the spread of disinformation and foreign influence manipulations.

Our story