The Free Internet is an Illusion

Capitalism has pocketed much of the foundational infrastructure of the internet. Here's how we can start uprooting it.

The 21st century is unthinkable without the internet. We use it every day, sometimes even without knowing. The “Smart Home” is restructuring the way we interact with our most intimate belongings, so that even fridges come with an internet connection today. And the internet, for a long time ignored by governments, has aroused much more interest from political actors in recent years. The case of Facebook is instructive: first there was the Cambridge Analytica data breach in the world’s biggest social network, and now governments all around the globe are questioning Mark Zuckerberg about his network’s wrongdoings, which in turn is running large-scale advertising campaigns1 trying to convince users of its good intentions.

Understandably, more and more people want to know what’s going on, and are beginning to question the fundamentals of how we communicate with each other. But the engagement of politicians with the internet has yielded some strange results, varying from the truly ludicrous to the more ambiguous. In Germany for instance, the government’s internet appointee Dorothee Bär has promised flying taxis and claims that fast fiber optics internet for remote regions in Germany was not as important, making her the laughing stock of German politics for a week. Less ridiculous, but still worthy of criticism, was Jeremy Corbyn’s Alternative MacTaggart Lecture a few weeks ago, which proposed strengthening the British Broadcasting Company and founding another, digital agency to aid the BBC in the digital realm. In the aftermath of the speech, Irina Bolychevsky and James Moulding wrote in an article for New Socialist that we don’t need to nationalise the internet, but to “re-decentralise” it.

Both have a point — Corbyn is right in stating that we need more public control over the internet, while Bolychevsky and Moulding pinpoint the central problem of Corbyn’s claim with their title: No, we don’t need a “statebook”. But both accounts lack some precision when it comes to how we should deal with the internet.

A dual provenance of the internet

Many of these modern misconceptions about the internet stem from its very beginnings. The internet has a dual heritage: its first origin is to be found in academia, while its second is to be found in a logic peculiar to capitalism. Both are at play in the current structure of how the internet works.

This is what makes it difficult to properly evaluate what is at stake with regard to the internet. Because it is at the same time both centralised and decentralised. While core services of the internet are certainly monopolised by corporations such as Facebook or Alphabet, a huge part of the internet is in fact still decentralised.

There is the story that the internet was invented by the U.S. Air Force in response to the growing possibility of a nuclear first strike by the Soviet Union, which turns out to be just that: a story2. In fact, the internet began with the idea to connect the U.S. universities involved in research for the Air Force (under the name of ARPA, Advanced Research Projects Agency) to communicate between the different projects. In 1969 the plans for what was then called the “ARPANET” could be realised with funding by the U.S. Air Force. The first stable internet connection was subsequently built between UCLA and the Stanford Research Institute. Shortly thereafter, UC Santa Barbara and the University of Utah were also connected to the ARPANET.3

The core novelty of ARPANET was its structure: a communication mesh that did not have a centre, because all components could take over the role of administering the traffic being sent over the network. It was also scalable and quick to deploy. New nodes could be easily connected to the network. The only two required things were a connection to the rest of the network and a gateway.

But the ARPANET was still a long way from what the internet is today. There were no websites and no browsers to view them. It took another 30 years before Tim Berners-Lee invented Hypertext Markup Language (HTML) in 1990. With HTML, the creation of websites became possible, and the first website to exist is still accessible on the pages of the CERN research facility, where Berners-Lee worked at that time. In 1993, the world wide web was finally launched, and one year later Berners-Lee founded the World Wide Web Consortium (W3C) to ensure all websites would follow the same standards. In fact, pretty much the whole engine that modern corporate platforms run upon was provided by researchers.

Corporate Control Over the Internet

But the ARPANET was not only a means to communicate. It was also a business opportunity. After researchers effectively implemented the internet at their research institutes and universities, corporate giants took matters into their own hands: while all protocols were free and public, one thing could never be for free: the infrastructure. The internet still needed physical connections — plain old cables — to work. And providing cable access to communication is something capitalism has successfully done for more than a century by now. During the high phase of colonialism, and after the invention of telegraphy in 1833 by Carl Friedrich Gauß and Wilhelm Weber, corporations were founded with the sole aim of installing huge undersea cables that could be used to transfer messages from the colonised countries back to the centre of the Commonwealth in a fraction of the time that was necessary until then.

No horse and no ship could beat the near-light-speed of electrons travelling inside cables. And here begins the rather centralised story of the global internet.4 Installing an undersea cable from India to the UK took an incredible amount of money. Take the first cable between the UK and Newfoundland, whose tremendous setup costs required the founding of the Atlantic Telegraph Company for the sole purpose of financing the installation. Such companies, organised after the image of the by-then popular republican form of state (the organisational form still used today for corporations5), were necessary, because even the Crown was unable or unwilling to finance such endeavours. This particular cable had a projected cost of £800,000 which, in today’s terms, would be about £81 million.6. To regain the expenditures with a profit to return to the shareholders, these companies sold capacity to use their cable to send messages. This way, these companies not only laid the foundation for global communication, they also were the cornerstone for press agencies such as Reuters, DPA, and Agence France-Presse.7

Nowadays, when a new cable is built, the same strategy is employed. Modern companies are funded by corporations such as Facebook or Alphabet, who have a vital interest in a stable communication network across the world. The FASTER cable, for instance — price tag: $300 million and operational since 2016 — was built in a joint venture of Google and five telecommunication corporations.

While the method of funding has stayed the same, the technology has advanced: instead of simple copper cables, there are now fiber-optics enabling literally light-speed communication. Yet the spirit of the past is still visible today. The routes used by modern internet cables are the same as those taken by European conquerors centuries earlier. It is unsurprising that the cable crossing the Atlantic Ocean from Spain to Florida is called “Columbus-III,”8 and that the whole of South America is connected to other continents only via the United States.9

The ownership of the physical connections of the internet is therefore clearly centralised. In most countries, only one or two corporations are competing to control the whole end user market,10 while across continents only a few corporations have the means to install undersea cables. What is in fact decentralised is the ability of individuals and companies alike to create websites, offer services, and use the internet. But there is also a third category that is important to remember: standardisation. The protocols used to communicate over the physical grid are public and they are maintained in a transparent process.11 A recent piece for New Socialist by Jason Prado does a great job explaining why standardisation is so important when it comes to communication and the internet.

Layer by Layer

Capitalism has inserted itself right from the beginning of the internet and thereby was capable of gaining control over it. Moreover, as all of us visit a handful of websites exorbitantly often, we tend to forget that the internet consists of billions of websites, and that most of the world’s internet traffic consists of multimedia content that we don’t normally think of as just “websites” — but the monopoly of Facebook and Alphabet over what we perceive to be the internet prevents us from seeing what the internet actually is.

The internet currently resembles a phenotype of what Guy Debord called the Spectacle12: What exists is represented to us by means of the spectacular. And as Facebook is “spectacular” (read: visible), all smaller networks remain in the shadows and therefore are not perceived in the public mind. Our understanding of what the internet is is severely distorted by how we perceive other parts of the world. We describe human communities in coarse terms such as “nation” or “voters”; we see the world along state borders, not regional communities;13 and swathes of the International Relations community are still convinced that states should be treated as black boxes. It became a habit to just assume the internet as a “thing,” as a whole. Yet, this couldn’t be farther from the truth.

Engineers and programmers, for instance, describe the internet not as one big entity, but along the lines of so-called “layers”, which divide the internet by functionality.14 The model used is the Open Systems Intercommunication model (OSI), which describes the internet in terms of seven, sometimes four, layers.15 Each subsequent layer requires the previous one to work, so it’s hierarchical. Without the first layer, none of the six following layers could work. But without the seventh layer, all layers below it could still work.

It is important to understand these layers because almost all modern-day communication uses these basic building blocks of the internet, and to truly understand how the internet works, you have to understand how the creators of the internet think. The first layer encompasses the material infrastructure of the internet — the undersea cables, the city infrastructure, the routers and switches through which the traffic is routed. The second one, the “data link layer”, facilitates the communication across networks and makes sure that the information actually reaches its destination. The third layer is called the “network layer” and it connects all devices with internet capabilities. In this layer each device is assigned an IP address and the conversion of Domain Names (DNS) is undertaken. DNS is another important concept: if you type into your browser’s address bar https://www.google.com/, you will be directed to the search engine, but so will you if you directly use Google’s IP address. Give it a try: https://172.217.0.0/. Domain names such as google.com are simply a convenience so that we don’t have to remember IP addresses all the time. IPs are the post addresses of the internet.

The fourth layer is called the “transport layer”. In simple terms, transport protocols such as TCP/IP are sets of rules on how to send and receive information. Protocols provide a reliable rule book that makes it easy for each device to be certain of how it will receive data. Without these rules, each device might send information just as it pleases and leaves the receiving end completely puzzled as to what it just received. The fifth layer facilitates potential encryption (such as HTTPS).

While the first five layers only create the streets on which information can travel, the last two layers, 6 and 7, transmit actual information. Layer 7 structures the data in such a way that both sides of the communication process can read and interpret all the data being sent.

The OSI model of the internet is similar to ancient post systems: Layer 1 describes all streets, bridges and ships that are used to transport things. The second layer consists of mechanisms to ensure that the letters reach the end — sending two or more messengers with the same letter. Layer 3 is simply some sort of navigation: The messengers know the target address and choose a path that leads them there. Layer 4 is the way the messengers transport this information — in a bag, for instance. The analogy for layer 5 are encryption mechanisms to make sure only the sender and receiver are able to read the letter. Layer 6 is the content of the letters, everything that is written on the paper, while Layer 7 describes the way such letters were written — using an introduction, some formal phrases, and a greeting.

Social Networks: The Missing Layer

What stands out here is that none of the layers in the OSI model correspond to what information we send across the internet. It only describes how information is being sent. That leaves us with the question of where to place social networks, forums and platforms. This is where an important piece by Benjamin Bratton comes into play: The Black Stack.16 While the standard OSI model describes the technical side of the internet, The Black Stack applies the same concept to the actual information.

Indeed, technically, social networks are “invisible”, because they all use the same, publicly available protocols to send information to our browser, and it is impossible to tell what has been sent only by looking at the traffic protocols — websites, photos, and videos, all look the same. But due to the presence of social networks in our everyday life, they are able to present the internet as something magical that simply connects our computers. Studies across the developing world have uncovered something alarming: “Millions of Facebook users have no idea they’re using the internet”, reads one headline. It is telling that a few media corporations have managed to trick us into thinking they were in unison with “the internet.” If people are convinced that Facebook is the only way to communicate, they tend to bow to their rules, which can be very constraining. Code is inflexible, and if a social network introduces new functionality, they have to define what should be permitted beforehand. While this is not inherently bad, there is a lack of democratic control in such processes, because users cannot influence the decisions made. Developers mostly follow orders by their bosses, who in turn decide based on the ability of such features to generate shareholder value.

Struggling for a truly free internet is therefore a huge and urgent task at hand for policymakers all around the world. The core mission of this can be summarised in three simple keywords: Understand, Educate, Regulate. Policymakers must first understand how the internet works and what it is in order to become independent of counsel by telecommunication companies. Then they must educate the population, because this ensures support from within the society in regulating the internet in a way that enables free communication across democratically managed connections.

Political Struggles for a Public Internet

The recent attacks on net neutrality exemplify what this overt control of telecommunications corporations over the internet and public discourse can lead to, if it remains unchecked by political actors.17 The principle of net neutrality ensures that all traffic is treated equally: whether you are visiting Google or your aunt’s travel blog, you can reach both sites in about the same amount of time. But if net neutrality is forsaken, it becomes possible for internet service providers (ISPs) to simply slow down the connection to your aunt’s blog, because, unlike Google, she doesn’t pay for quick access to her blog. As long as private corporations control the physical infrastructure of the internet, they also control what they let through, and what they block. What this means is that ISPs (and all other companies that handle business-to-business-connections in between) effectively control most of our everyday communication.

But politicians must also empower the people. Germany is a good example for this: after the Second World War, a federal agency for civic education (the “Bundeszentrale für Politische Bildung”) was founded to educate people on political topics, including, as of recently, digital education.18 Another feasible way to educate people is to organise in self-empowerment groups and learn about technology in a like-minded group. Across the globe, people are gathering in shops or public locations after work to learn about technology themselves. In Detroit, the Equitable Internet Initiative (EII) is training volunteers to set up internet connections on their own and teach them the technical specifications of internet protocols. In Germany, activist engineers use the public spaces of numerous “Netzläden” (network shops) to provide technical knowledge to interested people.

While such self-empowerment initiatives are an important first step towards digital enlightenment, they face two major problems. First, they tend to attract mainly people who wouldn’t need them anyway, because they are interested in the topic to begin with. Second, they are only makeshifts. They paper over cracks in the education system that exist because schools are not even remotely prepared for this kind of education, and neither are most teachers.

Regulating the internet is rather easy after these hard first steps. As most parts of the internet (layers 2 through 7 of the OSI model) are already publicly and democratically regulated, the only thing left is to constrain the power of corporate giants such as Facebook or Alphabet. Simply replacing them with another central and monolithic company, even if democratically controlled, is not the right way, just as Bolychevsky and Moulding explained. A British Digital Agency could only solve this problem if it remains lucid about the way human communities work: along the lines of (global) functions and (local) spaces. Many cities are already implementing digital services for citizens to apply for a new passport, initiate petitions, and inform themselves about local issues. The public needs to account for the functional differentiation of society into ever smaller subsystems, and the internet is perfectly suited for this task.

In practice, this means that instead of www.some-app.some-developer.com (e.g., docs.google.com), there need to be more URLs in the form of www.a-function.city-name.country (e.g. passport.london.uk, or london.foodsharing.org). Cities need domains for all of their services, and initiatives (such as food sharing, or even volunteer digital education) need domains for all cities in which they already have members. For a BDA to be successful in addressing these two paradigms of functionality and locality, it would need to be designed in such a way that it depends on user input. Cities would first need to offer services that are needed, evaluate different methods (such as: how should a form be presented so that it’s easy for residents to use?), then file a formal request for standardisation upon which other cities comment, before a vote takes place that makes a set of protocols mandatory, if cities want to offer a service. The same procedure must also apply to initiatives such as food sharing or offering digital education.

Generalising from all of these examples, the roadmap for a public internet should cover three areas.

First: all physical infrastructures must be nationalised.

Building the infrastructure of the internet requires an enormous amount of resources that small groups of activists could never obtain. The state, unwilling to engage with the emerging internet, was uninterested in investing in the internet and has left the market completely unregulated. This has resulted in exactly what the preachers of neoliberalism want to accomplish everywhere: a monopolised internet architecture, completely dependent on the goodwill of a small group of people who may or may not decide to let us communicate with each other.

Therefore, the first action that the public should take is to nationalise all of the physical infrastructure the internet consists of. This would include multinational agreements to keep all undersea cables up and running in joint efforts. Unfortunately, Corbyn’s speech did not tackle this at all, even though it’s one of the greatest obstacles to a free internet. There are small initiatives trying to take matters into their own hands by creating city-wide networks on their own, such as the German “Freifunker” or the Detroit-based Equitable Internet Initiative (EII). While their efforts are certainly needed and address problems in light of public negligence, they still depend on the larger infrastructure that has to be provided by big actors such as states or corporations. Also, if every individual were responsible for building their own internet access, we would most likely face the aforementioned communication breakdown. Instead, regulation can help standardise how people connect to the internet.

Second: all protocols and rule sets of how communication is facilitated must be democratically controlled.

Without standardised protocols that follow generally accepted norms of transmission, communication would be gibberish. The good news here is that democratic control is actually very possible. All of the protocols and means of sending and receiving packets have been standardised by institutions such as the IEEE, the W3C, and the ICANN which have done a wonderful job ensuring that devices stick to the same conventions when communicating. Concerning the form of internet communication, it is common to standardise everything.19

What could be changed, though, is the way these institutions are organised. While the IEEE or the W3C are nonprofit organisations with a deep foundation in the web development community, the ICANN has drawn criticism because of its hierarchical and pro-capitalist mode of operation. The ICANN’s main responsibility is handing out top-level-domains (TLD) such as .com, .org, country-codes like .us, .uk, and newer specialised domains such as .app or .rocks. But actors seeking to register these domains have to pay a fee to ICANN, which is currently set at $185,000.20 Instead of high fees, there should be clear rules so that anyone could apply without having to pay tremendous amounts of money. This would ensure that successful applicants are not chosen by wealth, but by compliance with the common good.

Third: everything atop of the OSI model must be decentralised and adapted to the local, regional, national, or global context where the service or platform is needed.

Atop of this OSI model reside the actual services and platforms. And these must be strictly decentralised and adapted to the organisational or functional unit they require. Cities can divide their own websites by the services they provide to residents, while global initiatives may use their website and create subdomains for each city in which they have an office. All of this must be decentralised, because each city, each village, and even each quarter will have certain, specific needs that others don’t have. In this way, it is better to let each organisational group decide what kind of services they want to offer, and how they want to offer it. Some initiatives may need an internal division by cities, some by countries, some by street name, some by individuals and some by social group.

Conclusion

We still have a lot to learn about the internet. Corporations are several steps ahead of the public, and they are very able to use this head start to their advantage. If policymakers around the world do not engage properly in internet regulations, we are likely to see a continuation of corporate control over the internet that might worsen everyday. By now it is already next to impossible to completely circumvent the monopoly of Alphabet.

In case of the UK, a proper digital agenda would include a strengthening of the councils and cities, and to aid them in creating platforms on which the citizens can engage in politics, make suggestions, and connect with each other — knowing that their data is safe, because the servers used to host these platforms belong to the public. Additionally, a sound digital agenda must include international agreements to maintain public undersea cables and ensure connectivity across continents. On the way to democratise the internet, a digital agency would make sense after all — not to ensure some hazy goals of “free speech” as Corbyn implied in his speech, but to take over control of the physical infrastructure across the UK and ensure that each connection in British households is guaranteed to work by the only institution currently capable of maintaining infrastructure: the state.

In the 21st century, it is impossible to engage with the internet on a strictly national level. In this way, Corbyn is well advised to stand up for transnational cooperation in gaining public control over one of the most vital assets of the modern man. In turn, this also means that the recommendations evolving from how the internet works apply to every other head of state all around the world.

The free internet is an illusion. But we have the means to turn it into a reality.


  1. Around the world, campaigns have been launched since June 2018 to address the mistrust in Facebook. These ads have been tailored to the specific target audience in each country, but all resemble around the notion of “better together”, read: You are still better off with Facebook, than without. 

  2. A more detailed “History of the internet” can be found at the Internet Society

  3. Ibid. For further information, see this page of the World Wide Web Consortium (W3C)

  4. For an extremely detailed overview of how corporations took over ruling the global communication see Potter, Simon J. “Webs, Networks, and Systems: Globalization and the Mass Media in the Nineteenth- and Twentieth-Century British Empire” 46, no. 3 (July 2007): 621–646. https://doi.org/10.1086/515446. 

  5. A great explanation of the “corporate” type of institutions, residing between the “public” and “private” realms is given by David Ciepley: “Beyond Public and Private: Toward a Political Theory of the Corporation.” American Political Science Review 107, no. 01 (February 2013): 139–58. https://doi.org/10.1017/S0003055412000536

  6. See Linge, Nigel, and Bill Burns. “The Cable That Wired the World.” Journal of the Institute of Telecommunications Professionals 10, no. 2 (2016): 41–45. 

  7. “Since individual papers could seldom on their own afford large budgets for expensive cable news, the solution was to share the costs of a common supply of news among many papers. Entrepreneurs were quick to recognize the significance of this new need, and international news agencies, springing up to organize the collection of news and to sell reports to multiple subscribers, rapidly developed to become what were among the first transnational corporations.” (Potter 2007, p. 633) 

  8. An impressive compilation of all modern undersea cables can be found in this interactive map

  9. It is telling that the researchers at TeleGeography answer their own FAQ question, “Why are there many cables between some continents but no cables between Australia and South America, for instance?” with “Undersea cables are built between locations that have something “important to communicate.”” 

  10. A comprehensive overview over the UK Telecom market in 2016 can be found in this paper by the regulating agency, Ofcom. 

  11. This process involves creating a first concept of how a new standard should look and subsequently “requesting comments”. This is why these proposals are called RFC — Request for Comments. Existing RFCs can be browsed at https://www.rfc-editor.org/. 

  12. Debord, Guy. The Society of the Spectacle. Reprint. Detroit, Mich: Black & Red, 2010. 

  13. I’m referring to communities along state borders, such as the Dutch-German border communities or the French-German border, where people mostly speak both languages and cross the border regularly, rendering it socially invisible. 

  14. This concept of “layering” infrastructure has caused one remarkable culmination of technology and urban design in the case of Waterfront, Toronto, where an Alphabet sub-company is trying to build the first “smart” city completely driven by digital technology. 

  15. The seven layers are: (1) the physical layer, (2) the data link layer, (3) the network layer, (4) the transport layer, (5) the session layer, (6) the presentation layer, (7) and the application layer. The four-layer model summarises some of the layers to make it simpler. 

  16. See Bratton, Benjamin. “The Black Stack.” E-Flux 53 (March 2014). http://www.e-flux.com/journal/53/59883/the-black-stack/. Bratton identifies six different layers: Earth, Cloud, City, Address, Interface, and User. His explanations are more concerned with sociological and political terms, yet fundamentally he follows the same principle as the OSI model. 

  17. In 2018, net neutrality was pronounced dead following the abolition of net neutrality regulations by the foremost regulatory agency in the United States, the FCC. 

  18. See http://www.bpb.de/lernen/digitale-bildung/. Unfortunately, this section is only available in German. 

  19. See again the New Socialist piece by Jason Prado, who addresses standardisation and problems that can arise from the processes of developing open platforms. 

  20. An additional problem of the current organisation of the ICANN is that it seems to be somewhat biased against the Muslim community, as is exemplified by it withholding the registration of the .halal and .islam top-level domains (TLD). 

Author:

Hendrik Erz (@sahiralsaid)

Hendrik Erz is a research assistant at the University of Bonn. His main research focusses on the nexus between Marxist economic theory, digitalisation and violence in capitalist societies (riots and terrorism).