Protocols for a Planet - ICT and environmental sustainability

On World Meteorological Day 2022, the secretary general of the United Nations (UN), António Guterres, announced that “[w]e must boost the power of prediction” and that “the [UN] will spearhead new action to ensure every person on earth is protected by Early Warning Systems within five years” [1].

Early Warning Systems (EWS) for natural hazards – encompassing, e.g., atmospheric, hydrologic, and geologic phenomena – are at the heart of UN’s climate adaptation strategy. EWS are generally made up of sensors that monitor the natural environment, telecommunication systems that transmit the recordings to computing facilities which store, process and potentially react to the signals if a natural hazard has been detected or predicted, by sending a warning to the human observer or affected population. Therefore, a call to “boost the power of prediction” is a call to widen and deepen the Information and Communication Technology (ICT) infrastructure, on which EWS strongly depend, over and into the natural environment.

From the very beginning, when the first electrical telegraphs and computers were set up, ICT was used to monitor natural phenomena and potentially react to its readings [2]. Since then, various initiatives, mostly in the U.S., were launched in the pursued of a planetary-scale ICT infrastructure, such as the Geoweb and GEOINT introduced by the United States Department of Defense ([3], [4]), The Digital Earth introduced by former U.S. vice-president Al Gore, and the Planetary Skin Institute introduced by Cisco and NASA\footnote{Benjamin Bratton would even go so far to say that the origins of a planetary-scale computation infrastructure “can be traced in broad strokes from perhaps Roman and Chinese military accounting” [5]. }. While the Sisyphean goal of such initiatives continue to be re-invented and -imagined, they have created what Benjamin Bratton [5] calls The Stack, an “accidental megastructure” that is “informed by the multilayered structure of software protocol stacks in which network technologies operate within a modular and interdependent vertical order.”

Examining briefly the role of technical protocols in the Stack, we find that the open and voluntary development of standards and protocols are what guarantee the interoperability and reliability of communication between the products of network operators, equipment vendors, content and service providers, and software developers. The US introduced four different national emergency alerting and warning system if we consider the period starting from 1951 to 1994 alone. This gives an idea how inherently heterogenous emergency communication networks are, made up of laptops, cell phones, PDAs etc. which use different protocols such as WiFi, Bluetooth, GSM. Thus a myriad of protocols is required to ensure that they all can communicate with each other. Therefore, to enable authorities to exchange all-hazard emergency alerts and public warnings over all kinds of ICT networks and technologies, the Common Alerting Protocol (CAP) was developed by the International Telecommunication Union (ITU), and is supported by the International Federation of Red Cross and Red Crescent Societies (IFRC) and the World Meteorological Organization (WMO) together with numerous other organisations to ensure that by 2025 all countries can integrate CAP into their EWS.

However, one has to realise that the network infrastructure of today builds on top and is shaped by the history of empires, colonialism, and capital. The distribution of submarine fibre-optic cables and datacenters shadow the sea routes pioneered in previous centuries, routes that sped the circulation of cotton, silver, spices, settlers, and slaves [6]. And thus, a problem that can arise when calling for a “boost the power of prediction” is that inequalities created in the past are carried over into the future in a scaled-up version.

But even within the Global North exist large disparities in connectivity between centres and peripheries. In 2018, Microsoft researchers found that approx. 50% US Americans do not use the internet at broadband speeds-almost half the country [7]. The disconnected are disproportionately rural and low-income. However, just like during the current energy crisis, policy-makers try to solve the connectivity crisis by giving immense sums of money to the firms at the heard of the problem. And just like Shell “Shell and Centrica post profits totalling £11bn as households struggle with bills”, so do Internet Service Providers when they receive funding to close the digital divide. Take for following case from the US as an example. In 2015, a major ISP called CenturyLink began receiving $505.7 million in annual support from the Universal Service Fund to pay for broadband deployment in underserved areas. Five years later, the company told the Federal Communications Commission it had only met the mandated milestones in 10 states out of 33. During these years, CenturyLink’s CEO was one of the highest-paid executives in the industry, earning $35.7 million in 2018.


[1] World Meteorological Organization, 24/03/2022
World Meteorological Day ceremony: boost the power of prediction

[2] Paul N. Edwards, 2010
A Vast Machine: Computer models, climate data, and the politics of global warming
Cambridge Publishing
ISBN: 978-0-262-01392-5

[3] Charles Herring, 21/05/1994
An architecture of cyberspace: S
patialization of the Internet;jsessionid=07E9215194490F6F6CFEE0FB5F1BCCF6?doi=

[4] National Research Council, 10/05/2006
Priorities for GEOINT Research at the Nati
onal Geospatial-Intelligence Agency National Academies Press
ISBN: 978-0-309-10149-3

[5] Benjamin H. Bratton, 2015
The stack: On Software and Sovereignty
MIT Press
ISBN: 978-0-262-02957-5

[6] Nicole Starosielski, 2015
The Undersea Network
Duke University Press
ISBN: 978-0-8223-5755-1

[7] Ben Tarnoff, 2022
Internet for the People: The Fight for Our Digital Future
ISBN: 978-1-8397-6202-4