According to Edwards, climatology in the 1960s and 1970s shifted from “data-driven, regionally oriented, descriptive field” to “theory-driven, globally oriented discipline increasingly focused on forecasting the future” (2010:139). This shift was created by scientists in theoretical meteorology and computer scientists at a small number of elite research institutions, who were able to use powerful computers to construct increasingly complex models that better mimicked the real world. The goal was to simulate the climate using computers. Given that we are limited to one planet, there was no possible way to do controlled experiments or natural experiments on the earth. Computer simulations, however, allowed for the systematic testing of variables that can re-create the earth in virtual reality. In fact, these complex models (e.g., global circulation models, GCM) reunified meteorology’s different strands (forecasting, theoretical, and empirical) together within a single research program. GCM’s “would transform climatology from a statistical science oriented toward the particularity of regional climates into a theoretical science more focused on the global scale. Ultimately, they would also guide vast changes in the global data infrastructure, and they would transform the sources and the very meaning of the word ‘data’” (Edwards 2010:142)
Computer models can only be used as climate simulations after achieving equilibrium – runs using models are simulated time, generating data that can be compared with “real” data. Because weather consists of a complex set of variables, early models reproduced one variable while doing poorly on another variable. But almost just as important as the use of models to generate numerical data was the creation of visualizations of the dynamic activity of model, given the use of computers in constructing them:
“This circuitous exchange of concepts, mathematical techniques, and computer code became entirely typical of computational meteorology and climatology after the mid 1960s. Rather than start from scratch, virtually all the new modeling groups began with some version of another group’s model. Veterans and graduate students from the three original GCM groups left to form new groups of their own, taking computer code with them (Edwards 2010:170)
Reality is complex – so for models to better reflect reality, “supercomputers” were necessary to increase the “resolution” of the model. Supercomputers were necessary for various highly-prized domains: nuclear weapons research, high energy physics, and, of course, climate modeling.
“Computational friction results from modelers. In their accounts, a kind of Heisenberg principle of modeling results from computational friction. Modelers can increase model resolution by sacrificing complexity, or they can increase model complexity by decreasing resolution-but they cannot do both at once” (2010:175).
In models, tiny errors that result from a long series of calculations exacerbate tiny errors into large ones. In fact, a new area of mathematics was generated from weather research – chaos theory. Chaos theory was generated from atmospheric modeling. It accounted for the reality in models (and in the physical world) that small differences in initial conditions create widely different outcomes: this is known as the “butterfly effect.”
How are models judged? Simulation should look like reality. Models could only be tested using “real data” to evaluate them. All the models point towards global warming from increased CO2; the scientific issue debated is not whether or not there is global warming, but how much there is and what is causing it (see the table on Edwards 2010:183). As climate change became a political issue in the 1970s, everyone wanted more global climate data to evaluate these complex models.
Making Data Global
This chapter goes back again in time to focus on data, specifically the issue of making data global. In the 1950s, the global data network that we have today was still not infrastructure. Instead, it was in what Edwards calls the “system-building phase”; a time when data adhered to national standards, but not global standards. The shift towards infrastructure was shaped by organizations like the World Meteorological Organization and the United Nations, as well as by sociopolitical realities like the Cold War. Despite the Cold War, “infrastructural globalism” emerged with the complicity of both the USA and USSR who, despite political competition, continued to collaborate with each other in the sharing of meteorological data.
To make data global, computers converted heterogeneous and limited data into gridded global data sets through the use of intermediate computer models. The resulting gridded global data sets created data for areas of the world that previously did not have direct observations. With data resulting from the combining of direct observation and measurement and computer generated data, climate studies could approach weather more scientifically with the emergence of global data sets. Here we have one example of what many researchers of globalization have emphasized as the loss of power by individual nation-states:
“Since the middle of the seventeenth century, under the so-called Westphalian model of sovereignty, states had retained absolute control over affairs within their territories, but expected no control whatsoever over the affairs of other states. No state recognized any authority higher than its own. Virtually all international associations were voluntaristic, existing only to promote mutual interests (and only so long as those interests remained mutual). Military alliances were paradigmatic” (2010:197).
The United Nations both maintained and eroded voluntarist internationalism On the one hand, voluntarist internationalism helped the emergence of global operational infrastructure due to technological change and institutional effort. On the other hand, the mutual orientation of science and the military in the nation-state resulted in the diminishing of voluntarist internationalism. Think of this in relation to what Edwards calls the ‘technopolitics of altitude,’ with the increased deployment of spy planes, satellites, and the conduct of the ‘space race.’
Another important aspect of the process of making data global was the emergence of the ‘politics of expertise.’ The power exercised through the politics of expertise required the command of knowledge combined with development funding by governments and foundations. The politics of expertise also contributed to a change in politics itself, as in the shift away from the State Department monopolizing all international issues to the management of technically-grounded international relations in other federal departments (in the case of the United States).