Abstracting the Earth
Notes on oilfields, exploration geophysics, and abstraction in the mid-20th century

Broadly speaking, my dissertation project was a study of 1) some techniques that oil prospectors have historically used to model the subsurface (in particular, exploration seismography), 2) how and why these techniques have changed over time, and 3) the reasons motivating those changes. With respect to the third aspect, I follow Georg Lukács and István Mészáros in characterizing (some of) these reasons in terms of a general tendency to formalism, which I take to be broadly characteristic of modern science. Importantly, this is not just to say that subsurface models have become more abstract, numerical, or decontextualized. More specifically, I was interested in the “ideological function,” in Mészáros’ terms, inherent to this tendency: the separation of certain questions of natural science—e.g., what are the properties of the earth and how can we study them?—from the material history of socio-ecological conflict.
That said, even if it can be described as a general tendency, it is critical to stress that scientific formalism includes very different kinds of abstractions, which themselves have been developed for very different reasons. The question thus cannot simply be reduced to whether knowledge of the earth is ‘more abstract’ today than before but why different kinds of abstractions emerged at different times. As an example: a geophysical abstraction such as a seismogram, despite being more quantitative, is not necessarily more abstract than a qualitative geological abstraction, such as a cross section model. Something like the anticlinal theory of petroleum accumulation, for instance—which suggests oil originates in source rock deep in the earth and migrates upward, eventually pooling underneath impermeable layers of caprock—is indeed quite abstract with respect to the intensely heterogeneous, uneven, and porous geological structure of the earth. Notably, into the late 19th century, the theory was in fact too abstract to predict the location of oil in Appalachia, and for this reason it was abandoned by oil men. Its (now commonly accepted) reputability gained ground only once it proved more useful in prospecting out West, near the turn of the century. Conversely, early exploration seismography frustrated academic geophysicists because it was not abstract enough: its practitioners had taken to inferring time-depth conversions from local observations rather than rely on textbook formulae. What exactly differentiates different abstractions? And, from a historical perspective, what was insufficient about prior abstractions such that new kinds had to be proposed?
The history of oil prospecting felt to me like a particularly useful context for these inquiries—the ‘ideological function’ of scientific formalism and its changing manner of abstraction—given how oil exploration has been both riven with conflict and also productive of knowledge about the subsurface in ways that have proven immensely influential in the development of 20th-century earth science. Indeed, to speak of the ideological function at work here: it is fascinating how few histories of earth science even bring up the revolving door (and financial ties) between universities and oil companies; the borrowing among academic geologists and geophysicists of survey techniques pioneered by oil prospectors; myriad shared intellectual endeavors, such as the premier scientific journal Geophysics (founded by oil prospectors). In brief, the ties of 20th-century earth science and the oil industry are as thick as those between, information science and the Cold War, and yet far fewer historical accounts of these relations exist (the most important exception being Aitor Anduaga’s fantastic Geophysics, Realism, and Industry).
And, at the same time, changes in techniques for producing scientific knowledge of the earth were often intimately related to rather mundane, practical problems of oil-finding. It was with this in mind that I became more interested in the history of oil prospecting, because I thought it may be a way of working backwards from the concepts of ‘complexity’ and ‘system’ that take such a strong hold on scientific consciousness by the mid-20th century, and which have since become totally hegemonic, presented across social and natural sciences as if they are necessary forms of being—it is in practically beyond reproach across countless disciplines that existence ‘in itself’ is composed of ‘complex’ ‘systems’. And so what I thought might be useful to do was to find a way to denaturalize and politicize these concepts, namely, by linking their emergence to specific concrete technologies and practices associated, first, and most obviously, with modern computing but second, and much less appreciated, with oil exploration.
So then I set about investigating this long history of the development of techniques of abstracting the earth, particularly within the upstream oil industry and again to try to see it in relation to this broader tendency to formalism and the separation of questions of natural science from the material history of socio-ecological conflict. This is not to say that this had all been decided in advance, or that there are no countertendencies, but rather I suggest that there is indeed a defining formalistic logic to the technological evolution of the oil and gas industry, across otherwise quite disparate modes of abstraction, and that this has had two important, enduring consequences. First, it has afforded the industry a remarkable flexibility in finding and extracting oil across otherwise highly variable geological, political, and economic milieus. To this point, the importance of science’s social mediation is a lesson I learned from Geoffrey Bowker’s Science on the Run, which among other things shows how the development of prospecting techniques was enabled and constrained by different property regimes: unlike electrical loggers pacing the surface, for instance, seismic waves don’t need landowners’ permission to enter the otherwise privatized subsurface domains of the US. Conversely, in the USSR, access to land posed less of an issue post-collectivization, and thus electrical logging flourished as a prospecting technique.
But the other consequence of scientific formalism is that it tends to reproduce the same type of problem within the industry (and within modern science) again and again—namely, as Lukács talks about in History and Class Consciousness, problems which emerge from the non-identity of the intelligible form of the subsurface with its content. Colloquially, these problems are often recognized as ‘bad data’, which can arise from all sorts of contingencies and circumstances, whether geological obstructions like fault lines or technical problems of sensing and data transmission, but which speak to the irremediable gap between the sensible subsurface and the technologies used to sense it. This is to say that there is thus no ‘final abstraction’ (and, pace the Lukács of H&CC, no culminating subject-object identity), inasmuch as the earth remains resistant to its conceptualization, recalcitrant to its being known and systematized, and it is this negative or planetary materiality that continuously regenerates and re-emerges in the discrepancy of subject and object, form and content, model and fact.
Along these lines, one thing I try to argue is that a notion of ‘complexity’ subsequently emerges within the oil industry as one possible way to address this discrepancy, given certain social and technological constraints. In particular, I suggest that it emerges through the advancement of 1950s digital seismography, in relation to a novel statistical approach pursued to move beyond the limits of older prospecting techniques, which tended to either be too generic and principles-driven (formal or scientific geology) or else too vernacular and context-dependent (‘practical’ or rule-of-thumb geology). There are antecedents, to be sure—in particular, experimental technologies developed during the 30-year history of exploration geophysics leading up to the 1950s—but I suggest that these statistical techniques developed with the advent of early modern computing had not only allowed oil prospectors to mediate the subsurface differently than before but fundamentally changed subsurface epistemology, giving way to notions of ‘complexity’ and ‘system’ in oil prospecting.
One of the major focal points for me was the Geophysical Analysis Group (GAG) at MIT, and in particular the work of its most central member, Enders Robinson, while he was a geophysics PhD student. Equipped with the techniques of time series analysis he had learned from the great cybernetician Norbert Wiener (one of his mentors, in addition to finding inspiration in Schumpeter’s theory of innovation, an important connection I can’t get into here), Robinson is credited as the ‘father’ of digital seismography for his development of a deconvolution algorithm, which he used to ameliorate ‘bad’ seismic data generated by oil companies off of the Gulf Coast. To put it briefly, his algorithm—‘predictive decomposition’ as he called it—was able to identify interfaces between rock beds that were otherwise obscured by the expanding seismic wave as it bounced around and interfered with itself underground.
The technical process to do so is complex, and I do not want to get into it any further here. But it is worth pausing to appreciate not only the difference in degree but also the kind of abstraction necessary for digital seismic analysis, again in view of this broader tendency toward scientific formalism I am interested in. Robinson gives a particularly revealing account, worth quoting at length, concerning how this idea to apply predictive filtering to seismic data first came to be:
One day, Wadsworth and Hurley fell into a discussion about the use of mathematics in geology. As the story goes, Wadsworth was needling Hurley a bit, chiding him because he thought not enough mathematics was being used in geology. Wadsworth was used to analyzing weather time-series data, which he described as data that ‘go up and down.’ Hurley said that seismic traces also ‘go up and down,’ and that geophysicists had to ‘eyeball’ them in order to pick reflections. […] An inquiry was sent to the Magnolia Petroleum Company, which provided eight seismic records. When I showed up, Bryan, with a sigh of relief, gladly handed me the eight records. (p. 11JA)
I have hung on this passage for some time. How could a ‘system’ as dynamic and variable as the weather be in any way comparable to something a ‘system’ fixed and deterministic as the subsurface? Ultimately, it is because of the equivalency of the presupposed form of data being worked on, which, by virtue of its means of production, has not just been removed from its concrete origins—the local engagements of prospectors with the earth—but in the hands of the digital seismologist takes on the abstract appearance of a time series that ‘goes up and down’. It it this form of appearance that translates the older scientific problem of finding subsurface geometries likely to harbor oil—articulated since the dawn of (analog) exploration seismography in the 1920s—into a problem of finding patterns in discrete, aspatial data, thus a problem legible in precisely the same way filtering noise from electrical signals or predicting the weather was legible. It is this distinct form of digital abstraction, and not merely quantitative, that I think is so significant for understanding the ideological function of scientific formalism, which above all provides for a different way of mediating contingency. But more on this another time.
Now, the algorithm Robinson developed with this new principle of abstraction in mind would become central to the digitization of oil prospecting in the following decade—not only for ‘digital seismography’ but for the industrial geophysics more broadly. And unsurprisingly this convergence of computing and geophysics heavily shaped and was shaped by the Cold War. Here, I just want to briefly note the important yet very understudied role which the company Texas Instruments (TI) played. Today not well-known beyond those familiar with high school calculators, TI was indeed an important US semiconductor and electronics manufacturer through much of the mid-20th century. It was heavily involved in a number of high profile federal government and military contracts, including seismic applications for the US government’s nuclear test detection program under Project Vela, computational and electronic equipment developed for rockets during the Space Race, and military seismic and radio equipment contracted with the Bureau of Aeronautics and the US Air Force from the Korean to Vietnam Wars.
Most importantly for my purposes is that TI was originally the manufacturing department of Geophysical Services Incorporated, an oil prospecting company founded in the early 1930s. While, in the 1950s, corporate restructuring turned GSI into a subsidiary of TI, its prospecting wing maintained global leadership through the mid-20th century precisely by leveraging TI-manufactured computer and electronic equipment for oil exploration—and at the center of which were the kinds of digital algorithms first pioneered at MIT.
TI’s integrated circuit-equipped Digital Field System, for instance, which was originally developed as part of Project Vela in the early 1960s, was subsequently adapted by GSI engineers to be mounted onto seismic recording trucks to capture digital seismic data out in the field. It would then be processed back at GSI’s offices on all-transistorized Texas Instruments Automatic Computers, or TIACs, where GSI-developed software packages applied the latest statistical techniques of data analysis, including deconvolution, to the problem of seismic interpretation. So the corporate history of TI then becomes illustrative of precisely this midcentury convergence of personnel, expertise, and hardware between oil prospecting and the emerging information and communication technology industry, all mediated by geophysical science—and again, important for me is this technical substrate for abstracting subsurface information into time series, such that it could like weather data be made legible to programing techniques developed on early modern computers.
Now, the co-development of oilfield industry technology and information technology of course does not end here: after the birth of digital seismography in the 1960s, there were very heavy IT investments of the 1970s among the oil majors, including acquisitions of the ‘venture wings’ of big companies like BP and Exxon, as investigated by Cyrus Mody (2023). At the same time, with respect to their own computing resources, as Lawyer, Bates, and Rice’s Geophysics in the Affairs of Mankind attests, through the 1980s, the geophysical industry used more magnetic tape and digital computer power than any other industry (2001, p. 179); as Robinson himself confirms, the “lion’s share of worldwide scientific computer time” was dedicated to seismic data analysis (1985, p. 43).
To conclude with something I hope to elaborate in a subsequent post: despite their alienated appearance, abstractions of the earth and the techniques used to produce them have never ceased to imply particular socio-ecological arrangements and thus the specific conflicts—e.g. over oil, land, environment, development—that have emerged within them. All concepts, including scientific concepts, are ‘merely historical’ forms of being, ‘vanishing necessities’ in Marx’s terms. In turn, challenging the hegemony of ‘complexity’ and ‘system’ concepts means not only historicizing them but also rendering them more natural—that is to say, artifacts of our planetary being, our natural history. Put another way, scientific concepts do not originate idealistically, distilled in the head of some hero scientist, but in the earth’s irreparable failure to explain itself, the discovery of which, we should not forget, is so often accompanied with immense scales of suffering, planetary suffering through the breakdowns of social, ecological, and technological mediations of thought and thing, form and content, subject and object.

