Lorraine Daston - When Science Went Modern

The history of science is punctuated by not one, not two, but three modernities: the first, in the seventeenth century, known as “the Scientific Revolution”; the second, circa 1800, often referred to as “the second Scientific Revolution”; and the third, in the first quarter of the twentieth century, when relativity theory and quantum mechanics not only overturned the achievements of Galileo and Newton but also challenged our deepest intuitions about space, time, and causation. 

Each of these moments transformed science, both as a body of knowledge and as a social and political force. The first modernity of the seventeenth century displaced the Earth from the center of the cosmos, showered Europeans with new discoveries, from new continents to new planets, created new forms of inquiry such as field observation and the laboratory experiment, added prediction to explanation as an ideal toward which science should strive, and unified the physics of heaven and earth in Newton’s magisterial synthesis that served as the inspiration for the political reformers and revolutionaries of the Enlightenment. 

The second modernity of the early nineteenth century unified light, heat, electricity, magnetism, and gravitation into the single, fungible currency of energy, put that energy to work by creating the first science-based technologies to become gigantic industries (e.g., the manufacture of dyestuffs from coal tar derivatives), turned science into a salaried profession and allied it with state power in every realm, from combating epidemics to waging wars. The third modernity, of the early twentieth century, toppled the certainties of Newton and Kant, inspired the avant-garde in the arts, and paved the way for what were probably the two most politically consequential inventions of the last hundred years: the mass media and the atomic bomb.

The aftershocks of all three of these earthquakes of modernity are still reverberating today: in heated debates, from Saudi Arabia to Sri Lanka to Senegal, about the significance of the Enlightenment for human rights and intellectual freedom; in the assessment of how science-driven technology and industrialization may have altered the climate of the entire planet; in anxious negotiations about nuclear disarmament and utopian visions of a global polity linked by the worldwide Net. No one denies the world-shaking and world-making significance of any of these three moments of scientific modernity.

Yet from the perspective of the scientists themselves, the experience of modernity coincides with none of these seismic episodes. The most unsettling shift in scientific self-understanding—about what science was and where it was going—began in the middle decades of the nineteenth century, reaching its climax circa 1900. It was around that time that scientists began to wonder uneasily about whether scientific progress was compatible with scientific truth. If advances in knowledge were never-ending, could any scientific theory or empirical result count as real knowledge—true forever and always? Or was science, like the monarchies of Europe’s anciens régimes and the boundaries of its states and principalities, doomed to perpetual revision and revolution? 

By 1900, when the International Congress of Physics scheduled its inaugural meeting to coincide with the Exposition Universelle in Paris, these anxieties had become acute: The most spectacular recent scientific discoveries, such as x-rays and radioactivity, and theoretical advances, such as the challenges to Newtonian absolute space and the electromagnetic ether, were also experienced by the scientists themselves as dizzying symptoms of malaise—or even of violence. The American historian and statesman Henry Adams, writing about the state of science in 1903, reached for metaphors of anarchist terrorism: “The man of science must have been sleepy indeed who did not jump from his chair like a scared dog when, in 1898, Mme. Curie threw on his desk the metaphysical bomb she called radium.”1 Scientific advances were hurtling forward with the speed and force of a locomotive—but no one knew its final destination, or even whether there was a destination. All one could do was hang on for dear life.2

The Great Acceleration
This was the moment when science went modern, when science became not only an active motor of what historian C.A. Bayly has called “the Great Acceleration of 1890–1914,”3 but also its breathless subject, swept up like everyone and everything else in gale-force winds of change. For the scientists, the realization that progress might have its dark side had been germinating since the mid-nineteenth century, when they noticed with consternation that their publications were no longer read after a decade or so and that it had become necessary to revise university curricula and textbooks several times a generation. Last year’s scientific truths, they noted with alarm, were becoming obsolete almost as rapidly as last year’s fashion in millinery. By the 1890s, the pell-mell accumulation of novelties on both the theoretical and empirical fronts threatened to bury the scientists like an avalanche and to undermine the foundations of even the most stable sciences, astronomy and physics.

This was also the moment when, as a response to this experience of modernity as acceleration en route to who-knew-where, scientists and later historians of science rethought the relationship of science to history in the broadest sense: not just the past, but also the present and future... read more:



Popular posts from this blog

Third degree torture used on Maruti workers: Rights body

Haruki Murakami: On seeing the 100% perfect girl one beautiful April morning

Albert Camus's lecture 'The Human Crisis', New York, March 1946. 'No cause justifies the murder of innocents'

The Almond Trees by Albert Camus (1940)

Etel Adnan - To Be In A Time Of War

After the Truth Shower

James Gilligan on Shame, Guilt and Violence