9 March 2020

The Guardian: History as a giant data set: how analysing the past could help save the future

Turchin’s approach to history, which uses software to find patterns in massive amounts of historical data, has only become possible recently, thanks to the growth in cheap computing power and the development of large historical datasets. This “big data” approach is now becoming increasingly popular in historical disciplines. Tim Kohler, an archaeologist at Washington State University, believes we are living through “the glory days” of his field, because scholars can pool their research findings with unprecedented ease and extract real knowledge from them. In the future, Turchin believes, historical theories will be tested against large databases, and the ones that do not fit – many of them long-cherished – will be discarded. Our understanding of the past will converge on something approaching an objective truth. [...]

Complexity science had its origins in physics, in the study of the behaviour of elementary particles, but over the course of the past century it slowly spread to other disciplines. As late as the 1950s, few cell biologists would have conceded that cell division could be described mathematically; they assumed it was random. Now they take that fact for granted, and their mathematical models of cell division have led to better cancer treatments. In ecology, too, it is accepted that patterns exist in nature that can be described mathematically. Lemmings do not commit mass suicide, as Walt Disney would have had us believe, but they do go through predictable four-year boom-and-bust cycles driven by their interactions with predators, and possibly also with their own food supply. In 2008, the Nobel-prize winning physicist Murray Gell-Mann declared it was only a matter of time before laws of history would be found, too. It would not happen, however, until all those who study the past – historians, demographers, economists and others – realised that working in their specialist siloes, while necessary, was not sufficient. “We have neglected the crucial supplementary discipline of taking a crude look at the whole,” said Gell-Mann.[...]

“This argument gets it exactly wrong,” argues Turchin, who since the early 1990s has been a professor in the department of ecology and evolutionary biology at the University of Connecticut, and is now also affiliated with the Complexity Science Hub in Vienna. “It is because social systems are so complex that we need mathematical models.” Importantly, the resulting laws are probabilistic, not deterministic, meaning that they accommodate the element of chance. But this does not mean they are hollow: if a weather forecast tells you there is an 80% chance of rain, you pack your umbrella. Peter J Richerson, a leading scholar of cultural evolution at the University of California in Davis, says that historical patterns such as secular cycles do exist, and that Turchin has “the only sensible causal account” of them. (It is also, Richeson points out, the only such account for now; the field is young, and different theories may follow.) [...]

Of course, our experience with the climate crisis suggests that even if we can predict the future the way we can forecast the weather, and come up with a set of preventative measures to stave off social collapse, that does not mean we will be able to muster the political will to act on such recommendations. But while it is true that human societies have in general been far better at reconstruction after disasters than preventing them in the first place, there are exceptions. Turchin points to the US New Deal of the 1930s, which he sees as a time when American elites consented to share their growing wealth more equitably, in return for the implicit agreement that “the fundamentals of the political-economic system would not be challenged”. This pact, Turchin argues, enabled American society to exit a potentially revolutionary situation.

No comments:

Post a Comment