An essay exploring technology’s evolution alongside humanity, first published on BBC Future
If you consider our place in the history of the Universe, it is easy to see humans as an insignificant temporal speck, flickering in an unspeakably vast cosmos.
One common analogy illustrates this by telling the story of our planet’s 4.7-billion-year history as if it were the 24 hours of a single day. If you assume that the Earth coalesced an instant after midnight, it took around four hours for the first life to appear: microscopic organisms clustered around hydrothermal vents beneath young oceans. It took five more hours for photosynthesis to begin – and until midday for the atmosphere to become rich in oxygen. By 18:00 we had sexual reproduction; at 22:00 the first ever footprints appeared on land, left by lobster-sized sort-of-centipedes; and by 23:00 the dinosaurs had arrived, only to exit 40 minutes later alongside three-quarters of Earth’s species in the planet’s fifth mass extinction.
Since then, the day’s remaining 20 minutes have seen the rise of the mammals, with something semi-human existing for about the last minute (three million years in real terms). Recorded history has lasted for the last tenth of a second, and the industrial revolution the last five thousandths of a second – by which point our analogy is fast becoming too microscopic to be useful.
So far so humbling. Looked at another way, however, this exercise emphasises something else. Life on Earth took a long time to get going, and even longer to build civilisations – yet once it did, the results have been remarkable.
Even in the context of several billion years of history, the last few human centuries have been astonishing. Our species has not only reshaped its planet’s biosphere, but is in the middle of engendering changes to its terrain, oceans and climate on a scale only asteroid impacts or centuries of apocalyptic volcanic eruptions previously equalled. The consequences of these changes will be measured in aeons. We have introduced something exponential into the equations of planetary time – and that something is technology.
We often think about technology as the latest innovation: the smartphone, the 3D printer, the VR headset. It’s only by taking a longer view, however, that we can understand its entwining with our species’ existence. For technology is more than computers, cars or gadgets. It is the entirety of the human-made artefacts that extend and amplify our grasp of the world. As the philosopher Hannah Arendt put it in 1958, we have in recent centuries developed a science ‘that considers the nature of the earth from the viewpoint of the universe’. Yet in doing so we have paradoxically trained ourselves to ignore the most important lesson of all: our co-evolution with technology.
What was the first human tool? We can’t be sure – but we do know that, from around two-and-a-half million years ago, our distant ancestors began to use found objects in a deliberate manner: hard or sharp stones, for breaking open shells or protection; sticks for reaching distant food; plants or animal parts for shelter or camouflage.
In this, and in their initial crafting and improvement of these objects, our ancestors weren’t so different from several other groups of animals. Plenty of creatures can communicate richly, comprehend one another’s intentions and put tools to intelligent and creative use: cetaceans, cephalopods, corvids. Some can even develop and pass on particular local practices: New Caledonian crows, for example, exhibit a “culture” of tool usage, creating distinct varieties of simple hooked tools from plants in order to help them feed.
Only humans, however, have turned this craft into something unprecedented: a cumulative process of experiment and recombination that over mere hundreds of thousands of years harnessed phenomena such as fire to cook food, and ultimately smelt metal; as gravity into systems of levers, ramps, pulleys, wheels and counterweights; and mental processes into symbolic art, numeracy, and literacy.
It is this, above all, that marks humanity’s departure from the rest of life on Earth. Alone among species (at least until the crows have put in a million years more effort) humans can consciously improve and combine their creations over time – and in turn extend the boundaries of consciousness. It is through this process of recursive iteration that tools became technologies; and technology a world-altering force.
The economist W Brian Arthur is one of the most significant thinkers to have advanced this combinatorial account of technology, especially in his 2009 book The Nature of Technology. Central to Arthur’s argument is the insight that it’s not only pointless but also actively misleading to do what most history books cannot resist, and treat the history of technology as a greatest-hits list of influential inventions: to tell stirring tales of the impact of the compass, the clock, the printing press, the lightbulb, the iPhone.
This is not because such inventions weren’t hugely important, but because it obscures the fact that all new technologies are at root a combination of older technologies – and that this in turn traces an evolutionary process resembling life itself.
Consider the printing press, the invariable poster-child for anyone wanting to offer a historical-ish perspective on the dissemination of information. The German inventor Johannes Gutenberg was, famously, the first European to develop a system for printing with movable type, in around 1440. Yet he was far from the first person to realise that using individual, movable components for each character in a sentence was a good way to speed up printing (as opposed to laboriously carving every page of text onto wood or metal).
Printing using individual porcelain characters had been developed in China in the 11th Century, and using metal character in Korea in the 13th Century. Gutenberg benefited, however, from the far smaller number of letters in German; from his knowledge of metal-smelting as a blacksmith and goldsmith, which helped him create a malleable-yet-durable alloy of lead, tin, and antimony; and from his insight that the kind of wooden presses used for centuries in Germany to make wine could be repurposed for pressing type against paper (itself a technology developed in China 1,500 years previously).
Wooden wine-presses, metal alloys, the Roman alphabet, oil-based ink, paper – every piece of the puzzle assembled by Gutenberg and his collaborators was based in a pre-existing technology whose origin could itself be traced back through previous technologies, in unbroken sequence, to the very first tools.
In a sense this is self-evident. It is, after all, only possible to build something out of components that exist – and these components must, in turn, have been assembled from other pre-existing components, and those from others that came before, and so on. Equally self-evidently, this accumulative combination is not by itself sufficient to explain technology’s evolution. Another force is required to drive it, and it’s similar to the one driving biological evolution itself: fitness as manifested through successful reproduction.
In the case of biological evolution, this process is based upon the transmission of genetic code from parents to offspring. The genetic code of successful organisms is passed on, while less successful organisms fall by the wayside. Genetic mutations produce incremental variations in species, some of which may prove favourable, while mechanisms such as sexual reproduction combine the genes of different individuals and potentially produce further advantages. Other mechanisms for the recombination of genes include micro-organisms like bacteria adapting themselves to exist entirely inside large cells, symbiotically conferring benefits upon their hosts.
In the case of technology, the business of survival and reproduction is symbiotic in a more fundamental way. This is because technology’s transmission has two distinct requirements: the ongoing existence of a species capable of manufacturing it, and networks of supply and maintenance capable of serving technology’s own evolving needs.
Humans’ fundamental needs are obvious enough – survival and reproduction, based upon adequate food, water, shelter and security – but in what sense can technology be said to have needs of its own? The answer lies all around us, in the immense interlinked ecology of the human-made world. Our creations require power, fuel, raw materials; globe-spanning networks of information, trade and transportation; the creation and maintenance of accrued layers of components that, precisely because they cannot reproduce or repair themselves, bring with them a list of needs far outstripping anything natural.
Consider the printing press one more time. Wine presses, smelted metal, paper, ink: the moment was ripe for a new technology to combine these and other elements. And it was ripe partly because sufficient interconnections of manufacture and supply existed to make combination feasible – and scalable. The paper Gutenberg used to print his bible was imported from the paper-making centre of Caselle in Piedmont, now a part of Northern Italy; its delivery entailed transfer across the Alps by ox cart, then by barge along the waterways of the Rhine. Caselle’s expertise had in turn been learned from southern Italy, which had acquired it from Spain and North Africa, whose Muslim rulers had first brought knowledge of paper-making along the silk road from China.
In its separateness from and yet reliance upon biological life, technology is uniquely powerful and uniquely needy. It embodies an ever-expanding network of dependencies, and in this sense it invents many more needs than it serves – with both its requirements and its capacities growing at an exponential rate compared to our own.
Let’s go back to the history of our planet. Rather than being guided by clocks, however, let’s this time focus on events – and in particular the incremental accruals of technological evolution.
Time in the human sense doesn’t mean much when it comes to technology, because – unlike something living – a tool doesn’t struggle to survive or to pass on its pattern. Rather, it’s the increments of design, manufacture, refinement and combination that mark development. In this sense, no time whatsoever passes for a technology if there is no development. If a human population uses thousands upon thousands of identical farming tools in an identical way for thousands of years, that technology is frozen in stasis. To use an ancient tool is to enact a kind of time travel.
From this perspective, most of our planet’s history saw no technological time passing whatsoever. Four billion years were less than the blink of an eye – while the last few centuries loom larger than all the rest of history.
There’s a simple mathematical way of thinking about this. When it comes to combining things, increasing the number of components you’re working with vastly increases the number of potential combinations. Three modules can be combined in six different ways, assuming each module is used once; four modules can be combined in 24 different ways; and by the time you reach 10 modules, there are over three-and-a-half million combinations. What this means is that, thanks to the fertile recombination of ever-more technological possibilities, time and evolution are steadily speeding up from the perspective of our creations. And the rate at which they’re speeding up is itself increasing.
This has perhaps been most familiarly stated in the form of Moore’s law, which originated in a 1965 paper describing a doubling in transistor density every two years. This pattern of increasing complexity – and thus performance – has lasted for over half a century, and proved applicable to much more than microchips. Despite recent suggestions that transistor size and density are approaching their limits, the cost of computing performance itself continues to follow Moore’s exponential curve – as does the sheer volume of interlinked computers in the world.
This is the point where what Arendt termed ‘the onslaught of speed’ starts to do strange things to time. Among the implications of Moore’s law, some thinkers have reasoned, is that the next two years are likely to see as much progress in computing terms as the entire history of technology from the beginning of time to the present – something that’s also likely to be true for the next two years, and the next.
And if this kind of analysis feels overfamiliar – or overstated – we can recapture its shock by putting things slightly differently. From the perspective of technology, humans have been getting exponentially slower every year for the last half-century. In the realm of software, there is more and more time available for adaptation and improvement – while, outside it, every human second takes longer and longer to creep past. We – evolved creatures of flesh and blood – are out of joint with our times in the most fundamental of senses.
All of which takes us towards one of the defining myths of our digital age, that of the Singularity: a technological point of no return beyond which, it’s argued, the evolution of technology will reach a tipping point where self-design and self-improvement take over, cutting humanity permanently out of the loop.
Is any of this likely? My belief is that, like most myths, the least interesting thing we can do with this story is take it literally. Instead, its force lies in the expression of a truth we are already living: the fact that clock and calendar time have less and less relevance to the events that matter in our world. The present influence of our technology upon the planet is almost obscenely consequential – and what’s potentially tragic is the scale of the mismatch between the impact of our creations and our capacity to control them.
This brings us to the biggest question of all. Can we deflect the path of technology’s needs towards something like our own long-term interest, not to mention that of most other life on this planet? Not, I would argue, if we surrender to the seduction of thinking ourselves impotent or inconsequential – or technology’s future as a single, predetermined course.
Like our creations, we are minute in individual terms – yet of vast consequence collectively. It took the Earth 4.7 billion years to produce a human population of one billion; another 120 years to produce two billion; then less than a century to reach the seven-and-a-half billion humans currently alive, contemplating their future with all the tools of reason, wishfulness, knowledge and delusion that evolution and innovation have bequeathed.
This is what existence looks like at the sharp end of 4.7 billion years. We have less time than ever before – and more that we can accomplish.