Welcome to Big Thinking.
As a new reader of IAI News, your first article is completely free!
To discover 3000+ more articles and videos, subscribe to the IAI.
Technology is often seen as the engine of social change. But this ignores the cultural forces and changes that enable technological shifts, as well as the fact that technology is often used to preserve the status quo, rather than usher in change, argues Lelia Green.
We have just experienced one of the most significant social upheavals in living memory. The COVID-19 pandemic made us review everything we took for granted: personal freedoms, community engagement, work, leisure, shopping, travel. It struck at the heart of consumer society, and it demanded to be taken seriously. And we changed: quickly, and dramatically.
In the rich countries of the global north and its wealthy, educated, privileged outcrops, we have the luxury of looking back on the worst of the pandemic. There’s a sense that normal, pre-COVID, life is resuming, but the truth is we all experienced a huge cultural shift. Our societies changed, culturally, with unexpected speed. Some would argue that it was technology that enabled the cultural shift that COVID required, as well as our slow return to “normal life”. From the means of detection and diagnosis of the disease, through to treatment options and onto vaccines, the north’s technological advantages and its capacity for production at scale, were crucial. Digital connectivity outpaced lightning’s speed in creating new ways for people to connect in virtual groups; contactless shopping became a thing and didn’t blunt the desire to consume. But is technology really the driver that makes it all happen? With over thirty years as a student of the interplay of technology and society to draw on, I say: ‘no’. Technology is only ever a second-order factor; culture is always the key to change.
Making a technologically determinist statement, such as ‘Computers have changed the world’ misses the point that it was cultural forces at work in the world that resulted in computers.
SUGGESTED READING AI won’t steal your job, just make it meaningless By JohnDanaher Back in the 1980s, like other researchers in those days, I had access to what we thought of as a state-of-the-art personal computer. The story of the PC can be seen as the paradigmatic case of a piece of technology changing culture forever. Alan Michael Sugar’s electronics trading company, Amstrad, which he set up in 1968, is usually pointed to as a key part of the story of how the computer became a consumer good. This narrative ties in with the myth of the brilliant visionary who changed the world by imagining an industrial machine as having domestic use. But beginning the story at the domestication of computing sidesteps the entire cultural zeitgeist that underpinned the information revolution. It ignores the shift in social and economic priorities which accompanied the huge investments in technology during and after the Second World War.
The political concerns of the late 40s and 50s directly funded a cold war arsenal and a space race between the USA and Russia. Together, these powers had defeated Nazi Germany and Imperial Japan, but in doing so they had become suspicious of each other. Their cultural anxieties gave rise to an unprecedented scale of technological investment. With a growing experience of computing, and with knowledge of game-changing wartime interventions such as Alan Turing’s leadership in cracking the ENIGMA code, the US Department of Defense supported experiments to get huge, stand-alone computers, ‘talking’ to each other.
Within its first decade, the United States’ Advanced Research Projects Agency (ARPA) had nurtured the social networks and the research priorities of the scientists who developed the prototype Internet. That was about the time that Alan Sugar, at 21, was setting up his consumer electronics company which, less than two decades later produced the machine I took with me when I emigrated to Australia. Making a technologically determinist statement, such as ‘Computers have changed the world’ misses the point that it was cultural forces at work in the world that resulted in computers.
It is culture that drives technological change and innovation. As societies, we get the technologies that our culture determines.
The same is true of the rise of England’s cities which powered the start of the industrial revolution, following the dislocation of people from the land. It’s possible to construct the ‘Enclosure Acts’ as the greediness of landowners intent on profiteering, but even the concept of ‘profit’ was comparatively novel at that time, with stock markets in their infancy and newspapers barely distinguishable from pamphleteering. Arguably, the growth of the city started with the failure of poor laws that kept people anchored to the parish of their birth. This social contract, where loyalty to a rural community meant cradle to grave security, broke down at the start of the sixteenth century, buckling under an estimated 40% explosion in population between 1485 and 1545. The fracture of the bond between a villager and their village sounded the death knell of feudalism, igniting the embers which led to the Enlightenment. These cultural shifts underpinned the social and political changes which were to follow, including the Agrarian and Industrial revolutions, and the establishment of Empire. There are technological dimensions to these cataclysms, but the technologies that resulted are the visible expression of shifts in culture.
SUGGESTED READING How technology will revolutionise relationships By ElyakimKislev It is culture that drives technological change and innovation. As societies, we get the technologies that our culture – specifically the elite groups we support in our culture – determines. When I first began my work in this area, prior to the integration of the world wide web into every facet of the global north, I identified the three key drivers of technological change in terms of A, B and C.
A is for the Armed Forces: huge drivers of innovation, supported by taxpayers in democracies and by despots in totalitarian systems. (Or vice versa: often it is the army that supports the despot.) B is for Bureaucracy: the naming, numbering, assessing and sorting that starts soon after conception and ends long after death, impacting every facet of activity in between. I count the higher education and research sector as part of bureaucracy, since it is funded through the public purse and grades and assesses those who pass through it. C is for Corporate Power, those same engines of profit and advantage that developed the vaccines, masks and lateral flow tests we continue to consume post-peak pandemic.
As the internet became pervasive at the cusp of the millennium, I was able to identify additional cultural change-makers. Change was not, initially, in gender relations: the 1990s web was a famously misogynistic space; as was the Geek culture it spawned. But that culture was to power the D in my alphabet: D is for the Distributed Collective. I’m not talking about the Titans of Silicon Valley here: they’re occupied with delivering for As’, Bs’, and Cs’ tech agendas. I’m referring instead to the hive. They leave their day jobs — teacher, bus driver, shelf stacker — and return home for an invisible second shift: updating wikis, moderating chat rooms, refining open-source software. The Distributed Collective keeps the non-commercial internet functioning. It also allows for the crowdfunding of technology, uncoupling some technical advance from the military-industrial complex.
And once Myspace and Facebook got going, I could include an E: E is for Everyday Users as drivers for technological adoption and dissemination. These are the friends, relatives and thought leaders who encourage us to sign up for ‘just one more’ platform. The Ds and Es in my alphabet are often motivated to prove themselves against a technological challenge; by winning respect and the ‘ego boo’ of others’ regard. These are socio-emotional and cultural forces.
Transhumanist technology isn’t a recipe for changing society, it’s the recipe for the status quo.
But maybe there’s a different kind of hope on the horizon, one that also addresses our cultural (and very personal) fear of death. In the same way that I have lived through the dawn of modern computing, I have experienced, at a distance, the world’s first heart transplant, the ‘test tube baby’, the cloning of Dolly the sheep and the sequencing of the human genome. If Yuval Noah Harari’s vision of transhumanist tech translating itself into Homo Deus becomes anything approaching reality, it will not be a game-changer: it will be more of the same. The same rich, educated, privileged elites will power and then use posthumanist technologies to keep them living and functioning for as well and for as long as possible. Transhumanist technology isn’t a recipe for changing society, it’s the recipe for the status quo.
My decades of exploring the interactions between society and technology lead me to conclude that technology is an outcome of social and cultural forces; powerplays and imbalances that we all ignore or collude with to some extent or another. And while it is true that a technology such as the internet allows the emergence of social formations that wouldn’t be possible without it, those changes are the products of human decisions. The implications of this are both challenging and liberating since, to a large extent, our technologies are a visible expression of the out-workings of cultural dynamics.
We cannot rely on technology, or on governments, to change our future. We need to change our culture, and we are the engines of culture change.
The challenging first. COVID-19 was but the merest blip on the horizon compared to the cataclysm which is the global climate emergency, accelerating in velocity and force with every weather-based news event. We cannot rely on green hydrogen, or any other technological innovation, to save us. A technological fix is an excuse to continue business as usual: the game doesn’t change. That’s not to say we shouldn’t invest in climate mitigating technologies, locally, nationally and internationally: we should. It’s simply to acknowledge that no technological fix is going to change the future we have constructed for ourselves. Unless there is a cultural shift, far greater than the one brought into being with COVID, we will continue our self-defeating love affair with technology and consumption.
The liberating implication of the social construction of technology is the reverse of this. We cannot rely on technology, or on governments, to change our future. We need to change our culture, and we are the engines of culture change. The ways in which we act and interact socially constitutes culture, and our recent experience with COVID shows this is fundamentally malleable. We can do things differently. We need to do things differently. And if we don’t, technology can’t save us.
Sign up to get exclusive access.
Get unlimited access to thousands of articles and videos
Start Free Trial
Already a subscriber? Log in
Sign in to post comments or join now (only takes a moment).
David Simpson 17 August 2022
Yes, but . . . it’s not either / or, it’s both / and – a dance between the two, as it has been since the stone axe and fire. The Black Death (1345) broke feudalism, and arguably made way for the modern era. It wasn’t “cultural”. The printing press (c. 1450) enabled literacy, the Reformation and, arguably, mass culture, as well as the dissemination of scientific ideas (e.g. Galileo’s discoveries were published in the Low Countries while he was under house arrest by the Catholic Church in Florence).
If we are (cultural) sheep we will let Meta et al control us. If we are angry and frustrated and convinced there’s a better option, we’ll use the technology to re-make society.
I would like to receive updates from the Institute of Art and Ideas.
No spam ever. You can unsubscribe at any time with just one click.