Services on Demand
Article
Indicators
Related links
- Cited by Google
- Similars in Google
Share
South African Journal of Science
On-line version ISSN 1996-7489
Print version ISSN 0038-2353
S. Afr. j. sci. vol.119 n.1-2 Pretoria Jan./Feb. 2023
http://dx.doi.org/10.17159/sajs.2023/12916
RESEARCH ARTICLE
Why there is no technological revolution, let alone a 'Fourth Industrial Revolution'
Ian Moll
Wits School of Education, University of the Witwatersrand, Johannesburg, South Africa
ABSTRACT
We are told by the powerful that we live In, or are about to live In, a Fourth Industrial Revolution (4IR). Seemingly, this revolution Is about deep-seated, rapid, digitally powered techno-scientlfic change. It is the age of smart machines; It is a new Information technology (IT) revolution. However, In this article I suggest that examination of the history of technologies that are often held up to be proof of the 4IR, in fact shows that there is no contemporary technological revolution. The research methodology that I employ here Is conceptual analysis and afocused review of literature on the history of partlculartechnologies. An Industrial revolution, as Its three historical Instances have demonstrated, is the fundamental transformation of every aspect of industrial society, including Its geopolitical, cultural, macro-social, micro-social, economic and technological strata. It certainly entails a technological revolution, but it Is more than just that. In this article, I am not concerned with the broader ensemble of socio-economic changes - it seems increasingly clear that the 'brave new world' of the 4IR Is not really happening - but simply ask the question: is there currently a technological revolution? The answer seems to be that there is not.
SIGNIFICANCE: The significance of this study is that it challenges the mainstream notion of technological innovation and change, associated with the 'Fourth Industrial Revolution'. It has implications for the way we think about technological and scientific revolutions
Keywords: digital, Industrial revolution, technological revolution, Information technology
Science, technology and the alleged 4IR
Talk of a 'Fourth Industrial Revolution' (4IR) is around us all the time. It seems to be about the way things just are. It is supposedly a full-scale human and social revolution, in which radical, fast-paced convergences of scientific and technological innovation in networked information technology (IT) dominate and transform every aspect of our lives. For a scientist, what sits underneath this is often an assumption that scientific progress is bound up with the computing power of ITs in scientific research methods and outputs. Indeed, there is hardly a science that does not employ the information processing capabilities of computers to do its work. And all sciences have in the last 30 years or so (at least) made dramatic advances in their scientific and technological knowledge bases on the basis of that computing power. So scientists quite easily accept that we live in an age of technological revolution. When someone like Klaus Schwab of the World Economic Forum tells us that "we are at the beginning of a revolution that is fundamentally changing the way we live, work, and relate to one another. In its scale, scope and complexity, what I consider to be the fourth industrial revolution is unlike anything humankind has experienced before" (my emphasis)1(p.7), then scientists might unthinkingly go along with that too.
However, scientists are not unthinking beings. So, let us pose the question: is contemporary IT innovation and convergence really revolutionising our technology, and our research problematics, in the way that we are told? In what follows, I suggest that this is not the case. The evidence that I will adduce in support of this claim comes from examination of the history and nature of technologies that are often put forward as proof of the dramatic arrival of a 4IR. I do not suggest that digital computing is not necessary, important, or even amazing in scientific research. I take that for granted. What I do want to question is the way we use the word 'revolution' to describe and understand our activities. On that score, I argue that we have not witnessed a 'grand', overall technological revolution in recent times. It is important that scientists and technologists understand this.
Some background
The annual World Economic Forum meeting in Davos is often described as the gathering of the world's economic elites. Corporate heavyweights, heads of state, cutting-edge scientists, global intellectuals and their entourages gather to discuss 'the next big thing' in the exercise of global power. In 2016, Schwab famously introduced the world to the notion of the 4IR:
We have yet to grasp fully the speed and breadth of this new revolution. Consider the unlimited possibilities of having billions of people connected by mobile devices, giving rise to unprecedented processing power, storage capabilities and knowledge access. Or think about the staggering confluence of emerging technology breakthroughs, covering wide-ranging fields such as artificial intelligence, robotics, the Internet of Things, autonomous vehicles, 3D printing, nanotechnology, biotechnology, materials science, energy storage and quantum computing, to name a few. Many of these innovations are in their infancy, but they are already reaching an inflection point in their development as they build on and amplify each other in a fusion of technologies across the physical, digital and biological worlds.1(p.89)
He emphasised what he proclaimed to be the unprecedented speed, size and scope of the '4IR', in relation to previous industrial revolutions. The velocity of change, he suggested, is exponential rather than linear; the combining of multiple technologies broader and deeper than ever before; and the systems impact is now total, across the whole of society and the world economy.1
However, Schwab's claims are contentious, in relation to technology per se, and in relation to paradigmatic scientific revolutions that might be said to undergird technological change. Notably, Jeremy Rifkin, advisor to the European Union, the Chinese government, and the then German Chancellor, Angela Merkel, challenged Schwab immediately. Rifkin had been a prominent writer for more than 25 years on the digital technology revolution that commenced in the 1960s (the Third Industrial Revolution, or 3IR) and on possible future industrial revolutions.2 For reasons that will become obvious, he was not on the list of invitees to Davos in 2016. Rifkin pushed back against the claim that the fusion of technologies between the physical, digital and biological worlds is somehow qualitatively a new phenomenon. He argued that the very nature of digital technology is that it reduces communication "to pure information" organised in networks that work like complex, interactive ecosystems:
[It] is the interconnected nature of digitalization technology that allows us to penetrate borders and 'blur the lines between the physical, digital, and biological spheres'. Digitalization's modus operandi is 'interconnectivity and network building.' That's what digitalization has been doing, with increasing sophistication, for several decades. This is what defines the very architecture of the Third Industrial Revolution.3
Rifkin went further, rejecting Schwab's argument that an overall rapid increase in the velocity, scope and systems impact of new technologies implies a 4IR. He showed that it is the intrinsic interconnectedness of networked information technologies themselves, and the continuous, exponential decrease in digital technology costs, that produce changes in "velocity, scope, and systems impact", and that this had been going on for more than 30 years. It was a misconception that Schwab saw this as a "new revolution". Rifkin does not think that contemporary technology innovation in the networked information society constitutes a 4IR.
The ensuing debate has been prominent across society. The pivotal texts of Schwab (the leading global 4IR advocate)1 and Tshilidzi Marwala (the leading South African 4IR advocate)4 are replete with what they claim to be evidence of acute, rapid, systemic technological development - that is, a contemporary technological revolution. In their wake, numerous publications propagate the notion of a 4IR in technological terms. Books by Skilton and Hovsepian5, and by Marr6 are significant in this regard. There is unfortunately also a number of fanciful, rhetorical, science-fiction like evocations of a contemporary digital revolution that undermine their own cases, such as Kurzweil's Singularity7, and that by Diamandis and Kotler that takes us on a "wild ride" (their own words) through "turbo-boosted" technological change in "swarms" of "tsunami-sized behemoths" that confront us with a "blitzkrieg" of technologies8(p xi,8,10,117). Significant works that question the plausibility of a 4IR are those by Daub9, Edgerton10 and Morgan11, and my own occasional paper seeks to make a contribution in this regard12.
Now it is not my intention here to tackle this debate in its broader context. I have argued elsewhere that this context is not so much about science and technology per se, but about the political and ideological intervention that Schwab sought to achieve at Davos 2016.13,14 If I may be permitted an indulgence, I would say that it is less about machines than about political machinations. However, both sets of protagonists accept that we live in an era of rapidly evolving technological innovation and change. My argument here is that, purely at the level of technology, there does not seem to be a case that there is such a phenomenon as a 4IR.
3IR or 4IR technologies?
Over the past few years, I have been researching the general beliefs that people have about the '4IR' and technology12(p30-32) Amongst other procedures, I used the search term "fourth industrial revolution" on both the Google and Google Scholar search engines, to discover what people in general, and academics in particular, believed about technology in the '4IR'. I noted down every single 'technology of the 4IR' declared by some or other commentator, academic and non-academic alike, until the data were clearly saturated. In the process, I surveyed over 320 websites, and scanned some 150 digitised journal articles. I should note here that a basic coding procedure for the analysis of qualitative data revealed no differences between the standard utterances of the online public at large, academic writers in general, and science and technology researchers. The technologies described by each of these groups of people were pretty much the same.
In July 2020, when I conducted a first survey, the terms internet of things, machine learning, robotics, artificial intelligence, big data, and automation were amongst the most frequently mentioned 'technologies'. By August 2021, when I repeated the survey, the terms blockchain and cyber-physical systems had joined the list of the most 'popular' terms. Automation had waned somewhat. The reason for the increasing mentions of blockchain appears to be the growing trade in the cryptocurrency Bitcoin. The prominence of the notion of cyber-physical systems does not seem to be so easily explained by the contexts of its use on the Web. Automation is (perhaps a euphemism for) the displacement of humans by robots in the workplace, and so as people come to understand it more, it is subsumed under 'robotics'.
It turns out that the way people use and understand all of these concepts as 'technologies of the 4IR' can be misleading. None of them is a groundbreaking invention of contemporary times. All of them were, and are, gradual evolutions of technology rooted in the defining technological transformations of the 3IR. I shall now make my case by examining each of them in depth.
Artificial intelligence
Artificial intelligence or Al is afield of knowledge and research that originated in the 1950s and that seeks to conceive, and sometimes to build, artificial animals including humans. It is somewhat disingenuous to describe Al as 'a technology'. Ai brings together such disciplines as cognitive science, philosophy, cognitive psychology, neuroscience, computer science, and information engineering. Among its central questions are, "Can a machine think?" and "Can a machine act like a human being?" In seeking to answer them, Al hypothesises a functional equivalency between human cognition and a computer program. It tries to understand the nature and limits of this putative identity between a machine on the one hand and a primate's intelligence and action on the other. For example, Al researchers investigate whether information processing in a person and a machine are governed by the same kinds of rules in accessing, storing and retrieving information in memory processes. Or they ask if the 'cognitive' schemas that produce action in machines and humans can be understood to be equivalent. Often, Al researchers either build actual machines (such as robots) or write symbol-processing algorithms - there is a debate in the field about the extent to which either, or both, is necessary - to help them find answers to their research questions.
However, it is not the technology as such that interests Al researchers, but rather the 'virtual machine' that runs inside it. A 'piece' of Al is the mental model of an information-processing system that a programmer has in mind when writing a program that could run inside a machine.15(p.4) Al is not technology per se, but some of the knowledge it produces is continually applied in various technology fields, including software engineering, computer design and - most notably- machine intelligence. It is very much of the 3IR, having commenced with the advent of modern high-speed digital computers in the 1950s.16,17 To recognise that Al is a scientific field that has progressed rapidly in the past three decades does not warrant the claim of an Al 'revolution' or scientific paradigm shift in contemporary times.
Robotics
Robotics is the development of computerised machines that replicate human action. It has scientific and technological roots far back in the 3IR. As regards automation, the first digitally programmed industrial robot started work in a Connecticut foundry in 1961. In 1969, the Stanford Arm, a six-axis articulated robot was invented, able to follow arbitrary paths in space and widen the potential use of robots in industry. 1974 saw the world's first electric, microcomputer-controlled, industrial robot installed in a Swedish factory. IRB6, as it was known, carried out welding, grinding and polishing functions in steel pipe production.
It must be emphasised that the vast majority of industrial robots are 'unintelligent', fixed machines that carry out rudimentary manufacturing functions, such as welding or screwing on certain parts of a car or household appliance, on assembly lines. By the new millennium, some 750 000 were deployed globally, mainly in motor car and electronics factories. By 2022, there were over 3 million manufacturing robots, with just over 1 million units in China and some 412 000 in Japan.18
At the other end of the spectrum, there are relatively few 'humanoid' robots, mostly found in research contexts rather than the workplace. WABOT-1, the first anthropomorphic robot, appeared in Japan in 1973. Its technological focus was mainly on a bipedal limb control system enabling it to walk. It was also fitted with sensors and actuators to measure distances to objects and grip and move them, and was able to recognise basic spoken commands.19 Freddy I (1969-1971) and Freddy II (1973-1976) were Scottish experimental robots using an object-level robot programming language, allowing them to handle variations in object position, shape, and sensor noise. They both used video cameras and bump sensors to recognise objects, and Freddy II was augmented with a large vertical 'hand' that could grip objects once recognised. By 2020, the robot that is widely regarded to be the world's most advanced humanoid, ASIMO, could walk, hop, run, jump, serve food and drinks, recognise moving objects, and respond to human gestures. However, it also uses sensor, actuator, bipedal and language processing technologies with a lineage straight back to WABOT-1, along with machine learning capabilities that have a similar technological 'ancestry'.20
Up to the late 1970s, robots were tediously hand-programmed for every task they performed. By then, the burning research questions of robotics needed machine learning technology to inaugurate learning robots. The coming merging of the two technological spheres in the 1980s was inherent in the long-evolving technologies of the 3IR.
Machine learning
Computers process information, perhaps in the same way that cognitive scientists think that a human being does. So machine learning refers to the ability of computers to process digital information and act automatically on the basis of it, without explicit programming. The idea is that a computer can learn 'from experience', and improve its information-processing ability over time in autonomous fashion, by running algorithms to access and process data. The most 'intelligent' computers can be fed data, access it themselves, and 'experience' it via sophisticated sensors. Deep learning, an evolution of machine learning, creates an 'artificial neural network' that can learn and make basic decisions on its own.
These developments have a history deeply rooted in the 3IR. The term 'machine learning' was coined by Samuel, who invented a computer program to play draughts in the 1950s. In 1957, Rosenblatt combined Hebb's psychological model of brain cell interaction with Samuel's program to create Perceptron, which was the first artificial neural network able to learn patterns and shapes. In 1959, Widrow and Hoff created such a program to detect binary patterns. Let us also not forget that, in 1997, the IBM computer 'Deep Blue' beat the world chess champion.
An explosion in machine learning research and development took place in the 1980s, on the basis of research programmes that had started in the previous decade, like that of Marvin Minsky at MIT The interest in neural network research at that time was not accidental. Advances in 'very-large-scale' computing enabled scientists to build machines with thousands of processors that could distribute computation over a large number of processing units running in parallel. Artificial neural networks' provided the theory that underpinned these developments.
At this time, the confluence of robotics and machine learning started to take shape. Not all machine learning is about robots. However, there was increasing demand by the 1980s for robots capable of doing things like identifying parts from a random selection, or maintaining 'positional accuracy' when objects shift about on assembly lines.
Benjamin Kuipers recollects that the serious questions of machine intelligence became: "How can a robot learn a cognitive map from its own experience of the environment?" and "How can an agent learn,
not just new knowledge within an existing ontology, but a new ontology it does not already possess?"21(p 243,261). Intellectually, this period was a high point in 3IR machine learning. Academics were consumed In debate about 'machine vision' in robots, in which sensors (cameras, lasers, lidar, radar, etc.) detect and categorise aspects of their environment. In industry, machine learning algorithms in robots enabled 2D and 3D 'object learning'. These robots made and acted on predictions using probabilistic reasoning algorithms coded into them. In business, robotic process automation -office automation technology in which robotic software replicates human actions to carry out business processes - evolved rapidly. In another applied research context, 'assistive robots' were built to process sensory information, and then act to help disabled and elderly people with everyday functions. By 2000, the evolution of natural language processing dating to the 1960s was being realised in early chatbots (the forebears of the robots with human voices 'inside' our cell phones today).
Machine learning has a deep and significant history in the 3IR, as does its mutual engagement with robotics.
Internet of Things
An Internet of Things (loT) is a system of networked mechanical and digital devices with the ability to transfer data amongst themselves without human intervention. A proverbial case in one's own home would be the digitised linking of an alarm clock, a coffee machine, a sound system booting up one's favourite tracks, onscreen reminders of one's appointments for the morning, and weather and traffic reports for the day, all connected via the Internet - a convergence in use of networks and devices that sounds revolutionary. However, it would appear that the technology is not new.
Obviously, the core technology of the loT is the Internet. The iconic technological events of the 3IR have been the invention of the Internet and the World Wide Web (WWW, or simply 'the Web'). The Internet was a 1969 project supported by the US Department of Defense that linked computers at a number of universities via standard telephone connections. Subsequently, Tim Berners-Lee built a document-linking structure on it, and most importantly, defined open standards for the exchange of information via the Internet. This structure consists of the all too familiar HTML, URL and HTTP computer codes. In 1991, Berners-Lee 'went live' with the first browser that used these standards to exchange hyperlinked data via the Internet, and inaugurated the WWW. It seems fair to say that, 30 years ago, the Internet consolidated the fundamental technological revolution of the 3IR.
The other major technology of the loT is the combination of analogue to digital and digital to analogue converters (ADC; DAC) that link mechanical devices, via sensors and actuators, to the Internet. These first appeared in the 1960s. The first loT was reputedly built in the early 1980s when techies at Carnegie-Mellon University installed micro-switches in a vending machine to check cooldrink availability from their desks. Perhaps the most significant piece of technology in the evolving loT was Trumpet Winsock in 1994, which made it possible to attach PCs to Internet networks.
In the 2020s, it is clear that an loT can radically beef up businesses and governments, by networking things like transportation, shipping, security, energy conservation and urban waste management, but their technology is definitively that of the 3IR.
Cyber-physical system
At first glance, cyber-physical systems (CPSs) appear to be well described as 21 st-century technology. The term was coined in 2006 by scientists at the US National Science Foundation. In the contemporary world, CPS technology works in manufacturing, electricity supply, health care and transport, to name but some of its terrains. It is also prominent in implementing global change agendas, such as decarbonisation. Edward Lee describes it thus:
CPS connects strongly to the currently popular terms Internet of Things (loT), Industry 4.0, the Industrial Internet, [etc.] ... All of these reflect a vision of a technology that deeply connects our physical world with our information world. ... [But] it does not directly reference either implementation approaches (e.g. the "Internet" in loT) nor particular applications (e.g. "Industry" in Industry 4.0). It focuses instead on the fundamental intellectual problem of conjoining the engineering traditions of the cyber and the physical worlds.22(p.4838)
So it looks very much like CPS might be one of Schwab's revolutionary disruptions.
However, this sense of what CPS is, is beguiling, as becomes clear when we start to unravel its technological roots. The key point is that a CPS is all about computational models. In this, it goes all the way back to Norbert Wiener's work during World War II, designing technology to aim and fire anti-aircraft guns automatically. Although Wiener employed analogue control circuits and mechanical parts, and not digital computers, his mathematical principles were precursors to the digital feedback control loops found today in CPS. Wiener consolidated this control logic in his 1961 book, Cybernetics. From the 1960s, the development of the mathematical principles of cybernetics is evident in the history of what are known as embedded and hybrid systems in computer programming. In the 1960s, researchers at MIT developed the guidance system for the Apollo spacecraft, which employed the first example of a modern, concurrent, embedded computing program. The notion of hybrid systems, the interaction of digital controllers, sensors and actuators in dynamic physical systems, was widely researched in the 1990s.
These are the significant predecessors of CPS in the expanding 3IR of the 20th century.
Big data
Big data storage, and its associated analytics, is technology that enables a massive coming together of information in extensive global networks, based on 3IR technology that has evolved over the past 60 years. Increasingly, vast databanks are processed by large organisations, like companies and governments, to plan and make strategic decisions. However, while the amount of digitised data today is unprecedented, the technology of data storage and analysis has in fact evolved in waves over many years:
It would be nice to think that each new innovation in data management is a fresh start and disconnected from the past. However ... most new stages or waves of data management build on their predecessors. ... Data management has to include technology advances in hardware, storage, networking, and computing models such as visualization and cloud computing. ... The data management waves over the past five decades have culminated in where we are today: the initiation of the big data era.23(p.10,11)
In the 1950s, the first computers stored data on magnetic disks in flat files with no structure. To understand information, 'brute-force methods' had to be applied. Then, in 1961, the silicon chip (or 'integrated circuit', still the basic building block of 'big data') provided for much larger, more efficient data storage and retrieval, and much smaller computers to do the job. Later, in the 1970s, relational databases imposed structure on data, in 'ecosystems' that helped classify and compare complex transactions. In 1976, the graphical entity-relationship database model defined data elements for any software system, thus adding deeper analytics to increase data usability. By the 1990s, as the sheer volume of data grew out of control, the data warehouse was developed. In the new millennium, cloud computing evolved as data warehousing was taken 'off site'. Cloud computing is innovative, contemporary, on-demand data storage and computing power; one of its most important attributes is bringing together diverse data sets, such as climate records and social media messages, for purposes of analysis and decision-making.
The history of the emergence of 'data' as storage and analytics makes it quite clear that the ongoing emergence of what we now term 'big data' is a technology of the 3IR.
Blockchain
Blockchain is a database distributed across the nodes of a computer network. It stores information digitally, but differs from past databases in that it structures information in discrete 'blocks' rather than in tables. These blocks are closed when filled, and linked in a chain that constitutes a secure, shared, distributed ledger. The sequence of each block is irreversible - it is given an exact timestamp and a hash (a digital fingerprint or unique identifier). No block can be altered, and no new block can be inserted between two existing blocks in the chain. Each subsequent block strengthens the verification of the previous block and hence the entire blockchain. Data security is vastly increased.
The World Economic Forum and other 4IR buffs tell us that blockchain is one of the biggest advances of our time. They trot out a series of innovations in blockchain as evidence of a 21st-century 'quiet revolution': Bitcoin, 'the first blockchain"; Ethereum, 'little computer programs' providing financial instruments within the blockchain system; 'scaled blockchain' which deploys and regulates required computing power from within the blocks themselves; and storage within blockchain of non-fungible tokens (things like artworks or intellectual property).
However, like CPS, this supposed revolution is beguiling. Blockchain technology did not begin with Bitcoin. At their most honest, 4IR adherents might admit that it dates to Haber and Stornetta's specification of conditions for a cryptographically secured chain of blocks in 1991,24 If they bothered to read the work of these authors, they would realise that blockchain technology incorporates the Merkle-Damgard (M-D) hash function formulated in 1967, and formally validated in the 1980s. In particular, its iterative structure, in which a previous block's hash is the input for the next block, is replicated in blockchain.25(p.129) They might also take note of the fact that the BBVA Foundation bestowed its Frontiers of Knowledge Award in the 'ICTs' category on Shafi Goldwasser, Silvio Micali, and their fellow computer scientists for their "fundamental contributions to modern cryptology". The citation praises the Goldwasser-Micali (GM) crypto-protocols, defined in 1982, for providing "the underpinning for digital signatures, blockchains and crypto-currencies"26.
So, once again, we encounter an alleged '4IR' technology that is actually an evolving 3IR technology rooted deeply in the previous century.
Revolutionary technologies
The conclusion from these preceding discussions of proclaimed '4IR' technologies is clear. None of them is a radical, groundbreaking invention of contemporary times. All of them were, and are, gradual evolutions of technology rooted in the defining technological transformations of the 3IR. This and similar evidence about most latter-day IT innovations calls into question claims that we are today in a period of dramatic technological revolution.12(p.32-39) However, it would be absurd to suggest that there are no technological innovations in our time that are revolutionary in their own context. One example is the first real-life 'shadow hand' in the terrain of bionics.27 This prosthetic hand translates electrical impulses from the brain into digital information that allows a person deliberately to use their robotic hand. Research programmes seeking to replicate the functionalities of bionic hands have expanded rapidly over the past two decades.28,29 However, as one might expect, they are focused on what Thomas Kuhn terms the normal science of a scientific paradigm, rather than scientific revolutions (paradigm shifts) that transcend specific research contexts.30
A similar situation prevails with respect to other prominent new, revolutionary 21st-century technologies, such as nanotechnology and autonomous vehicles.
Medical nanotechnology involves implanting microscopically small devices in humans to detect, monitor and treat various illnesses and impairments. Generally, scientists in this field do not construe it as technological revolution, but rather as "a new and promising route to extract reliable information" within a relatively stable, enduring research programme.31(p.1) So graphene-based brain implants that record low-frequency electrical activity to enable drug delivery and tissue engineering, are described cautiously by scientists as needing "accurate theoretical modelling of the interface between graphene and biological material" in order for them to advance.32 Elon Musk's Neuralink, an envisaged brain-machine interface device, although hyped by many 4IR adherents as the epitome of a current revolution, is described extremely modestly by the scientists working on it: "further research studies are needed to move forward beyond speculation"33.
Autonomous vehicles are prominent in the rhetoric of the '4IR technological revolution'. Yet the vehicle technology is not ready for deployment on public roads: "self-driving cars are already on the road, [but] operating only at lower speeds within small geofenced areas"34. The views of researchers are modest: "there still is no comprehensive answer on how to proactively implement safe driving"35. Despite the intelligent sensors, digital maps and Wi-Fi communications that can, in principle, put autonomous vehicles on public streets, the seemingly intractable requirement is that environmental modifications would need to be made to facilitate their deployment.36
It turns out that it is difficult to find an incipient technology of the immediate 21st century that is revolutionary, and construe it as a broader 'technological revolution', simply because such technologies are generally found in their own contexts of discovery and emergence, that is to say in the research contexts in which they appear. Because a particular technical invention is revolutionary in its own context of use, does not mean that it constitutes, or is part of, a broader technological revolution.
The convergence of technologies
Jamie Morgan points out that, in considering claims about a 4IR, "it is the confluence of technologies that is considered socially significant"11. Technologies in combination with each other create the potential for change, because they "represent an anticipated fundamental transformation". This potential is real "in so far as individually all of the technology is either available in initial form or is something particular groups are working on somewhere in the world"14(p 374). Obviously, then, if a required range of innovations is not available, even revolutionary technologies do not constitute a technological revolution. The key issue on which the existence or otherwise of a technological revolution associated with a '4IR' turns is not so much separate technologies in their own right, but rather the required converging of technologies.
We often hear claims that the 4IR "is based not on a single technology, but on the confluence of multiple developments and technologies"37. Similarly, that it is a "fusion of advances in artificial intelligence, robotics, the Internet of Things, 3D printing, genetic engineering, quantum computing, and other technologies"38 (all my emphases). In the first quotation from Schwab, he proclaimed "the staggering confluence of emerging technology breakthroughs". There is very little evidence, however, of such grand, contemporary technological convergences in the current era.
A smaller-scale fusion of technologies is not necessarily the harbinger of a socially pervasive technological revolution. It is a truism to say that technologies converge, at many points in time and in any era. This occurs in multiple forms, in multiple ways, at multiple levels of complexity. For the most part, interacting machines and tools are commonplace in any production process. The robotic hand is one such example; another example of such convergence in its own context is that between robotics and machine learning in the 1980s. However, neither of these constituted an overall, fundamental technological revolution beyond the 3IR. The historian Hobsbawm's words explaining why, through the multiple technological innovations of the two World Wars, there was no technological or industrial revolution, seem pertinent to the current context: "What they achieved was, by and large, an acceleration of change rather than a transformation"39(p.48).
Having made the case that there is no current, substantive technological revolution, it is important to recognise that there is nonetheless something significant happening at this moment in relation to the
history of technological evolution. The ideology of the 4IR, construed by its mainstream ideologues as a technological revolution, has become hegemonic in the prevailing language of academia, business, politics and education.12,15 Joseph Stiglitz40 and other economists have identified the close coupling of neoliberalism, globalisation and the networked digital economy. However, while Stiglitz has suggested that "neoliberalism must be pronounced dead and buried"41 in the face of crises such as the global meltdown of 2008-2009, the waves of anti-globalisation and hostile populism sweeping through countries of the North, and the COVID-19 pandemic, the mainstream economic response of supranational states such as the IMF and the World Bank has sought to resuscitate neoliberal ideology42. It seems clear that the World Economic Forum intervention in 2016 is one way in which "neoliberal practice is able to resurface and show up in new and unexpected ways"43(p.1083). It has evidently been very successful.
Conclusion
There is no doubt that the pervasive digital convergences of the 3IR have constituted, and continue to constitute, an overall technological revolution, when considered in relation to the 'industrial age' brought about in the 2IR. There is also no doubt, on the evidence adduced in this article, that there is not a contemporary technological revolution. One thing to learn from this is that there is slippage in the way we use the term 'revolution' - linguists would call it a 'floating signifier'. To say that a scientific or technological discovery is revolutionary, is not necessarily to say that we are living in a period of technological revolution, let alone a 4IR.
Competing interests
I have no competing interests to declare.
References
1. Schwab K. The Fourth Industrial Revolution. Geneva: World Economic Forum; 2016. [ Links ]
2. Rlfkln J. The Third Industrial Revolution: How lateral power Is transforming energy, the economy, and the world. New York: Palgrave MacMlllan; 2011. [ Links ]
3. Rlfkln J. The 2016 World Economic Forum misfires with Its Fourth Industrial Revolution theme. Hufflngton Post. 2016 January 14. Available from: https://www.huffpost.com/entry/the-2016-world-economic-f_b_8975326 [ Links ]
4. Marwala T. Closing the gap: The Fourth Industrial Revolution In Africa. Johannesburg: Macmlllan; 2020. [ Links ]
5. Skllton M, Hovseplan F. The 4th Industrial Revolution: Responding to the Impact of artificial Intelligence on business. Cham: Palgrave Macmlllan; 2018. https://doi.org/10.1007/978-3-319-62479-2 [ Links ]
6. Marr B. Artificial Intelligence In practice. Chichester: Wiley; 2019. [ Links ]
7. Kurzwell R. The singularity Is near: When humans transcend biology. New York: Viking; 2005. [ Links ]
8. Dlamandls P, Kotler S. The future Is faster than you think: how converging technologies are transforming business, Industries, and our lives. New York: Simon & Schuster; 2020. [ Links ]
9. Daub A. What tech calls thinking: An Inquiry Into the Intellectual bedrock of Silicon Valley. New York: Macmlllan; 2020. [ Links ]
10. Edgerton D. The shock of the old: Technology and global history since 1900. London: Profile; 2008. [ Links ]
11. Morgan J. Will we work In twenty-first century capitalism? A critique of the fourth Industrial revolution literature. Econ Soc. 2019;48(3):371-398. https://doi.org/10.1080/03085147.2019.1620027 [ Links ]
12. Moll I. Debunking the myth of the Fourth Industrial Revolution. Occasional paper. Johannesburg: Centre for Researching Education and Labour (REAL), University of the Wltwatersrand; 2022 March 31. https://doi.org/10.54223/uniwitwatersrand-10539-32846 [ Links ]
13. Moll I. The myth of the fourth Industrial revolution. Theorla. 2021 ;68(167): 1-38. https://doi.org/10.3167/th.2021.6816701 [ Links ]
14. Moll I. The fourth Industrial revolution: A new Ideology. Trlplec. 2022;20(1):45-61. https://doi.org/10.31269/trlplec.v20l1.1297 [ Links ]
15. Boden M. Al: Its nature and future. Oxford: Oxford University Press; 2016. [ Links ]
16. Turing A. Computing machinery and intelligence. Mind. 1950;59(236): 433-460. https://doi.org/10.1093/mind/LIX.236.433 [ Links ]
17. McCarthy J, Minsky M, Rochester N, Shannon CE. A proposal for the Dartmouth Summer Research Project on Artificial Intelligence [document on the Internet]. c1955 [cited 2021 Dec 07]. Available from: http://raysolomonoff.com/dartmouth/boxa/dart564props.pdf [ Links ]
18. International Federation of Robotics. IFR Presents World Robotics Report 2021 [media release]. 2021 Oct 28. Available from: https://ifr.org/ifr-press-releases/news/robot-sales-rise-again [ Links ]
19. Takanishi A. Historical perspective of humanoid robot research in Asia. In: Goswami A, Vadakkepat P, editors. Humanoid robotics: A reference Dordrecht: Springer; 2019. [ Links ]
20. Sakagami Y, Watanabe R, Aoyama C, Matsunaga S, Higaki N, Fujimura K. The intelligent ASIMO: System overview and integration. In: Proceedings of the IEEE International Conference on Intelligent Robots and Systems, 2002 September 30 - October 04; Lausanne, Switzerland. New York: Institute of Electrical and Electronic Engineers; 2002. p. 2478-2483. https://doi.org/10.1109/IRDS.2002.1041641 [ Links ]
21. Kuipers B. An intellectual history of the spatial semantic hierarchy. In: Jefferies M, Yeap W-K, editors. Robotics and cognitive approaches to spatial mapping. Berlin: Springer-Verlag; 2008. [ Links ]
22. Lee EA. The past, present and future of cyber-physical systems: A focus on models. Sensors. 2015;15:4837-1869. https://doi.org/10.3390/s150304837 [ Links ]
23. Hurwitz J, Nugent A, Halper F, Kaufman M. Big data for dummies. Hoboken, NJ: Wiley; 2013. [ Links ]
24. Haber S, Stornetta WS. How to time-stamp a digital document. In: Menezes AJ, Vanstone SA, editors. Advances in cryptology-CRYPTO' 90. Lecture Notes Computer Science vol. 537. Berlin: Springer; 1991. https://doi.org/10.1007/3-540-38424-3_32 [ Links ]
25. Halunen K, Vallivaara V Karinsalo A. On the similarities between blockchains and Merkle-Damgard hash functions. In: Proceedings of the IEEE International Conference on Software Quality, Reliability and Security; 2018 July 16-20; Lisbon, Portugal. New York: Institute of Electrical and Electronic Engineers; 2018. p. 129-134. https://doi.org/10.1109/QRS-C.2018.00035 [ Links ]
26. BBVAFoundationrecognizesGoldwasser,Micali,RivestandShamirforenabling a secure digital society thanks to modern cryptography [media release]. 2018 January 16. Available from: https://www.frontiersofknowledgeawards-fbbva.es/noticias/bbva-foundation-frontiers-knowledge-award-goldwasser-micali-rivest-shamir-cryptography/ [ Links ]
27. Wits researchers interpret brainwaves to give amputees a hand. Daily Maverick. 26 November 2019. Available from: https://www.dailymaverick.co.za/article/2019-11-26-wits-researchers-interpret-brainwaves-to-give-amputees-a-hand/ [ Links ]
28. Basumatary H, Hazarika SM. State of the art in bionic hands. Trans Hum-Mach Syst.2020;50(2):116-130. https://doi.Org/0.1109/THMS.2020.2970740 [ Links ]
29. Lundborg G. The hand and the brain. Cham: Springer Nature; 2014. https://doi.org/10.1007/978-1-4471-5334-4 [ Links ]
30. Kuhn T. The structure of scientific revolutions. Chicago, IL: University of Chicago Press; 1962. [ Links ]
31. López-Lorente Al, Valcárcel M. The third way in analytical nanoscience and nanotechnology: Involvement of nanotools and nanoanalytes in the same analytical process. Trends Anal Chem. 2016;75:1-9. https://doi.org/10.1016/j.trac.2015.06.011 [ Links ]
32. Bramini M, Alberini G, Colombo E, Chiacchiaretta M, DiFrancesco M, Maya-Vetencourt JF, et al. Interfacing graphene-based materials with neural cells. Front Syst Neurosci. 2018;12. https://doi.org/10.3389/fnsys.2018.00012 [ Links ]
33. Fiani B, Reardon T, Ayres B, Cline D, Sitto SR. An examination of prospective uses and future directions of Neuralink: The brain-machine interface. Cureus. 2021 ;13(3), Art. #14192. https://doi.org/10.7759/cureus.14192 [ Links ]
34. Wardlaw C. When will self-driving cars be available? Car News. 2022 January 26. Available from: https://www.edmunds.com/car-news/when-will-self-drivingcars-be-available.html [ Links ]
35. Zhao C, Li L, Pei X, Li Z, Wang F-Y, Wu X. A comparative study of state-of-the-art driving strategies for autonomous vehicles. Acc Anal Prev. 2021 ;150, Art. #105937. https://doi.Org/10.1016/j.aap.2020.105937 [ Links ]
36. Ibanez-Guzmán J, Laugier C, Yoder JD, Thrun S. Autonomous driving: Context and state-of-the-art. In: Eskandarian A, editor. Handbookof intelligent vehicles. London: Springer; 2012. https://doi.org/10.1007/978-0-85729-085-4_50 [ Links ]
37. Marwala T. Covid-19 has forced us into the fast lane of the 4IR super-highway. Daily Maverick. 2020 May 28. Available from: https://www.dailymaverick.co.za/opinionista/2020-05-28-covid-19-has-forced-us-into-the-fast-lane-of-the-4ir-super-highway/#gsc.tab=0 [ Links ]
38. McGinnis D. What is the fourth industrial revolution? [webpage on the Internet]. c2020 [cited 2021 Dec 07]. Available from: https://www.salesforce.com/blog/2018/12/what-is-the-fourth-industrial-revolution-4IR.html [ Links ]
39. Hobsbawm EJ. The age of extremes: The short twentieth century 1914-1991. London: Abacus; 1995. [ Links ]
40. Stiglitz J. The great divide: Unequal societies and what we can do about them. New York: Norton; 2015. [ Links ]
41. Stiglitz J. After neoliberalism. Project Syndicate. 2019 May 30. Available from: https://www.project-syndicate.org/commentary/after-neoliberalism-progressive-capitalism-by-joseph-e-stiglitz-2019-05 [ Links ]
42. Ostry J, Loungani P, Furceri D. Neoliberalism: Oversold? Finance & Development. 2016;Jun:38-41. Available from: https://www.imf.org/external/pubs/ft/fandd/2016/06/pdf/ostry.pdf [ Links ]
43. Aalbers M. Neoliberalism is dead... Long live neoliberalism! Int J Urban Reg Res. 2013;37(3):1083-1090. https://doi.org/10.1111/1468-2427.12065 [ Links ]
Correspondence:
Ian Moll
Email: ian.moll@wlts.ac.za
Received: 07 Dec. 2021
Revised: 27 Apr. 2022
Accepted: 24 July 2022
Published: 24 Jan. 2023
Editor: Michael Inggs
Funding: None