The Challenges to Our Innate Cognitive Abilities and Mental Well-being

The Social, Cultural and Environmental Costs of Hyper-Connectivity: Sleeping Through the Revolution

ISBN: 978-1-83909-979-3, eISBN: 978-1-83909-976-2

Publication date: 17 August 2021

Citation

Hynes, M. (2021), "The Challenges to Our Innate Cognitive Abilities and Mental Well-being", The Social, Cultural and Environmental Costs of Hyper-Connectivity: Sleeping Through the Revolution, Emerald Publishing Limited, Leeds, pp. 55-70. https://doi.org/10.1108/978-1-83909-976-220211004

Publisher

:

Emerald Publishing Limited

Copyright © 2021 Mike Hynes

License

This work is published under the Creative Commons Attribution (CC BY 4.0) licence. Anyone may reproduce, distribute, translate and create derivative works of this work (for both commercial and non-commercial purposes), subject to full attribution to the original publication and authors. The full terms of this licence may be seen at http://creativecommons.org/licences/by/4.0/legalcode


The content of a medium is just the juicy piece of meat carried by the burglar to distract the watchdog of the mind.

Marshall McLuhan

We are not going to stop making progress, or reverse it, so we must recognise the dangers and control them.

Stephen Hawking

The Human Costs of Digital Technologies?

Modern digital information and communications technology (ICT) has changed how we live in many diverse ways and we have come to be reliant on our digital electronic devices to perform the most simple and routine daily tasks. Our alarm clocks as we rise, checking our smartphones for the latest snippets of news and updates from family and friends, our fridges, coffee makers and many other kitchen devices connected to the Internet of Things (IoT), our workstations and laptops, iPads and tablet devices, smart watches, smart TV’s, fitness devices and other self-monitoring tools; the pervasiveness of all these connected devices have been rapid and often mechanical. These digital technologies have become so integrated into our lives that going without such devices can cause significant anxiety for many individuals. While we know that these can add value to our lives in so many ways, now that we are surrounded and have succumb to digital technology in every facet of our daily routines we need to ask; what are such devices doing to our cognitive capacities, if anything at all? Many of these everyday items have been adopted by individuals without enough due care and diligence as to the possible long-lasting consequences of their impacts and long-term use on our innate human cognitive capacities. While many of these devices are designed to specifically aid our decision-making – think about how automatic it is for us to check our phones for the time of day, check our emails, use the calculator to do arithmetic, the built-in camera to capture specific moments in time, our apps to catch-up on the weather or the latest news headlines – at what point do they, in fact, take over actual decision-making thus ignoring and reducing our own capacity for logic and reasoning over time? We no longer need to remember important detail or basic facts about the world; we now simply use our digital devices to recall these elementary pieces of day-to-day information. This is something that, it has been argued, is turning us into organisms living symbiotically with technology: part human, part machine.1 It may be some years into the future before we truly understand and determine the impacts of digital technology use, and it may well be that they have been only positive in their application in our daily lives. But if the opposite is true and they have negatively impacted upon on cognitive abilities or overall sense of well-being, it might be too late to turn back from the trajectory of use impacting our reasoning and diminishing our own sense of self-worth. With fewer than 30 years since the internet became widely publicly available, these long-term effects have yet to be understood or established, never mind being acted upon.

Changing Minds?

In a 2008 seminal essay by Alan Carr in The Atlantic titled Is Google Making Us Stupid? What the Internet Is Doing to Our Brains! the author outlined a number of reasons why he believes the internet, as its currently organised, may be having significant effects on our cognitive abilities.2 Carr’s main argument is that the way we use the internet, in particular, might be having some detrimental effects on cognition that diminish our capacities to concentrate and truly consider issues and arguments. Despite the title it was not specifically targeted at Google, rather the way we browse and surf the internet. Carr points to the way the use of hyperlinks, as we browse the internet, may be having unexpected and startling effects on our levels of concentration, and he notices this in the context of reading:

Over the past few years I’ve had an uncomfortable sense that someone, or something, has been tinkering with my brain, remapping the neural circuitry, reprogramming the memory. My mind isn’t going – so far as I can tell – but it’s changing. I’m not thinking the way I used to think. I can feel it most strongly when I’m reading. Immersing myself in a book or a lengthy article used to be easy. My mind would get caught up in the narrative or the turns of the argument, and I’d spend hours strolling through long stretches of prose. That’s rarely the case anymore. Now my concentration often starts to drift after two or three pages. I get fidgety, lose the thread, begin looking for something else to do. I feel as if I’m always dragging my wayward brain back to the text. The deep reading that used to come naturally has become a struggle.3

He suggests that our brain circuitry may be changing as a result of the amount of time we are spending online. The internet is allowing previously challenging activities, such as research, to become easier thus reducing our time spent thinking deeply and contemplating issues. He argues that the more we use digital ICT, the more we start to emulate and present similar qualities as that of the workings of the technology itself. While acknowledging that we may well be reading more today thanks to the ubiquity of text on the internet, as well as text and instant messaging, it is a different type of reading that has altered and trivialised our ways of thinking about the content. Quoting Maryanne Wolf, a developmental psychologist at Tufts University and the author of Proust and the Squid: The Story and Science of Reading Brain, she worries that the internet has promoted a new way of reading based more on efficiency and immediacy rather than that of deep reading.4 That is, our inherent ability to interpret text and to make the rich mental connections that form when we read deeply and without distraction.

Carr goes further by first suggesting that the internet, as a digital communications system, now plays many roles in our lives and exerts broad influence over our thoughts. Yet, for all that is been written about the internet, there is very little to know about how it may be reprogramming us. The notion that our minds should operate at high speed – just like a data-processing machine – is the governing business model of the internet. But humans are bad at processing data, good at making abstract decisions and artificial intelligence (AI) is good at processing data, bad at thinking in abstract. We, as humans, make abstract decisions based on instinct, common sense and scarce information. We can feel, imagine, dream and invent things – such as digital technologies – and reinvent aspects of the past. Human memories consist not simply of matters of experiences but also the links between such experiences forming new connections promiscuously and thereby create opportunities for self-transformation and new collective phenomena. But this malleability means that human memories can also be capricious and unreliable, detaching at unexpected moments. The use of a computer metaphor is simply the most recent in a long line of tropes that pick up on the most advanced and complex technology of the day5; we understand how computer memory works, so we end up thinking that we understand how human memory works. It is suggested that simulating an entire, biologically realistic human brain remains an elusive goal with today’s hardware and technologies; the processing power alone that would be needed to pull off such a feat is enormous.6 AI provides a possibility and can always complement human intelligence, but we should not equate how our mind works with that of digital processing. So why is such an approach often adopted by big tech?

The quicker we surf online, the more hyperlinks we click, the more pages we view, the more opportunities that the likes of Google, Facebook and other platforms have to collect information on our likes, habits, dislikes and fears; all to feed their insatiable advertising algorithms. The last thing big tech wants is to encourage leisurely reading or slow concentrated thoughtfulness and contemplation. It is in their economic interest to let us frivolously skim through webpages and internet content and drive us to distraction. In a recent study by a team of international researchers from various universities across the globe, they found that the unique features of the online world may also be influencing our attentional capacities, memory processing and social cognition.7 They reported that the available evidence indicates that the internet can produce both acute and sustained alterations in each of these areas of cognition, which may be reflected in changes in the brain circuitry. The multifaceted stream of incoming information we are constantly subjected to online encourages us to become engaged in attentional-switching and multitasking, rather than a sustained focus on one thing. This ubiquitous and rapidly expanding access to online information and trivia overtakes previous transactive systems and potentially even internal memory processes themselves. A 2014 study appearing in the journal PLoS One found that people who spend a lot of time media multitasking – shifting between different websites, apps, programs or other digital stimuli – tend to have less grey matter in a part of their brain involved with thought and emotion control.8 These same structural changes are associated with obsessive–compulsive disorder, depression, and anxiety disorders. The online social world also attempts to match real-world cognitive processes becoming meshed with our offline sociality, and this introduces the possibility of the special properties of social media to impact on our real life in unforeseen ways. While such research may be in its infancy, as digital ICT become increasingly enmeshed into our everyday life, they are also becoming highly proficient at capturing and disrupting our attention and unsettling our regular cognitive processes.

In The Shallows: What the Internet Is Doing to Our Brains, Carr expands his arguments on the themes first raised in his 2008 essay.9 In examining the range of technologies introduced throughout history, he provides a well-developed and balanced introduction to the sociocultural good and bad of technological development. The value of this text to the current debates lies in the ample scientific evidence presented regarding the neurologic changes the human brain undergoes in response to our digital technology use. While evidence of an improvement in visual-spatial skills can be traced to the increase time spent at our screens and monitors, this improvement comes at a price as our abilities for deep analytical and critical thinking and reflection becomes diminished. Carr is essential pointing to the key differences between us, as humans, and these digital processing machines. Our requirements for processing and storing information build upon the notion of our knowledge of short- and long-term memories. Paying close attention to something or issue causes our front lobe to communicate with the midbrain provoking those neurons to release dopamine. When the hippocampus receives this dopamine, it fuses explicit memory. When we are trying to intentionally remember something specific, like a list of significant family birthday or anniversary dates, this information is stored in your explicit memory. People use this memory store every day, from remembering the time and place you are due to meet a friend for coffee, to recalling the date and time of a doctor’s appointment.

Explicit memory is also known as declarative memory since you can consciously recall and explain the information. However, the volume of competing information received by the human brain as we interact online begins to exceed actual working memory capacity. The establishment of memory cannot occur because our frontal lobe cannot focus on one thing long enough to allow that processing to take place. As our internet screen time increases, it becomes more and more difficult to store information in memory. Hence, our reliance is on the computers’ ubiquitous supply of artificial memory. A good practical example of this is our family and friends’ phone numbers. How many of us can readily recall the actual phone number of our nearest and dearest? I suggest a decreasing number; we have left this memory work to our smartphones and other portable digital devices. Many within the tech sector would like to match our understanding of human memory with that of computer memory. But as argued previously, the two are not the same. Human memories are ever-changing, moulded by the context in which they are made and retrieved. Computer memory, on the other hand, is static and stored in bits and bytes.

It is too early and very difficult to truly understand what the lasting social and cultural outcomes of widespread digital technology adoption and use will be or its effects on the workings of the human brain and our cognitive capabilities. But Carr reports on numerous scientific studies which reveal a molecular basis for behavioural changes resulting from the increased use of digital technology. Think about it yourself for a moment. Can you concentrate for extended periods of time on reading a book or long text, or do you get distracted easily? Do you find yourself reading a passage and then seeking to ‘go elsewhere’ to find additional information on an event, word or phasing; or just daydreaming? Was this always the case? It must be understood that each technology introduced throughout human history has had some effects – some small, some bigger – on individuals and societies. Be it the printing press, radio, television; changes in how we design, develop and adopt technologies have all left an indelible mark on an ever-changing societal landscape.

Will this superficial skimming and scanning technique of thinking, promoted and reinforced by the internet, serve us for better or for worse in the long run? Interestingly, Carr draws his curiosity on the subject from Marshall McLuhan’s (1964) celebrated work Understanding Media: The Extension of Man, who declared that the electric media of the twentieth century – television, radio, movies, and the telephone – were all breaking the domination of text over our thoughts and senses.10 McLuhan famously coined the phrase ‘the medium is the message’ and what’s often forgotten is that he was not just acknowledging, and celebrating, the transformative nature and power of new communication technology, he was also sounding a warning about the threats it poses and the risk of being oblivious to such dangers. He understood that when a new medium emerges, people are naturally carried away by the information coming over and through such a mode: the actual content. The news in the newspaper, the music from the CD player, the drama from the radio, the sitcom from the television: all tend to get lost behind the engaging content that emerges from these various mediums. The problem with this approach is that when people begin to debate the rights and wrongs of the medium’s effects on individuals and society, it is only the content that we are all discussing. But such code or content and the way it is developed to grab and retain our attention is significantly meaningful in how it effects our cognition, markedly when we look at the developing brain in children and their use of digital devices.

In a following chapter, the contemporary phenomenon of the smartphone and its meteoritic rise in popularity and use will be discussed in detail, but on the specific issue of digital ICT use and its effects on cognition, it’s important to consider the matter here, particularly with regard to our younger populations. The developing human brain is constantly building neural connections while pruning away less-used ones, and digital media use is playing an active role in that process. Paediatrician Michael Rich – director of the Center on Media and Child Health at Boston Children’s Hospital11 – argues that much of what happens on the screen of digital devices provides impoverished stimulation of the developing brain compared to reality, and that children in particular need a diverse menu of online and offline experiences, including the chance to let their minds wander and spend time away from such technology.12 The use of digital devices can interfere with everything from sleep to creativity and many children and teens who stay up late texting on their smartphones are lacking the deep REM sleep13 essential for processing and storing information from that day into memory. While such research is in its infancy, researchers from the National Institutes of Mental Health recently offered a glimpse of some early results, based on preliminary data from the Adolescent Brain Cognitive Development (ABCD) study.14 They found significant differences in the brains of some children who reported using smartphones, tablets and video games more than seven hours a day. Children who reported more than two hours a day of screen time got lower scores on thinking and language tests. A separate study suggested that more screen time is linked to poorer progress on key developmental measures such as communication skills, problem-solving and social interactions among young children over time.15 For parents, this is a natural anxiety who often find themselves asking: what is happening when my child is staring at their smartphone in terms of their cognitive, social and emotional development?

Humans: The Weakest Link?

This debates on how digital technology is altering our cognitive competences continues in The Glass Cage: Who Needs Humans?16 This time Carr sets his sights on AI, self-driving automobiles, digitised medicine and workplace robots when he explores the often-hidden costs of allowing digital technology dominance over our work and our leisure time. Drawing on various studies that highlight how closely our sense of happiness and personal fulfilment are linked to performing skilled work in the real world, he points to something we all may already suspect; shifting our attention to computer screens to simply monitor rather than participate can frequently leave us disengaged, bored and prone to mistakes. Even as many of these new technologies bring a new sense of relief to our collective lives by replacing the manual, mundane, cumbersome labour of the past, this code is also unconsciously stealing something essential from us. Using the example of fly-by-wire aviation in which the pilot is in the cockpit ‘just in case’, he discusses automation and the drive by technologist to build ‘immaculately self-contained systems that preform flawlessly without any human oversight or intervention’.17 But technology and machines will always share the fallibility of their designers and creators:

[A]s automation technologies become more complicated and more interconnected, with a welter of links and dependencies among software instructions, databases, network protocols, sensors, and mechanical parts, the potential sources of failure multiply. Systems become susceptible to what scientists call ‘cascading failures’, in which a small malfunction in one component sets off a far-flung and catastrophic chain of breakdowns.18

These matters because in designing automated machines – be they aeroplanes or robotic processes on the factory assembly line – the essence of such design is to effectively eliminate the human from such system. Digital technology designers frequently view humans as unreliable and inefficient – at least compared to modern digital processing computers – and thus strive to give them as small a role in an automated system as possible. Individuals end up functioning merely as monitors of the system, passive unreceptive watchers of screens. Thus, automation technology has, in fact, created predictable yet unprecedented opportunities for human error, which has opened doors to new forms of system breakdown.19 Workers in these conditions can easily get bored, daydream and their concentration can drift. If their main task is simply to monitor and observe, watching rather than acting, their instincts and reflexes will grow rusty from disuse over time. Placed in such a position, workers have trouble recognising and diagnosing problems, intuition and reaction time slows and responses can be sluggish and deliberate rather than quick and automatic.20 The human placed at the very end of technology-centred automation systems – rather than at the core of human-centred system design – may well find they are losing something very important and personal over time.

The idea behind the ‘use it or lose it’ hypothesis derives from the growing evidence that a lifetime of learning, mental and physical activity and rewarding work is good for people, and these findings also appear to be true when it comes to warding off Alzheimer’s disease and varies other forms of dementia. In many ways, our brains are like muscles; if we do not use, strengthen and stretch them, they will not deliver the high performance they have been designed and developed for. Essentially, our brains need regular mental stimulation to work well. The more stimulation, the better the cognitive functioning such as thinking and memory. Just as our body needs regular exercise to remain in a healthy condition so too does our brain need regular mental workouts to remain functionally resilient. Studies and comparisons involving people working in specific occupations and individuals whose mental activity levels are determined by their self-reporting has shown a positive relationship between levels of activity and levels of cognitive functioning. A 2012 study, for example, suggests that the presence of new neurons in the adult hippocampus indicates that this structure incorporates new neurons into its circuitry and uses them for some functions related to learning and related thought processes.21 The generation of these new neurons depends on a number of factors ranging from age to aerobic exercise to sexual behaviour to alcohol consumption. However, most of the cells will die unless we engage in meaningful mental stimulation or learning experiences when the cells are about one week of age. If learning does occur, the new cells become incorporated into brain circuits used for learning, and in turn, some processes of learning and mental activity appear to depend on the presence of these new cells. The study points to extensive literature showing that new neurons are kept alive by effortful learning, a process that involves concentration in the present moment of experience over some extended periods of time. Higher levels of job demand and job control are, thus, conducive to employees’ overall cognitive health and well-being. Exposure to both high job demand and control constitutes an active job, according to the model, fostering learning and strengthening brain capacities through neurocognitive stimulation.22

Mental disuse is as impressive an example of use it or lose it as physical disuse. A guitarist, for example, has a bigger cortex – the part of the brain that controls hand and finger movement – than a non-musician, just as a right-footed footballer will have stronger and larger muscles in her right leg, at least compared to someone who does not play the sport or who kicks with their left foot. It is simple adaption to use in both cases, and the opposite can be said for the consequence of disuse: smaller, weaker, declining function. Changes brought on by mental disuse occur in neurons (our brain cells) and neural networks: neurons shrinks, networks become less dense, connections (synapses) deteriorate.23 The neurotransmitters that permit impulses between neurons reduce and become less sensitive to receptors. The result is that cognitive functioning slows down, and mental capacity is reduced. We tend to underestimate this mental disuse because we are not as aware of its creeping effects in our everyday lives. We cannot see and feel the atrophy of our mind in the same way we can see and feel the atrophy of our muscles in our body. The effects on the mind are subtle and difficult to measure, and they happen gradually over time and as we get older. This makes them prefect culprits for the ageing process and when symptoms such as forgetfulness, confusion and speaking difficulties become severe enough, they get labelled dementia. The changing nature of some work may be altering our ability to react and make decisions in a timely manner, as well as add to the long-term depreciation of our cognitive abilities. The humdrum of simply watching a monitor on an automated assembly line not only leads to boredom, it can also lead to personal cognitive deterioration.

Surrendering to the Machine

Of course, not all jobs involve mundane monitoring tasks and some forms of work presents their own issues with regard to increasing challenges to individuals in new digital work environments. A recent survey from Korn Ferry – Workplace Stress Continues to Mount – examined the growth in workplace stress for professional workers, increasing by nearly 20 per cent in the last three decades.24 Among the top reasons for the increased stress over time are the threat of losing a job to technology, and the pressure to learn new skills just to stay employed. How is this digital work environment affecting overall mental well-being and our need for contentment, and how is the widespread introduction of new digital technology into the workplace impacting upon our prospects for improved quality of life and work/life balance in the future? In the broader sense, the relentless drive towards automation may well remove the human from the workplace altogether, particularly from the factory floor, adding to competition and pressure on those who remain employed. Most large industrial organisations and companies began investing in such digital automation and robotics over the past number of decades, and many such systems view the human presence as the weakest link in the assemble line chain. Employers and workers now anticipate widespread job automation over the coming decades. About eight in 10 American adults – 82 per cent – fear that by 2050, robots and computers will definitely or probably will do much of the work currently done by humans, according to a December 2018 Pew Research Center survey.25 A smaller share of employed adults – 37 per cent – say robots or computers will do the exact type of work they currently do themselves by 2050.

In their 2011 book Race against the Machine, MIT Researchers Erik Brynjolfsson and Andrew McAfee investigated the connections between digital technology, employment and organisation in the twenty-first century.26 The author’s central thesis is that we are in the midst of a technological revolution that is radically redefining what work is, how value is created and how the economy as it is currently organised distributes such value. They argue that for the last number of years, massive advancement in digital computer technology – from improved industrial robotics to automated translation services – is largely behind the sluggish employment growth figures of the last two decades. Even more worrying for workers, they foresee gloomy prospects for many types of work as these powerful new digital technologies are increasingly adopted not only in heavy industry, manufacturing and retail work but also in professions such as financial services, education, medicine and even law. Such technological acceleration is creating enormous value for many organisations, companies and some individuals at the very top, and there is no question they increase productivity in many ways. But the problem is that digital technologies change rapidly, but organisations and employee skills are just not keeping pace. As a result, many workers are being left behind, their income and prospects being destroyed, leaving them worse off in economic terms and reducing their purchasing power more than before this digital disruption.

While the foundations of our economic system presume a strong link between value creation and job creation, the Great Recession of 2008–2012 revealed a weakening or rupture of that link. This, it is suggested, is a deep structural change in the nature of production. As digital technology accelerates so too will the economic mismatches, undermining our social contract and ultimately hurting both rich and poor and not just the first waves of the unemployed.27 But we must recognise that ever since the followers of New Ludd28 began smashing machines in the early 1800s, workers have always worried about automation destroying their jobs and, thus, their livelihoods. Economists have always attempted to reassured people that new forms of work would be created when these old forms were abolished, and for several centuries, they were correct. However, there is no economic law that states that everyone, or even most people, automatically benefit from such technological progress, and as the digital economy grows, it can leave some people, or even a lot of people, worse off:

And computers (hardware, software, and networks) are only going to get more powerful and capable in the future, and have an ever-bigger impact on jobs, skills, and the economy. The root of our problems is not that we’re in a Great Recession, or a Great Stagnation, but rather that we are in the early throes of a Great Restructuring. Our technologies are racing ahead but many of our skills and organizations are lagging behind. So it’s urgent that we understand these phenomena, discuss their implications, and come up with strategies that allow human workers to race ahead with machines instead of racing against them.29

A study compiled by the McKinsey Global Institute suggests that advances in AI and robotics will have a severe effect on everyday working lives – comparable to the shift away from agrarian societies during the Industrial Revolution – predicting that by 2030 as many as 800 million jobs could be lost worldwide to digital acceleration.30 In the United States alone, between 39 and 73 million jobs stand to be automated, making up around a third of the total workforce. But the report’s authors maintain that such technology will not only be a destructive force. New jobs will be created, existing roles will be redefined and workers will have the opportunity to switch careers. But they further suggest that income inequality is likely to grow, possibly leading to political instability, and the individuals who need to retrain for new careers will not be the young but, in fact, middle-aged professionals. We may be moving far too quickly to automate white-collar jobs, sophisticated tasks and mental but rewarding work and becoming increasingly reliant on automated decision-making and predictive analytics.

This represents a large-scale de-skilling of the workforce, which will have particularly ramifications for society at large, some of which we are now just starting to experience. Entire professions, careers and businesses are being eliminated in the rush towards automation and digitisation, and moves away from traditional forms of work are creating sizeable deindustrialised regions of unemployed, even in the developed world. The disappearance of high-quality manufacturing jobs in many advanced economies is not being replaced with similar high-level roles in the service sector. The option for reskilling is all well and good, but if the work for these skills resides elsewhere, the outlook for workers who cannot relocate due to family or personal commitments is bleak. As factories close, many middle-skilled workers are forced to accept low-paying jobs in the service sector in their region, adding to the reduction of the income distribution and a subsequent rise in regional inequality. Automation impacts on work and workers in numerous ways and should not be underestimated as a significant challenge to our sense of personal worth and well-being, but at the same time, it does not, of itself, spell the end of work just a change in the nature and value of work. Automation and digitalisation is hastening the growth of ‘gig working’ as more and more complex jobs and work are broken down into discrete tasks ripe for outsourcing to the waiting crowd, with all its associated precariousness and volatility for workers and their families.

Precarious Work

Work traditionally occupied a substantial proportion of most people’s lives and has often been taken as a symbol of personal value and self-worth. Work provides status, economic reward, a demonstration of religious faith and a means to realise self-potential.31 But the meaning of work in contemporary society is now a challenging debate within sociology with some espousing that the post-industrial workforce should be expected to possess a relatively high degree of career and occupational identification and are likely to anticipate intrinsically meaningful work built around self-actualising opportunities. But this is a highly contested claim, and some prominent theorists also suggest that work identities are increasingly fragile, unstable and discontinuous. Indeed, Bauman argued that in postmodern societies, consumption has supplanted work as the key source of self-identity and social status.32

Digitalisation has enabled many new forms and organisations of work, more generally, and has introduced new terms and phases into our everyday vocabulary. For many people, the term gig economy still sounds a little ambiguous. The term refers to a way of doing business where freelancers and independent contractors are employed in place of full-time paid workers by an organisation. As such, workers rely on finding short-term segments of work or tasks to be performed – or as they are better known, gigs. These are primarily through digital online intermediaries or platforms designed specifically for this purpose. Some of the most popular gig economy digital intermediaries are Airbnb, Uber, Fiverr, Deliveroo and other food delivery services, and these digital platforms act as mediators that help both employers’ complete tasks and freelancers find temporary work, all for a specific fee. Gig working is not new; contract work has been a feature of modern work and the knowledge economy for several decades. What is new is the addition of these digital platforms as arbitrators in the contract work process.

In Human as a Service, Jeremias Prassl sets out his arguments on the challenges posed by such on-demand work and, in particular, how these digital work intermediaries deliver tight curated products and services by means of close control over their workforce; from setting terms and conditions and checking relevant qualifications to ensuring proper performance and payment.33 He argues that these digital platforms operate in often legal grey areas using narratives of entrepreneurship, opportunities, autonomy, self-determination and freedom for these workers. Digital work intermediaries do not wish to be seen as employers and will continually reinforce this by distancing themselves from the responsibilities and obligations that traditional businesses and organisations must adhere to in order to protect their workers or consumers.34 One of these distancing strategies is the use of language to rebrand work and shape the regulatory responses. In this gig environment, there is no longer talk of ‘work’ rather it is the requirement for ‘gigs’, ‘lifts’, ‘tasks’, ‘hits’ and ‘favours’ which replace the traditional vocabulary of the labour market. The ultimate goal in this approach is to question whether the law in general – and employment law in particular – remains relevant in regulating the contractual relationships formed between digital platforms, their users and their workforce.35 So, what of the workers and their welfare in this new work environment? The business models for most of these digital platforms are clearly based on tight control over their workforce, subject to constantly changing and increasingly arduous terms and conditions, the very opposite to what is suggested in their entrepreneurial claims:

[F]or a large number of workers, the reality as a Tasker, Driver-Partner, or Turker is more reminiscent of Victorian labourers’ daily grind than the glamour of Silicon Valley: long hours for low wages, constant insecurity, and little legal protection – with no chance of a future upside.36

A 2018 survey on millennial workers and twenty-first-century work outlines the impact that the rise of precarious gig work is having on an entire generation. The Generation Effect: Millennials, Employment Precarity and the 21st Century Workplace survey looked at precarious work’s impact on millennials’ community participation, health, quality of life, work and the workplace. But the study’s findings on the cost to mental health and well-being and the pervasiveness of this issue among millennials were the most disturbing.37 The study found a close correlation between mental health and the quality of employment now on offer, which suggests that these changes to the form and organisation of work are having a negative impact on this generation. Underemployment – which is defined by the International Labour Organisation (ILO) as those who work fewer hours than a national-specific threshold related to working time while they are willing and available to work additional hours – is closely linked with the gig economy since full-time work is never guaranteed, and findings from two large UK samples highlight the possibility that underemployment among part-time workers may also have detrimental psychological consequences.38 The experience of gig workers across the world must be understood in the context of neoliberalism, which has amplified both the globalisation and precariousness of work, and while gig workers share some similar vulnerabilities with other workers, the digital platform-specific vulnerabilities of workers require particular attention. New forms and organisations of work – designed, developed and promoted by digital work intermediates – should not be allowed to do us harm in the long run, nor be allowed roll back years of hard fought labour regulation and measures designed to help and support workers of all ages, gender and ethnicity, attain a proper and rewarding quality of life.

A Just Transition

The development, emergence and wide prevalence of digitalisation into almost every facet of our daily lives and routines has brought about some great benefits, which are widely acknowledged and highly praised across society. But digital technology’s pervasiveness into every aspect of our everyday life is relatively recent and its long-term impacts and consequences on our cognition and personal well-being have yet to be fully understood and realised. In particular, we discussed in this chapter the unstoppable march of digitisation and the digital economy into all forms and organisation of work, but this has happened without debate or discussion as to its lasting impacts and consequences for individuals, communities, societies and entire regions. In how our current economic system is organised – widely adopted in countries of the West – there will always be opportunities for individuals and organisations to capitalise on the introduction of digital technology in a drive for productivity while displacing workers, but inevitably society in general will end up ‘picking up the tab’ for such disruption and dealing with the unescapable outcomes. Unemployment and underemployment have very damaging effects on the individual, their families and their communities, yet we’re heading at breakneck speed into a digital future with limited or no role for humans in the workplace. This significant shift in value will elevate just a few to mega-status wealth but leave the vast majority behind to fight over the scraps and leftovers. Such a future scenario has the potential for significant consequences for family and community cohesion and social unrest yet there’s little political appetite or will evident at present to discuss such developments or to anticipate what will be required in a post-work era.

Like many of the developments around digitisation, we are thoughtlessly following a technological determinist approach and hoping for only positive outcomes from digital technology futures. Research has long suggested that social media can be harmful to users’ well-being, for example, but a comprehensive new study examining the impact of Facebook usage on well-being over time found that using that social media platform was consistently detrimental to mental health.39 At the same time, it is also important to remember that much more evidence for creeping changes to human cognition and personal well-being is needed before we decide upon and make decisive decisions and judgements on digitisation’s potential role and direction and true consequences. Digitalisation that weakens our cognitive capacities and innate human aptitude for reasoning will diminish us all over time. We must be more aware of possible negative outcomes in order to spot them when they begin to emerge, and before it’s too late to turnaround the digitisation juggernaut from eliminating humans altogether from various work environments and devaluing our sense of self-worth. Debates and discussions on a just transition with regard to the EU Green Deal are loud and clear,40 but much more attention and effort is needed into understanding parallel just transition financed strategies with regard to disruptive digital technology’s impacts on work, individuals and communities alike.

1

Hern, A. (2020). Part human, part machine: Is Apple turning us all into cyborgs? The Guardian, November 25. Retrieved from https://www.theguardian.com/technology/2020/nov/25/part-human-part-machine-is-apple-turning-us-all-into-cyborgs

6

Lafrance, A. (2016). The human remembering machine: A new mathematical model of memory could accelerate the quest to build super-powered, brain-inspired hardware systems. The Atlantic, October 3. Retrieved from https://www.theatlantic.com/technology/archive/2016/10/the-human-remembering-machine/502583/

11

Michael Rich is also Associate Professor of Pediatrics at HMS and Associate Professor of Social and Behavioral Sciences at the Harvard T.H. Chan School of Public Health.

12

Ruder, D. B. (2019). Screen time and the brain. Harvard Medical School, June 19. Retrieved from https://hms.harvard.edu/news/screen-time-brain

13

Rapid eye movement (REM) sleep is characterised by low muscle tone, REMs and dreams, and during such episodes, neural activity appears to originate in the brainstem making the brain more active, and it plays an important role in helping the brain consolidate and process new information.

14

See ‘Adolescent Brain Cognitive Development Study (ABCD Study)’ last updated May 2020 at https://www.nimh.nih.gov/research/research-funded-by-nimh/research-initiatives/adolescent-brain-cognitive-development-study-abcd-study.shtml.

17

Carr (2015, p. 153).

18

Carr (2015, p. 155).

20

Carr (2015, p. 157).

24

Worried workers: Korn Ferry survey finds professionals are more stressed out at work today than 5 years ago. (2018). Korn Ferry, November 8. Retrieved from https://www.kornferry.com/about-us//press/worried-workers-korn-ferry-survey-finds-professionals-are-more-stressed-out-at-work-today-than-5-years-ago

28

While the name New Ludd and, indeed, Luddite is of uncertain origin, it is widely suggested that it was a secret oath-based organisation of English textile workers in the nineteenth century. This radical faction destroyed textile machinery as a form of protest. The group was protesting against manufacturers who used machines in what they called a fraudulent and deceitful manner to get around standard labour practices.

34

Prassl (2018, p. 51).

35

Prassl (2018, p. 50).

36

Prassl (2018, p. 52).

39

Shakya, H. B., & Christakis, N. A. (2017). A new, more rigorous study confirms: the more you use Facebook, the worse you feel. Harvard Business Review, April 10. Retrieved from https://hbr.org/2017/04/a-new-more-rigorous-study-confirms-the-more-you-use-facebook-the-worse-you-feel

40

The Just Transition Mechanism (JTM) is part of the European Green Deal Investment Plan and a key tool to ensure that the transition towards a climate-neutral economy happens in a fair way, leaving no one behind: see https://ec.europa.eu/regional_policy/en/newsroom/news/2020/01/14-01-2020-financing-the-green-transition-the-european-green-deal-investment-plan-and-just-transition-mechanism.

References

Barrett, 2011Barrett, L. (2011). Beyond the brain: How body and environment shape animal and human minds. Princeton, NJ: Princeton University Press.

Bauman, 2005Bauman, Z. (2005). Work, consumerism and the new poor. Maidenhead: Open University Press.

Brynjolfsson, McAfee, 2011Brynjolfsson, E., & McAfee, A. (2011). Race against the machine: How the digital revolution is accelerating innovation, driving productivity, and irreversibly transforming employment and the economy. Lexington, MA: Digital Frontier Press.

Carr, 2008Carr, N. (2008). Is Google making us stupid? What the internet is doing to our brains. The Atlantic, July/August, C1.

Carr, 2010Carr, N. (2010). The shallows: What the internet is doing to our brains. New York, NY: W.W. Norton & Company.

Carr, 2015Carr, N. (2015). The glass cage: Who needs humans anyway? London: Vintage Books.

Dekker, Woods, 1999Dekker, S., & Woods, D. (1999). Automation and its impact on human cognition. In S. Dekker & D. Woods (Eds.), Coping with computers in the cockpit (pp. 728). London: Routledge.

Firth, Torous, Stubbs, Firth, Steiner, Smith, Armitage, 2019Firth, J., Torous, J., Stubbs, B., Firth, J. A., Steiner, G. Z., Smith, L., … Armitage, C. J. (2019). The “online brain”: How the internet may be changing our cognition. World Psychiatry, 18(2), 119129.

Grint, Nixon, 2015Grint, K., & Nixon, D. (2015). The sociology of work (4th ed.). Cambridge: Polity Press.

Karasek, 1979Karasek, R. A., Jr. (1979). Job demands, job decision latitude, and mental strain: Implications for job redesign. Administrative Science Quarterly, 24(2), 285308.

Loh, Kanai, 2014Loh, K. K., & Kanai, R. (2014). Higher media multi-tasking activity is associated with smaller gray-matter density in the anterior cingulate cortex. PloS One, 9(9), e106698.

Madigan, Browne, Racine, Mori, Tough, 2019Madigan, S., Browne, D., Racine, N., Mori, C., & Tough, S. (2019). Association between screen time and children’s performance on a developmental screening test. JAMA Pediatrics, 173(3), 244250.

Manyika, Lund, Chui, Bughin, Woetzel, Batra, Sanghvi, 2019Manyika, J., Lund, S., Chui, M., Bughin, J., Woetzel, J., Batra, P., … Sanghvi, S. (2019). Jobs lost, jobs gained: What the future of work will mean for jobs, skills, and wages. Retrieved from https://www.mckinsey.com/featured-insights/future-of-work/jobs-lost-jobs-gained-what-the-future-of-work-will-mean-for-jobs-skills-and-wages#

Martin, Lewchuk, 2018Martin, J. C., & Lewchuk, W. (2018). The generation effect: Millennials, employment precarity and the 21st century workplace. Retrieved from https://apo.org.au/node/229416

McLuhan, 1964McLuhan, M. (1964). Understanding media: The extensions of man. London: Sphere.

Mousteri, Daly, Delaney, 2020Mousteri, V., Daly, M., & Delaney, L. (2020). Underemployment and psychological distress: Propensity score and fixed effects estimates from two large UK samples. Social Science & Medicine, 244, 112641.

Parker, Morin, Horowitz, 2019Parker, K., Morin, R., & Horowitz, J. M. (2019). The future of work in the automated workplace. Retrieved from file:///Users/mikehynes/Downloads/US-2050_full_report-FINAL.pdf

Prassl, 2018Prassl, J. (2018). Humans as a service: The promise and perils of work in the gig economy. Oxford: Oxford University Press.

Shors, Anderson, Curlik Ii, Nokia, 2012Shors, T. J., Anderson, M. L., Curlik Ii, D., & Nokia, M. (2012). Use it or lose it: How neurogenesis keeps the brain fit for learning. Behavioural Brain Research, 227(2), 450458.

Vickery, Matson, Vickery, 2012Vickery, D., Matson, L., & Vickery, C. (2012). Live young, think young, be young: … At any age. Boulder, CO: Bull Publishing Company.

Wolf, 2008Wolf, M. (2008). Proust and the squid: The story and science of the reading brain. New York, NY: Harper.