Will we all be cyborgs?

Martin Hall asks: will we be cyborgs, part flesh and part machine? What advantages will this bring? And what should we worry about?

The convergence of information technologies and bioscience is changing our lives. As bandwidth speeds of a gigabit per second become available and affordable for some, how will this affect who we are as we are able to reconstruct our own bodies and shape the bodies of our children?

The Pew Research Center recently canvassed over a thousand practitioners and experts in information technology. They were asked what they believed were the implications of a gigabit world. Will there be distinctive killer apps, disruptive innovations that will result in significant changes in the ways we live? The Pew survey identified health as one of the areas that will be most affected. Here is Hal Varian, chief economist for Google: “the big story here is continuous health monitoring… It will be much cheaper and more convenient to have that monitoring take place outside the hospital. You will be able to purchase health-monitoring systems just like you purchase home-security systems. Indeed, the home-security system will include health monitoring as a matter of course. Robotic and remote surgery will become commonplace”.

A gigabit connection provides one thousand megabits of information per second (Mbps). At the beginning of this year, the average connection speed across the world was just under 4 Mbps, across the United States 10.5 Mbps and in South Korea – the country with the highest average connection speed – 23.6Mbps. A gigabit world, then, would see a forty-fold increase in Internet speed in the best performing country. This may seem unattainable in the near future. But this technology is already with us. Some scientific communities have already had access to very fast networks for several years. Four years ago, Google ran a competition for the first community network running at 1 gigabit per second, a hundred times faster than the average speed for the US as a whole. Kansas City won and residents are now signing up for the service.

The convergence of bioscience and information technology is best represented in the history and triumphs of the Human Genome Project. Launched in 1990 and completed in 2003 with the sequencing of the chemical base pairs that make up our DNA, the results of this extensive collaboration is now transforming medicine and health care. Genome sequencing would not have been possible without high-speed computing and future developments will depend on almost instant access to massive data sets.

But in addition to new drug development and predictive diagnosis of genetic conditions there are more complicated innovations. There is widening enthusiasm for personal DNA profiles that establish genealogies; the confirmation that the skeleton found under a car park in Leicester was once Richard III is a famous example. But for others, there is a deep suspicion of what this could bring. For example, indigenous communities with hard-won rights to land and resources fear that the misuse of DNA sequencing may strip away these rights. And the extensive and continuing revelations about the misuse of information technologies by state agencies has encouraged a significant backlash against the pooling and use of personal health records. The union of the biological and digital sciences is a complicated marriage that will bring unanticipated consequences.

One area to watch for such surprises is digital implants; the surgical insertion of microprocessors that make us part flesh, part machine.

The specter of the cyborg has been with us since Mary Shelly’s 1818 novel Frankenstein. The early Internet and popularity of personal computers brought flesh, organs, digital processors and information technology together in fiction and theory. Milestones, and still great reading today, were William Gibson’s 1984 classic Neuromancer and Donna Haraway’s prescient essay, Cyborg Manifesto, published the following year.

But cyborgs were already on the street thirty years ago. The first surgical implantation had been in 1958 ( the recipient lived until 2001); today’s microprocessor-controlled pacemakers sense the physical activity of their host and respond by increasing or decreasing their rate. And since Gibson foresaw a future in which the body could be rebuilt at will – although for nefarious purposes in the dark world of the matrix – remarkable new medical technologies can transform the quality of life. Surgical cochlear implants pick up signals from a speech processor and send them to the auditory nerve. Robotic prosthetic limbs receive signals from the nervous and muscular systems and transmit these to an artificial arm or leg. In the near future, implantable artificial kidneys with microelectromechanical membranes will filter blood and excrete toxins while reabsorbing water and salt.

Widely available gigabit connections will enable intelligent, implanted devices such as these to become part of the “internet of things”, much as Gibson imagined in Neuromancer. The Pew Center study predicts personalized digital health within the next ten years. Here is Judith Donath, at Harvard’s Berkman Center for Internet and Society: “telemedicine will be an enormous change in how we think of healthcare. Some will be from home—chronically ill or elderly patients will be released from hospitals with a kit of sensors that a home nurse can use. For others, drugstores (or private clinic chains—fast meds, analogous to fast foods) will have booths that function as remote examining, treatment, and simple surgery rooms. The next big food fad, after hipster locavores, will be individualized scientific diets, based on the theory that each person’s unique genetics, locations, and activities mean that she requires a specific diet, specially formulated each day”.

But medical applications are likely to develop more rapidly than this. Chip implants that yield personal information to a scanner have been around – controversially – for a decade, promoted for monitoring prisoners and hospital patients. And a person offered a surgical implant that could save their life by responding to real-time information is unlikely to decline out of deference to Edward Snowden.

All these developments, whether for lifestyle choices, medical care or lifesaving technologies, will require a significant trade off between privacy and the sharing of personal digital information. Will this happen? Revelations about the extensive misuse of surveillance by state agencies across the West has resulted in a backlash against sharing. We are becoming aware that our digital traces are everywhere we go, and we don’t like it very much. But despite this, we surrender our personal data every day in return for the conveniences this brings.

Anyone who uses any free Google service pays with the surrender of some personal data, and usually a lot. Google knows where its users are, and what they are interested in, by collecting information on Internet searches, the contents of e-mails sent and received and from geospatial information transmitted from smart phones and tablets. The payback for the loss of privacy is easier shopping, finding places anywhere and a fast, free and capacious e-mail service. We all want safe cities, with protection from mugging to terrorism and everything in between. Today’s cities are impossible to police effectively without constant, digital surveillance. Every person in Britain is now photographed on average 300 times each day, often without knowing it. In London, more than 16 000 sensors automatically record the location of anyone carrying an Oyster card. Digital number plate recognition systems record the movement of every car across motorway systems, linking back to the identity of the registered owner. We are, to go back to William Gibson’s prescient novel, already in the Matrix, and this is a messy and complicated place to be.

And, finally, the engine of most contemporary change – consumerism. From the earliest Apple Mac to the latest iPhone, markets have directed and accelerated the advance towards a gigabit world. This Christmas’s best seller will be the wearable band, which offers a range of digital functions from paying for coffee to monitoring health patterns; market analysts predict that 43 million of these devices will be sold across the world. 28 million of these will be smartbands, that connect to tablets, iPhones and other digital devices. And once this market is saturated, as it surely will be, what next? With 1000 megabits of information available every second, what could be more natural than tucking the microchip away beneath a fold of skin, perhaps along with a tattoo or body piercing?

But not for everyone. Respondents to the Pew Center survey also saw in this future an entrenched digital divide. Rex Troumbley, from the University of Hawaii, commented that “we should not expect these bandwidth increases to be evenly distributed, and many who cannot afford access to increased bandwidth will be left with low-bandwidth options. We may see a new class divergence between those able to access immersive media, online telepathy, human consciousness uploads, and remote computing while the poor will be left with the low-bandwidth experiences we typically use today.”

And so again we are in William Gibson’s dystopian world, or right back to Mary Shelly’s horror of the “miserable monster”, and his reproach to his creator: ‘I ought to be thy Adam; but I am rather the fallen angel.’ 

Martin is currently Vice Chancellor of the University of Salford in Manchester and chair of the board of Jisc, the United Kingdom’s information technology service for higher and further education (www.jisc.ac.uk). 

Before joining Salford in 2009, Martin was Deputy Vice-Chancellor at the University of Cape Town (from 2002-2008) and the inaugural Dean of Higher Education Development at UCT (from 1999-2002). He is Emeritus Professor and Life Fellow at the University of Cape Town, a Fellow of the Royal Society of South Africa, a Principal Fellow of the Higher Education Academy and a Fellow of the Royal Society for Arts.

Read another post from Martin here: https://universitybusiness.co.uk/News/its_not_the_app_its_the_application

**

Pew Research Center, September 2014, “Killer Apps in the Gigabit Age”

Available at: https://www.pewInternet.org/2014/10/09/killer-apps-in-the-gigabit-age/