Chemotherapy – the invaluable treatment that has saved countless lives of cancer patients through blocking different functions in cell growth and replication – stemmed from the discovery, during World War II, that exposure to mustard gas in conflict significantly reduced soldiers' white blood cell count, sometimes fatally. The compound nitrogen mustard was studied further by researchers and found to halt the growth of rapidly dividing cells such as cancer cells.
These discoveries informed the development of a series of similar but more effective agents that killed cancer cells. Incidentally, a derivative of the original mustard gas, bendamustine, is still used in the treatment of cancer patients today.
This is just one example of humankind adapting a negative discovery into a positive one. With our combined creative abilities and an exponential ability to collaborate, share and learn in greater numbers, the human race continues to push the boundaries of invention. Now on the brink of a new digital epoch, with the Internet of Things (IoT) leading the way, how we choose to utilise technology to empower or disempower ourselves will determine our fate.
As Czarina Walker, founder and chief executive officer at InfiniEDGE Software, says: "The Internet of Things encourages laziness and drives innovation (at the same time)."
It is proving to optimise health, extend longevity and improve overall quality of life. But on the flip side, there is evidence that it will make us lonelier, more detached and more relationally dysfunctional than ever before. If we want to harness its extraordinary potential for human good, we have to acknowledge both ends of the consequential spectrum – and then choose an acceptable compromise between control and convenience.
Mind the gap
Inarguably, the rise of IoT is proving to be the most profound game changer of the 21st century. To date, an estimated 9 billion 'things' are 'talking' to one another across silos and network systems, gathering data which through analysis and interpretation enable optimised actions within a resource-constrained global society.
With rich, accessible data to pull from, people are unprecedentedly informed and empowered to understand and improve the systems that enable their daily living. On the other hand, the hardware and computational power allow for analytics and apps on our devices to do some or all the thinking for us. While convenient, this diminishes our autonomy as well as our capacity to think for ourselves.
According to a study led by Columbia University professor Betsy Sparrow on the 'Google Effect', "Our brains rely on the internet for memory in much the same way they rely on the memory of a friend, family member or co-worker. We remember less through knowing information itself than by knowing where the information can be found."
Take a GPS as an example. Previous to an automated navigational system, we were forced to rely on our own internal reason and mapping skills – studying the road, getting lost, recalling landmarks.
We needed to memorise maps and directions. We had to be aware of waypoints, signage and the behaviour of other traffic. We argued with our spouses to ask for directions. We not only engaged our full cognitive facilities in the process; we spanned the spectrum of emotions in a single car ride.
But now, automation spares us all the drama and brain energy. From smart toasters to home apps, to fridges that remind us to 'buy the milk', these inventions are ushering in more efficiency. This process of retaining less stored information, because it can be retrieved from someone or something else, is called transactive memory.
However, such ultra-dependence on smart technology is not only forging a kind of 'digital amnesia', as the Kaspersky researchers call it; it's subtly stripping away our freedom of choice and incumbent journey towards self-realisation.
Dr. Victor Frankl, survivor of the Holocaust and psychologist, said: "Between stimulus and response there is a space. In that space there is a power to choose our response. In our response lies our growth and our freedom." But if we have outsourced our most basic thinking to machines, so that we are not even conscious of why we do the things we do and how to control them for the better, are we really free? Can we really grow and own the consequences of our choices if there was never a conscious moment to choose them?
If our lives are increasingly entangled in a web of technological stimuli that are taking the cognitive load upon themselves, we may find ourselves increasingly resentful of the dependence, and not the freedom, we created.
A vote for small thinking
At a furious pace, everything from supermarket checkouts to toll booth operations to factory work is transitioning to robotic replacement these days. But automation cannot only be justified as the inevitable shift in global market trends; it's birthed out of the fundamental postulate that humans are better versions of themselves when spared of doing the 'little things'.
Oscar Wilde put it this way: "Unless there are slaves to do the ugly, horrible, uninteresting work, culture and contemplation become almost impossible. On mechanical slavery, on the slavery of the machine, the future of the world depends." The underlying assumption here is that, by machines doing all the braindead stuff, we are freed to design and innovate and focus on all things 'cultural and contemplative'.
However, according to David Krakauer, President of the Santa Fe Institute (SFI): One of the most concerning issues with the next technological developments is our tendency to trade control for convenience. And it's true – how many times have we agreed to 'Terms and Agreements' with only a quick scan (if that) of the content?
We need to always be mindful of the threat to our personal privacy and long-term security issues, considering the relentless invasive nature of IoT and big data to access and use our personal information.
But what worries him, aside from giving in to convenience is humans' laziness. "What I worry about almost more than anything else is a certain kind of mental laziness, and an unwillingness to engage with the difficult issues… It's somehow more pressing in a time where there are systems out there willing to make the decisions for you," he says. "We don't have the right immune system to deal with who has access to our knowledge."
Creating gatekeepers
So, how do we deal with this? How do we ensure that the IoT, with all its daily 5 quintillion bytes of decision-empowering data and infinite raw material to build a brave new world, is actually getting us to do our best thinking?
The key is not to retreat back into our technophobic caves – nor is it to blindly open every door that technology avails. We have to create ongoing interfaces, human touchpoints within the digital world that keep the individual cognisant and connected to their choices.
New inventions like transformational products are doing just that, by engaging consumers in "conversations without words" through an "aesthetic of friction". Whereas most smart infrastructure is embedded as quiet problem solvers, these products are meant to disturb and disrupt and remind the user of good energy choices (such as their worm-shaped extension cord that "writhes in frustration" when too many standby devices are plugged in).
Researchers from Cornell are also proposing a kind of digital infrastructure design that would inform users of their browsing history and expose the very information that was once deliberately hidden in code. A recipe for Big Brother paranoia? Possibly. But equally, an uncomfortable reminder to be discerning and know your actions have rippling – and measurable – effects.
The digital age comes packaged as both opportunities and perils of unseen proportions. One thing is for sure – we will need to take responsibility for our role in the creation and management of our digital future if we still want to have a say in what it looks like.
We need to do more thinking, not less!
This blog was authored by: Geoff du Toit