Think Different

Wednesday, 12 August, 2015

Apple are planning to change the technological terrain of this decade with the next level of wearable technology. Apple’s iWatch has been one of the most anticipated events in new technology news this year with a mammoth one million devices pre-ordered. The buzz surrounding this product is almost hysterical. But while Smart watches might be a novel idea, earlier releases such as the Pebble or the Android smartwatches have been met a somewhat lackluster reception. So why all the hype?

Let’s look at a few of the most impressive innovations of this device. First, it has haptic touch technology that enables the iWatch to understand the user’s personalised range of context-specific controls. Second, it has a built-in speaker and microphone that allows the user to make phone calls and play music. And third, it has a fitness feature that uses Wifi, GPS and an accelerometer to measure distance travelled, steps taken and speed. Also, the heart rate sensor is a gimmicky addition that will likely appeal to fitness enthusiasts.

Then there are the rumours about clever software for the device such as a sensor that can detect whether the house lights have been left on or the device’s ability to show step-by-step recipes on the watch face. The potential for the iWatch to serve as a handy sidekick to our daily lives is undeniable. There is no doubt that this device is designed for utility and convenience, helping us to lead better, quicker and more efficient lives.

Even though these features don’t appear to be particularly groundbreaking, it is the permanent wearability of the iWatch that has the most implications for the user. Apple CEO Tim Cook claims it will “change the way you live your life”. So, for a generation of users already annexed by our technology, will the iWatch break the barriers between body and machine more than ever before?

The iWatch can be worn 24/7 making it constantly accessible and having greater functionality within the user’s everyday life. Imagine this: even in the shower you are not insulated from the digital world. A vibration on your wrist can alert you to five missed calls from your mother or a cringe-worthy photo that has been uploaded to Facebook.


Let’s suppose that the Apple watch does manifest into our lives in this new and even more invasive way. As technology becomes more intertwined with our experience of the world and reality, we really must consider whether this evolution is such a good thing.

We all know that any new form of technology is usually met with warnings from the skeptics and the doomsayers. Do you remember hearing how mobile phones caused cancer or that microwaves were radioactive death boxes and that computers were going to blow up the world in Y2K?

But as exaggerated as many pessimistic claims towards new innovations are, there is some very real evidence showing that our device-reliant lifestyles are changing the way that we think. And they are making us ‘dumber’ in the conventional sense. The potential impacts of technology on learning and intelligence have been established by ongoing research.

In the US, a study measured the effects of mobile phone usage on IQ the scores of high school children. Not surprisingly, the study showed that the amount of teenagers who owned smartphones increased from 23% in 2008 to 37% in 2013 (and 78% had older model mobile phones). This increase was linked to a decline in the mean SAT test scores during this time. All three skill sets of critical reading, mathematics and writing showed a decline.

Psychologists have also described the ways that our thinking has changed as we have become more digitally literate. They speak about this in terms of crystallised intelligence and fluid intelligence, which were coined after Catell’s 1940s IQ model. These terms refer to two very different aspects of intelligence, which are uniquely influenced by the technology we have come to rely on.

Fluid intelligence can be likened to the RAM capacity of a computer but in the human brain. It determines the functional ability to gather large and complex bodies of information and multi-task with speed and efficiency. This intelligence is also related to visual IQ, involving spatial visualisation, orientation skills and divided attention. The good news is that these facets of our intelligence have been bolstered and strengthened as we have adapted to the digital realm we now live in.

But while our fluid intelligence has been enhanced as we adapt our functioning to changing technological demands, our crystallised intelligence is predicted to take a nosedive. This type of intelligence is essentially our implicit core knowledge. This is where we store facts and events that we have learned and embedded within our memory, ready to draw upon at any time. This is the type of information that we should be able to retain and then extract for standard academic testing.

However, in a world where we rely on our gadgets to outsource, crowd source, and cloud source information for us, this kind of storage capacity and intelligence is declining. Dr Tomas Chamorra-Premuzic terms this phenomenon the “hyper-link economy”. We are becoming just like our tablets or smartphones – our ability to problem solve now relies on our ability to connect to where the information is rather that our own retention of information.

Furthermore, Professor David Nicholas, director of Computer Research at University College, London, compared our digital usage habits to that of a fast-food consumption of information. Faced with information overload, we have become ‘scattered thinkers’ who don’t digest, analyse or critically evaluate content anymore. Instead, we only skim the surface.

Our creativity and productivity has also been compromised. This occurs when we multi-task and multi-screen between calls, emails and digital content, rather than concentrating on a single task.

Because conventional intelligence and academic tests rely on our crystallised memory, it would be brash to neglect these impacts. How will we cope without our technological crutches when we are faced with multiple-choice or written answer tests in our academic years?

And the arrival of Apple’s iWatch has the potential to reinforce our reliance on technology. Sitting inconspicuously on our wrists all day and night, counting our steps, measuring our activity and monitoring our sleep – are we really so far away from morphing into human cyborgs?

Guardian columnist Julian Baggini has aired a similar concern denoting that smartwatches will encourage “auto-instrumentalisation” of ourselves, whereby we treat ourselves as machines to be maintained and serviced to enable operation at maximum efficiency. For example, the health analysis tools on these devices prioritise success via measurements and spreadsheets rather than genuine improvements in quality of life or happiness.

Despite this new wave of technophobia, we can’t avoid new technologies. It is the future of our society and imperative to the way we interact and operate in the modern era. However, perhaps we need to approach it with a little more skepticism and a little less hype. If we can reflect upon and absorb all of the information that is hurtled at us, maybe we have a chance of cementing our perspectives and identities outside of a technological device.

When the newest gadget promises to be “life-changing” and as ubiquitous as the Apple iWatch is looking to be, we would be wise to remain objective and critical thinkers, not allowing our existence to be dictated by the newest form of technology.