Cognitive Psychology and the Smartphone

The iPhone was released 10 years ago and that got me thinking about the relationships I’ve had with smartphones and mobile devices. Of course, I remember almost all of them…almost as if they were real relationships. The first one, the Qualcomm QPC 860, was solid but simple. That was followed by a few forgettable flip phone and a Motorola “ROKR” phone that never really lived up to its promise.

But then came the iPhone, and everything changed. I started really loving my phone. I had an iPhone 3GS (sleek and black) and a white iPhone 4S which I regard at the pinnacle of iPhone design, and I still have as a backup phone. A move to Android saw a brief run with an HTC and I’ve been in a steady commitment with my dependable and conservative Moto X Play for 2 years now. It’s with me every single day, and almost all the time. Is that too much? Probably.

Smartphones are used for many things

There is a very good chance that you are reading this on a smartphone. Most of us have one, and we probably use it for many different tasks.

  • Communication (text, email, chat)
  • Social Media (Facebook, Twitter)
  • Taking and sharing photos
  • Music
  • Navigation
  • News and weather
  • Alarm clock

One thing that all of these tasks have in common is that the smart phone has replaced other means of accomplishing the same tasks. That was original idea for the iPhone, one device to do many things. Not unlike “the one ring”, the smart phone has become the one device to rule them all. Does it rule us also?

The Psychological Cost of Having a Phone.

For many people, the device is always with them. Just look around a public area: it’s full of people on their phones. As such, the smartphone starts to become part of who we are. This ubiquity could have psychological consequences. And there have been several studies looking at the costs. Here are two that piqued my interest.

A few years ago, Cary Stothart did a cool study in which research participants were asked to engage in an attention monitoring task (the SART). They did the task twice, and on the second session, 1/3 of the participants received random text notifications while they did the task, 1/3 received a random call to their phone, and 1/3 proceeded as they did in the first session, which no additional interference. Participants in the control condition performed at the same level on the second session, but participants who received random notifications (text or call) made significantly more errors on the task during the second session. In other words, there was a real cost to getting a notification. Each buzz distracted the person just a bit, but enough to reduce performance.

So put your phone on “silent”? Maybe not…

A paper just published by Adrian Ward and colleagues (Ward, Duke, Gneezy, & Bos, 2017) seems to suggest that just having your phone near you can interfere with some cognitive processing. In their study, they asked 448 undergraduate volunteers to come into the lab and participate in a series of psychological tests. Participants were randomly assigned to one of three conditions: desk, pocket/bag, or other room. People in the other room condition left all of their belongings in the lobby before entering the testing room. People in the desk condition left most of their belongings in the lobby but took their phones into the testing room and were instructed to place their phones face down on the desk. Participants in the pocket/bag condition carried all of their belongings into the testing room with them and kept their phones wherever they naturally would (usually pocket or bag). Phones were kept on silent.

The participants in all three groups then engaged in a test of working memory and executive function called the “operation span” task, in which participants had to work out basic math tests and keep track of letters (you can run the task yourself here), as well as the Raven’s progressive matrices task which is a test of fluid intelligence. The results were striking. In both cases having the phone near you significantly reduced your performance on these tasks.

A second study found that people who were more dependent were affected more by the phone. This is not good news for someone like me, who seems to always have his phone nearby. They write:

Those who depend most on their devices suffer the most from their salience, and benefit the most from their absence.

Are Smartphones a Smart Idea?

Despite the many uses for these devices, I wonder how helpful they really are….for me at least. When I am writing or working, I often turn the wifi off (or use Freedom) to reduce digital distractions. But I still have my phone sitting right on the desk and I catch myself looking at it. There is a cost to that. I tell students to put their phones on silent and in their bag during an exam. There is a cost to that. I tell students to put them on the desk on silent mode during lecture. There is a cost to that. When driving, I might have the phone in view because I use it to play music and navigate with Google Maps. There is a cost to that.

It’s a love hate relationship. One of the reasons I still have my iPhone4S is because it’s slow and has no email/social media apps. I’ll bring it with me on a camping trip or hike so that I have weather, maps, phone and text, but nothing else: it’s less distracting. Though it seems weird to have to own a second phone to keep me from being distracted by my real one.

Many of us spend hundreds of dollars on a smart phone and several dollars a data for a data usage plan and at the same time, have to develop strategies to avoid using the device. It’s a strange paradox of modern life that we pay to use something that we have to work hard to avoid using.

What do you think? Do you find yourself looking at your phone and being distracted? Do you have the same love/hate relationship? Let me know in the comments.

References

Ward, A. F., Duke, K., Gneezy, A., & Bos, M. W. (2017). Brain Drain: The Mere Presence of One’s Own Smartphone Reduces Available Cognitive Capacity. Journal of the Association for Consumer Research. https://doi.org/10.1086/691462

Stothart, C., Mitchum, A., & Yehnert, C. (2015). The attentional cost of receiving a cell phone notification. Journal of Experimental Psychology: Human Perception and Performance 41(4), 893–897. http://doi.org/10.1037/xhp0000100

 

A Computer Science Approach to Linguistic Archeology and Forensic Science

Last week (Sept 2014),  I heard a story on NPR’s morning edition that really got me thinking…(side note, I’m in Ontario so there is no NPR but my favourite station is WKSU via TuneIn radio on my smart phone). It was a short story, but I thought it was one the most interesting I’ve heard in last few months, and it got me thinking about how computer science has been used to understand natural language cognition.

Linguistic Archeology

Here is a link to the actual story (with transcript). MIT computer scientist Boris Katz realized that when people learn English as second language, they make certain errors that are a function of their native language (e.g. native Russian speakers leave out articles in English). This is not a novel finding, people have known this. Katz, by the way, is one of many scientists that worked with Watson, the IBM computer that competed on jeopardy

Katz trained a computer model to learn from samples of English text productions such that it could detect the writer’s native language based on errors in their written English text. But the model also learned to determine similarities among other native languages. The model discovered, based on errors in English, that Polish and Russian have historical overlap. In short, the model was able to determinethe well know linguistic family tree among many natural languages.

The next step is to use the model to uncover new things about dying or languages. As Katz says

But if those dying languages have left traces in the brains of some of those speakers and those traces show up in the mistakes those speakers make when they’re speaking and writing in English, we can use the errors to learn something about those disappearing languages.”

Computational Linguistic Forensics

This is only one example. Another one that fascinated me was the work of Ian Lancashire, an English professor at the University of Toronto and Graeme Hirst, a professor in the computer science department. The noticed that the output of Agatha Christie—she wrote around 80 novels, and many short stories— declined in quality in her later years. That itself is not surprising, but they thought there was a pattern. After digitizing her work, they analyzed the technical quality of her output and found richness of her vocabulary fell by one-fifth between the earliest two works and the final two works. That, and other patterns, are more consistent with Alzheimer’s than normal aging. In short, they are tentatively diagnosing Christie with Alzheimer disease, based on her written work. You can read a summary HERE and you can read the actual paper HERE.  It’s really cool work.

Text Analysis at Large

I think this work is really fascinating and exciting. It highlights just how much can be understood via text analysis. Some of the this is already commonplace. We educators rely on software to detect plagiarism. Facebook and Google are using these tools as well. One assumes that the NSA might be able to rely on many of these same ideas to infer and predict information and characteristics about the author of some set of written statements. And if a computer can detect a person’s linguistic origin from English textual errors, I’d imagine it can be trained to mimic the same effects and produce English that looks like  it was written by a native speaker of another language…but was not. That’s slightly unnerving…