James Gips died last summer. You’ve probably never heard of him. Me neither. But I happened across his obituary in the Wall Street Journal, and the title intrigued me: “Professor Helped Disabled Use Cursors.” I read on.
Gips taught computer science at Boston College, and his research helped spur an innovation in eye-controlled cursor technology. At first, he and his colleagues thought their work would have a gaming application, and they dubbed their product EagleEyes. “It involves attaching electrodes to skin around the eyes so they can pick up electrical signals generated by eye movements and relay them to a computer.” Pretty cool stuff. Just the kind of flashy electronic accessory to give the edge to a new gaming platform.
Yet EagleEyes also came to the attention of Kathy Nash, a Massachusetts mom who had a different idea. “It would be perfect, Ms. Nash thought, for her teenage son Michael, whose severe disabilities left him virtually paralyzed.”
Nash contacted Gips with her idea, and she wouldn’t take no for an answer. Eventually, with Gips’s help, Michael became quite adept at EagleEyes and used it for all kinds of interactive, creative communications, and even went on to attend classes and graduate from high school.
And Gips? “The professor,” continues the obituary, “spent years refining the technology to help disabled people escape locked-in lives.” He’d been thinking video game innovation, but Michael and Michael’s mom helped him gain a more expansive perspective.
Same technology; different perspective. The upshot, especially for Michael and those like him? Sublime.
That’s the kind of expanded perspective I try to instill in my beginning nursing students. Right now, they’re in a nursing home clinical rotation, and they’re each assigned a resident to care for each week. It’s no surprise that the students are very motivated to engage in what we might call nursey activities – passing meds, changing dressings, listening to lung sounds, stuff like that – and those are the very things that fill their clinical journals every week. “I completed a head-to-toe assessment on my assigned patient,” somebody will write, “and I detected an area of skin breakdown which I reported to the staff nurse.” That’s great, of course, and certainly representative of the skills they’ve come to nursing school to learn about and practice in the first place.
Yet, often enough, I also get comments like this: “My resident was fairly independent and didn’t require a lot of care. After passing meds and taking a set of vital signs, I spent the rest of the morning just listening to her stories. She was very sweet.”
Just listening. That “just” is the giveaway. It signifies that, relative to more hands-on interventions, listening to stories somehow doesn’t count. It’s an extra – an add-on. It’s nice to do, but not nearly as important as the nursey stuff.
That’s simply not the case. In fact, as I emphasize repeatedly, it just might be that listening to stories – just sitting there and listening and nodding and listening and smiling and listening some more – could well be the most important thing they do all day. Particularly in the nursing home, where some folks won’t get family visitors for weeks or months at a stretch, and where the staff simply may not have the time to regularly pause with individual residents who yearn to share their lives and connect.
The nursing students are all about helping people, but they’re understandably focused on physiological help – giving shots and monitoring vital signs, for instance, the new skills they’ve been acquiring. Part of my job is to enlarge the students’ perspective so that they’ll value the ordinary, old skills they’ve been practicing their whole lives – like attentive listening and compassionate courtesy – and intentionally incorporate them into their care. Like Gips and his technology, my students may approach their clinicals with one purpose in mind (nursey stuff), but their actual contact with patients leads to a broader vision.
My favorite image of what I’m getting at occurs in the movie “Wit” (2001). Vivian (Emma Thompson), an aloof academic, has advanced ovarian cancer and is receiving treatment at a hospital. At one point, she kinks her I.V. tubing, and the computerized pump, sensing an obstruction, beeps out an alarm. It’s the middle of the night, and Vivian knows that the beeping will elicit a response from the staff. She just wants somebody to talk to. She’s scared. Alone. She wants company. The beeping I.V. pump is only a pretense.
The nurse, Susie (Audra McDonald), enters the room with one thought in mind: Attending to the beeping pump and restoring the flow of Vivian’s I.V. fluids – a medical task. Once that’s accomplished, she turns to her patient and reaches out. “What’s the trouble, sweetheart?” Susie asks. That simple question, that ordinary, simple turning from object to subject, coaxes forth a cathartic response from Vivian. She confesses her fears and self-doubt. She weeps. She drops her emotional guard and receives Susie’s tender care. “It’s okay. It’s alright,” the nurse repeats over and over as she strokes Vivian’s arm. “It’s okay. It’s alright.” It’s a moment of profound healing for Vivian, and it had very little – nothing, really – to do with the I.V. infusion.
Although this enlarged, multilayered vision of our interactions with others lends itself most easily to the healthcare realm, it has application everywhere and for all of us. Case in point: A story I caught on NPR recently about Amazon’s use of artificial intelligence to expedite their same-day delivery service. “AI is key to Amazon's retail forecasting on steroids,” reports NPR’s Alina Selyukh, “and its push to shave off minutes and seconds in the rush to prepare, pack and deliver.” The computerized crunching of unimaginable amounts of data allows some Amazon customers to expect their purchases to arrive within hours of clicking on the website – even as quickly as a single hour!
Accomplishing that feat, especially in the crush of dense urban traffic, means that hundreds, maybe thousands of factors have to be accounted for – from anticipating what products people will buy, to where they’re stored, to how they’re transported. AI even has to take into account outlier scenarios according to Amazon’s Cem Sibay. “The driver forgets his key at reception and has to walk a little bit longer,” Sibay says. “The driver is delivering a package, and it's an elderly lady. And they, you know, talk a little bit.”
Exactly, and here’s the connection with Vivian’s I.V. tubing and Susie’s soothing presence: It just might be that the elderly lady ordered the bauble or book, whether consciously or not, because she knew somebody would be bringing it to her door. It just might be that having someone to talk to was her objective in the first place. “It's hard for AI to predict all these scenarios,” Selyukh comments. “But next time, maybe the address with the Chatty Cathy will get a few more minutes baked into the algorithm.”
I guess that’s one way to look at it. I prefer to see it as another object lesson in expanded vision. The big difference, of course, is that Amazon’s pause to listen is driven by profit. Ours will be – ought to be – driven by love.