THE BLOG
01/20/2015 10:46 am ET Updated Dec 06, 2017

Silence in the Time of the Apple Watch

2015-01-14-iStock_000026525222Large.jpg

I hate eating by myself. Fortunately, given the moral imperative that living in Manhattan creates to have a rich and full social calendar, I rarely have to. But on the occasion when either out of choice or poor planning, I do have to break bread alone, I have noticed that I gravitate towards restaurants that are lone diner friendly. By which I mean the kind of place where the hostess' face does not contort with concern or pity when you walk in and, with feigned nonchalance, announce "one for dinner."

In places like that, I often see other lone diners. What I also notice is that most of them, often including myself, are not really dining alone. They are dining with their smartphone or tablet screens, the soft bluish glow of the screens comforting, perhaps through allusion -- alluding to the connected, and hence rich life of the smartphone bearer, as evidenced by the tweets, Facebook messages or emails that await her attention.

Adults in the U.S. average over five hours per day spent on digital media. Smartphone separation anxiety is a thing, Internet and tech addiction is a well-researched mental disorder and a city in China has a "texting lane" for pedestrians. The "internet of things" -- sensors in objects from microwaves to pacemakers -- isn't just a technology fad anymore. And wearable tech is already last month's news.

For simplicity and brevity here, I will use "technology" to mean "digital technology" -- the worlds we reach and inhabit through anything that has a screen.

The current conversation about technology and its discontents seems to be dominated by two opposing and often highly polarizing viewpoints. One side, led by the technocrats, Silicon Valley entrepreneurs and the cults of Apple and Google, sees the debate as ridiculous at best, and at worst, refuses to acknowledge that "more technology is always good for all" may not be axiomatically true. The self-referential language of this group, secure in its own importance, does not help the conversation either.

A small but telling example of this exists in the advertising rhetoric sometimes used at the launch of Apple products: "This Changes Everything." If this were mere marketing hyperbole, it might be easier to pay less attention to. The fact is that the iPhone (and other smartphones) have changed everything, at least for that part of the world that can afford to have their "everything" changed by the iPhone.

On the other side of this debate are people who the first side considers complete luddites. And the voices from this other side, given their volume and reach, often end up preaching only to the choir. Or worse still, they get lost because of their pandering to nostalgia and to times when people could have an entire meal without feeling the need to check their phones even once. Or, when being someone's "friend" meant that you had actually met him or her in the flesh. Those times, clearly, seem to be over for a lot of us. The ubiquity (and the usefulness) of smartphones, tablets, and now wearable technology, is a reality that is upon us.

What gets lost in this polar debate, though, is a nuanced examination of the effects of technology, both positive and not so positive, on our lives, bodies, minds, and relationships. And the choices we are and aren't making because of the often-crippling binary state of the conversation on technology. If you're not with us, you're against us.

In his recently published book, The Science Delusion, Curtis White presents a series of arguments against the "mechanistic materialism of neuroscience." He introduces a third option in the competition between the faith based and reality based conceptions of reality: the Metaphor Based. He argues that through symbolic systems such as language, we have created a reality that may not have existed otherwise. He goes on to say, "We are not Homo sapiens but Homo analogos, the first creature to live not only in a physical environment of jiggling atoms but in an environment of its own devising. In fact, if we didn't first live in the analogue we would never have "known" that we were made of jiggling atoms..."

In other words, our reality is not just mediated by the symbolic systems, but is also shaped by it. The next generation of this symbolic world that we have created and now inhabit is a world that comes to us heavily mediated by technology.

How does this constant mediation of reality by technology affect us? Does it change the nature of reality itself and not just our experience of it? (If I take a picture with my smartphone at a dinner with friends to post it on Facebook, does it change my experience of the dinner?)

Or, as a friend likes to say: "If it's not on Facebook, did it really happen?"

The need for the archival and broadcast of our experiences, no matter how excruciatingly banal they may be, has never been stronger in all of human history. ("Look! My first #pumpkinspice latte of the #fall! #latergram")

Have we approached the next stage in our "evolution" from Homo sapiens to Homo analogos -- and become the Homo digitalis?

And if so, what have we left behind?

Has technology made our lives easier in innumerable ways? Of course it has. The fact that I can make restaurant reservations and buy movie tickets while sitting on a bench in Central Park is very convenient. Or, carrying five books on my Kindle while on vacation, without the weight of the books in my suitcase, makes life a tad easier. Or, never having to lose my way looking for that flamenco bar in the back lanes of Seville makes me a bit more secure. Add to this list scores of other apps, widgets and services that I am an active user of, and also mostly grateful for.

But, have I also lost something? Have I lost my ability to create a space of solitude while sitting on that Central Park bench, making restaurant reservations and ordering movie tickets? And have I lost something in my inability to get lost in the back lanes of Seville looking for that flamenco bar?

I came across a wine app recently that can instantly tell me everything I need to know about a wine while browsing in a wine store -- down to recent ratings, scores, tasting notes and what others are saying about the wine. All I need to do is take a picture of the label and let the app work its magic. I feel uneasy with this level of knowledge available to me in the act of picking a wine, an act that is enjoyable primarily because it is laden with some mystery. I can take in the subjective opinions of the (sometimes) knowledgeable staff at the wine shop, but to have the level of objective certainty and crowd sourced wisdom offered by said app takes away from the process of wine buying, tasting and appreciating that involves a certain amount of not-knowing, slowness and self-discovery.

The always available instant-coffee version of knowing may not always be better than getting to know through a process of making essential mistakes. In many instances, crowd-sourced wisdom helps us make better choices and fewer mistakes. But, exactly because it is crowd-sourced, it also steers us towards the path well trodden and keeps us away from serendipity.

Last September, Apple unveiled a product that is all set to make wearable technology as ubiquitous as smartphones are today -- the Apple Watch. This device can make us instantly aware of every text message we receive, as well as our heart rate, the number of steps we walked today and the hours of REM sleep we got last night. In a nutshell, because it is worn on and is thus a part of the body, it can tell us more about our body than we could ever knew outside of a doctor's office.

And as Apple says prophetically in the blurb about this much talked-about product on their website, "Apple Watch represents a new chapter in the relationship people have with technology." Apple, as usual, is right in saying this. In more ways than we can begin to realize.

It goes on to say, "It even gets your attention the way another person would -- by tapping you." And, "With Apple Watch, every exchange is less about reading words on a screen. And more about making a genuine connection."

Given the connection I already have with my smartphone today, I, Homo digitalis, am increasingly uneasy about this claim.