{{ story.headline }}

{{ story.timestamp }}

I'm excited about wearables. I'm pretty much a fitness and a self-improvement nut, I'm a data fanatic, and I see a huge future in the Quantified Self and the Internet of Things. Unfortunately, however, after jumping in with both feet and trying several devices for anywhere from a few days to a few months, here's my verdict on the tech.

I'm no longer wearing any wearable.

Thus, all this marvelous technology is still relegated to niche. To be honest, my six-year-old got more use out of most of the devices than I did, as the most expensive spy toys he's ever been allowed to play with.

I'm not alone in this judgment call. I've read dozens of articles and reviews over the last two months claiming that wearables are still just a trend. And as angry and disappointed as I get with each new failure, each time I read something new I begrudgingly start to nod my head.

But then I remember that smart watches and wearables aren't the same thing. A smart watch is a wearable, but it's just one kind of wearable.

When I consider smart watches as a sector on their own, the technology available today reminds me of Windows smart phones circa 2006. There was so much you could do with them, but it was such a pain in the ass to do it, you never did it.

"Hold on a sec, let me find my stylus."

A Phone Is Not a Laptop

What Windows did wrong with smartphones, and kept doing wrong up until Windows Mobile 7 for that matter, was try to recreate the personal computing experience in your hand. This didn't work, and it was actually Apple's limitation-first philosophy on iOS (along with, I'll say it, great design) that created a new set of use cases.

Apple realized you weren't going to do spreadsheets on your phone, and they quickly rolled out a plethora of apps designed for touch, swipe, location, and communication, the holy quarternity of mobile usage.

Android capitalized and mass-marketed the use cases, creating a half-open-source ubiquity which, in my opinion, brought about the dominance of mobile-first development. Remember the days when you could develop for iOS and wait to release on Android, if at all?

Playing the mobile metaphor all the way to the end, there's obviously an iPhone-of-watches out there on the horizon. It doesn't even necessarily have to be the iWatch. It could be. But it doesn't have to be.

It definitely isn't anything we've seen so far. There have been steps in the right direction, but from everything I've tested, the use cases are the same -- make calls, read texts, count steps, maybe take photos.

All of which can be done on a phone.

A Watch Is Not a Phone

The problem with smart watches is they're trying to be little tiny smart phones when they're not. The technology isn't there to support a form factor or battery life that makes the phone replaceable.

I don't want to talk into my wrist. The screen size is ill-equipped for the vast majority of smart phone functions, including typing. The camera photos and videos look like dumb phone photos and videos circa 2006.

And the circle is complete.

So except for a slim cross-section of usage, sending calls to voicemail, LOLs, and happenstance photos to document something like, I don't know, shipping something expensive, I'm just gonna go ahead and take the extra three seconds to reach into my pocket for my phone.

The beauty of what I'll call the dumb activity tracker -- like a Jawbone or Fitbit -- is that, much like the iOS example above, it has many limitations. It can count my steps, roughly monitor my sleep, and wake me up. End of story.

The tradeoff is that most of the time I forgot it was there. I don't carry my phone all day, maybe 90% of the day, but that other 10%, the dumb trackers were counting the steps the phone couldn't. Same thing with sleep, although frankly, I rarely used dumb trackers for that, mostly because I don't like sleeping with the equipment on.

Eventually, however, I tired of the dumb trackers as well. Why? Because they ran out of new and interesting stuff to tell me.

The reason we all took off our watches in the 2010s wasn't due to the fact that we had smart phones in our pockets telling us the time -- we had that years earlier with dumb phones. We took our watches off because, more often than not, something within a neck-crane was telling us the time when we needed to know the time.

Yeah, people still wear watches for the look. But remember calculator watches? Those were pretty.

Keeping It Simple

There is a happy medium somewhere in there. The simplicity of the dumb trackers calls for smarts. Open the functionality of a single button to be smart enough to be aware of other devices in range, like your car, the television, a friend's wearable, and you're onto a whole new set of use cases. Add one and two finger touch, maybe swipe or tap, against the device itself, not some ridiculously small screen, and now you're communicating.

But that last one for all its potential human error and battery drainage, is only necessary if communication is a viable part of wearable usage.

I mean, it's a safe bet right now that the camera, whether it's a Dick Tracy wrist mount or a glasshole "is-anybody-in-this-stall?" legal nightmare, is probably not. If I want to communicate, I'll pull out my phone. I can't imagine the use case where I wouldn't.

I probably shouldn't be communicating with my wearable. But my wearable should be communicating with me and everything around me that I want it to talk to. This is the set of use cases that will lift the wearable from niche to ubiquitous device, but it's not going to happen without a whole lot of work on the data side.

Wearables are still awesome. They are still the future, despite the naysayers. But until we settle on a common set of use-case functionality and the right lexicon of input and feedback, we'll see rev after rev of smart watch doing all sorts of tricky things, but still, disappointingly, never taking off.

{{ fs.headline }}

{{ fs.timestamp }}