Smartglasses Face a Blurry Future
At the 2012 Google I/O conference, the big “reveal” was Google Glass. A team of Glass-wearing skydivers live streamed their descent toward the roof of the San Francisco Moscone center where the event was underway. It was an awe-inspiring stunt, but Google Glass flopped due to a buggy and ridiculous user experience, and the project was shuttered in 2015. Or was it?
The website for glass proclaimed “Thanks for exploring with us,” but it also offered hope for the future with “The journey doesn’t end here.” Of course, Google can waste money on pie-in-the-sky projects forever because they print so much pie-in-the-sky money with their AdWords platform. But what about the rest of us? When should we expect some breakthrough capability with smartglasses? And what would that look like anyway?
I actually think other technologies that were related to the first glasses experiments are going to dominate our attention, and that is probably a good thing. Smartglasses initially were a symbol for three separate and distinct technology advances:
- A heads-up type display that removes the need for a display screen to be positioned in your field of view.
- A hands-free user interface to be able to engage with an application to move the experience along without tapping on a screen or pecking a keyboard or zapping a barcode or whatever other input you choose.
- A camera application to capture and share the imagery in your field of view.
Let’s start with number 3 first. I decided to do this blog post when I saw that Snap (the company behind Snapchat) just disclosed in an investor update that they are writing off about $40 million on Spectacles inventory they are not able to sell. In case you have not heard, Spectacles are the smartglasses that are integrated with Snapchat to give the user a hands-free camera application to share the imagery in their field of view with the Snapchat application. It flopped. But that is not the interesting bit. The interesting bit is that the glasses were $129 including the charging case. While not free, that is not bad for a first generation, new form factor camera with LED lighting, a power source, and the electronics for connecting to other devices. I think experiments like Spectacles are going to lead to a simpler form factor for a lightweight, high functioning camera that attaches to your glasses or the bill of your cap. It will simply be able to attach to whatever application you are running via Bluetooth or WiFi, and now you have a hands-free camera to snap images or stream video to applications running on your smartphone or tablet.
Item number 2, the hands-free user interface, is actually here today. It comes in two parts that everyone will quickly recognize. The first is the earpiece/microphone that we have all used or seen others use (Jawbone is a popular brand that has done well in the market). This allows you to give audible input to an application (likely running on your smartphone or tablet) and receive audio back from the application. The second part is Alexa (or Siri, pick your assistant). I think Alexa is actually going to be the game changer because Amazon is so good at productizing computing infrastructure for folks like ServiceTrade to incorporate in our applications. We also have experience with Google and Microsoft – there are good reasons why Amazon is the market leader by a pretty wide margin. I believe Alexa will be another example of their market-leading competence in this area. The applications you use will have an Alexa interface that enables the technician to move the workflow along by saying “Alexa, move the workflow along (as a proxy for whatever application option makes sense.)”
Item 1, the heads-up display, is the hard bit. Not because this is new or novel because pilots, for example, have been using heads-up displays in aircraft since the mid-90s. It is difficult because shrinking it to work in a miniature and mobile environment like a pair of glasses is a difficult piece of physics. The display only works correctly if the user can see the application interface in the same plane of focus as the other items of interest. If I understand what I have researched, it appears the approach being used by Google Glass is a near retina display. The image is projected directly onto the retina, so there is no issue with the depth of focus. The information is just “there” for the retina to absorb without refocusing on a “closer” screen display.
What Google Glass got wrong (in my humble opinion) was trying to introduce all three elements in a single device, while simultaneously assuming that the applications where we might use the technology were readily available. None of the technologies were significantly evolved to enable an “all in one” device to be successful. I am not a fan of “all in one” applications anyway, as I find they typically suck at most of the things they try to achieve for the sake of claiming a longer checklist of “features.”
Instead of the “all in one” that flopped for Google (although the physics breakthroughs they achieved with the display are impressive), I believe you will begin to see small changes sneak up on you. It is easy to imagine someone with a Bluetooth Jawbone and a visor-mounted camera collaborating via Facetime with a remote colleague. There’s nothing extraordinary here because all of the technology is well developed already. I can also imagine a technician setting up their tablet beside a piece of equipment and asking Alexa to play and pause and rewind a recorded video of how to repair a complex piece of equipment – hands-free with an interactive application that we already use every day.
There is a phrase in my industry called the “consumerization of IT.” Basically, this phrase means that the end-user consumer applications for new technology will generally lead the market before the commercial applications become available. Seems counterintuitive until you realize that consumer spending makes up 70% of the US economy. It just makes sense that the titans of technology such as Amazon, Apple, and Google, would focus their research and development dollars to address the biggest available market. If you want to experiment with things that likely will work to improve your commercial application, don’t look for some big breakthrough from a wildly new and different application. Instead, focus on the commercials that you see during the holidays that demonstrate how you can display an eggnog recipe and play holiday music by commanding Alexa to do so. Pay attention to the display of best-selling gadgets at Best Buy from companies like Jawbone that connect to applications on your phone. Then go play around in the context of your work for customers and find innovative ways to put these consumer breakthroughs to work for the benefit of your customers and your company.