Augmented Reality: A New Frontier for Digital Commerce?

Augmented Reality: A New Frontier for Digital Commerce?

The other day I had my blood drawn for some tests.

I went to my local clinic and the nurse did my usual blood work. What was not usual was her missing my vein three times with the needle. Not ok!

Just as I was about to tell her I’m a trained battlefield medic and I could do it myself if she wanted to take a break, she finally managed to hit the damn vein.

Fast forward to 2020, I again arrive at my local clinic but this time, the nurse enters wearing a particular pair of glasses. The rest of the procedure is the same; only she manages to hit the vein the first time. Yay!

After it's done, I ask the nurse what’s up with the glasses. She explains that they allow her to see where my veins are and hit them perfectly every single time.

Sounds like some future sci-fi story, right?

Wrong. This device already exists and offers just one example of how augmenting our perceived reality can have real-life benefits.

 

What is Augmented Reality (AR)? 

Augmented reality takes your real-world experience and adds new digital layers do it. It can be as simple as adding new elements like video, graphics or other data to your field of view or go as deep as alter what you see, hear and feel all at the same time.

For that to work, we need specialized hardware and software. Let’s say we want to alter what we see. In that case, the specialized device can be as simple as a smartphone that we carry around with us all the time.

So it’s no surprise then that as mobile phones with cameras got more and more popular, so did apps that altered our field of view with new elements.

They were fun and made us laugh:

 

While the tech may have found mainstream success with simple apps, the technology of altering what you’re seeing has many other use cases – from fighter pilot helmets, selling furniture, virtual dressing rooms, aircraft maintenance, healthcare and more.

Let’s look at all of the above and more while concentrating on two verticals: augmenting what we’re seeing and what we’re feeling through touching (e.g. haptic feedback) and how those two technologies combined can have world-changing consequences.

Augmenting Your Field of View

When you hear someone say or talk about AR, then augmenting one's field of view is what they most likely mean. The technology enhances your physical surroundings with digital information. Unlike Virtual Reality (VR), you don’t escape the physical world; you’re only adding new elements to enhance it.

While games like the above Starbucks example are fun to play, they don’t accomplish much besides killing some time. 

There is so much more that the technology can achieve. The vein viewer example from the beginning of this article adds valuable information to your field of view. But there are many more examples and opportunities. 

For example, IKEA has offered augmentation technology to enable customers to visualize virtual furniture at their own home for a couple of years now:

 

Taking the same basic idea of adding virtual things to a real-world environment, Swivel allows users to virtually try on different articles of clothing in real-time. That takes the saying “window shopping” to a whole new level!

 

And Disney has developed an app that makes characters from coloring books come alive as 3D images.

Augmented reality isn’t just being used for promotional or sales support. It is also the product itself.

For example, Playstation’s Eye of Judgement fuses AR and card-based role-playing to create a unique experience. More recently Pokemon GO was announced, which will use a combination of AR, geo-caching, and multi-player gaming, to simulate being a real-world Pokemon trainer (whether you like the franchise or not, you have to admit, this is cool).

 

A different take on augmenting our field of view involves the augmentation happening in or on the object itself. For example, objects like helmets, car windshields and mirrors. 

Fighter pilots have been using head-up displays (HUDs) that show relevant extra information for years and years. And even on cars, basic HUDs that display your speed have been around for ages.

Picture of a smart mirror

Image via Adafruit

The DIY community has been playing around with smart mirrors for a while now. They display basic information like the date, time, calendar, events and more on a mirror surface. You’re looking at the mirror every morning anyway, so you might as well get some relevant and useful information from that, without the need to check your smartphone.

Taking the same basic idea of adding relevant information and displaying it on a screen, but applying to car windshields, we get smarter windshields like the one below:

 

The technology has been around for years (the demo is from 2010), but we’re only just starting to see this tech built into cars with HUDs.  Traffic info, warnings for people and other objects on the road and navigation info all helping drivers stay focused on the road instead of looking at their navigation device.

Now imagine how many accidents we could avoid if rearview mirrors, navigation, speed and more could be incorporated into a motorcycle helmet. How cool would that be?

Well, you don’t have to imagine it as this smart helmet actually exists and is made by a company called Skully:

 

Now, take the same basic concept of augmented helmets and apply that to large manufacturing and industry. What you end up with is the DAQRI smart-helmet.

The smart-helmet is an industrial-grade, human-machine interface that inserts real-time information, work instructions, safety information, mapping and more to maximize safety, productivity and well-being for workers in a variety of industrial settings. 

It’s an all-in-one system meaning that all the technology you need is inside the helmet, and there are no wires or anything of that sort to obstruct the working environment.

 

One of the uses cases for the helmet is overlaying relevant information on top of what the viewer sees in their physical environment. For aircraft maintenance workers, it recognizes all the different mechanical parts and gives instructions on what to search for and how to do tasks effectively.

Now, imagine using the same technology to help you assemble the million and one pieces of furniture you just got from IKEA. You take all the parts out of their boxes and lay them in front of you, then fire up the IKEA app on your smartphone and voilà – you have exact step-by-step instructions on which part goes together and how you’re supposed to build them.

You take all the parts out of their boxes and lay them in front of you, then fire up the IKEA app on your smartphone and voilà – you have exact step-by-step instructions on which part goes together and how you’re supposed to build them.

Image of augmented picture of aircraft maintenance

Screencap via The Verge

Microsoft HoloLens and Magic Leap Are Just the Beginning

Augmenting what we see takes what we see and adds a static data layer like text notifications, numbers and things of that nature.

Even with tools like the IKEA app that places new objects into our field of view, we can’t go closer to see a table from a different angle because it’s not fixed to anything, it just floats on the screen. 

Now, what would happen if we could somehow manage to make the virtual objects stay in place (or come alive and move around), while at the same time making the viewing device wearable and recognize commands by either voice or hand gestures? We would get devices like Microsoft HoloLens and Magic Leap.

 

What makes this technology different is that it tries to combine the best aspects of virtual reality, augmented reality and real life. Its promise is to make it possible to treat virtual objects as “real“ and see them from different sides and vantage points. As you walk closer, the object gets bigger and vice versa – just like a real-world object would.

One of the first companies to get a chance to work with this technology (neither HoloLens or Magic Leap are selling any products at this stage) is Volvo which used the Microsoft HoloLens platform to produce a virtual showroom of sorts.

The rooms are, of course real, but everything else is virtual. It gives consumers the ability to get up close and personal with the car and see it from every possible angle and understand the technology and everything that goes into manufacturing it.

Volvo car augmented into a showroom

Screencap via Youtube

In a couple of years, when this technology is consumer-ready, I could easily see IKEA taking advantage of this ability to “anchor” virtual objects.

So, we could take a bookshelf and virtually attach it to our wall, then walk around the room, see it from multiple angles and more. Plus, once the device is attached to your head, your hands would be free and assembly instructions suddenly get a whole lot more useful.

Further in the future, this technology will be available as a contact lens. I know this sounds like science-fiction, but Google is already working on smart contact lenses.

Image of a smart contact lens by Google

And yes, I know that in its current incarnation the technology is only capable of measuring blood sugar, BUT it also serves as a proof-of-concept for electronic chips that can be small enough to fit inside a contact lens. With that mind, the future's brilliant (and clear) indeed.

The short film "Sight" gives us a glimpse into what day-to-day life would look like when the screen is directly on your eye.

Sight from Robot Genius on Vimeo.

How Haptic Feedback Will Allow us to "Feel" the Digital World

Apple Haptic Engine - Picture

Image via Apple

In March last year, Apple came out with the new 13-inch Macbook Pro, which made the news in many different ways. It had a new, smaller form factor, a singular USB type C port and something that Apple called “Force Touch".

Force Touch, known on the Apple Watch and iPhones as “3D Touch,” is just another name for haptic feedback technology. It creates a sense of touch by applying forces, vibrations or motions to your fingers, tricking you into feeling something that is not there.

Although the technology seems like something from a sci-fi movie, it was first demoed at the beginning of the 90’s.

In its most rudimentary form, haptic feedback is what you feel as the rumbling of the controller when you play games on your games console (force feedback) or the vibration of your smartphone when sound is turned off. And it’s even being used in athletics to “teach” your muscles to behave like athlete’s muscles.

Image of Disney's Haptic Feedback screen

Image via Disney Research

All of the above use cases are interesting and without a doubt useful, but things start to get really interesting when haptic feedback via electromagnetic waves is used to “feel” different textures on a glass surface. This is known as “Electrostatic Vibration” and it does not use any moving parts.

The technology provides a wide range of tactile sensations to fingers sliding across surfaces of any shape and size, from small mobile displays to curved or wall-sized screens. It can be easily combined with a wide range of touch sensing technologies, including capacitive, optical and resistive touch screens.

 

You can “feel” the texture simply by sliding your fingers on a screen. Feel the texture on a display, how awesome is that, right?

“Pretty awesome indeed, but where can this kind of technology be used in a meaningful way?” is a perfectly reasonable question. For the answers, let’s go back to IKEA.

Keep in mind, the video above was produced in a lab over 5 years ago. By the time these technologies reach the mainstream, the IKEA app on our HoloLens-like wearable device will be smart enough to project an accurate, high-quality 3D model of my future sofa into my apartment. I’ll walk around the room, change the color of the sofa, and look at it from different angles and distances. But now thanks to haptic technology, I can touch my sofa and get an accurate feeling of the material. For that, I’ll simply run my fingers on my smartphone screen (equipped with haptic feedback technology) to get a “feel” for the materials used.

Tell me why would I ever want to go to a brick and mortar store ever again, when I can get this level of detail while shopping from home and get it delivered in hours with services like UberRUSH and Amazon drones?

But that’s not all; we can still add more layers to the experience by adding a “virtual” sense of weight to our hand via nerve manipulation.

Manipulating Nerves and Giving Weight to Digital

Image of a bionic hand

Image via Bebionic

Years ago, when you got into a terrible accident that resulted in, God forbid, an amputation of your arm, your only solution was to have a prosthetic hand/arm created that looked like the real thing but was not able to function in any meaningful way.

Thanks to advances in technology, today's high-tech prosthetic hands use individual motors and microprocessors that work on electrical impulses generated from the biceps and triceps.

And soon there’ll be bionics on the market that are connected directly to the brain that enable a bionic hand to also feel the sense of touch. In short, amputees can regain total control of their new bionic arms and continue to live a fulfilling life. That, in and of itself, is an amazing technological achievement. But that’s not all. 

The same technology, and electrical impulses, can be used to fake the hand into sensing weight that is not there. There’s a company called UnlimitedHand that uses electronic muscle stimulators (EMS) to simulate sensations similar to electrostatic vibration (e.g. feel textures).

Additionally, because it uses muscle stimulation and our feeling of weight is muscle-based, it’s possible to simulate pressure on muscles by stimulating muscles with a precise amount of low-level electrical energy. This results in our brain telling us that this amount of pressure corresponds to an object weighing approximately this much. 

How Will We Shop in the Future?

I’ve been using IKEA as an example throughout this article to drive home how all this new technology will impact and add to our shopping habits. However, the truth is that the future of shopping might only involve us using augmentation and haptic technology to get a sense of how products will look and feel. 

Naturally, as smart watches and other wrist wearables evolve, they may work in tandem with smart contact lenses and provide haptic feedback via muscle stimulation that will allow us to "feel" the digital layer on top of the physical world. 

Looking for new clothes? Simply go and stand in front of your smart mirror equipped with Kinect-like sensors and the technology will take care of the rest. It’ll augment your mirror image to correspond to whatever you try on.

Wondering how the material feels on your skin? Touch the mirror and find out.

Shops, as we know them right now, may not exist anymore. There’s little they can offer that would make the buying experience better than the one that I can have in the comfort of my home.

The future looks bright, very bright indeed. And the crazy part is that most of the technology covered here is already available. 

Yes, it might be too cumbersome or too heavy or too big for everyday use for now, but it’s out there. It just needs a little more time and effort. Soon, we’ll start to see and use unique technologies in our everyday lives. It’s a good time to be alive.


 

About The Author

Ott Niggulis is a chef/paramedic/freelance writer who focuses on marketing and CRO. Marketing is a numbers game and he loves numbers. Follow him on Twitter.