Picture this, you time travel back to the year 1800 and grab the first person you see.
You bring them back to the present and sit them in front of a computer.
Without any instruction do you think they'd be able to navigate the internet?
Will they understand how to use the mouse and keyboard?
Will they even be able to turn it on? Probably not.
Time travel may not be possible yet, but AR is.
Except with AR, all of us are the people from the 1800s.
We aren't entirely sure how to interact with this groundbreaking technology.
We have a sense what we want to do with it,
but were still collectively figuring out a shared language for making it happen.
There's a term in technology called User Interface Metaphors, or UI Metaphors.
These are common types of interfaces that apply to multiple sets of common technologies.
For example, your phone, your computer,
maybe even your refrigerator all come equipped with qwerty keyboards these days.
As soon as we see one,
we know what it is and how to use it.
That's a UI Metaphor.
The hard part about AR is that it is such a new technology that it
doesn't really benefit from any UI metaphors. There's no precedent.
No context for how humans can and should manage
the data and unique digital opportunities that AR can bring us.
Should we use our hands or our eyes to navigate menus.
Should we select items on those monies with a tap,
a snap, or our voices.
Should we even have menus at all?
When it comes to AR, UI designers need to start from the ground up to research,
design, and test what works best for this unique platform.
Don't forget, computers didn't first use the optical mouse,
they started with punch cards.
In AR, we're still using punch cards.
But the hunt for something like the AR mouse is now underway.