Gestures are a way of interacting that have become
popular with touchscreen phones and tablets.
They're normally movements of your hands,
arms or fingers that have a definite meaning or result.
Classic examples on a touchscreen are swipe gestures to scroll through a list,
or pinch to zoom where moving your fingers apart or together zooms in or out of an image.
In VR, if you have a six-degree-of-freedom controller,
you can interact with your whole body.
You can swipe with your arms or zoom by moving your whole arms in and out.
You can also have gestures with other parts of your body,
like nodding your head or shaking it.
Because gestures can use your whole body and involves movement,
they can feel good,
and a lot of them can also be very natural.
Gestures are likely to feel most natural if they're close to real-world actions.
Swiping feels natural because it's like flicking through clothes on hangers,
and zooming with your hands feels natural
because it's like stretching or squashing something.
That real-world connection makes them easy to learn.
But both of these examples will also feel natural
because we've learnt similar gestures on touchscreens.
So we can also take existing interfaces as a key.
You should design gestures to be like real-world actions.
For example, in Western cultures,
nodding your head means yes,
so it's a natural gesture for confirming an action. But we have to be careful.
If we use gestures that are not linked to real-world actions,
they can be unnatural.
Making a circle is an easy gesture to do,
but it's not closely linked to anything you would do in the real world.
The same goes for actions that are based on the real world
but aren't that similar to the equivalent real-world interaction.
You might also have noticed that just now I said in Western cultures,
nodding your head means yes.
It doesn't mean yes everywhere in the world.
Just because a gesture is well-known where you are in the world doesn't
mean it will work if you are creating an experience for people globally.
Gestures can be quite hard to implement,
normally with custom scripts,
but there's been a lot of work in the last few years on
generic gesture recognition software
based on machine learning algorithms like neural networks,
and some are starting to be available for Unity.
It's a big area of development for us at Goldsmiths,
so watch this space.