With immersive technologies on the brink of a breakthrough, what would it take for machines to read and understand our gestures?

Image for post
Image for post

Human beings communicate by making sounds and moving body parts — a combination that helps us connect the inside of our minds with the outside world. We show our spatial conceptualisation to each other, simply by the wave of a hand. However, we are not able to gesture when we interact with machines. Yet.

Louder than words

To be able to fully interpret us, speech robots need to learn more than the words we use. Body language is fundamental — both for…

Image for post
Image for post

“Call me an ambulance, now”
– ”From now on, I’ll call you ‘An ambulance’. OK?”

Picture a world where humans can communicate with machines — one where a soothing voice reads you the weather report, orders food and makes shopping lists, without you needing to lift a finger. We’re nearly there, with the emergence of voice assistants. Just one minor detail is holding us back — Siri, Alexa, Google Assistant and other speech robots don’t really understand anything.

And how could we expect them to?

Technically speaking

Today our interaction with everyday helpers, such as mobile phones, cars, speakers, fridges, or even…

Vilde Reichelt

Linguist and writer for Bakken & Bæck – it’s all semantics to me.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store