Whether exchanging dialogue with our smartphones or scribbling characters on touchscreens, the Human-Machine Interfaces (HMI) we interact with today are intuitive and foster 'easy to use' input methods.
Driven by speech, handwriting and touch, our technologies are continually progressing towards intuitive communication between humans and machines, and we are continuing to march forward.
However, several advancements in artificial intelligence technology, such as machine and deep learning capabilities, have paved the way for the humanistion of our machines and devices. And there’s one particular development in the AI space which has pioneered the ability for seamless human-to-machine interaction – cognitive ergonomics.
Through cognitive ergonomics, system designs that allows machines to adapt and operate considering mental workloads and other factors, we are able to communicate with our devices as easy as writing a note on paper.
> See also: How machine learning will transform hospitality
Although our machines and systems operate internally at technically complex levels, on the outside, they interface at the user’s level potentially improving human reliability and reducing errors.
Here are a few examples of how cognitive ergonomic-driven technologies have laid the foundation for increasingly humanised HMIs.
Natural language processing
There’s no denying that digital and virtual assistants have become part of our everyday lives. By simply talking to our devices, cars and wearables, we are able to get directions, access phone numbers and search the web with ease.
Yet while the brains behind the system – natural language processing (NLP) – has arguably become a popular AI technology input method in our daily lives, the extent of its capabilities often times goes unnoticed.
From the ability to hold almost full conversations with our machines, to the voice processing of varying types of languages, these capabilities would not have been possible without the advancement of NLP technologies.
Surprisingly, even to the capacity that voice recognition is utilised today, the space has been led by one main player – Nuance. Powering Apple's Siri and other applications including many automotive systems, Nuance supports just under 40 languages.
However, a recent announcement by Google introduced the new 'Cloud Speech API' software which can recognise and understand 80 languages. Howerver there often remain accuracy issues surrounding NLP in noisy environments and recent research indicates that hands free technologies are often more distracting than desired for mission critical applications such as automotive.
As natural language processing technologies have progressed, not only are our digital assistants able to understand the way we naturally communicate, but they’re also beginning to anticipate our needs. Through NLP and machine learning, the AI technology behind our devices completes our day-to-day tasks and sometimes even makes our decisions for us.
Specifically, big players in the space like Amazon and Apple, have begun utilizing machine learning and big data capabilities in our assistants to act on our future needs. And brands like Spotify, Fitbit and Domino’s, are tapping into the virtual assistant space to bridge the gap between their products and the user.
For example, with the release of the new Amazon Echo, Alexa (Amazon’s voice-operated digital assistant) will congratulate you on completing a run, order you your favorite pizza or pick your next playlist – all by picking up on consumer purchasing trends and tracking user data.
A growing trend in human-machine interaction is utilising handwriting recognition (HWR) as a natural input method for interacting with our devices. Simple touch can include more complex input mechanisms using written input methods.
Through handwriting recognition capabilities, neural network based input methods are now allowing consumers to interact with their devices simply by writing – including their wearables, smartphones and tablets.
For example, MyScript’s handwriting input technology behind wearable devices, which have very small screens, allow for users to superimpose letters on top of one another to form complete words and sentences. This enables users to write texts, emails or search for directions using handwriting recognition on screens that are too small to type on.
Other emerging applications, such as textual interfaces in smart cars, are now entering the AI space. Handwriting recognition not only maximises convenience for drivers, but also gives drivers additional, and sometimes safer, options to communicate with their vehicle.
Car manufacturers, like Audi, are integrating these intuitive and minimal distraction input methods for drivers. Whether on armrests or vehicle consoles, drivers use handwriting recognition to find their final destination, send a message and search for music through a simple handwritten character – keeping their eyes on the road and making for a safer commute.
> See also: The droids you're looking for: the AI tech that will make up the intelligent enterprise
High accuracy recognition is obtained even when the input is written at a large angle from horizontal, as is often the case when the driver is not watching the input process.
Going beyond device input mechanisms, many of today’s most popular tablets and 2-1 systems have active stylus offerings that provide near pen-on-paper writing experiences.
This makes desirable the creation of digital documents, emails using only the stylus’ digital ink and handwriting recognition. The technology is capable of interpreting math equations providing calculated results and even diagrams with text can be written, all seamless converted into digital document form.
The technology even allows for intuitive editing, all with only the stylus. interactive ink management enables editing in either the ink or typeset domain using only the stylus. Interactive ink enhances the user experience and is enabled with today’s modern neural network techniques.
From simplifying our daily lives to improving safety, cognitive ergonomics has played an important role in the improvement of seamless and natural human-machine interaction. And users now live with unlimited access to information and device capabilities that they might not have had otherwise.
The humanisation of our tech through cognitive ergonomics has undoubtedly improved the efficiency, increased knowledge and opened many doors to future capabilities of AI-driven technology.
Sourced from Gary Baum, VP, MyScript