The intelligent enterprise: how businesses will use cognitive computing in 2015

Speaking to students at MIT in October, Elon Musk, engineer and CEO of Telsa Motors and SpaceX, called artificial intelligence 'our biggest existential threat.' He may be the man behind the first commercial flights to the International Space Station, but it's hard to avoid feeling he may have his head in the clouds when it comes to what is science and what is science fiction. At the same time we have films like 2014's 'Her' that depict a not-so-distant future where smart operating systems can have their own emotions and identities, and eventually become so intelligent that they supersede us. While autonomous A.I has been a trope in our culture for many years, the hype and speculation certainly hasn't abated in 2014.

But far from excluding humans, A.I systems based on cognitive computing technology have the potential to augment our reasoning capabilities and empower us to make better informed real-time decisions- and are already doing so.

'People will always remain key in the decision making process- cognitive computing will just require them to impact decisions in a different way and at a different stage,' explains Hugh Cox, chief data officer of Rosslyn Analytics. 'Human expertise, knowledge and experience will continue to be collected, meaning that as time progresses computers will become more adept at making decisions, removing the need for human interaction. But it’s important to note; the most advanced cognitive computing tool will never replace humans because we have contextual insight that computers simply don’t possess.'

> See also: How artificial intelligence and augmented reality with change the way you work

According to IBM's Senior Vice President John E. Kelly, we're on the cusp of the 'third era' of computing- one of cognitive computing. In the age of tabulating machines, vaccuum systems and the first calculators, we fed data directly into computers on punch cards. Later on, in the programmable era, we learnt how to take processes and put them into the machine, controlled by the programming we inflict on the system. But in the forthcoming era of cognitive computing, computers will work directly with humans 'in a synergetic association' where relationships between human and computer blur.

The main benefits of this kind of synergy will be the ability to access the best of both worlds- productivity and speed from machines, emotional intelligence and the ability to handle the unknown from humans.

'They will interact in such a way that the computer helps the human unravel vast stores of information through its advanced processing speeds' says Kelly,' but the creativity of the human creates the environment for such an unlocking to occur.'

Reigning champion

The most well-known representative of 'cognitive computing' right now is IBM's Watson system. In 2011, the computer famously appeared on -and won- US gameshow 'Jeopardy!' by providing questions in response to clues posed in natural human language, which included nuances such as puns, slang and jargon. It was able to quickly execute hundreds of algorithms simultaneously to find the right answer, ranking its confidence in their accuracy and responding within three seconds.

In the three years since its Jeopardy! Victory, IBM engineers say Watson has become 24 times faster, with a 2,400% improvement in performance, and 90% smaller, having shrank from the size of a master bedroom to 'three stacked pizza boxes.'

This year, the company opened up the system for the first time as a business unit called the IBM Watson Group, which it is investing over $1 billion into in the next few years. It's dedicated to commercialising Watson as cloud-hosted software, services and apps, as well as supporting businesses and startups that are building their own innovations around the technology.

As Rob High, IBM Fellow, VP and CTO for Watson explains, the basic principles of the Watson system are based on four 'E's':

'The first E stands for education, which means machine learning,' says High. 'Watson can go out and read enormous quantitites of literature in human language, and reason about that language for the purposes of answering a question in a way that is similar enough to how a human would reason about that problem. Watson can find connections between ideas that never would've popped up before or were hard to find because the amount of data.'

'The second E, 'expertise', isn't just a function of whether you or I can answer each other's questions, it means can I answer it well and establish trust with you, so it's based on the level of confidence in the response.'

Watson comes back with both the answer it found, knowing there may be ambiguity in the question, its confidence in the answer, and the supporting evidence for it.

The third E is 'expression':

'Cognitive computing means systems that have learnt human forms of expression, most notably language,' says High. 'It doesn't stop at written language, but can intepret vocal and visual language as well.'

'Over the last 70 years we assumed we had to adapt to the constraints of the computer. Even modern computer applications require that we fill in boxes on the form or buttons. Those interfaces represent constraints within the computing system, and they're certainly not the way you and I would communicate with each other – I believe this has to happen, it's a necessity.'

The natural next step, says High, is for sthe ystem to recognise and interpret body language, gestures, and other non verbal ambiguity in language.

The fourth E, 'evolution', means that the system and its engineers are constantly learning which reasoning strategies do and do not work, and evolving new strategies, says High.

But as Cox argues, the evolution of systems like Watson will always still require humans to make it valuable.

'Yes we will see advances as processing power improves,' he says. 'But it will be a breakthrough in the method of computing that really moves us forard. What we will start to see more of in the short-term is improved analysis and speed, which will make it appear more like the computer is thinking- but it's a process that relies on us.'

Workable intelligence

Since the launch of the Watson Ecosystem, IBM has been working to capitalise the technology though industry pilots in a handful of industries, from finance to medicine and scientific research. Its also collaborated with academic institutes such as Imperial College London and the University of Southampton.

Several US hospitals are working with Watson in the care of cancer patients, drawing from medical literature and patient information and answering questions in natural, conversational language to communicate the best course of treatment for each patient. IBM is working with the Cleveland Clinic and New York Genome Center to build a Watson app that identifies patterns in genome sequencing that it's hoped will help doctors and patients make better informed decisions about their health.

On the customer service front, the Watson Engagement Advisor product has been set up as a one-stop shop for military personnel transitioning out of service, fielding questions around topics including banking, employment and benefits.

Far from being a god-like generalised intelligence, Watson has its roots in IBM's Deep Blue invented soley to play chess- it is based on 'deep but narrow' knowledge of a particular domain.

> See also: How artificial intelligence will make humans smarter

'Through Watson people have the ability to help train the system, unlike conventional computing where the requirement is to sit down and programme the logic,' says High. 'Now the task is to teach the computer in-depth knowledge of your particular domain, its conventions, idioms and idosyncrasies'.

Thanks to this approach, it does require some effort to train the system, akin to the type of training someone might perform when teaching a human to become an expert in their field. But rather than sending a medical student to school for eight years, they must teach the computer in a matter of months or weeks it takes to build an enterprise app.

Gartner VP Tom Austin describes this process as a 'non-trivial event.'

'Building an app on top of Watson really requires you to understand the body of information,' he says. 'For this reason I wouldn't expect any time in the forseeable future that we have a 'doc in the box' Star Trek type technology that will be able to immediately diagnose any illness.'

Though Watson may well represent an important step forward in the future of cognitive computing, the limitations of developing apps around it for the average enterprise might be obvious to IT teams who have ever faced the 'app delivery chasm' of delivering custom apps in line with business demand.

As Cox argues, though the money and resources being put into cognitive computing by IBM will benefit everyone in years to come, 'Watson isn't for business – at least not for the 99% of companies in the world. It's simply too young in its artifical thinking.'

Just like NASA took astronauts to the moon, the technology that was developed to catapult a rocket into space was soon applied elsewhere, creating new industries. In the meantime, startups are developing small-scale examples of human-driven machine learningfor use in businesses – a blend of artificial intelligence and human insight.

Analytics software based machine learning technology, such as that being launched by companies like Saffron technologies, are building solutions without the benefit of IBM's billions of resources, compute power and expertise, but are making waves in the kinds of bespoke solutions businesses are looking for.

Others, such as the virtual assistant Amy by artificial intelligence startup x.ai, are built on the Watson platform but provide a ready-made solution for businesses wishing to take advantage of these technologies. 'Amy' can be copied into an email conversation, and uses natural language processing to identify the content, so users can have her 'do all the tedious email ping pong that comes along with arranging a meeting.'

Chatbots like Amelia, created by IPsoft, can make inferences based on the information she has to sense users' emoticons and react accordingly. If it does not understand a request it can 'go ask a human for help,' making it complimentary to existing human expertise.

These features allow 'Amelia' to be deployed in multiple scenarios – from call centre environments to cosmetic artistry to financial advisor assistants. As this technology rolls out to various industries, IPsoft claims it will have a major impact on the global workforce. Just as the industrial revolution forced farmers to take on higher level jobs, so will this cognitive revolution force knowledge workers to become more creative and create new opportunity, claims the company.

But as Cox argues, there is one undeniable truth- 'the value businesses devise from cognitive computing will only be as good as the data.'

With urgency, businesses need to focus on harnessing the knowledge and expertise of their people to start asking the right questions of the data they have available and they need to invest in enriching the quality of that data, says Cox.

'Only then can you capture the drivers and reasoning that are behind the association between the input variables and the decision. Without laying these foundations, cognitive computing will remain out of reach.'

Avatar photo

Ben Rossi

Ben was Vitesse Media's editorial director, leading content creation and editorial strategy across all Vitesse products, including its market-leading B2B and consumer magazines, websites, research and...

Related Topics

Data