Subscribe / Unsubscribe Enewsletters | Login | Register

Pencil Banner

Microsoft serves up DIY bot development

Simon Bisson | March 31, 2016
Bots are simple apps that enable useful conversations with users, and Microsoft's new Bot Framework and Cognitive Services let you create them easily

A bot bestiary

The simplest bots are scripted and will be very familiar to anyone who's built a bot for IRC or for a chat service like AIM. They behave and operate much like interactive voice response services over the phone (and could quickly be repurposed from existing voice and touch-tone-driven systems).

More complex bots will take advantage of the explosion of machine learning-powered AI systems, with Microsoft providing tools to help refine understanding of user content. An onstage demo showed a food-ordering app being trained to recognize the phrase "my crib" as referring to a user's home address.

More complex bots can be built on top of a range of cognitive services Microsoft has been working on over the last year or so. Cortana Analytics builds on Azure's Machine Learning platform to give you quick API access to common machine learning algorithms. That runs alongside itsCognitive Services microservices, a set of focused APIs for specific machine learning scenarios. Originally named Project Oxford, the Cognitive Services tools now include 22 different REST APIs for speech, natural language, vision, and search.

The Language Understanding Intelligent Service (LUIS) is a key tool for bot development, as it lets you build waterfall recognizers that can quickly help pinpoint user intent. It's an important feature when you're building bots that need to work with a wide swath of the general public.

The power of conversation

APIs like these underpin conversational computing, as they enable developers to infer context from a range of different inputs. Being able to quickly see that someone is at a desk can help tailor responses and decisions, narrowing what could have been a wide decision tree and simplifying the code needed to deliver that conversational agent. Understanding the context a user is working in can quickly make a bot appear as a natural, more intelligent part of a user's day-to-day life.

Perhaps the real power of this cognitive approach to computing can be encapsulated by the final demo of Satya Nadella's section of the Build keynote, where a blind engineer has been able to use Cognitive Services vision and speech APIs to build an application that works with smart glasses to narrate the world. This gives him a set of tools that makes it easier to understand what is happening around him, from colleagues' emotional responses about a presentation to a description that explains a random sound in the park.

Bots and "conversation as a service" are the foundation of a world of cognitive computing. But they bring to the table a set of ready-to-use tools that can expand our applications to a much wider audience. Satya Nadella's first public speeches talked about a world of "ubiquitous computing and ambient intelligence." Now, two years on, Microsoft is delivering on that vision.

Source: Infoworld 

 

Previous Page  1  2 

Sign up for CIO Asia eNewsletters.