This week at CES, a MacBook analyzed my speech and told me I was feeling happy, confident, and convincing. I'd like to think that was accurate. Minutes later, when one of my TechHive colleagues and a developer got into an opinionated discussion, it picked up feelings of anger and assertiveness--which was probably also accurate.
We were trying out Beyond Verbal, a cloud-based platform that reads emotions in real time by analyzing voice intonations when someone is speaking. Its Web engine, called Moodies, can extract more than 400 variants of moods just through voice alone. If you allow Moodies to access your computer's microphone, it will listen to you talk in 20-second spurts, and then present a breakdown of what's really going through your head.
The system breaks readings down into two categories: Your primary mood, which is the most expressed emotion, and your secondary mood, which is less expressed but still present, kind of like the hidden intention behind what you're saying. What's super interesting is that it's language-agnostic, meaning that it focuses on tone, inflection, and intonation instead of on the words themselves.
So, what does this mean for us? Although Beyond Verbal has been working on this technology for the past 18 years, the company recently opened the SDK up to mobile app developers. Expect to see diary-type apps that track emotions, apps that analyze phone calls to give insight to how the person on the other end is feeling, and much more in 2014.
But, for now, you can try the web engine for free. Maybe launch it quietly in the background when your friends are over to see how they really feel about your newfound interest in interior design.
Sign up for CIO Asia eNewsletters.