Putting them to the test
Just using the two products is a different experience. Dictation software doesn't understand speech the same way humans do. We continually and instantaneously parse the words we hear based on context; that's how we know the difference between "ice cream" and "I scream." Computers do much the same thing, but they aren't as good at it.
What this means is that, in Mavericks's Dictation system, words appear on the screen as I speak them, but in a disjointed way, as the system tries to figure out what I'm saying. The words themselves and their order change as I get deeper into a sentence; things keep switching around. Sometimes the screen gets so jumpy that it's distracting. Dragon Dictate doesn't put words on the screen as fast as Mavericks's Dictation, but the words it does put up are usually closer to the final transcription than in Dictation.
The real test, however, is accuracy. To assess that, I used both the Mavericks Dictation tool and Dragon Dictate to transcribe a four-paragraph, 268-word passage of text. I ran through the passage three times in Mavericks, to iron out some kinks, and just once in Dragon Dictate. I didn't use my existing user profile in Dragon Dictate, in an attempt to make the playing field even.
The results? Both programs made mistakes. Mavericks Dictation's errors were more frequent and more ridiculous, however. For instance, when I said "detail," it transcribed "D tell." When I said "expository," it heard "Expo is a Tory." The program had particular problems with the sentence "Students must be jarred out of this approach." I spent several minutes trying to get Dictation to transcribe "jarred" and "jar" correctly; each time it transcribed them both as "John." I also found it odd that Dictation refused to insert a space before opening quotation marks; it failed to do so in every instance of my test.
In the end, Mavericks's built-in Dictation tool made 28 mistakes.
Dragon Dictate had fewer problems but still made some mistakes of its own. It too tripped on "expository," but less hilariously than Dictation, writing "expositors" instead. It insisted on transcribing "class scored" as "classic lord." Overall, it made nine mistakes.
So the final accuracy scores were 96.6 percent for Dragon Dictate and 89.6 percent for Mavericks's Dictation. Although that difference might seem insubstantial, and although Mavericks still got a very high B, if you were to dictate a passage of 10,000 words, the text would have more than 1000 errors if you used Mavericks's Dictation tool, versus about a third of that in Dragon Dictate.
The bottom line
This result isn't so surprising. Dragon Dictate is a paid application with several years' worth of development effort behind it. Also, Dragon Dictate requires you to spend time training it before it will even work, so it has a much better idea of your voice and the way in which you speak.
Sign up for CIO Asia eNewsletters.