Siri, the seemingly forever-in-beta voice-based virtual assistant built into iOS, received just a pinch of attention during Apple's Worldwide Developers Conference keynote. As you admire the highlights of Apple's Siri announcements—higher quality voice options, answers to new kinds of questions, Wikipedia integration, and a new interface—two notable elements stick out, and they both involve Google. First, Siri's core interface continues to lag behind Google Now's in iOS 7; second, Apple's relationship with the search giant has deteriorated to the point where it's finally willing to make a partnership with another old rival: Microsoft.
Overall, it seems that Apple is trying a new take on an old adage: If you can't join 'em, beat 'em.
Many of the improvements to Siri in iOS 7 look quite welcome. You'll be able to ask questions like "How old is Elvis Costello," and Siri will show the relevant Wikipedia text and read the exact answer aloud. (He's 58, by the way.) The redesigned Siri results screens Apple showed off look lovely, and the high-quality voices it demonstrated sounded great.
But one core feature from Google Now, the search giant's take on voice-based interactivity, appeared to be missing: live audio transcription as you speak.
Behind the scenes, both Apple and Google send your audio to their remote servers for rapid transcription, results of which are sent back to your device as quickly as possible. And the two companies employ similar basic approaches for doing so: The words you speak are digitized and sent to the remote server as you go; neither service waits until you're finished talking to start that process, since that would be wasted time.
Google Now shows a live transcription as you speak
It's what they do during that process that differs. As Apple demonstrated onstage at WWDC, Siri in iOS 7 will show a squiggly waveform as you speak, with your transcription appearing only after Siri's stopped listening. Google Now, however, shows its best interpretation of what you've said as you're speaking. Sometimes, that gets a smidgen wonky, in that later words you speak will help Google better understand what you're saying, and already-transcribed words thus get re-interpreted.
Despite that minor limitation, Google Now's approach is unequivocally miles better than Apple's. Google's take makes it easy to see if the transcription process has gone entirely off the rails, as users of either voice-based system know can happen from time to time. Siri's brief black hole means you have no clue whether Apple really understands what you're asking until several sometimes very long seconds have gone by.
Sign up for CIO Asia eNewsletters.