Documenting the Coming Singularity

Saturday, November 22, 2008

How Google's Ear Hears

Technology Review - November 20, 2008, by Kate Greene

The new voice-search application for the iPhone marks a milestone for spoken interfaces.

If you own an iPhone, you can now be part of one of the most ambitious speech-recognition experiments ever launched. On Monday, Google announced that it had added voice search to its iPhone mobile application, allowing people to speak search terms into their phones and view the results on the screen.

In designing the system, Google took on an enormous challenge. Where an automated airline reservation system, say, has to handle a relatively limited number of terms, a Web search engine must contend with any topic that anyone might ever want to research--literally.

Fortunately, Google also has a huge amount of data on how people use search, and it was able to use that to train its algorithms. If the system has trouble interpreting one word in a query, for instance, it can fall back on data about which terms are frequently grouped together.

Google also had a useful set of data correlating speech samples with written words, culled from its free directory service, Goog411. People call the service and say the name of a city and state, and then say the name of a business or category. According to Mike Cohen, a Google research scientist, voice samples from this service were the main source of acoustic data for training the system.

Read source>>

Technological Singularity and Futurism is updated often; the easiest way to get your regular dose is by subscribing to our news feed. Stay on top of all our updates by subscribing now via RSS or Email.