Apple says the audio user interface will include one or more audible control nodes perceived by the user to be spatially located at different points about the user of the portable media device. A sensor in the portable media device senses a movement of the portable media device by the user toward one or more of the audible control nodes. The operation of the portable device is then modified in accordance with the sensed movement of the portable media device.
The filing provides numerous scenarios of how this might be used. For example, Apple says that the user might use audio cues to select between musical genres.
"A set of four genres can be presented as an audio menu to a user with each menu item having a distinct location. The rock genre menu item can appear to the upper left, the jazz genre menu item to the upper right, the blues/r&b menu item to the lower left and the country menu item to the lower right. These menu items can be presented in a variety of forms such as synthesized text-to-speech, recorded speech or samples of recorded music. In some embodiments, the menu items can be presented sequentially, such as when articulating the menu items as speech, or can be presented simultaneously, such as when communicating the menu items as samples of recorded music. (Speech can also be presented simultaneously and recorded music samples sequentially as well.)
The user can move portable media device in the direction of a menu item, and the sensor can detect the movement. The processor in portable media device can determine, based on the sensed movement, the menu item indicated by the user and present a confirmation of the indicated menu item to the user, such as repeating the menu item or sample of recorded music. "
Read More [via AppleInsider]