Posted November 26, 2008 at 2:46pm by iClarified · 7568 views
The Google Mobile iPhone application may break some rules that Apple placed on iPhone developers, according to Cnet.
Google's application both activates the proximity sensor and delivers an audible prompt to voice your search terms, and the only way it can do this is by using an API that isn't part of the public list Apple has put together for developers, according to Gruber. Think of an API as helpful code that an operating system shares with an application to make it easier for that application to get things done.
Apple lets developers create applications that access some parts of the iPhone--such as the accelerometer for spacial controls and GPS for navigation--but it considers other parts of the phone's technology off-limits to anyone but Apple. Nonetheless, Sadun observes that there are tons of applications within the App Store that do what Google has done with its mobile application: take advantage of technology that is accessible, such as the proximity sensor, but go beyond the basic things you're allowed to do with that technology by using "unpublished" APIs that exist but are not publicized by Apple.
Sadun compares this to jaywalking: Sure, you might get hit by a bus, but you probably won't, if you're careful. And the cops aren't exactly going to launch a three-state manhunt for you, if you make it across the street.
While other developers may have to sneak around App Store rules I think its fair to say Google will continue to receive exceptions considering it supplies the iPhone with its Maps and YouTube functionality.
I don't think Google App is the only one using the sensors. Fring turns off the display when you talk the same way the iPhone does when you make or receive calls
So? Sure Apple does work more tightly with Google than other companies releaseing apps. In fact, in one keynote I recall Jobs saying "We love working with these guys [Google]" (or something along that line)