Google's voice search function on the latest version of its Google Mobile search app is pretty slickly implemented: it knows when you put it up to your ear, prompts you to speak, and then searches when you move it away from your ear.
No other application has leveraged that same functionality yet, and - as it turns out - for good reason: the part of the iPhone's code that controls that capability is undocumented by Apple.
Using undocumented Application Programming Interfaces (APIs) is not just frowned upon - it's against the terms that iPhone developers agree to when they download the SDK.
The reason for this isn't particularly nefarious: undocumented APIs are often in active development and subject to change, which could lead to third-party applications that rely on them to break or crash. Daring Fireball's John Gruber has a more in-depth piece about what exactly this means.
But the central point of this issue is that somehow Google's application made its way into the App Store anyway, despite violating the terms - and we know Apple's not exactly shy about rejecting applications for that reason.
The voice search feature itself was highly publicized before its release, and Google has now even admitted that it used undocumented APIs, although it denied using private or dynamic frameworks, which could have been a more serious issue from a technological standpoint.
So did Apple's vetting process - which we'll politely call inconsistent at best - just miss this one? (perhaps they were too enamoured about how cool it was.) Or did they know that the app used the APIs and give them a pass anyway because, hey, they're Google - don't be evil, remember? Not to go all Watergate on this, but the important question becomes what did Apple know and, yes, when did they know it?