This is a prediction I’ve been tracking for a couple years: passive listening, where your phone knows what media you’re consuming by listening.
Which means eventually apps will know a consumer is watching on TV (Twins baseball game, Simpsons, American Idol) or listening to (KISS-FM, the new Michael Jackson single).
This information could be used on its own during media events — obfuscating the need to use a hashtag to contribute or track conversations about that show on a social network like Twitter or Facebook.
OR it could be used as a an additional data layer in conjunction with user information, geographic coordinates and big data to increase the value of the social networking experience and content delivery based on the context of the user.
Facebook is the first big social networking player to roll out a form of this technology. Per Ars Technica, Facebook has added a new feature to its mobile app as of Wednesday that uses a phone’s microphone to identify ambient TV shows, music, or movies and include them in status updates… The feature works similar to that of the music-identification service Shazam: the phone listens for 30 seconds, and then the app will pop up its best guess for the music or video it’s hearing.
Yes, yes. There’s a privacy concern here.
But experience shows that this kind of tech innovation seems invasive, there is an inevitable backlash, and then consumers tend to adopt components of new ways of interacting as they weigh benefits over privacy.
Regardless, this trend is on the horizon, and we should watch it closely!
different but similar: L’Oreal is using tech innovatively to match ad model’s hair color with your own http://adage.com/article/cmo-strategy/l-oreal-targets-ads-based-hair-color/293390/?utm_source=daily_email&utm_medium=newsletter&utm_campaign=adage&ttl=1401421253