Hi everybody. I know it's been a while since I rapped at ya, but you know how it goes. Anyway, enough about me - let's talk about my favorite topic, music recommendation engines!
I recently attended the SXSW 2009 Interactive conference, and some of my favorite panels (not surprisingly) focused on the ups, downs, and future of music recommendation engines. My favorite one, titled "Help! My iPod Thinks I'm Emo" (click on that to see the slides from the panel) focused on the current state of recommendation engines and talked about why they don't work - "work" in this case meaning "introduce people to music they would otherwise not have heard of but would probably like." Studies have shown that the majority of music that gets recommended to people by automated recommendation engines actually represents a very small percentage of music that's available to these engines. In other words, music recommendation engines are not helping people dig into the "long tail".
So, if these engines aren't perfect now, are they at least getting better? Not really. In 2009, we basically have the same two options as we did back in early 2006, when CNET's Steve Krause wrote this great piece outlining the differences between the two prevalent forms these engines take: collaborative filtering, and content-based recommendations.
If you don't feel like reading Steve's article (you should, it's enlightening) here's the short version:
- Collaborative filtering (think last.fm) is where a computer tells you that you will like Coldplay because you like Radiohead. Why? Because other people who like Radiohead like Coldplay. And by "like" the computer means "listen to frequently." (Not included in this calculation is the fact that Coldplay blows and you will probably hate them.)
- Content-based recommendation engines (think Pandora), will also tell you that you will like Coldplay because you like Radiohead, but in this case, it will be because the computer perceives shared characteristics between the two bands’ music -- such as high-pitched vocal melodies, anthemic, sweeping guitar arrangements, and a general gloomy outlook on life. (This type of recommendation also fails to take into account the fact that Coldplay blows and you will probably hate them.)
So where does that leave us? If, like me, you were hoping for a beautiful future where our robot overlords would tell us what to like and how to think, you're probably out of luck. It turns out that the most promising systems are probably going to utilize a hybrid approach between all of the methods discussed above, whereby human intuition and cold, hard algorithms will share the spotlight. For example, imagine the Hype Machine (which automatically crawls a curated selection of music blogs and generates an automatic playlist based on what music is being blogged about) with Slacker.com (where professionals have decided what bands sound like each other) with user-generated tags from Last.fm.
Of course, even that sort of hybrid system won't be perfect. Nothing will, obviously. But at least there's hope for the future. In another session I attended, "Music 2.0 = Music Discovery Chaos?" it became apparent than even those people I would consider to be "high-level users" of both music and technology primarily relied on their friends and human tastemakers to point them to new music. There's a nice writeup of the panel over at Music Machinery.
So, how are you finding new music these days? Have you ever actually heard a band on one of these sites that you've A) never heard of before, and B) went on to become a fan of? Let me know in the comments.
EDIT:I just realized that the author of the awesome "Music Machinery" blog I link to above is none other than Paul Lamere, one of the co-hosts of the first SXSW panel I reference above. Cool! Expect to hear more about him and some of the work his new company is doing in my next post.