Facebook is rolling back one of the features that really drew the ire of many a user recently and it has me applauding the move and got me rethinking the idea of algorithms curriating too much of my life.
The Facebook news feed was supposedly simplified your experience on the site but targeting specific actions you’d previously taken in order to determine what was most relevant to you from your friends posts. The theory was to eliminate some of the clutter that made the chronological news feed difficult to navigate, especially when you have post-happy friends dominating it. However, the end result was a very narrow view of what your actual social network really looked like. Since it was very difficult to customize the actual display of the featured content the only way to really tweak the algorithm was to spend enough time clinking on other things to have it adapt the display to you. This only works if you actually know exactly what it is you want the algorithm to do for you and not if you use Facebook to just see generally what’s going on.
Facebook isn’t the only one incorporating algorithmic curation of content, but because they did it in the context of telling us what of our friends and family’s lives we should see the impact hit much closer to home.
Google does something similar with the way it parses search results for both logged and unlogged users even when they type in the same search term. Yahoo customizes its front page the same way displaying news and entertainment content different for users even when they are visiting the page at the exact same time. Apple and Amazon storefront will make different recommendations for products based even when viewing or buying the exact same thing.
Pundits typically applaud this targeting as smartly intuitive designed to hone in on the targeted information or advertising making it more relevant and therefor more worthy of one’s attention. The personalization factor appears to give the user the sense of a completely customized experience, just for them.
However, there’s an inherent downfall to purely algorithmic targeting of this nature. Netflix stumbled on part of it initially in The Napoleon Dynamite Paradox. Typically, if someone likes or is looking for for A they probably would also like or look for B based. However, with the movie Napoleon Dynamite this logic doesn’t hold true as the reaction to the movie is so starky contrasting and the web of previous data tying the movie as the “B” to any given “A” is impossible. Netflix posted the problem to the development community at large offering a mighty reward if anyone could figure out how to consistently recommend it appropriately as part of reconfiguring the recommendation algorithm. To the best of my knowledge, no one’s ever done it.
Probably, no one ever well, because algorithms assume that people are more predictable than they inherently really are and are locked into a certain if-then clause that cannot take into account other intangible aspects of human decision making. Anyone who’s studied human behavior either from the purely scientific (sociology, psychology) or from the business perspective (marketing) will tell you you can only hone in on an expected behavior and get it right so often and that number rarely ever reaches 100%. All the same inputs rarely produce the same consistent output when it comes to people.
The solution on how to present the right content at the right time to the right person then expands beyond simply building an equation that seeks to replicate the expectation of human need.
First, since the end user may not always know what they actually want, the option for a more open and generic experience should occur. The traditional shot-gun approach to advertising or content provisioning (such as a printed paper or magazine or broadcast radio or television station), where the entire populous saw the same ad no matter how seemingly insignificant it was to them, might be a little too broad for some, but one can not underestimate the value of unexpected exposure. One might never realize the need for
That’s a paradox of social web sites: there’s an inverse relationship between their ability to recommend things to you and the amount of information you provide. As quality recommendations go up, privacy goes down.