Google’s motto ‘Don’t be evil’ was held up to scrutiny this week after CEO Larry Page defended recent changes to search results that focus on personalisation.
The company is trying to merge search results with its social networking application Google+, which is a competitor to Facebook. With 845 million users, Facebook sits on the largest amount of personal information any company has ever had. And that means finely-tuned targeted advertising.
Page says it will create a ‘more intuitive experience’. But what if a user doesn’t want that? What about random? Like randomly walking into a bookshop or library and discovering something new? Changing search outcomes based on what a computer algorithm thinks you will like may not be evil, but it is manipulation.
Online ‘filter bubbles’ is the subject of Eli Pariser’s Ted Talk, available at www.ted.com. It gives insight into the shift from the original intent of the internet to what it has become: an advertising tool. He first became aware of this when he noticed that Facebook removed his conservative friend’s posts.
He says, ‘It turned out Facebook was looking at what links I was clicking on and it noticed I was clicking more on my liberal friend’s links than my conservative friend’s links. Without consulting me, they had edited them out.’
And of course Google does it too. ‘Two people searching the same thing will get very different results. One engineer told me there are 57 signals Google looks at. These include your computer, browser and location so they can personally tailor your query results.’ News websites, such as Huffington Post and the BBC, can also be personalised says Pariser. ‘The internet is giving us what it thinks we want to see, but not necessarily what we need to see.’
He makes the point that in the early 1900s, newspapers were critical to a functioning society. From that grew ethics in journalism and now the internet is at the same point of evolution.
Information gatekeepers – once human – understood civic responsibility and ethics, but invisible algorithm filters are only programmed for relevance. Things that are important, challenging, uncomfortable and are of another point of view should also be included, according to Pariser.
Should we be surrendering our innate human curiosity to robots so easily?