IdleRich
IdleRich
Interesting talk here where he speaks about how personalisation is the new buzz word for internet companies and what that can mean for search engines.
http://www.dshed.net/eli-pariser
The main thrust of what he's saying is that Google, in part response to the way Facebook etc only show you data that you're likely to like (because it comes to you via your friends who presumably like things that you like) now attempts to show you results from your searches that have been tailored to you, based on the kind of things you've clicked on before - for instance if I was to do a search on football it would be more likely to show me proper football than it would be to show me American Football (or Aussie Rules or whatever) because that's what I've looked at before. Fair enough. Problem is that this extends to politics and news; if a rabid right-winger and a loony lefty both put the same search into google they will presumably get very different pages reinforcing their respective beliefs and yet both will believe that they've just picked the first results that come out of an objective (or at least rigidly algorithmic) search of the interweb. This doesn't seem right to me, the search is telling you what you want to hear and, worst of all, most people aren't aware of this.
This personalisation thing is mentioned in this article here too although, unsurprisingly, as the quote is from a Google employee there is a much more positive spin given to it.
http://www.guardian.co.uk/media/2011/jul/30/google-plus-facebook-social-networking
Anyway, I don't know to what extent other search engines do this but it makes me reluctant to use Google - certainly for important stuff where I'm trying to discover what I think, not just be given what someone thinks I want to think (though maybe it is better for when you're searching for football, music etc).
http://www.dshed.net/eli-pariser
The main thrust of what he's saying is that Google, in part response to the way Facebook etc only show you data that you're likely to like (because it comes to you via your friends who presumably like things that you like) now attempts to show you results from your searches that have been tailored to you, based on the kind of things you've clicked on before - for instance if I was to do a search on football it would be more likely to show me proper football than it would be to show me American Football (or Aussie Rules or whatever) because that's what I've looked at before. Fair enough. Problem is that this extends to politics and news; if a rabid right-winger and a loony lefty both put the same search into google they will presumably get very different pages reinforcing their respective beliefs and yet both will believe that they've just picked the first results that come out of an objective (or at least rigidly algorithmic) search of the interweb. This doesn't seem right to me, the search is telling you what you want to hear and, worst of all, most people aren't aware of this.
This personalisation thing is mentioned in this article here too although, unsurprisingly, as the quote is from a Google employee there is a much more positive spin given to it.
http://www.guardian.co.uk/media/2011/jul/30/google-plus-facebook-social-networking
Seems to me that the first claim in that quote is somewhat undermined by the second. Aren't diversity and specificity mutually exclusive?"Diversity of results is something deeply baked into the algorithm tools we use, so that we hopefully give a broad perspective," said Gomes. "But if you are interested in a topic you'd tend to do a very specific query anyway, and our first goal is to give you the information you want."
Anyway, I don't know to what extent other search engines do this but it makes me reluctant to use Google - certainly for important stuff where I'm trying to discover what I think, not just be given what someone thinks I want to think (though maybe it is better for when you're searching for football, music etc).