Pages

Sunday, May 20, 2012

Are your search results Google’s opinion?

New information technology is a challenge for all of us, but particularly for lawyers who have to sort out the integration into existing law. A question that has been on my mind for a while, and one that I think we will hear more about in the future, is what is the legal status of search engines, or their search results respectively?

There has now been an interesting development in which Google asked a prominent law professor, Eugene Volokh, for an assessment of the legal classification of their service.

In a long report titled “First Amendment Protection for Search Engine Results,” which was summarized on Wired, Volokh argues that Google’s search engine is a media enterprise, and its search results are their opinion. They are thus covered by the US American First Amendment, which protects their “opinion” on what it might have been you were looking for, as a matter of “free speech.” (It should be noted that this report was funded by Google.)

It is hard for me to tell whether that is a good development or not. Let me explain why I am torn about this.

Search engine results, and Google in particular, have become an important, if not the most important, source of information for our societies. This information is the input to our decision making. If it is systematically skewed, democratic decision making can be altered, leading to results that are not beneficial to our well-being in the long run.

Of course this has been the case with media before the internet, and this tension always existed. However, non-commercial public broadcasting, often governmentally supported, does exist in pretty much all developed nations (though more prominently so in some countries than in others). Such non-commercial alternatives are offered in cases when it is doubtful that the free market alone will lead to an optimal result. When it comes to information in particular, the free market tends to optimize popularity because it correlates with profit, a goal which can differ from accuracy and usefulness.

There is also what I called the “key in the trunk” effect, the unfortunate situation in which the solution to a problem can only be assessed if the problem is solved: You need information to understand you lack information. Information plays a very crucial role to democracy.

Research in sociology and psychology has shown over and over again that people, when left to their own devices, will not seek for and think about the information they would need to make good decisions. We are, simply put, not always good in knowing what is good for us. Many problems that can be caused by making wrong decisions are irreversible – by the time the problem becomes obvious, it might be too late to change anything about it. This is often the case when it comes to information, and also in many other areas whose regulation is therefore not left to the free market alone. That’s why we have restrictions on the use of chemicals in food, and that’s why no developed nation leaves education entirely to the free market.

(Sometimes when I read articles in the US American press, I have the impression that especially the liberals like to think of the government as “they” who are regulating “us.” However, in any democratic nation, we do impose rules and regulations on ourselves. The government is not a distinct entity regulating us. It’s an organizational body we have put into power to help improve our living together.)

That it is sometimes difficult for the individual to accurately foresee consequences is also why we have laws protecting us from misinformation: Extreme misinformation makes democratic decision making prone to error. Laws preventing misinformation sometimes conflict with free speech.

Where the balance lies differs somewhat from one country to the next. In the USA, the First Amendment plays a very prominent role. In Germany, the first “Basic Right” is not the protection of free speech, but the protection of human dignity. Insults can bring you up to a year prison sentence. Either way, freedom of the press is a sacred right in all developed nations, and a very powerful argument in legal matters. (It is notoriously difficult to sue somebody for insult in any of its various manifestations. Aphrodite's middle-finger, see image to the right, which was on the cover of a German print magazine in February 2010, was covered by press freedom.)

So how should we smartly deal with an information source as important as Google has become?

On the one hand, I think that governmental intervention should be kept to a minimum, because it is most often economically inefficient and bears the risk of skewing the optimization a free market can bring. If you don’t like Google’s search result, just use a different search engine and trust in the dance of supply and demand. George Orwell's dystopian vision told us aptly what can happen if a government abuses power and skews information access to its favor, putting the keys into the trunk and slamming it shut.

On the other hand, Google is a tremendously influential player in the market of search engines already, and many other search engines are very similar anyway. Add to this that it’s not clear our preferences which Google is catering to are actually the ones that are beneficial in the long run.

This point has for example been made by Eli Pariser in his TED talk on “Filter Bubbles” that we discussed here. Confirmation bias is one of the best documented cognitive biases, and our ability to filter information to create a comfort zone can lead to a polarization of opinions. Sunstein elaborated on the problem of polarization and its detrimental effect to intelligent decision making to quite some length in his book “Infotopia.”

It is sometimes said that the internet is “democratic” in that everybody can put their content online for everybody else to see. However, this sunny idea is clearly wrong. It totally doesn't matter what information you put online if nobody ever looks at it, because it has no links going to it and is badly indexed by search engines. It might be that, in principle, your greatly informative website appears on rank 313 on Google. In practice that means nobody will ever look at it. Information doesn't only need to exist, it also has to be cheap in the sense that it doesn't necessitate great cost of time or energy to access it, otherwise it will remain unused.

Then you can go and say, but Google is a nice company and they do no evil and you can't buy a good Google ranking (though spending money on optimizing and advertising your site definitely helps). But that really isn't the point. The point is that it could be done. And if the possibility exists that some company's “opinion” can virtually make relevant information disappear from sight, we should think about a suitable mode of procedure to avoid that.

Or you could go and say, if some search engine starts censoring information, the dynamics of the free market will ensure some other takes the place, or people will go on the street and throw stones. Maybe. But actually it's far from clear that will happen. Because you need to know information is missing to begin with. And, seeing that Google's “opinion” is entirely automated, censorship might occur simply by mistake instead by evil thought.

So, are your search results Google's opinion? I'd say they are a selection of other people's opinions, arranged according to some software engineers' opinion on what you might find useful.

That having been said, I don't think it is very helpful to extrapolate from old-fashioned print magazines to search engines and argue that they play a similar role, based on a 1980 case in which an author, unsuccessfully, tried to sue the NYT over the accuracy of their best-seller list, a case which Volokh refers to. The diversity in search engines is dramatically lower than opinions that could be found in print in the 80s. At the same time the impact of online search on our information availability is much larger. Would you even know how to find a best-seller list without Google?

Now, I'm not a lawyer, and my opinion on this matter is as uninformed as irrelevant. However, I think that any such consideration should take into account the following questions:

First, what is the impact that search engine rankings can have on democratic decision making?

Second, what market share should make us get worried? Can we find some measure to decide if we are moving towards a worrisome dominance?

Third, is there any way to prevent this, should it be the case? For example by offering non-commercial alternatives or monitoring by independent groups (like it is the case eg with press freedom)?

25 comments:

  1. In a free society, the availability of particular information depends on the energy spent in propagating it by the proponents of its relevance.

    A fully automated Google will find and show sufficiently disseminated and cross-linked information.

    I don't see any way around it. If someone solves some major problem but publishes the result on some obscure website that receives five visitors a year, I don't see any humanly possible way except serendipity of finding that information.

    A dictatorship can squelch certain information, no matter how relevant it actually is, and no matter how much energy the proponents of its relevance have; and the wealthy can substitute money for energy and make some information seem more relevant than it actually is. But as long as the search engine is neutral in that its results are based not on trying to evaluate the content, but on the public's vote on it, it is the least of the problems in today's world.

    ReplyDelete
  2. I mean to say above that censorship by governments and distortions by the wealthy (e.g., Rupert Murdoch) are more important to the degradation of our information than the search engine algorithms ** in today's world **.

    ReplyDelete
  3. Well, Google was not the first search engine, and people had an easy time switching every time a better one appeared. There is no reason to think it will be different with Google. The Internet is also demcratic to search engines, anybody can make one, and put it there.

    About people not knowing (and not wanting to search) imformation that is important for them... I think that is a much deeper problem, but I bet a search engine that actively tells you "you are tryng to do X, that topic, that you don't even know you should look for is relevant to you" would put Google out of business in no time.

    ReplyDelete
  4. This comment has been removed by the author.

    ReplyDelete
  5. Hi Bee,

    I’ve always thought of Google’s results much the same as I do those of science; being at best each stands as to represent provisional truth. That is respective of the concern that what we are given might not represent the truth, thus I think is something that can only be ultimately dealt with by the individual themselves. That is I would argue the blind acceptance of misinformation to be more often indicative of the lack of due diligence of the searcher, rather than any intent of those who offer such utilities. Moreover, what I find to be more pertinent is what it is that people are predisposed to have believed, as opposed to taking the time to wonder why anything they are presented with should be believed. In short I think the best discriminator of truth still is and I suspect will be for some time the human mind, which should never surrender its proxy to an algorithm; that is whether it represents that of synthetic execution or those of societal groups.

    ” Instead of tending towards a vast Alexandrian library the world has become a computer, an electronic brain, exactly as an infantile piece of science fiction. And as our senses have gone outside us, Big Brother goes inside. So, unless aware of this dynamic, we shall at once move into a phase of panic terrors, exactly befitting a small world of tribal drums, total interdependence, and superimposed co-existence. [...] Terror is the normal state of any oral society, for in it everything affects everything all the time. [...] In our long striving to recover for the Western world a unity of sensibility and of thought and feeling we have no more been prepared to accept the tribal consequences of such unity than we were ready for the fragmentation of the human psyche by print culture.”

    -Marshall McLuhan, ”Gutenberg Galaxy” p. 32, University of Toronto Press


    Best,

    Phil

    ReplyDelete
  6. Google is deeply run by its founders Sergey Brin and Larry Page. They and their efforts' credibilities are not the "truths" of a corporate whore CEO, a government insertion, or traditional Media in bed with its owners and regulators. Pick your flavor of tyranny when common law (good manners) is replaced by statutory law (jackbooted State compassion).

    One in three East Germans was a stukach. How much Homeland Severity is enough? Like church collection plates, the need can never filled. Google, Wikipedia, etc. are working approximations to objective truth. Official Truth is not about real world content. al-Assad of Syria knows to rule.

    ReplyDelete
  7. Hi Arun,

    "a fully automated Google will find and show sufficiently disseminated and cross-linked information."

    But that's a chicken and egg problem. How does something get disseminated and cross-linked that isn't suitably indexed to begin with?

    Either way, I didn't so much mean that there's something wrong with search engines per se, but that relying on a ranking exclusively based on popularity isn't helpful to all ends. I suggested earlier (ah, some years ago, somewhere on this blog) that it would be good if there was a way to directly "tag" a link with the reason it's being used. For example, I might link to a website because it's a great resource, or because it's a terrible site but hilariously so. For the end of popularity, both counts the same - a link is a link. But if you're looking for information, you might be less interested in the entertainment value than in the truth value.

    Thus, I basically think we need more alternatives to search algorithms, even if these alternatives are not profitable. Best,

    B.

    ReplyDelete
  8. Hi Arun,

    "I mean to say above that censorship by governments and distortions by the wealthy (e.g., Rupert Murdoch) are more important to the degradation of our information than the search engine algorithms ** in today's world **."

    I would agree. But I think it's only a matter of time till these problems merge, and I think we better be prepared for that situation. Best,

    B.

    ReplyDelete
  9. Hi Marcos,

    "The Internet is also demcratic to search engines, anybody can make one, and put it there."

    Anybody who can afford the investment that is. The internet is not democratic. It's capitalistic. Best,

    B.

    ReplyDelete
  10. Hi Phil,

    I'll have to disagree with you. It's not a problem we can deal with individually, because it's the difficulty of understanding social dynamics and collective effects that is the root of the problem. What is needed is a better understanding of the way we assess and use information that are relevant for our political opinion making. If we at least knew the facts, then we could talk about what can be done about it. But at present, I think very little is known and everybody just kinda hopes that market forces will be sufficient for optimization. Which I doubt however. Best,

    B.

    ReplyDelete
  11. Hi Bee,

    Essentially what you are suggesting is what we need to offset the effects of corporate funded Big Brother is the creation of a government sponsored Big Brother; the problem is in the end they are both Big Brother. However, where I think we do agree is that more must first become aware of the effects and consequences of modern media, before any remedy can be found. What I find interesting is when McLuhan was writing his books he was referring mainly to the effects of the development of media historically in general tracing the evolution of same reflective of its effects on society and yet his analyze of matters I find today still to be as relevant as ever. The thing is as McLuhan made clear media isn’t going to go away, as won’t its dangers and benefits, as each is systemic. So my point of view remains, being we each of us must first learn about the dragon and then learn to tame it, rather than depend on the arrival of a dragon slayer; as they too where only ever a myth.

    “Computers can do better than ever what needn’t be done at all. Making sense is still a human monopoly.”

    -Marshall McLuhan, “Take Today: The Executive as Dropout “, p.109 (1972)

    Best,

    Phil

    ReplyDelete
  12. Hi Phil,

    I would prefer many smaller brothers. And they don't necessarily need to be governmentally funded, though that's the obvious solution. They could, in principle, be non-profit based on other sources, but I'm not sure that would work.

    Yes, McLuhan seems to have been very prescient. I can only hope that in 40 years from now, his vision applies to a phase that came - and passed.

    Best,

    B.

    ReplyDelete
  13. "If it is systematically skewed ... " Everything that people do is systematically skewed. Most of the biosphere on planet Earth consists of bacteria and viruses, but most people do not understand bacteria and viruses very well. Search engines and the international corporatocracy are driven by money, which history shows is not necessarily a benign driver.

    ReplyDelete
  14. Hi Bee,

    Well at least our hopes be the same if not our methods. The thing with McLuhan is he looked inside the animal as much as he did into to assessing its tools and I don’t think there is any way to avoid such a relational examination to have things better understood. In respect to your smaller big brothers we do have things such as Wikipedia, which comes up on the first page of just about every search I make. This has me wondering, is this same for everyone or just the self selected demographic?

    Best,

    Phil

    ReplyDelete
  15. Hi Bee,

    One thing I am upset about with Google is that Google Scholar has been demoted to being a specialty search. I thus would support having Google petitioned to give it a prominent place on their header page line; at least then if still mainly ignored no one can complain they weren't given the option.

    Best,

    Phil

    ReplyDelete
  16. Hi Phil,

    Yes, I'm with you on that. Or at least, they could let me customize what I want in the head menu. Mine features several options (Play, News) that I never use, so why can't I just pull the Scholar button there? Best,

    B.

    ReplyDelete
  17. Hi Bee,

    I guess we could appeal to Sergey and Larry that doing no evil is not enough, as doing some good is also at times necessary. Then again although things may not present themselves on my header line sites such as Backreaction always seem to appear in one of the boxes below; then of course there isn't anything to have wondered about there ;-)

    Best,

    Phil

    Best,

    Phil

    ReplyDelete
  18. Dear Bee,

    "But if you're looking for information, you might be less interested in the entertainment value than in the truth value. "

    The problem is, deciding the truth value is precisely the job that governments, corporations, churches, etc. arrogate to themselves. I don't want that. Nor do I want the search engine to be trying to determine truth for me.

    That is why I phrased it as "the availability of particular information depends on the energy spent in propagating it by the proponents of its relevance."

    Maybe the search engines could add a dimension to their indices. That is, users of the search results could rate the value of the results; and the search engine could display rankings by communities of users.

    So maybe I should be able to phrase a request to Google to search for articles on quantum gravity that Phil and Bee found useful. Of course, we'd want that anonymized, so Google would build user profiles, and classify them, and I would say, find me articles on xyz rated high by users with user profiles close to that exemplified by Bee or Phil.

    So, e.g., if I wanted to see the best climate change denialist stuff, I would ask for articles rated high by a profile like Motl's :)



    -Arun

    ReplyDelete
  19. Hi, Bee.

    "Anybody who can afford the investment that is. The internet is not democratic. It's capitalistic."

    Well, read my original phrase with an ironic tone. As always, you are right, it is capitalistic, and capitalism is an insane mesh of democracy and ostracism that makes it very confortable for the people that are inside, but despair anybody that gets outside.

    Anyway the point is that currently, lots and lots of people can afford the investiment, and those people are eager to lend the needed money for other people who have the technical capabilities. If that stops being the case, it could become important to put concurrency back at this market. But it is not important now (and regulating the market won't help you get concurrency back).

    ReplyDelete
  20. Hi Marcos,

    I agree with you except on the point that it's not important now. It is good to be prepared for problems, and it's a problem that I think we will likely have to face at some point. Maybe not where you live, and maybe not where I live, but sooner or later that question will become relevant for a significant fraction of people on the planet. Best,

    B.

    ReplyDelete
  21. Hi Arun,

    Yes, adding dimensions to their indices is what I mean.

    "deciding the truth value is precisely the job that governments, corporations, churches, etc. arrogate to themselves. I don't want that. Nor do I want the search engine to be trying to determine truth for me."

    But search engines already do that, which is precisely my concern. Maybe not for you personally, but if you'd go and ask an average group of people if a statement X is true, first thing they'd do is Google it. If it's not easily to be found, or only late in the ranking, I suspect it's far more likely to be rated not true. Now it is probably the case that it being found on highly ranked sites is correlated with it being true, but once you start substituting one for the other you're headed for trouble.

    Either way, "truth" is a loaded word, and it was not a good choice. It would only work anyway for values that propagate like popularity. Ie, the Google ranking is implicitly based on that assumption that popular sites know better what's popular. You could do a similar thing for some other values. Say, scientific, because it seems plausible to me that people with scientific knowledge are more likely to be able to tell what's scientific knowledge. Or maybe entertaining. Or witty. Best,

    B.

    ReplyDelete
  22. The NYT has an interesting OP-ED by a law professor on this very issue today:

    "as a general rule, nonhuman or automated choices should not be granted the full protection of the First Amendment, and often should not be considered “speech” at all."

    ReplyDelete
  23. Hi Bee,

    A nice piece you point to here, yet I have a feeling its author has a greater fear of technology then he has an understanding of its creators. The thing is free speech isn’t a right, even in the US, which trumps all others; as for instance it doesn’t apply to someone having the right to shout fire in a crowded theater . So as far as I’m concerned it has been well established that mere opinion is opinion regardless of how transmitted and thus allowed as long as it doesn’t have dire consequence for others; as there are times that having it right does make all the difference. So in this case I do agree we don’t need more laws, more regulation or diminished rights, just a better respectful understanding and adherence to the ones we already have.

    “The most stringent protection of free speech would not protect a man in falsely shouting fire in a theatre and causing a panic. [...] The question in every case is whether the words used are used in such circumstances and are of such a nature as to create a clear and present danger that they will bring about the substantive evils that Congress has a right to prevent.”.

    - Justice Oliver Wendell Holmes, Jr. (Supreme Court decision "Schenck vs. United States 249 U.S. 47") [1919]

    Best,

    Phil

    ReplyDelete
  24. Hi Phil,

    Well, all rights have to be re-assessed at the point where they start conflicting with other people's rights, and not causing physical harm is always high on the priority list. I believe we talked about this earlier, the assessment of priorities differ from one country to the next, due to culture and history. German for example has as a pretty high priority the protection of dignity, which is why, in principle, you can sue somebody for giving you the finger. (In practice it's likely to be dropped due to its limited relevance.)

    In any case, to be able to assess these conflicts and solve them in a way that are of the largest benefit to everybody involved, one needs to know first of all things where they originate and what are the consequences. Information that originates in a computer code is a different thing than information that originates in a vocal cord, and to asses the consequences one would have to think about the points I had listed in my post. Best,

    B.

    ReplyDelete
  25. Hi Bee,

    Yes, Canada has similar protections as to have all rights weighed in the balance, rather than having any one to take precedence. My only point being is it’s a poor argument to point to the potential evils of the machine, while ignoring those who created it and in turn their intentions. As for increasing the tools for making such assessment I’m all in favour of such, as long as they also only remain ones to help decide on intent and not become instruments of control; that is regardless of the source.

    Best,

    Phil

    ReplyDelete

COMMENTS ON THIS BLOG ARE PERMANENTLY CLOSED. You can join the discussion on Patreon.

Note: Only a member of this blog may post a comment.