Fake News and the Power of Algorithms: Dr. Ricardo Baeza-Yates Weighs In With Futurezone at the Vienna Gödel Lecture

NTENT Chief Technology Officer, Dr. Ricardo Baeza-Yates made news in Austria on June 8 when he presented his lecture titled “Bias on the Web” at the Vienna Gödel Lecture 2017. Using his expertise on data analysis and algorithms, he addressed the potential for data bias due to basic assumptions, prejudices and preferences that make it difficult to identify credible information, and how these biases influence factors like distribution and presentation. The following is taken from an interview with Futurezone at the event, where they asked Dr. Baeza-Yates what solutions he had to offer.

Futurezone: You speak in your lecture about fake news and the power of algorithms. What are their central theses?

Baeza-Yates: In the past two to three years, there has been a concern, especially in the USA, with the problem that bias in data can affect minorities, for example, if whether or not it’s harder for African Americans to get loans. Another example of bias is the influx of fake news. It is often difficult to distinguish what is true and what is false, as was the case in the Austrian election campaign, when the current president, Van der Bellen, had to show his medical records due to a fake post saying that he had cancer. This development worries people and while we are right to focus on the bias in the data available to us for decisions, we must also address the less visible part of the problem and deal with all the biases themselves.

Futurezone: For example?

Baeza-Yates: One aspect is called “Presentation Bias.” If you go to the supermarket to buy rice, you can only choose from the range of rice options that you see. Your selection is limited by factors such as which rice brand can pay the most for shelf space or the specifications for the shelf. This happens on the Web. When I visit a video streaming page, I get to see a certain selection. This can be ten titles or even a hundred, but there will always be millions of videos I cannot choose. This is a form of bias. The providers whose products I see get richer while the rest lose.

Futurezone: Does this also affect the way in which information is distributed?

Baeza-Yates: Yes, we also see this effect in social networks and search portals. It happens as a consequence of the filter bubble. A potential solution for this would be for users to actively consume as diverse a range of information as possible so they can broaden their horizons through chance discoveries found at the border of their filter bubble.

Futurezone: What could the providers do?

Baeza-Yates: One solution would be to encourage the operators of the platforms to offer a fairer choice. Another option might be to confront users with information diametrically opposed to their own opinion. We have yet to see this possibility in the current systems since users generally seek information aligned with their own views, despite the fact that understanding contrary opinions is an essential part of effective decision making.

Futurezone: Are there already approaches here?

 Baeza-Yates: We are currently trying to develop systems that attempt to connect people with diametrically opposed opinions using intermediate topics where there is an agreement. For example, bridging the gap between conservatives and progressives by finding common ground in the love for a favorite football club. This approach, however, has limitations. I do not believe that this would work for deep, religious beliefs.

Futurezone: What is the role of classical media?

 Baeza-Yates: The media can cause bias in many ways; for example, by featuring certain messaging to give weight to one side over another. Political factors often play an influential role in media as well.

Futurezone: How do the algorithms of Google and Co distort the information offer?

 Baeza-Yates: Search engines amplify such biases. When a journalist writes an article, he or she will likely refer to the top results on Google as a starting point for research. When the article is published again on the Web, it in turn affects the ranking of the results. Thus, the concentration on a few sources is strengthened, new content is influenced in that direction, and the search engines regard it as correct.

Futurezone: Why don’t we seek out new information sources more frequently?

 Baeza-Yates: People want to see their opinions confirmed. We call this “self-selection bias.” People are more likely to click on different things when they are presented with the same message selection.

Futurezone: What are some possible consequences of distorted information?

 Baeza-Yates: Take the cases of Brexit and the US election: These were manipulated by social media. I do not mean by hackers – which cannot be excluded – but by social biases. The politicians and the media are in the game together. For example, a non-Muslim attack may be less likely to make the front page or earn high viewing ratings. How can we minimize the amount of biased information that appears? It is a problem that affects us all.

One might try to make sure people get a more balanced presentation of information. Currently, it’s often the media and politicians that cry out loudest for truth. But could there be truth in this context at all? Truth should be the basis but there is usually more than one definition of truth. If 80 percent of people see yellow as blue, should we change the term? When it comes to media and politics the majority can create facts. Hence, humans are sometimes like lemmings. Universal values could be a possible common basis, but they are increasingly under pressure from politics, as Theresa May recently stated in her attempt to change the Magna Carta in the name of security. As history already tells us, politicians can be dangerous.

Futurezone: Why do we choose them then?

 Baeza-Yates: At the end, we are still animals. Our instincts tell us to defend our family and possessions. Politicians use this to paint threat scenarios on the wall. Even if people know that they are not true, they usually do not want to take the risk. The society is our software, our brain is our hardware, but it lags 100,000 years behind. That is why the basic values are, in any case, only a normative concept, but we should still try to uphold universal values. Otherwise it will be very difficult.

Futurezone: How could people be politically informed without polarizing?

 Baeza-Yates: The solution in the field of politics would be most likely to recognize that there is a spectrum of views on most subjects. I can determine whether a user is left or right by analyzing his or her reading behavior and then present results that match. If the reader sees a scale that identifies his or her political ranking on the basis of media consumption, it may cause a shift in behavior. But this can always be only one option, so many people would not accept the offer. We could provide fact-checks and an evaluation system for the credibility of reports in order to help, but we would need an unquestioned basic assumption that can be a source of any coverage, such as trust in the capitalist social order, or something else that society has accepted. That makes it the truth for most people. Bias is not about how far I am from the truth, but about how far I am from what is accepted. Bias is always dependent on an assumed baseline, which is often difficult or even impossible to define.

Futurezone: What happens when many people sit on different biases?

 Baeza-Yates: Polarization is a problem. It results from the fact that people must choose the lesser of two evils. I do not believe that Austria would have elected a Green President if the situation had not been so acute. Those types of decisions can always turn in both directions, as seen in the case of Trump.

Futurezone: What could help us here?

 Baeza-Yates: We need neutral media. The US media is so polarized that there are practically two separate publics in the country. To a lesser extent, this also happens in Europe. Our system is increasingly polarizing and the media is pushing it forward. It is difficult to find a newspaper that is not politically positioned. This often leads to conflicts between what the editors think and what they should actually point out.

Futurezone: What should politics do?

 Baeza-Yates: Politicians should help society achieve its goals. Unfortunately, this rarely happens. Far too often it is about personal power games. Most of the time, it is other actors, such as some journalists, who keep an eye on social welfare.

Futurezone: Quality journalism as a solution?

 Baeza-Yates: The media does not only belong to journalists. Through social media, today everyone is a bit of a journalist. Editors cannot always say what they want. It could be a good idea to try and contrast different views on a topic within the comments section. Then all sides can argue their position and decide who is more credible. Often views aren’t just black and white. Most things are gray. This is a well-known approach but in the past, the mere juxtaposition of different experts has also been criticized and described as anti-journalism.

I do not understand the criticism; I think it comes from people who are afraid to lose the sense of interpretation.

Futurezone: Why does the debate about Fake News boil up, even though the phenomenon is old?

 Baeza-Yates: Fake news has always existed. But the web is an amplifier. So, everything is as usual, but the degree of amplification sometimes gets out of control. Fake news can reach the whole world in a short time, while in the past they may not have gone beyond the main tableau of our town.

Futurezone: Is the Internet an uncontrollable jungle?

 Baeza-Yates: The network also allows us to trace messages back. This can help us. If I had to choose, I would prefer the current situation. Traceability is invaluable. Many people currently seem to feel that the world is getting worse. This is also attributed to the mirror of us created by the Web, where we observe the events, right? The Internet not only reinforces the bad, but also the good. When people feel that the Web presents a gloomy picture of the world, this is due to their self-selection bias. The claim that the network is evil is simply not true, because it is just an intensified reflection of society.

About Dr. Ricardo Baeza-Yates: As CTO, Dr. Ricardo Baeza-Yates oversees the technical vision of the company. Prior to NTENT, he spent 10 years at Yahoo Labs ultimately rising to Vice President and Chief Research Scientist. He has also served as a professor at Universidad de Chile since 1985, where he founded and acted as the Director of the Center for Web Research and twice served as Computer Science Department Chair; as well as professor at Universitat Pompeu Fabra in Barcelona since 2005 where he founded and acted as director of the Web Research Group. Ricardo is an ACM and IEEE Fellow with over 500 publications, tens of thousands of citations, multiple awards and several patents. He has co-authored several books including “Modern Information Retrieval”, the most widely used textbook on search.

He earned Bachelor’s and Master’s Degrees in Computer Science and Electrical Engineering from the University of Chile and a Ph.D. in Computer Science from the University of Waterloo in Canada.