Challenging Times
A Symposium Organized by the Oxford Berlin Research Partnership Looks at the Role of Scientists in Times of Conflict, Fake News, and Polarization
Aug 03, 2022
"I love the word ‘Wissenschaft’ (‘science’ in German). It unites us all: researchers, artists, students, and committed public figures – we all love to create (‘schaffen’) and get creative," Çiğdem İşsever said at the symposium: “Academics in the Public Sphere: Navigating the Political.” A Berliner by choice, she has spent 15 years researching and teaching at Oxford and co-leads the Oxford Berlin Research Partnership with Andrew Hurrell.
The post-doctoral physicist was thus able to give a very personal insight of both cities and their connection at her commencement speech. The event on July 8 and 9, 2022 was jointly organized by the Berlin University Alliance (BUA) and the University of Oxford.
Scientists and experts from Oxford and Berlin met at the Berlin-Brandenburg Academy of Sciences and Humanities in Berlin-Mitte. For two days, everything revolved around the challenges that scientists face in the public sphere. How can facts be communicated in times of political polarization? What role do fake news and hate speech play in the multi-voice debates on social media? Panel discussions and short speeches were on the agenda, as well as time for getting to know each other and sharing thoughts.
Between Fake News and the One Truth
The role of truth in the age of conflict and wars was largely discussed. Can truth be understood as a point of reference rather than a binary distinction between fake news and truth? Timothy Garton Ash, Professor of European Studies at the University of Oxford, analyzed the development of the public sphere in the last decade in this context. He noted that, on the one hand, we have gained unlimited access to knowledge and facts, but paradoxically, freedom of speech has suffered as a result. In the public debate, different spaces with different views and facts tended to emerge. However, this is neither an inevitable consequence of technological development nor exclusively due to social media. Rather, it is a consequence of a complex situation of political, cultural, economic and social factors and of algorithms that function according to the law of attention instead of quality.
In order to counteract this, Garton Ash stressed the role of public service broadcasting in preventing political polarization and further emphasized that this sector would therefore have to be provided with sufficient budget. "Universities also play an essential role in finding and disseminating truthful information in an age of fake news, extreme polarization, populism, and misinformation." They could play this role even better by integrating digital and media literacy more strongly into teaching, and by making scientists even more present in the public sphere.
Prof. Dr. Rasmus Nielsen followed this up from his perspective as head of the Reuters Institute for the Study of Journalism at the University of Oxford. He underlined that changes were created by us as citizens, but also as media users and consumers. If almost everyone used digital and
mobile media platforms, our billions of individual decisions would have profound institutional impacts. Traditional publishers would be pushed back, and instead a few US corporations would make profits and set standards and decide on connections and access as well as structured information. This shows that even non-political influences on the media have political consequences, especially for democracy. This is why we need to think about new ways for media to report and reach people, he continued. In the period before digital media, he said, people had more restricted but easier access to secure information. On the other hand, according to Nielsen, nobody today would want to sacrifice their smartphone and digital devices for that.
Artificial Intelligence: Utopia or Dystopia?
How can artificial intelligence (AI) be used in a meaningful way to solve the major challenges of humanity such as health care, nutrition or the bioeconomy? How can a technical understanding of AI be communicated to a larger audience? Another topic of discussion at the Oxford/Berlin Symposium was “Living and Working with Artificial Intelligence.” Prof. Nigel Shadbolt, a computer scientist at the University of Oxford, raised the question of how we want to adapt our values to the possibilities of AI. “The challenges we face cannot be met without artificial intelligence,” he said. At the same time, he said, this will require regulation and a practical ethical approach as AI systems become our confidants in the future and as human beings begin to demand rights and freedoms for them.
Ioannis Havoutis of the Oxford Robotics Institute noted that robots can already be found in the service sector, in the healthcare system, in companies, and households. As such, he said, they have the potential to free people from undesirable, dangerous or monotonous work. Consequently, there will no longer be as many jobs in the future – a sociopolitical problem that should not be underestimated and that calls for radical solutions.
Abigail Williams, Professor of English Literature at the University of Oxford, talked about how she worked with an English gaming company to develop a chatbot for the play "Romeo and Juliet". Young people can thus chat with the characters in the play and feel more carried away by the plot. Abigail Williams made it clear that technology is a tool in her eyes, not an acting subject. At the same time, artificial intelligence complicates our understanding of creativity, since authenticity, humor, and intentions must also be considered. But instead of seeing artificial intelligence only as a machine that produces imitations, Williams emphasized that AI could show us things that we ourselves are not capable of, such as the computer program Alpha Go, which beat the best player in the Chinese board game "Go" in 2015. If we consider AI as a tool, however, we should ask ourselves what human choices are incorporated into the databases and programming.
Who can be found in the data sets? What assumptions do we make? Michelle Christensen, professor at Technische Universität Berlin, is also concerned with this issue. She pointed out that facial recognition software continued to have difficulties with dark skin tones. Discrimination patterns, for example in the field of gender, were adopted from society into algorithms and programming. Michelle Christensen drew attention to collectives that are developing a feminist and queer vision for such technologies and setting up alternative servers and networks that are freely accessible in the spirit of the open-source movement.
During the discussions of the experts from Oxford and Berlin, it became clear that there are still many questions and concerns in the scientific community as well about the ethical design of AI. Çiğdem İşsever summed it up as follows: "I am developing useful tools, but I am concerned that they are being used by politicians to oppress people. What is my responsibility as a scientist? Can we develop algorithms that cannot be abused?"
The different perspectives from Oxford and Berlin complement and inspire each other; the two days full of discussions have made this clear – not only in the areas of truth and artificial intelligence, but also in science communication, in dealing with our colonial past, and in the situation of the universities in the pandemic. Many new impulses for future workshops and collaborations emerged from the symposium, demonstrating once again that the partnership between Oxford and Berlin serves to connect an extremely active and vibrant community with a lot of potential for future projects.
Prof. Dr. Michel Zürn from the WZB Berlin Social Science Center explained how populists harm democracy by subverting truths with different strategies, for example by not recognizing real events as such and thus rejecting the idea of truth or by questioning the competence of scientists. He argued that it is important to discuss and politicize the mutual relationship between truth and democracy. Belief in science is based on education, but also on trust, and this presupposes that people accept statements that they cannot verify themselves.
What role can universities and their partners play in strengthening citizens' digital and media literacy and thereby supporting democracy? Prof. Dr. Christoph Markschies, president of the Berlin-Brandenburg Academy of Sciences and Humanities, and Prof. Dr. Anita Traninger, a Romanist scholar at Freie Universität Berlin, led the panel discussions on this topic. In this context, they drew attention to the Oxford Berlin Research Partnership. Garton Ash pointed out the need to conjointly look within the research partnership at which questions are really relevant to the fragmentation and polarization of the public sphere. We should not only talk about fake news, but rather question the way algorithms work and jointly develop suggestions on how they should be optimized – not with regard to the attention of users, but with regard to democracy.