Policy recommendations

1. Properly identify problems

Recognise the complexity of issues

A holistic approach is required to combat mis- and disinformation. Regulating platforms is insufficient. It is crucial to use appropriate terminology concerning the complexity of the issue; to solve a problem, we must first correctly define it. In politics and the media, the blanket term ‚fake news‘ – which can refer to both mis- and disinformation – is too frequently used. Making matters worse, ‚fake news‘ is also used to discredit inconvenient statements by political opponents. Correct terminology of problems and phenomena includes a critical analysis of various aspects of mis- and disinformation. For instance, the way information spreads needs closer scrutiny, as does content and content formats. All too often, a textual format is implicitly assumed; it is essential to name and emphasise the format of mis- and disinformation. Images and videos are orders of magnitude more memorable and widespread, with voice messages growing in popularity. We need to foster an awareness in politics, journalism and society at large that while mis- and disinformation are not new phenomena, social media networks and messaging services increase the spread of information exponentially. Changes in media consumption contribute to the problem, as do a loss of trust in the state and media outlets, global information wars, reckless use of digital media and peoples‘ susceptibility to conspiracy theories. The complexity of the issue demands an approach that goes beyond solely regulating digital platforms.

2. Political and diplomatic responsibility

A diplomatic and foreign policy focus on the human rights aspects of disinformation

The case studies of India and Brazil show that disinformation spreads so rapidly, and is so effective in part because government officials tacitly approve or welcome it. Disinformation is used to legitimise political harassment of specific groups, which can be tied by caste, religion, gender or sexuality. Diplomacy and foreign policy must make explicit the abuses that lead to riots and violations of human rights. The extensive internet shutdowns in India are part of the problem, which need to countered by political condemnation.

Make global platform operators more accountable, protect the human right to private communication

Developing a solution that doesn‘t curtail freedom of speech and holds global corporations and platform operators to account requires a global discussion. Regarding messenger services, this applies to Facebook as the owner of WhatsApp, Facebook Messenger and Instagram, as well as Telegram and Discord. Any solution must protect the human right to private – encrypted – communication.

3. Responsibility of platforms and technology design

The design of messenger services must take disinformation into account

Social media platforms have a major responsibility to combat disinformation. Technology design offers broad possibilities for curbing the rapid spread of information. In response to the Brazilian election, WhatsApp introduced a feature marking forwarded messages as such, clarifying that information did not originate with the person that sent it. While circumventable by saving an image, it does create a little more transparency. WhatsApp currently allows forwarding of five chats at a time, a feature implemented in response to events in India and Brazil. However, groups can be included, meaning that if a group contains the maximum of 256 members, almost 1,300 people can be reached in just a few steps49. Further transparency and traceability might be improved with a feature WhatsApp is currently testing – a disclaimer on how often content has been forwarded50. All of this is to say that transparency and traceability can be improved significantly by restricting forwarding options.

Due to encryption, self-regulation and deleting content are less relevant tools for most messenger services

Nonetheless, Telegram and others must take more effective action in open groups and channels to prevent the spread of disinformation, possibly through the use of ‚fingerprints‘. Labelling mis- and disinformation through the use of fact-checks can have an unintended side-effect: if only content that‘s been checked can be trusted, everything else becomes untrustworthy51, potentially reducing trust in the media. Messenger services and their parent companies must work closely with governments to combat illegal content. They have a responsibility to educate their users and offer simple options for reporting illegal content.

Displaying information on disinformation within messenger services

Messenger services are an ideal platform to educate and caution against mis- and disinformation, including cyberbullying and hate speech. Apart from asking how NGOs and governments can reach and educate people, we need to discuss what responsibilities messenger services have and how they can help. Telegram, for instance, already informs users about updates and new features through its own channel. Educational campaigns could also be run through such channels. To its credit, WhatsApp has started a campaign – implemented outside of the messenger – in India to combat mis- and disinformation, as well as hate speech that leads to lynchings and mob violence52.

4. Regulation

‚Know Thy Foe‘ – promote and support research into disinformation and its spread

Mis- and disinformation is rarely illegal – especially in liberal democracies. As a consequence, we need to ask what good regulation looks like, rather than whether it‘s required. This includes debate over whether shared information can be regulated directly or whether less intrusive measures may be more effective. Generally speaking, disinformation – regardless of the spread of disinformation on messenger services – is currently under-researched53. This is in part due to the fact that social media platforms do not make their data available to independent research institutions. Laws that stipulate that anonymised data be made available for research could be a solution. In addition, ethical standards need to be developed – or existing ones adapted – that enable research to be carried out on closed groups within messenger services. Carlotta Dotto has submitted a proposal for just this to First Draft54. Her paper provides journalists with guidelines on researching dark social and raises awareness of the ethical question that needs to be considered.

5. Journalism

Promote local journalism to restore trust in the media

Studies have highlighted the erosion of trust in news and the media alongside a decrease in news consumption due to a perceived negative slant of coverage55. Journalism and society at large needs to ask how to build trust and improve the negative image of the media. The Danish journalist Ulrik Haagerup in collaboration with others, is attempting to produce more constructive news via the ‚Constructive Institute‘56 approach, which puts negative news in the right context. For instance, if several burglaries are reported in area, it makes sense to note that burglaries in general are on the decline. The effect of mis- and disinformation that holds politicians responsible for a perceived deterioration in the area can be somewhat negated. The predominant focus should be the promotion of local journalism and media, which raises the question of sustainable business models.

Careful selection of talking points to counter the spread of misinformation and disinformation

Editors and journalists must consider what they report – and what they don‘t57. Reporting mis- and disinformation often increases its reach – regardless of the channel. Therefore a wider discussion between journalists and editors is needed to debate which journalistic standards need adaptation for the 21st century. After all, the assumption that disinformation will be covered by traditional media is often part of the dissemination strategy and can sow doubt – even when reported on in the right context58. Journalists also need social media training in order to recognise and classify the effects of information spread via social media platforms.

6. Resilient societies

A federal agency for all-ages digital media education

The proliferation of social media and the internet mean the question of how to act online is now a facet of everyday life for almost everyone. Misinformation and disinformation are often unwittingly spread by the masses, especially when it validates or confirms someone‘s values. Questioning all information is thankless and almost impossible task. Nevertheless, education can and must be one of the tools to counter the spread of mis- and disinformation. Studies have shown that people over 65 are the primary spreaders of mis- and disinformation, spreading it seven times more than people under the age of 2959. Further education for all ages, which deals with the rapid changes of the digital age and counters disinformation is needed. A federal agency for digital education, similar to the Federal Agency for Civic Education in Germany could be an opportunity to provide life-long education to everyone60. For years, Finland has had a lifelong training programme on disinformation – in part due to its close proximity to Russia – and has not only provided a model for how this could work but also shown the impact such a programme has61. We need greater awareness and education on forwarding illegal content – which is punishable and by no means harmless or humorous. The discussion must include society as a whole and cannot be confined to schools. It has become evident that strategies focussed solely on technological approaches are not promising. While studies show that messenger services enable and increase the spread of disinformation, they also show that a lack of trust in both the state and the media is an important factor in the dissemination of disinformation. Increases in nationalism and its epiphenomena such as racism, sexism and anti-Semitism also play a significant role. It‘s not sufficient to view disinformation as an external, political problem, restricted to campaigns. Neither can we only hold social media platforms to account and outsource the solutions to their parent companies. A broad discourse is required to combat disinformation, in which everyone must be included.

WhatsApp campaign to curb spread of misinformation

49 Cf. Kastrenakes 2019. 50 Cf. Sagar 2019. 51 Cf. Pennycook, Bear, and Collins 2019. 52 Cf. Banaji and Bhat 2019. 53 Cf. Jaursch 2019. 54 Cf. Dotto, Smith, and Wardle 2019. 55 Cf. Newman et al. 2019. 56 Cf. Haagerup 2019. 57 Cf. Phillips 2018. 58 Cf. Illing 2020. 59 Cf. Guess, Nagler, and Tucker 2019. 60 Cf. Riedel 2019. 61 Cf. Mackintosh 2019.