Media Manipulation, Cheap Fake Videos, and Politics in Colombia

The dynamics of political communication in democratic societies have changed as digital technologies have spread around the world, and, particularly, as social media platforms have transformed the speed and scale of information flows. Changes in the distribution, production and consumption of media content have generated a greater abundance of information of all kinds (dubious, false or reliable) that, without editorial and quality filters, circulates rapidly among different publics and audiences. In Colombia, for example, the use of social media and instant messaging applications by a large part of the population has impacted the development of political campaigns in the last decade, increasing the polarization of public discourse. The overwhelming dissemination of misleading content on social networks during the 2016 plebiscite, the abundant circulation of false information to discredit politicians during the 2018 and 2022 presidential elections and the 2019 local elections, are examples of how the new dynamics of political communication through social media affect the democratic processes of deliberation and consensus building.

Information disorder and media manipulation

In a report published by the Council of Europe, Clare Wardle and Hossein Derakhshan (2017) coined the term information disorder to describe the complexity and variety of problematic information circulating in contemporary media ecosystems. According to these researchers, misleading and false content, ranging from news to videos to memes, can be classified into three types according to their intentionality: disinformation, misinformation, and malinformation.

Recognizing the difference between these three types of information is useful. It allows to understand some of the political, financial and psychological motivations that drive the creation and propagation of problematic information. Furthermore, it allows to evaluate the contents and messages that circulate through different platforms and social networks, including memes, news, videos, audios, among others. Following Wardle and Derakhshan (2017), we can define disinformation as false information deliberately created to attack a person, a social group, an organization, a political party or a country. Misinformation is information that, although false, has not been created with the intention to cause harm. Malinformation, on the other hand, is information that is true, but is intentionally used to cause harm.

Disinformation, in particular, has become one of the most effective and unethical strategies used in contemporary political communication to attack and damage the reputation of political parties, candidates and governments. In recent years, disinformation campaigns have been deployed during electoral processes in several countries. This has altered the deliberative processes by polarizing public discourse and affecting citizens’ democratic decision-making. These campaigns can be understood as operations executed in a coordinated manner by different groups and actors, with the aim of attacking adversaries by taking advantage of the affordances of digital technologies, both to produce and distribute misleading content. Researchers such as Joan Donovan, have called this use of technologies “media manipulation” to emphasize the intentional modification of texts, images, sounds and videos, as well as the alteration of algorithmic recommendation systems of digital platforms and social networks. Media manipulation is a socio-technical process that is carried out with the intention of influencing public discourse, in addition to capturing the attention of audiences.

A typology of manipulated videos in Colombian political communication

Nowadays, media manipulation is executed in multiple formats such as photographs, audios, texts, videos, memes. Many people, with a variety of skills, can produce media content with the use of accessible software, and devices such as cell phones, computers, tablets, among others. Moreover, this content is easily and quickly distributed by many through social networks, digital platforms and instant messaging applications.

Manipulated videos are one of the most widespread and viral formats, not only because of their persuasive ability to present altered evidence, but also because of their ease and low cost of distribution on digital platforms. From videos of politicians, where they are giving speeches or are in everyday life situations, to videos of police violence during street protests, manipulated audiovisual content is one of the most popular means to deceive and confuse a wider public.

Journalists and academics have studied audiovisual manipulation and identified the different modalities it can take according to the techniques and materials used to alter videos. The Fact Checker’s Guide to Manipulated Video elaborated by journalists from the Washington Post (2019), for instance, identifies three main forms of alteration:

  • 1) by means of images taken out of context,
  • 2) deceptively editing the audiovisual material and
  • 3) deliberately and maliciously altering the content.

In Deepfakes and Cheap fakes: The Manipulation of Audio and Visual Evidence, Britt Paris and Joan Donovan (2019) have described a wide spectrum audiovisual manipulation content ranging from deep fakes videos, manufactured with advanced artificial intelligence (machine learning) technologies for image and sound processing, to cheap fakes videos built with basic audiovisual editing tools that are easy to access and use.

The deepfakes / cheap fakes spectrum (Paris & Donovan, 2019)

In Colombia, manipulated videos have been used to attack political candidates, discredit the government in power and generate confusion during socio-political protests. In order to understand the type of falsified audiovisual content used in Colombian political communication, together with the media literacies team of Universidad Javeriana’s Open Platform for Digital Citizenships (Plataforma Abierta de Ciudadanías Digitales), we conducted a review of some of the manipulated videos that have circulated in the last five years in social networks and instant messaging applications. Some of these videos have been evaluated by verified initiatives such as ColombiaCheck and Factual, and have been censored from mainstream social media platforms (although some copies of them still circulate in some niche platforms). Others have been analyzed as part of disinformation operations (see Lombana-Bermudez et al, 2022).

According to our initial analysis, and following the typology developed by Paris and Donovan (2019), the manipulated videos used in Colombian political communication can be classified into six categories according to the alteration technique that was used:

  • Lip sync – Dubbing lip sync: videos of politicians dubbed with audio that is not the original, where the meaning of the message is changed. For example, the video with dubbed audio of Ivan Duque’s self-interview.
  • Juxtaposition of politicians’ faces in TV series: politicians’ faces are juxtaposed over characters from popular TV series. For example, the video of an episode of The Simpsons where the face of Álvaro Uribe appears juxtaposed over the face of Homer, while he gives his innocence speech.
  • Face swapping and alteration – Replacing one face with another (Masking): faces from videos are swapped and altered with faces of politicians through the masking effect. For example, the video where characters from the movie La Vendedora de Rosas, appear with altered faces, masked by those of politicians from the Centro Democratico party. The face of the character El Zarco is masked with that of Federico Gutiérrez.
  • Alteration of speed: Accelerating or slowing down the speed of the moving images and the audio of the video. For example, the video where Gustavo Petro appears giving a speech in New York, with slowed down segments, making him appear to be speaking slowly and slurring his words as if he were in a state of drunkenness.
  • Recontextualization: videos edited to alter the original meaning. To achieve this, they typically cut, rearrange or sometimes add images from other videos. For example, the video known as “the hooded” (“el encapuchado”), where a public performance in support of peace at the National University of Colombia is edited to present Daniel Quintero Calle as a sympathizer of violence and vandalism.
  • Misrepresentation: the original video is presented inaccurately. For example, the use of incorrect dates or locations when showing the video misrepresent the original context, thus potentially misleading the viewing public. For example, the 2017 video showing police officers forcibly entering a building in Maracaibo, Venezuela, is presented on social networks as if it were from the 2021 protests in Bogota.

The six types correspond to videos made with easily accessible low-cost technologies, which do not require further computational processing or the use of artificial intelligence. Hence, these videos can be considered under the “cheap fakes” spectrum. They are videos which can be made using cell phone applications (videoshow, youcut, video.guru), digital platform interfaces (Tiktok, Instagram, Youtube) or video and image editing software (Photoshop, After Effects, Premiere, Gimp). Below is an infographic with still frames of the videos according to each category.

References

Paris, B. & Donovan, J. (2019) Deepfakes and cheap fakes: The Manipulation of Audio and Visual Evidence. Data & Society, September 18, 2019. https://datasociety.net/library/deepfakes-and-cheap-fakes/

Donovan. J.(2019) The Lifecycle of Media Manipulation. Verification Handbook 3. https://datajournalism.com/read/handbook/verification-3/investigating-disinformation-and-media-manipulation/the-lifecycle-of-media-manipulation

Lombana-Bermúdez, A.,Vallejo Mejía, M., Gómez Céspedes, L. y Pino Uribe, J. (2022)  Cámaras de eco, desinformación y campañas de desprestigio en Colombia. Un estudio de Twitter y las elecciones locales de Medellín en 2019. Política y Gobierno. Vol. 29 Núm. 1 (2022): Vol. 29 Núm. 1

Washington Post (2019) The Fact Checker’s guide to manipulated video. https://www.washingtonpost.com/graphics/2019/politics/fact-checker/manipulated-video-guide/

Wardle, C. & Derakhshan, H. (2017) Information Disorder: Toward an interdisciplinary framework for research and policy making. Council of Europe report, DGI (2017) 9. https://rm.coe.int/information-disorder-toward-an-interdisciplinary-framework-for-researc/168076277c

Leave a Comment

Your email address will not be published. Required fields are marked *