Use of Deepfakes in Political Surveillance and Disinformation

  • Mansi Chauhan and Dr. Sukriti Yadav
  • Show Author Details
  • Mansi Chauhan

    Student at Amity University Lucknow, India

  • Dr. Sukriti Yadav

    Assistant Professor at Amity University Lucknow, India

  • img Download Full Paper

Abstract

The rise of deepfake technology—synthetic media generated using artificial intelligence—poses unprecedented challenges to democratic institutions, legal frameworks, and human rights. This paper critically examines the dual role of deepfakes in political surveillance and disinformation campaigns, highlighting their potential to undermine electoral integrity, infringe privacy, and erode public trust in factual communication. Through a comparative legal analysis of responses in the European Union, United States, and India, the paper identifies substantial regulatory gaps and the ethical dilemmas posed by AI-generated deception. It explores how courts are beginning to confront issues related to synthetic evidence, privacy rights, and platform accountability, while proposing a forward-looking framework grounded in transparency, consent, and cross-border enforcement. Ultimately, this study advocates for a rights-based, interdisciplinary approach to mitigate the legal and democratic harms of deepfake misuse in political contexts.

Keywords

  • Deepfakes
  • Political Surveillance
  • Disinformation
  • Electoral Manipulation
  • AI Regulation
  • Privacy Law
  • Judicial Perspectives
  • Synthetic Media
  • Freedom of Expression
  • Comparative Law

Type

Research Paper

Information

International Journal of Law Management and Humanities, Volume 8, Issue 2, Page 4044 - 4054

DOI: https://doij.org/10.10000/IJLMH.119511

Creative Commons

This is an Open Access article, distributed under the terms of the Creative Commons Attribution -NonCommercial 4.0 International (CC BY-NC 4.0) (https://creativecommons.org/licenses/by-nc/4.0/), which permits remixing, adapting, and building upon the work for non-commercial use, provided the original work is properly cited.

Copyright

Copyright © IJLMH 2021