Student at University of Allahabad, Uttar Pradesh, India
The expansion of Generative Adversarial Networks, or GANs, has created space for new sort of synthetic media known as deepfakes. Our long held conviction that anything you witness with your own eyes must be true is challenged by these audio and video recordings, which appear so real that they make it difficult to distinguish between reality and fiction. This study focuses closely at the legal and ethical repercussions of deepfake technology. Even though it has legitimate creative applications, particularly in fields like entertainment and cinema, it is also turning into a potent instrument for harming people by violating their privacy and undermining public confidence. Also it alerts people to a rising threat to India's democracy: the use of phony audio or video to disseminate misleading information and sway public opinion during elections. This work focuses on how Indian courts are employing personality rights to combat personal data theft and digital impersonation by examining recent court rulings, such as those in the Anil Kapoor and Arijit Singh cases. It also takes a serious look at whether our current legislation the Information Technology Act, 2000 the Bharatiya Nyaya Sanhita and the Copyright Act are strong enough to deal with the difficulties deepfakes provide. This paper argues that although our present laws offer some scattered answers for things like cybercrime and defamation, they fall short in the matter of regulating the particular subject matter of AI generated identity theft and impersonation.
Research Paper
International Journal of Law Management and Humanities, Volume 9, Issue 1, Page 1550 - 1567
DOI: https://doij.org/10.10000/IJLMH.1111413
This is an Open Access article, distributed under the terms of the Creative Commons Attribution -NonCommercial 4.0 International (CC BY-NC 4.0) (https://creativecommons.org/licenses/by-nc/4.0/), which permits remixing, adapting, and building upon the work for non-commercial use, provided the original work is properly cited.
Copyright © IJLMH 2021