Deepfakes in the Courtroom: Evidentiary Challenges under the BharatiyaSakshya Adhiniyam, 2023 A Cyber Law Perspective

Deepfakes in the Courtroom: Evidentiary Challenges under the BharatiyaSakshya Adhiniyam, 2023 A Cyber Law Perspective


Tightly linked to authenticity is the problem of chain of custody. In customary evidence law, it is
considered that the chain of evidence needs to be intact in order to guarantee that the evidence has
not been interfered with. Nevertheless, with the deepfake technology, manipulation is possible at
several levels creation, transmission, storage, and presentation. A small difference in documentation
can also be employed by the defence to claim that the evidence has been changed or substituted. A
legal community has come to appreciate that the concept of chain of custody should not be limited
to the physical control, but to digital provenance, metadata trail, and cryptographic verification .
The other key topic is the so-called lie dividend, which has already become a significant
phenomenon in the context of cyber law. It is the skill of offenders to refute real evidence by stating
that it is a deepfake. This puts an inverted burden of proof on even authentic recordings, which are
viewed with suspicion. With such a cautious stance taken by the courts, litigation is becoming more
complex and lengthy, thereby postponing justice. This has a direct impact in the presumption of
innocence and standard of beyond reasonable doubt in criminal trials.
The difficulties do not apply to the evidentiary rules only. The current Indian legislation, including
the Information Technology Act, 2000 and the Bharatiya Nyaya Sanhita, 2023 also covers such
areas of the deepfake abuse as impersonation, fraud, and privacy breach. There are procedural
protections in the Bharatiya Nagarik Suraksha Sanhita, 2023. Nevertheless, these laws had been
written prior to the emergence of generative AI and thus do not take into account the evidentiary
challenges of AI-generated evidence. They penalise misuse but fail to give clear guidelines against
which such evidence can be judged in court.
A comparative study with other countries illustrates some valuable lessons. In United States, courts
use the Daubert Standards that direct the judges to serves as gatekeepers and determine the
reliability of evidence that is scientifically reliable before it can be admitted. This strategy is
especially helpful when it comes to addressing deepfakes because it prioritizes the importance of
expert testimony and forensic validation over visual credibility. The American system
acknowledges that technological evidence should not be accepted as true but rather should be
evaluated using scientific criteria.
The EU AI Act and the General Data Protection Regulation (GDPR) have been a wider regulation
measure by the European Union. These frameworks make AI-generated content transparent,

obliging to disclose and label synthetic media. The EU indirectly enforces evidentiary reliability by
making sure that a user is informed of manipulated content. This model emphasizes the significance
of combining regulation of technology with the evidentiary law.
Another strategy is offered in China, which is the strict regulation and enforcement. Its legislation
requires deep fake content to be labeled and the failures of platforms to detect and eliminate such
content to be liable. Although this model is successful in combating misuse, it casts doubt on the
state overreach and free speech. However, its focus on the technological protection of their content
like watermarking and traceability has provided effective solutions which India can model.
When considering cyber law, it is evident that India has to adopt more than its existing system of
reaction. The introduction of a specific deepfake law that will help to distinguish explicitly AImanipulated content and impose liability on its abuse should be taken as the first step. Secondly, the
Bharatiya Sakshya Adhiniyam, 2023 should be modified to incorporate AI-specific stipulations,
especially in the realms of authentication and admissibility. In situations that involve suspected
deepfakes, the courts are supposed to take a similar gatekeeping role as that of Daubert standard by
ensuring expert verification is done in a mandatory manner.
Moreover, forensic infrastructure is in dire need of reinforcement. Standardised detection methods,
specialised AI forensic labs, and training programs on judicial training are necessary in order to
make sure that digital evidence can be properly evaluated in the courts. The law should implement
technological solutions like evidence tracking, digital watermarking, and metadata maintenance
which are based on blockchain technology. As it has been observed by practitioners, basic measures
such as locking down original devices, calculating cryptographic hashes, and maintaining platform
logs can go a long way in increasing evidentiary reliability.
The safeguarding of constitutional rights is also the most important thing. The employment of AIaltered evidence provokes critical issues about the right to a fair trial and due process. Courts have
to find a balance between avoiding abuse of deepfakes and avoiding that valid evidence is rejected
as a result of over-cynicism. This necessitates formulation of definite court principles on how to
deal with disputed AI evidence.
To sum up, deepfakes are among the most prominent issues to the evidentiary law in the digital age.
They destroy the credibility of electronic records, hinder procedural protections, and jeopardize the
fairness of trials. Although the Bharatiya Sakshya Adhiniyam, 2023 marks progress, it is not enough
to tackle the issue of the intricacies of the AI-generated proof. Through experience of the foreign

systems and a proactive and technology-based strategy, India will be able to revamp its evidentiary
legislation to suit the future requirements.
The legislation needs to be changed to go beyond accepting electronic evidence to actually
confirming its authenticity. In an era where the reality can even be created, the role of the judiciary
has to change to the mode of passive acceptance to active investigation. It is only under these
conditions that the legal system will be able to retain the most important of its values the pursuit of
truth.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top