By - King Stubb & Kasiva on September 21, 2023
In the age of artificial intelligence (AI), our creative horizons have expanded exponentially. AI-generated works have not only transformed industries but also posed intriguing questions for copyright law. This article discusses the complexities surrounding the use of generative AI systems in copyright law, drawing insights from attempts to secure copyright protection for AI-generated works and recent cases where AI tools faced legal challenges over copyright violations.
Copyright law hinges on the notion of originality, where works must originate from human creativity. But what happens when AI, driven by vast datasets and intricate algorithms, creates content that challenges traditional definitions of creativity? This question became strikingly relevant in 2021 when an application for copyright registration of a comic book was submitted to the US Copyright Office. This comic book was a unique blend of text and images, partially crafted by a human artist and partly generated by an AI tool known as "Midjourney."
The US Copyright Office's response was clear—it refused to grant copyright protection to the AI-generated portion of the comic book. The rationale behind this decision was that works produced entirely by machines, devoid of any human creative input, do not qualify for copyright protection. This decision poses a fundamental query: Does creativity inherently require the involvement of a human mind?
Copyright law recognizes two distinct categories of works: primary works and derivative works. Primary works are entirely original creations, while derivative works are based on pre-existing subject matter but must exhibit substantial variation from their source material. Generative AI tools, such as language models and image generators, often scrape data from existing sources as part of their training process.
Consider the case of DALL-E, an AI model capable of generating images in the style of famous artists. To do so, it must have been trained on samples of copyrighted works created by various artists. This scenario raises a pertinent question: Are the outputs produced by AI tools best characterized as derivative works or merely copies of primary works?
AI tools, despite their data-driven origins, do not produce verbatim copies of their training data. Instead, they generate content based on their unique learning from this data. Consequently, one could argue that the outputs of AI tools are indeed derivative works, distinct from their source material.
Authorship lies at the heart of the copyright law. As per the Copyright Act of 1957 in India, the author of computer-generated works is defined under as the "person who causes the work to be created." This definition which is similar to the UK's Copyright, Designs and Patents Act of 1988, recognizes works generated without human authors. However, the Indian legal landscape has witnessed a traditional approach to authorship.
In the case of Rupendra Kashyap v. Jiwan Publishing House Pvt. Ltd., the High Court of Delhi ruled that only natural persons, not artificial entities like the Central Board of Secondary Education (CBSE), could claim copyright authorship. This perspective, reaffirmed in Tech Plus Media Private Ltd. v. Jyoti Janda, has been upheld by subsequent judgments, such as Navigators Logistics Ltd. v. Kashif Qureshi & Ors.
In 2019, the Delhi High Court further solidified this stance by rejecting a copyright claim over a list compiled solely by a computer, emphasizing the absence of human intervention.
However, in 2020, the Copyright Office in India recognized an AI tool named Raghav as the author of an artwork it produced, alongside the developer of the AI tool. This recognition marked a significant departure from the traditional interpretation of authorship. Nonetheless, the Copyright Office later issued a withdrawal notice, indicating that the responsibility to inform the office about the AI tool's legal status rested with the applicant.
This legal ambiguity surrounding AI authorship poses substantial challenges for businesses striving to protect their intellectual property. The question remains: How can AI-generated content be appropriately attributed and protected under copyright law?
The use of AI tools in creative processes inevitably involves the use of training data, which may include copyrighted material. This raises the specter of copyright infringement. For instance, Getty, a prominent image licensing service, has filed a lawsuit against the creators of "Stable Diffusion," an AI tool alleged to have unlawfully copied and processed millions of copyrighted images.
In response, Stability AI's CEO has argued that generative AI "transforms" the work product, invoking the shield of fair use. The dispute between Getty and Stability AI, alongside a class action lawsuit filed by three artists against Stability AI, Midjourney, and DeviantArt for copyright infringement, highlights the pressing need to address the legal implications of AI-generated content.
Similarly, software developers brought a class action lawsuit against GitHub, Microsoft, and OpenAI, accusing GitHub's AI Copilot tool of being trained on unlicensed data. GitHub countered these allegations, asserting that the tool had been trained on publicly available code and that the developers failed to demonstrate any harm. This case underscores the importance of ensuring fair use and compliance with copyright laws when developing AI tools.
Under the Copyright Act of 1957 in India, certain uses of copyrighted works for "criticism or review" may not necessitate obtaining consent from the copyright owner. However, the success of a fair use claim hinges on whether AI-generated outputs can be deemed "transformative."
Courts determine transformation by evaluating whether the new work differs in character, serves a distinct purpose from the original, and is not merely a substitute for it. Developers of AI tools contend that the outputs, driven by user prompts and customized responses, are inherently transformative. This perspective finds support in cases like Google's legal battle over thumbnail images displayed in search results, where the court ruled that Google's use was "significantly transformative."
While AI brings immense potential to the legal landscape, it also raises legitimate concerns. AI systems can unwittingly perpetuate biases, leading to inequities and discrimination within the legal system. Also, these systems can be very challenging to interpret, making it difficult for the layman to challenge or appeal their decisions. Furthermore, the reliance on vast amounts of data raises significant privacy concerns, particularly when the data used is really personal or of sensitive nature.
Accountability is another pressing issue. AI systems can make errors or produce unintended consequences, and holding them accountable for such mistakes can be a daunting task.
To navigate the intricate web of AI in copyright law, a balanced approach is imperative. Developers and businesses should ensure compliance with copyright laws by obtaining the necessary licenses for training data and giving due credit to content creators. Regular audits of AI systems can aid in defending against infringement claims.
Notably, Adobe has taken proactive steps by offering financial indemnity in case of copyright claims related to its new generative AI tool, "Firefly."
Furthermore, the Copyright Office's perspective on works created jointly by humans and machines remains uncertain. The legal analysis becomes more straightforward when AI-generated output substantially differs from the training data. In such cases, the individual who initiated the AI's output typically assumes ownership. However, complexities arise when the output closely resembles elements of the training data. Here, the copyright situation becomes less clear, necessitating further legal scrutiny.
In line with India's Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules of 2021, which mandate the appointment of compliance officers and regular reporting, the AI domain could adopt a comparable approach. AI firms could be required to designate compliance officers responsible for overseeing and enforcing copyright protection, conducting audits, and obtaining explicit permission from content owners before using their works for AI training programs.
This approach strikes a delicate balance between safeguarding the rights of copyright owners and fostering the growth and advancement of AI research.
The intersection of generative AI systems and copyright law presents a profound challenge to the legal community. As AI-generated works become increasingly prevalent, it is crucial to navigate this complex terrain with diligence and foresight. Striking a balance between protecting intellectual property and nurturing innovation in the era of AI remains an ongoing endeavor. The legal framework will need to evolve to address the intricacies of AI-generated content while preserving the principles of creativity and ownership central to copyright law.
AI-generated works may not qualify for copyright protection if they lack human creative input, as determined by copyright authorities.
Primary works are entirely original creations, while derivative works are based on pre-existing material but must exhibit substantial variation.
Developers can ensure compliance by obtaining licenses for copyrighted training data, crediting content creators, and conducting regular audits of AI systems for adherence to copyright regulations.
 U.S. Copyright Office, Refusal Decision on Application for Copyright Registration of "Comic Book Title," June 1, 2021, available at https://www.copyright.gov/docs/zarya-of-the-dawn.pdf.
 1993 SCC OnLine Del 660.
 Tech Plus Media Private Ltd. v. Jyoti Janda 2014 SCC OnLine Del 1819.
 2018 SCC OnLine Del 11321.
 Getty v. Stability AI Ltd.
 J. Doe 1, et al v. GitHub Inc., et al, Case No. 22-cv-06823-JST.
 Perfect 10, Inc. v. Amazon.com, Inc., 508 F.3d 1146 (9th Cir. 2007).
 U.S Copyright Office available at https://www.copyright.gov/comp3/chap300/ch300-copyrightable- authorship.pdf.
 Harvard Business Review, ‘Generative AI Has an Intellectual Property Problem’ by Gil Appel, Juliana Neelbauer & David A. Schweidel.