Copyright in the Age of Artificial Intelligence: Safeguarding Authorship

By Alistair Lam ’27

The past few years have seen a dramatic increase in copyright lawsuits in the US related to artificial intelligence (AI). (1) Many parties have initiated legal challenges against generative AI (genAI) startups like OpenAI and PerplexityAI; these range from record giants like Sony, UMG, and Warner, who have sued AI music generation services for infringement, to prominent media outlets like the New York Times and notable figures like writer Ta-Nehisi Coates, comedian Sarah Silverman, and politician Mike Huckabee, who have all cited alleged violations of fair use of their written works for training ChatGPT. (2) (3) (4) Many of the aforementioned plaintiffs argued that AI companies used their original creations to train AI models and generate similar or even identical outputs without their permission. Garnering relatively less attention, but still very much in the spotlight, are cases where artists try to enforce their copyright in works that involve AI materials or elements, such as Stephen Thaler and Jason M. Allen, both of whom created artworks with the help of generative AI. (5) (6) Importantly, these two categories of lawsuits are distinct: one involves the use (or misuse) of human-created works, while the other involves copyrights for works containing AI-generated content. Reflecting on the original purposes of the intellectual property (IP) framework as a tool for balancing authors’ rights with social good, it becomes evident that there should be limits to the expansion of authorship claims for AI for strengthening the IP rights of human creators.

AI poses a unique challenge to IP law due to the philosophical roots of the establishment of IP itself. There are a few main frameworks that can justify the existence of IP. (7) Two of them are human-centric: the theory of natural rights, pioneered by philosopher John Locke in the late seventeenth century, and the more contemporary theory of personhood, proposed by Margaret Jane Radin in the twentieth century. The theory of natural rights asserts that humans have an intrinsic right to reap the fruits of their intellectual labor, while the theory of personhood asserts that possessing the ability to control one’s own intellectual product is an integral part to human existence and fulfillment. Since these two frameworks do not include AI in their definitions of the moral community, AI does not possess such natural or personhood rights. In March of 2023, the US Copyright Office issued a statement of policy reaffirming its position—based on decades of legal precedent and judicial interpretation—that an “author” who can be given copyright is defined as a human and excludes non-human AI agents. (8) This definition signifies that copyright is designed to protect novel ways of expressing ideas, which are only—at least given the current GenAI technologies—possible in human works.

Hence, one turns to the other significant philosophical theory of IP’s value: utilitarianism. The US Constitution outlines a utilitarian view of IP in Article 1, Section 8, Clause 8 (the “IP Clause”), which bestows upon Congress the power “To promote the Progress of Science and useful Arts, by securing for limited Times to Authors and Inventors the exclusive Right to their respective Writings and Discoveries.” From an economics perspective, intangible ideas behind artistic and literary works, as well as scientific inventions, are public goods in the sense that they are non-rival (one creation does not compete with or hinder the ability to create another) and non-excludable (the creator cannot limit access to the creation). In this light, such public goods will be underproduced in the market without government intervention, which may come in the form of giving and enforcing IP rights. Thus, in an ideal world, IP rights, through patents and copyrights in particular, act as incentives for generating more creative works, processes, and technologies, benefitting society as a whole.

It might appear, if IP rights are so advantageous, that the definition of authorship in copyright should be expanded to include works that are ‘co-authored’ between humans and genAI, or even wholly ‘authored’ by genAI. On the surface, AI-containing or AI-generated works seem to fit most of the criteria of copyrightable material: just like human-made works, AI-containing or AI-generated works can be “fixated in a tangible medium” and can also meet standards of “originality” or “novelty,” especially in content that are substantially authored by humans. (7) However, a compelling argument exists against redefining authorship to be AI-inclusive: opening the floodgates to copyrighting AI works will upend the balance between protecting authors’ rights and promoting societal benefit. GenAI has already led to a “clogging” of the internet and social media with an overwhelming load of AI-generated, low-quality “junk” content. (9) (10) Giving GenAI and its users ‘authorship’ over this type of material will only diminish the value of copyright for human artists and inventors who offer more useful and meaningful contributions, thus lowering the incentive for creation in society, and defeating the purpose of copyright in the first place. Of course, there exist boundary cases in which human authors employ AI’s assistance in creating works that still exhibit high levels of human creativity and value. Therefore, a categorical rejection of copyright claims for any works containing GenAI materials would also be unreasonable. There must exist a balance between guarding the fine line of authorship while simultaneously avoiding the over-assignment of copyrights.

The legality of training AI models on human-made works should be considered in a similar light. Many plaintiffs have claimed that the AI models are generating look-alikes which infringe upon their copyrights, since these imitations are harmful to the value of their original works. One could argue that AI’s modified versions should be viewed as a kind of derivative work that equally deserves IP rights since the AI-generated output is original, having created novelty despite sourcing from existing works. However, this view is erroneous since it contradicts the definition of human authorship. The GenAI models—at least the current-day ones that use deep machine learning—are merely calculating probabilistic, statistical frequencies of different nodes in multiple layers of neural networks. In other words, GenAI is, by definition of its training, copying preexisting human works. Hence, they are generating supposedly ‘original’ output without actually creating anything original—in fact, even cognitive scientists remain puzzled by the “black box” inner process by which these LLMs operate. By contrast, human artists and inventors are mostly offering entirely original contributions to their fields. Even in special cases such as parody, a type of derivative work, the human artist must be knowledgeable about the value of the original work and clear about their aims in recreating the original work through a new lens. Outputs by GenAI should not be given copyright since they are mere combinations that borrow entirely from human sources without any original creation.

Despite the recent craze surrounding AI opening up exciting grounds for a wide range of possibilities in litigation strategies—be it artists fighting to protect their copyrights against GenAI models or with these models’ help—the fundamental purpose underlying legal principles in IP remains highly relevant and applicable. IP rights as a framework exists to reward the product of human intellect and creativity, and copyright specifically to motivate creators and artists to improve our lives with valuable reflections and daring imaginations. AI, albeit a groundbreaking technological advancement, does not in any way preclude the original aim of IP or its utility as shield and armor for societal progress. A successful implementation of the IP system in the advent of AI thus requires a delicate balancing act that avoids stretching copyright beyond its intended scope. A solution might lie in the creation of a sui generis IP rights system unique for AI-assisted and generated IP, similar to existing applications for databases and cultural heritage that lie outside the category of traditionally copyrightable IP.

Endnotes

(1) Zirpoli, Christopher T. Generative Artificial Intelligence and Copyright Law. LSB10922, Congressional Research Service, 29 Sept. 2023, https://crsreports.congress.gov/product/pdf/LSB/LSB10922.

(2) “Record Companies Bring Landmark Cases for Responsible AI Against Suno and Udio in Boston and New York Federal Courts, Respectively.” Recording Industry Association of America, 24 June 2024, https://www.riaa.com/record-companies-bring-landmark-cases-for-responsible-ai-againstsuno-and-udio-in-boston-and-new-york-federal-courts-respectively/.

(3) Reed, Rachel. “ChatNYT.” Harvard Law Today, 22 Mar. 2024, https://hls.harvard.edu/today/does-chatgpt-violate-new-york-times-copyrights/

(4) Brittain, Blake. “Meta Hit with New Author Copyright Lawsuit over AI Training.” Reuters, 2 Oct. 2024, https://www.reuters.com/legal/litigation/meta-hit-with-new-author-copyright-lawsuit-over-ai- training-2024-10-02/.

(5) Small, Zachary. “As Fight Over A.I. Artwork Unfolds, Judge Rejects Copyright Claim.” The New York Times, 21 Aug. 2023, https://www.nytimes.com/2023/08/21/arts/design/copyright-ai- artwork.html.

(6) Prada, Luis. “AI Artist Is Unironically Mad That People Are Stealing His AI Artwork.” Vice, 4 Oct. 2024, https://www.vice.com/en/article/ai-artist-copyright-lawsuit/.

(7) Wagner, Polk. “Introduction to Intellectual Property.” University of Pennsylvania Carey Law School.

(8) Copyright Registration Guidance: Works Containing Material Generated by Artificial Intelligence. 37 CFR Part 202, Rules and Regulations, 16190 Federal Register, Vol. 88, No. 51. United States Copyright Office, 16 Mar. 2023, https://www.copyright.gov/ai/ai_policy_guidance.pdf.

(9) McMillan, Robert. “AI Junk Is Starting to Pollute the Internet.” The Wall Street Journal, 12 July 2023, https://www.wsj.com/articles/chatgpt-already-floods-some-corners-of-the-internet-with-spam- its-just-the-beginning-9c86ea25.

(10) Vincent, James. “AI Is Killing the Old Web, and the New Web Struggles to Be Born.” The Verge, 26 June 2023,https://www.theverge.com/2023/6/26/23773914/ai-larg.

Previous
Previous

Infringement on the Separation of Church and State in Louisiana Public Schools

Next
Next

The Legality and Anti-Democratic Implications of the Department of Government Efficiency