The Global Creative Community Stands Unified Against Unchecked AI Use

Generative AI (GenAI) has ignited a revolution in content creation, enabling anyone to generate images, music, text, and videos with just a few prompts. While this innovation offers exciting possibilities, it also raises significant ethical, legal, transparency, and financial concerns among creators.

Artists, writers, musicians, designers, photographers, and other creative professionals are dealing with the impact of AI on their respective industries—with perspectives ranging from concern to outright opposition. The overarching issue is that AI companies are profiting from data that was never intended for their use without consent. In response, creators around the world are demanding transparency, fair compensation, and the ability to block their works from being used for training AI company datasets. Here’s a deep dive into what members of the creative community and their advocates have to say about gen AI, and why so many are sounding off.

Key takeaways from the global creative community:

  • AI companies are misusing copyrighted works by using them to train their models without obtaining permission and falsely claiming fair use.
  • Creators globally have been impacted, from the uproar in the UK to the gripes of Germany, and beyond.
  • AI executives are resigning over ethical and legal concerns, with unregulated AI development.
  • The creative community is fighting back, demanding proper licensing agreements and full transparency.

AI companies are misusing copyrighted works by using them to train their models without obtaining permission and falsely claiming fair use.

“I am no longer certain of my future as an artist—a technology has emerged that represents an existential threat to our careers: generative artificial intelligence (Generative AI). Generative AI is unlike any tool that has come before, as it is a technology that uniquely consumes and exploits the innovation of others.”

 artist Karla Ortiz in her Written Testimony before the U.S. Senate Judiciary Subcommittee on Intellectual Property, July 2023.

“As Congress well knows, copyright is a fundamentally important right authorized explicitly by the U.S. Constitution, not a minor inconvenience that can be disregarded by downstream inventors or investors. Copyright is the means by which authors and publishers are incentivized to write, publish, inspire, and inform— crucial roles that are more essential than ever in the face of numerous, serious threats to democracy. Big tech wants a pass on the indiscriminate appropriations they have already undertaken [through AI scraping] and continue to press for their own gains. There isn’t a single, rational reason to accommodate them, but there are ample, critical reasons to protect the vitality of authors and publishers…”

Maria Pallante, Association of American Publishers (AAP) CEO; Mary Rasenberger, Authors Guild CEO; and Danielle Coffey, News/Media Alliance CEO as authors of an April 2024 op-Ed in The Hill.

“…generative AI is creating products that are competing directly with works created by The New York Times and… [using works] of other human writers and artists who are suing AI companies. This consideration is not only directly relevant to the fair use doctrine, since it impacts the assessment of the fourth factor, but it is also among the most serious concerns raised by generative AI. It should profoundly disturb not only authors, artists, and publishers, but also the general public.”

 — Mira T. Sundara Rajan, Kluwer Copyright Blog, February 2024.

“The most valuable data is yet to be made. Calling all AI training fair use removes incentives for copyright holders to continue creating and publicly sharing their work (writers, artists, musicians, coders). [This] results in worse [AI] models long term than countries or societies that embrace consent and compensation. Extremely naive to assume all the data we need is already out there to build super intelligence, the world evolves too fast. This is how we lose the AI race.”

Rohan Paul, Co-founder/CEO of Controlla on LinkedIn, January 2025.


“There is no fair use precedent that legitimizes mass copying and exploitation of the expressive content of creative works by for-profit entities…an overly broad application of fair use to exempt unconstrained copying by AI companies could affect a potentially enormous transfer of value from the creators and owners of copyrighted works to the commercial entities that seek to exploit them.”

Jacqueline Charlesworth, Partner, Frankfurt Kurnit Litigation Group. Generative AI’s Illusory Case for Fair Use. SSRN, October 2024.


Creators globally have been impacted, from the uproar in the UK to the gripes of Germany, and beyond.

“In late 2024, the UK government proposed changing copyright law to allow artificial intelligence companies to build their products using other people’s copyrighted work—music, artworks, text, and more—without a license. The musicians on this album came together to protest this. The album consists of recordings of empty studios and performance spaces, representing the impact we expect the government’s proposals would have on musicians’ livelihoods. All profits from the album are being donated to the charity Help Musicians.”

 More than 1,000 UK musicians who signed the Is This What We Want? campaign—an album protesting the UK government’s proposed changes to copyright law, March 2025.

“I gave evidence on AI and the creative industries to the UK Parliament’s Culture, Media and Sport Committee… One thing we discussed was how the UK can lead in AI without upending copyright law and destroying the creative industries—which I’m very optimistic we can do.”

Ed Newton-Rex, CEO, Fairly Trained, in his December 2024 testimony before the UK Parliament’s Culture, Media and Sport Committee regarding the impact of AI on the creative industries.

“The German music sector has also joined the global protest against unauthorized mass AI ingestion of copyrighted works.At the end of 2024 and the beginning of 2025, German music royalties collecting society, GEMA, launched its legal campaigns against Open AI and against Suno on behalf of its musician members. In November 2024, GEMA sued OpenAI in the Regional Court of Munich over the unlicensed reproduction of song lyrics in GEMA’s repertoire by OpenAI’s large language model, ChatGPT.” 

Rachel Kim, Copyright Alliance VP of Legal Policy & Copyright Counsel in a March 2025 blog titled A Global Phenomenon: The Creative Community’s Viral Outrage Against AI Theft.

Read more about The Creative Community’s Viral Outrage Against AI Theft.


“I’ve resigned from my role leading the Audio team at Stability AI, because I don’t agree with the company’s opinion that training generative AI models on copyrighted works is ‘fair use’…I disagree because one of the factors affecting whether the act of copying is fair use, according to Congress, is “the effect of the use upon the potential market for or value of the copyrighted work”. Today’s generative AI models can clearly be used to create works that compete with the copyrighted works they are trained on. So, I don’t see how using copyrighted works to train generative AI models of this nature can be considered fair use.”

Ed Newton-Rex, CEO, Fairly Trained, in a November 2023 Music Business Worldwide article.

“I recently left the brilliant team at Liquid AI where I was CFO & VP BD to start a new venture. I left because there are critical problems that need to be solved at the intersection of AI, IP/data, copyright, and law…For many reasons my colleagues and I felt it was important to start sharing some of what we’ve been working on… to start: the notion that models do not memorize and regurgitate copyrighted information that they’ve trained on is demonstrably false. And yet, this is still a point being contested in courts across the country today. If you’re interested in how it’s possible to detect and reproduce the copyrighted content trained on by a model, start by reading Suchir [Balaji’s] post and the portions where he discusses entropy. And credit to him for the courage to surface this information & make it available for the public.”

Louis Hunt, former CFO and VP BD at Liquid AI.

Note: Tragically, Suchir Balaji, who left ChatGPT over ethical concerns regarding copyright and AI, was found dead in his apartment in late 2024 at 26 years old, not long after he left the company and spoke out in a candid interview with The New York Times in October 2024. During the NYT interview, Balaji talked about the enormous amounts of internet data that he helped gather and organize to train ChatGPT’s chatbot.

Read more about the crucial role these AI Whistleblowers played in sparking public discussion.


The Creative Community is fighting back, demanding proper licensing agreements and full transparency.

“Publishers must have control of how and where their copyrighted material is used. As AI tools proliferate, we have to ensure that they are used responsibly. AI can be a powerful tool to empower journalists and to better inform communities across America, but if used recklessly it can undermine the health of the organizations that dedicate substantial resources to creating news and other media content for the American people. News media and AI-based entities can both continue to exist and thrive, but for that to happen sustainably, they must do so in cooperation, with publisher rights fully respected.”

Danielle Coffey, President and CEO, News/Media Alliance, March 2025.

“[I]f we are going to live in a world with responsible, respectful, and ethical AI then it is essential that strong and effective transparency rules are in place in the United States and abroad that protect the creative community from infringement and misuse of their works.”

 Keith Kupferschmid, Copyright Alliance CEO, June 2024.

“It is possible to be pro-AI and pro-copyright, and to couple AI with respect for creators. Responsible AI starts with licensing, and in developing this license, CCC enables users to efficiently gain access to a consistent set of rights across many rightsholders and returns royalties to rightsholders as compensation for use of their works.”

Tracey Armstrong, President and CEO of the Copyright Clearance Center (CCC) in Publishing Perspectives article, July 2024.

“You get young guys, girls, coming up, and they write a beautiful song, and they don’t own it, and they don’t have anything to do with it. And anyone who wants can just rip it off. The truth is, the money’s going somewhere … Somebody’s getting paid, so why shouldn’t it be the guy who sat down and wrote Yesterday?”

singer/songwriter Sir Paul McCartney, The Guardian, January 2025.

“[The] wheels are in motion to allow AI companies to ride roughshod over the traditional copyright laws that protect artists’ livelihoods. This will allow global big tech companies to gain free and easy access to artists’ work in order to train their artificial intelligence and create competing music. This will dilute and threaten young artists’ earnings even further. The musician community rejects it wholeheartedly.”

 — singer/songwriter Sir Elton John, The Guardian, January 2025.

“Society cannot afford to allow multi-billion-dollar corporations to exploit the creative works of others without consequence. Such a permissive environment would stifle innovation, erode the value of copyright, and ultimately harm our society’s cultural and economic fabric. Gen AI must be subject to the same copyright laws as humans. To do otherwise would prioritize corporate profits over individuals’ rights and the creative ecosystem’s integrity.”

David Atkinson, Unfair Learning: Gen AI Exceptionalism and Copyright Law, SSRN, page 37, October 2024.

“From struggling to provide coherent answers to questions about where their training data comes from, to reluctantly admitting that training AI would be impossible without using copyrighted materials, to openly urging the government to make it easier to use content for AI training—by now, OpenAI is not even attempting to conceal the fact that its wonder-machines are built on questionably-obtained content, reaffirming that once again in its recent proposal submitted to the White House Office of Science and Technology Policy.”

Theodore McKenzie, Head of Content, 80 Level Blog, March 2025.

“In the end, perhaps generative AI will take the place of humans and produce more works at a faster rate than we ever could, but that’s not a future most humans want to see. What most humans want AI to do is to make their dinner or clean their homes so that they have more time to create—not the other way around.”

Kevin Madigan, Copyright Alliance SVP, Policy & Government Affairs, Generative AI Licensing Isn’t Just Possible, It’s Essential, November 2024.

Read more about how Requiring AI Transparency Won’t Destroy the Trade Secrets of AI Companies.


Where do we go next?

Governments worldwide are beginning to respond. Lawsuits against AI companies are mounting, and tech firms are under increasing pressure to develop fair and legal AI policies. However, the legal landscape is still catching up to the speed of AI development, leaving many creators uncertain about their future.

Generative AI is not going away. As the technology evolves, creators will continue to face new challenges and opportunities. But creators are not asking for AI to disappear—they are asking for a system that respects their rights, compensates them fairly, and ensures human creativity remains at the heart of art, music, literature, and beyond.

The key to a fair AI future lies in balancing innovation with ethical responsibility. What happens next will determine the future of creativity in the digital age.


If you aren’t already a member of the Copyright Alliance, you can join today by completing our Individual Creator Members membership form! Members gain access to monthly newsletters, educational webinars, and so much more — all for free!

get blog updates