Artificial intelligence continues to be of interest to nearly every industry, with applications including personalized shopping suggestions on social media platforms to office productivity programs with AI assistants. Its widespread adoption shows the technology has only begun to impact our everyday lives. However, its evolution will need to address some concerns—and quickly.
At CES 2025, the panel "AI and the Crisis of Creative Rights: Deepfakes, Ethics, and the Law" tackled the intersection of AI tech, intellectual property and ethics—something which the PR industry and communicators, particularly those working with content and content creators, should pay close attention to.
AI: Tool vs. Threat
Panel experts with backgrounds in law, government affairs, public policy and creative representation emphasized AI’s dual role as a powerful tool for enhancing creativity. But they also noted the potential threat to human authorship and consent, particularly in generative AI’s replication of voices and images.
“AI has always been a tool that artists can use, but it can't be allowed to replace artists and their work,” said Dr. Moiya McTier, Senior Advisor for the Human Artistry Campaign, which advocates for ethical AI practices and supports anyone who uses creativity in their work, including teachers' unions, athletes, artists, photographers and more.
The group has also been working on a federal level on the “No Fakes Act,” which will protect name, image and likeness of individuals from computer generated recreations from AI and other technologies, including deep fakes.
“If someone makes a deep fake of you, or someone creates a digital replica of your likeness and puts it out there, there's now recourse [with the law],” McTier noted.
Lisa Oratz, Senior Counsel, Perkins Coie, noted that even though there’s been negative stories surrounding digital replicas, there’s also some beneficial uses for AI.
“A lot of my creative clients, who are really embracing the use of AI, are constantly telling me how much it has really [helped] their creativity issues as a tool, even simple things like just making iterations easier,” she said. “You can [also] make content faster, easier and more affordable. You can do things to reduce barriers to entry and democratize content and smaller companies can more effectively compete with larger studios.”
Litigation, AI and Copyright
Every panelist stressed the importance of watching upcoming trials and legislation regarding generative AI and fair use. According to Chad Hummel, Principal, McKool Smith, there are currently over 40 cases in litigation, many focusing on fair use and copyright law. These will be exceptionally important for creators and social media platforms, many of which use stock photography, music, animations and gifs created by others to be shared in posts.
Given that copyright plays a large role in the future of AI, its definition and legislation on a common level is necessary. With language learning models (LLMs) like ChatGPT and OpenAI being trained on existing internet content, two issues emerge for creators. The first step, Oratz explained, is training models and the law deciding if it is a fair use or licensing issue. The other issue, she noted, is output—what the genAI provides the user with. Does the output also infringe on copyright? Who is responsible? The platform or the user?
“Currently, the copyright office takes a narrow view of human authorship,” Oratz said. “And the copyright office will not grant protection if something is purely AI-generated.”
Hummel said anyone in media or creative industries should keep a close eye on the Reuters and New York Times cases, in particular.
“I’m sure eyes roll when you hear litigation,” Hummel said. “But these are all business issues that will define the economics for the industry for the next several decades. Anyone in media, entertainment, tech: this matters to you—not just lawyers.”
Paul Lekas, SVP, Global Public Policy, Software & Information Industry Association (SIIA), believes that while we won’t see a federal consensus on any act regarding copyright or fair use and AI, we will see the lower courts, such as in the states, drawing the lines.
“I work with companies that create content that is expensive to put together—academics, licensed paywall content, etc., Lekas said. “There are concerns about valuable data being used [to train] AI tools. It’s hard to find middle ground. Licensing is one area that makes sense to balance the two sides, but it doesn’t get to the real issues of NIL in the creative community.”
Duncan Crabtree-Ireland, National Executive Director SAG-AFTRA, offered up a personal harrowing experience: seeing his own deep fake as he helped to negotiate the end of the writers' and actors' strike in 2023, where one of the negotiation points involved the proliferation of AI and NIL.
“I don't think any of us should want to see a culture that's based on algorithmic output,” Crabtree-Ireland said. “There is something unique and special about… human culture that needs to come from the creative spark of humanity.”
Nicole Schuman is Managing Editor at PRNEWS.