<img height="1" width="1" style="display:none" src="https://www.facebook.com/tr?id=2006193252832260&amp;ev=PageView&amp;noscript=1">

3 Min Read

Generative AI Is Getting Sued. Here's Why You Should Pay Attention

Featured Image

Wondering how to get started with AI? Take our on-demand Piloting AI for Marketers Series.

Learn More

Major generative AI companies are now facing legal challenges that could have big implications for anyone using AI tools that generate text, images, or code.

One of the most significant legal actions comes from stock photo company Getty Images. Getty is threatening to sue Stability AI, the company behind the Stable Diffusion image generation model. Getty claims millions of its copyrighted images were used to train the model.

Stability AI is also being sued by three artists who claim the company engaged in similar copyright infringement with their work. (Midjourney, another major image generation tool, is being sued in the same suit.)

GitHub, a code repository, and its owner Microsoft are also being sued over GitHub’s Copilot tool. Copilot uses AI to generate code, and it was trained on code hosted in GitHub repositories.

The suit against the companies alleges that some of these repositories are protected by open source licenses. OpenAI is also named in the suit because its models power CopIlot.

While these legal actions are just getting started, they’ve kicked off an aggressive public debate about the fair use of the content used to train AI models.

Many generative AI startups and companies are built on these foundational models. This means the outcomes of these legal actions could have a huge impact on the generative AI space—and any company using generative AI tools.

Here’s what to be thinking about as a marketing or business leader as these legal actions progress.

Always ask: Where is the training data coming from?

It’s a known issue that generative AI models are trained on data that companies probably didn’t get explicit permission to train on.

“Anywhere this occurs, there’s going to be this question of: ‘Where did they get the training data and did they have permission?’” says Roetzer.

But it’s not clear exactly what permission might be needed, which is what the lawsuits and legal actions seek to clarify.

OpenAI and others would argue there’s little difference between what they do and what a human does when creating original content.

“Their argument would be: What’s the difference if the AI goes and reads 10 articles and synthesizes its findings and it writes the original content based on those 10 things? What does it need to cite them for? It’s synthesizing learning, which is what we do as humans,” says Roetzer.

Understand the legal issues here are far from settled

No matter which side you're on, the debate here is far from settled.

Roetzer gives a hypothetical example:

If you wanted to paint like Picasso, you could just go study the entire library of Picasso’s works and create an original image in his style.

“Am I plagiarizing? Am I stealing? Am I infringing on anyone’s copyright? I’m not saying I’m in that camp. I’m not saying I agree with it. I’m just saying this is the argument you’re going to hear,” he says.

The legal challenges may hinge on the fair use of content as it relates to monetization. Are AI models limiting someone’s ability to make money?

Today, Google gets around this because they provide links to websites that financially benefit from the content published in search results.

Generative AI outputs, as of today, provide no such links.

If ChatGPT recommends the best local restaurant for you by synthesizing information from 10 restaurant review sites, is it limiting their ability to make money in a way that violates fair use of the content?

Exercise caution moving forward

“As a brand, I would be very cautious about how you’re using these tools,” says Roetzer.

If you leverage APIs from a company that gets in deep trouble, you may be building your business on top of something that you lose access to in the future.

Investors should also be careful as they evaluate investments in the space since many generative AI startups are built on top of a handful of models and APIs.

The legal issues around generative AI are so new there aren’t easy answers. At every juncture, talk to your attorneys and get them involved: You’ll need their help to navigate this new normal.

How to figure this out faster

One way to figure this out how to take advantage of AI quickly is by taking our Piloting AI for Marketers course series, a series of 17 on-demand courses designed as a step-by-step learning journey for marketers and business leaders to increase productivity and performance with artificial intelligence.

The course series contains 7+ hours of learning, dozens of AI use cases and vendors, a collection of templates, course quizzes, a final exam, and a Professional Certificate upon completion. 

After taking Piloting AI for Marketers, you’ll:

  1. Understand how to advance your career and transform your business with AI.
  2. Have 100+ use cases for AI in marketing—and learn how to identify and prioritize your own use cases.
  3. Discover 70+ AI vendors across different marketing categories that you can begin piloting today.

Learn More About Piloting AI for Marketers

 

Related Posts

How to Automatically Generate Marketing Language and Copy with AI

Paul Roetzer | June 20, 2023

Anyword is an AI-powered language platform that empowers marketers to generate marketing copy and improve its performance through AI-powered analysis.

[The Marketing AI Show Ep. 22]: Adobe Adopts AI Images, DALL-E 2 for Product Design & Sequoia Capital Goes All-In on Generative AI

Cathy McPhillips | October 28, 2022

In this week's episode, Adobe embraces image generation AI, a startup uses AI for product design, and Sequoia Capital gets in the generative AI game.

Amazon Makes a Major Generative AI Announcement

Mike Kaput | April 18, 2023

Amazon just made a big play into the generative AI space, making them another serious contender in the AI arms race.