<img height="1" width="1" style="display:none" src="https://www.facebook.com/tr?id=2006193252832260&amp;ev=PageView&amp;noscript=1">

42 Min Read

[The Marketing AI Show Episode 68]: Zoom’s AI Companion, DALL-E 3, Adobe’s New AI Features, and OpenAI Hits $1 Billion in Annual Revenue 

Featured Image

Wondering how to get started with AI? Take our on-demand Piloting AI for Marketers Series.

Learn More

It was another interesting and busy week in the world of AI. Paul and Mike talk about tech updates with Zoom, Adobe, and DALL-E 3, and how OpenAI is on pace to reach $1 billion in revenue in the next 12 months. Wow! Tune in to learn more.

Listen or watch below—and see below for show notes and the transcript.

This episode is brought to you by our sponsors:

Use BrandOps data to drive unique AI content based on what works in your industry. Many marketers use ChatGPT to create marketing content, but that's just the beginning. BrandOps offers complete views of brand marketing performance across channels. Now you can bring BrandOps data into ChatGPT to answer your toughest marketing questions.

The AI for Agencies Summit is a virtual half-day summit happening on November 2. The AI for Agencies Summit is designed for marketing agency practitioners and leaders who are ready to reinvent what’s possible in their business and embrace smarter technologies to accelerate transformation and value creation. To register, go to AIforAgencies.com and use the code AIPOD50 to get $50 off your ticket.

Listen Now

Watch the Video


00:06:12— Zoom’s AI companion

00:15:31 — DALL-E 3

00:25:21 — Adobe Creative Cloud’s AI updates

00:37:56 — Replit AI for all

00:41:27 — AI that identifies strangers online

00:46:03 — OpenAI is on track to generate $1B in 12 months

00:48:32 — AI deepfakes in the music industry

00:51:02 — Meta’s new AI characters


Zoom releases an impressive AI companion for meetings

Zoom just dropped an impressive set of AI features that are likely to impact meeting productivity. These features are collectively known as AI Companion. The two main features are summaries, which are compiled by AI and emailed to you after a meeting, and a live AI Companion window that you can interact with to ask questions or get insights about the meeting in real time. Our team took it for a test, and we’re offering you our initial insights.

DALL-E 3 is about to change image generation—and prompting

DALL-E 3, OpenAI’s latest image generation tool, is rolling out to ChatGPT Plus users—and it could have a huge impact on how we prompt AI systems. Our team dove right in and thought the system’s initial outputs were impressive. It can generate high-quality images from scratch within seconds, right within ChatGPT. But, as Paul put it in a LinkedIn post, “it’s the system's ability to rewrite and improve the prompts that is potentially transformational.”

Adobe announces big AI updates across Creative Cloud

Adobe just announced magical new generative AI capabilities across its Creative Cloud suite of apps. So, if you’re an Adobe customer you need to update your apps and start checking out what’s possible. One of the biggest updates is the rollout of Adobe’s Firefly Image 2 Model, which now empowers users to generate even higher-quality images within Adobe products. Notably, the new model, like its predecessor, is designed to generate content that’s safe to use in commercial work. The model has some new text-to-image capabilities as well.

Generative Match allows you to generate new images in the same style as existing ones. You can also apply Photo Settings to generated images to adjust the depth of field, motion blur, field of view, and more just like you can do with a physical camera lens. And finally, Adobe also announced a “Content Credentials” icon that will be embedded into the metadata of content created using Adobe’s AI. The icon will show you which AI software was used to create it, as well as other information about the image.

There are more exciting technology and AI updates in the Rapid Fire section of the podcast, including OpenAI’s revenue forecasts, AI deepfakes in the music industry, and more. Listen, subscribe, and please leave a review!

Links Referenced in the Show

Read the Transcription

Disclaimer: This transcription was written by AI, thanks to Descript, and has not been edited for content.

[00:00:00] Paul Roetzer: I don't think the government agencies are structured to actually deal with the flood of this stuff. How do you keep up with it?

[00:00:07] Paul Roetzer: Welcome to the Marketing AI Show, the podcast that helps your business grow smarter by making artificial intelligence approachable and actionable. You'll hear from top authors, entrepreneurs, researchers, and executives as they share case studies, strategies, and technologies that have the power to transform your business and your career.

[00:00:27] Paul Roetzer: My name is Paul Roetzer. I'm the founder of Marketing AI Institute, and I'm your host. Welcome to episode 68 of the Marketing AI Show. I'm your host, Paul Roetzer, along with my co-host, Mike Kaput. Good morning, Mike. Morning, Paul. We are back on our Monday schedule. It is Monday, October 16th, 8 a.m. We're doing a little early because I have to fly to D. C. today after Mike and I had a whirlwind last week, I think.

[00:00:59] Paul Roetzer: I feel like every week we're starting off with where were Paul and Mike last week? Where were you last week? I was in

[00:01:06] Mike Kaput: Florida. Florida. Okay.

[00:01:08] Paul Roetzer: Yes. Yeah. I was in, Napa Valley to Vegas to Columbus. Wow. So I had this Sunday to Thursday, stretch. It was really good though. Itell you, I don't know about you, but the thing that's so great to me.

[00:01:25] Paul Roetzer: I hate being away from my family. Like I don't, I don't like the travel quite honestly. I don't being away at all, but it's invaluable to be in person with all these different audiences that we're meeting with, like last week for me was senior living facility owners on Monday. Wednesday was, golf course owners.

[00:01:50] Paul Roetzer: And then Thursday was workforce and education in the state of Ohio through Ohio excels in the governor's office. And there's such diverse audiences and you just start to really grasp like where people are at with their understanding of this. And so rather than you and I just having these conversations on Mondays and hearing every once in a while feedback online to actually be with hundreds or thousands of people in a, you know, we're a few weeks stretched.

[00:02:17] Paul Roetzer: And hear the questions they're asking and see how they react to different technologies. You know, for me, it's just so helpful to understand how to position this stuff, how to explain it. And we've talked before about this like bubble we sort of live in where we're just like consuming this stuff. And honestly, like if you're a listener to the podcast every week, you're staying up on a lot of this, like advancements, the average marketer, the average business professional, the average leader is not like we, we get lost sometimes and assume everyone's figured this stuff out.

[00:02:49] Paul Roetzer: And I can tell you. Very confidently that the average person does not know this stuff yet, and it's just the basics and the intro level stuff. I don't know. Are you seeing the same thing, Mike, when you're out doing these talks? Yeah, I

[00:03:03] Mike Kaput: couldn't have said it better. It's exactly that. No matter how diverse or different these audiences are, they come back to these commonalities of they just don't fully grasp yet what's possible with this technology.

[00:03:15] Mike Kaput: And a lot of them just have not really done a deep dive on a

[00:03:19] Paul Roetzer: lot of it yet. Yeah. And I, I'll often pull the audience. You do the obvious, okay, who's, who's tried chat you routine. You get all the hands who uses it every day. You get a couple hands who's tried an image generation tool and get, you know, half the hands who uses image generation daily get no hands, like who's tried video generation, they just stare at you.

[00:03:39] Paul Roetzer: Who's you know, you can kind of work down the line of like. The progression of this technology and you start realizing like they just, they just don't know. And then the part that I love is sometimes you get just amazing questions. Like once you connect the dots for people, they just start asking such fantastic questions about the bigger impact of AI.

[00:03:56] Paul Roetzer: So I don't know. I mean, so thanks to everyone who's having like an eye on these tours. Like we've done a ton of talks. I think collectively we've probably done over a hundred this year. And it's just amazing to meet with people and hear where they're at with this stuff. So. So yeah, so that's what I don't know, DC, I think I'm doing a government marketers tomorrow.

[00:04:15] Paul Roetzer: So, just a quick trip to DC and then I'll be back and then I just have a few left this year, hopefully for the road. So, all right. So episode 68 is brought to us by brand ops. Many marketers use chat GPT to create marketing content, but that's just the beginning. When we talk with brand ops team, we were impressed by their complete views of brand marketing performance across channels.

[00:04:37] Paul Roetzer: Now you can bring brand ops data into chat GPT to answer your toughest marketing questions. Use brand ops data to drive unique AI content. Based on what works in your industry, visit brand ops. io slash marketing AI show to learn more and see brand ops in action. And then it's also brought to us by the AI for agency summit that is coming up fast.

[00:05:01] Paul Roetzer: We have about two weeks or so left. It's November 2nd is a half day virtual summit. It's the inaugural AI for agency summit. We've got an amazing, agenda from, it goes from noon to five Eastern time. So it's going to be everything you need to know. Like I'm going to start with AI emergent agency.

[00:05:19] Paul Roetzer: Mike's going to talk about partner programs. I'm going to talk about intellectual property and copyright law. We're going to have five case studies of agencies like rapid fire 15 minute case study from agencies. It's just going to be great. So we really hope everybody joins us. If you're interested, if you work in an agency, especially if you're a leader of an agency.

[00:05:37] Paul Roetzer: Check that out November 2nd. That is AI4Agencies. com. That's A I F O R Agencies. com. And you can use AIpod50 for 50 off your virtual pass. All right, Mike, this is, this should be a fun one because this was like a really hands on episode of getting to experiment with some of this technology. Some of these things we're going to cover today are things we've talked a little bit about before hinted at in past episodes, but we actually got a chance to experience some of this technology ourselves in the last week, and so we're going to go into some of that, as well as some of the other big announcements from the last week.

[00:06:11] Paul Roetzer: So take it away, Mike. Awesome,

[00:06:12] Mike Kaput: Paul. So first step, zoom. just dropped an impressive set of AI features that are likely to impact meeting productivity. And these features are collectively known as AI companion. The two main AI features here are summaries, which are compiled by AI and emailed to you after a meeting and a live AI companion window that you can interact with to ask questions or get insights about the meeting.

[00:06:40] Mike Kaput: In real time, so Paul, you actually brought this to our team's attention because you've been testing this live across some of your meetings in the past week. What are some of your initial thoughts on these features?

[00:06:55] Paul Roetzer: Yeah, so this is when I was traveling last week. I think I had a. Buffered on Tuesday, maybe in Vegas.

[00:07:00] Paul Roetzer: I think I flew from Napa to Vegas Monday night. And so I turned on zoom on on Tuesday for a meeting and I saw it pop up at the bottom. If you haven't seen it yet, you have a summary option and then an AI companion option. Now, the admin has to turn on these capabilities, in your account. So if you if you don't have them, have the admin go on and turn them on.

[00:07:21] Paul Roetzer: And then there's a couple, there's a very few options. It's like for the summary, who gets it is one of the options. It's the host, it's the, host and all participants. I figured what the other one is, but it's like three radio button options. And then AI companion is similar. There's like three radio button options.

[00:07:39] Paul Roetzer: So we turned it on for a meeting with Cathy and Noah and I on our team, and I wasn't really honestly sure what it was going to do. We were just kind of checking it out. I had yet to actually go in and research the announcement and things like that. So the AI companion part pops up like a module, just like chat or participants would within a Zoom meeting, and it just sits on the right hand side.

[00:08:02] Paul Roetzer: And it prompts you with like three Prompts you could give it like it's sampled three sample prompts you can have so catch me up as one So if you join late, it'll like write you a summary of it I think there's one about your name being mentioned and a couple other quick ones So the things that I found is you can use the prompts they give you and they work really well But you can pretty much ask it anything so it's not restricted to just the prompt examples they give you So for example, in a meeting, you can do catch me up, which does a quick summary was my name mentioned so I can jump in and they'll immediately tell me whether or not you know, my name has been mentioned, summarize the meeting so far, what are action items so far, what's been discussed around a specific topic?

[00:08:43] Paul Roetzer: What are the action items for Mike so far? So you can pretty much have a conversation with it. I don't know For a fact, what tool they're using behind the scenes, what technology, I assume it's GPT 3. 5 or four based on how it works. I don't think they built their own model to do this. But the companion in, in my experience was really good.

[00:09:05] Paul Roetzer: So it does really quick, concise synopsis of what's going on. It gets, it would seem to be very accurate, very factual. So the outputs were fast, they were accurate, they were concise during the meeting, and then the post meeting summary was incredible. So, you know, again, we all sit in meetings all the time, and someone takes notes, and someone tries to summarize it, and usually the notes in our case are just bullet points of what's going on, and normally what happens in our environment, and this is when I was running my agency for 16 years, and we run the institute, Multiple people are usually taking notes and then everybody kind of like drops them into the same spot or takes them in the same live Google doc or puts them into like HubSpot in our case, like call notes.

[00:09:46] Paul Roetzer: And so you have like redundancy of all these same notes and no one actually summarizes the meeting. You just have a bunch of bullet points of what was talked about. So this does like a. Thorough synopsis of the meeting automatically puts it into headings. So you get an email, in my case, it was an hour meeting, five minutes after the meeting ended, I got a summary email and it was broken into headings with boldface.

[00:10:08] Paul Roetzer: You can click an edit button and go in and make edits. There was a few like misspellings of names and things like that, that you could go in. For example, we were talking about our Macon conference. Well, it didn't, when I was saying Macon, it doesn't know how to spell it. So I go in and fix that little things like that.

[00:10:24] Paul Roetzer: But honestly, like it's, it's probably the best summary I've ever gotten of a meeting from a human, like it was thorough. It was good. So I, my initial thing here, and we can kind of. Dive a little bit more into it is I know Microsoft Teams is doing the same thing are going to be doing the same thing I know Google meets going to do the same thing.

[00:10:45] Paul Roetzer: And so it was one of those where you have this experience or oh, I can't go back now like that is such a better meeting experience that any meeting that doesn't have that doesn't have the summary and doesn't have a companion is going to feel inefficient and archaic. So we're using summary right now in this podcast.

[00:11:03] Paul Roetzer: We're going to test. I'm taking this summary and using it to develop a blog post that summarizes the podcast. So, once you have that feeling, the first time I used ChatGPT, I was oh, I can't imagine having to go find templates again. It's just, I just want to prompt it. Or Google search, which sort of feels obsolete to me now, having to click through 10 links.

[00:11:23] Paul Roetzer: Once you experience a tech and you're feeling about that activity just changes. That's when you know the tech is getting quite good. And so for me, the big moment with this one was we keep waiting to find out, is Microsoft 365 CoPilot going to be what it appears it has the potential to be? Is, you know, Google Workspace AI going to be a true transformational agent within productivity?

[00:11:51] Paul Roetzer: And this was one of the first experiences outside of ChatGPT where I used this tech and thought, Oh, that's good. Like that's good in its current form. It doesn't even need rapid improvement and another version of a language model. So I was honestly really impressed with the simplicity of the ease of use and the quality of the outputs.

[00:12:13] Paul Roetzer: Yeah, I like that

[00:12:13] Mike Kaput: point you made about, you know, meetings that don't use this tech moving forward will be inefficient and archaic. So, in your mind, what does the future of meetings look like now that we have this, now that we might be able to assume we'll have something even better in coming years? Can you unpack that for us?

[00:12:33] Paul Roetzer: Yeah, I mean, I think it's just going to be really the couple key things we just talked about. I'm sure there can be other features added. Again, how creepy you want the stuff to get. I'm sure you could layer over like a computer vision model that's looking at faces and expressions and looking for, you know, are people paying attention?

[00:12:49] Paul Roetzer: You could imagine other forms of AI that you could impact on top of this and add these other layers. But just this, knowing how prevalent Zoom meetings and Google and Microsoft meetings are, how many virtual meetings are happening every day. Just this could drive so much efficiency in, in those meetings and the note taking and the summarization afterwards and the action items.

[00:13:13] Paul Roetzer: I, at the end of the summer, you get an email, it has action items at the end. Now, mind you, as we were going through the meeting, I didn't say, okay, Zoom AI companion. Action item one, like I wasn't telling it that it interpreted action items based on the words we were using and who the action item was, you know, who was responsible for the action item.

[00:13:32] Paul Roetzer: So, I mean, that's my initial feeling right away is. This goes back to, are companies ready for this? So you turn this technology on, it's free by the way, like it's included in your Zoom license. So it's part of your paid package, you don't have to pay more for it. So does it affect anyone's job? Is someone sitting in meetings to do this exact task?

[00:13:54] Paul Roetzer: If you're in a big enterprise, there's a chance you have an administrative assistant whose job is partially to sit in these meetings and develop action items. And okay, well, that person's job just got affected. And so I think that's the key for me is when you look at this tech and it's quite good. What does that mean for us?

[00:14:11] Paul Roetzer: It means, Oh, cool. Like we might be able to use those summaries of our podcast each week as the summary blog post. Like we may have just saved ourselves an extra 30 minutes each week on the podcast. If this summary functions in that way, other than that, it's just guiding the team. Okay. Here's the process for when you turn this on, make sure the people you're in a meeting with understand the technology you're using, because they may not be aware what AI companion is.

[00:14:35] Paul Roetzer: And when they see it's turned on, they're going to, what is the AI companion? So you have to like have a process to educate people and put some usage policies in place for your team of how this stuff is used. But other than that, I think it just becomes kind of the seamless component. Again, I go back like pre pandemic.

[00:14:52] Paul Roetzer: When you got on a zoom, some of us may forget this, but there wasn't really clear protocol of, are we turning our videos on or aren't we? And it was almost impolite to assume the other person was going to be okay to have their video on and to automatically turn video on when people would get on.

[00:15:08] Paul Roetzer: And so then during the pandemic, it just seemed Oh, okay. These are actually video calls. Like it's weird to not have your video on. And I think we'll probably go through that really quick, awkward phase of like. Oh, you're turning your ad companion on and then like within a month or two, everyone's just going to have their ad companion on.

[00:15:24] Paul Roetzer: And it's just going to be a normal thing. So in our

[00:15:29] Mike Kaput: second big topic for today, DALL-E three, which is open AI's latest image generation tool is rolling out to chat GPT plus users. And this could have a huge impact on how we actually prompt AI systems. So DALL-E three is. Outputs are really impressive. It can generate high quality images from scratch within seconds, and it can do it right within chat GPT.

[00:15:57] Mike Kaput: Now, Paul, however, as you put it in your LinkedIn post, there's another bigger thing going on here. You said, quote, it's the system's ability to rewrite and improve the prompts that is potentially transformative, by which you mean it is actually. Adding to and rewriting the initial prompt you put in to get better output.

[00:16:18] Mike Kaput: So can you walk us through that?

[00:16:20] Paul Roetzer: Yeah, so I've got it up on the screen. So if you're watching the YouTube video, so again, we published our podcast each week as a YouTube video. Also, we cut it up into each segment. So if you have never checked out the YouTube channel, you go check out the Marketing AI Institute YouTube channel.

[00:16:35] Paul Roetzer: And we've got these up. But the thing that was incredible to me, as you're saying, is all I gave. The valley three in chat GPT was image of the construction of the Great Pyramid. That was the entire prompt. And it took, I don't know, 15, 20 seconds or so. And then it output the four images and they were amazing.

[00:16:56] Paul Roetzer: And below it just gave like a quick overview. It says here are the images depicting the construction of the Great Pyramid of Giza. And then it has like a little caption. So the first one is a photo representation showing workers pulling limestone blocks on sledges with a partial pyramid structure in the background.

[00:17:11] Paul Roetzer: That's the caption, but when you click into it, the prompt that the machine actually gave is quite expansive. So, here's the prompt that It wrote for itself to generate this image photo depicting the construction of the Great Pyramid of Giza workers dressed in ancient Egyptian attire are seen pulling massive limestone blocks on sledges wet sand is visible in front of the sledges, suggesting the method used to ease the transportation and then it keeps going on.

[00:17:39] Paul Roetzer: And so I was Oh, my gosh, like that's. That's crazy. Like I knew it was supposed to head in this direction. The second one illustration showcasing a bird bird's eye view of the Giza plateau during the construction of the Great Pyramid. Numerous workers are actively involved in different stages of construction.

[00:17:54] Paul Roetzer: Some are pouring limestone while others are transporting blocks using, and it just keeps going. And so that was the thing that to me became quite impressive and alluded to where this all goes. And the reason I say that is. When, when we first started having chat GPT and DALL-E 2 and mid journey, it was apparent that the main friction point to get quality outputs was your ability to do a prompt.

[00:18:25] Paul Roetzer: And so what we said all, all along was if you're a writer, you're going to be better at prompting writing tools because you can explain things. You can give guidance as though you were talking to an intern or a junior writer. If you're a graphic designer, you can do better with image generation tools because you can explain and you can have a vision better for what the output should be.

[00:18:47] Paul Roetzer: So my wife as an artist could give a way better description to an image generation tool than I could as a writer. Video production, the same thing, audio production. So your expertise and your domain knowledge actually allowed you to get more value out of the tools because you could create a better prompt.

[00:19:05] Paul Roetzer: Now you could go take some prompting classes and learn some techniques around it, but at the end of the day, someone who is highly visual, who is a creative person in that space, can use better descriptions. So if I wanted to give a prompt to say, Runways Gen 2, I'm not a video producer. I can't, tell it what kind of shot to take.

[00:19:27] Paul Roetzer: I can't put myself in the chair of the director and explain all this stuff. So once you see this, It's okay, we knew that the friction point was prompting that these language model companies and SAS companies had every motivation in the world to remove that friction point, that they could use AI to enhance prompts, then more people would use the tools and get greater value.

[00:19:51] Paul Roetzer: So it was inevitable that this was going to happen. And we saw it starting in phases, like when gen two from when we came out, that was one of the features that would enhance the prompt for you, but not at this level. And so when you look at these enhanced prompts. It's incredible. And it leads to way better outputs from the system.

[00:20:10] Paul Roetzer: So now as a user, I'm far more likely to use DALL-E three, because I don't have to be good at prompting. I just have to give it an overall sense of what it is I'm looking for. And then I can pick an image and kind of iterate from there. So it's kind of that democrat democratization of creativity to where anybody could get value from these tools, regardless of their background.

[00:20:32] Paul Roetzer: And that. Changes things.

[00:20:35] Mike Kaput: So, yeah, talk to me a little bit more about those changes. What does that mean for a marketer business leader kind of trying to both use and adopt this technology for themselves, but more broadly in their organization? Does prompt engineering still matter as a skill you want to train for invest in?

[00:20:53] Mike Kaput: How does that work?

[00:20:55] Paul Roetzer: So I've never seen prompt engineering as a career path. I always saw it as a skill set, and Ithink that that is more true today than it was yesterday, basically. It still matters. There's still... You know, when you're working with the language models to give it guidance of who it is to say, you're a researcher, you're, you're an entrepreneur, you're a dentist, like to give it those guardrails, to tell it to think step by step.

[00:21:20] Paul Roetzer: We've actually seen recently some research saying that one of the best things you can do for these language models, tell it to take a breath and think about it step by step. For some reason, the same psychology that works on humans works on these language models and it creates better outputs. So there's this Again, because these models don't come with instruction guides that they don't even really know the best way to prompt them when they come, when they first emerge.

[00:21:46] Paul Roetzer: There is always going to be a bit of an art and a science to prompting. I don't see that going away. But what this does is it probably rapidly accelerates the adoption curve of this technology because one, it's built right into chat GPT. So the distribution is there. It's easy to use. And the quality of the output is going to be way higher.

[00:22:06] Paul Roetzer: So you're not going to have people get frustrated. Oh, I can't do it. Well, I mean, this can handle words now, which the previous models couldn't. So all the barriers that would slow down adoption seem to be falling. So I do think that prompting still will matter. But I just feel like the AI is going to get better and better at enhancing whatever you give it.

[00:22:27] Paul Roetzer: And the same is going to be, so this model we're seeing here where it gets better at image generation prompts, the same thing is going to happen in language and in vision and in audio. And so someone like you and I, who never done audio production, we're probably going to be able to go in and use these, deep fake technology, like being able to create deep fakes.

[00:22:47] Paul Roetzer: And we'll maybe talk about that on next week episode. There's That, that area is exploding right now. I just saw one with like lip dub from a company where they're, it's just everywhere. It's going to have translation and deep fake capabilities, but you're going to still prompt it. And that's going to, you know, be solved for.

[00:23:03] Paul Roetzer: So I think prompting still matters. I still think it's important to learn how to do it correctly, but it's, it's going to become less and less of a burden on the human to, to come up with a great prompt.

[00:23:17] Mike Kaput: So that was an interesting point you made about, you know, the built in distribution here as part of ChatGPT.

[00:23:24] Mike Kaput: Obviously, there are other image generation tools, MidJourney is another popular one, but do you see yourself or others embracing image generation more thanks to how easy this is to use right within ChatGPT?

[00:23:37] Paul Roetzer: Yeah, for sure. I mean, I think... So there's a couple of things at play here. So mid journey I've mentioned before in the podcast, I don't use it cause I don't use discord and I'm sure discord is not probably that hard to learn, but the couple of times I went in to do it, I was I don't want to do this.

[00:23:52] Paul Roetzer: Like it was just like another place to log into and it just seemed more complicated than it needed to be. So I'm sure people love Midjourney. I know the image quality is amazing. I've seen a bunch of comparisons of DALI 2 or DALI 3 versus Midjourney and sometimes Midjourney still wins like fine. But the fact that ChatGPT has, I don't know how many hundreds of millions of users at this point, that's the distribution where it just, they can win because more people are going to use it.

[00:24:20] Paul Roetzer: Adobe has obviously made some big, we're going to talk about Adobe today. They've made big advancements in this area. And so Adobe users, which I'm not a power user of Adobe, Adobe users could probably just use what Adobe has. So I think it's just going to be again, the tech you already use. Is likely going to be the place where you're still going to focus most of your energy because all of these core tech companies have to build this stuff right into the technology.

[00:24:45] Paul Roetzer: So, in our case, I use chat all the time, so it's extremely convenient for me to have DALL-E 3 built in. I did see. And I think we'll put this in the newsletter. You can now build images right in Google search. Like that, that was new in the last week. Where if you do an image search, one of the options now is just to like create the image.

[00:25:05] Paul Roetzer: Iassume they're using ImageGen, image generation technology that they introduced last year but never released. So you're going to have options of where to do it. I just, I wouldn't bet against open AI at this point.

[00:25:19] Mike Kaput: So you mentioned Adobe, Adobe actually had a very big week. They just announced some quote, magical new generative AI capabilities across the creative cloud suite of apps.

[00:25:31] Mike Kaput: So there are dozens of new updates right now across many of Adobe's major products. So if you're an Adobe customer. Go ahead and update your apps and check out what's possible when it comes to AI now within your apps. But one of the biggest updates is the rollout of Adobe's Firefly image to model, which now empowers users to generate even higher quality images.

[00:25:55] Mike Kaput: Within adobe products. Now, interestingly, the new model, like its predecessor is designed to generate content that's safe to use in commercial work. It also has some new text to image capabilities. It has something called generative match, which allows you to generate new images in the same style of existing ones.

[00:26:17] Mike Kaput: You can also apply photo settings. To generated images. So you can do things like adjust the depth of field, motion blur, field of view, just like you can do with a physical camera lens. And according to Adobe, this quote offers massive time savings for everyone charged with scaling up content production, while also making sure it aligns with a brand's look and feel.

[00:26:40] Mike Kaput: On top of all this, Adobe also announced content credentials. Which is an icon that will be embedded into the metadata of content created using Adobe AI. The icon will show you which AI software was used to create a certain image, as well as other information about the image. Paul, I mean, there's a lot going on here to talk about, but it seems like the overall trend right now here is that we're getting to a point where brands now have pro level creative capabilities.

[00:27:15] Mike Kaput: Some of which appear to be legally compliant from a commercial perspective, like right within the tools they already use. Is that how you kind of viewed

[00:27:23] Paul Roetzer: these updates? Yeah. So, I mean, the Firefly one, when it first came out, which wasn't too long ago, I think that was just a couple of months ago, you know, it was, the quality was questionable.

[00:27:33] Paul Roetzer: You could obviously get better outputs from mid journey other places. So, we knew Adobe was going to continuously improve that. The reality is because they're doing this and adhering to, in theory, more laws and regulations around the training data and what they're using, it's probably not going to be on par with mid journey and maybe even DALI 3.

[00:27:53] Paul Roetzer: I don't know. But for most enterprise users, it's probably going to be the safer bet and good enough. So I think that's one of the things here. Again, if you're an Adobe user, you're likely going to lean toward using these tools, especially if you're in a bigger enterprise, because now you don't have to go get approval for a new tool outside that maybe doesn't have.

[00:28:14] Paul Roetzer: The guardrails in place. So from a privacy security, legal standpoint, it's going to be a safer bet to go this route and Adobe's not going to slack here. Like they're going to keep coming and innovating, keep improving, keep probably, you know, I think one of the knocks. On adobe over time would be like you really had to learn photoshop to use photoshop or illustrator Like it was meant for people whose job was to to do that For them to remain competitive in the future I I think they're gonna have to go the route like we're seeing with DALL-E3 where Really almost anyone because they got to compete with Canva and beautiful.

[00:28:52] Paul Roetzer: ai and like these other players. They're going to make design so easy. And so I think they're going to have to open up the possibilities for the average non designer to go in and do a lot of the things that used to take a professional designer. That's one of the shifts. I see this happening is really opening up access.

[00:29:10] Paul Roetzer: The other thing that comes to mind is we talk about, I know my talks when I'm out doing the public speaking, I'll say like 80 percent at least of what knowledge workers creative professionals do will be assisted within the next 1 to 2 years. This is a really good example of why we believe that you're not going to be able to use marketing and creative tools that don't have AI in them.

[00:29:31] Paul Roetzer: It's literally just going to be infused into everything to the point where you're not even going to know you're necessarily using AI technology. So. That's obviously the route Adobe is going. It's just AI and everything like they're going to, every feature is going to be enhanced with AI capabilities.

[00:29:46] Paul Roetzer: The content credentials is interesting. We talked a couple of weeks ago about DeepMind and it was a deep Sith. I forget what their tech tech was called where they were trying to embed stuff. But I, Ilike the fact that we're seeing more progress being made more companies working on the ability to quickly tell whether something is fake or not, you know what it was created with and that's going to be essential moving forward.

[00:30:11] Paul Roetzer: So it also

[00:30:12] Mike Kaput: seems like the feature here we talked about where you can apply these photo settings to get different like. Camera effects and stuff is just one other validation of what we talked about with DALL-E. It's another way of essentially giving you, prompt assistance. Is that what that kind of seemed like to you?

[00:30:30] Mike Kaput: It seems like there's a lot of assistive features in here to get more out of the tool. Again, without, I wouldn't know, for instance, anything about how to instruct this tool in a prompt, how to act like a physical camera

[00:30:42] Paul Roetzer: lens. Yeah, I've been playing around with Playground. io, which I believe just rebranded to Playground.

[00:30:48] Paul Roetzer: com. I think Dharmesh Shah actually gave them the URL Playground. com. I saw on Twitter the other day, Dharmesh has apparently been sitting on that URL for a long time. But anyway, Playground. io, gives you access to stable diffusion and a couple other image generation models. And when you go in and create your prompt, you can pick, they have like 12 to 15 templates that are more themes, and those themes have built in prompts to drive a certain sort of output, you know, a certain look and feel.

[00:31:21] Paul Roetzer: And so, yes, again, I think everything is going to continuously remove the burden of the user. To develop the prompt and the user is going to very quickly just say, this is what I'm going for, and then it'll kind of take it from there. So it makes total sense that adobe would go in that direction, which again, I think opens up adobe to way more users than it traditionally has had, because you won't have to have expertise necessarily using the tools to do it.

[00:31:49] Paul Roetzer: All you need is natural language.

[00:31:52] Mike Kaput: So you had mentioned this kind of assumption. We work under the 80 percent of what we do as knowledge workers or creative professionals is going to be assisted by AI in some way. I thought it was interesting as part of these announcements. Adobe just directly called that out and said, Look, this offers massive time savings for everyone charged with scaling up content production.

[00:32:13] Mike Kaput: So could you maybe just talk for a second about if I'm a marketer or creative professional, what And I'm oh, okay, all these tools I use now have AI. It unlocks some possibilities, it sounds like. That just previously were unimaginable.

[00:32:29] Paul Roetzer: Yeah. I mean, that's going to happen in every aspect. So again, when we think about generative AI, we think text images, video, audio code, if you're into code, but really it's text to anything, anything you need to produce, it could be product design.

[00:32:42] Paul Roetzer: It could be storyboards for ads. It could be anything, and it's going to drive massive efficiency across all that within the next 12 months. So, I mean, I think in a lot of cases. That 80 percent is really going to be more about whether or not companies choose to adopt it. I think the technology will be there to where almost everything that we do can be AI assisted if we choose to use the tech that is available to do it.

[00:33:08] Paul Roetzer: There's just going to be some industries where they're just not allowed to use this stuff or the curve required. Or the approval process required to get the technology through procurement and then to build the processes to guide the users like in big enterprises, it may take a while or in some industries that are slower to adopt.

[00:33:27] Paul Roetzer: It may take a while, but I mean, I'm fairly confident within the next 12 months that if you chose to have AI assisting and pretty much everything you do, like for us as the Institute, we don't have those barriers were a small team we can move quickly. I really struggle to imagine anything we're going to do within 12 months that AI won't be assisting us in some capacity.

[00:33:49] Paul Roetzer: And that's without even being extremely proactive in finding it all. We're not every week sitting down saying, Okay, here's the 25 things we're still doing that have no AI. We're not doing that. And yet, we're taught as a team, anything you do that is repetitive, data driven, predictive, or generative, like those four variables, go find a smarter way to do it.

[00:34:10] Paul Roetzer: And so I think we've just made it so that people have the autonomy to go and seek out smarter ways to do things. I know you've been testing, you were testing, what was it? Hey, Jen, I, I, you, or Eleven Labs just this week. It's hey, let's go test that. So we just hear of the tech and it's let's go try it.

[00:34:27] Paul Roetzer: And if we find a use case for it, let's infuse it in. And so I think for organizations that are, that have that mentality, like a true, I wouldn't say it's an AI first mentality, but we're going to be an AI emergent organization. We're going to find ways to be more efficient, more productive, more creative, more innovative in everything we do that, that permeates the organization.

[00:34:49] Paul Roetzer: Like everything we do, we literally think. Is there a better way to do this? Anytime someone sees an inefficiency, you question the way you're currently doing and saying, okay, maybe I should go find a tool to do this. And so I think the organizations that have that mindset by fall of 2024, I mean, the tech's gonna be there to, to pretty much assist in everything you do.

[00:35:13] Paul Roetzer: So I, I, again, I'm fairly confident that 80 percent within one to two years, because. I just don't know what software exists two years from now that doesn't have AI in it. I don't know how you continue to function as a SaaS company without a major AI roadmap for your product.

[00:35:30] Mike Kaput: Yeah, to that point about how our team approaches this, I'm always under the assumption that anything I'm doing in a given day...

[00:35:38] Mike Kaput: Can be done better or more efficiently with AI. And while it's not always feasible to, you know, fire up a new AI tool, learn how to use it, I keep at least a couple of AI assistants alone or image generation tools open while I'm doing anything. And you'd be surprised even small efficiencies. If those stack up to your point a year from now, I don't see how some of these companies on the other side will be able to compete with people that are using even some of these features across a wide range of use cases.

[00:36:08] Paul Roetzer: Yeah, and it's going to be true in every industry and that's I think a lot of times people ask me. You know, like just talk about senior living, for example. So, you know, I was, that's the talk I was doing on Monday. And so you're talking to senior living facility owners, and it's not obvious in the services that they provide immediately.

[00:36:25] Paul Roetzer: So let's say you're thinking about AI and saying, okay, how can this help our services? It's no, no, no. First, talk about your core business. Every business has marketing, sales, service, operations, HR, finance, legal. AI can help in all of those. So before you even think about the products and services you offer, Just think about the function of your own company and you find the ways to do it.

[00:36:46] Paul Roetzer: So, so many of the ways we're using AI aren't necessarily in the products and services we offer as an institute. It's the back office stuff. It's all these other things that we're using it for. And that's like every company in every industry can be thinking about their own business. The last step is sometimes the actual products and services you offer.

[00:37:06] Paul Roetzer: So, for example, in senior living facilities, there's a chance That at some point in the next decade, Optimus from Tesla becomes a functioning assistant where it's actually in these facilities helping in some way. And now it's part of the offering. One of the ones somebody told me about was like they were using computer vision technology for fall detection in rooms.

[00:37:28] Paul Roetzer: You know, things like that where they're okay, this is actually now part of the service offering. But that to me is like. That would be amazing, but do the obvious stuff right now. Just get it into your meetings, get it into your doc development, proposals, grants, like whatever it is you're doing, go find ways to infuse AI into the business.

[00:37:46] Paul Roetzer: Then focus on the service and product roadmap.

[00:37:51] Mike Kaput: All right, let's jump into some rapid fire topics. First up, a tech company just made an AI announcement that promises to shake up the world of computer programming. So Repl. it, which is an online integrated development environment. used by programmers to create software.

[00:38:09] Mike Kaput: They announced that they are making their AI coding copilot available for free to everyone, to all of its 23 million users and free moving forward. Now, Repl. it AI, as they call it, is a powerful set of AI features that do things like complete code for programmers and assist them in completing programming tasks.

[00:38:32] Mike Kaput: What's interesting here is Repl. It's said in this announcement that the company is on a mission to empower the next billion software creators and that over time, they actually became convinced they can only accomplish this mission by putting AI into the hands of every developer using their platform.

[00:38:51] Mike Kaput: Now, AI copilots have already been shown to dramatically increase the productivity of developers. So. What's going to be the impact of a move like this by Repl. it to just give it to every single developer that even touches

[00:39:07] Paul Roetzer: their platform? Yeah. So I've always been impressed with replica. I met their founder Amjad at the February event that Jasper did the gen AI event.

[00:39:17] Paul Roetzer: He was one of the speakers and I had a chance to talk with him briefly. And, he's been grinding for like a decade on this. Like this isn't an overnight, you know, success story. This is a guy who's worked really hard for a long time with a vision. And every time I've heard him say this mission of the billion software creators, the first thing that comes to mind immediately to me is that's not traditional software creators.

[00:39:37] Paul Roetzer: It's like. When they're talking about a billion, they're talking about democratizing the ability to create. code and to build applications that anyone can do it. A billion is a lot of people that aren't trained as coders. So what they're trying to do is make coding done through natural language. Again, text to anything, text to apps, text to web pages, text to tools.

[00:39:59] Paul Roetzer: They, they want you and me, Mike, to be able to code things and build things. So yeah, we've seen the origins of this through github copilot. We have seen aspects in some other tools Advanced Code Analysis, or I think is what it's currently called. They, they have to rebrand that by the way. But OpenAI's Code Interpreter that they changed the name to, that writes Python code.

[00:40:20] Paul Roetzer: And in theory, you can use it to develop code that you can then build things with. But what, what Repl. it's trying to do here is make everyone a coder. Like anyone can build anything. That's why they say it's AI for everyone. And, and so I, you know, I think. Repl. It's a major player again. It's not a household name in the business world because historically it has been primarily used by coders developers.

[00:40:44] Paul Roetzer: But I think this is the kind of tool that you're going to see more widely available. I don't know that he wants to sell this company, but I would think they're a prime acquisition target for someone in the next 12 months. It's probably just. What his vision and roadmap are for the company, but, it's a very disruptive organization with a very grand, vision and mission.

[00:41:07] Paul Roetzer: And so far, they've shown every sign of the kind of organization, the kind of leader that, that can achieve that kind of thing. So I would just pay close attention, play around with it. If you've never heard of it, you know, go in and check it out. I think they're going to be a really important company moving forward.

[00:41:26] Paul Roetzer: So we talked

[00:41:27] Mike Kaput: last week about this controversial AI product called Pendant from Rewind, which records everything people do and say around you. So I think this is becoming a recurring bit, at least in October for Halloween, but get ready for another creepy AI tool that is now drawing some attention from a bunch of high profile outlets this week.

[00:41:49] Mike Kaput: So NPR actually just highlighted how this new, er, facial recognition tool called PimEyes. is gaining popularity on TikTok because it allows you to identify strangers from photos and videos. So in one high profile example, PimEyes was able to identify a random cameraman at a Taylor Swift concert. After someone uploaded a couple quick photos of him to this tool says NPR.

[00:42:17] Mike Kaput: This is an AI tool. That's like reverse image search on steroids. It scans a face in a photo and crawls dark corners of the internet to surface photos. Many people didn't know existed of themselves in the background of restaurants or attending a concert. So Paul, there's a lot to unpack here, but NPR mentioned that companies like Google have.

[00:42:41] Mike Kaput: Been hesitant to release super powerful facial recognition technology like this, but it seems like without laws regulating the use of this technology, like Pandora's box is kind of already open here. I mean, do you see it that way? What do we do about stuff like this? That's just completely out in the open, completely terrifying.

[00:43:02] Paul Roetzer: Yeah. So the Google not willing to release, it sounds awfully familiar. ChatGPT when they had language models like ChatGPT before open AI did it. So a couple of thoughts, I guess one. If you think the government doesn't have and use this technology, you're crazy. Like they've been using this technology for probably a decade.

[00:43:22] Paul Roetzer: So this isn't new tech, getting it in the hands of the average person to be able to use it is new there. What's going to happen with things like this, because there's, again, once it runs out, they're just going to flood the market, just like the deepfake stuff. It's just going to be everywhere. So it's not going to go away.

[00:43:42] Paul Roetzer: Unfortunately, this is, we can hate it, but it's here. You're going to hear justification for it, Media monitoring. So Mike and I both had, you know, a bit of a PR background. Some of the work we did at the agency previously was PR work and you do some media monitoring. And previously you would be looking for you know, brand product images or something and photos.

[00:44:04] Paul Roetzer: Well, now you could like monitor for executives appearing online. You could monitor for employees online. I'm not saying we would do this, but they're, they're going to justify this technology for the viable use cases that exist, that enterprises may be willing to pay for. Does not mean that it should exist.

[00:44:23] Paul Roetzer: I'm just saying that's how they're going to justify this stuff is that there are going to be some valid use cases, like monitoring your own image online. Because again, like if you think for what would be a justifiable use case, well, if deepfake technology spreads and there can be deepfakes of anyone anywhere online, and you want to know if you've been deepfaked, then you might be willing to pay nine 99 a month.

[00:44:45] Paul Roetzer: To have a service that monitors for your face online. So that same technology that's monitoring for you being deepfaked is being used to docs people and like find people online. So there's, there's just going to be pros and cons to this tech. So if you came to me and said, Hey, for nine, 9 a month, we can monitor for your face online or for your kids faces or whatever, would you do that?

[00:45:12] Paul Roetzer: And that's one where you step back and say, I was kind of actually maybe helpful. It's almost like your credit score. It's like monitoring for things to happen online that maybe you don't know about, or, or the ones that monitor for your information being stolen and leaked, like your passwords or credentials being leaked online.

[00:45:28] Paul Roetzer: Like we pay for services to monitor that stuff. And so I could totally see a market emerging to pay to monitor for your face online to make sure it's not appearing in places you don't want it to. So again, like it or not, this, this tech is going to be everywhere and may eventually be normalized.

[00:45:48] Paul Roetzer: I know Facebook has the same capabilities, like this isn't new. It's just who's willing to release it into the world. And once they do, then everybody does. Once it becomes normal. It's just going to be everywhere.

[00:46:03] Mike Kaput: So it turns out that open AI is on track to generate 1 billion in revenue over the next 12 months, according to reporting from the information, it is now generating more than 80 million per month in revenue compared to just 28 million in all of last year.

[00:46:23] Mike Kaput: Interestingly, at the same time this report came out, OpenAI also kind of pulled back the curtain on how its technology works in an article they published on October 11th, and in this article, OpenAI basically just provided, you know, an explainer for their technology that introduces us to OpenAI. how to use its products, how it trains its AI systems, and how it works to make AI products it releases safer.

[00:46:49] Mike Kaput: So it's definitely a good primer if you are new to OpenAI's products, but Paul, what did you make of OpenAI's revenue numbers here, and why are they publishing an intro to OpenAI now?

[00:47:03] Paul Roetzer: Yeah, the intro thing I think is just a transparency play trying to get into the enterprise market. They have to be more clear about what they, who they are, what they do with the data, things like that, because it's probably some of the pushback they're getting.

[00:47:13] Paul Roetzer: But they're on this run rate to a billion with almost no enterprise adoption. Like that's the thing that jumps out to me is how fast they're growing. With such little adoption so far, like they just introduced chat GPT enterprise, what, like 45 days ago, I mean, I can't imagine they have more than maybe a few hundred customers at this point that they've signed on and onboarded.

[00:47:37] Paul Roetzer: So, man, I mean, they'll, there'll be 10 billion plus in revenue at this time next year, like on a run rate for 10 billion plus. So I don't know. Microsoft's investments looking pretty smart right now. Anybody who got in early, I think it's looking pretty smart. Like this is going to be a massive company and I don't know contextually, but it's got to be one of the fastest growing in history.

[00:48:00] Paul Roetzer: Like there aren't very many companies that have gone from an 80 million, you know, or 28 million per month to 80 million or whatever that number is like. Getting to a billion in, in, in a year isn't easy to do as quickly as they've done it. You know, when you think back to before November 30th of last year, when Chatspeed came around almost a year ago, they didn't really have revenue streams.

[00:48:23] Paul Roetzer: They were very limited. So to all of a sudden be doing this is wild.

[00:48:31] Mike Kaput: So next up, we might soon have some legislation, at least in the U. S., that cracks down on deepfakes in the music industry. So in the United States, a bipartisan bill has been introduced that is going to hold individuals or companies liable for producing unauthorized AI replicas of individuals in a musical performance.

[00:48:52] Mike Kaput: And this has been called the no fakes bill. Act, which stands for nurture originals, foster art, and keep entertainment safe. Paul, we talked last week about the dangers of AI celebrity deepfakes when we talked through our story about how Tom Hanks warned his followers of a deepfake scam featuring his likeness.

[00:49:14] Mike Kaput: Do you expect to see actual legislation pass around deepfakes and AI

[00:49:19] Paul Roetzer: replicas? I wouldn't be surprised if that came quick, but I feel like there's probably existing laws that cover that you know, I don't know, maybe 10 episodes or so ago we talked about how like the FTC and some other government agencies were pushing for Hey, we already have laws for all of this.

[00:49:34] Paul Roetzer: You can impersonate people online when it's not a parody, when you're trying to actually you know, pretend like it really is someone, um. For commercial benefit or for other harms. Iassume there are already laws that are going to get this. The problem is it's so hard to track them and to act on them.

[00:49:53] Paul Roetzer: I don't think the government agencies are structured to actually deal with the flood of this stuff. How do you keep up with it? So. It makes sense that you would try and legislate it further, because it seems like it's going to be such a problem and it's going to affect government and elections and things like that.

[00:50:12] Paul Roetzer: So I don't know. I mean, I hope that they find ways to limit this. It's not an area I've studied. Like you and I wrote about deep fakes in our book, like the marketing artificial intelligence book. And we said then this is going to be a major problem and that every, every company should have part of their crisis communication strategy, a deep fake strategy.

[00:50:30] Paul Roetzer: Because it's going to happen like there's, this is the, it's like the tech we're talking earlier. Like you can't, we can't go back in society to a moment where we just don't have this. And so I think it's going to become a bigger and bigger issue. And so, yeah, I mean, if the government can do anything to help, great.

[00:50:48] Paul Roetzer: But I, unfortunately, I think this is going to come largely down to tech companies building ways to monitor and remove this stuff. And it's just going to be a constant race of AI versus AI. So last but

[00:51:01] Mike Kaput: not least this week. We've been talking about last week, about meta's rollout of AI characters or these AI personas that you can interact with, that are kind of like chatbots that you can use their platform to converse with.

[00:51:16] Mike Kaput: Unfortunately, this rollout appears to have many users scratching their heads, because The personas, the AI characters, are influenced by the personalities of real life celebrities, but not exactly. For instance, AI character on Instagram who looks and talks like the celebrity Kendall Jenner, for instance.

[00:51:38] Mike Kaput: But the character is named something completely different. The character is named Billy, despite having the same voice, tone, and mannerisms as Jenner. This already caused a ton of widespread confusion. First, fans of, Kendall Jenner thought the video of this AI character was the real person, pretending to be someone named Billy.

[00:52:01] Mike Kaput: Then they learned it was AI, straight up creeped out many, many users. This appears to be the same approach Meta is taking with other celebrities like Tom Brady and Snoop Dogg. They're contributing likenesses to AI characters, but these characters aren't named after them at all and appear to be completely distinct personas.

[00:52:21] Mike Kaput: So everyone is kind of has their head spinning here. But Paul, this is to me, just both baffling and a little weird. Like what's going on here?

[00:52:29] Paul Roetzer: It's such a stupid thing. I don't know. It's just Tech for tech sake. It's cause you can, you did, and you could pay them each 5 million, whatever they paid them for their likeness.

[00:52:38] Paul Roetzer: And I don't know, I think it goes to the point, like we've said on previous episodes, that the average person doesn't know AI can do this. Like they don't know you can create deep fakes of these people. They don't know that they can have these seem like real conversations. They don't, they don't understand this stuff.

[00:52:54] Paul Roetzer: And even when you understand that it's confusing, even though we know what they're doing, it's why is it Billy? And why? Why isn't it just Kendall Jenner AI or why don't you just straight up say, I don't know, I don't understand this strategy. I don't understand what they're doing here.

[00:53:11] Paul Roetzer: Yeah, I don't like it. Hopefully it just sort of like dies off at some point. But I don't know that the market for it will be there. Like people pay to talk to Mr. Beast when he's not named something else. Yeah. Maybe this is some grand experiment by Meta. I don't know. I don't know what their strategy here is.

[00:53:29] Paul Roetzer: It doesn't really make a lot of sense. And obviously they seem to be getting a lot of pushback of other people that are confused. So they may have to rethink this go to market plan for this technology.

[00:53:40] Mike Kaput: All right, Paul, that's another big week in AI. Really appreciate you unpacking it all for us. I know our audience does as well.

[00:53:49] Paul Roetzer: Yeah, thanks, Mike. And thanks again to BrandOps, our sponsor. And a reminder that all of the items covered today are in our weekly newsletter. Mike does a summary of that. And then, there's a whole collection of other items that we didn't get to that we've now included in the newsletter. So, in this, this week's edition, again, the newsletter comes out Tuesday morning, just like the podcast.

[00:54:10] Paul Roetzer: There's another six or eight topics we didn't get to. So definitely check out the newsletter if you haven't already. Marketinginstitute. com. You can subscribe to the newsletter and get the summary of the podcast each week, as well as a bunch of other stuff Mike and I are reading. That don't make the cut to get into the podcast because we try and keep this to an hour or less each week.

[00:54:29] Paul Roetzer: If we wanted to go two hours, I feel like we have enough content to get two hours most weeks. So thanks again, everyone. We will be back next week with our regularly scheduled broadcast. Talk to y'all soon.

[00:54:42] Paul Roetzer: Thanks for listening to the Marketing AI Show. If you like what you heard, you can subscribe on your favorite podcast app, and if you're ready to continue your learning, head over to www.marketingaiinstitute.com.

[00:54:55] Paul Roetzer: Be sure to subscribe to our weekly newsletter, check out our free monthly webinars, and explore dozens of online courses and professional certifications.

[00:55:03] Paul Roetzer: Until next time, stay curious and explore AI.

Related Posts

[The Marketing AI Show Ep. 22]: Adobe Adopts AI Images, DALL-E 2 for Product Design & Sequoia Capital Goes All-In on Generative AI

Cathy McPhillips | October 28, 2022

In this week's episode, Adobe embraces image generation AI, a startup uses AI for product design, and Sequoia Capital gets in the generative AI game.

[The Marketing AI Show Episode 58]: Big ChatGPT Updates, A New Autonomous AI Agent, Vertical-Specific LLMs, McKinsey’s State of AI Report, and New Google AI Search Features

Cathy McPhillips | August 8, 2023

This week's Marketing AI Show covers ChatGPT updates, autonomous agents, vertical-specific LLMs, McKinsey’s State of AI report, and new AI search features from Google.

[The AI Show Episode 102]: Apple WWDC, Warnings of Superintelligence, and Adobe’s Controversial New Terms of Use

Claire Prudhomme | June 12, 2024

In Episode 102 of The Artificial Intelligence Show our hosts discuss Apple's AI plans at WWDC, former OpenAI researchers bold AGI predictions, and Adobe's new terms that sparked controversy.