AI-Powered Seller

Why Revenue Teams Are Seeing 0.00 Value in Generic AI Implementations

Jake Dunlap & Kevin (KD) Dorsey Season 1 Episode 8

Can generative AI truly revolutionize sales, or is customization the secret ingredient to success? Uncover the transformative potential of custom AI models as we explore the pivotal role of data quality in deriving meaningful value from AI implementations. Many teams struggle to see results when applying AI generically; our discussion sheds light on the importance of tailoring AI models to fit your unique business landscape. From understanding sales playbooks and buyer personas to recognizing when to choose custom models over plug-and-play solutions, we promise you'll gain insights into maximizing your AI investments.

Witness firsthand how small, continuous improvements can skyrocket sales rep productivity. Drawing from personal experiences of tripling team production, we highlight the power of incremental progress and the often-overlooked victories it brings. Discover how even AI novices can adopt custom GPT models with ease, thanks to education and non-technical approaches. Our team’s practical steps, including the deployment of AI for scorecards, showcase the incredible time savings and enhanced productivity achievable through tailored solutions. Let us guide you through leveraging AI tools effectively without the technical hurdle.

For leaders and sales teams aiming to boost productivity, creating custom solutions is a game-changer. Learn how even those without technical expertise can harness these tools, focusing on continuous learning and adaptation. We emphasize creating "mini products" as a strategy to tackle organizational challenges, advocating for a gradual approach to tech integration. Through relatable examples, like a customized scorecard system, we demonstrate the capability of AI to streamline processes and identify improvement areas. Finally, embrace the concept of "what good looks like" in business, laying a solid foundation and gradually customizing AI solutions to enhance your strategies without the pressure of perfection.

Join the AI-Powered Seller Newsletter:  https://bit.ly/ai-powered-seller-newsletter

Sign up for Custom GPTs:
https://link.meetjourney.ai/aff_ad?campaign_id=4859&aff_id=2751&source=Jake-KD-Podcast&hostNameId=24435

Connect with Jake:
https://www.linkedin.com/in/jakedunlap/

Connect with KD:
https://www.linkedin.com/in/kddorsey3/

Speaker 1:

Welcome back everyone. This is our fourth episode of the AI Powered Seller, and last time we got a little spicy. Last time we talked about will AI replace sales enablement we got all sorts of feedback on that one, but today we're going a little bit of a different path around. Why so many teams are seeing zero value from rolling out generative AI generically, right? So we're going to talk about when it should be custom, when it does not need to be custom, how to actually do it in a way where you get a return, because, at the end of the day, none of this matters. If it's not improving performance, it's not improving the metrics.

Speaker 1:

So we're gonna talk about data, when to customize, how to customize and how to do it at scale. So this will be a good one today. So every week, as you all know, we take DM, so we're getting these every single week. If you have questions, send it to us. If you have things that you want us to go specifically deep on, send it to us, because this one was a really good one. The DM question of the week was what role does data quality play in customizing AI models? Right? So how can businesses ensure they have the right data to do customization the right way.

Speaker 2:

So I think there's a few different things I think about with this. You know, it's so funny, man, I talked to a lot of companies and it's always well, the data. You know we've got to have. This data I want you to think about with a sales org, right, which is what we talk about. The data is the playbook, the data is the how we show up for meetings, it's how we, how we best defined our ICP, our buyer personas, our competitive battle card, and so for me, you know, we're going to dive into this kind of the custom GPT world.

Speaker 2:

I think when I think of data for sales leaders, you need to be thinking about your quality of that data, because when you want to start to create some of these custom models, the more you know precise, your playbooks, your personas, your ICPs are the answers that it returns for the sales team are exponentially better than you know. Some of the data points, right, you see these tools. It's like when you mention the customers, you know, whatever you get, a get a three percent likelihood. Like, don't get me wrong, like that's an interesting data point. But instead imagine a world and what we're going to dive into is, you know, hey, I've got a competitive deal with this company in this company. Here's the situation. It's a cfo. We're competing against zendesk. Um, tell me the top three things I should avoid, or the top three things I should double down on. If, if your content, data is on point, that actually is what I think creates a better. You know, call it output.

Speaker 1:

So that's an interesting take and it's the right takes. I think when most people think data, they actually think literal metrics, versus what Jake just walked through is actually the context, the content of, like what's happening. Context that's a good way, that's a really good way to put it. It's more context than it is data, because you hear people say, oh, we have all this data, it's like cool, but you have no context to it. That's right. Here's the win rates of our last thousand deals, and that's right.

Speaker 1:

If you don't tell it what to look for. It's the personas, it's what's worked in the past, it's the insights you can provide to the data I think makes this work well, I think that's right. Man, let's jump in here and talk custom AI models, right, because I think a lot of times when people hear it, they go, oh, shit, yeah, what, what is it? How do I do this? Oh, I need all this data, need an llm. What's an llm? How do I set this up? How do we go through all of this? So what I think it's led to is a lot of people doing like just plug and play that's right.

Speaker 2:

I wrote a copilot, right. Nobody got fired for implementing copilot.

Speaker 1:

It's the safe way to do it. That's right, and I wrote about this just this week on linkedin of like. You know, opp or opd right of like are using other people's prompts or other people's data and I think when people are doing that, it's taking them a different direction, so like. So when does it make sense to do custom versus more of a plug and play AI model for a company?

Speaker 2:

Yeah, that's a great question. Look, I think things that are more universal for your company, right, and even like and let's take like a co-pilot, you know if you're going to install co-pilot for your company and you know you want to have a really easy access to product specifications, you want to have easy access to FAQs things that are like universally you know known or universally you know. There's not like kind of precise tactical applications that would happen Totally fine. You know a more generic model that you can just upload a bunch of your information to and it's going to pull out. I'm like, hey, what's the specs on this product? Again, what does this thing do versus this? So I think it's the more generic pieces where a more general thing that you're going to implement is relatively fine, because you really just need the LLM to look for the information and you're giving it your products right, and so it doesn't need to necessarily know how to customize it to a situation.

Speaker 2:

Now you could obviously argue like, well, if you could customize it to a situation, that would obviously be better too. But those are the applications and you know, I'd say chat, gpt, for example, if you're using that, like the teams. More general, you know it's about the level of precision that you want. You know if you are okay, because chat gpt out of the box is pretty damn good. You know the paid version and and you know perplexity is getting there and a lot of them are getting there um, you know if it's more like general q a? You know general um. You know research. Uh, yeah, I think all those are fine yeah, it's actually.

Speaker 1:

If I hadn't thought about this way till you said it. It's actually a good guide of like. If the answer can be generic, it's okay to use a generic. Yeah, I think that's the right way to be specific to your use case a certain persona, a certain industry. That's where you know the word that I think we're just going to drill with people's context. It will just be missing.

Speaker 2:

That's exactly right.

Speaker 1:

To go through it right, because also, too, we had to think about resources, like, if you're early as a company, you may not have the resources. You may not even have the context to give. So it's probably actually I want your take on this. I'm an early company. Yeah, I don't have all these resources. I don't have a lot of data, I don't have a lot of context. Is it the right play to put a generic one in, or could that actually go down the wrong? Could it take me down the wrong path without actually doing?

Speaker 2:

it the right way from the beginning. Yeah, I mean, we're going to get into this, but I think the interesting thing is we talked about this in the last episode, about sales enablement and the content Before to build out all this documentation, it would have taken you 40, 50, 60, you know, you know people hours. I mean chat. Gpt can write your personas for you, it can write your competitive, you know, and, and then it can. It can get you 80 and then you spend one you know person hour on the customization.

Speaker 2:

So, um, you know, look, I think maybe it's about the level of precision. It's like do I need this to be directionally right? 70 to 90 percent of the time, and that's good enough? Or, like, man, you know, is the is the extra mile to do the law to get toward the long tail, you know, a 10 percent increase in productivity, a 20 percent increase in productivity. It's like, you know, I think those are the things that I would balance is what's, what's the risk reward here? So if you, you know, look, if you do some, you know, I would test. That's what I'd say test the generic model. Hey, look, I work at this company. Insert link. Um, I sell this product. Insert link. I'm in a competitive deal with insert link and the persona is this persona in this industry, um, um. And then you try to run it and you say like well, yeah, that's pretty dang good and that gets me there.

Speaker 2:

So I think I would say it's a test thing, yeah yeah, I think that'll be interesting to watch how companies do it, because you do, you have people rolling out completely generic programs and then they're like, nope, nothing happened that's what we see with sales works right now, man, I gotta tell you because, again, we've got a lot of companies where you know we we lost a deal maybe four, three or four weeks ago, and the guys, like you know, they, went to the. I mean, we've got buying from the CEO. We never lose deals at contract, like, if we're sending a contract, this is a done deal, right. And then somehow the COO was like, well, we need to pause because we're thinking about our generative AI strategy across our org and we've talked about this. Man, it is. And I emailed the CEO. Look, I don't get bitter when I lose deals. Okay, I'm okay, I lose deals.

Speaker 1:

So you realize that every opener disqualifies what comes after it.

Speaker 2:

I don't get bitter, I don't get better. It lets me know you're getting better. I have more. I know it's more of like I don't feel the need to bond. No problem, I'll follow up in a month let's see how things are going.

Speaker 2:

I felt obliged to email the CEO. I'm like I'm just going to tell you you're thinking about this completely the wrong way. You are going to implement a solution that there is no solution that works for finance and also works for your BDR and also works for your biz ops team and for your customer success team. Each of these are personalized use cases of it. So if you're going to pick a generative AI strategy or whatever, you're still developing custom applications of how departments are going to use this to get that level of specificity.

Speaker 2:

So that's the biggest problem that I see is people think that implementing a generic AI like co pilots the most popular because it's micro, especially for bigger companies that are at risk adverse, um, and thinking that just a generic application is going to do it. Like I said, it works in those cases a product or more random things but it just doesn't get specific enough for a lot of these. So I felt, I felt you know, katie, I felt like it was my duty to make sure the ceo was. He didn't respond no he didn't, he'll respond.

Speaker 2:

Nobody wants to? Yeah, exactly, he'll respond in like two months.

Speaker 1:

I was going to say two, three months. You're going to, you're going to get a note on that. It's like, hey, you were right, I love those ones, I frame those ones.

Speaker 2:

Hey, katie, you, this comes up in like the next month or so, then maybe we should pick back up the conversation and everybody says yes, because I know what's going to happen in the next month, and then they come back and they're like, oh yeah, maybe we should talk about that. So that's it, man. I think there's a when there's a precision of specificity that is a 10, 20 percent better, because we as humans actually suck at perceiving 10 to 20 percent better. You know like. You know think about like baseball is such a classic, you know metrics, you know game, you think about it the guy who hits 320 versus the guy who hits, you know, 280. You know, if you watch that both of them hit, you'd be like I can't, I don't know how much is better. I mean, you're literally, you're talking about a half of what 50, 0.5% difference, right, 5% difference. We can't feel that.

Speaker 2:

We as humans don't feel small perceptions, and I think this is. We're talking more about sales, best practices, but hey, that's what we're here for. It's like that's why reps don't understand the difference between a great script and a good script is because you can't really feel the difference until you look at the data on and say, you know, actually, yeah, this did increase my conversion rates from stage three to stage four by eight percent. Well, guess what eight percent is meaningful, you know. But I, I think that that level of specificity that they're, they're, you know the answer for a lot of this, and I know we're going to get into kind of like why it's. It's simple to start, um, I think that's a lot of it too. It's like, well, yeah, this is good enough. It's like, well, I mean, actually it could be 10% or 20% better?

Speaker 1:

Yeah, because we have really bad memories as human too. I'm experiencing this with my own team right now. We have, in one of my orgs, almost tripled production over the last nine months. It's like production, not just revenue, like at a per rep basis. We're almost at a 3x right now, which also means I have reps making two to three times more now and they don't remember what it was like six, seven months ago. They only know what they're experiencing now and now. It's like I'm still striving to get that next 10 and that next 10, that next 10, and there's actually this gap a little bit right now, like, well, katie, we're already here, like we're three times better.

Speaker 1:

It's like, yeah, but we could be four that's right times and like it's those little steps people always forget reps, business leaders as well as like oh, that's what worked, then they don't remember the differences as they move forward, that's right, the small incremental wins.

Speaker 2:

That that's stacked, yes, over it, and I think that that's the application here. Is, you know, can it get you five or ten percent better if you do um? Or are there these general applications that are totally fun, yeah, so so let me ask this to you. So I think the cool part is, you know um, you have kind of went through, you know your team has went through. Uh, you know we did a six-week accelerator um and we'll have another one kicking off. We did launch the next one in January early. We'll have another one kicking off, I think, late Feb. So we'll link to the the link for that. So talk about it Like so, if you want to get involved in this kind of custom model, custom GPT type world, why is it simple to start, or how, how would you think about starting?

Speaker 1:

Oh, man, like I think this is fun because with my team. So a quote I come back to all the time. All that I read at least once a month. I come back to this leadership quote where it says the strength of the leader can become the weakness of the team. Sure, I come back to this quote all the time to remind myself of, like, where am I strong, that I might be creating a weakness in my team. And if I look at my previous company, I was, oh, whatever, this would be a big word. I was on the forefront of like AI starting to, like I was starting to learn a lot, but it never translated down to my team. Well, and I was like I think I created that gap. I was strong at it. So people were like, oh, katie will figure this out. So my current company had, finally, I had all my leaders go through the course.

Speaker 2:

So this is the first reason why this is easier there's education out there. It's no longer you and I talking over here.

Speaker 1:

There's education. That's why it's simpler. But I took 100% novices like they. There was no context around AI occurring going through that program to now having managers building custom gbds in my org six weeks later, you know. So the one, the education's out there, but then two once you know how to do this, you don't need that much to get going?

Speaker 2:

no, not at all. It's not technical. I think a think. A lot of times people hear custom AI, whatever, and they're like bro, I don't even know how to use it in the first place, and they get overwhelmed and it's like it's an LLM.

Speaker 1:

You just use words, you talk to it like almost literally you explain what you're looking to do, and this is also why it's simple. Generally, a lot of the best use cases for custom GPTs you're already doing manually. So if you can walk it through the process of what you're trying to do automatically, it's so much easier, right.

Speaker 2:

Like we, so we did it like our scorecards.

Speaker 1:

We already had the scorecard built right and, by the way, there's also custom GPT scorecard builders that help you build the scorecard.

Speaker 1:

But we took the scorecard. We put in what we were looking to do. Actually, I'm going to rephrase this we did not Gullar did. I played zero role in this at all. Gullar took the scorecards, gullar went through the program, gullar built the custom GPT, gullar tested it, gullar presented it, tweaked it and now we have a gpt running for all of our calls yeah, every single one of them being scored, and it was like 10 hours of work. I asked him. I was like how long did it take for you to actually do this? He was like well, probably 10 12 hours total. I was like how would it fast, would it be? Now he's like probably four to five hours.

Speaker 2:

I can think of that. How much does that save him time a month? And now multiply that across all your leaders so actually, this is, this is the other.

Speaker 1:

I love how you brought it. You know we're not good at like the small increments. We're also not good at big numbers. Yeah, we don't understand. Right now, for this team in particular, I have 20 active sellers and this is my inbound org. They are handling anywhere from God 60 to 70 opportunities or like demos, like a month per rep per rep, like a month per rep per rep, and now almost every single one of those being scored by this gpt where, at best case, my managers could have done two to three calls per rep every other week. Yeah, it's all like it changes everything. We're talking about hundreds of hours, right, not only being saved but gained exactly.

Speaker 2:

Yeah, it's like a save, it's like overproducing.

Speaker 1:

We're getting almost everything scored now and being able to measure actually right before recordings. I just got off with one of my managers from a one-on-one and that's what we were talking about. I was like we need the reps leveraging this more, not just us to them, but them using it as well, and then it's also it's easier to update, right, which I think is the other thing we talked about. Where we didn't talk about on the first portion is like you got to keep improving it. Even if you start generic, it should never stay generic. Like you have to continually feed it more information, right, and so I think those are things where, like actually I throw that back to you if you look at these companies, like, how do they keep them up to date? Right, because I think a lot of people like it's like why these generic ones? It's a flash, right, they launch it, it and it goes.

Speaker 2:

Yeah, it's like it's decent Woo yeah.

Speaker 1:

And then it dies down Like how do you maintain it, how do you keep?

Speaker 2:

it no-transcript, and I go like, okay, you've done it. How many of you then kind of gave up for a little bit and then everyone like keeps their hands up right, or how many of you never went back, you know. So I think that, leaders, you just got to believe you know these numbers. You just got to believe you know this, this, this, these numbers you know, as we think about the future of scaling teams, you know everybody is less with more right. And I was talking to one of our private equity partners on Tuesday and I was like you know, I talked to her. I said, look the leader, your leadership team has never been asked to actually care.

Speaker 2:

Or look at rep productivity. Rep productivity isn't something that most people even track and the reality is they have. Most orgs know how to scale. Is you add this amount of people with this amount of quota, with this? And that's been the play. That's been the play for a long time. And so the idea of rep productivity. And look me as a leader, I need to learn this shit. I need to. I need to be an expert. You know one again most of them don't know. Most of them don't know rep productivity. So if you're not tracking. That is wild Katie. I mean I would guess 90% of sales leaders I talk to don't look at rep productivity.

Speaker 2:

That is insane, yeah, they don't look at it, because they look at their day. It's simple and easy to just say, x number of quota waterfalled up to your number down, or, however, with x, number of attrition equals this. That, if you think about how, I mean like, look, I built, I built a model, you know kind of forecast like that in 2012, and that's just kind of how they have learned to do it. So, you know, part of this is just understanding that you, as a leader, are responsible for finding ways to increase rep productivity and in just across the org, and and this is the answer is if you can provide these custom solutions. I mean, just just let this think about that. You're you're.

Speaker 2:

It's the first technology where not only are we usually when you're able to do more, the quality degrades at a massive level. Look at your outbound strategies, perfect example. And this is the first where it's like not only is it more, it's like a higher quality output. And so I think there's like a belief at the top that we need to do this. This is something that we're going to stick with. This isn't something where, yeah, we're going to try it, because, again, a lot of this is you've got to think it's our job now as CEOs, cros, vps, it's our job to teach our people and that means you have to learn it as well, too that this is a new way to problem solve, and we've talked about this, I know, in past episodes. But there's a change management lift around, getting the org rallied around Like we're going to continue to make this investment and stick with it.

Speaker 2:

You know the cost and once you've done that, you've built the, you've built the rhythm. As you think about kind of this crawl, walk, run is. You know it's like great, you know this quarter we're going to build here are the biggest bottlenecks and time sucks. Is there a solution where we could customize a solution? And we will, you know, and what's the potential impact of that? Okay, great. So we'll build this one and we'll maintain this.

Speaker 2:

Again, you don't need to be a programmer to do this. All you need to understand is like there's some very basic and if we've got in that I'm sure Brian did a course, yeah, probably. What we'll maybe do is we'll link to like this one episode or course where Brian talks about just how to write good custom instructions for custom TPT. So if we've got something, we'll link to it, but you know the cost to maintain is really just the consistent updating, almost like a product, like you're creating these like little mini products and you know, a few hours a month, but again, it's not, you don't need a technical person to do it. And then then the last thing I'll say is you know, you don't need to try to do like I mentioned. I guess you don't need to try to do like like I mentioned. I guess you don't need to try to do it all at once. You know, you can just you kind of pick one or two, but I, I, I think if you don't believe that you have to do it and that you have to maintain these things, uh, you're going to be in trouble.

Speaker 2:

And then the other piece is the technology is just getting so good, so fast. And so you know, even when you think you've done you've, you've mastered it. Well then, you know, chat gbt ad citing its sources. Okay, perplexity, this is a, this is a big one. About a month, three weeks or that. No, I guess it would have been a few months now. Perplexity started in their api, adding the site, sourcing, and so they're just they're evolving. So again, if you as a leader aren't taking the time to understand what's possible. It's just worded in in sales. We're not used to moving in these fast cycles of rapid change, but you can make it simple. So again, getting started, I think, is relatively easy to start, just getting a basic education and then to maintain. You can just pick one or two use cases and then just having this kind of consistent optimization rhythm, and I think that that's what a lot of people need.

Speaker 1:

I want to read this Actually this is from um guller, my manager, to one of my vps, jeff. We built another one for our other sales or to go through right. It's like. So he built it for another org right. So we have right, org by org by org. I just want to read like I'm so proud, I'm so proud of my team right now. I'm so proud like I just love this right. So jeff got it. They've been communicating on it and then go to message jeff. I told jeff I can make it even better if he scores five to ten calls with. The scorecard provides the reasons to why he scored the way way he did. We can feed that back into the knowledge base and make it even stronger. I was like, oh, that's all you have to do, it's such a good example.

Speaker 1:

Oh okay, so he scores seven calls, gives feedback on why it scores the way that it did, fed it back in even better. And what's actually interesting I was talking about this with them this week even where it's not perfect you mentioned like 80-90% what's wild is it is almost never wrong on the misses, like if a rep didn't do something well, it's almost never wrong. Just every once in a while it thinks the rep did something well. That's almost never wrong, right right. Just every once in a while it thinks the rep did something well that it didn't. But that's the 10.

Speaker 1:

But because it gives the reason, we can see like, oh wait, no, no, that was, that's not good, that wasn't good. But it's wild. Like if, if you have a good scorecard in place, it catches if a rep did not do it. But sometimes it says, oh yeah, that was a great bucket question. You're like, no, it wasn't, and here's why and we can address it. That's right, we feed it back, so it's just ongoing. But it's just wild what you can do. And that started generic and how's it? Now has become custom, right, so it's just.

Speaker 2:

There's so many use cases here yeah and again, but you know, even spending a little bit of time on this one, um the other thing, think about the positive you now now have. How many of you, if I went and listened to this manager's team versus this manager's team, versus this manager's team versus this manager team, how different would be what they consider quality? And that's not good. That's not good. There is a right and a wrong way to probably run a call at your company, at your company, and so if you are trying to, you know you're letting four or five different managers, you're not creating a center of excellence for what it means to be a great rep at your company and you have to be able to capture these best practices where everybody can have insights on them. You know, you have look, we know if people execute this, this is the right way to do it. So you're also creating a quality uniformity of proven processes, which I think is also, you know, like.

Speaker 1:

I always love this when people go down as well, but, jake, I'm not going to turn my people into robots, right, there's 10 different ways to do this. I need to let go. I was like oh, have you ever been to like a Michelin star restaurant? Before you ever been to like a Michelin star restaurant? Before you ever been to like a five star restaurant?

Speaker 2:

Yeah.

Speaker 1:

Do they let their cooks all cook?

Speaker 2:

different. I'm an artiste.

Speaker 1:

I'm an artiste. No, the recipe is the recipe. They might add some flair, but like the recipe is the recipe, and there are recipes hiding within orgs that they just don't realize. Like, jake is our top performer. Here's what he does differently than the rest. That's repeatable, right? Because there's some things that, like you know, the acronym that I say a lot right now is like wiggle right, wgll, what good looks like. And I actually had someone ask me once like well, why isn't what greatness looks like? Why is it? Why is it what good looks like versus what great looks like? I actually have a reason for that. I believe everyone can be good, right, I don't necessarily believe everyone can be great. Agreed, there are some nuances, some natural abilities, some behavioral that separates that top 5%.

Speaker 1:

1% 100% agree, that's greatness, but I believe I can get everybody good.

Speaker 2:

My saying is I can create an army of B players. It is the extra. That's what makes you an A, but if you just do these things, you're not going to lose you will succeed, you'll be good and you can earn your way to greatness.

Speaker 1:

But there's greatness hiding in every org that they just don't capture. And out of fear of feeling prescriptive, I just had a message today. It's like well, how do I do this without restricting my reps? I had a message today. It's like well, how do I do this without restricting my reps? I was like this is not about restricting, this is about uplifting your reps. Here's what needs to be accomplished in discovery how you accomplish it Cool.

Speaker 2:

We got to do this part first, this part second, there's a framework and as long as we do this but I'd also stop people from skipping around I was very fortunate. I was like 26, 27 is when I went to a company that kind of taught me this and I didn't believe it was a thing of like the science of the sale is good, like a natural, natural seller, and I was very interested in psychology and you know I was struggling. I struggled my first few months, maybe I told this story before around um, my boss pulls me in as a second-to-last person to sell anything. I'm like what the hell? I'm God's gift to sales. Why haven't I closed anything?

Speaker 2:

And my director, my boss's boss, had listened to my call and he goes Jake, why aren't you following the roadmap, the script? I go, the script. I am Jake Dunlap, he goes. Jake, let me ask you this Do you think we're stupid? He goes. Do stupid? He goes. Do you think we train a thousand people and I process? That does not work and I go hmm well, probably not close sixty three thousand dollars in new business in the next month and I was just like holy crap, this is a thing, because actually it was very freeing because I had the best practices roadmap.

Speaker 2:

I didn't have to think about my next question, I didn't have to think, I could just show up and execute and and I've got a similar analogy to you. I'm like okay, do you think the best movies in the world are made with no script and just the quality actor? It's like so. So Anthony Hopkins reads the script, but you're better than Anthony Hopkins, you're better than two-star Michelin restaurant. You are that damn good. And so I think, as you know, we're talking about these things. All of these things are important. That's why, if you haven't documented, to be able to give the AI context, that's why you should be documenting.

Speaker 1:

And we'll part with some ideas here. It's like one let AI interview you. If you're like, well, I don't have all this stuff, let it interview you. I want you to ask me questions to better understand my deal process. I want you to ask me Such a good application.

Speaker 1:

Just let it interview, you Get the context out of your head and down on paper and now you have something to feed it. And if you're really taking this serious, like I'm in the spreadsheet hell right now doing 2025 planning, I'm all up in it. The very first additional auxiliary hire I listed. I'm calling it a GTM engineer or AI internal. I just want to hire one person to be able to own this and drive this, because there's so much that could be done. I'm creating a budget for this. I want this Now. It's no excuse for me not understanding it.

Speaker 1:

I'm going to still try to stay on top of it, but leaders out there make budget for this. Get someone in Work with an agency, a consultant, someone who knows this, to bring it into your orgs, otherwise you're going to miss, you just are.

Speaker 2:

Well, that's it. I mean, and you know you're doing exactly what we're talking about is you're staying up to speed on it, so you use cases that unlock, and I think that that's what anybody should do. So for me, man, there's some applications for generic AI, but I think all these things are related. The quality of your context is going to help. If you don't have context and if you really don't want to create the time to create bandwidth to create context, well then, just keep using more generic to create context. Well then, just keep using more generic If you see the opportunity.

Speaker 2:

Like what Katie to? Hey, you're a CEO, you know you're scaling initially, or you're a bigger company, Just again interview me here, Ask me 10 questions in order to help me create a buyer persona document, and, and you know, I will answer your questions. And now the cool part is, with the mobile app, you can do it all voice to voice, you can do it in your car, and now you can use custom GPTs and the mobile app too, which is a lot of fun. So, all right, man. Closing thoughts.

Speaker 1:

I mean, at the end of the day, create the space, create the budget for this, but get started. Don't let greatness prevent goodness. We'll be like. Well, I need to do it perfectly. Get started. If that's all it will take to get started as generic, start generic but then build it to custom. But don't wait, I can't remember the quote. Right Like the pursuit of perfection, get in the way of greatness. Where it's like oh, if I don't do it perfectly, it's not worth it.

Speaker 1:

Get started and then build towards custom, but this is not optional anymore.

Speaker 2:

This is something that has to be done and again, you can go do it. You heard this. You know we've got some education. So in the show notes we'll put a link to some of the custom GPTs that we've built that you can customize from there. We just closed our biggest deal with those as well. We've got a sales team of 24 now where we're building out, like we're basically just taking these custom GPTs and just customizing it to their business, their personas, et cetera. But I would encourage all of you to make your own too. So if we've got a link to that training, I'll drop it in there.

Speaker 2:

Also, make sure everyone's subscribed. We've got an AI-powered seller newsletter. So, you know, 100% free, just sign up. We're sending out. We've got, you know, one of our more technical people who pays attention to, you know, perplexity adding, you know, quotes or adding the citations, like I didn't know that. You know that was Brian. So, you know, subscribe to the newsletter. Check out some of the custom GPTs. You know for anybody out there If you have any questions, always, you know, dm me, katie, you know, drop a comment on youtube if you're watching this there. Um, and yeah, man, this is a good topic. I'm looking forward to to the next one and we'll see on the next episode everybody sounds good man.

People on this episode