AI-Powered Seller

How OpenAI's Sales Reps Actually Use AI to Close Deals

Jake Dunlap

Connor from OpenAI shares how revenue teams are leveraging generative AI to transform their sales processes and achieve meaningful business outcomes beyond the initial experimentation phase. He outlines a clear pattern among successful organizations: democratizing technology access, investing in AI education, and measuring impact through improved business metrics rather than just time savings.

• Three myths slow AI adoption: needing a massive project, assuming availability equals adoption, and waiting for AI to mature
• Deep research capabilities fundamentally change how sales teams prepare, turning hours of work into minutes
• Sales leaders create personalized "user guides" for team members to help AI tailor coaching approaches
• Projects and custom GPTs serve as "homes" for deal information, enabling better handoffs and customer experiences
• Successful executives sponsor AI initiatives, demonstrate personal usage, and recognize AI champions
• Most powerful implementations progress from individual impact to team impact to organizational transformation
• Future of AI includes "agentic orchestration" where AI becomes the always-on assistant for all work activities
• Frontline sellers will need increased technical aptitude as AI becomes more integrated into sales processes

Start using AI today rather than waiting for perfect understanding—dedicate time to learn this technology as it represents the future of sales. Try one use case at a time and share what you're learning with your team.


Speaker 1:

What are some of the cool ways that you are seeing frontline leaders leverage this to kind of work with the team and not just help to close deals but maybe develop people etc. Everything that.

Speaker 2:

I mentioned can be really valuable for a frontline or second line leader. You're increasingly able to personalize chat, gpt. Where I have seen leaders take this to the next level is how they also understand your team. For example, I have my team do user guides. So with every member of my team, whether they're a frontline leader or whether they're an IC, we understand how they like to work. We understand their working hours, what their preferences are, their communication styles. You can put as much of it in that as you want, whether this is an GPT and so on. Suddenly you sort of have a version of that individual through which you can now ask you know, chat to BT to support communication to them. You can ask it for, like you know gut, check it within the sort of coaching that you're providing and, as long as you were continually feeding it the feedback that you were giving, the conversations that you're having, the stellar performance that you're seeing or the areas of growth and coaching that you want to provide, you're responsible for.

Speaker 1:

Where do you think we're headed? How about over the next, like six months, that you think that people are going to be doing around Gen AI, or what's going to be possible in the next, you know, six to 12 months?

Speaker 2:

What I will say is we're seeing some suggestions of it already and AI powered seller.

Speaker 1:

Connor, I'm pumped for the conversation here, so why don't you tell people a little bit about your background and the role that you play at OpenAI?

Speaker 2:

Yeah, sure, and excited to be here. Thanks for having me. So I lead our mid-market enterprise sales team at OpenAI. So it's a pretty big segment by go-to-market standards. It ranges from smaller-sized businesses all the way up to the lower end of true blue enterprise companies. We have two products, so we sell Chachapiti Enterprise. We have our API and I guess the way I think about our team's role is like we translate the capability of these products into value for our customers. So that's what I do at OpenAI, happy to expand on that role.

Speaker 2:

The last 10 years has mostly been spent at early stage companies. I was fortunate enough to start out as an SDR at a company that I think I still think is pretty world class and app folio cut my team there and between that entry point and where I am today, I worked for about three different startups, some for like close to five years like product board, others not as long, didn't quite have the legs to go the distance. We've all seen that movie and mostly where I've really enjoyed spending my time is sort of acting in that role of translator and working really closely with product teams and, especially, as it relates to OpenAI, our folks in research and applied and then understanding how we can make the sort of foundational work that they're doing understandable and valuable for our customers, and so I joined OpenAI just about two years ago. The go-to-market team at that time was, I think, probably fewer than 20 people, maybe fewer than 15, whereas today, I think, we're pushing 500.

Speaker 1:

That's awesome. Yeah, that's good. It's a little different than like. Maybe it'll work out, maybe it won't. It's like I think I'm pretty sure this is a safe bet, like I'm pretty sure this is going to be a thing as a part of it. Well, that's great, man. Yeah, and I have to imagine I mean for you guys in particular.

Speaker 1:

Just, you know how much of your work and sales team you know is like that translation, because I, you know my first kind of set of questions are really around. You know how revenue teams and go to market are actually implementing you know generative AI today, because what I see is a lot of like still a lot of wild, wild west, right when it's like oh yeah, these people have some licenses to this and these people have some licenses to this, and so what are you seeing from call it like best in class? Like what are the companies around go to market in particular? Like what are the things that you see? Like, have they identified pain points? It's not all about AI. Like what are the things you see from the people that get it and are, you know, kind of running forward towards some type of solution?

Speaker 2:

Yeah, that's, that's sort of the question. Thankfully, a year ago I would probably tell you we're still sort of identifying exactly what that looks like, but now I think we have a much better sense. We sort of moved from the era of experimentation and pilots and like is this hyper, is this not? Into real world deployments, and that shift feels very stark.

Speaker 2:

And what I have noticed when it comes to the workforces that, to use your words, really get it is there is this concept around the importance of having an organization that is AI-fluent or AI-enabled, and they sort of view it in very clear steps. The first is really just around the technology. They know they need to give access to it. This is table stakes and it goes to your point. It goes beyond oh, we know this is available and so our team is using it. Or this was like turned on for free by like insert cloud provider and so our team is using it, and it's rather there's a lot of intention behind making sure that they are democratizing access across the org to the technology first and foremost, democratizing access across the org to the technology first and foremost.

Speaker 2:

The second thing that I noticed is that they recognize the criticality of being enabled and educated, like there is some type of curriculum, whether that's external, whether that's internal. They are leaning in to the level of education required to like to get get to, at minimum, a one-on-one level of AI fluency across the organization. And then there's this appreciation that goes beyond time savings in terms of the recognition of value, the business justification for it. What are the metrics? It goes beyond access and education. How are we actually measuring the efficacy or the value of this work?

Speaker 1:

And this is always Is that usage-based Connor Real quick Is that? When you're talking about that last one, because that resonates with me a lot is it about usage, or are you guys trying to quantify business impact too? Or maybe it's both?

Speaker 2:

It is definitely both. I think, whether this is in the context of pilots that we run or in a version of QBRs, it's always really important to help stakeholders, the buyers, the exec team to understand what that usage looks like, what is the utilization, and that itself has nuance. It goes beyond like yeah, maybe it's being used daily, but in what way and what models are being used? Who are the teams using something like deep research, for example, or some of the more advanced models to sort of back into those use cases? But that only goes so far.

Speaker 2:

I think you know it's very reasonable for an exec to say like that's great if we're saving time, but where is that time being reinvested? That's the higher value work being done. You know it's a very common, very reasonable question, and so we then have to go that extra distance to help map that back to the things these businesses care about. Sometimes that's more straightforward and other times it's not right. This is very new and the technology is advancing very quickly, and so, at minimum, there needs to be this understanding of how teams are rethinking the way that they do work with AI first principles in mind.

Speaker 1:

Yeah, and I think for many leaders out there, you know, and I'm going to I want to ask you the question of what's stopping people.

Speaker 1:

What we've seen is the saying that I have that I use frequently is look, generative AI is not, I think, a lot of people.

Speaker 1:

It's like they're treating it like we've got to have the perfect solution across every role and it's like, guys, no, you don't. Generative AI is a department by department and use case based deployment and I think so many leaders they just they let it get too big and a lot of what I feel like I've been doing lately is like stop, yes, we're going to be able to do all this stuff, but we're going to start with this group. We're going to solve this challenge that they're having on pipeline generation, you know, deal evaluation, whatever it might be, challenge that they're having on pipeline generation, deal evaluation, whatever it might be, and just getting them to focus there, because I feel like there's very few people that are like I get it big picture. And then the other issue I see around that's hurting deployments at time is sometimes IT is also behind on the art of possible. Sometimes IT is like well, especially when it comes to chat GBT, it's like well what about our data?

Speaker 1:

It's like, guys, that's been solved for like two years, you know, as a part of this. So so, again, I think it's like I want to hear from you what you feel like stopping people. For me, I feel like it's like they're making it too big, they're thinking of it. It's like trying to say what's our internet strategy in 1996? And it's like, well, the answer is it depends on this person or this person. So I'd love to hear from you what you're seeing around what's holding people back.

Speaker 2:

I think you said it pretty well. We see something similar in the conversations that we're having. I often characterize it, or we often characterize it, in three myths that slow companies down. The first you've touched on it's that we need to have this big, massive AI project and that is the only way that we're going to succeed. What's this big bet? That's not to say that you shouldn't have it, but many view that as the singular way that they'll be able to prove that it's valuable.

Speaker 2:

I think the second and we did mention this for a moment there was that there's like the general accessibility leads to adoption Simply by having it there and inviting your employees to raise their hand, and if they do with a very clear, maybe big project to use it for, they will then have access. That that is then the thing that is needed. And then I think the third is that there's just this sort of waiting game until, like, ai sort of quote unquote matures. And I think those three things are probably where we've seen companies get really hung up either on one or all of them, when in the reality like if you think about one, you know the misconception that you need this big AI project. The reality is that AI literacy speeds up time to value and gets you closer to that big bet that you want.

Speaker 2:

The more that you put this technology in the hands of your employees and just have them start using it, the more likely it is that you'll A discover use cases that you hadn't thought of previously, because you have subject matter experts using this for the stuff that they do best, but also, b you'll get a much clearer picture of the rough edges of what the current use is right. So if you're using chat, you're maybe bumping up against you know you mentioned before we started like the usage of operator, like where's the limitation of the way that you might be using this today? And then what? Maybe do you have to take that extra step for to sort of like see a more sophisticated use case? And then for something like accessibility, like adoption depends on the usefulness of the tool itself, and then, finally, when it comes to waiting for AI to mature, first movers are seeing impact. I think I know that this is something you're seeing, this every day, and so there's really no, can't afford to wait frankly, I like that.

Speaker 1:

I wrote it down this three myths of Gen AI. Call it deployment or accessibility, um, and then let's get let's get tactical um, and maybe go kind of throughout the go-to-market roles, right, you? You have kind of the vantage point of both being a sales leader right and also helping sales leaders or go-to-market teams to implement. So let's, let's start with the front lines. You know are what are, are maybe the top one or two use cases where, when you're talking to a sales leader, you said look, I know you want to do all these things, but here are one or two for frontline salespeople or frontline go-to-market folks. What are the top few? Start here's that you typically hear or see.

Speaker 2:

I think I'm like deciding between more than a few. Oh yeah, go ahead.

Speaker 1:

Give me a second.

Speaker 2:

My mind is going a lot of different places, but I will say some of the first ones are probably at least obvious to you, maybe not obvious to everybody, but they're like productivity-driven use cases, right, these are things around first mapping out the areas within your workflow that you just understand are sort of like ripe for AI usage. And this comes into, like, the things that LLMs are really good at writing, summarization, data analysis and research. And so I'll be specific about one. I think deep research has sort of fundamentally changed the way that our team, and certainly that of many other go-to-market organizations, do their job.

Speaker 2:

Certainly in the world of sales, like what used to take us hours, especially for, like, enterprise reps who are genuinely doing the homework and coming showing up with a very sharp point of view and a very deep understanding of their customer's business.

Speaker 2:

This was a labor of love for the best of the best, and suddenly you have deep research at your fingertips and you're now doing something in minutes that could take hours or days, and I'm seeing that even the sort of account directors, as we call them, on our team leverage this in all sorts of creative ways, not only in an effort to sort of prepare for these conversations, but also they're sort of doing some of the work on behalf of their customers to show them the art of possible, and that in and of itself is a form of selling. When they are, you know, deeply understand the problems that they're trying to solve using our technology, to sort of take some early steps in terms of, like, how they would think about solving it through use of our technology, you suddenly have like asynchronous demos taking place, when they can sort of send these research reports that are valuable to the customers before you even have a conversation with them.

Speaker 1:

I love that, yeah, are they sending them again? We ran a play with our private equity partners where it was like basically creating like white papers on trends in the company that they just invested in, and it's like an eight page white paper on. Here are the trends, that they're the headwinds that this company is going to face, based on their market and where we think we can help, and it crushed right. I mean, it's like it's such a no-brainer. It's like that's why, with outbound, it's so frustrating, I feel like right now, because, to your point, the level of richness and insights and like specificity that I can get is is higher than it's probably ever been. But I feel like so many companies, when it comes to outbound, or have gone the other way right, it's probably ever been.

Speaker 1:

But I feel like so many companies, when it comes to outbound, have gone the other way, right, it's like how can we do more and automate everything? You know as a part of it which there's certainly things that we can have, so okay, so productivity, again kind of number one. I think we're seeing the same thing. What are some other? Any other use cases? You're seeing kind of the frontline or cool Like again, you talked about the kind of deep research being one any other kind of frontline either. You know sales or account management use cases that you're seeing across clients that are successful, yeah.

Speaker 2:

I think another and all sort of give a simple version and then maybe a more sophisticated version. But a simple version of this is either a GPT or a project through which you are funneling the types of internal data information that you might use Think, call transcripts, think notes that you've taken and are just feeding these in such so that you can create a much more holistic picture of the work that needs to be done for a strategic account and then sort of breaking that out into a very specific project plan, but then also using chat as sort of the daily assistant to like hold you accountable to that plan and then to do a lot of the work in terms of the sort of daily steps that you do to create a very like valuable, differentiated experience for the customer. And so this might be the way that you are leveraging deep research to send this. It could be how you are like analyzing a lot of the different conversations that you're having to sort of challenge you in terms of some of the blind spots that may exist or the areas within a deal in which you are exposed.

Speaker 2:

I think so much of what we do as sales leaders is we try to understand in these engagements like where are the exposure areas, Like what have we missed? And chat's. Actually really good at that is the more context that you feed it, and I think projects have been a really good place that I've seen my team and other go-to-market organizations create a home for all of this. I have folks pinging me all the time asking like, when are we going to be able to like share this across teams? That's something our org is working on, but that's another really good example of just like creating homes for each account, through which you can just continually bolster them with context and information and then use that as a vehicle for ongoing coaching actions to be taken, project management and then also some of the customer-facing content that you create in terms of the stuff that you'll put into a deck or into a proposal.

Speaker 1:

Love that. Yeah, that's a great one, and so you guys are using projects within ChaiGPT. Okay, cool, awesome.

Speaker 2:

Yeah, I love that. Projects and GPTs.

Speaker 1:

Yeah, and I think that that's the future I think a lot of people need to think about. It's like imagine you'll have this through line, imagine the beautiful handoff processes that should be happening right. It's like everything is there, it's all in one place. You can query it, you know. It's like it's so easy now from maybe if you have, like a setup where you have SDRs and salespeople and account manager, to where imagine how much this is going to improve the customer experience. Right. Where it's like I'm not repeating myself 50 times the account. Like once I close the deal, the account manager is immediately up to speed on my business. Like that is a true, true game changer.

Speaker 1:

And we built a GPT, a custom GPT. Like called competitive deal winner. And it's like it'll have you know, you upload your transcript or whatever it might be, and it'll help to say like hey, typically your deals have these five people I don't hear any of those people on this call compared to the competition. Like concede this point, focus on this point. Right, and like the ability to help to stress, test some of these as a frontline rep. It's a no brainer. I mean, the alternative used to be in Connor, you remember this. It's like you would sit up at night and play out the scenarios. You're like, okay, this person needs to be involved, and then we do this.

Speaker 1:

But it's now like by creating a project using different GPTs, you know you have your own like to your point. We call it. You know we kind of use the word assistance a lot because we feel like it resonates where it's like each role has your own little two or three buddies that are kind of sitting there helping you to do your day and with your chat GBT, I can at mention the next buddy and then bring him into that and then at mention the next one. And so I think anybody who's in sales, you know, if you're on the front lines or leadership, I think this concept of kind of creating a home for a deal in prod is kind of a no brainer. So I love, I love that use case.

Speaker 1:

Let's go to the leaders now. Ok, let's talk about maybe you know, and we're going to go to like executive leaders next, but let's talk like frontline leaders. What are some of the cool ways that you are seeing frontline-line leaders, or you know front or you know director-level leaders leverage this to kind of work with the team and, you know, not just help to close deals, but maybe, you know, develop people et cetera.

Speaker 2:

Yeah, this I mean, it's the use cases will blend and then, in some cases, really sort of stand out as being very singular and specific to roles, and so, like, everything that I mentioned can be really valuable for a frontline or second line leader in the way that they support their team. And but as you start to think about like, for example, you're increasingly able to personalize something like you know, chat GPT, through which it like is more and more valuable in the way that it understands your body of work, like what your function is, how you communicate, and where I have seen leaders take this to the next level is how they also help chat GPT, um or the or you know, using whatever it is that you're using, like also understand your team. And so, for example, if you have a leader who's uh, you know we for example, I have my team do user guides, and so, for example, if you have a leader who's you know, we for example, I have my team do user guides, and so, with every one, every member of my team, whether they're a frontline leader, whether they're an IC, we understand how they like to work, we understand their working hours, what their preferences are, their communication styles you can put as much of in that as you want, whether this is a GPT and so on. And suddenly you're, and suddenly you sort of have a version of that individual through which you can now ask ChatGPT to support communication to them. You can ask it for, like you know, gut check it within the sort of coaching that you're providing and as long as you were continually feeding it the feedback that you were giving the conversations that you're having, the stellar performance that you're seeing or the areas of growth and coaching that you want to provide, you're having the stellar performance that you're seeing or the areas of growth and coaching that you want to provide.

Speaker 2:

You now suddenly have this running list that A is valuable in the day-to-day conversations that you have. You can consult it before one-on-ones. But suddenly it's mid-year time for check-ins and reviews or you're doing annual performance reviews and you have this rich history and context of your relationship that not only understands the work that you've done up to this point but they understand that individual that you're responsible for their career growth and progress. And not only does it save a ton of time we all know how much time we spend during performance reviews but it's substantive because it's backed in that sort of rich data set and context. So that's one of the areas. I think that's been really, really cool and super valuable just from a person development standpoint.

Speaker 1:

That is dope. Yeah, can you explain to everybody user guide, just in case they're not familiar with what that is.

Speaker 2:

Yeah, in our case it's quite simple. We initially did it in Notion and we've sort of since moved it. You know I might now live in their individual GPTs or projects, but these user guides are a template that you can provide out to your team that asks them both basic or increasingly, you know, maybe complex questions that help you understand who they are as individuals, as human beings, not just workers, but then also how they like to work, what are their preferred working hours, what are the ways that they like to receive feedback or give it, what are their communication styles. You know you can get into like personality tests and things like that and incorporate that data if you like, but these are really good ways to sort of cut through the noise and understand at a deeper level who the human being is on the other side of the desk from you, so that you can tailor your coaching accordingly, because you know you have to, of course, modify that at the individual level when you're working with as a leader.

Speaker 1:

That's so. Yeah, I'm definitely stealing this idea and are you creating so you know if you're a frontline leader out there and you're listening to this, or for your team? Are you creating, then, individual custom GPTs for each person, or is it like a team-wide one?

Speaker 2:

I would ChatGPT will get so much better at this, at just the universal level, when you think about it as the orchestration law. But in order to keep it really clean and consistent, I would recommend GPTs for now, because in all of the custom instructions you can put in those user guides and then you can continue to add to it, and so the same goes with feedback loops and so on, and suddenly it's just your sort of partner in the process of being a more effective leader.

Speaker 1:

Oh my gosh, this is so. Oh my gosh, I'm just like, oh gosh, I cannot wait to like talk to people about this use case, like it's such a, it's such a no brainer one that you know we, we talk a lot about the kind of, you know, a rep can you know use, like your version of a good discovery, call GPT right, and then they're kind of putting their call recordings through it and presenting that to you in your one-to-one. But this to me goes, as a leader, you know, I think we like to think our minds are a steel trap. Let's be perfectly honest. You know, as soon as we finish a one-to-one, we're kind of off to the next you know thing. And this ability for me as a leader to have this kind of running dialogue and like, hey, we finish a call where we just had our goals one-on-one, put the transcript in there and then it uploads it so that that is gold, that is absolute gold.

Speaker 1:

Um, in my opinion, any, any other kind of gems around frontline leadership that you're seeing, that you know you feel like, are, um, you know moving the needle. But you know I love this use kind of. I'm also thinking we'll create a user guide gpt, that then it will help them to fill out the user guide. Yeah, and then, and then it'll get the output, and then that can be your, your, your knowledge doc. As a part of it.

Speaker 2:

That's exactly right Like it's. It's. I think before we even started recording, we sort of talked about how hard it is to break habits. Like starting somewhere like a doc or Notion seemed obvious at the time, but, of course, like you should start with chat, you have to remind yourself of that.

Speaker 2:

I think the other would just be, on a number of different levels, taking advantage of data analysis, and so as a frontline leader let's talk about that you might be using something like intelligence already, a gong or a chorus or insert any number of the ones that are out there and you can have that.

Speaker 2:

But you can take that many layers deeper.

Speaker 2:

You can also take the information that you have around team performance. You can use, uh, that plus deep research to understand sort of the competitive landscape and compare that relative to what you're encountering in the field. This can be, you know, information around um, that you're history and beyond, and maybe we'll talk about this if we have time, maybe we won't, but this starts to get more sophisticated as these frontline leaders and as they work with cross-functional partners start to work with the technology in a rarer form, maybe through the API, to think, through, replicating a lot of the work that an SDR once did right All the way up from enrichment to routing, to outreach and beyond. We have certainly moved past what that function traditionally did and rely on our technology to do it. And frontline managers have a role to play here, because they understand very deeply and intuitively what it is that these roles once performed, and so they're now as much architecting the responses and the quality and the nature of this work that's happening through the technology as they are actually working with with the evolved roles themselves.

Speaker 1:

All right, let's talk about this because this is a good transition. I talk also about, like, senior leaders and how execs and you know, in the go to market. But I want to talk more about data and analysis. Right, and maybe where are the? Where are the areas you feel like, hey, if you had to put yourself as a frontline leader, director, if you had to say one or two use cases around data analysis, that chat, gbt is really good at now.

Speaker 1:

Are there a couple that kind of come to mind of like you know, okay, definitely this, and also the flip side. Are there areas where you're like, look, it's not there yet, where it can do why, right? So are there a couple of kind of tactical data analysis? Because I, you know, we, you know the call transcript and kind of noticing patterns? I was just talking to one of our partners about this earlier today and it's like great, like you know, with the api I can take all the gone recordings, like from jake dunlap, I can then run them through my call score, gpt, you know, via the api, and then I can turn back. Here are the trends for jake and it crushes so good. So that's one, but I have had, you know when I try to do more detailed like data trend analysis of like large amounts of data across reps, you know, are we there yet? Or where is that level of like being able to use a custom GPT or even the API to really spot those meaty trends in individual performance?

Speaker 2:

I think we are there. I think it will get better. It often is, I know you, probably you understand deeply. It just comes down to the work that you're able to do with the data itself, and it's like the quality and the nature and the categorization of the data. And so we're at a point now, I think, where you may not have an FLM if they don't have these skills developed yet to take all of this data in its sort of unstructured form and supply it and get the type of analysis and insights that would help them really move the needle. However, if they have a cross-functional partner that can help get that data into a state where the model can interpret it much more easily, suddenly you have something there, and so it's a little bit less about what the model can do from a performance standpoint and more about what is necessary to get that data into such a state that the model can make sense of it.

Speaker 1:

Yeah, that's it. Yeah, and I think a lot of people it's like that last part you said is critical, like don't throw the baby out with the bathwater, meaning it might not be giving you kind of the insights, because the way that the data is structured is just like not as easy. I mean, it's pretty good at taking a lot of like unstructured points, but we've had to kind of learn that lesson the hard way a few times on, just like why can't we get it to look at this data point? And then you know we kind of figure out, oh, we need to feed it this way, or let's focus on like one or two people at a time, or something like that.

Speaker 2:

Totally, and you know, to be fair, companies have built up this massive amount of internal intelligence. They have data, they have tools, they have insights, they have research. It's the most valuable asset they have in many cases, but most employees can't access it easily. Right, we've had this whole job category of data scientists just to help solve this problem. But that data is unstructured or it's hard to find.

Speaker 2:

And so I mentioned all this because I think, until recently, and still to a certain extent, you have a lot of knowledge workers trying to work within their specific domain or function who then feel underwhelmed when they go into chat, either through lack of understanding or training or how to give the right context to the models to get valuable results. They feel like this is too generic or this is not relevant to my job, and then they go away. And that's fair. It's hard to like, go in with expectations, see a result that's lackluster, and then suddenly you're like this is not for, and so that kind of goes back to our initial comments around, like the steps that are really important to take to support adoption at scale. But it's not. It certainly isn't just plug and play, like in some cases it is, but in most cases, there's that extra, there's those extra steps that really make the difference.

Speaker 1:

Yeah, yeah, I think that that's such a good call out. Yeah, we're just seeing it. We did a session in January with one of our clients We've got about 400 sales reps and I said how many of you use ChatGBT or something similar every day or at least weekly, and probably 90% of hands went up. Next question how many of you have had any training from your company on how to use it? Every hand goes down and I think that's your. You know to your point. It's if you're not enabling or obviously partnering with a company like ours, right, shameless plug. But you know, to get the education, your team again.

Speaker 1:

We are used to talking to machines in like five words of you know broken English, right, it's like, and we're so used to talking to machines of like if I give it too much, you know Google quits, it's like no search results, and so you know we're kind of reprogramming and I think this is important for anybody in leadership. It is your job as a leader. Like this is like whenever companies had to lead their employees to the internet, you know kind of revolution and they had to train them. They had to train Brenda how to send an email right and and how to like, use Google, like, and I think employers really need to realize like you have to invest in this. Like this is a trans. This is a transformation in the way we solve problems as humans, and it's not an easy one, right, because it is we're unlearning.

Speaker 1:

For me, you know, 20 plus years of like behavior that led to me being successful, and now I've got to reprogram that to solve problems kind of foundationally different, and so I think it's just such a good call out around making sure that your teams are being enabled to learn how to use the tools, and it's your job, you know, as a company, to do that. So, um, um, so, all right, I got a couple more here. So let's talk exec level. Okay, if I'm an exec, I'm a CRO or even a CEO, but I'm go-to-market focused. What are some of the ahas? That, when leaders implement this, they're like oh my God, my life is so much easier now that I'm doing x I think there are a few things and I I'll keep it.

Speaker 2:

I'll begin with the primitives, and we can always expand to like where, how we think about at the company level, leveraging the technology to sort of reimagine workflows and how that might translate to something very like a very new experience or a very important outcome for executives. But if you think about it from the context of VP or C-level, they are suddenly seeing a team that, like, using the example of an AI-enabled team, you now have an organization that's sort of reimagining the way that they can approach the work that they've been doing the same way for a very long time and in our experience, like I, you know we we've used this example a few times, but that's it. But for good reason, you know, you have some, some. A company like Moderna, for example, and they in probably the first couple of months, or maybe it was within the first 90 days, had a team that created 750 custom GPTs, and it wasn't just their researchers or their scientists, they had their legal team creating the contract GPT. They had their marketing team creating the brand GPT, and what they saw was this transition from individual impact to team impact, to organizational impact.

Speaker 2:

And so when you're, when you're an executive, thinking about the importance of deploying this technology. It really is that sort of maturity curve. That's when you start to feel the difference, when it goes from individual to team to organization, because, as we talked about at the beginning, initially like wow, like time savings, amazing, that's great, but what are we then doing with it? And going back to the Moderna example, they actually, if you take something that they call the Dose ID, gpt, basically this had the potential to boost the amount of work that they were doing as a team. They were comprehensively evaluating these extremely large amounts of data and suddenly they're measuring this in the context of how quickly they can. They're either growing their business or reducing costs, because the organization is now reapplying that time saved into much higher value areas of work.

Speaker 1:

Yeah, I think that that's great. It's like you know, they're kind of once they start to see that type of behavior, we did something, you know, similar. We brought everyone in the company together in March and we had everybody create and we blocked out three hours and we taught everyone how do you create a custom GPT and what do you do, and had everybody create you know, their own custom GPT. And once that starts to trickle up, and now they're like oh, I'm just going to do that for this problem now and for this problem now and for this, and how easy it is and what are the leaders that this type of behavior is happening. What do you think? What are these execs? What are they doing differently than maybe other execs that are kind of like, well, my enablement person's figuring this out or my IT, like what are those execs doing differently than execs that aren't seeing that type of individual team? You know, company performance.

Speaker 2:

It's a really good question. The first thing that comes to mind is really quite simple, but you called it out earlier as something that possibly that you hadn't seen. In surveying a room, which is the first thing they do is they just sponsor it. Something that we encounter and I'm sure that you have all the time is when you're speaking with frontline leaders or individual contributors, there is a certain uncertainty as to whether or not they can use it at all, or whether they can use it for a specific use case, and there's a little bit of a reticence, frankly, to admit that they're using it. There's kind of like a culture, or at least this like pervasive sort of concept, that by admitting that you're using AI, you are somehow cheating or, you know, doing shortcuts to your work, and they don't want to admit that, and that's something that needs to be addressed very quickly. And so that where I see executives facing that head on is they're a, they are sponsoring it, they're leading from the front, not only through the way that they are communicating to the company and the communication is important how they message this out to the company and the way that they think about AI policy and AI governance and the effective, safe usage of it is very, very important. How they message this out to the company and the way that they think about AI policy and AI governance and the effective, safe usage of it is very, very important, but then also how they use it themselves when they're leading by example.

Speaker 2:

I can't tell you how many just exec hackathons we have run through which we're going in on-site doing workshops with exec teams, and that's very often where the magic happens and you start to see a lot of light bulbs turn on because, to your earlier point around Art of Possible, that's where that starts to become a reality.

Speaker 2:

I'd say the second thing that execs are doing is that they are orchestrating or delegating to the right people the types of curriculum and education that needs to happen. So they're sponsoring the hackathons, they're bringing in folks to help educate their team on the right ways to use this, and then they're also elevating champions and they're rewarding or highlighting or recognizing the good work that's being done with the technology itself. And so when you're celebrating it and recognizing it and then also sponsoring the sort of safe, responsible use of it, suddenly you see folks leaning in much more, and what really needs to happen is you need to see the proliferation of these use cases. You want your employees sharing how they're doing it, because that's where innovation starts to happen, and you get the flywheel going, and if employees don't feel like they can do that, the impact will be stymied, and so that's where I see execs really make a difference.

Speaker 1:

Yeah, the exec leader's like, yeah, you guys do it, but I'm still doing it this way, and you're like, well, that's not going to work. And the other thing that I'd add that we're seeing that that's kind of leading to success is we're, you know, for each department we're setting up Kanban boards of like the custom GPTs, right when it's like each department, like really making the departments take ownership over, like solving their problems, and saying like, okay, we're going to do these two first and we kind of create it's like okay, business case validation.

Speaker 1:

Like what would it be used for? Is this going to increase productivity by more than 20%? Yes or no? Right, great, move into like testing.

Speaker 1:

And then you kind of have this like it's, I really feel, like with Gen AI. It's like each department in the go to market needs to have their own kind of like almost like a product. You know, not a full like product type roadmap, but you've got this kind of production based roadmap and we're seeing that lead to a lot of success. Because then we get all these ideas that people have and like, great, jake's running on this one and this. But we're also making sure that, you know, we're capturing all of those best practices and so we can share them out. And you know, because we do see that too, it's like, oh yeah, stephen's team built this, you know.

Speaker 1:

You know, assistant, but not everybody is on a team's version of chat GBT. You're like, well, that doesn't make any sense. Like, if he built that, why, why don't we share that with everybody? Right, and so if you're on enterprise or teams, you know you can, you can do that. So, all right, this has been an awesome. I literally have like this is my scribble of notes that you can't see of like different points that are going to be amazing. But I've got kind of one last question for you when are we headed? Okay, if you think about and obviously I know you know there's only so much you can say right, I know we got chat GPT-5 coming down the pipe, you know here and that's going to be pretty dope. I'm pretty excited for that. But what? Where do you think we're headed? And again, like I mean God, who knows, in two years? So I won't even know how, about over the next, you know 6 to 12 months.

Speaker 2:

Yeah, even. It's the pace of everything and I appreciate that you gave six months. I've heard five years many times.

Speaker 1:

That's max Five years. Come on Five years. What are you kidding me? My robot's going to be over here interviewing my avatar will be interviewing you in five years, my avatar will be interviewing your avatar in five years yeah, no kidding, but sometimes even six months can feel like a lifetime away.

Speaker 2:

But what I will say is we're seeing some suggestions of it already and it's making its rounds and, I think, in many cases becoming a bit of a buzzword. But this idea of agentic orchestration is real and meaningful and I think we'll increasingly see that play out in ways that are much more accessible to your average user In the context of something that I think almost most people at this point are familiar with, something like a chat experience, and they can chat GPT. Increasingly you will view this as, I think, the orchestration layer for work, wherever work is happening, and some of the sort of stepping stones to that have been laid out in the context of things like GPTs being able to connect those to data. But as we increasingly release sort of agentic offerings into it so you mentioned Operator before we have Deep Research these are some of the first examples of agentic technology within chat you can imagine that that continues to expand and this concept of chat is like a genuine assistant for work, wherever work is happening begins to take very real, very meaningful shape and as it increasingly understands more and more about you you know there was memory that was released not long ago.

Speaker 2:

You now have connectors through which you can connect it to your Google Drive or your email or your calendar, and you also have this concept of tasks through which it can also proactively prompt you to do things.

Speaker 2:

It's increasingly that sort of super assistant that you don't just go to here and there, sort of for one-off use cases. It's sort of this always-on assistant through which you're doing work. I think that is a pretty safe way to think about some of the future potential of the technology. And then I think, from like an org perspective, like I do think we'll see the pyramid flatten a little bit, like you'll see fewer, you'll see more roles emerge that work very closely with the technology to architect workflows, to drive change management. And you'll see, like I think, there will be a higher degree of expectation around the technical aptitude of sellers, frankly, and those in technical success will become the new force multiplier and have already, frankly, like we all in sales, should, should put a high, should hold ourselves accountable to raising the bar for what we are capable of from a technical success standpoint.

Speaker 1:

I love that. Yeah, I think. Yeah, I think that you were mentioning kind of tasks, like just kind of in passing. But for those of you who don't know, I mean I agree, like the future seller man, I'm gonna have 600 things that the tasks are running. You know, I've got my whole account book and every day my tasks are looking did they release a new annual report? Did they release a new blog, right? And so you know, we're so hung up on these tools that are like intent signals.

Speaker 1:

It's like imagine a world where one seller has 600 tentacles going out every day and it's kind of coming back to you and saying, jake, these are the four things that happened yesterday that are going to allow you to go out and have a better conversation with the current customer. Or, like you know, get your foot in the door with someone and then it can run those through. Hey, jake, you know he runs it through the GPT and says, okay, this is how you should talk about it. Or, like you know, then then that's where you know, I think, for so many people. And then you turn your brain on what I tell a lot of people.

Speaker 1:

Look, if you're copying and pasting purely AI responses, you're not going to be employed because the agents will start to do it. So we have to realize that the people in the loop. It's like it's expanding my ability to do more and higher quality, but you still have to turn yourself on and you know like to your point. You have to take the ownership over learning the tools, not sitting around waiting for your boss to teach you, although it would be good if they did that. So, parting thoughts, connor, parting thoughts. This has been an awesome conversation.

Speaker 2:

Like I said, I've got literally a page of notes Parting thoughts for anybody out there who's on the front lines or up to the exec team. Yeah, my guidance at the broad level of just ask what to do or how to get started with the technology is usually the same and it's very simple and maybe unimpressive, but it is just go out there and start using it a you know, too often we do fall back into old habits and it's reasonable. The pace is crazy. Everything is moving very quickly. It can be very overwhelming, but the fastest way to sort of start to mitigate, I think maybe, a growing sense of anxiety.

Speaker 2:

That is very fair and a lot of people in terms of like can I keep up with this? Am I doing enough? Is to a like, give yourself some grace. Like it's okay if you don't feel like you're an expert yet. There's a lot of great content out there and you can take steps to slowly sort of immerse yourself. But the best way to do it is just to get in there and start using it, start innovating, share what you're doing with your team or with your leaders, assuming that you know this is the like. That's like a like safe and responsible use is happening, um, but go out and go out and use it and and sort of challenge yourself to try to like learn something new each week, um, with the technology, and and see how you can sort of you know, uh, deploy it in a way that's that's relevant and valuable to you and your personal and professional life. And it just gets easier from there.

Speaker 1:

That's right. Try one thing. I think that's great. I tell everyone I'm like you know I have this concept in time management 80, 15, 5, 80% of the time you're doing things that are going to impact your kind of world in the next. You know now and month. You know 15% is the next six months and then 5% is that six, 12, 18 months. Dedicating time an hour or two is absolutely critical. This is not optional. To learn this, it is the future. So, connor, I appreciate you, man, that's fun. It was a good conversation. I enjoyed it Absolutely. I had it as well. I appreciate you having me All right, amazing. Thank you very much for joining everybody. I hope you got a ton of value out of the episode. Make sure to subscribe, if you are not subscribed already, to the channel. If you're listening on podcast, make sure that you get alerts for when new episodes come out and download. And, connor, appreciate you, man, great conversation, thank you.

People on this episode