ÃÛ¶¹ÊÓÆµ

Applying AI in Marketo: Practical Strategies and Implementation

Join Lucas Goncalves Machado, AJ Navarro and Darshil Shah for a focused session on leveraging AI in Marketo. In this session, you will:

Understand how to integrate AI at every stage of the lead lifecycle—from acquisition and scoring to nurturing and conversion

Explore the latest AI capabilities now available in Marketo and how they can enhance your campaigns

Follow step-by-step guidance for implementing AI-driven workflows in your own instance

Designed for marketing operations professionals who value clear, practical advice, you’ll leave with concrete strategies to put into practice immediately.

Transcript

Well, it is a couple minutes after, and let’s go ahead and get started, because I know that this is a pretty interesting topic, and I want to have enough time for some questions.

So thank you all for coming. Thank you all for attending an early session.

But today’s deep dive is one of my favorites, one of my personal interests.

Applying AI in Marketo and actually having practical strategies and practical implementation, so you can get started now. And at any time, know that these guys are incredibly good at what they do, so if you have any questions, please feel free to reach out to them. I’m sure they would love to have follow-on conversations and continue this topic if you have any questions.

So that being said, let’s go ahead and get started.

So a little bit of house rules before we dive into the actual meat and potatoes, as this is a user group, no self-promotion or pitching of any kinds that are permitted during these events. Do not contact outside the group without consent, and I’m sure that these gentlemen will go ahead and apply that during this conversation. And if user group members share their use cases, please don’t share that information without their consent as well. So if you have any questions, like I said, please reach out. Please do your due diligence.

All right, so this meeting is recorded. If you do not wish to participate in the live recording, please feel free to stop and watch this at a later time. This will be sent out after the fact as part of an email series. So if you don’t feel free, feel confident talking or sending questions in the Q&A, please go ahead.

We’ll see you later.

Thank you for joining.

All right, so Stephanie Dam has started a new research opportunity with us. So if you have any interest in learning more about the adoption of the marketing team or conducting on how to have intermediate or advanced practitioners are learning to use Marketo, if you’re interested in being applied any of this information, please feel free to grab screenshots and sign yourself up for this. The feedback will be used specifically for learning content, and the information is free to join, and it’s about 30 minutes of just talking back and forth with her. If you’re interested in what’s going to be going on in the future or how ÃÛ¶¹ÊÓÆµ is going to take that information, I think it’s an interesting opportunity.

All right, and personal favorite, and I strongly encourage everyone on this call, go ahead and do this. One of the best things I ever did as part of my career, apply to be an ÃÛ¶¹ÊÓÆµ Champion. This is not just for Marketo, so if you have any other tools from the ÃÛ¶¹ÊÓÆµ toolbox, please feel free to sign up. It’s not just for Marketo. There’s a couple new ones this year, such as ÃÛ¶¹ÊÓÆµ Workfront, and so they’re constantly adding more as we get more and more interest. So be loud, be proud, but please sign up for the ÃÛ¶¹ÊÓÆµ Champion program.

It’s already open and it ends on June 10th. Please sign up here. There’s a webinar that happened already on May 15th that goes through a bit more detail if you have any questions about it. It has a bunch of previous champions or current champions that talk about the process, what to do, how to get started, if you’re not sure. But again, I cannot recommend this enough, so please sign up. Apply.

All right, so what is happening next? On the 13th, we have RT CDP AJO. We have New Skill Exchange opens here. So if there’s any interest in these specific topics, here are the dates and what is happening for each. But as you can see, we have content on RT P, C and AJO happening on the 13th. On the 14th, we have more learn and grow tracks about the analytics. So look for that content. On the 20th coming up, we have content about AJO and more Marketo engaged office hours, and then look for the information on Workfront.

All right.

Upcoming user group events.

All right, so you can see that we have some in-person coming up again. I love these. If you have the chance, be sure to check that out. If you are in a specific area that does not apply to you, but you know that there’s a user group, you can check that inside of the Marketo user groups sign up. It just has an entire list of all your user groups available based off of location. If there’s one that doesn’t fit you perfectly, go ahead and check that out. But as you can see, May 16th, the 21st, all the way coming up through here, a lot of international, which is great. Again, you can sign up and have these as recordings.

All righty.

And as always, please feel free to provide any feedback.

Let us know what you think on G2. Provide trust radius.

All of this is listened to and taken into account.

All righty.

With that being said, I hand it over to the amazing Lucas. Before we get started though, let’s go ahead and do some introductions. So I am hosting. Courtney unfortunately could not be with us today, so I’m taking her place. But my name is Chris Kelly. I have been a Marketo champion for going on three.

This will be my second going on third year. I am a director of marketing for Qualified Digital and been in marketing automation for almost 10 years now. Lucas, would you like to take it away? Sure. Hello everyone. I’m Lucas. I’m the director of AI and automation at Revenue Pulse. We are a marketing operations agency mostly focused on Marketo and I take on a lot of different AI projects on our clients. I’m actually based in Brazil, Sao Paulo. So unfortunately I’m heading to winter now. I know that everyone is happy, but you can see that I’m already wearing a hoodie because it is getting cold here. So nice to meet you all and excited to be a part of this presentation.

And my name is AJ Navarro. I’m the marketing operations manager at Sprout Social. We are a social media management company. Also doing a lot of fun things around AI. It’s a big hot topic for our organization as well. First year champ, so really excited to be here and talk to everybody today about AI.

Hey everyone. This is Tarshil Shah. I work as a senior consultant at Deloitte based in India.

Third time champion and super excited to be part of the presentation. Always I love the deep dives and specifically we have got an amazing set of people today to talk us through amazing use cases and I’ll be happy to share a couple of use cases from mine as well. Looking forward to the presentation. Thank you.

All right. So we were talking on how we could kick things off today. And one thing that I always like to talk about is an introduction to AI because many times when we were talking about AI these days, we are so focused on large language models and they are not the only thing that exists and they are not the first AI that we saw and that we are using in our day to day. So I like to get this introduction before we go to large language models and what they can do so we can be more contextualized on what really is AI and its powers.

So when we talk about artificial intelligence is kind of an area of research about how machines or computers can simulate human intelligence and they can do tests that usually are just done by humans. So for example, where AI can be applied. It can be applied to the predictive maintenance. You can imagine a factory that has some machines that require maintenance from time to time. If you can predict when they will require maintenance, you can save a lot of cost on it. Or in our day to day when we were watching our streaming, we received content recommendation. This is also AI analyzing our content preference, what we have watched and recommending the next best content.

Or in many times in our day to day in marketing operations, we have predictive lead scoring platform that are analyzing intent data to understand what you’re doing and to try to predict how interested on buying you are at that specific point in time.

And AI can be very complex as large language models, but not always. I brought here just a very basic example, maybe the most basic AI that exists, which is Naive Bayes. So how it works, thinking of an example here. If you think that directors have a 50% chance of buying just after the MQL and workers of the financial sector have 20% of chance of buying after the MQL. If a director of a bank MQL, they have 10% of chance of buying. So you multiply just the probabilities and you get a 10% chance. Why this is very naive, it doesn’t consider the director of banks might be more complex than just directors from the financial services. It aggregates it all together. And this is also AI, even though it is very, very simple and anyone can do that and understand it. But it is not the only one. It gets more and more complex as we go to a large language model, for example.

And an LLM is basically a type of AI that is trying to predict a very specific thing, trying to predict the next best word or token based on all the two contexts that you’ve given it, all the previous responses that gave and based on all the knowledge that you have. And a lot of examples of large language models like chat, JPT, Jami and I, Jasper, and some very cool market innovations that we’re going to talk about. And it usually uses more complex methods than Naive Bayes, of course, like neural networks, which is very, very heavy computationally. That’s why you are always talking about, always hearing conversations about chips and data centers and electricity to power those things because it’s very computationally heavy.

So I talked a bit about tokens and what are tokens in large language models are basically the fundamental unit of text. You can think of it sometimes as a word, but symbols, punctuations are also tokens and misspelled words can also be understood as a token or more than one token. So here on the right, you can see an example of an image. So you can, as an image of the tokenizer from chat, JPT, you can see here is an example. Each of these words are one token. So once a fourth, but a dot is also a token. Chat, JPT is actually three tokens misspelled words can become more than one token. And it acts differently on different languages because this one is here is in English. A key is here in Portuguese, which is my native language. And you can see that a key has two tokens. So if you change it to Portuguese, it spends actually more tokens on the same word.

Why tokens are an important concept to understand? Because the price that you’re going to pay for API calls, so if you’re automating any process with API is going to be based on tokens. So optimizing your token spend is something very important to get a good price for your automations.

And another parameter that I like to introduce for large language models is the temperature. What makes AI makes large language models more humans? If you ask the large language model the same thing twice, it is going to give you different answers because it has kind of a randomness parameter that is temperature. It goes from zero to two being zero, the most analytical. So when you’re using the API, you can set it to zero and force it to be the most analytical possible. And it’s very useful for a few use cases. If you want a lot of creativity, maybe explore new ideas, set it to two and you can have a conversation kind of with an artist. And it is also a little more prone to hallucinations if it is untrue.

So be aware of these specific parameters. And when you are talking via the web with the interface, it is usually set to one by default. So it’s halfway through.

So this is the conclusion.

LLM, AI being LLM or not, allows us to make useful predictions in marketing and enhance our campaigns, especially if we are using our first-party data and applying our own use cases to it. So it can allow us to improve efficiency by writing better, faster, implementing scripts that before AI, we didn’t know how to, and also it helps a lot on troubleshooting things faster. But it also helps us innovate in our process by segment persona with higher accuracy, improve our lead scoring, provide sales with actionable insights and so many more things. So we are going to talk a lot about how we can innovate in our processes today. Before we go to these specific use cases, AJ, do you want to talk about how AI is being applied in Marketo now? Yes, thank you. So let’s talk about the ways today in Marketo that you can leverage AI through some of the features.

So the first one is with the new email designer or NED. There’s a lot of AI built in. One of the very first things is to be able to generate personalized copy instantly with generative AI. We’re also able to do different types of testing and nurture copy to improve conversions. Again, all AI driven. Access to AI generated visuals through ÃÛ¶¹ÊÓÆµ Firefly and assets from ÃÛ¶¹ÊÓÆµ Experience Banter, again, all built in. We’re also able to create, edit and personalize our emails all within the Marketo engagement without having to know zero code.

All these things combined just make it scalable and promotes creativity. So this just empowers teams to produce more content, less effort without sacrificing quality.

The next area of AI is in our interactive webinars, and this is one of my favorite areas. One, generative AI is now built in to turn the static webinar replays into engaging searchable experiences. What happens is after a webinar, AI will scan and create a transcript of the webinar, boosting deliverability and just overall viewer attention, making your webinars easier to find and easier to get to the content that users care about. There’s also a new interactive FAQ where key questions are extracted from webinars. It is saved, so there’s no manual content left to kind of create a blog or something of that sort from questions that were asked during the webinar series.

One of my favorite things here is the smart video chapter. So the native AI built in creates chapter navigation through the different topic summaries of your webinar so users can access the content quickly and on demand. All this just increases engagement where viewers will stay longer, find what they need faster, and just convert at higher rates, all while using AI.

Next, please.

And then lastly, we have the dynamic chat. So here we have other generated chat flows where we create branded, different types of conversational experiences with single prompts. There’s not too much scripting required. We can also launch chat experiences much faster using AI-generated questions, responses, and CTAs. With that, we’re also able to route using AI, high-intent visitors to the right reps or to the right content, all in real time. We’re also able to customize the chat based off of different audience segments, regions, or behaviors without any ad left.

And then using AI, we’re also able to tie this to any of our smart campaigns, scoring or nurture workflows, all while using AI within the native dynamic chat. So these are some of the things that today on how ÃÛ¶¹ÊÓÆµ is leveraging AI within Marketo. And for those of you who attended Summit or Summit Recap, there’s a lot more AI features coming out, such as the journeys and things like that. So this just shows the level of investment of ÃÛ¶¹ÊÓÆµ and Marketo and what they’re doing with bringing AI into the platform and just enabling us as marketers to work faster, smarter, and be more creative.

And you can be thinking these use cases are already pretty cool, but what if I want to integrate Marketo directly with AI to perform some other use cases and other things? How can I do that? So we’re going to go through three use cases, but one quick slide before we get into them.

When we’re talking about integrating AI with Marketo, I like to lay this three options here of how you can do it technically and the pros and cons for each because each use case will have a different approach. So you can integrate directly via webhooks, which is amazing because a lot of people already know how that works. You can, with webhooks, you can control exactly which data you’re sending and it’s easy to anonymize.

The cons is that the data treatment that you can apply there is very basic. You cannot do powerful data operations with webhooks. You’re limited to smart campaigns, so you have to send one lead at a time to trigger a smart campaign. So you have to send one lead at a time. And if you’re not careful and send like 1 million leads and it takes longer to process, you might jam your instance for a while.

When you’re talking about the API, we’re talking about pulling the data away from Marketo, doing some operations in your computer, in your cloud, and then getting them back to Marketo and that allows more powerful data operations, never jam your instance and allow also bulk operations. So you’re not constrained to that one lead at a time limitation.

But the problems that import the data might be a little dangerous. Duplicates might import bad data. Data is going to be stored locally or on the cloud for a while while you’re performing those operations. And you need to have some deeper knowledge on the endpoints to avoid the usage of sensitive data. So for example, if you’re analyzing activity data, if you pull the fill out form activity from Marketo, you are going to get inside that activity all the fields from the form So you’re going to get the person name, the email address, for example, which is sensitive information that many times you don’t want to pass to AI. And you also can use third party tools like IPaaS, such as Zapier, NHN, Workato to build those things. Allows you a lot of flexibility. It’s easier to build. It’s easier to integrate other systems than Marketo.

But unfortunately, it also requires a different skill set. People who are building things on IPaaS usually are not marketing operations managers, and it’s also an extra cost. So you have all these options to integrate AI to your Marketo. And we’re going to talk about a few of those during the use cases. And the first one, this is one of my favorites, which is persona classifications.

Because many times we’re doing Marketo the same strategy to classify personas. We are using the manual logic job title contains something, and that cannot model all the nuances of job titles. So one joke here, a lot of people try to segment CROs, but if you put job title contains CRO, if CRO is in the middle of a word, people will be pulled to that segment, which is very, very annoying.

It also has to be manually updated every time there is a new trend on job title. So now people are being called something else as their job title. You have to update everything.

For example, probably you wouldn’t see a lot of job titles related directly to AI or large language model engineer. That’s something completely new. And now you have to update those to reflect on the segmentation for personas. Also relies heavily on language, which can be a challenge for global companies because well, every language spells things differently. And in the end, you have a lot of leads on the default bucket. Have you seen these challenges before, Ajay? Have you seen some problems like this? Yes. Yeah. Just how people fall in and out of segments. This is one of the many things too that we’re looking into at Sprout on how do we just automate everything nice and clean.

As we know, job titles change or the small differences in IT versus IT manager, information technology roles and titles change. And just the ability to be able to train an AI and create our own persona so we’re an auto categorized. It just makes it really impactful, but I have to spend less time doing that type of cleanup.

Great. Great. Yeah. That’s also a good point. People have to spend a lot of time keeping this updated at the end of the day.

So how the AI persona actually classification works. What you do is that you create a chat GPT that is trained on your persona best examples. Usually it’s based on job titles. To create a GPT train, what you do is fine tuning. So you basically come from job title. If the person has the job title A, then fits on persona A. If the person has the job title B, then you fall into the bucket B. If it’s C, let’s say it’s persona A again. And you do a lot of examples for each persona.

Usually 100 to 200 examples per persona. The model then can learn from that examples and it will draw a definition from it. It will understand what you mean when you give that specific job title. Why all of them are going to a specific persona. And then if you send a new job title to that model, one that it wasn’t trained on, it will try to fit in the best bucket possible.

Like everyone would. Like if you see a lot of examples, then you try to put the new example where it would fit best. And since it is a multilingual model, it will be able to classify in any possible language. So that won’t be a problem. So how this workflow works. So every time a lead is created or the job title is updated in Marketo, you call a webhook sending only the job title to the fine-tuned chat GPT. That will return the persona classification and then Marketo can process the response.

It’s very important that Marketo process the response because you see that we are always talking in this use case and acronyms. You are not specifying the name of the persona to chat GPT. You’re not responding with, I don’t know, IT workers because IT workers has a lot of tokens and we want to optimize for token usage. So it just responds like P1 just. And then in Marketo, your process that P1 are IT workers via smart campaigns. So this is how it basically works in the workflow. And what you get as we were saying, so you can send here, this is what you see in the activity log. You send someone, you send chat GPT that someone is a data analyst and you have to classify the personas based on the job titles and respond with the persona only from P1 to P6. And you can see here to the response is P3. And P3 is data related analysts. So then I treat that inside Marketo. But since it is just responding P3, we just use two tokens for the response, which is much less than data analyst workers, especially when you scale that for one million leads in a database.

So that’s how you see it back in Marketo.

And when you’re creating this type of strategy, this AI persona strategy, there are a few common mistakes and pitfalls that I like to highlight here too, so you can avoid.

The first two ones can be a little contraindicative. The first one is to create it too basic. The second one too complex. Why too basic is a problem because you are now using a much more powerful technology to classify your persona so you can create more buckets of personas. You can segment them better and review your definition to understand how it can impact your scoring or dynamic content. So you can think long term of how we could change our persona’s definition to better serve our business. The other common mistake is to make it too complex. So if you go the other way around and you create, I don’t know, 80 different persona buckets, it’s now so complex that the team cannot understand. And it’s basically using the job title itself because it is the same thing.

If you provide little training data, the GPT or whichever LLM that you’re using will not understand the real definition, the real sample. And so you have to provide enough examples with good quality so you can understand. And also you have to set your temperatures and tokens. So you’re getting the temperature to zero. And so the GPT is the most analytical or whichever LLM you’re using is the most analytical possible. And you set the tokens to two so it doesn’t explain why you are picking any specific persona because it might sometimes. You can see here that I set the temperature to zero and the tokens maximum to two. This makes it be more analytical and it will never respond more than two letters, basically, because I’m setting the tokens to two. So if we only respond like P3, it will not respond P3 because this, this, and that. I’m not interested on that explanation.

How you can make this use case even better? Other parameters besides subtitles, for example, industry can be a good one to leverage personas.

And with the new AI persona, you can also add this new persona to your scoring to understand the criteria for each one of them. Maybe you have a different threshold depending on your persona, and now you can really use that in your strategy. And you can also review your segments so you can understand how to use this information in your marketing efforts and marketing campaigns. And here is a QR code to a blog post explaining how to do this step by step. I can see that you just sent a message asking that. The step by step to fine tuning is on that blog post. And I’m very happy to talk about it deeper if you want. It’s just a very complex process that I didn’t want to take so much time here.

And the second one is AI sales insights. We’ve been talking about this one a few, AJ. Why do you find that interesting? Yeah, this is probably my favorite AI use case. So much for so that when Lucas first told me about this about a year ago, I’ve been advocated in at Sprout for quite some time.

And the gist of it is, and I won’t sell this thunder here, but the gist of it is being able to take a record of Marketo, understand when somebody MQLs, GPT analyzes that leads history or activity, puts together a short summary and populates that to a string field of saying why this person MQLed, what they were interested in, and what you should talk about.

That to me is just a huge unlock to be able to do that type of thing. And most importantly, to enable our sellers about what to talk about and layer in all the other data that we have, such as like industry moments or anything that they’re using MSI, but the ability to quickly analyze the Marketo activity record and understand why someone MQLed is just a huge unlock. Not to mention, I think Lucas will talk about it here, but it leverages the Marketo ID. So there’s no PII involved.

Exactly. So what happens that instead of sales having to receive this long list of activities from Marketo and then there’s the research and understand why this person MQLed, you can just send a paragraph explaining that and even suggest like which the first communication should be. So thank you for that amazing intro AJ. It’s I think you expressed well the value and why people should do this. Going a bit to the technicals, how does work? You have to build kind of this workflow here when you receive an HTTP request, when someone MQLs, get the access token from Marketo, get the paging tokens, then query the activities prompt to chat GPT or whichever LLM you’re using and then post to a Marketo field.

So the initial trigger and essential steps is that you trigger that specifically workflow call. So you know when you have the MQL trigger on your smart campaign, just after you turn someone into MQL, you can trigger a web hook that will call that third body that will run this workflow. And when I say third body, it can be any iPad solution. Tap your NHN, Workato, you can run this on the cloud, you can run this anywhere.

And then when this workflow receives the web hook call, you have to query the data from Marketo and you have some essential steps to do that. First you have to get the access token, then you have to get the paging token to get the right page from the activity log and then pull the activities from the activity log.

I know this question is coming to you. How do you do this technically? Again, I did not want to get into these details, but there is a blog post in the end that I also we are working to release a specific training on this use case with the ÃÛ¶¹ÊÓÆµ Adoption Marketing team. So we see that too in a few weeks. And with all the code provided, you’ve seen both blog posts, I always provide all the code just to make it easier for you to apply it.

So when you query all the activities, this is what you’re going to see in the end. You’re going to see these activities with specific program names and like something that you cannot just send to Chai GPT and ask, hey, prompt this, or you cannot send actually to a salesperson and say, hey, understand why this person MQL’d. What you actually can do is prompt to a large language model. This is the prompt that I usually use. Analyze the following lead activities and explain why the person marked as MQL so they know how they should approach the client, including which product or service this lead is most interested in and any other relevant insights. Include relevant URLs.

Remember, this will only be read by a salesperson, so don’t use technical explanation. Just your best summary. Keep your response limited to 100 words. This is an example of why of this specific person, how it worked. Lead was marked as MQL mainly due to their engagement in NaNaNa. I had to redact some parts of it because they are from client. So, but you can see how it explains exactly why this person MQL’d and even the URL specifically for this person.

Some common mistakes and pitfalls when doing this, when applying this in your day-to-day is going too far in the past. So you don’t want to get activities from the past six months to understand why someone MQL’d. Probably activities that happened long ago will not be relevant business-wise. So any day will make the process longer to analyze all the data, pull all the data from Marketo.

Remember to always summarize and limit your output. So I remember if I added here, yeah, keep it limited to 100 words. This is a good limit. So TBT or whichever you’re using doesn’t go and add three paragraphs. You don’t want that.

And other common mistake here is use the Book API because the Book API is great for large data exports, but when I lead MQL’s individually, so you run this workflow each person at a time. So to make it faster, it’s better to use the REST API.

And it can be even better if you add demographics information so the LM has even more context. And you can also add suggested next steps. You can make the LM suggest the next email to be sent or which outreach sequence to add it to. And again, you have here the blog post that explains step by step how to do this.

And that’s all. And I’ll pass now to Darshav to go on personalized lead scoring. Yeah, thanks Lukas. Thanks Ajay. Load all the use cases. So I would like to discuss the AI based personalized lead scoring.

Thank you. So I thought I’ll start a bit by setting up a context. And the first point that I would like to make here is research back data that suggests that only 30% of marketers say that the scoring model effectively identifies ready to buy leads. And this is from the Gartner research that they conducted. And the reason that most marketers feel that way is because by using the manual weights, for example, for downloading an ebook, it’s a static score. And it’s a built in score and it fails to adapt to the changing buying behaviors. So buyer behaviors, they are changing more frequently than they were before.

So yeah, the static model, the hard code model basically fails to adapt to the rapidly changing buying behaviors. And of course, then there’s a possibility of over emphasis on any or I would say a few attributes. So at a particular time when you were building a scoring model, at that point in time, there were certain attributes that you felt and your strategy team felt that they were more important than to show the buyer buying readiness, but over the time that might not be the case. So all in all, the point I would like to make here is that the static or the hard coded scoring model is a bit flawed, given that it’s not very much adaptable to the changing buying behaviors. And towards the right, there’s a, of course, not to scale, but I have posted a graph that shows that the lead scoring accuracy over the time decays. So basically, as the time passes, the lead scoring accuracy and I didn’t think for like using scoring model to identify people who are ready to buy, that the accuracy basically decays with time. So that’s the reason that AI based personalized lead scoring model would make sense here. And yeah, thank you. So I have a couple of slides where I’m discussing the technical and the, of course, the architecture diagram of how this entire model works and how it integrates with Marketo. But before going into that sort of detail, I thought I would just set up a bit more context on how and basically what is an AI powered lead scoring. So AI model analyzes the historical lead and conversion data and of course the behaviors behavioral signals. So when I say behavioral signals, these are the, like if a person fills out a form, if they visit the pricing page. So these are like in a static leads scoring model, you would have trigger campaigns that scores people when they perform certain actions that you feel that you and your strategy team feel that signals the buying readiness. And yeah, so those behavior signals are taken into account to predict the lead score and the key inputs to this particular AI based leads scoring model would be formographics. And these are of course the demographic attributes of a person, behavioral data and including the past interactions as well and the engagement velocity. For example, an example could be the time spent on pricing page. And this can also include the delta or the time difference between two consecutive engagements. So for example, if a person fills out a form and then they visit the pricing page. So what is the time difference between that? And of course you can add in other attributes that you feel that would be pivotal to determine the score of a person. Again, as you have been building the static scoring model by creating trigger campaigns based on the behavior that you feel are what’s signaling the buying readiness. This is also in a similar way. And of course the content interaction pattern. For example, again, as I said before, repeated visits to case studies, particular blocks, etc. Intent data. So if you also have an intent data available to your market, it could be a third party data enrichment platform or a data intent platform connected to your market or providing all the intent data to market. So that can also serve as an input to the AI powered leads scoring model. And all of those combined inputs are fed to the AI model and the output is a predictive score. As an example, I have taken a range from 0 to 100. It could be anything based on your use case, based on your requirement and how useful people what’s the range that you follow. And of course then along with that, the AI model also gives the confidence interval. So the example output would be for a person, all those inputs, the example output would be a score of 85 plus minus 5 percent. So the model says that this person’s score could be in the range of 85 and plus minus 5 percent of 85. And this confidence interval is again determined by the inputs and how you have trained your model.

Next slide, please.

So in here, I have showcased the architecture diagram of how this model would look like and how it would function internally. So let’s start from the right because that’s and of course we’ll also come towards the left on how the marketo and the AI scoring model are integrated. But before that you would have to build your AI scoring model. So you start with an algorithm. So basically on a very high level, there are two types of machine learning algorithms. One is supervised and another one is unsupervised learning. So this particular use case would leverage the supervised learning algorithm. The reason being in the supervised learning algorithm, we fit the labeled data. When I say labeled data, we are along with the behavioral demographic attributes of a person while training the model. We are also feeding whether or not that person converted or not. So AI model will be getting a labeled data of a person’s interactions plus whether those interactions, those set of interactions with those sort of velocity and demographic attributes, whether the person got converted as a one customer or not. So all of this data is fed as a training data to the supervised machine learning algorithm. As you see, and of course if you have multiple engagements, you would need to normalize those engagement metrics. So that when they’re fed as a training data to the AI model, they are all in a normalized manner that can be like the AI model can consume that normalized data. And of course those training data can be used to train your AI scoring model initially. And once you have a trained AI scoring model, you can, of course, you’ll be integrating that with ÃÛ¶¹ÊÓÆµ Markit. So Markito will be sending the lead data. So for example, once the AI scoring model is built and they want to get the score for a person, so typically we use webbooks to pass the lead data to this AI scoring model that of course is sitting outside of Markito and you’ll be using webbooks for that. And the AI scoring model will consume all of those lead data and the typical inputs would be in the line of the items that we discussed in one of the previous slides. And in return as a response, the AI scoring model will return the score, of course, to the confidence level. And again, in order to ensure that this particular model is accurate, it remains accurate, it’s very agile, we retrain the model on a periodic basis, typically monthly, using new conversion data to reduce the drift and ensure that the model stays adaptable, the model is agile as and when new data points and as and when new conversion happens. So that is how we ensure that the model is personalized, the model is adaptable over time, of course, which is unlike the static hard-coded scoring model that most users typically have in that instance.

Next slide.

Yeah, so I have on a very high level bifurcated the entire implementation into three layers. So first is data layer. So when I say data layer, it represents all the data that’s being fed to the AI model and it would be the, of course, you need to ensure that Marketo tracks all the needed events. So if you have your blog pages sitting outside of your market landing pages, you need to ensure that you have munchkin deployed on it and the activities are getting tracked in Marketo and sync CRM fields. So along with the market data, if you’re also using certain CRM fields data, either you need to have those fields data being sent to market and have market send that to the AI model or there’s a possibility where if you do not want to have the data sync to Marketo, you can also sync directly from CRM to AI model. But I would not suggest going that route because that unnecessarily adds to a lot of complexity as in you need to ensure that when a web call is made from Marketo to AI model, at the same time, at that point in time, the CRM data should also be available and queried by the AI model. So again, unnecessary complexity gets added on that. And at the end, the end of the pass, of course, the pass data to AI model to predict the score via webhook. And AI layer again, in order to build this AI model, there are two options you can use like ÃÛ¶¹ÊÓÆµ Sensei or AWS SageMaker for predictive base scoring. And again, the other option is to build this custom AI model like you will have your AI engineers build the model for you. And you can have the, you can deploy to a serverless system like AWS Lambda and have Marketo call that service.

And again, you will in Marketo have a lead scoring program and that smart campaign would simply be triggered by scoring threshold. So if a person score is more than the threshold, they just notify sales by having a sales flow step. So that is the entire implementation right from building the AI model to the data layer to how campaigns and notify sales campaigns should be set up in Marketo.

Next slide, please.

Yeah, I would also like to discuss some of the common pitfalls and best practices similar to how we have also discussed these for our previous use cases. So one of the common mistake is overfitting the model. So if you have like if you are using a lot of features, if you are like a lot of interactions, behavioral based interactions as an input parameter to your AI model, then there could be a case there. The model performs very well with the training data, but with the real data, the model doesn’t perform very well. The accuracy is not as high as it was with the training data. So in order to ensure that the model doesn’t overfit, it should limit the features to top 20 features you have in Marketo.

And again, there could be a cold start problem wherein there is no historical data for new markets, new products that you have. And for that case, you can use the hybrid model. You can have a combination of this AI model, predictive AI model and then rule based scoring. And you can build the data over time and then progressively switch over to the 100% AI based scoring model.

And again, there’s a bit of explainability as a part of best practices, explainability as well. So in order to ensure that at any point in time, you are able to explain the particular score of a person that’s determined by the AI model.

It’s a good practice to have the values to show why lead is scored high. So there could be a case where pricing page visits, for example, is a high buying intent owned by a person. So that could, like if a person visits the pricing page, their score would be relatively higher compared to a person just opening up an email.

And of course, the documenting model very well. And if you are maintaining multiple versions, versioning control, etc. is also one of the best practice. And again, switching from a conventional AI, conventional scoring model to AI based scoring model, this is a transformational change. I would suggest like crawl, walk, run approach. Do not switch over from conventional model to AI based model all of a sudden. You need to have a lot of sales like marketing sales alignment, alignment with your engineering team, alignment with your data team to ensure that the this transition is smooth and well taken by the entire team.

Yeah, yeah. Thank you so much.

I was talking on mute. Thank you so much, Darshav for that news. It’s fascinating and that it’s AI, but not large language models, right? So that’s pretty cool.

So I think we have time for one or two questions from the chat.

So there is a question on lead score in Darshav. Will we need to train the model first or gonna have the data that we already have in the instance, can take the data that we already have in the instance? Yeah, that’s a really good question.

So in most cases, and the way we do is, you know, have the data that’s already in the market for instance historic data use, we can use that to train the model to ensure that the model is trained based on how people in your instance react or I would say not react, but engage with your marketing content and down the funnel convert as a one customer. So there is no like for other models, there could be a lot of data set available online. This is something that you should not be doing for market. Like each market instance is different, you have like different marketing strategy being running in each market instance, there’s a different set of people, different set of prospects and market on how they engage with the content with your content. So it’s necessary to train the model on your historic like on your market instances historic data. So to get the maximum accuracy, that is what I would recommend here.

Alan also sent a question, can Snowflake flow into chat GPT or another AI platform to create AI based list segmentations and then flows into Marketo? Never did something similar, did you guys ever? Because I have it, but I know it is possible, I know it is something that you can do, you can run your audience data through an AI to get those segmentations and then you can flow them into Marketo by creating new static list and then in the leads to the list using the market ID. So you could do that, but you’re gonna never did.

The other ones.

Akshat asked if we have detailed documentation on how implement the lead scoring with chat GPT or using webhook or API.

Now if you have a documentation for that use case, Darshav? I don’t have it documented anywhere, but I’m working towards the marketing nation. Yeah, thanks Akshat for asking that. I think a lot of other members would be interested in that. I would certainly post it on LinkedIn and marketing nation.

So keep an eye on marketing nation that you’ll see that soon.

And I think that’s all for today everyone. Thanks so much for attending and such a productive session.

Great day. Yeah. Thank you so much. Have a great day and the weekend. Bye.

recommendation-more-help
7bb6a267-e711-49b2-a29d-57541f7f2fe8