ÃÛ¶¹ÊÓÆµ

AI Assistant – Beyond the Basics

This session empowered attendees to start using the AI Assistant with confidence, building proficiency that translates to workflow efficiencies and marketing performance. Overviewing key use cases for AI Assistant, how to apply the tool capabilities, as well as how to grow AI Assistant usage.

Key Discussion Points

  • Learn foundational concepts and initial set-up of AI Assistant in AEP.
  • Examine key use cases, the application of capabilities, and tips and tricks for maturity progression.
  • Discuss upcoming capabilities and enhancements of the too

video poster

Transcript
Hi, everyone. Thank you for joining. We’ll be getting started here in the next couple of minutes. My name is Namita Patel. I am a Solution Customer Success Manager, and I will be today’s host. Today’s session is focused on AI Assistant, Beyond the Basics, led by Joseph. We’re going to wait here just a couple of more minutes and let everyone filter in, and then we’ll go ahead and get started. So while we wait, we can have everyone comment in the chat where is everyone located, their company, what they’re hoping to get out of today’s session. We’d love to hear from everyone. While we’re waiting for everyone to join, we do have some upcoming webinars that may be of interest to everyone. So on May 12th, we have an AI-driven automation webinar specifically for ÃÛ¶¹ÊÓÆµ Workfront. And May 13th, we also have a Mastering Digital Asset Management webinar. And May 15th, we have the Intro to Components of Cross-Solution Architecture Design specifically focused with Target and AJO. May 15th, we also have an AEM Cloud Optimizing Performance and Prevent Issues webinar. And then May 16th, we have an Elevating Digital Experiences with AI-Driven Insights focused on CJA, Analytics, and Target. And then lastly, May 21st, we have a Digital Trends and Preparing for the Future webinar, which is focused on all ÃÛ¶¹ÊÓÆµ solutions. So if you guys are interested, please register for these webinars. I’m seeing some questions. Thank you for participating. I love this. All right. We’re going to go ahead and get started here. Ben, I will kick it off to you. Perfect. Thank you, Namita. So hi, everyone. Thank you for being here today. I do appreciate the time. So Namita mentioned my name is Ben Joseph. I am a strategist on our Field Engineering Ultimate Success Team. I specialize in AEP, really focused on real-time CDP at the application level and really a lot around audience and media strategy. And really the intention for today’s webinar, it’s to focus on the AI assistant within AEP. And as the states, we want to go a bit beyond the basics. We want to take a deeper look into its application and really how to get the most out of the tool. So we’re going to kick off with just a brief intro in terms of what it is, if you’re not already familiar. So this is the tool that came out about a year ago. So we’re going to do a quick overview just of the capability and then different areas where you can look to see different modes of value. We will spend a majority of the time on that second bullet there, really looking at its key use cases, how to apply its capabilities to real-world examples, how you can go about building maturity over time. And then we’re going to look at a few of the upcoming capabilities as well. It is worth a call out that some of the things that we’re going to be talking about in the practical examples, they are not yet available. So some of these features, they are still in alpha, they’re still being rolled out. So we’ll make it very clear as to what those are. And then we are going to end with just some quick notes on how you can go about getting started. AI Assistant, it’s relatively simple to set up, but there is an opt-in process. So we will provide some info there just in that regard. So with that, I am excited to jump into this newer capability. So let us go ahead and dive in. So I want to start here just to level set and to really showcase where and how AI Assistant fits into the larger ÃÛ¶¹ÊÓÆµ ecosystem. And I’m doing this with the acknowledgement that I think it can often be challenging to track all the developments, the products, the components of AI in today’s world. And this is of course across the industry as a whole, but also within ÃÛ¶¹ÊÓÆµ as well. You’ve likely heard buzz around a lot of the things that ÃÛ¶¹ÊÓÆµ is doing with AI. I am referring to names like ÃÛ¶¹ÊÓÆµ Sensei, ÃÛ¶¹ÊÓÆµ Firefly, ÃÛ¶¹ÊÓÆµ generative AI, Gen Studio for Performance Marketing. And most of these revolve around the creative and the generative side of AI that’s being rolled out. And while AI Assistant, it’s obviously an AI tool, it is a bit different in that it’s designed for productivity, right? So specifically to help users get things done inside ÃÛ¶¹ÊÓÆµ Experience Platform. So we’re going to get into the capabilities, but I’m referring to things like answering questions about the product, right? Checking pieces of data, troubleshooting issues, different things like that. So while the generative tools are really about the content, image or text creation, this tool is really more about that productivity, right? And that’s the lens for how we’re going to use this for the rest of the session. So now that we’ve sort of seen where it fits into ÃÛ¶¹ÊÓÆµâ€™s broader AI strategy, I want to get a bit clearer just in terms of what it actually is and what it does. So at its core, AI Assistant, it is a conversational interface that’s built directly into AEP, right? It lets you ask questions in very simple and plain language and immediately get back to this direct contextual answer that are not really only based on ÃÛ¶¹ÊÓÆµ product knowledge, but it’s also based on your actual environment too. So everything you see here from troubleshooting, gaining product knowledge, surfacing audience data, analyzing profiles or segments, these are real use cases and tasks that we see customers doing today. And it’s worth a call out. This is a tool that’s constantly evolving, right? Because it’s relatively new, there are some specific features that are being worked on or in testing stages. So again, you’re going to hear us talk about things that are on the go. So I do recommend if you see something that does spark interest, but you don’t yet have access, we’d recommend getting in touch with your ÃÛ¶¹ÊÓÆµ rep in that regard. And then just another thing I do want to reiterate here, which I think really makes this tool very powerful is that again, it’s not just trained on ÃÛ¶¹ÊÓÆµ product knowledge. Yes, you can ask it to explain the concepts of the tool. You can get very helpful responses, right? So for example, what is a dynamic segment? How do I create one? This would be based on the data, but importantly, this is merged with data from your actual instance, right? So asking questions like show me my top audiences by profile count, activation status, this is really where you can start to unlock the unique value and I think really get the most out of the tool. So as we start talking about building maturity, we’ll want to think about these different types of queries, right, and responses and how you can go about leveraging them differently. So I don’t think you can talk about a specific tool and its benefits without talking about some of the actual value that it can yield, right? So these numbers, these reflect usage and results that we’re seeing across real customer environments and I think these are really compelling because they’re tied to real outcomes that matter to customers, right? So that 70 plus hours saved that you see, this is coming from real things like reduction in troubleshooting times, reduced manual QA, more efficient audience setup, the 98% reduction in data hygiene. So this comes from a user who used to manually search for unused audiences, but now just asks which ones are being used in journeys. The 80% faster time to master new features. This comes from a team that used to hunt down documentation that used to reference Slack channels, email team members, product questions just to learn about AEP and new features. Now they’re just asking the AI assistant. So really the takeaway here is that this is a tool that can save real, tangible hours that really do make a difference when you’re considering your workload efficiency. So again, really think of this as a productivity tool. I think you’re going to find from a practitioner point of view, there are a lot of smaller but really cumbersome tasks that you might be doing within AEP today. And really the idea is that this tool can hopefully help you navigate and speed up some of that work. So before we dive into use cases, I do want to quickly address what I think is a very important area and that’s really around data stewardship. And we know that customers very rightfully, they do have concerns around things like data sharing, privacy, governance. So I do think it helps to review just some of the actions that ÃÛ¶¹ÊÓÆµ has taken around tool responsibility and governance. So first, and this is a major call out when using the AI assistant, nothing is shared across organizations or customers. Your data and customers’ data, that is yours. That’s staying within the tool. It’s never shared. On the topic of LLM restrictions, so nothing you type or ask is used to train other models. Any LLM logging is disabled. So we don’t want that being a concern with regard to content filtering. So everything is essentially filtered, meaning any personal or sensitive data. This is scrubbed, it’s filtered out before answers are processed. And lastly, answers are traceable, right? So when the AI assistant tells you something, it shows you exactly where that information is coming from. I know personally, I’ve used other LLMs in the past. I’ve gotten back some wild responses that had nothing to do with my original question. AI assistant, it does try to avoid that, right? It will give both sources and citations when it does provide responses. So again, I do think it’s important to just highlight these guardrails that are in place that really do hopefully help shine a light on some of the measures that ÃÛ¶¹ÊÓÆµ takes to execute this tool responsibly. So now I just want to get into really the heart of what AI assistant actually does from a use case perspective. And I’m going to break this down by categories, sample prompts that you might see, what role may commonly leverage this type of query, and really what it could essentially help deliver. So first, I think a pretty simple use case, we see new users leaning on this product to really understand platform concepts. So the last questions like what’s the identity graph? What is the schema? And really get quick responses, which allow them to immediately understand the capability in more depth. You know, where this was once, you know, something that may have required digging through documentation or enablement. This is now a quick question that can be answered directly right in the interface. We also see being used for onboarding enablement, admin or new team members working on the tool, they may use it to explore foundational data pieces like schemas or relationships in their particular instance. Or prompt like what audiences are using attribute X. This can help them understand existing configurations before maybe creating a new segment for an upcoming upcoming campaign. We will see marketers use it to look for dormant or stale segments with prompts like what audiences having qualified anyone in 30 days. This can obviously help with decluttering and basic things like data hygiene, data teams, they can run it from a data management perspective. So prompts like show me schemas with customer loyalty scores can maybe be used to flag unused or outdated segments. Practitioners, they can use it in real time to debug or troubleshoot. So why isn’t a user qualifying for this segment? It can be a prompt that looks to investigate an issue. This can obviously say back and forth with internal comms, maybe even chasing tickets. And then finally, journey owners, right? They can validate live connections with a prompt like what journey is this audience used in, which can obviously be really helpful when you’re launching a new campaign or maybe a journey. So I know there’s a lot here, but I do think it’s important to understand because there is a lot of opportunity and application to use the AI assistant across a number of different areas, across a number of different roles, right? This is not just a chatbot that a practitioner can use to ask product questions. This can be used by resources across a ton of different roles within your organization in a lot of different and unique ways. So one of the questions that we commonly see is, okay, so how does this now evolve? And what we’re showing with this table is essentially how these prompts start as one-to-one interactions, right? You ask a question, you get a response, but ultimately they become more embedded into broader workflows and then just become part of your larger strategy as a whole. So just as an example here, looking at a manage audiences use case, at first someone might ask the simple prompt, what audiences haven’t qualified anyone in 30 days? So at first you just get a simple response back on a one-off basis, but then maybe you start to augment that prompt into your larger workflows, right? Maybe it becomes part of your weekly QA process, especially when creating or maybe activating new audiences. And then maybe it progresses to a larger strategic tool at an org level, right? Maybe it’s now included within your internal documentation, or maybe it’s included in your onboarding checklist. And you can see how this can really be done across all six of these use cases with this similar structure. So it’s really not just about these one-off prompts to solve an in-the-moment problem, right? But the idea is to take these prompts and really try to create structure and habits around them to where they’re fully incorporated into your workflows. And this is really how you can think about starting to build adoption and maturity with the AI assistant as a whole, right? And we’re going to break this down in the next section where we start to look at prompts, where you can go about applying them, and really just how to move beyond just asking a question and then moving on from there. So now I just want to get into really how this tool works at a capability level, because that’s essentially what’s really going to set the stage for how you can go about using it most effectively. So what we’re showing here are really the baseline behaviors that are useful to know upfront, really the ones that help users get comfortable and really build a foundation for more advanced use down the line. So first, there is the Discovery panel, which is essentially a quick launch area for predefined concepts. So it’s organized by topic, so think things like audiences, journeys, data sets, and it lets users effectively explore themes without needing to come up with the right wording from scratch. Then there is prompt and object autocomplete. So once you start typing, AI assistant will suggest complete queries or even fill in object names like the specific datasets, the specific schemas or segments that actually do exist in your environment. This is obviously great as this removes a lot of the guesswork and the manual searching that you might have done. Review sources and explore references. These are more about where the information in the responses are coming from. The AI assistant, it often links out to documentation or lets you even hover over and inspect schemas or other objects without ever needing to even leave the chat. And lastly, analyze response structure. This is a bit more subtle, but it is important. It’s really the idea that AI assistants answers are structured, right? So you’re not just getting a paragraph of text. You might see things like tables or breakdowns, linked objects, or even explanations and sources of how that response was generated. So again, what we’re starting with here, these aren’t necessarily the fanciest ways to use AI assistant, but we do find they’re a really great natural entry point for new users really before starting to move into some of these more complex workflows. So once users start getting comfortable with the basics, one of the first areas that we start to see AI assistants starting to play a greater role in everyday use is really across data hygiene and governance or really just keeping your AEP instance clean and well maintained, right? So what you’re seeing here is really a simple set of prompts that do things like help teams monitor data quality, maybe system health or permissions. For instance, if you ask which attributes haven’t been used in the last 30 days, this can help you identify fields that may be stale or redundant, right? Maybe something got created for one-off use, but it was never cleaned up. Another prompt may be which data sets haven’t received any records in seven days. So this can be a helpful, a quick way to catch maybe a broken or maybe an additional source before it actually becomes a bigger issue. We also see prompts around identifying duplicate audiences. This can be really helpful for proper profile counts. It can be helpful for data accuracy. From a permissions and governance standpoint, we see things like what destinations are connected to this instance or which users have access to this sandbox. So it’s important here because it’s not all about data hygiene, right? But it could also be used to identify the right systems, the right people have the right level of access, right? And then on the right side here, you’re going to see a few examples of how AI-SSN actually surfaces those answers, right? So showing attributes that aren’t being used in audiences or walking through the logic of audience duplication. So just to take a pulse right now on where we are. So the previous slide, we talked about actually learning the tool. And now we’re sort of segueing into how you can apply it for varying degrees of maintenance support. So really, I think the key here is once you’ve got the fundamentals down, you can start to use it in these more evolved ways, right? And a good starting point for that can be using it to keep your AEP instance cleaner, more stable, and really just easier to operate over time. So now I just want to go into really how AI-SSN supports troubleshooting. And I will note up front, not all of this is fully available in every environment yet. Some of what you’re seeing here, it’s still being rolled out. But with that said, even with what’s live today, we’re already seeing teams using the Assistant to dig into issues that may have historically required a lot of work or maybe even escalation. So a common prompt we may see, why aren’t any profiles getting qualified in my audience? So in this case, you might get answers with details around whether the audience logic is empty, or if attributes are missing, or if certain activation rules are off. And this is obviously something that previously may have required a ton of digging. So getting a quick answer here, it’s clearly very valuable. A prompt like, why can I actually get a response? So this is exactly where it is that you’re hitting a roadblock, but can also help from a knowledge perspective too, right? So where the explanations, the sources, they can really help you better understand what the root cause of your issue actually is. One of the upcoming features, but prompts like create a support ticket or check the status of case 12345. The idea here is really cut down on that friction between hitting a roadblock and getting the level of help that you need. Especially if you’re deep in a workflow and just want a straightforward path to resolution, and you’re not wanting to switch systems or even mindsets really to chase down the updates that you’re looking for. And the sample output that we’re showing here. So this is demonstrating how AI Assistant can break down a segment troubleshooting workflow. So you can see that it’s really with a step-by-step logic. It’s not just a quick one-off response. So really there’s a lot of value in using the AI Assistant across troubleshooting and support issues. Some of the capabilities here, I will call it again, they are still evolving, but I do think it’s valuable. It’s a valuable shift just in terms of getting clearer, faster answers without jumping through a ton of hoops that you might have gone through previously. So I know a few slides ago we did talk about really how AI Assistant can be used to help clean and maintain your AEP environment. So this next set of prompts, it might feel similar at first glance, but the focus here shifts. This is more so really about making sure that your data isn’t just clean, but that it’s campaign ready. So you might see prompts here like, show me the schema associated with this data set, or is this schema profile enabled? And this is really about making sure that the data that you’re relying on, it’s actually connected properly and that it’s usable in real time for segmentation, for activation, whatever it might be. And this really becomes especially important when maybe audiences are not qualifying as expected, or maybe the data isn’t flowing how you intended it to. Then there are prompts like which audiences use consent attributes or which audiences use loyalty attributes. And really what these do, they’re letting you see how key data elements are being used inside your segmentation. So it’s a really helpful way to test certain segments, maybe before campaigns go out, particularly if those data points within the audiences are tied to certain compliancy rules, or maybe they’re tied to personalization logic. Another one worth calling out, so show me all the fields of the schema that contain field X, right? This is really a nice shortcut for understanding the structure of important fields. And I think it’d be particularly helpful if you’re not the one who originally created that schema, and this is a net new workflow for you. So really, again, we see this as really another useful application in starting to build maturity with AI Assistant. And really the key here is that you can use the tool for really this set of checks that can hopefully give you more confidence that your data structure within AAP, that it’s going to properly support your targeting, your audience qualification, your activation, really whatever it might be. So this next slide now pivots us really into the space of audience insights. And this is really about helping teams get a better understanding of who’s in their audiences, how big those groups are, and really how they overlap. So it can be really helpful just in terms of shaping actual audience targeting and different segmentation strategies. So on this slide, we’ll just look at two capabilities here. So the first is discovering and validating audiences. So prompts like how many people live in California are over 21, have an email address, and have consented to email marketing. So these can be really helpful as far as checking eligibility criteria, let’s say before actually launching a campaign. This early validation, it can help avoid things like maybe sending an audience that’s too small or maybe misaligned with campaign goals. Secondly, we have trend analysis and shifts. So you might ask what’s the overlap between two audiences or how has this audience changed over the last month? And this is really useful across a few areas. So one, it can definitely help reduce redundancy, right? So like audiences that look too similar, but at the same time as well, you can use this as a scale play, right? You can expand out to additional audiences that may be similar to your intended target, but don’t exactly resemble them. So a lot of use cases from that perspective. And now we are essentially progressing from audience insights to audience forecasting. And I think the example here, it’s pretty straightforward. You can ask the assistant for historical audience sizes. It’ll return to trend line. So in this example, how your social media engages audience has changed month to month. And what it can do is so it can take these trends and then project them forward, right? So now instead of just reacting to your audience sizes, you can look at forecasts of where they might be headed. So this is obviously really helpful when we’re thinking about things like campaign planning, budget allocations, timing. So if you see that an audience may be shrinking, just as an example, you’re going to obviously action on that differently than if the audience was growing. And again, I will caveat with these last few slides. Some of this is still a bit forward-looking and not every one of these features is available yet, but they do reflect the kinds of audience level insights and trends that are upcoming with the assistant that I think can support further use cases. So this is our last slide here in the capability section, but this is now taking a shift into journey tracking and performance validation. So once your audience is built and maybe you’ve launched it into a journey, how do you make sure it’s working as expected? So in this case, AI Assistant, it can support tasks and prompts like which audience is in this journey. So this can be helpful context for confirming the right segments are mapped. How many profiles have entered? This can help check if qualification is working as intended. And even early performance metrics like what is the click-through rate for this journey can help give you an early quick glance into performance. And I definitely know, I’ll acknowledge on the surface, some of these queries, they may seem more basic, but in a lot of cases, these questions aren’t always easy to answer quickly, right? In some cases, you’re navigating to other areas of AAP. You may be juggling between different systems. You might be inquiring with other team members who might be closer to the reporting. So really, I think that this is a nice shortcut as far as getting what you need in a much quicker way. You will also notice here in the output example, the Assistant is essentially showing how it arrived at the answer. So we walked through this concept a little bit earlier, but in this case, what it’s doing, it’s showing how the CTR was calculated and what data objects it pulled from. And I think this is useful not only because it hopefully allows you to have a greater trust in the output, but it also makes the Assistant more teachable, right? So you can follow the logic, you can spot gaps, you can even ask better follow-up questions, and you can dig into the topics in more detail if that’s what you choose. So again, I apologize. You’ve heard me say it a few times. Not everything that we talked about in the previous slides is out just yet. So what we want to do here, we want to paint an overall picture of AI Assistant in terms of what you can do with it, how again to go beyond the basics, and really just how to view different levels of maturity. So here, the intention is to help you sort of delineate between what you can do right now today and what you’ll be able to do in the near future. So what you’ll see here, yes, we covered some of these things like forecasting audience conversions, identifying significant drops in engagement, exploring usage trends within your data sets. These are all advancements that will be available soon. We know some of the updates around support are going to be very valuable. A lot of our customers, they’re excited about opening and checking tickets without having to switch tools. And some of these directly build on the things that we talked through earlier, and I think that’s important, right? So for example, moving from simply tracking audience recency to flagging meaningful shifts in audience behavior, or maybe going beyond checking a data set for trends to actually visualizing the data within a built-out table. So really, hopefully, as you’re in the AI Assistant and you’re testing the tool for yourself, you’ll be able to get an idea of all the things that you can do today and how you’ll ultimately be able to expand and hopefully mature from those initial capabilities with what’s in the tool now and then what is upcoming. So I want to close with just sort of a recap. So I know we’ve walked through what the AI Assistant can do, how you can think of it in terms of maturity progression, how the tool is evolving. But again, I mentioned earlier, I do want to shift quickly into what’s needed to actually get started. So it’s relatively straightforward, but it’s not as out of the box as let’s just say I’m opening Segment Builder or customer AI or something simple within AAP, right? So as far as the steps go, the first is just around licensing. So since AI Assistant relies on ÃÛ¶¹ÊÓÆµâ€™s generative AI infrastructure, as part of that, customers do need to agree to the specific licensing terms tied to the features. And these terms, they’re just designed to ensure responsible compliant use of AI, particularly because the Assistant is accessing customer data within your instance. We’re not going to go into or go through the full contract language here, but it’s just important to know that this agreement does need to be signed before the Assistant can be enabled. Your admin or your legal team, they already may be familiar with this requirement, but if it is something you haven’t done yet and want to get this is where you would start. And then once essentially licensing is squared away, the next step is just enabling access to different AAP users. So this is where the admin configuration comes in. It is worth a call out. So AI Assistant, it is not enabled by default. So admins will need to go in to the proper permission settings and assign the correct access to users or user groups. And really this does go beyond just giving permissions and access to AI Assistant as a whole. You might have users who have restricted access to the platform. It may be read or write access across certain areas. Maybe they have no view access across specific functions. And permissions for AI Assistant, they can be configured at that level. So for example, you might only decide to offer some users maybe access to product knowledge through AI Assistant, maybe only operational insights view, maybe workspace level permissions. So this step is important just to make sure the right users do have access to the right level of insight and capabilities within AI Assistant. Depending on what your governance model looks like, you might choose to start small. Maybe only have two users enabled, but then you might expand out as teams begin using the Assistant more regularly. Okay, so just to wrap up there. So I know we’ve looked at a lot. We’ve looked at how AI Assistant is being used today, the types of prompts and patterns that can really drive value, how you can go about building maturity over time, what’s on the horizon, and then again like we just covered, how to get started. So really whether you’re just beginning or you’re planning for more advanced use, really I think the goal with AI Assistant is the same. We want you to start using this tool to get insights faster. We want you to use it to reduce the time you might spend digging through data and documentation and really ultimately work more efficiently within experience platform. So I do hope this was helpful. You’ll have access to this deck and this content afterwards. It’ll all be sent around and with that I believe Namita might be launching a poll and I believe we’re just going to be checking the Q&A pod now just for some questions. Yeah, so I’m going to go ahead and launch the poll right now and Ben we do have a couple of questions that I can run through here or if you want to take a look Ben in the in the Q&A pod. Yes, I will start with the bottom. I’ll just try to go through all these. I might not be able to answer all these on the spot but I will take a quick glance and we could also stand around answers to this afterwards. So one, do you have specific time saving analytics for ÃÛ¶¹ÊÓÆµ Workfront? So the AI Assistant, a version of it is available in Workfront. I’m not as familiar with how that one works across Workfront as I am with AEP so that will have to be a follow-up that we will send over to you guys. Is there an estimated date when the new features will be released? That can also be estimated as a follow-up. We do have estimated dates but they are going to vary based off of the function, based off of the current roadmap. So if that’s something that you reach out to your CSM, your Tamon, that’s definitely something that we can get more information on. Does UT have capability to create, remove, update audiences, data sets, schemas based on specific prompts? That’s an excellent question. Right now the more generative components of it where you’re actually asking it to do that for you are not built into the platform. Whether that becomes more of sort of a plan within the roadmap when we’re thinking more about agentic AI and really looking at this larger concept of having the tool act for you, that’s something that could be on the horizon and is not something that actually exists within the tool today. Today it’s more about receiving guidance, reporting insights, and then actually actioning on that yourselves. There was a question, is AI capabilities available in ÃÛ¶¹ÊÓÆµ Experience Manager, AI Assistant, there is a version of it I believe available in AEM as well. Again I’m not going to have as much insight into that. This was focused specifically on AEP, but yes I do believe it’s within other platforms today. How do we know what is supported versus not supported? Is there any documentation link? Yes when we send around this deck there is a link within that slide that goes into what some of those future capabilities are so you should get a better idea of exactly what it is that you can do today versus what might be upcoming within the next few months. And I think that’s what we have so far in the Q&A. Ben there are a couple questions in the chat area, will audience agents be part of the AI assistants? Yeah so that’s a great question, that’s going to be sort of a future outlook, something that we’re going to look at. It’s not available today but it should be a part of I think a larger vision again when we’re thinking about agentic AI and how different components and different agents can essentially own different components of CDP and CGA and AJO and work towards sort of larger prompts in a more proactive basis. So I can’t necessarily speak to what that roadmap looks like or what that vision looks like but I do believe that that’s ultimately where we do want to go with specific workflows of individual agents but with this specific capability we’re not there yet. I see a couple questions again about AEM sites, unfortunately I can’t speak to AEM right now only how AI assistant works in AEP but again just reach out to your ÃÛ¶¹ÊÓÆµ rep and we can definitely get you more information on how AI assistant lives with or components of it live within other solutions as well. I think that covered everything within the chat. Yeah and then someone just asked if you are going to share the product roadmap for AI assistants? Yeah I believe that that’s in one of the links so we can get that out as well. Yep yeah I think that covers all of the questions. Perfect. One more just came in as an admin how do I enable the AI assistant for users in the admin console? So that was part of the last slide the detailed steps there are included in a within that last slide so once you click through to that you should have detailed instructions in terms of just the step-by-step process to get the right users enabled. Perfect I think that covers all the questions. Thank you everyone. Yeah thanks everyone appreciate the time hope this was helpful. Have a great day.
recommendation-more-help
abac5052-c195-43a0-840d-39eac28f4780