Leveraging GenAI for Creating Personalized Experiences at Scale with Edge Delivery
Today, businesses are facing constantly increasing personalization expectations, fast campaign delivery demands, and never tighter marketing budgets. What if… business users would be able to do more with less, creating and publishing personalized campaign content themselves in seconds?
In this talk, we will show a demo leveraging GenAI tools including WatsonX and Firefly to quickly produce performance-optimized, personalized landing pages served with Edge Delivery. We rely on Adobe Workfront for initiating and tracking the landing page generation process. A custom microservice running on Microsoft Azure is generating Edge Delivery documents stored in Google Drive. Microservice leverages WatsonX for creating personalized text targeting specific audiences, while images are pulled from Adobe Stock or generated with Firefly. A custom Google Docs editorial plugin enables tweaking the generated content with the help of an AI sidekick. Edge Delivery makes content creation amazingly quick and delivers experiences with a Lighthouse score up to 100.
With this approach, businesses can generate complete, personalized landing pages in seconds, not days or weeks. Combining the capabilities of GenAI and Edge Delivery for personalization at scale brings a productivity lift in the marketing team with lightning-fast delivery and excellent page performance.
Luka Miroić
As we are using the "template" Google document, which can be different for a specific customer segment storefront. This can be a nice improvement for the next steps.
Vlad
How do you keep track (visually) of what was actually approved for publishing?
Luka Miroić
We are using standardized content structure so it's easier to munipulate all the content we generate by AI for a specific customer segment storefront. But we had to do a lot of validation to ensure correct content is returned by AI.
Markus Haack - Adobe
How do you make sure AI generated content & images match to the brand guidelines (tone of voice, imagery etc)?
Tad
How do you QA the generated storefronts, to make sure they stay not-nonsensical?
Luka Miroić
This is why we have a review processes which are done in two steps, firstly Marketeer has to do the initial review and after that final approveal has to be done by Publishers. Maybe it could be possible to train some AI model to do some initial validation, but in my opinion, for now, we should have the final approveal done by the Publisher.
Luis Hernandez
How do you ensure the generated images add the right products? How do you ensure GenAI doesn't got too creative?
Luka Miroić
We have a template document with specific blocks, but some of the blocks we custom implemented or extended existing components, example Hero product component and the Product carousel component.
Luka Miroić
The CTAs which leads from the Landing page to the Product page and back we had to do some manipulation because we are creating both pages dynamically so path to them has to be built dynamically on creation. But all the CTAs related to shopping, example Add to cart button, Checkout process etc. is handled OOTB by commerce block that are already developed by Adobe. Only some small configuration in CSV file is necessary to make thia work.