Do you have the infrastructure needed to sell on LLMs?

How do you sell to a customer who doesn't use a search bar?
As Large Language Models (LLMs) like Gemini and Perplexity move toward "instant checkout," the conversation in retail boardrooms has shifted from if to how.
But the "how" is proving to be a significant source of friction. We’re seeing major global retailers scramble to organize "tiger teams" to handle direct integration requests from OpenAI and Microsoft, while mid-market organizations and sellers are left wondering how to even gain access to these new channels.
The questions we hear most often aren't about the front-end interface — they’re about the "black box" of agentic discovery: How do I rank inside an LLM? How do I enrich my product catalog so that an AI doesn’t just see my products, but prefers them? What are the protocols, and how do these channels actually differ from TikTok or Instagram?
While there’s a lot of excitement, there’s an uncomfortable truth hiding behind it: Most retailers and brands are still trying to solve today’s problems with yesterday’s infrastructure.
Despite the high-profile partnerships, early data shows a massive gap between referral traffic and actual transactions.
That gap exists because while a partnership marks the beginning of the journey, building the backend to make an AI agent an effective, autonomous shopping assistant is a far more complex undertaking.
The window for readiness is closing, and the winners won't be those with the most headlines, but those who solved the infrastructure gap first.
The front end is moving fast, but is your infrastructure ready to deliver?
The statistics paint a clear picture. When Perplexity enables in-chat checkout and Microsoft embeds purchasing into Copilot, AI agents aren't just a cool feature, they’re a meaningful transaction channel.
For brands, retailers and wholesalers, this is a massive opportunity to capture conversational queries traditional search would never touch. But what actually happens behind the scenes when an AI agent tries to execute a transaction?
For most retailers, the answer is a tangle of infrastructure that doesn't exist yet. To compete, you need to address four critical pillars.
Pillar 1: How discoverable is your catalog across the AI ecosystem?
AI agents don't browse your website like humans do. They gather data through crawling, Google Shopping listings, Model Context Protocol (MCP), and structured feeds like the OpenAI Product Feed. This creates an immediate challenge: How do you provide clean, fast updates across all these access points simultaneously?
Consider what "clean data" really means. When a consumer asks for "sustainable running shoes under $150 with arch support for overpronators," can an agent find that level of detail in your catalog?
If your product descriptions are focused on light-weight marketing copy rather than structured specs — using vague phrases like "premium feel" instead of measurable "ANSI safety certifications" or "100% post-consumer recycled nylon" — your products are essentially invisible to AI. While humans can infer meaning from lifestyle images, LLMs require contextual depth to verify facts and make a recommendation.
Fast updates matter even more. A lag of just a few hours can result in an agent quoting an incorrect price or recommending an out-of-stock item. This doesn't just lose a sale; it erodes the agent's "trust" in your brand.
Maintaining custom integrations for every major LLM is an architectural challenge that requires constant iteration. But before you pivot your technical roadmap to build for these new protocols, you need to establish a baseline: Is your catalog even visible to these engines today?
Assess your Generative Engine Optimization (GEO) readiness here to see how your products currently rank.
Pillar 2: How do you manage multi-LLM connectivity at scale?
Before a transaction can happen, you have to actually connect. This is where many organizations hit a wall.
Every LLM has its own technical requirements and emerging protocols — such as the Universal Commerce Protocol (UCP) or the Agentic Commerce Protocol (ACP) — that retailers, brands and wholesalers must adhere to.
Navigating these shifting standards requires immense development resources and specialized knowledge that most commerce teams simply don't have. For example, your teams likely haven't architected for agent-specific secure tokenization — ensuring an AI can execute a purchase without exposing sensitive raw data — or built the logic required to distinguish between an authorized agent checkout and a bot-driven fraud attempt.
The complexity isn't just technical; it's strategic. Right now, no one knows which assistant will win the "buy box" for your specific category. Will Gemini be your top performer, or will it be Copilot? To find out, you need the infrastructure to test multiple LLMs simultaneously without rebuilding your integration every time a new player enters the market. Are you building a single-point connection, or a scalable gateway to the entire AI economy?
Pillar 3: Can your checkout handle a transaction within an LLM?
Enabling a shopper to add an item to a cart within an LLM is table stakes. But, the real challenge is enabling secure, autonomous purchases at scale.
Agentic checkout moves beyond traditional transactions. It requires agent verification, user consent and fraud prevention orchestrated in milliseconds, often while the user isn't even looking at their screen.
This requires a fundamental rethink of risk management. Traditional fraud prevention relies on behavioral signals like mouse movements, typing patterns and session duration. When an AI agent clicks "buy," those signals disappear.
Does your current payment infrastructure have alternative verification methods to validate these transactions without creating friction? Most legacy systems weren't architected for a world where the "shopper" is a piece of code.
Pillar 4: How does an AI-driven order flow into your legacy stack?
The transaction doesn't end when the agent completes the purchase.
That order data, including customer details, tax calculations and shipping addresses, needs to flow seamlessly into your Warehouse Management System (WMS) and Customer Service platform.
But what happens when that order involves multiple marketplace sellers or dropship partners? Your infrastructure needs to route each line item, track fulfillment and sync updates back to the agent.
If a customer asks the AI about a return, does your support team have visibility into that transaction? Most order management systems assume orders originate from your own app or site. Extending that logic to multiple LLMs, each with its own data format, is a massive engineering hurdle.
The build-vs-partner dilemma: Do you have the expertise to build for the unknown of agentic commerce?
Building this four-pillar infrastructure in-house requires deep, specialized expertise in AI/ML, payment orchestration, catalog GEO, inventory and price sync with LLMs, order orchestration and rapidly evolving LLM protocols. For most retailers, the question isn't just about budget, but about whether your engineering team should be building specialized commerce infrastructure or focusing on your core customer experience.
While you spend months or years attempting to master these protocols internally, your competitors are already capturing agent-driven traffic through established partners.
In a market moving this fast, the choice is clear: Do you want to spend your resources building a software company from scratch, or do you want to be a leader in agentic commerce?
Mirakl Nexus: Purpose-built for the agentic era
We designed Mirakl Nexus as an agentic commerce solution to solve these four pillars with a no-code solution. With Mirakl Nexus, retailers, brands and wholesalers get:
Real-time connectivity and visibility: Through real-time connectors, LLMs get instant access to live pricing and inventory across your entire catalog, including marketplace and dropship products. This transparency also extends to post-purchase experiences, providing customer service teams with the complete order data to resolve queries instantly.
Catalog GEO: Transform messy seller data into agent-ready, structured content. It ensures your products are discoverable regardless of the protocol the agent uses.
Seamless orchestration: Through our partnership with J.P. Morgan Payments, we provide enterprise-grade checkout and risk management designed specifically for agentic complexity.
Actionable performance insights: Observe conversion, engagement and product performance, per AI channel. Nexus allows you to identify which LLMs and product categories generate real ROI, helping retailers and brands optimize distribution and catalog selection based on actual performance data.
Even better, retailers, brands and wholesalers don't need to rip and replace their existing systems. Nexus orchestrates across your current commerce stack, extending your infrastructure without a total technology overhaul.
The window is closing — are you ready?
First-mover advantage in agentic commerce isn't about the headlines; it's about being ready when agents start making autonomous decisions.
As these agents develop preference algorithms, they will favor retailers who deliver clean data, accurate availability and seamless fulfillment. If you are absent or poorly represented, you aren't just losing a sale today — you're being programmed out of your customers’ future.
Frequently asked questions: Navigating agentic commerce
If you’re looking for the highlights on how to prepare your business for the era of AI-driven shopping, here are the essential answers.
What is "agentic commerce" and why is it a priority now?
Agentic commerce refers to the shift from traditional search bars to Large Language Models (LLMs) like Perplexity and Gemini acting as "autonomous shopping assistants". As these platforms move toward "instant checkout," they are becoming a meaningful transaction channel. Brands that fail to bridge the "infrastructure gap" risk being "programmed out" of their customers' future as AI agents develop their own preference algorithms.
Why can’t I just use my existing website and product listings for AI agents?
AI agents do not browse websites like humans; they gather data through crawling, structured feeds, and specific protocols like the Model Context Protocol (MCP). If your product data is messy, incomplete, or slow to update, your products become "invisible" to these engines. Agents require highly detailed, clean data — such as specific attributes for "sustainable running shoes" — to prefer your products over a competitor's.
What are the primary technical hurdles to selling via LLMs?
There are four critical pillars you must address:
Catalog discoverability: Providing clean, fast updates so agents see accurate pricing and inventory.
Multi-LLM connectivity: Navigating a "tangle" of shifting technical standards like the Universal Commerce Protocol (UCP) and Agentic Commerce Protocol (ACP) across different AI models.
Agentic checkout: Enabling secure, autonomous purchases that don't rely on traditional human behavioral signals like mouse movements for fraud prevention.
Legacy integration: Ensuring order data flows seamlessly into your existing Warehouse Management System (WMS) and support platforms.
How does checkout change when the "shopper" is an LLM or agent?
Traditional fraud prevention relies on human signals like typing patterns and session duration, which disappear when an AI agent clicks "buy". Agentic checkout requires a fundamental rethink of risk management, utilizing alternative verification methods to validate transactions in milliseconds — often while the user isn't even looking at their screen.
How does Mirakl Nexus simplify this transition?
Mirakl Nexus acts as an agentic commerce operating system, offering a no-code solution to solve the four pillars of infrastructure. It provides:
Real-time connectivity to live pricing and inventory.
Catalog GEO to turn messy data into agent-ready content.
Enterprise-grade checkout and risk management through a partnership with J.P. Morgan Payments.
Orchestration that works with your current commerce stack, meaning you don't have to "rip and replace" existing systems.



