AI-referred visitors show stronger intent signals, engaging more deeply within sessions, navigating fewer pages, and exhibiting fewer early abandonment behaviors than users from traditional channels.
Last November, we found that AI‑referred traffic converts at three times the rate of traffic from other channels.
At the same time, the amount of traffic AI platforms are driving has grown steadily. According to Clarity data, AI-referred traffic has increased by 22% over the past six months.
Taken together, these findings raise some important questions. If AI-referred sessions are converting at a higher rate, what does that say about the intent behind those visits? Do they exhibit behavior consistent with stronger intent than users from traditional acquisition channels?
To answer these questions, we looked at session-level behavioral data to compare how AI-referred visitors engage once they arrive compared to traditional acquisition channels.
What We Measured
We analyzed behavioral data across over 30 billion sessions from the 2,000 Microsoft Clarity projects that received the most traffic from large language models. The date range for this analysis was October 2025 to March 2026. The goal was to compare how users behave once they arrive from AI-driven sources compared to traditional acquisition channels.
For each channel, we evaluated a consistent set of Clarity’s behavioral signals, including quick backs, session duration, pages per session, scroll distance, and rage clicks.
Together, these metrics provide a multi-dimensional view of intent, capturing not just whether users arrive, but how they engage, how far they explore, and where friction or disengagement begins to appear.
A full breakdown of all channel-level metrics is included in the appendix.
These same behavioral signals are available in Microsoft Clarity. Using AI channel grouping, you can isolate AI-referred sessions and compare them directly against other acquisition channels, making it possible to see how these patterns show up on your own site.
Key Findings
The Cleanest Intent Signal: Fewer Quick Backs from AI Traffic
One of the clearest behavioral differences in this dataset appears in quick back activity. Quick backs often signal a mismatch between expectation and content that prompts users to re-evaluate their choice.
We found that AI-referred sessions show lower quick back rates than Organic Search, indicating that AI-originated visitors are less likely to abandon a page immediately after arrival:
- AI Platform quick backs: 1.60 per session
- Organic Search quick backs: 2.04 per session
At first glance, the difference may appear marginal. But at scale, a channel generating 21.6% fewer quick backs than search is a meaningful signal about user intent and pre-qualification. This indicates that AI-originated sessions are associated with fewer immediate abandonments after arrival.
This pattern is consistent with how AI platforms mediate the discovery process. Unlike traditional search, where users scan and compare multiple results on their own, AI tools typically surface and contextualize information before a click. By the time a user reaches a site from an LLM, they’ve often already seen a synthesized explanation and chosen a source that appears relevant to their specific question.
As a result, the visit is less exploratory and more directed. Organic search traffic, by contrast, includes a wider mix of early-stage and ambiguous queries, which naturally leads to more rapid backtracking as users evaluate whether a result matches their intent.
The implication is that AI traffic arrives with a higher degree of pre-alignment between user expectations and page content. Lower quick back rates don’t just indicate reduced friction; they suggest that AI systems are effectively filtering and framing information before the user ever lands on a site, resulting in sessions that begin with fewer immediate reversals.
Concentrated Engagement: More Time, Fewer Pages
AI-referred sessions follow a consistent pattern: users spend more time within a session, but they do so across fewer pages. Compared to Organic Search, AI Platform sessions show higher average session duration (approximately 943 seconds versus 849 seconds) while visiting fewer pages per session (about 10.6 versus 16.8). This combination is consistent with more focused interaction within individual sessions, rather than broad exploration across many pages.
Scroll distance helps refine this picture. While AI traffic does not lead across all channels, it ranks near the top, second only to Organic Search, indicating that users are still engaging meaningfully with the content they land on. Visitors from AI platforms tend to spend time within pages, reading and interacting with content, rather than moving across multiple pages to gather information. This pattern is somewhat different from what is typically observed in channels like Organic Search, where time on page, scroll depth, and page depth often move together.
One additional factor to consider is that higher conversion rates in AI traffic may also contribute to longer session durations. Conversions often involve time-intensive actions such as completing forms, reviewing pricing, or moving through checkout flows. As a result, session duration for AI traffic likely reflects a combination of sustained content engagement and progression through conversion steps, rather than reading behavior alone.
AI Traffic Shows Higher Rage Clicks, But It’s Not the Highest Overall
Rage clicks offer a useful signal of friction within a session. They don’t always indicate failure, but they often reflect moments where a user is trying to make something happen and isn’t getting the expected response.
In this dataset, AI Platform sessions show higher rage click rates than Organic Search (approximately 0.009 vs. 0.006). This could indicate that AI-referred visitors encounter more friction during their sessions. However, this pattern is not unique to AI traffic. Paid Search and Direct traffic also show similarly elevated rage click levels, with Paid Search slightly higher overall.
This places AI within a broader category of channels that tend to attract more intent-driven behavior. These users are often closer to taking action, which can change how they interact with a page when something doesn’t respond as expected. In these sessions, friction is more likely to trigger repeated attempts rather than disengagement.
Teams looking to optimize for AI traffic should audit rage clicks within high-intent paths like forms, pricing pages, and CTAs. Prioritize fixing friction in these areas, where AI-referred users are most likely trying to complete an action.
The Anatomy of an AI Session
The aggregate data shows consistent patterns, but those patterns become more concrete when you look at individual sessions for AI-referred traffic.
A typical AI-referred session begins with a user landing on a specific page, often a product or content page, rather than a homepage. This aligns with the context of AI tools, where users are frequently directed to a relevant destination rather than starting a broad exploratory journey.
From there, engagement tends to be focused and sequential. Users scroll through the page in a steady, continuous manner, spending meaningful time within a single session rather than rapidly bouncing between pages. Navigation, when it occurs, is purposeful. For example, moving from a product page to pricing or supporting details and then returning before taking action.
These sessions also tend to show fewer signs of friction. Across the dataset, AI-referred sessions exhibit fewer quick backs as usersprogressthrough content with intent.
When viewed alongside sessions from other channels, the contrast becomes clearer. Channels like Social tend to drive shorter, more exploratory sessions, where users scan quickly and often exit without deeper engagement.
Taken together, these patterns are consistent with sessions where users arrive with more contextual awareness of what they are looking for. Rather than browsing broadly, these users appear to use the session to validate information and make a decision.
What This Means for Marketers & Website Owners
Across channels, most acquisition strategies are built around a familiar assumption: users arrive to discover, compare, and gradually narrow their intent through navigation. The behavior of AI-referred traffic suggests something different. These visitors tend to exhibit behavior consistent with being closer to a decision point, with clearer expectations and a narrower set of questions to resolve.
For marketers and site owners, this shifts how AI traffic should be interpreted and optimized. It’s less about attracting exploratory clicks and more about meeting users who are already partway through their journey. Here are some actionable steps you can take to optimize for this growing channel.
1. Prioritize Specificity Over Broad Positioning
AI-referred visitors are more likely to engage with content that directly answers their intent.
Key takeaway: Pages that clearly articulate value, differentiation, and next steps are better aligned with this behavior than those that rely on users to piece together meaning through exploration.
2. Treat Each Page as a Self-Contained Decision Environment
Because AI visitors tend to visit fewer pages per session, the landing page often carries more of the burden of informing and converting. Supporting details, proof points, and context should be accessible within the page itself rather than spread across multiple navigational layers.
Key takeaway: Ensure each landing page can stand on its own by including key information, proof points, and next steps without requiring users to navigate elsewhere.
3. Rethink Engagement Signals
Fewer pages per session do not necessarily indicate lower interest. For AI traffic, meaningful engagement is more likely to show up as time spent, scroll behavior, and progression within a single page rather than movement across the site.
Key takeaway: Evaluate engagement based on depth within a page (time, scroll, interaction), not just pages per session.
What Comes Next
We’ve established that AI-referred visitors convert at higher rates, and that their on-site behavior helps explain part of that difference. What remains less clear is how these patterns should translate into intentional design and optimization decisions.
The data finds that AI-referred sessions are more focused and occur later in the user journey, which has implications for how content is structured and presented. But while these patterns point in a direction, they don’t yet define a single “correct” approach.
As AI-driven traffic continues to grow, the next step is understanding how different page experiences perform within this context: what actually improves engagement, reduces friction, and drives conversion for these sessions specifically.
Rather than treating this as a fixed conclusion, it’s an area we’ll continue to explore as more data becomes available.
If you want to understand how AI traffic is behaving and converting on your own site, Clarity’s AI channel group lets you isolate and compare AI-referred visitors against other acquisition sources.
Methodology
Session data was sourced from Microsoft Clarity across websites with measurable LLM referral traffic. All figures represent aggregated, anonymized behavioral data. Individual site data is never shared or identifiable.
The table below provides the complete set of session-level metrics across all channels included in this analysis.
While the main findings focus on the most meaningful behavioral differences, this dataset is included for transparency and for readers who want to explore the full distribution of engagement, depth, and friction signals across channels.
Behavioral Metrics by Channel
| Channel | Percent of Sessions | Avg Quick Backs | Avg Session Duration (s) | Avg Pages per session | Avg Scroll Distance (px) | Avg Rage Clicks |
| Direct | 24.80% | 3.43 | 1,641.06 | 29.93 | 2,501.63 | 0.010 |
| Organic Search | 23.25% | 2.04 | 849.06 | 16.84 | 2,769.24 | 0.006 |
| Referral | 8.99% | 1.74 | 772.05 | 12.41 | 2,040.40 | 0.005 |
| Paid Search | 6.99% | 1.54 | 520.36 | 9.43 | 2,329.26 | 0.011 |
| 1.14% | 2.18 | 572.52 | 25.06 | 1,595.54 | 0.002 | |
| Display | <1% | 1.61 | 879.64 | 7.46 | 1,421.41 | 0.010 |
| Social | <1% | 1.33 | 242.25 | 3.34 | 2,154.46 | 0.004 |
| Affiliates | <1% | 1.41 | 472.03 | 6.88 | 1,240.32 | 0.004 |
| AI Platform | <1% | 1.60 | 943.41 | 10.61 | 2,554.05 | 0.009 |
| Other Advertising | <1% | 1.31 | 289.87 | 3.91 | 1,677.36 | 0.006 |
| Unattributed Traffic | 33.34% | 1.71 | 1,244.58 | 15.20 | 1,714.52 | 0.005 |
Channel Definitions
| Channel | Definition |
| Direct | Traffic arriving with no clear referral source, typically from typing a URL or using bookmarks. |
| Organic Search | Unpaid traffic from search engines. |
| Referral | Traffic from websites (non-search, non-social) that link to your site |
| Paid Search | Traffic from search ads (Microsoft Ads, Google Ads) |
| AI Platform | Unpaid referrals from LLM tools (ChatGPT, Perplexity, Copilot, Gemini, etc.) |
| Social | Traffic from social media platforms |
| Unattributed Traffic | Sessions that could not be confidently attributed to the above categories |
Metric Definitions
| Metric | Definition |
| Quick backs | Instances where users navigate to a page and then quickly return to the previous one. |
| Session duration | The average time spent in a session. |
| Pages per session | Number of pages viewed before exit. |
| Scroll distance | How far users progressed through individual pages. Measured in pixels. |
| Rage clicks | Users rapidly clicked or tapped in the same small area. |
