For years, the SEO industry has been asking a question nobody could answer: how do we know if our content appears in AI-generated responses?

Google's AI Overviews remain a black box. ChatGPT offers no publisher analytics. Perplexity provides citations but no backend data. The entire Generative Engine Optimization movement has been built on educated guesses and third-party scraping tools that approximate visibility without confirming it.

That changed on February 10, 2026, when Microsoft released AI Performance in Bing Webmaster Tools as a public preview. For the first time, a major search platform is giving website owners direct access to citation data from AI-generated answers across Microsoft Copilot, Bing's AI summaries, and partner integrations.

The reaction from the SEO community was immediate. Within hours, practitioners were pulling data, sharing screenshots, and comparing notes on what the numbers revealed. The consensus: this is the beginning of something significant, even if the data has limitations.

This is not just a new feature. It's the first official acknowledgment that AI visibility is now a measurable metric, and publishers deserve access to it.


What AI Performance Actually Tracks

The AI Performance dashboard lives inside Bing Webmaster Tools at bing.com/webmasters/aiperformance. Anyone with a verified site can access it immediately. No special permissions or beta applications required.

The report provides five distinct metrics, each offering a different window into how AI systems interact with your content.

  • Total Citations shows the aggregate number of times your site appears as a source in AI-generated answers during a selected timeframe. Every time Copilot, Bing Chat, or a partner integration references your content while composing a response, that counts as one citation. This number gives you a straightforward sense of how frequently AI systems rely on your pages as sources.
  • Average Cited Pages reveals the daily average number of unique URLs from your domain that get referenced. A high number here means AI systems are pulling from a broad range of your content, not just one or two popular pages. This breadth can signal strong topical authority across multiple subjects.
  • Grounding Queries might be the most strategically valuable metric in the entire dashboard. These are sample phrases that AI systems used when retrieving content that was referenced in AI-generated answers. They show what Copilot searched for internally when it needed a source to ground its response.
  • Citations by URL breaks down how many times specific pages on your site were cited. This lets you identify which content resonates most with AI systems and which pages might need optimization to earn more citations.
  • Citation Trends shows how your citation activity changes over time. You can spot patterns, identify when visibility spikes or drops, and correlate changes with content updates or algorithm shifts.

Together, these metrics provide something unprecedented: first-party data on AI visibility from a major platform. Not estimates. Not approximations. Actual citation counts from Microsoft's own systems.


Why This Matters for Content Strategy

The AI Performance report signals a fundamental shift in how search performance gets measured. For over two decades, the core metrics have been impressions, rankings, and clicks. AI-driven answers change that dynamic entirely.

When users get information directly from AI-generated responses, there may be no click-through event at all. Traditional analytics miss this interaction completely. A site could be informing thousands of AI answers daily while seeing no corresponding traffic increase.

This creates both a measurement problem and a strategic problem. If you cannot measure AI visibility, you cannot optimize for it. And if you cannot optimize for it, you lose influence over how your brand and content appear in the fastest-growing discovery channel.

Microsoft framed the launch explicitly as an early step toward Generative Engine Optimization tooling. The term GEO has gained traction in the industry to describe strategies for improving visibility in AI-generated content. Unlike traditional SEO, which focuses on rankings and clicks, GEO focuses on being cited and mentioned when AI systems generate answers.

The AI Performance report gives GEO practitioners their first official dataset. Now content strategists can see which pages earn citations, identify patterns in grounding queries, and track how visibility changes over time. This moves GEO from theory to practice.


The Grounding Queries Goldmine

Of all the metrics in the AI Performance report, grounding queries may offer the most actionable insights.

Remember, these are not user queries. They are the search queries that Copilot's retrieval system generates internally when it needs to find sources to ground its answers. They reveal how AI systems translate conversational questions into searchable terms.

When someone asks Copilot a complex question like "What is the best email marketing platform for a small e-commerce business with less than 10,000 subscribers?" the AI does not paste that entire prompt into a search engine. It breaks the question into smaller sub-queries and searches for each one separately.

The grounding queries show what those sub-queries look like. They tend to be keyword-dense, specific, and often longer than typical human searches. Understanding their structure helps content creators optimize for the actual queries that drive AI citations.

Publishers analyzing their grounding query data have found patterns worth noting. Vertical-specific queries drove significant citations for specialized content. Feature-focused queries revealed what aspects of products or services users care about most. Comparison queries showed where AI systems look for head-to-head evaluations.

This data can inform content strategy directly. If your grounding queries show AI searching for "[your industry] + compliance + requirements," you know that content addressing compliance in your industry has citation potential. If queries frequently include "[product category] + pricing + comparison," creating detailed comparison content becomes a clear priority.


What the Data Does Not Include

The AI Performance report has notable limitations that Microsoft has acknowledged.

  • The most significant gap: no click data. The dashboard shows how often your content is cited but not whether users who see those citations click through to your site. Without click-through rates, you cannot calculate return on investment or determine whether AI visibility translates into meaningful traffic.
  • The metrics also do not indicate ranking or prominence within AI-generated answers. A citation could appear as the primary source at the top of a response, or it might be listed as a supplementary reference at the bottom. The report treats all citations equally.
  • There is no API access yet. Fabrice Canel from Microsoft confirmed on social media that API support is on their backlog but provided no timeline. For now, users can export data as CSV files from the dashboard, but automated reporting and integration with other tools will have to wait.
  • The data covers only Microsoft's ecosystem: Copilot, Bing AI summaries, and select partner integrations. It tells you nothing about visibility in ChatGPT, Perplexity, Google AI Overviews, or other generative platforms. For a complete picture of AI visibility, publishers will need to combine this data with third-party monitoring tools.

Microsoft has signaled that improvements are coming. Canel noted that this is "just a preview" and promised "more in 2026." The addition of click-through data would be the single most impactful improvement, finally connecting AI visibility to actual site traffic.


How This Fits Into the Broader GEO Movement

The AI Performance report arrives at a pivotal moment for the search industry.

Traffic data from mid-2025 showed AI referrals to top websites increased 357% year-over-year, reaching over 1.1 billion visits. Research from Gartner predicted that 25% of traditional search volume would shift to AI chatbots by the end of 2026. Semrush data suggests LLM traffic could overtake traditional Google search by 2027.

These projections reflect a fundamental change in how people discover information. Instead of scanning search results and clicking links, users increasingly ask AI assistants questions and receive synthesized answers directly. The blue link is not disappearing, but it is being supplemented and sometimes replaced by conversational discovery.

This shift has created urgent demand for GEO strategies and measurement tools. The AI Performance report is the first official response from a major search platform.

The report also validates what GEO practitioners have been arguing: AI visibility requires different metrics than traditional search. Impressions and rankings matter less when users never see a results page. Citations and source attribution become the new currency.

Some industry observers see this as Microsoft gaining competitive advantage through transparency. While Google's AI Overviews remain opaque to publishers, Microsoft is offering data that helps content creators understand and optimize for AI discovery. This transparency could attract publishers who want insight into how their content performs in the AI era.


Practical Steps for Using the Data

For publishers ready to act on AI Performance data, several approaches have emerged from early adopters.

  1. Verify your site in Bing Webmaster Tools if you have not already. The AI Performance report is only available to verified site owners. Verification takes minutes and unlocks access immediately.
  2. Export your grounding queries and analyze them for patterns. Look for topic clusters, feature interests, and comparison opportunities. These queries reveal what AI systems search for when they need your type of content. Optimizing for those specific queries can increase citation likelihood.
  3. Identify your most-cited pages and understand why they earn citations. What makes them valuable to AI systems? Is it the structure, the data, the freshness, or the specificity? Apply those lessons to underperforming content.
  4. Track citation trends over time and correlate them with content changes. When you update a page, does citation activity increase? When you publish new content, how long until it starts appearing in citations? These patterns help you understand the feedback loop between content actions and AI visibility.
  5. Compare your AI Performance data with traditional search metrics. Pages that rank well in organic search may or may not earn AI citations. Understanding the relationship between these metrics helps you prioritize efforts across both channels.
  6. Combine Bing data with third-party AI visibility tools. Services like Otterly.ai, Ahrefs Brand Radar, and others track mentions across ChatGPT, Perplexity, and Google AI Overviews. Together with Bing's first-party data, these tools provide a more complete picture of AI visibility across platforms.

What This Means for the Future of Search Analytics

The AI Performance report represents more than a new feature. It signals how search measurement is evolving.

For twenty years, search analytics focused on rankings, impressions, clicks, and conversions. These metrics assumed users would see search results, evaluate options, and click through to websites. The entire measurement framework was built around the click as the unit of value.

  • AI-generated answers break this model. When information is delivered directly in a conversational interface, the click may never happen. Users get what they need without visiting the source. Traditional analytics register nothing, even though the content influenced the outcome.
  • Citation metrics offer an alternative. They measure influence rather than traffic. A page that gets cited thousands of times in AI answers has visibility and authority, even if those citations do not translate directly into clicks. This upstream visibility has value, particularly for brand awareness and thought leadership.
  • The challenge is connecting citation visibility to downstream business outcomes. Currently, the AI Performance report shows citation counts but not their impact. Future iterations may bridge this gap by adding click data, conversion tracking, or integration with other analytics platforms.

Microsoft's willingness to provide this data sets an expectation for the industry. Publishers will increasingly demand similar transparency from Google, OpenAI, and other platforms. The argument is straightforward: if AI systems use our content to generate answers, we deserve to know when and how.

Whether other platforms follow Microsoft's lead remains to be seen. But the precedent is now established. AI visibility data can be provided to publishers. The question becomes whether other platforms choose to provide it.


Wrap up

Microsoft's AI Performance report is not a complete solution to the AI visibility problem. The data has limitations. Click metrics are missing. API access is not available. Coverage is limited to Microsoft's ecosystem.

But it is a genuine first. For the first time, publishers can see official data on how often their content gets cited in AI-generated answers. They can analyze grounding queries to understand what AI systems search for. They can track citation trends and identify their most valuable content for AI discovery.

This data enables action. Content strategies can be informed by actual citation patterns rather than guesses. Optimization efforts can be measured against real outcomes. The feedback loop between content creation and AI visibility finally closes.

For SEO professionals and content creators, the message is clear: AI visibility is now a measurable metric. The tools to track it are emerging. The publishers who learn to use this data effectively will have advantages as AI-driven discovery continues to grow.


FAQ

What is AI Performance in Bing Webmaster Tools?

AI Performance is a new dashboard in Bing Webmaster Tools that shows how often your website content is cited in AI-generated answers across Microsoft Copilot, Bing AI summaries, and select partner integrations. Released on February 10, 2026 as a public preview, it provides the first official metrics for tracking AI visibility from a major search platform.

How do I access the AI Performance report?

You need a verified site in Bing Webmaster Tools. Once verified, navigate to bing.com/webmasters/aiperformance to access the dashboard. No special permissions or beta applications are required. Verification is free and typically takes only a few minutes to complete.

What are grounding queries in the AI Performance report?

Grounding queries are the search phrases that Copilot's retrieval system generates internally when it needs to find sources to answer user questions. These are not the actual queries users type. They show how AI systems decompose conversational questions into searchable terms when looking for content to cite.

Does the AI Performance report show click-through rates?

No. The current version only shows citation counts, not whether users click through to your site after seeing the citation. Microsoft has acknowledged this limitation and indicated that improvements are planned for future releases, though no timeline has been provided.

How is AI visibility different from traditional SEO metrics?

Traditional SEO metrics focus on rankings, impressions, and clicks. AI visibility metrics focus on citations and mentions in AI-generated answers. When users get information directly from AI responses without clicking to a website, traditional analytics miss the interaction entirely. Citation metrics capture this upstream visibility.

What is Generative Engine Optimization and how does it relate to this report?

Generative Engine Optimization, or GEO, refers to strategies for improving visibility in AI-generated content. Unlike traditional SEO which focuses on rankings, GEO focuses on being cited and referenced when AI systems generate answers. The AI Performance report provides the first official data for measuring GEO success on Microsoft's platform.

Can I see AI Performance data for Google AI Overviews or ChatGPT?

No. The Bing AI Performance report only covers Microsoft's ecosystem including Copilot and Bing AI summaries. For visibility across other platforms like Google AI Overviews, ChatGPT, or Perplexity, you would need to use third-party monitoring tools such as Otterly.ai, Ahrefs Brand Radar, or similar services.

How often is AI Performance data updated?

The dashboard provides data within a selectable date range, typically showing information from the past 90 days. The exact update frequency has not been specified by Microsoft, but early users report that data appears to refresh regularly, allowing tracking of recent citation activity.

What content gets cited most often in AI answers?

Early data from publishers shows that citation distribution is often uneven, with a small number of pages earning the majority of citations. Content that tends to perform well includes pages with clear structure, specific data points, expert insights, and comprehensive topical coverage. Analyzing your own grounding queries reveals what specific content types AI systems seek from your site.

Is API access available for AI Performance data?

Not yet. Fabrice Canel from Microsoft confirmed that API support is on their backlog but did not provide a timeline. Currently, users can export data as CSV files from the dashboard. Automated reporting and integration with other tools will require waiting for API availability.


Top 11 AI Analytics Tools to Use in 2025
AI analytics for marketing in 2025: a quick guide to top tools for personalization, SEO, and predictive insights to boost ROI and campaign performance.
6 Innovative Approaches to Implementing AI in Data Analytics
Comprehensive guide explores six innovative approaches to integrating AI into data analytics. Each method utilizes different tools and technologies that professionals can implement immediately in their current workflows.