The Chrome User Experience Report helps us with real-world measurements of key metrics that help determine the User Experience for any website in Google Index. These metrics are derived directly from the Chrome user data and give us an idea about the real-world challenges faced by real-world Chrome users.
This data can be accessed via the following methods:
- PageSpeed Insights, which provides URL-level user experience metrics for popular URLs that are known by Google’s web crawlers.
- Public Google BigQuery project which aggregates user experience metrics by origin, for all origins that are known by Google’s web crawlers, and split across multiple dimensions outlined below.
- CrUX Dashboard on Data Studio, which can be set up to track an origin’s user experience trends.
- CrUX API, which provides metrics by origin and urls.
In this article, we will be discussing how to generate the Chrome UX report using Data Studio.
- On a new tab open the URL: g.co/chromeuxdash
- On the resulting window enter your website url:
Beware, to use the following URL formats:
If your origin(website) is not included in the CrUX dataset then you may get the following message:
If the origin exists, the schema page will open which consists of all the parameters in the data set that will fetch to create the report.
Click on Create Report to generate the CrUX Dashboard.
- Core Web Vitals overview
- Metric performance
- User demographics
Core Web Vitals Overview
The first tab gives you an overview of all the major Core Web Vitals metrics. You can check it for both Mobile and Desktop versions.
This consists of a series of tabs LCP, FID, CLS, FCP, INP and TTFB metrics. All these tabs share data regarding each individual Core Web Vital metric.
Each tab consists of the Device filter.
The primary visualizations on these pages are the monthly distributions of experiences categorized as “Good”, “Needs Improvement”, and “Poor”. The colour-coded legend below the chart indicates the range of experiences included in the category. For example, in the screenshot above, you can see the percentage of “good” Largest Contentful Paint (LCP) experiences fluctuating and getting slightly worse in recent months.
The most recent month’s percentages of “good” and “poor” experiences are shown above the chart along with an indicator of the percent difference from the previous month. For this origin, “good” LCP experiences fell by 3.2% to 56.04% month-over-month.
Additionally, for metrics like LCP and other Core Web Vitals that provide explicit percentile recommendations, you’ll find the “P75” metric between the “good” and “poor” percentages. This value corresponds to the origin’s 75th percentile of user experiences. In other words, 75% of experiences are better than this value. One thing to note is that this applies to the overall distribution across all devices on the origin. Toggling specific devices with the Device filter will not recalculate the percentile.
Lastly the User Demographics data (Labelled as the “Device Distribution” Tab). The device distribution page shows you the breakdown of phone, desktop, and tablet users over time. Many origins tend to have little to no tablet data so you’ll often see “0%” hanging off the edge of the chart.
Just like above, the connection distribution page shows you the breakdown of 4G, 3G, 2G, slow 2G, and offline experiences.