How to Set Up GA4 BigQuery Export (Step-by-Step)
BigQuery export unlocks raw event-level data. This guide covers setup, costs, and the queries every analyst needs on day one.

One of the most powerful features of GA4 is the free BigQuery export — yet over 70% of properties we audit don't have it turned on. If you're still relying solely on the GA4 UI for analysis, you're leaving your most valuable data on the table. BigQuery export gives you access to the raw, unsampled, event-level data that powers everything from advanced attribution modeling to machine learning predictions.
This step-by-step guide covers everything you need to know about setting up, configuring, and using GA4 BigQuery export in 2025, including realistic cost estimates, essential starter queries, and common pitfalls to avoid.
Why BigQuery Export Matters
The GA4 interface is designed for quick, high-level analysis. But it comes with significant limitations that affect data accuracy and analytical depth:
- Sampling: GA4 applies data sampling to explorations and reports that exceed certain thresholds. BigQuery data is never sampled — you always work with 100% of your events.
- Thresholding: As we covered in our thresholding article, Google Signals triggers data hiding in the UI. BigQuery receives the complete, unthresholded dataset.
- Data limits: GA4's standard reports have limitations on the number of dimensions, metrics, and date ranges you can combine. BigQuery has no such restrictions — you can write any SQL query against the full dataset.
- Data retention: GA4 UI retains detailed event data for at most 14 months. BigQuery stores data indefinitely (as long as you're willing to pay the storage costs, which are minimal).
- Join capability: BigQuery allows you to join GA4 data with CRM data, advertising data, product catalogs, weather data, or any other dataset. This enables advanced analyses that are impossible in the GA4 interface.
Prerequisites
Before setting up the export, ensure you have the following:
- A Google Cloud Platform (GCP) project: If you don't have one, create a free project at
console.cloud.google.com. You'll need to add a billing account, but the BigQuery free tier covers most small-to-medium sites. - Editor or Admin access to both the GA4 property and the GCP project.
- BigQuery API enabled in your GCP project (it's usually enabled by default).
Setup in 5 Minutes
- Open your GA4 property and go to Admin → BigQuery Links.
- Click Link and select the GCP project where you want the data exported.
- Choose your export frequency: Daily (recommended for most sites) or Streaming (for real-time needs).
- Select the events you want to export. We recommend all events initially — you can always filter in your queries later, but you can't retroactively export events you didn't include.
- Choose a data location (e.g., US or EU) — this should match your GCP project's default region for cost optimization.
- Click Submit. Wait 24-48 hours for the first tables to appear in your BigQuery project.
Understanding the BigQuery Schema
GA4 exports data into tables with a specific naming convention and structure:
- Daily tables: Named
events_YYYYMMDD(e.g.,events_20250228). One table per day, created after the day is complete. - Intraday tables: Named
events_intraday_YYYYMMDD. These contain streaming data for the current day and are replaced by the daily table once the day is finalized.
Each row in the table represents a single event. The schema uses nested and repeated fields (RECORD and ARRAY types), which is different from flat tabular data. Key fields include:
event_name— The name of the event (e.g.,page_view,purchase).event_params— A repeated RECORD field containing all event parameters as key-value pairs.user_properties— A repeated RECORD field containing user-scoped properties.user_pseudo_id— The client ID (equivalent to the GA cookie).geo,device,traffic_source— Nested records with geographic, device, and acquisition data.
Essential Starter Queries
Here are three queries every analyst should run on their first day with BigQuery export:
1. Daily active users by event:
SELECT event_name, COUNT(DISTINCT user_pseudo_id) AS users
FROM `project.dataset.events_*`
WHERE _TABLE_SUFFIX BETWEEN '20250201' AND '20250228'
GROUP BY event_name ORDER BY users DESC;
2. Unnesting event parameters:
SELECT event_name,
(SELECT value.string_value FROM UNNEST(event_params) WHERE key = 'page_location') AS page,
COUNT(*) AS events
FROM `project.dataset.events_*`
WHERE _TABLE_SUFFIX = '20250228'
GROUP BY 1, 2 ORDER BY events DESC LIMIT 20;
3. Revenue by traffic source:
SELECT traffic_source.source, traffic_source.medium,
SUM(ecommerce.purchase_revenue) AS revenue
FROM `project.dataset.events_*`
WHERE event_name = 'purchase'
AND _TABLE_SUFFIX BETWEEN '20250201' AND '20250228'
GROUP BY 1, 2 ORDER BY revenue DESC;
Cost Reality Check
BigQuery pricing has two components: storage and query processing. Here's what to expect:
- Storage: $0.02/GB/month for active storage. A site with 1M events/month generates roughly 1-3 GB of data — costing $0.02-0.06/month. Even at 10M events/month, storage costs are typically under $1/month.
- Queries (on-demand): The first 1 TB of queries per month is free. After that, it's $6.25/TB. Most analytics queries scan well under 1 TB per month.
- Streaming export: Adds real-time data availability but costs approximately $0.05 per 200MB of streamed data. Only enable if you genuinely need sub-hour data latency.
For most sites under 10M events/month, the total BigQuery cost is under $5/month — often effectively free within the free tier. This makes BigQuery one of the highest-ROI analytics investments you can make.
Common Mistakes to Avoid
- Not enabling export early enough: BigQuery export is not retroactive. It only captures data from the day you enable it forward. Every day without export is data you'll never get back.
- Choosing the wrong data location: Select a BigQuery region that matches your GCP project and is close to your team. Cross-region queries incur additional costs.
- Not understanding UNNEST: GA4's nested schema requires the
UNNESTfunction to access event parameters. This is the #1 source of confusion for analysts new to BigQuery. - Running expensive queries: Always use
_TABLE_SUFFIXfilters to limit the date range of your queries. Without date filtering, queries scan all historical data, which can become costly as your dataset grows.
NiceLookingData detects this
Our GA4 auditor automatically checks if BigQuery export is configured and warns you if it's missing — one of the first things every analytics team should set up. We also verify that the export is using the correct frequency and that data is actually flowing into BigQuery.
Key Takeaways
- BigQuery export provides raw, unsampled, unthresholded GA4 data — essential for serious analytics work.
- Setup takes 5 minutes and costs are negligible for most sites (under $5/month).
- Enable export as early as possible — it's not retroactive.
- Learn the UNNEST pattern for querying event parameters — it's the key to unlocking BigQuery's full potential.
- Use daily export for most cases; streaming export only when sub-hour latency is required.