Audit logs are available via Google Cloud Logging, where they can be immediately filtered to provide insights on specific jobs or queries, or exported to Google Cloud Pub/Sub, Google Cloud Storage, or BigQuery.
Using Cloud Logging to find large queries in BigQuery
You can leverage Cloud Logging to filter your audit logs to find specific jobs that match your criteria, whether it’s expensive queries, specific failure modes, or unintended access patterns. The following example shows you how to filter for large queries.
- Go to the Logs Viewer in the Cloud Platform Console.
-
Select BigQuery from the list of resources that you want to filter.

-
Switch to Advanced Filtering by clicking on the drop-down arrow in the Filter box and selecting Convert to advanced filter.

-
Enter filter statements to look for particular patterns of usage.
-
Example 1: Show events from users that have run queries exceeding 5GB.
Enter the following filter statements in the Filter by box and click Submit Filter.
metadata.serviceName="bigquery.googleapis.com" protoPayload.serviceData.jobCompletedEvent.job.jobStatistics.totalBilledBytes > 5000000000 -
Example 2: Show events related to complex queries (requiring higher billing tier).
Enter the following filter statements in the Filter by box and click Submit Filter.
metadata.serviceName="bigquery.googleapis.com" protoPayload.serviceData.jobCompletedEvent.job.jobStatistics.billingTier > 1

-
Querying audit logs using BigQuery
To analyze your aggregated usage data using SQL, set up export of audit logs back to BigQuery. For more information about setting up exports from Cloud Logging, see Overview of Logs Export in the Cloud Logging documentation.
Set up streaming of audit logs to BigQuery
- Go to the Logs Viewer
in the Cloud Platform Console and click on Exports in the
navigation bar.

- Configure options under Select service:
- Select BigQuery as the source of your Export.
- Click the Add Item button and select
cloudaudit.googleapis.com/data_access.

- Set up Select export destinations:
- Under Stream to BigQuery dataset, select Add new dataset.
- Type the name of new dataset to create for the audit logs
(for example,
AuditLogs) and click Create. - Click Save to finish setting up the export.

- Navigate to the dataset in the BigQuery web UI to start sending queries about your data access usage and charges.
Sample audit queries in BigQuery
Example 1: Charges for BigQuery usage over the last 7 days
SELECT
query_date,
ROUND(((total_bytes*5)/1000000000000),2) Cost_In_Dollars
FROM (
SELECT
STRFTIME_UTC_USEC(metadata.timestamp,"%Y-%m-%d") AS query_date,
SUM(protoPayload.serviceData.jobCompletedEvent.job.jobStatistics.totalBilledBytes) AS total_bytes
FROM
TABLE_DATE_RANGE(AuditLogs.cloudaudit_googleapis_com_data_access_, DATE_ADD(CURRENT_TIMESTAMP(), -7, 'DAY'), CURRENT_TIMESTAMP())
WHERE
protoPayload.serviceData.jobCompletedEvent.eventName = 'query_job_completed'
GROUP BY
query_date )
The following is an example of the query results.

Example 2: Charges for BigQuery by user over the last 7 days
SELECT
protoPayload.authenticationInfo.principalEmail User,
ROUND((total_bytes*5)/1000000000000, 2) Total_Cost_For_User,
Query_Count
FROM (
SELECT
protoPayload.authenticationInfo.principalEmail,
SUM(protoPayload.serviceData.jobCompletedEvent.job.jobStatistics.totalBilledBytes) AS total_bytes,
COUNT(protoPayload.authenticationInfo.principalEmail) AS query_count,
FROM
TABLE_DATE_RANGE(AuditLogs.cloudaudit_googleapis_com_data_access_, DATE_ADD(CURRENT_TIMESTAMP(), -7, 'DAY'), CURRENT_TIMESTAMP())
WHERE
protoPayload.serviceData.jobCompletedEvent.eventName = 'query_job_completed'
GROUP BY
protoPayload.authenticationInfo.principalEmail)
ORDER BY
2 DESC
For more information about which elements can be queried in the audit logs, see AuditData in the Cloud Logging documentation.