# Connect to Google Cloud

Connect your Google Cloud (GCP) project to Finora to track AI workloads — Vertex AI, Gemini APIs, and any other AI service Google bills through their standard billing export.

> **Before you start**
>
> * You need access to a Google Cloud project where you can create a service account and grant BigQuery permissions.
> * **Standard billing export to BigQuery must already be enabled** on your billing account. If it's not, follow [Google's setup guide](https://cloud.google.com/billing/docs/how-to/export-data-bigquery-setup) — it's a 5-minute setup. After enabling, **wait about 24 hours** for the first export to appear before connecting Finora.

## Overview

Google Cloud doesn't have a single "billing API" the way OpenAI or Anthropic do. Instead, Google's recommended path is to export your billing data to a BigQuery table and read from there. Finora reads that table using a service account you create — a Google identity with read-only access to your billing dataset.

## Step 1: Confirm your BigQuery billing export

1. In the [Google Cloud Console](https://console.cloud.google.com), open **Billing → Billing export**.
2. Confirm **Standard usage cost** export is **Enabled**, and note three things:
   * **Project ID** — the project that hosts the BigQuery dataset
   * **Dataset name** — typically `billing_export` (yours may be named differently)
   * **Table name** — looks like `gcp_billing_export_v1_XXXXXX_XXXXXX_XXXXXX`

> **Use the standard export, not the detailed export.** Finora is built around the standard export's data shape. The detailed export has a different layout and won't work.

## Step 2: Create a read-only service account

1. In the same project, open **IAM & Admin → Service accounts → + Create service account**.
2. Name it `finora-bigquery-reader`.
3. Click **Create and continue**.
4. Grant these two roles (and only these):
   * **BigQuery Data Viewer**
   * **BigQuery Job User**
5. Finish creating the service account.
6. Open the new service account, go to **Keys → Add key → Create new key → JSON**.
7. Download the JSON file Google generates. It looks like:

```json
{
  "type": "service_account",
  "project_id": "...",
  "private_key": "-----BEGIN PRIVATE KEY-----\n...\n-----END PRIVATE KEY-----\n",
  "client_email": "finora-bigquery-reader@your-project.iam.gserviceaccount.com",
  "...": "..."
}
```

> Treat this JSON file like a password — anyone with it can read your billing dataset. Don't email it, paste it in chat, or commit it to source control.

## Step 3: Connect GCP to Finora

1. In Finora, click **Settings** in the sidebar, then open the **API Keys** tab.
2. From the **Provider** dropdown, choose **Google Cloud Platform**.
3. Fill in:
   * **Project ID** — from Step 1
   * **Dataset ID** — from Step 1 (e.g. `billing_export`)
   * **Table ID** — from Step 1 (the long `gcp_billing_export_v1_...` name)
   * **Service Account Key (JSON)** — open the JSON file in a text editor and paste the **entire contents** into the field.
4. Click **Save**.

Finora makes a test query to confirm everything works. On success, Google Cloud appears in your connected providers list.

### Multiple GCP billing accounts (optional)

If you have several billing exports — for example, one per business unit — you can add labels for each. After saving the first connection, click **+ Add another GCP account** in the GCP row, give it a label like `marketing` or `engineering`, and repeat.

## What Finora tracks for Google Cloud

* Total cost converted to USD
* Date of each charge
* Which Google service and SKU was billed (e.g. a Vertex AI or Gemini line item)
* Project ID and labels from your billing data

Finora filters automatically to AI-relevant services (Vertex AI, Generative AI, Gemini APIs). Other GCP spend stays in your bill but isn't pulled in.

## FAQs

**Why a service account JSON key and not OAuth?** BigQuery exports are read on a regular schedule, not interactively. Service account authentication is the standard, audit-friendly approach for this kind of automated data access on Google Cloud.

**Saving failed with `403 Access Denied`.** Most common causes:

* The service account is missing the **BigQuery Job User** role (Data Viewer alone isn't enough — you need *both*).
* Wrong Project ID.
* The dataset is in a region the service account can't reach.

**Saving failed with `Not found: Table ...`.** Double-check the **Table ID**. Standard export tables have long names like `gcp_billing_export_v1_<billing-account-id>_...`. Copy it directly from BigQuery to avoid typos.

**Data is empty after the first refresh.** Newly enabled exports take up to 24 hours to populate with historical data. If running `SELECT * FROM your_table LIMIT 10` in the BigQuery console returns zero rows, the export hasn't backfilled yet — wait and try again.

**How do I rotate the service account key?** Create a new key in your GCP service account, paste it into Finora **Settings → API Keys → Update**, save, and then delete the old key in GCP IAM.

## Removing the integration

Open **Settings → API Keys**, find the GCP row, click the trash icon, and confirm. Your stored connection is deleted. We recommend you also delete the corresponding service account key in GCP IAM afterwards.


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://docs.finora.services/billing-integrations/ai-providers/connect-to-gcp.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
