API Connection Guide

Everything you need to connect the TagDrishti dashboard to real live data. Each section covers exactly which service to connect, what credential to get, which env variable to set, and what the dashboard will call.

โ„น๏ธ
Dashboard Demo Mode โ€” The dashboard file ships with USE_DEMO_DATA: true so it works instantly with realistic sample data. Set this to false and fill in the CONFIG block once your backend is running to switch to live data.
1
Architecture Overview
How the dashboard, backend, and data sources connect
โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
โ”‚                    TagDrishti Architecture                   โ”‚
โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜

  Browser (dashboard.tagdrishti.com)
      โ”‚
      โ”‚  JWT from Clerk
      โ–ผ
  Cloud Run API (gtm-monitor-api.run.app)
      โ”‚
      โ”œโ”€โ”€โ–บ Supabase PostgreSQL  โ€” tenants, workspaces, api_keys, alert_configs
      โ”‚
      โ”œโ”€โ”€โ–บ BigQuery            โ€” tag_events, security_events, consent_snapshots,
      โ”‚                            anomaly_alerts, web_vitals_rollups
      โ”‚
      โ”œโ”€โ”€โ–บ Upstash Redis       โ€” last 24h tag status cache (instant dashboard load)
      โ”‚
      โ”œโ”€โ”€โ–บ Resend              โ€” email alerts
      โ”‚
      โ””โ”€โ”€โ–บ Stripe              โ€” billing portal, usage metering


  GTM Monitor Script (on client website)
      โ”‚
      โ”‚  x-api-key header
      โ–ผ
  Cloud Run API โ†’ Pub/Sub โ†’ Cloud Function โ†’ BigQuery
    
โœ…
What you already have: Cloud Run API running, BigQuery schema deployed, GTM Monitor v7.4 script. What this guide covers: wiring up all 6 services so the dashboard shows real data.

The dashboard makes all data calls to your Cloud Run API โ€” it never directly queries BigQuery or Supabase from the browser. This means one URL to configure and one auth token to handle. All service credentials stay server-side.

2
Environment Variables
Complete list โ€” set in Cloud Run & dashboard CONFIG block

In Cloud Run: Set these via gcloud run services update or the Cloud Run Console โ†’ Edit & Deploy โ†’ Variables. In the Dashboard HTML: Update the CONFIG block at the top of the <script> section.

Dashboard HTML โ€” CONFIG block

VariableWhere to get itRequired?
CLOUD_RUN_URLCloud Run Console โ†’ Service URL (e.g. https://gtm-monitor-api-xyz.run.app)Required
CLERK_PKClerk Dashboard โ†’ API Keys โ†’ Publishable KeyRequired
USE_DEMO_DATASet false once backend is liveRequired

Cloud Run โ€” backend/.env

VariableWhere to get itRequired?
SUPABASE_URLSupabase Dashboard โ†’ Settings โ†’ API โ†’ Project URLRequired
SUPABASE_SERVICE_KEYSupabase โ†’ Settings โ†’ API โ†’ service_role key (not anon)Required
CLERK_SECRET_KEYClerk โ†’ API Keys โ†’ Secret key (starts sk_live_)Required
GOOGLE_CLOUD_PROJECTGCP Console โ†’ Project IDRequired
PUBSUB_TOPICSet to gtm-monitor-events (matches schema)Required
BQ_DATASETSet to gtm_monitor (matches schema)Required
RESEND_API_KEYResend Dashboard โ†’ API KeysRequired
RESEND_FROMe.g. alerts@tagdrishti.com (verified domain)Required
STRIPE_SECRET_KEYStripe โ†’ Developers โ†’ API Keys โ†’ Secret keyRequired
STRIPE_WEBHOOK_SECRETStripe โ†’ Webhooks โ†’ your endpoint โ†’ Signing secretRequired
UPSTASH_REDIS_REST_URLUpstash Console โ†’ your database โ†’ REST URLOptional
UPSTASH_REDIS_REST_TOKENUpstash Console โ†’ your database โ†’ REST TokenOptional
TAGDRISHTI_NOTIFY_EMAILYour own email โ€” gets CC on every new signupOptional
FRONTEND_URLSet to https://dashboard.tagdrishti.comOptional
๐Ÿ”
Never commit .env to Git. Add .env to .gitignore. In Cloud Run, set variables via the Console or gcloud run services update --set-env-vars. Use Secret Manager for STRIPE_SECRET_KEY and SUPABASE_SERVICE_KEY in production.
3
Cloud Run API Endpoints
Every endpoint the dashboard calls โ€” add these to your Fastify server

These are the routes the dashboard's apiGet() / apiPost() calls expect. All are prefixed with your Cloud Run URL. All require Authorization: Bearer {clerk_jwt} except the health check.

GET /api/me Called on init โ€” loads tenant + workspaces

Returns the authenticated user's tenant record and all workspaces. Dashboard uses this to populate the sidebar domain picker, all filter dropdowns, and the settings page.

Response Shape

{
  "tenant": { "id", "name", "email", "plan", "plan_event_limit", "stripe_customer_id" },
  "workspaces": [{ "id", "name", "domain", "gtm_container", "status", "tags", "api_key" }]
}
GET /api/dashboard/overview Main overview stats, chart, alerts

The most-called endpoint. Powers all 5 stat cards, the main chart, alert feed, tag table, domain health, and security mini-list on the Overview page.

Query paramTypeDescription
rangestring1h | 24h | 7d | 30d โ€” maps to BigQuery date filter
workspace_iduuid | "all"Filter to single workspace or all workspaces for tenant

BigQuery queries this endpoint should run

-- stats: use v_tag_health_24h view for fast response
SELECT * FROM `gtm_monitor.v_tag_health_24h`
WHERE tenant_id = @tenant_id
  AND workspace_id = @workspace_id  -- or omit for all

-- chart: hourly bucketing
SELECT TIMESTAMP_TRUNC(received_at, HOUR) as hour,
  COUNTIF(tag_status='success') as fires,
  COUNTIF(tag_status='failure') as fails,
  AVG(tag_exec_time_ms) as latency
FROM `gtm_monitor.tag_events`
WHERE event_date >= DATE_SUB(CURRENT_DATE(), INTERVAL 1 DAY)
  AND tenant_id = @tenant_id
GROUP BY 1 ORDER BY 1

-- alerts: from anomaly_alerts table, last 6
SELECT * FROM `gtm_monitor.anomaly_alerts`
WHERE tenant_id = @tenant_id AND event_date >= CURRENT_DATE() - 1
ORDER BY event_date DESC, received_at DESC LIMIT 6
GET /api/tags Full tag health table (Tag Health page)
Query paramTypeDescription
rangestring24h | 7d | 30d
workspace_iduuid | "all"Filter by workspace
statusstringall | failure | critical | success | blocked_by_consent

Source

Query v_tag_health_24h BigQuery view with tenant_id + tag_status filter. Return { tags: [...] }

GET /api/security Security events log (Security page)
Query paramTypeDescription
rangestringDate range for query
typestringall | magecart_detected | csp_violation | sri_missing | unknown_script_domain | network_exfiltration

Source

Query v_security_summary BigQuery view + security_events table. Return { stats:{total,critical,pci_scope,sri_missing,unknown_domains}, events:[...] }

GET /api/consent Consent rates and compliance (Consent page)

Query v_consent_compliance BigQuery view. Return { analytics_rate, ads_rate, functional_rate, gpc_count, blocked_count, eu_pct, trend_labels[], trend_analytics[], trend_ads[], trend_gpc[] }

GET /api/vitals Core Web Vitals P75 scores (Web Vitals page)
Query paramTypeDescription
workspace_iduuid | "all"Domain filter
devicestringall | mobile | desktop | tablet

Source

Query web_vitals_rollups BigQuery table for yesterday's P75. Return { lcp, cls, inp, fcp, ttfb, lcp_good_pct, lcp_needs_pct, lcp_poor_pct, lcp_trend:[], days:[] }

GET /api/alerts Alert history (Alerts page)

Query anomaly_alerts BigQuery table filtered by type + severity. Return { alerts: [...] } with full alert records including notification_sent flag.

GET /api/workspaces List all workspaces for tenant

Read from Supabase workspaces + api_keys tables. Joins to get api_key per workspace for the settings page. Used by Domains page.

POST /api/workspaces Create new workspace + API key

Body: { name, domain, gtm_container }. Creates Supabase workspace row + generates API key (td_{uuid}). Returns { workspace, api_key }.

POST /api/alert-config Save alert configuration

Body: { workspace_id, alert_email, slack_webhook_url, fail_rate_threshold, critical_tag_alert }. Upserts to Supabase alert_configs table.

PATCH /api/tenants/me Update account details

Body: { name?, email?, retention_days?, children_mode? }. Updates Supabase tenants table for authenticated tenant.

POST /api/billing/portal Generate Stripe customer portal URL

Calls Stripe billingPortal.sessions.create with tenant's stripe_customer_id. Returns { url } โ€” dashboard opens in new tab.

GET /api/bq/status BigQuery connection status + usage

Runs a lightweight SELECT COUNT(*) on tag_events for today. Returns { status, rows_24h, last_write, usage:{events,domains,sessions} }. Used by BigQuery page.

4
Authentication โ€” Clerk
JWT-based auth for dashboard โ†’ Cloud Run API
๐Ÿ”‘
How it works: Clerk authenticates the user in the browser, provides a signed JWT, and the dashboard passes it in every API request. Your Cloud Run backend verifies the JWT using Clerk's JWKS endpoint and extracts the org_id to identify the tenant.
1
Create Clerk Application
Go to clerk.com โ†’ Create Application. Choose "Social Login + Email". Copy your Publishable Key (pk_live_โ€ฆ) and Secret Key (sk_live_โ€ฆ).
2
Add Clerk to Dashboard HTML
Add the Clerk JS SDK to the <head>, initialise with your publishable key, then get the token in initAuth().
3
Verify JWT in Cloud Run
Install @clerk/clerk-sdk-node. Use clerkClient.verifyToken(jwt) in a Fastify pre-handler. Extract sub (user ID) and look up tenant_id in Supabase by clerk_org_id.
// dashboard HTML โ€” initAuth() replacement for production
import Clerk from '@clerk/clerk-js';

const clerk = new Clerk(CONFIG.CLERK_PK);
await clerk.load();

if (!clerk.user) {
  // Not signed in โ€” redirect to sign-in
  clerk.redirectToSignIn({ returnBackUrl: window.location.href });
  return;
}

// Get fresh JWT for API calls
STATE.authToken = await clerk.session.getToken();
// Cloud Run Fastify โ€” auth middleware
import { createClerkClient } from '@clerk/clerk-sdk-node';
const clerkClient = createClerkClient({ secretKey: process.env.CLERK_SECRET_KEY });

app.addHook('preHandler', async (req, reply) => {
  try {
    const token = req.headers.authorization?.replace('Bearer ', '');
    if (!token) return reply.code(401).send({ error: 'Unauthorized' });
    const payload = await clerkClient.verifyToken(token);
    // Look up tenant from Supabase by Clerk org/user ID
    const { data: tenant } = await supabase
      .from('tenants').select('*')
      .eq('clerk_org_id', payload.org_id || payload.sub).single();
    req.tenant = tenant;
  } catch (e) { reply.code(401).send({ error: 'Invalid token' }); }
});
5
Supabase โ€” Application Database
Stores tenants, workspaces, API keys, alert config
1
Create Project
Go to supabase.com โ†’ New Project. Choose a region close to your Cloud Run deployment (e.g. ap-south-1 for India). Copy the Project URL and service_role key from Settings โ†’ API.
2
Run Schema
Go to SQL Editor โ†’ New Query. Paste the schema SQL from the Setup Guide (tenants, workspaces, api_keys, alert_configs tables + indexes). Click Run.
3
Set Env Variables
Set SUPABASE_URL and SUPABASE_SERVICE_KEY in Cloud Run. Use the service_role key โ€” not anon. This key bypasses Row Level Security and is needed for server-side operations.
โš ๏ธ
Which key to use: The dashboard calls Cloud Run, not Supabase directly. Cloud Run uses the service_role key. Never expose this key in frontend code. The anon key is for client-side queries only โ€” not needed here.

Tables used by dashboard

TableUsed by endpointData
tenants/api/me, /api/tenants/meName, email, plan, Stripe IDs
workspaces/api/workspaces, /api/meDomain, GTM container, status
api_keys/api/workspaces, event ingestionPer-workspace API key for script
alert_configs/api/alert-configEmail, Slack webhook, thresholds
6
BigQuery โ€” Event Storage & Analytics
All tag events, security events, consent, vitals, anomaly alerts
1
Deploy Schema
Run bq mk --dataset --location=asia-south1 gtm_monitor then bq query --use_legacy_sql=false < tagdrishti-bigquery-schema.sql. Creates all 5 tables + 4 views + scheduled rollup.
2
Enable Cloud Run BQ Access
Grant your Cloud Run service account roles/bigquery.dataViewer on the gtm_monitor dataset. Cloud Run uses this to run SELECT queries for the dashboard endpoints.
3
Set Project ID
Set GOOGLE_CLOUD_PROJECT env var in Cloud Run. The BigQuery client picks this up automatically via ADC (Application Default Credentials) โ€” no separate key file needed on Cloud Run.
4
Wire Pub/Sub โ†’ Cloud Function โ†’ BQ
Deploy the bq-writer Cloud Function (included in schema SQL) subscribed to the gtm-monitor-events Pub/Sub topic. This writes inbound events to the correct BQ table by message_type.

Table โ†’ Dashboard page mapping

BQ Table / ViewDashboard PageEndpoint
v_tag_health_24hOverview, Tag Health/api/dashboard/overview, /api/tags
security_eventsSecurity/api/security
v_security_summarySecurity stats/api/security
v_consent_complianceConsent page/api/consent
web_vitals_rollupsWeb Vitals page/api/vitals
anomaly_alertsOverview alert feed, Alerts page/api/alerts, /api/dashboard/overview
v_daily_usageBigQuery page usage bar, Settings plan/api/bq/status
7
Upstash Redis โ€” Cache Layer
Optional but recommended โ€” instant dashboard load, rate limiting
โšก
Why Redis: BigQuery queries take 1โ€“3 seconds. Redis caches the last 24h tag status per tenant so the Overview page loads instantly. Cache is invalidated when new events arrive from Pub/Sub. Without Redis: dashboard still works, just slower initial load.
1
Create Upstash Database
Go to upstash.com โ†’ Create Database. Choose the region closest to your Cloud Run deployment (ap-southeast-1 for India). Copy the REST URL and REST Token.
2
Set Env Variables
Set UPSTASH_REDIS_REST_URL and UPSTASH_REDIS_REST_TOKEN in Cloud Run.
// Cache pattern in /api/dashboard/overview endpoint
const cacheKey = `overview:${tenantId}:${workspaceId}:${range}`;
const cached = await redis.get(cacheKey);
if (cached) return JSON.parse(cached);

// ... run BigQuery queries ...

await redis.set(cacheKey, JSON.stringify(result), { ex: 60 }); // 60s TTL
return result;
8
Resend (Email) + Slack Alerts
Alert delivery when anomaly_alert payload arrives
1
Create Resend Account
Go to resend.com โ†’ Create API Key โ†’ copy it. Add your sending domain (e.g. tagdrishti.com) under Domains and verify DNS records. Set RESEND_API_KEY and RESEND_FROM=alerts@tagdrishti.com.
2
Configure Slack Webhook
In your Slack workspace: Apps โ†’ Manage โ†’ Build โ†’ Create New App โ†’ Incoming Webhooks โ†’ Activate โ†’ Add to Channel. Copy the webhook URL. Users save this in dashboard โ†’ Settings โ†’ Notifications. Stored in Supabase alert_configs.slack_webhook_url.
3
Trigger Logic
When an inbound event has anomaly_alert set, Cloud Run reads alert_config from Supabase for that workspace and fires both email (via Resend) and Slack (via HTTP POST to webhook). The notification_sent flag in anomaly_alerts BQ table tracks delivery.
9
Stripe โ€” Billing
Subscription management + Customer Portal
1
Create Paddle Products
Paddle Dashboard → Catalog → Products. Create 4 plans: Starter ($50/mo), Starter+ ($89/mo), Agency ($399/mo), Enterprise ($699/mo). Copy each Price ID (pri_…). Set these as environment variables in Cloud Run.
2
Add Webhook
Stripe โ†’ Developers โ†’ Webhooks โ†’ Add Endpoint. URL: https://YOUR-CLOUD-RUN-URL/api/billing/webhook. Events: checkout.session.completed, customer.subscription.updated, customer.subscription.deleted. Copy signing secret โ†’ set as STRIPE_WEBHOOK_SECRET.
3
Enable Customer Portal
Stripe โ†’ Settings โ†’ Billing โ†’ Customer Portal โ†’ Activate. This lets the /api/billing/portal endpoint generate a self-serve portal URL so users can change plan, update card, or cancel from the dashboard.
โ„น๏ธ
Webhook โ†’ Supabase: When checkout.session.completed fires, Cloud Run reads the plan from metadata and updates tenants.plan in Supabase. The dashboard reads this from /api/me and shows the correct plan features in Settings.
10
Dashboard Go-Live Configuration
The 3 lines to change in the dashboard HTML to switch from demo to live
// โ”€โ”€ In TagDrishti-Dashboard-Full.html โ”€โ”€
// Find the CONFIG block near the top of the <script> section
// Replace these 3 values:

const CONFIG = {
  CLOUD_RUN_URL: 'https://gtm-monitor-api-YOUR-HASH.run.app', // โ† your Cloud Run URL
  CLERK_PK: 'pk_live_YOUR_PUBLISHABLE_KEY',                    // โ† from Clerk dashboard
  SUPABASE_URL: 'https://YOUR_PROJECT_REF.supabase.co',          // โ† only if direct BQ needed
  SUPABASE_ANON: 'YOUR_SUPABASE_ANON_KEY',                        // โ† only if direct BQ needed
  USE_DEMO_DATA: false,                                          // โ† CHANGE THIS to false
  REFRESH_INTERVAL: 30000
};
โœ…
That's it. Set USE_DEMO_DATA: false, fill in CLOUD_RUN_URL and CLERK_PK, and every API call in the dashboard switches from demo data to your live Cloud Run backend. All data flows, chart rendering, and table rendering are identical โ€” only the data source changes.

Then add the Clerk JS SDK to the dashboard's <head> tag, implement the production initAuth() replacement shown in Section 4, and deploy to dashboard.tagdrishti.com (Vercel, Cloudflare Pages, or Firebase Hosting all work).

โœ“
Launch Checklist
Don't go live until every item is checked
1
Supabase schema deployed
4 tables visible in Table Editor: tenants, workspaces, api_keys, alert_configs
2
Cloud Run API running
GET /health returns {"status":"ok","version":"7.4"}
3
Clerk auth working
Sign in โ†’ /api/me returns tenant record
4
BigQuery pipeline active
Pub/Sub โ†’ Cloud Function โ†’ BQ writing events. /api/bq/status returns status: "connected"
5
GTM Monitor script installed
Events appearing in BQ tag_events table within 30s of page load
6
Email alert tested
Force a tag failure โ†’ anomaly_alert payload โ†’ email received via Resend within 2 minutes
7
Slack alert tested
Webhook URL saved in settings โ†’ alert delivered to Slack channel
8
Stripe products created
3 products with correct prices. Checkout flow works end-to-end. Webhook updates Supabase plan field.
9
Dashboard deployed
USE_DEMO_DATA: false set. Deployed to dashboard.tagdrishti.com. Clerk auth redirects working.
10
First real customer onboarded
Client installs script โ†’ you see their tag fires in the dashboard within seconds.

TagDrishti API Connection Guide ยท v7.4 ยท February 2026
Questions? contact@tagdrishti.com