OotyOoty
OctopusLive

SEO Intelligence

CanopySoon

Amazon Seller Intelligence

IrisLive

YouTube Analytics

EchoLive

Social Media Intelligence

View all products →
FeaturesToolsPricingDocs

Products

OctopusLiveCanopySoonIrisLiveEchoLiveView all products →
FeaturesToolsPricingDocs
Login
Start free
Ooty

AI native tools that replace expensive dashboards. SEO, Amazon, YouTube, and social analytics inside your AI assistant.

Product

  • Features
  • Pricing
  • Download

Resources

  • Docs
  • About
  • Blog

Legal

  • Privacy
  • Terms
  • Refund Policy
  • Security

© 2026 Ooty. All rights reserved.

All systems operational
  1. Home
  2. Blog
  3. ai marketing
  4. How Ooty Works: The Architecture Behind AI-Native Marketing Tools
ai marketing

How Ooty Works: The Architecture Behind AI-Native Marketing Tools

A transparent look at how Ooty routes data between Claude and marketing APIs -- the proxy pattern, authentication flow, credential security, and why this architecture was chosen.

ByFinn Hartley
22 February 2026Updated 23 February 20267 min read
#architecture#mcp#technical#security#ooty#transparency#how-it-works

When you connect Claude to your Google Analytics account through Ooty's Compass MCP server and ask "What were my top traffic sources last month?", several things happen in the next two seconds. Understanding what happens -- and why it's designed that way -- matters if you're evaluating whether to trust Ooty with your marketing credentials.

This post explains the architecture honestly, including the trade-offs we made and the ones we're still working through.

The Core Problem: API Keys on User Machines

Most developer-facing tools that connect AI assistants to external APIs ask you to store your API keys locally. Your Google API key, your Meta app secret, your Amazon credentials -- all sitting in a config file on your laptop.

This approach has obvious appeal: no intermediary, no additional company with access to your credentials, complete control. It also has a serious problem.

API keys in local config files are:

  • Accessible to any process running as your user
  • Often accidentally committed to git repos (over 12.8 million secrets were detected in public GitHub commits in 2023 -- GitGuardian, 2024)
  • At risk when your machine is shared, lost, stolen, or compromised
  • Individual per-user, meaning every user needs to manage their own API key with every upstream service

For simple developer tooling, this trade-off is acceptable. For a product used by marketing professionals who aren't managing their own security infrastructure, it isn't.

Ooty is built around a server-side proxy pattern that keeps upstream API credentials off user machines entirely.

The Architecture

Three layers between your machine and upstream APIs -- credentials never leave our servers

1

Your Machine

Claude Desktop + MCP Client

License key (revocable)

Session token (24h expiry)

No upstream API credentials

HTTPS + JSON-RPC 2.0
2

ooty.io

Vercel Edge + Supabase

1. Auth layer -- validate & rate-limit

2. Tool router -- parse & route

3. Upstream proxy -- decrypt tokens, call API

Authenticated API calls
3

Upstream APIs

Google, Meta, Amazon, YouTube

OAuth tokens encrypted at rest

Minimum scopes per product

Data flows through, never stored

400-800mstypical round-trip for cached sessions

The Authentication Flow in Detail

When you start Claude and make your first tool call through an Ooty MCP server, here's what happens step by step.

The Authentication Flow

What happens in the two seconds between asking a question and getting an answer

1

Tool call

client

Claude calls get_search_console_data

2

HTTP POST

client

MCP client sends to /api/mcp/octopus with Bearer token

3

Validate

server

Auth layer checks license key, creates/validates session

4

Rate limit

server

Per-session and per-license rate checks

5

Route

server

Tool router parses call, validates params against schema

6

Proxy

server

Decrypt OAuth tokens, call upstream API, format response

7

Response

client

JSON-RPC response flows back to Claude

The entire round-trip is typically 400-800ms for cached sessions, slightly longer for first-call session creation.

How OAuth Tokens Are Stored

When you connect your Google account through Ooty's OAuth flow:

  1. You authorise Ooty via Google's consent screen (we request only the scopes needed for that product)
  2. Google sends us an access token and refresh token
  3. We encrypt both tokens using AES-256-GCM with a key derived from your user ID and a server-side secret
  4. The encrypted tokens are stored in Supabase
  5. We never store tokens in plaintext

When making upstream API requests:

  1. We retrieve the encrypted tokens from Supabase
  2. Decrypt them in memory (they're never written to disk in plaintext)
  3. Use the access token for the API call
  4. If the access token is expired, use the refresh token to get a new one, then re-encrypt and store the updated tokens

This flow means your Google credentials are never accessible to anyone who compromises our database without also compromising the encryption keys -- which are separate secrets, not stored in the database.

What We Store and What We Don't

We store:

  • Your Ooty account credentials (email, hashed password)
  • Your license key records
  • Encrypted OAuth tokens for connected platforms
  • Session token records (for validation and revocation)
  • Usage logs (tool name, timestamp, duration -- not request payloads)
  • Rate limiting counters

We don't store:

  • The raw content of your marketing data
  • Your Search Console queries or Analytics data
  • Your ad creative or campaign content
  • Search results or API response payloads

Data flows through our servers but isn't persisted. We're a conduit, not a database. For a deeper look at the security considerations behind these decisions, see our MCP security guide.

The Six MCP Products

Each product connects to specific upstream APIs with minimum necessary OAuth scopes

Octopus

SEO & Search

Search Console

PageSpeed

Knowledge Graph

Indexing API

Falcon

Paid Advertising

Google Ads

Meta Ads

Echo

Social Media

Instagram

LinkedIn

X

Reddit

Iris

Video Analytics

YouTube Analytics

YouTube Data

Compass

Web Analytics

Google Analytics 4

Search Console

CrUX

Canopy

E-Commerce

Amazon PA-API

Keepa

Rainforest

Each product requests the minimum OAuth scopes needed for its tool set. Compass doesn't request Ads scopes. Iris doesn't request Search Console scopes. The scopes are deliberately narrow -- this is both a security best practice and a GDPR data minimisation requirement.

The Remote MCP Transport

Ooty uses the Streamable HTTP transport for MCP -- specifically the newer spec that supports both stateless and stateful connections. This means:

No local process required. Unlike stdio-based MCP servers that run as a process on your machine, Ooty's remote MCP servers run on our infrastructure. You point Claude at a URL and authenticate with a token. That's the entire client-side setup.

Works across machines. Your license key works on any machine where you're running an MCP-compatible client. No per-machine setup beyond pasting the URL and license key into your config.

Centrally updatable. When we add a new tool to Octopus or fix a bug, the update is live on our servers immediately. You don't need to update a local package.

Trade-off: network dependency. If our servers are down or there's network latency, your tool calls are affected. We run on Vercel's edge infrastructure with global distribution to minimise this, but it's a real dependency that a local-process architecture doesn't have.

Where the Architecture Has Trade-offs

It wouldn't be honest to present this as all upside. There are genuine trade-offs.

Trust surface. You're trusting Ooty with your marketing account credentials. We've designed the system to minimise risk (encryption, no payload storage, narrow scopes, revocable tokens), but you're trusting a company. If you're not comfortable with that trust relationship, a self-hosted architecture where you hold your own keys is the alternative.

Latency. Every tool call goes through our servers. Compared to a direct API call from your local machine, there's additional network hop latency. In practice this is 50-150ms of additional latency, which is acceptable for interactive use but would be meaningful in high-throughput automated systems. For when to use direct APIs instead, see our MCP vs API decision framework.

Scope to our products. The proxy architecture only supports the upstream APIs we've built proxy support for. If you need to connect Claude to an internal API or a platform we don't support, you'd need a different approach -- either a local MCP server you manage yourself or a custom MCP server built against the MCP spec directly.

No offline use. This is an internet-connected product. It doesn't work offline.

Why This Architecture Was Chosen

The alternative is what most early MCP tools do: give developers instructions for setting up their own API keys with every upstream service and running local MCP servers that hold those keys.

For developer audiences, this is completely appropriate. Developers are comfortable managing API keys, setting up OAuth apps, running local processes, and debugging credential issues.

For marketing professionals -- the audience Ooty is built for -- this creates too much friction. Setting up a Google Ads OAuth app, a Meta developer account, a YouTube API project, an Amazon PA-API application, all before seeing any value, is a significant barrier. Many marketers will stop before they start.

The proxy architecture means: sign up, choose your products, go through familiar OAuth consent screens, paste one URL and one key into Claude's config. That's it. No API key management, no OAuth app registration, no local process management. See how it works in practice in our getting started guide.

The trade-off is trusting us with your credentials. We've tried to make that trust as narrow as possible through the security design described above.

Transparency as a Design Principle

We publish how this works because we think users deserve to understand the systems they depend on. Security through obscurity isn't a design principle we believe in -- if the architecture can only be trusted when users don't understand it, we've built the wrong architecture.

If you have questions about the security model, want to see our data processing documentation, or have concerns about a specific aspect of how we handle credentials, reach out directly. We'd rather explain the design than have you make decisions based on assumptions.


Questions about the architecture or security model? Write to us at security@ooty.io.

From Ooty

AI native marketing tools for SEO, Amazon, YouTube, and social — replace your expensive dashboards.

Start free
Share:
Finn Hartley

Written by

Finn Hartley

Product Lead at Ooty. Writes about MCP architecture, security, and developer tooling.

Related posts

22 February 2026·10 min read·ai marketing

MCP Security Guide: 10 Things Every Developer Should Know

The Model Context Protocol is moving from experimental to production fast. MCP server downloads grew from 100,000 to over 8 million in just five months (MCP Manager, 2025). AI assistants connected to live APIs, databases, and third-party services are no longer

#mcp#security#gdpr
22 February 2026·9 min read·ai marketing

MCP vs API: When to Use Each (A Practical Decision Framework)

When developers first encounter the Model Context Protocol, a common question surfaces: "Why not just use an API?" It's a fair question. APIs have worked for two decades. They're well understood, well documented, and supported by every programming language, fr

#mcp#api#decision-framework
23 February 2026·27 min read·ai marketing

AI Marketing + MCP Glossary 2026: 200+ Terms Defined

AI marketing has developed its own vocabulary fast — and the MCP ecosystem has added another layer on top of it. This glossary defines the terms you'll encounter when working with AI marketing tools, reading research, or setting up MCP connections. Terms are l

#glossary#ai-marketing#mcp

On this page

  • The Core Problem: API Keys on User Machines
  • The Authentication Flow in Detail
  • How OAuth Tokens Are Stored
  • What We Store and What We Don't
  • The Remote MCP Transport
  • Where the Architecture Has Trade-offs
  • Why This Architecture Was Chosen
  • Transparency as a Design Principle