Phase 6: Multi-Tenancy & Production Hardening (PLANNED)
Timeline: Week 10-12
Status: ⚠️ PLANNED - NOT YET IMPLEMENTED
Prerequisites: All functional phases (0-5) complete
Objective: Prepare for multiple clients running simultaneously with proper security, deployment, and testing.
Current Production State (As-Is)
Implemented
- ✅ GoClaw managed mode configured
- ✅ Supabase multi-tenancy (separate customer records)
- ✅ Basic auth via Supabase JWT
- ✅ Docker Compose dev environment
- ✅ MCP server for tool calls
- ✅ Sentry integration configured
- ✅ Cloudflare Pages deployment setup
Not Yet Implemented
- ❌ Row Level Security (RLS) policies
- ❌ Per-client rate limiting
- ❌ Workspace isolation per tenant
- ❌ Audit logging
- ❌ Encrypted credential storage
- ❌ OAuth flow security hardening
- ❌ Production Docker Compose
- ❌ Health check endpoints
- ❌ Structured logging
- ❌ Backup automation
- ❌ Comprehensive test suite
6.1 — Multi-Tenant Isolation (Planned)
Tasks (To Be Implemented)
- [ ] Implement Row Level Security (RLS) on all tables
- [ ] Per-client LLM rate limiting
- [ ] Workspace isolation per client
- [ ] Audit logging for all actions
- [ ] Verify GoClaw tenant isolation
GoClaw Managed Mode Setup
Already configured in Phase 0, but verify:
{
"database": {
"mode": "managed",
"postgress_dsn": "${GOCLAW_POSTGRES_DSN}"
}
}
GoClaw uses Supabase's multi-tenancy via:
- Session isolation (separate sessions per client)
- Memory isolation (embeddings scoped by client_id)
- Workspace paths per tenant
Row Level Security Policies
Create front-end/supabase/migrations/20260315_rls_policies.sql:
-- Enable RLS on all client-facing tables
ALTER TABLE customer_customer ENABLE ROW LEVEL SECURITY;
ALTER TABLE customer_campaign ENABLE ROW LEVEL SECURITY;
ALTER TABLE customer_posts ENABLE ROW LEVEL SECURITY;
ALTER TABLE customer_platform_post ENABLE ROW LEVEL SECURITY;
ALTER TABLE customer_email_template ENABLE ROW LEVEL SECURITY;
-- Create policies that enforce client isolation
-- Policy: Users can only see their own client's data
-- Assumes auth.uid() is mapped to a customer_id via a user_customer_mapping table
-- customer_customer policy
CREATE POLICY customer_isolation ON customer_customer
FOR ALL
USING (
id IN (
SELECT customer_id
FROM user_customer_mapping
WHERE user_id = auth.uid()
)
);
-- customer_campaign policy
CREATE POLICY campaign_isolation ON customer_campaign
FOR ALL
USING (
customer_id IN (
SELECT customer_id
FROM user_customer_mapping
WHERE user_id = auth.uid()
)
);
-- customer_posts policy
CREATE POLICY post_isolation ON customer_posts
FOR ALL
USING (
customer_id IN (
SELECT customer_id
FROM user_customer_mapping
WHERE user_id = auth.uid()
)
);
-- customer_platform_post policy (via customer_posts)
CREATE POLICY platform_post_isolation ON customer_platform_post
FOR ALL
USING (
customer_post_id IN (
SELECT id FROM customer_posts
WHERE customer_id IN (
SELECT customer_id
FROM user_customer_mapping
WHERE user_id = auth.uid()
)
)
);
-- customer_email_template policy
CREATE POLICY email_template_isolation ON customer_email_template
FOR ALL
USING (
customer_id IN (
SELECT customer_id
FROM user_customer_mapping
WHERE user_id = auth.uid()
)
);
-- Admin override: users with admin role can see everything
CREATE POLICY admin_full_access ON customer_customer
FOR ALL
USING (
EXISTS (
SELECT 1 FROM user_roles
WHERE user_id = auth.uid() AND role = 'admin'
)
);
-- Create user_customer_mapping table if not exists
CREATE TABLE IF NOT EXISTS user_customer_mapping (
id SERIAL PRIMARY KEY,
user_id UUID REFERENCES auth.users(id) ON DELETE CASCADE,
customer_id INTEGER REFERENCES customer_customer(id) ON DELETE CASCADE,
role TEXT DEFAULT 'member' CHECK (role IN ('owner', 'admin', 'member')),
created_at TIMESTAMPTZ DEFAULT NOW(),
UNIQUE(user_id, customer_id)
);
-- Create user_roles table for system-wide roles
CREATE TABLE IF NOT EXISTS user_roles (
id SERIAL PRIMARY KEY,
user_id UUID REFERENCES auth.users(id) ON DELETE CASCADE,
role TEXT NOT NULL CHECK (role IN ('admin', 'manager', 'viewer')),
granted_at TIMESTAMPTZ DEFAULT NOW(),
UNIQUE(user_id, role)
);
Per-Client LLM Rate Limiting
Add to api/src/utils/rate-limiter.ts:
import { createClient } from 'redis'
const redis = createClient({ url: process.env.REDIS_URL })
await redis.connect()
export async function checkRateLimit(
customerId: number,
endpoint: string,
limit: number = 10,
windowSeconds: number = 60
): Promise<{ allowed: boolean; remaining: number }> {
const key = `ratelimit:${customerId}:${endpoint}`
const count = await redis.incr(key)
if (count === 1) {
await redis.expire(key, windowSeconds)
}
const remaining = Math.max(0, limit - count)
return {
allowed: count <= limit,
remaining,
}
}
Use in API routes:
app.post('/generate_post', async (request, reply) => {
const { customer_id } = request.body
const { allowed, remaining } = await checkRateLimit(customer_id, 'generate_post', 50, 3600)
if (!allowed) {
return reply.code(429).send({
error: 'Rate limit exceeded. Please try again later.',
retry_after: remaining,
})
}
// Continue with generation...
})
Workspace Isolation
Ensure each client gets isolated GoClaw workspace:
{
"agents": {
"defaults": {
"workspace": "/app/workspace/${client_id}",
"restrict_to_workspace": true
}
}
}
Create workspace directories on client creation:
app.post('/client', async (request, reply) => {
const { name, description } = request.body
const { data: client, error } = await supabase
.from('customer_customer')
.insert({ name, description })
.select()
.single()
if (error) {
return reply.code(500).send({ error: error.message })
}
// Create workspace directory
const workspacePath = `/app/workspace/${client.id}`
await fs.mkdir(workspacePath, { recursive: true })
return client
})
Audit Logging
Create table:
CREATE TABLE IF NOT EXISTS audit_log (
id SERIAL PRIMARY KEY,
user_id UUID REFERENCES auth.users(id),
customer_id INTEGER REFERENCES customer_customer(id),
action TEXT NOT NULL,
resource_type TEXT NOT NULL,
resource_id INTEGER,
changes JSONB,
ip_address INET,
user_agent TEXT,
created_at TIMESTAMPTZ DEFAULT NOW()
);
CREATE INDEX idx_audit_log_customer ON audit_log(customer_id);
CREATE INDEX idx_audit_log_user ON audit_log(user_id);
CREATE INDEX idx_audit_log_created ON audit_log(created_at DESC);
Middleware for logging:
app.addHook('onResponse', async (request, reply) => {
if (request.user && reply.statusCode < 400) {
await supabase.from('audit_log').insert({
user_id: request.user.id,
customer_id: request.body?.customer_id,
action: `${request.method} ${request.url}`,
resource_type: extractResourceType(request.url),
resource_id: extractResourceId(request.url),
ip_address: request.ip,
user_agent: request.headers['user-agent'],
})
}
})
6.2 — Security & Secrets (Planned)
Tasks (To Be Implemented)
- [ ] Move all API keys to Supabase Vault
- [ ] Implement OAuth flows for social platforms
- [ ] Secure Telegram pairing with confirmation codes
- [ ] Encrypt sensitive data at rest
- [ ] Security headers and CORS hardening
Supabase Vault for Secrets
Create table for encrypted credentials:
CREATE TABLE IF NOT EXISTS client_integrations (
id SERIAL PRIMARY KEY,
customer_id INTEGER REFERENCES customer_customer(id) ON DELETE CASCADE,
platform TEXT NOT NULL CHECK (platform IN ('twitter', 'linkedin', 'facebook', 'mailchimp', 'ghost')),
credentials JSONB NOT NULL, -- encrypted by Supabase Vault
oauth_refresh_token TEXT,
oauth_expires_at TIMESTAMPTZ,
is_active BOOLEAN DEFAULT TRUE,
created_at TIMESTAMPTZ DEFAULT NOW(),
updated_at TIMESTAMPTZ DEFAULT NOW(),
UNIQUE(customer_id, platform)
);
-- Use Supabase Vault to encrypt credentials column
CREATE EXTENSION IF NOT EXISTS pgsodium;
ALTER TABLE client_integrations
ALTER COLUMN credentials TYPE TEXT
USING pgsodium.crypto_aead_det_encrypt(credentials::text::bytea);
OAuth Flow Implementation
Install: pnpm add simple-oauth2
Create api/src/integrations/oauth.ts:
import { AuthorizationCode } from 'simple-oauth2'
const twitterOAuth = new AuthorizationCode({
client: {
id: env.TWITTER_CLIENT_ID,
secret: env.TWITTER_CLIENT_SECRET,
},
auth: {
tokenHost: 'https://api.twitter.com',
tokenPath: '/2/oauth2/token',
authorizePath: '/2/oauth2/authorize',
},
})
export async function getTwitterAuthUrl(customerId: number): Promise<string> {
const state = crypto.randomBytes(16).toString('hex')
// Store state for verification
await supabase.from('oauth_states').insert({
customer_id: customerId,
platform: 'twitter',
state,
expires_at: new Date(Date.now() + 10 * 60 * 1000), // 10 min
})
return twitterOAuth.authorizeURL({
redirect_uri: `${env.API_BASE_URL}/oauth/twitter/callback`,
scope: 'tweet.read tweet.write users.read',
state,
})
}
export async function handleTwitterCallback(code: string, state: string) {
// Verify state
const { data: stateRecord } = await supabase
.from('oauth_states')
.select('*')
.eq('state', state)
.eq('platform', 'twitter')
.single()
if (!stateRecord) {
throw new Error('Invalid state')
}
// Exchange code for token
const result = await twitterOAuth.getToken({
code,
redirect_uri: `${env.API_BASE_URL}/oauth/twitter/callback`,
})
const token = result.token
// Store encrypted credentials
await supabase.from('client_integrations').insert({
customer_id: stateRecord.customer_id,
platform: 'twitter',
credentials: {
access_token: token.access_token,
},
oauth_refresh_token: token.refresh_token,
oauth_expires_at: new Date(Date.now() + token.expires_in * 1000),
})
return { success: true }
}
Add routes:
app.get('/oauth/twitter/authorize/:customer_id', async (request, reply) => {
const { customer_id } = request.params
const authUrl = await getTwitterAuthUrl(parseInt(customer_id))
reply.redirect(authUrl)
})
app.get('/oauth/twitter/callback', async (request, reply) => {
const { code, state } = request.query
await handleTwitterCallback(code as string, state as string)
reply.redirect('/app/settings/integrations?success=twitter')
})
Secure Telegram Pairing
Generate pairing codes:
app.post<{ Params: { id: string } }>(
'/client/:id/generate_pairing_code',
async (request, reply) => {
const { id } = request.params
// Generate 6-digit code
const code = Math.floor(100000 + Math.random() * 900000).toString()
// Store with expiration
await supabase.from('telegram_pairing_codes').insert({
customer_id: parseInt(id),
code,
expires_at: new Date(Date.now() + 15 * 60 * 1000), // 15 min
})
return { code }
}
)
Liaison agent verifies code during pairing:
## Secure Pairing Flow
1. User sends `/start`
2. You: "To get started, please enter your 6-digit pairing code. You can get this from your dashboard at [url]"
3. User: "123456"
4. You: Call web_fetch POST /telegram/pair with { code: "123456", telegram_chat_id: [...] }
5. If valid → paired
6. If invalid → "Code not found or expired. Please generate a new code from your dashboard."
Security Headers
Add to api/src/index.ts:
import helmet from '@fastify/helmet'
await app.register(helmet, {
contentSecurityPolicy: {
directives: {
defaultSrc: ["'self'"],
styleSrc: ["'self'", "'unsafe-inline'"],
scriptSrc: ["'self'"],
imgSrc: ["'self'", 'data:', 'https:'],
},
},
hsts: {
maxAge: 31536000,
includeSubDomains: true,
},
})
Update CORS:
await app.register(cors, {
origin: [env.FRONTEND_URL, env.DASHBOARD_URL],
credentials: true,
methods: ['GET', 'POST', 'PUT', 'DELETE'],
allowedHeaders: ['Content-Type', 'Authorization'],
})
6.3 — Deployment (Planned)
Tasks (To Be Implemented)
- [ ] Create production Docker Compose config
- [ ] Set up health check endpoints
- [ ] Implement structured logging
- [ ] Configure error alerting (Sentry)
- [ ] Backup strategy for workspaces
Production Docker Compose
Create docker-compose.prod.yml:
version: '3.8'
services:
goclaw:
image: ai-marketing-goclaw:latest
build:
context: ./goclaw
dockerfile: Dockerfile
restart: unless-stopped
ports:
- '18790:18790'
environment:
- GOCLAW_HOST=0.0.0.0
- GOCLAW_PORT=18790
- GOCLAW_POSTGRES_DSN=${GOCLAW_POSTGRES_DSN}
- GOCLAW_PROVIDER=anthropic
- GOCLAW_ANTHROPIC_API_KEY=${GOCLAW_ANTHROPIC_API_KEY}
- GOCLAW_GEMINI_API_KEY=${GOCLAW_GEMINI_API_KEY}
- GOCLAW_TELEGRAM_TOKEN=${GOCLAW_TELEGRAM_TOKEN}
volumes:
- goclaw-data:/app/data
- goclaw-workspace:/app/workspace
networks:
- ai-marketing-network
healthcheck:
test: ['CMD', 'curl', '-f', 'http://localhost:18790/health']
interval: 30s
timeout: 10s
retries: 3
api:
image: ai-marketing-api:latest
build:
context: ./api
dockerfile: Dockerfile
restart: unless-stopped
ports:
- '8000:8000'
environment:
- NODE_ENV=production
- SUPABASE_URL=${SUPABASE_URL}
- SUPABASE_KEY=${SUPABASE_KEY}
- GOOGLE_API_KEY=${GOOGLE_API_KEY}
- GOCLAW_API_URL=http://goclaw:18790
- REDIS_URL=${REDIS_URL}
- SENTRY_DSN=${SENTRY_DSN}
networks:
- ai-marketing-network
depends_on:
- goclaw
healthcheck:
test: ['CMD', 'curl', '-f', 'http://localhost:8000/health']
interval: 30s
timeout: 10s
retries: 3
redis:
image: redis:7-alpine
restart: unless-stopped
ports:
- '6379:6379'
networks:
- ai-marketing-network
volumes:
- redis-data:/data
volumes:
goclaw-data:
goclaw-workspace:
redis-data:
networks:
ai-marketing-network:
driver: bridge
Health Check Endpoints
Add to api/src/index.ts:
app.get('/health', async (request, reply) => {
// Check Supabase connection
const { error: dbError } = await supabase.from('customer_customer').select('id').limit(1)
// Check Redis connection (if using)
let redisOk = true
try {
await redis.ping()
} catch {
redisOk = false
}
const healthy = !dbError && redisOk
return reply.code(healthy ? 200 : 503).send({
status: healthy ? 'healthy' : 'unhealthy',
timestamp: new Date().toISOString(),
checks: {
database: !dbError,
redis: redisOk,
},
})
})
Structured Logging
Update api/src/index.ts:
const app = fastify({
logger: {
level: env.NODE_ENV === 'production' ? 'info' : 'debug',
transport:
env.NODE_ENV === 'production'
? undefined
: {
target: 'pino-pretty',
},
serializers: {
req: (req) => ({
method: req.method,
url: req.url,
headers: {
host: req.headers.host,
'user-agent': req.headers['user-agent'],
},
}),
res: (res) => ({
statusCode: res.statusCode,
}),
},
},
})
// Add request ID
app.addHook('onRequest', (request, reply, done) => {
const requestId = crypto.randomUUID()
request.log = request.log.child({ requestId })
reply.header('X-Request-ID', requestId)
done()
})
Error Alerting
Install: pnpm add @sentry/node
import * as Sentry from '@sentry/node'
if (env.NODE_ENV === 'production' && env.SENTRY_DSN) {
Sentry.init({
dsn: env.SENTRY_DSN,
environment: env.NODE_ENV,
tracesSampleRate: 0.1,
})
app.setErrorHandler((error, request, reply) => {
Sentry.captureException(error, {
extra: {
url: request.url,
method: request.method,
body: request.body,
},
})
request.log.error(error)
reply.code(500).send({ error: 'Internal server error' })
})
}
Backup Strategy
Create backup script scripts/backup-workspaces.sh:
#!/bin/bash
DATE=$(date +%Y%m%d_%H%M%S)
BACKUP_DIR="/backups"
WORKSPACE_DIR="/app/workspace"
# Tar and compress workspace
tar -czf "$BACKUP_DIR/workspace_$DATE.tar.gz" -C "$WORKSPACE_DIR" .
# Upload to S3 (or your cloud storage)
aws s3 cp "$BACKUP_DIR/workspace_$DATE.tar.gz" s3://your-bucket/backups/
# Remove older than 30 days
find "$BACKUP_DIR" -name "workspace_*.tar.gz" -mtime +30 -delete
echo "Backup completed: workspace_$DATE.tar.gz"
Add to cron: 0 2 * * * /app/scripts/backup-workspaces.sh
6.4 — Testing (Planned)
Tasks (To Be Implemented)
- [ ] API integration tests for all endpoints
- [ ] GoClaw agent conversation tests
- [ ] End-to-end test (Telegram → publish)
- [ ] Load testing
API Integration Tests
Create api/tests/integration/endpoints.test.ts:
import { test, describe, beforeAll, afterAll } from 'node:test'
import assert from 'node:assert'
import { startServer } from '../src/server.js'
let app: any
beforeAll(async () => {
app = await startServer()
})
afterAll(async () => {
await app.close()
})
describe('Client Context Endpoint', () => {
test('GET /client/:id/context returns client data', async () => {
const response = await app.inject({
method: 'GET',
url: '/client/1/context',
headers: { 'x-api-key': process.env.API_KEY },
})
assert.strictEqual(response.statusCode, 200)
const data = JSON.parse(response.body)
assert.ok(data.client_id)
assert.ok(data.name)
assert.ok(data.brand_voice)
})
})
describe('Search Images Endpoint', () => {
test('POST /search_images returns Unsplash URLs', async () => {
const response = await app.inject({
method: 'POST',
url: '/search_images',
headers: {
'x-api-key': process.env.API_KEY,
'content-type': 'application/json',
},
body: JSON.stringify({ query: 'coffee', count: 3 }),
})
assert.strictEqual(response.statusCode, 200)
const data = JSON.parse(response.body)
assert.strictEqual(data.images.length, 3)
})
})
Run: node --test tests/integration/
Load Testing
Install: pnpm add -D autocannon
Create tests/load/generate-post.js:
import autocannon from 'autocannon'
const result = await autocannon({
url: 'http://localhost:8000/generate_post',
method: 'POST',
headers: {
'x-api-key': process.env.API_KEY,
'content-type': 'application/json',
},
body: JSON.stringify({
customer_id: 1,
platforms: ['twitter'],
prompt: 'Test post',
}),
connections: 10,
duration: 30,
})
console.log(result)
Run: node tests/load/generate-post.js
Verification Checklist
- [ ] RLS policies enforce tenant isolation
- [ ] Rate limiting prevents abuse
- [ ] OAuth flows connect social accounts securely
- [ ] Telegram pairing requires confirmation codes
- [ ] All API keys stored in Supabase Vault
- [ ] Production Docker Compose starts all services
- [ ] Health checks return 200 for healthy services
- [ ] Structured logs include request IDs
- [ ] Sentry captures and reports errors
- [ ] Workspace backups run nightly
- [ ] Integration tests pass for all endpoints
- [ ] Load tests show acceptable performance
Next Steps
Your migration is complete! Monitor production metrics, gather user feedback, and iterate on features. Consider:
- Adding more social platforms
- Implementing A/B testing for content
- Building client-specific reporting dashboards
- Expanding proactive intelligence (competitor monitoring, trend detection)
- Multi-language support for international clients