Skip to main content

Multi-tenant architecture

Every user belongs to a tenant, and all data — connections, ingested content, graph nodes, chat sessions, observations — is scoped to that tenant via a tenant_id column that’s enforced on every query path. Tenants cannot access each other’s data.
Project-level isolation layers on top: within a tenant, data is further scoped to projects so teams can keep separate knowledge bases within the same org.

Authentication

Email + password

JWT-based sessions, stored server-side and invalidated on logout.

Google OAuth

Single-click sign-in. Google credentials never stored; Fabric issues its own session token after Google verifies identity.

API keys

Tenant- and project-scoped. Issued under your account for programmatic access.

Email verification

Required for new accounts (configurable). Admin-enabled dev mode allows local development without auth.

Credential storage

OAuth tokens (Gmail, Drive, Slack, Fireflies)

Stored encrypted at rest. Refreshed automatically before expiry. Revoking access in the provider’s interface immediately stops Fabric from syncing that connector.

Database and IMAP credentials

Stored with per-tenant AES-256 encryption. Used only during sync and query execution. Never logged, never transmitted to third parties.

Secret injection in production

1

Source of truth: AWS SSM Parameter Store

Secrets live in SSM as SecureString parameters at /copilot/fabric/{env}/secrets/*.
2

Pulled at container start

backend/scripts/start.sh runs aws ssm get-parameter for each secret before starting uvicorn, injecting into the container env.
3

Never in images

Secrets are not baked into Docker images, not committed to git, not present in CloudFormation templates.

Data residency

All data is stored on your own infrastructure. Fabric does not transmit your ingested content to external services, with one exception: the LLM used for knowledge extraction, observation extraction, and chat.
You configure which LLM endpoint to use — Anthropic, OpenAI, a private deployment, or a local model. The LLM endpoint is the only place your content leaves your infra, and you control it.

API security

All endpoints require a valid session token or API key except the auth endpoints themselves. Tokens are validated on every request. Admin endpoints additionally require the admin role on the token claims.
Per deployment. For production, restrict allowed origins to your frontend domain.
Applied at the ALB in AWS deployments. Self-hosted deployments should add their own rate limiting at the reverse proxy layer.
The tenant_id column appears in the WHERE clause of every production query that reads or writes tenant-scoped data. There’s no “global” query path.

Observability as a security feature

Langfuse traces every LLM call, embedding operation, and tool invocation — with tenant_id and project_id on every span. You can audit what the agent did, what data it saw, and what it cost.
Most competitors are black boxes. Fabric is inspectable. Structured logs include tenant context on every entry. In production these go to CloudWatch with retention policies; in dev, stdout.

What Fabric doesn’t do (yet)

SOC 2 / HIPAA

Not certified today.

Per-document ACLs

Tenant-level isolation today; sub-tenant ACLs on the roadmap.

Enterprise SSO

Google OAuth today; SAML and Okta on the roadmap.

Audit log API

Request logs exist; a dedicated audit export endpoint is on the roadmap.
On the roadmap as the enterprise surface expands.