Trust & Security
Last updated: May 8, 2026
This page is written for security and procurement teams evaluating PipeLedger AI as a vendor. It complements our Privacy Policy (focused on data subjects) and Terms of Service (license terms) by documenting our security posture, compliance roadmap, and operational controls. A Data Processing Agreement is available to Enterprise customers on request. See Section 9.
1. Compliance & Audit Roadmap
- SOC 2 Type 1. Targeted Q3 2026. Engagement with an accredited auditor will be confirmed prior to scope finalization.
- SOC 2 Type 2. Targeted within twelve months of Type 1 issuance.
- GDPR & CCPA. PipeLedger acts as a Data Processor (see Privacy Policy §7). We support data subject access and deletion requests directed through the customer Administrator.
- HIPAA / PCI-DSS. Out of scope. PipeLedger is not designed to process Protected Health Information or payment card data; customer Administrators are responsible for ensuring such data is not extracted from upstream ERPs.
2. Sub-Processors
The following service providers process customer data on our behalf. We commit to providing 30 days’ advance notice of any material change to this list, including the addition of new sub-processors. Email alexander@pipeledger.ai to subscribe to sub-processor change notifications.
- Supabase. Authentication, organizational tables, audit logs, security rules. Region:
us-east-1(Virginia, USA). - Google Cloud. BigQuery (data warehouse), Cloud Run (workloads), Cloud Storage (DLQ + audit archive), Secret Manager (credentials). Region:
us-east4(Virginia, USA). - Vercel. Web application hosting and REST API delivery. Edge network global, primary execution in
iad1(Virginia, USA). - Sentry. Error monitoring and performance tracing. Receives application stack traces; financial data is excluded by
beforeSendscrubbers and project-level data redaction. Region: United States. - Dagster Cloud. Pipeline orchestration and scheduling. Receives pipeline metadata, asset names, and run timing; does not receive General Ledger row content. Region: United States.
3. Encryption & Data Protection
All customer data is protected by AES-256 encryption at rest and TLS 1.3 in transit, consistent with our Privacy Policy §5. Encryption keys for production datasets are managed by Google Cloud KMS and Supabase’s managed Postgres encryption layer; PipeLedger does not hold raw key material.
- Per-tenant data isolation. Each customer organization is provisioned its own BigQuery dataset prefix and Supabase Row Level Security (RLS) policies. Cross-tenant data access at the database layer is structurally impossible.
- HMAC tokenization secrets. Per-organization HMAC secrets used by masking operations are stored in Google Secret Manager, never in application code or environment variables.
- Append-only audit log. Every administrative action, security rule change, and data-egress event is recorded in an immutable audit trail enforced at the database layer (no UPDATE/DELETE rules).
4. Authentication & Access Controls
- Customer authentication. Email and password with bcrypt-hashed credentials, or Google OAuth, both via Supabase Auth. TOTP-based multi-factor authentication is available today; mandatory enrollment for every user account is on the security roadmap.
- API tokens. Outbound integration tokens are bcrypt-hashed at rest; the plaintext is shown only once at issuance and never persisted server-side. Tokens carry audience-binding metadata (RFC 8707) and are rejected when presented to a different resource.
- Optional IP allowlist. Each integration token may be scoped to a list of source IP addresses or CIDR ranges. Connections from outside the allowlist are rejected before any authorization or rate-limit logic runs.
- Three-dimensional authorization. Every integration token carries an explicit role (viewer / operator), scope (organization-wide / specific dimensions), and sensitivity ceiling (standard / restricted / highly restricted) enforced at query compilation. Customers configure these via the in-app Credentials interface.
- MCP spec 2025-11-25 conformance. The Model Context Protocol server (
mcp.pipeledger.ai/mcp) implements RFC 9728 Protected Resource Metadata, RFC 8707 audience binding, Origin validation against a configured allowlist, MCP-Protocol-Version header negotiation, andWWW-Authenticatechallenge headers on 401 responses. - OAuth 2.1 + Dynamic Client Registration roadmap. Approved 2026-04-24; phased rollout in progress. PipeLedger acts as the authorization server; static tokens remain supported in parallel during the transition.
5. Vulnerability Management & Monitoring
- Dependency monitoring. GitHub Dependabot is enabled on all production repositories. High-severity advisories are triaged within 7 days; moderate within 30 days.
- Application error monitoring. Sentry is wired into every customer-facing service (web, MCP server, pipeline worker, orchestrator) with PII scrubbing on the egress path. Issues are reviewed daily.
- Uptime monitoring. Google Cloud Monitoring probes our public health endpoints from three geographic regions every 60 seconds. An alert fires when at least two of three regions report failure for three consecutive minutes.
- Service health endpoint. Both
app.pipeledger.ai/api/healthandmcp.pipeledger.ai/healthare publicly probable without authentication for customer-side uptime monitoring.
6. Penetration Testing
PipeLedger has not yet completed a third-party penetration test. The first engagement is targeted for Q3 2026, scoped to the public web application, the REST API, and the MCP server. Findings will be remediated according to severity (critical and high within 30 days, moderate within 90 days). A summary of completed engagements will be published on this page after each test concludes.
7. Data Residency
All customer data is processed and stored in the United States (Virginia region). Specifically:
- Supabase Postgres in
us-east-1. - BigQuery datasets, Cloud Run workloads, Cloud Storage buckets, and Secret Manager entries in
us-east4. - Vercel application hosting in
iad1(Washington, D.C. metro).
8. Incident Response & Breach Notification
In the event of a confirmed security incident affecting customer data, PipeLedger will notify the affected customer Administrator within 72 hours of confirmation. Notifications include the nature of the incident, the data classes affected (where known), the remediation steps taken, and the contact for follow-up.
Customers may report suspected vulnerabilities or security incidents to alexander@pipeledger.ai. We commit to acknowledging reports within one business day. A dedicated security@ alias will replace this address once organizational tooling is in place; mail to either address will reach the responsible party.
9. Data Processing Agreement
PipeLedger offers a standard Data Processing Agreement (DPA) for Enterprise customers covering our role as Data Processor, sub-processor commitments, retention and deletion obligations, security commitments, breach notification timing, audit rights, and US data residency. Bilateral amendments are accommodated.
To request a copy of the DPA for legal review, email alexander@pipeledger.ai with your organization name and procurement contact. We typically return a redlined version within two business days.
10. Contact
Security inquiries, vulnerability reports, and procurement questionnaires: alexander@pipeledger.ai.