Your keys live in your vault. LLM billing runs through our gateway; everything else never leaves your Mac.
Every tool asks for your HubSpot / SES / Apify / Stripe keys and stores them server-side. Switching tools means pasting every key again — and trusting a vendor you'll abandon.
Tool X marks up OpenAI tokens 3× and bundles them as "AI features." You pay both the tool's markup and your own provider bills for the same underlying infrastructure.
Every "here's our shiny AI SaaS" launched in 2023 is 30% gone by now. Your workflows, prompts, and data went with them.
You want a Python cron to re-run a report using the Apify key you pasted in the dashboard. You can't — the key is in the vendor's DB, not on your machine.
22+ integrations, every one BYOK. Paste once in Integrations → Apify (or SES, GSC, Ghost, Unipile, Stripe, GitHub, …). Keys stay in ~/BlackMagic/.bm/integrations.json on your disk, forever.
Every saved integration also writes a plain KEY=value line to <vault>/.env. Your Python / Node / shell scripts just load_dotenv() and read APIFY_API_TOKEN, AWS_ACCESS_KEY_ID, FEISHU_WEBHOOK, SES_FROM, etc.
companies/*.md, contacts/*.md, deals/*.md, signals/*.md, playbooks/*.md, drafts/*.md. Readable in any text editor. Version-controllable with git. Movable to any Mac with rsync.
All tool execution — fetch, scrape, send_email, cms_create_draft, GSC query — runs in a local Node daemon. The only outbound traffic is the target API call + our LLM proxy for billed reasoning.
Stop paying us and the vault stays on your disk. Open it with a text editor. The keys in .env still work with whatever replaces us. Literally no lock-in.
Your vault is git-init'd by default. Diff a contact's history. Revert a bad enrichment. Branch a new ICP to experiment. It's just files.
Your credits token (`ck_...`) lives in .bm/config.toml and is read only by our codex runtime for billed LLM calls. Skills never see it. Integration tokens never commute.
UI writes integrations.json. Daemon mirrors to .env. Scripts read .env. Any other script, cron, or custom tool on the machine can share the same keys without re-authentication.
Each Skill declares `requires: { integrations, us_files, cli }` in frontmatter. Point-of-run check + one-click fixes. Users learn exactly what each Skill needs — no hidden state.
Canonical store at ~/BlackMagic/.bm/integrations.json. Per-provider `{ status, connectedAs, connectedAt, credentials }`. UI reads + writes, daemon consumes.
Every save regenerates <vault>/.env with predictable names — APIFY_API_TOKEN, AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, AWS_REGION, SES_FROM, FEISHU_WEBHOOK, GHOST_ADMIN_API_KEY, GHOST_ADMIN_API_URL, GSC_SERVICE_ACCOUNT_JSON, …
zenn_api_key (your credits token) lives in .bm/config.toml and is never mirrored to .env. A Skill cannot accidentally use your billing token to bypass the proxy.
A company is a markdown file. A contact is a markdown file. A run is a markdown file. You can grep, git-log, and rm -rf with confidence.
SES requests use Node's native https module, sidestepping any Electron / Chromium network interference. GSC JWT signing uses Node crypto. No wrapper SDKs that phone home.
Apify charges you directly. AWS SES charges you directly. Unipile charges you directly. We charge you only for credits on the LLM proxy. No 3× markups.
No. Keys live in .bm/integrations.json on your Mac. Integration API calls (Apify → api.apify.com, SES → email.us-east-1.amazonaws.com, GSC → googleapis.com, etc.) go directly from your daemon to the provider. We route only LLM reasoning calls through our proxy — and even those carry your `ck_` credits token, not your integration keys.
Yes. Daemon logs every outbound request to ~/Library/Logs/BlackMagic AI/. Plus you can set a proxy and inspect every packet — SES, Apify, Feishu, and all CMS calls are plain HTTPS to the providers' public APIs.
The vault stays on your disk, integrations.json stays on your disk, .env stays on your disk. The desktop daemon stops receiving updates but continues to run. Skills you've customized are yours. Migrate everything to a text editor if you want — it's just .md + .json + .env.
Yes — the vault is designed to be rsync'able / Dropbox-syncable / git-pushable. Keys in integrations.json are machine-local by convention (don't check them into git), but if you want to sync them, that's your call.
.env is the universal interface. Every Python, Node, Go, or shell script in the world knows how to load_dotenv(). Mirroring means any script you write in your vault picks up the same keys you pasted in the UI — zero glue code.