Why it matters: Most AI‑powered apps (calorie trackers, CRMs, study companions, relationship apps eyes 👀) send every prompt to OpenAI, exposing all user data. With Maple Proxy, that data never leaves a hardware‑isolated enclave.
https://blossom.primal.net/21b4ee5b782d240d3eb06c3db41ffa73ec79734f104ee94113ff2dc6d7e771c3.mp4
How it works:
🔒 Inference runs inside a Trusted Execution Environment (TEE).
🔐 End‑to‑end encryption keeps prompts and responses private.
✅ Cryptographic attestation proves you’re talking to genuine secure hardware.
🚫 Zero data retention – no logs, no training data.
Ready‑to‑use models (pay‑as‑you‑go, per million tokens):
- llama3‑3‑70b – general reasoning
- gpt‑oss‑120b – creative chat
- deepseek‑r1‑0528 – advanced math & coding
- mistral‑small‑3‑1‑24b – conversational agents
- qwen2‑5‑72b – multilingual work & coding
- qwen3‑coder‑480b – specialized coding assistant
- gemma‑3‑27b‑it‑fp8‑dynamic – fast image analysis
Real‑world use cases:
🗓️ A calorie‑counting app replaces public OpenAI calls with Maple Proxy, delivering personalized meal plans while keeping dietary data private.
📚 A startup’s internal knowledge‑base search runs through the proxy, so confidential architecture details never leave the enclave.
👩💻 A coding‑assistant plug‑in for any IDE points to http://localhost:8080/v1 and suggests code, refactors, and explains errors without exposing proprietary code
Getting started is simple:
Desktop app (fastest for local dev)
- Download from trymaple.ai/downloads
- Sign up for a Pro/Team/Max plan (starts at $20/mo)
- Purchase $10+ of credits
- Click “Start Proxy” → API key & localhost endpoint are ready.
Docker image (production‑ready)
- `docker pull ghcr.io/opensecretcloud/maple-proxy:latest`
- Run with your MAPLE_API_KEY and MAPLE_BACKEND_URL
- You now have a secure OpenAI‑compatible endpoint at http://localhost:8080/v1
Compatibility: Any library that lets you set a base URL works—LangChain, LlamaIndex, Amp, Open Interpreter, Goose, Jan, and virtually every OpenAI‑compatible SDK.
Need more detail? Check the technical write‑up, full API reference on GitHub, or join the Discord community for real‑time help.
https://blog.trymaple.ai/maple-proxy-documentation/
Start building with private AI today: download the app or pull the Docker image, upgrade to a plan, add a few dollars of credits, point your client to http://localhost:8080/v1, and secure all your apps.