Back to Guides

Privacy & Security

Local-First Architecture Explained

AKARI's approach to privacy isn't a feature bolted on after the fact — it's a fundamental architectural decision that shapes everything about how the software works. AKARI is a desktop application built with Tauri and Rust. It runs natively on your machine, and your data lives in a local project directory. There's no AKARI cloud server storing your projects, no remote database holding your content, and no background sync uploading your work without your knowledge. This is what we mean by "local-first": your machine is the source of truth for all your data. The network is used only when explicitly needed — specifically, when your AI partner needs to communicate with an AI model API. Local-first isn't just a privacy choice. It also means: • Instant performance — no waiting for server responses • Offline capability — most features work without internet • True data ownership — your files are just files on your disk • No vendor dependency — if AKARI disappeared tomorrow, your data remains We believe this is the right architecture for creative tools. Your unreleased projects, client materials, and business data deserve the same level of protection as the most sensitive information on your computer — because that's exactly what they are.

What Data Stays on Your Device

Let's be explicit about what stays local. Everything listed below lives on your machine and never touches an external server: Project Files and Creative Assets Every video, image, audio file, document, font, and template in your projects. Whether you created it in AKARI or imported it from elsewhere, it stays on your disk. Content Pool Database The SQLite database that indexes your assets, stores metadata, and tracks relationships between content items. All the AI-generated tags, analysis results, and organizational data. AI Memory — All 4 Layers Session memory, project memory, user memory, and skill memory. Every preference the AI has learned about you, every decision it remembers from past projects, every workflow it has optimized — stored locally. Business Data If you use the Backstage modules: your customer list, transaction records, email subscriber data, analytics history, and funnel configurations. This data never leaves your machine. Settings and Preferences Your AI partner's name, personality configuration, model preferences, workspace layout, and every other setting you've customized. Search Indexes and Vector Embeddings The semantic search indexes that power AKARI's intelligent content discovery. These are computed locally and stored locally. Conversation History Every conversation you've had with your AI partner, including the context, decisions, and creative direction discussed. The data that stays on your device is stored in standard, open formats — SQLite databases, Markdown files, JSON configurations, and your media files in their original formats. No proprietary encryption, no obfuscation. You can inspect everything with standard tools.

When Data Leaves Your Device

Transparency requires being specific about the one scenario where data does leave your device: when you interact with your AI partner. What Gets Sent When you send a message to your AI partner, AKARI transmits: • Your current message (the prompt you typed) • Relevant context selected by the AI partner (such as the current workspace state, selected elements, or relevant memory) • System instructions that define the AI partner's behavior What Does NOT Get Sent • Your entire Content Pool or asset library • Full conversation history (only relevant context is included) • Business data (customer lists, transactions, analytics) • Your complete memory database • Files on your computer outside the current context • Any personally identifiable information beyond what you include in your message Where It Goes The prompt is sent to the AI model provider you've configured: • If using OpenRouter — to OpenRouter's API endpoint, which routes to the model provider • If using direct API connections — directly to Claude (Anthropic), GPT (OpenAI), Gemini (Google), or whichever provider you've selected Each provider has their own privacy policy and data handling practices. AKARI itself operates no servers that process your prompts or store your conversations. Encryption in Transit All API communications use HTTPS/TLS encryption. Your prompts are encrypted between your machine and the API endpoint. No intermediary can read the content. You Control the Flow You can review exactly what context is being sent before each request. AKARI provides a transparency mode that shows you the complete payload being transmitted, so you can verify that no unexpected data is included.

The 4-Layer Memory System and Privacy

AKARI's memory system is one of its most powerful features — and one where privacy matters most. Here's exactly how each layer handles your data: Layer 1: Session Memory Storage: In-memory only (RAM) Persistence: Cleared when you close the project Privacy: Never written to disk beyond temporary undo history. When the session ends, it's gone. Layer 2: Project Memory Storage: SQLite database in your project's .akari/ directory Persistence: As long as the project exists Privacy: Fully inspectable with any SQLite browser. You can read, edit, or delete any record. The file lives alongside your project files. Layer 3: User Memory Storage: Markdown and JSON files in your user configuration directory Persistence: Permanent, across all projects Privacy: Human-readable files that you can open in any text editor. Your brand guidelines, style preferences, and personal settings are stored as plain text that you can edit directly. Layer 4: Skill Memory Storage: SQLite database with vector embeddings in your user directory Persistence: Permanent, growing over time Privacy: Inspectable with SQLite tools. Contains learned patterns and optimized workflows. No personal data beyond your creative preferences. Important: No memory data is ever sent to an external server for storage. When the AI partner references memory in a conversation, only the specific relevant memories are included in the API context — not the entire database. You can export, back up, or delete any memory layer at any time. There's no hidden copy, no remote backup, and no recovery mechanism that bypasses your control.

OpenRouter and API Key Security

Your API keys are the credentials that allow AKARI to communicate with AI model providers. Protecting them is critical. How Keys Are Stored AKARI stores your API keys in your operating system's native credential manager: • macOS: Keychain Access (encrypted with your system password) • Windows: Windows Credential Manager (encrypted with DPAPI) • Linux: Secret Service / GNOME Keyring (encrypted by the system) Your keys are never stored in plain text files, configuration files, or anywhere else on disk. They're encrypted at rest by your operating system's security infrastructure, which is the same system that protects your website passwords and other sensitive credentials. How Keys Are Used When AKARI needs to make an API call, it retrieves the key from the credential manager, includes it in the API request header, and immediately discards it from memory after the call completes. The key is never logged, cached, or written to disk. OpenRouter as a Privacy Layer If you use OpenRouter as your AI model provider, it serves as an additional privacy layer. OpenRouter routes your requests to various AI models without requiring you to have individual accounts with each provider. This means: • One API key for access to many models • OpenRouter's privacy policy applies to request handling • You can switch between models without sharing keys with multiple providers Direct API Connections If you prefer not to use OpenRouter, you can connect directly to individual providers. Your key is sent only to that specific provider, and no intermediary handles your requests. No AKARI Account Required For the free tier, AKARI doesn't require you to create an account with us. You bring your own API key, and we have no knowledge of or access to it. We can't see your key, your usage, or your content.

Content Provenance (C2PA Integration)

As AI becomes more prevalent in creative work, proving the authenticity and origin of content becomes increasingly important. AKARI is building toward compliance with C2PA — the Coalition for Content Provenance and Authenticity standard. What C2PA Does C2PA embeds cryptographically signed metadata into content files that records: • What tool created or edited the content • Whether AI was involved (and to what degree) • When the content was created or modified • The chain of edits and transformations applied This metadata is tamper-evident — if someone modifies the file without updating the provenance record, the signature becomes invalid. How AKARI Implements It When you export content from AKARI, the provenance metadata includes: • That AKARI was the creation tool • Which AI models were used (e.g., "Claude was used for subtitle generation") • Whether content elements are AI-generated, AI-assisted, or human-created • The creation timestamp What This Means for You • Your audience can verify that your content is authentic • Platforms can distinguish responsible AI use from deepfakes • Your creative attribution is preserved even when content is shared • You can prove the origin of your work in case of disputes What AKARI Will NOT Record • Your personal identity (unless you explicitly choose to include it) • Your API keys or account information • The specific prompts you gave to the AI • Private project details C2PA integration is being implemented progressively. Current builds include basic provenance support, with full compliance planned for upcoming releases.

Data Portability and Backups

Your data should never be held hostage. AKARI is designed for complete data portability. Project Portability Your entire project is a directory on your filesystem. To move it to another computer, copy the directory. To back it up, include it in your regular backup routine. To share it with a collaborator, send them the folder. No export wizard, no proprietary archive format, no cloud sync dependency. Standard File Formats AKARI uses open, standard formats for everything: • Media files: Your videos, images, and audio stay in their original formats (MP4, PNG, WAV, etc.) • Database: SQLite — readable by thousands of tools and libraries • Configuration: JSON — human-readable and universally supported • Memory: Markdown — plain text that opens in any editor • Exports: Standard formats for each content type (MP4, PNG, HTML, etc.) No Proprietary Lock-In There are no proprietary file formats in AKARI. Every piece of data you create or import can be accessed without AKARI installed. If you decide to switch to different tools, your data comes with you completely. Backup Recommendations Since everything is local, backup is your responsibility. We recommend: • Include your AKARI project directories in Time Machine (macOS), File History (Windows), or your preferred backup solution • For critical projects, keep an additional off-site backup • Use version control (like Git) for text-based project components if you want detailed change history • Export final deliverables to a separate archive for long-term preservation Import and Export AKARI can import from and export to standard formats at every level: • Individual assets (drag and drop) • Project archives (folder copy) • Memory data (Markdown and JSON export) • Settings and presets (JSON export) • Content for publishing (platform-optimized exports)

The AKARI Constitution on Privacy

AKARI's AI Constitution includes explicit privacy commitments that are not just policy statements — they're technical constraints built into how the AI operates. Principle 2: Privacy as a Right This principle manifests in several concrete ways: No Training on Your Data AKARI does not collect your content, conversations, or creative output to train AI models. The AI model providers have their own policies (which you should review), but AKARI itself adds no additional data collection layer. No Analytics Phoning Home AKARI does not send usage analytics, crash reports, or telemetry to our servers by default. If you opt into anonymous usage statistics (to help us improve the product), you can see exactly what data is collected and disable it at any time. No Content Scanning AKARI does not scan your content for advertising purposes, content moderation, or any form of surveillance. Your files are your files. Principle 3: No Mass Surveillance or Manipulation The AI partner is specifically constrained from: • Creating tracking pixels or web beacons • Building dark patterns designed to manipulate users • Implementing data harvesting mechanisms • Creating surveillance tools or monitoring systems If you ask the AI to help you build something that violates these principles, it will explain why it can't help and suggest ethical alternatives. Consent Before Transmission Before any data leaves your device, AKARI requires your explicit action (sending a message to the AI partner). There is no background data transmission, no periodic check-ins, and no silent uploads. These constraints are embedded in the AI's system prompts and are enforced at the application level. They're not promises that depend on trust — they're architectural decisions that are verifiable in the open-source code.

Comparison with Cloud-Based Alternatives

To put AKARI's privacy approach in context, here's how it compares to typical cloud-based creative tools and AI services: Data Storage Cloud tools: Your projects live on their servers. You access them through a browser. AKARI: Your projects live on your computer. Period. Data Access Cloud tools: The service provider can access your data (for support, moderation, or improvement). AKARI: Only you can access your data. We have no access mechanism. Internet Dependency Cloud tools: No internet = no work. AKARI: No internet = no AI partner, but everything else works perfectly. Data Portability Cloud tools: Export is possible but often limited, lossy, or slow. AKARI: Your data is already in standard formats on your disk. Nothing to export. AI Data Usage Cloud AI tools: Your conversations may be used to improve models (varies by provider). AKARI: Your conversations are never stored on any server. AI providers handle only the individual API calls. Service Continuity Cloud tools: If the company shuts down, your data may be at risk. AKARI: If AKARI development stops tomorrow, your data remains exactly where it is, in standard formats. Cost of Privacy Cloud tools: Privacy features are often premium add-ons. AKARI: Full privacy is the default, free behavior. There is no "less private" option. We're not saying cloud tools are bad. Many people prefer the convenience of cloud storage and collaboration. But if privacy is a priority — and for creative professionals handling client work, unreleased projects, and business data, it should be — AKARI's architecture provides guarantees that cloud tools structurally cannot.

Security Best Practices

While AKARI's architecture provides strong privacy by default, your overall security also depends on how you manage your machine and credentials. Here are our recommendations: Protect Your API Keys • Never share your API keys with others • Use separate keys for different purposes if your provider supports it • Rotate keys periodically, especially if you suspect compromise • Monitor usage through your provider's dashboard for unexpected activity Secure Your Machine • Keep your operating system and AKARI updated • Use full-disk encryption (FileVault on macOS, BitLocker on Windows) • Use a strong login password and enable automatic screen lock • Be cautious about installing unverified plugins or extensions Back Up Regularly • Your AKARI data is only as safe as your backup strategy • Include project directories in automated backups • Test your backups periodically — make sure you can actually restore from them • Consider encrypted backups for sensitive client work Network Security • Use AKARI's AI features only on trusted networks • Be aware that public Wi-Fi can expose your API calls (though HTTPS protects content) • If working with especially sensitive material, consider using a VPN Review Periodically • Check your AI partner's memory periodically to ensure it hasn't stored anything you'd prefer it didn't • Review the Taste profile to confirm the AI's understanding of your style is accurate • Audit your connected accounts and API keys — revoke any you no longer use AKARI gives you the architecture for privacy. Good security practices ensure you get the full benefit of that architecture.

Our Commitment Going Forward

Privacy isn't a feature that can be "done." It's an ongoing commitment that must evolve as technology changes. Here's what we commit to: Never Moving Away from Local-First The local-first architecture isn't a temporary choice. It's a core design principle. Future features will be designed to work within this architecture, not to circumvent it. If we add optional cloud features (like cross-device sync), they will always be optional and transparent. Open Source Accountability AKARI's code is open source under AGPL-3.0. This means: • Anyone can audit our privacy claims by reading the code • Security researchers can identify and report vulnerabilities • The community can verify that updates don't introduce unwanted data collection • If we ever broke our promises, the code would prove it Transparent Communication If our privacy practices ever need to change, we will: • Announce changes clearly and in advance • Explain exactly what is changing and why • Give users time to adjust or opt out • Never make retroactive changes to existing data Continuous Improvement We actively look for ways to improve privacy: • Reducing the amount of context sent to AI providers • Exploring on-device AI models that eliminate network calls entirely • Improving the transparency of what data flows where • Making privacy controls more granular and accessible Your trust is the foundation of everything we're building. We intend to keep earning it every day.