Cogitae Tutorials — Step-by-Step Guides for Mac AI Power Users
Cogitae is self-documenting — once you have a provider connected, you can ask it anything about itself and it will answer. Better yet, it has full exposure to its own internals: it can walk you through its own UI, help you accomplish tasks directly, and even debug issues with itself. These tutorials cover the features that make Cogitae genuinely different.
Add Another AI Provider
Cogitae works with any number of AI providers simultaneously — Anthropic Claude, OpenAI, Google Gemini, xAI Grok, Mistral, Perplexity, Groq, DeepSeek, Ollama, and more. You bring your own API keys, which means your conversations never pass through a middleman and you pay provider rates directly — no markup.
What You’ll Need
An API key from at least one provider. If you don’t have one yet, here are the fastest options to get started:
- Anthropic Claude — console.anthropic.com — Claude 3.5 Haiku is fast and cheap; Claude 3.7 Sonnet is the best all-rounder
- OpenAI — platform.openai.com — GPT-4o is capable and widely supported
- Google Gemini — aistudio.google.com — Gemini Flash is extremely fast and inexpensive; has the best prompt caching rates
Steps
1. Open Preferences
Press ⌘, or go to Cogitae → Preferences in the menu bar.
2. Select the AI tab
Click the AI tab in the preferences toolbar.
3. Add a Provider
Click the + button in the Providers section. A provider editor opens on the right.
4. Choose Your Provider
Select your provider from the dropdown (e.g., Anthropic). The form updates to show the fields that provider needs.
5. Enter Your API Key
Paste your API key into the API Key field. Cogitae stores it securely in the macOS Keychain — it never leaves your machine in plain text.
6. Choose a Default Model
Select a default model from the dropdown.
7. Save
Click Save. Your provider appears in the list and is immediately available in all conversations.
Setting a Summarization Provider
Back in the AI preferences, you’ll see a Summarization Provider dropdown. This is the provider Cogitae uses for lightweight background tasks — generating conversation summaries, creating references for bookmarked content, and similar housekeeping. Set it to your fastest, cheapest model (Gemini Flash or Claude Haiku are good choices) to keep background costs minimal.
You’re Ready
Open a new conversation (⌘N), select your provider from the toolbar, and start talking. Once you have a provider connected, Cogitae becomes fully self-documenting — just ask it how to use any feature and it will tell you, or do it for you.
Build a File Monitor Agent in 60 Seconds
Cogitae’s event-driven agent system lets you create autonomous agents that watch for things and act on them — no code required. This tutorial walks you through creating an agent that monitors a folder and summarizes what changed.
What You’ll Build
An agent that watches your Downloads folder, detects new files, and sends you a desktop notification with a plain-English summary of what arrived.
Steps
1. Open Agent Preferences
Go to Preferences → Agents and click + to create a new agent.
2. Name Your Agent
Give it a name like Downloads Monitor.
3. Choose the Event Source
Set the event source to FileSystem and select your ~/Downloads folder as the path to watch.
4. Write the Instruction
In the instruction prompt field, write something like:
A new file has appeared in my Downloads folder. Briefly describe what it is based on its filename and extension, and whether I likely need to do anything with it.
5. Set the Action Policy
Choose Act and Notify — the agent will run the AI analysis and send you a notification with the result.
6. Enable and Save
Toggle the agent on. From now on, every time a file lands in Downloads, Cogitae silently analyzes it and tells you what it is.
What’s Happening Under the Hood
Cogitae uses FSEvents (the same kernel-level file watching macOS uses for Spotlight) to detect changes. When a change is detected, your agent wakes up, runs a headless AI session with full tool access, and delivers the result — all without you touching the app.
Taking It Further
- Watch a project folder and get notified when build artifacts change
- Monitor a shared Dropbox folder and summarize new documents
- Watch a log directory and alert you when errors appear
- Ask Cogitae to write and install the agent for you: “Create an agent that watches my Downloads folder and tells me when something new arrives”
Socratic Reasoning: Getting Better Answers Through Debate
Cogitae includes a structured investigation system called Aristotle that uses a second, independent AI model (controlled by Socrates) to challenge its own conclusions. The result is analysis that’s been stress-tested before it reaches you — closer to how good human thinking works.
Why This Matters
Single-model responses are confident by default. The model generates a plausible answer and stops. It doesn’t backtrack, consider alternatives, or catch its own overreach.
Cogitae’s Aristotle forces that process to happen: one model investigates, builds a case, and reaches a conclusion — then a second model (Socrates) tears it apart looking for gaps, unsupported assumptions, logical fallacies, and unexplored alternatives. The cycle repeats until both models agree the conclusion is solid.
When Cogitae Uses It
When you ask a diagnostic question — “Why is this happening?”, “What’s wrong with this approach?”, “What am I missing?” — Cogitae evaluates whether the problem is complex enough to warrant Socratic reasoning. If it is, it will ask your permission before launching the full investigation (since it uses a high-intelligence model and costs more tokens).
How to Trigger It Manually
Ask anything that starts with why, what’s wrong, or help me think through:
“Help me think through the architecture for this feature.” “Why does this algorithm behave differently at scale?” “What are the weaknesses in this business plan?”
You can also ask Cogitae to investigate explicitly:
“Investigate why my build times have gotten slower.” “Explore the tradeoffs between these two approaches.”
What You Get Back
A structured report with:
- Conclusions with confidence levels
- Evidence gathered during investigation
- Eliminated alternatives — what was considered and ruled out
- Residual uncertainty — what’s still unknown
- Recommended next step
Practical Example: Planning a Feature
Instead of asking “How should I implement X?” (which gets a confident single-pass answer), try:
“I want to add real-time collaboration to my app. Investigate the tradeoffs between CRDTs, operational transforms, and a simple lock-based approach for my use case.”
Cogitae will research each approach, have Socrates challenge its analysis, and return a structured comparison that’s been genuinely stress-tested.
Control Your Mac from Your iPhone
Cogitae has an iOS companion app that lets you start conversations, continue them, and — most powerfully — send instructions to the macOS app from your phone. Your Mac runs the task while you’re away from it.
What You Can Do
- Browse and continue any conversation from your Mac, on your iPhone
- Start new conversations that sync back to macOS
- Send natural language commands that execute on your Mac: “Read the file I was working on and summarize the last 50 lines”
- Trigger agents, change settings, run tools — all from iOS
Setup
On your Mac:
Cogitae uses Bonjour for local network discovery — no configuration needed if your iPhone and Mac are on the same Wi-Fi network.
On your iPhone:
- Open the Cogitae iOS app
- Tap the machine picker — your Mac appears automatically
- Select it to connect
Starting and Continuing Conversations
The iOS app shows your full conversation history, synced via iCloud. Tap any conversation to open it, read it, and continue it. Responses from the AI appear on both devices.
Sending Remote Commands
This is where it gets interesting. When connected to your Mac, messages you send from iOS are relayed to the macOS app and executed with full tool access — the same tools available in a desktop conversation.
From your phone, you can say:
“Check if my server is running and tell me the last 20 lines of the log” “What files did I add to my project today?” “Change the code style to dark theme” “Run my daily summary agent now”
Your Mac does the work. The result comes back to your phone.
A Practical Workflow
You’re in a meeting and remember you need to check something in your codebase. Pull out your phone, open Cogitae, connect to your Mac, and ask:
“Search my project for all uses of the deprecated API and list the files”
By the time you’re back at your desk, the answer is waiting.
Branching Conversations: Never Lose a Good Idea
Most AI clients give you a linear conversation. Cogitae gives you a tree. Every AI response, every note, every system message can be branched — you can explore a different direction without losing where you were.
Why Branching Changes Everything
In a linear chat, if you don’t like an answer and ask a follow-up, you’ve permanently altered the conversation. The original response is still there visually, but the context has moved on.
In Cogitae, you can branch from any AI response and take the conversation in a completely different direction — while the original branch stays intact and navigable. You end up with a map of how your thinking evolved, not just where it ended up.
Creating a Branch
- Hover over any AI response and click the branch icon
- Type your new message in the input field
- Submit — a new branch is created
You’ll see branch spinners (arrows) on the message, showing you how many siblings exist. Click them to move between branches.
The Conversation Graph
Open the Table of Contents with Command-L and switch to Graph View. You’ll see your entire conversation as a visual tree:
- Blue nodes: System messages
- Green nodes: Your messages
- Purple nodes: AI responses
- Orange nodes: Notes
Pan, zoom, and right-click any node to navigate to it or make a different branch active. For complex research or planning sessions, this view reveals structure you can’t see in a linear scroll.
Practical Uses
Exploring alternatives: Ask for three different approaches to a problem, then branch from each one to explore them independently.
Safe editing: Edit a user message to rephrase a question — Cogitae automatically creates a branch, so your original question and its response are preserved.
Parallel research: Branch at a system message to run the same conversation with two different AI providers and compare responses side-by-side.
Capturing tangents: If a conversation goes somewhere interesting but unrelated to your main thread, branch there and explore it without derailing the original.
Creating Projects: Branch any number of conversations off of the initial system prompt. Each different aspect of your project on its own branch, all grouped under a parent conversation. Each branch can have its own branches.
Notes as Anchors
Notes are non-AI messages you can drop anywhere in the conversation tree — they don’t get sent to the AI, but they can be branched from and attached to queries as context. Notes have the same full markdown rendering as AI messages. Use them to annotate branches, mark decisions, or leave yourself reminders mid-investigation.
Context Optimizer: Use Only the Tools You Need
Cogitae supports a large and growing number of tools — file access, web search, code execution, shell commands, and more. Every enabled tool adds its full definition to the system prompt, which means more tokens per query. The Context Optimizer solves this by letting the AI discover and activate tools on demand, rather than loading everything upfront.
How It Works
When the Context Optimizer is enabled, Cogitae starts each conversation with a minimal toolset. Instead of injecting every tool definition into the system prompt, it provides a single catalog tool that the AI can call to:
- List available tools — see what’s available, grouped by category, with descriptions
- Enable specific tools — activate only the ones relevant to the current task
The AI reads your message, decides which tools it needs, enables them, and proceeds — all transparently. You don’t need to do anything differently.
Turning It On
Go to Preferences → Tool Settings and enable Tool Context Optimizer. It applies to new conversations from that point forward.
When to Use It
The optimizer is most useful when you have many tools enabled — especially MCP tools or native plugins — but most conversations only use a handful of them. It reduces token usage by keeping unused tool definitions out of the context window.
The trade-off is a small overhead: the AI makes an extra tool call at the start of each session to discover and enable what it needs. For conversations with only a few tools enabled, the savings are minimal and you can leave it off.
Tips
Enable a wide variety of tools. The optimizer makes it practical to leave many tools enabled without paying the token cost for all of them in every conversation. Install what you might need and let the AI pick the right tools for the job.
Trust the AI’s tool selection. The optimizer shows the AI a catalog with descriptions. It will enable the tools relevant to your query — you don’t need to tell it which tools to use.
Spend Less, Think More: Automatic Cost Optimization
Cogitae is built around the idea that you shouldn’t have to think about AI costs — the app should handle that for you. It does this in three ways: automatic model selection, prompt caching, and live cost tracking.
Automatic Model Selection
When Cogitae spawns sub-agents (for Aristotle, Socrates, parallel research tasks, and more), it doesn’t just use your default model for everything. It scores each task against two dimensions:
- Intelligence Index — how complex is the reasoning required?
- Coding Index — does this task involve writing or analyzing code?
For a simple task like “search this file for a pattern,” it picks the cheapest model that can do it reliably. For a complex architectural analysis, it selects a frontier model. You get the right tool for each job automatically, without configuring anything.
Prompt Caching
Anthropic and Google both support prompt caching — when the same large context (your system prompt, conversation history, attached files) is sent repeatedly, the provider caches it and charges a fraction of the normal input price for subsequent requests.
Cogitae takes full advantage of this. Long conversations with Anthropic Claude or Google Gemini get significantly cheaper as the session progresses, because the context is being served from cache rather than re-processed each time.
You can see the cache hit ratio for any conversation:
“What’s my token usage?”
Cogitae reports both best-case cost (with full caching) and worst-case cost (no cache), so you know exactly where your money is going.
Live Cost Tracking
The cost tool gives you per-session, per-message, and per-provider breakdowns:
“How much has this conversation cost so far?” “Compare the cost of my last 10 messages across providers”
Cogitae pulls live pricing from its cloud API (refreshed every 12 hours) so the numbers are always current — no stale hard-coded rates.
Practical Tips for Keeping Costs Down
Install a wide selection of models. Don’t just add the latest-and-greatest models. Add cheaper ones, older ones, faster ones too. Add many models from many different providers. This will give Cogitae a good selection from which to pick the best model for the job.
Use the cheapest model that works. For simple Q&A, drafting, or summarization, a fast cheap model (Claude Haiku, Gemini Flash, GPT-4o-mini) is often indistinguishable from the expensive ones. Reserve frontier models for genuine complexity.
Keep long-running conversations going rather than starting fresh. In a cached session, your accumulated context costs a fraction of what it would in a fresh conversation.
Let Cogitae pick the model. When using multi-agent features like Aristotle or parallel research, trust the automatic model selection. It’s optimized to minimize cost while maintaining quality.