Context window management

Manage Bob's 200k token context window to improve task performance and reduce Bobcoin usage.

Understanding the context window

The context window is the total amount of information Bob can process at once, measured in tokens.

The context window includes:

  • System instructions: Bob's core capabilities, tool definitions, and mode-specific instructions.
  • Conversation history: All messages, tool uses, and results from your current chat.
  • File contents: Any files you reference using @ mentions.
  • Tool outputs: Results from commands, file reads, searches, and other operations.
  • MCP tool definitions: Descriptions and parameters for all connected MCP server tools.

Monitoring your token usage

Bob displays the current token usage in the top-right corner of the chat panel. This counter shows how many tokens are being used in the current conversation, helping you track context consumption in real time.

Token counter in Bob's chat panel

Token limitation

Bob's context window has a maximum capacity of 200,000 tokens. When you approach this limit, Bob automatically condenses the conversation at 140,000 tokens to maintain performance and prevent context overflow.

Automatic context condensation

When a conversation reaches 140,000 tokens, Bob automatically:

  1. Preserves the most recent and relevant context.
  2. Summarizes or removes older conversation segments.
  3. Maintains critical system instructions and tool definitions.
  4. Continues the conversation with condensed context.

Bob continues operating with earlier conversation context summarzied.

Impact on Bobcoins

Context window usage affects your Bobcoin consumption. Every token processed by Bob, both input and output, contributes to your usage:

  • Larger context processes more tokens: Each interaction includes the full active context.
  • Repeated processing: The same context gets processed with every message you send.
  • Cumulative effect: Long conversations with extensive context consume more Bobcoins per interaction.

Best practices for context management

Keep conversations focused

  • Start a new chat when switching to a different task or topic. This prevents unrelated context from consuming tokens and potentially confusing Bob.
  • Use one chat for implementing a new feature, a separate chat for fixing an unrelated bug, and another chat for refactoring a different component.
  • Avoid using the same chat for multiple unrelated tasks, continuing conversations indefinitely across different projects, or mixing debugging, feature development, and documentation in one chat.

Use context mentions strategically

Context mentions are useful but consume tokens. Be selective about what you include.

Consider the following examples:

Efficient mention usage:

@/src/components/Button.tsx Refactor this component to use TypeScript

Inefficient mention usage:

@/src @/tests @/docs Review everything and suggest improvements

The second example includes unnecessary content, increasing token and Bobcoin usage.

Provide targeted context

Instead of mentioning entire directories or large files, focus on specific sections:

Targeted approach:

@/src/utils/validation.ts:45-67 Fix the email validation logic

Broad approach:

@/src/utils Review all utility functions

Using a targeted approach gives Bob exactly what it needs without consuming unnecessary tokens.

You can also highlight text in your editor and use + L to add it to the chat.

Consider when to use MCP tools

Each MCP server tool you connect adds its definition to Bob's context window. While MCP tools extend Bob's capabilities, having numerous tools can consume significant tokens.

  • Only connect MCP servers you actively use.
  • Disconnect unused MCP servers to free up context space.
  • Consider using project-specific MCP configurations rather than global ones.

You can enable and disable specific tools for an MCP server, to ensure you are only using the tools you need.

Working with large projects

Large projects require additional context management strategies to work effectively within token limits. When working with codebases that have many files or complex architectures:

  • Break down large tasks into smaller, focused subtasks.
  • Use specific file paths and function names rather than vague references.
  • Target specific sections of large files using line ranges (e.g., @/src/utils/validation.ts:45-67).
  • Make iterative changes, reviewing and approving each step before proceeding.

For detailed strategies and examples, see Working with large projects.

How is this topic?