Skip to main content
Back to Glossary
Technical

Context Window

The maximum amount of text an AI model can consider at once, including both your input and its response.


Why Context Window Matters

Imagine trying to write a book report but you can only remember the last 3 pages you read. That's what a small context window feels like for an AI.

A larger context window means the AI can:

  • Read longer documents
  • Remember earlier parts of a conversation
  • Consider more code files at once
  • Make connections between distant pieces of information

Context Window Sizes

Context windows are measured in tokens (roughly 4 characters per token).

Small (4K-8K tokens): Older models. Can handle a few pages of text.

Medium (32K-100K tokens): Most modern models. Can handle long documents or conversations.

Large (200K+ tokens): Cutting-edge models. Can process entire books or large codebases.

The Catch

Bigger isn't always better. Models can struggle with information in the middle of very long contexts (the "lost in the middle" problem). They also get slower and more expensive as context grows.

How Reviews Help

When evaluating AI tools, check how they handle long context. Does the tool maintain coherence across long documents? Does it actually use information from earlier in the conversation? Reviews from real users reveal whether the context window is genuinely useful or just a marketing number.

Related Terms

More in Technical