Context Length
Definition
The maximum number of tokens a language model can process in a single prompt and response combined.In-Depth Explanation
Context length determines how much text a model can "remember" during a conversation. Early models had 2K-4K tokens, while modern models support 128K (GPT-4 Turbo) or even 1M tokens (Gemini). Longer contexts enable processing entire books or codebases but require more memory and compute. Context length is a key differentiator between models.
Real-World Example
Claude 3 with 200K context can analyze an entire novel in one prompt, while GPT-3 with 4K context could only handle a few pages.