Language Models
Context Window
Definition
The maximum amount of text (measured in tokens) that a language model can process in a single interaction.In-Depth Explanation
Context window size determines how much information an LLM can consider at once. Larger windows enable processing of longer documents and maintaining extended conversations. GPT-4 Turbo has a 128K token context window, while Claude 3 supports up to 200K tokens.
Real-World Example
A model with a 4K context window cannot summarize a 10K-word document in one pass without chunking.
0 views0 found helpful