Context Window
LLM’s short-term memory: Max tokens it can process at once.
From 10-AI-Concepts
- Early: 4k tokens (GPT-3).
- Now: >1M.
- Impacts prompt design, conversation memory.
LLM’s short-term memory: Max tokens it can process at once.
From 10-AI-Concepts