what's a "context window" and why do you keep saying it when talking about vibe coding and LLMs?
what is a context window? a context window in a Large Language Model (LLM) is the amount of text, measured in tokens, that the model can process at any given time to understand input and generate output. think of this like the model's short term memory.
we measure and display this in #shakespeare with a percentage indicator.
the percentage indicates your context window size. as the window gets larger, sometimes LLMs can start making changes that are unintended and also as the context window grows, costs become more expensive as more text is used for the input and output.
clearing your context window can sometimes help alleviate issues and save costs.