Discussion
atonse: CC seems to have gotten pretty good with auto compacting and continuing where it left off. Are there any good use cases for this?I guess it would be to avoid tool use?
satring: The big win is multi-file refactors where the model needs to hold the full dependency graph in context simultaneously. With 200k, compaction kicks in mid-task and the model loses track of which files it already changed, leading to inconsistencies or repeated work. With 1M you can load an entire module (tests, types, implementation) and get a coherent refactor in one pass.Also useful for long debugging sessions where the conversation history itself is the context. Compaction preserves a summary, but summaries lose the exact error messages, stack traces, and failed approaches that help the model avoid going in circles.
atonse: But interestingly every now and then I look at the compaction result and it now says if you need to reference the previous conversation you can open <file>. So technically that context is connected.I’ve noticed MCPs get unstable after compaction. but even that’s been less so lately.