I spend my time trying to tuning the voice+webapp experience: i.e. how it can explain things, can it surface thinking tokens from claude tools properly etc. The sweat, blood, voice go into `/create_research -> /create_plan` loop before the `/implement_plan`. Sometimes I copy the research and paste it into chatGPT for review or comments as well.
I generally use the MCP to get it to follow commands and explain things to me to make progress in this cycle, and I often pause it and ask for drawing me a mermaid a sequence diagram for events or a block diagram showing how pieces go together.