One thing I like with Aider is the fact that I can control the context by using /add explicitly on a subset of files. Can you achieve the same wit OpenCode ?
I'm curious: I'venever touched cloud models beyond a few seconds. I run a AMD395+ with the new qwen coder. Is there any intelligence difference, or is it just speed and context? At 128GB, it takes quite awhile before getting context wall.
There's a difference in intelligence. However for 90% of what I'm doing I don't really need it. The online models are just faster.
I just did a one hour vibe session today, ripping out a library dependency and replacing it with another and pushing the library to pypi. I should take my task list and let the local model replicate the work and see how it works out.