What local model keeps NanoClaw useful on a 32 GB MacBook Pro?
An AI-and-agents draft arguing that a small local coder model plus one Ollama runtime is the most honest NanoClaw starting point on a 32 GB Mac when you want autonomy without starving the rest of the machine.
Mar 16, 2026