Local ai is not ready, and if you think it is, prove me wrong with a detailed guide running commodity hardware with complete setup steps that can use a decently sized model.
I spent 2 weeks trying to get anything running - 8gb RX550XT, 12gb ram, 8core cpu. I even tried turboquant to lower memory utilization and still couldnt even get a 3B or 4B model loaded, and anything lower wont suit my needs (3/4B are even pushing it).
Spending humongous amount of money to get machine that'll felt obsolete in 2 years? I don't know.
You're like the kid showing up to a test without a pencil.
It's ridiculous for you to suggest that an advanced AI model needs to run on your budget 7 year old graphics card that is already out of date for even today's gaming. My parents spent $2500 on a computer in 1995 and that was a 166Mhz Pentium 1. If they spent that money today it would be $5261. Think of what you can get for amount of money. Then you're over here trying to say a budget graphics card needs to somehow compete with the bleeding edge of computer innovation.
You do, in fact, need to spend money on appropriate gear if you expect to participate.