upvote
> The new law, "Claude's Law" dictates that processing speed will increase by a factor of 10 every year.

Is the limit of currently in silico mainly the cleverness of the design or is it physics? While there will be scope for better designs I'm not sure it's at the level of a factor of 10 every year for 10 years.

But perhaps moving fully into 3D chips designs might give a significant boost if the cooling problem can be fixed.

Ultimately computers are just the universe set up in a specific way such that when the universe rolls forward it does a useful computation.

In terms of the best way to build such a machine - I don't think it's inevitable it's biological - biology has great energy efficiency, high levels of connectively, analog inputs and outputs, temporal dimensions and the ability to mix global and local signalling etc. However the actual max rate of neuron firing is relatively low.

You've also not mentioned quantum computing.

reply
I doubt that undirected statistical systems perform better than expert systems. The latter are already in use, nobody designs new chips exclusively with pen and paper. Also the current limits of computation are more due to atomic size and information speed, less due to humans just not working fast enough.
reply
Fantasy land type mindset.
reply
What is so fantasy land about a DNA computer?

https://en.wikipedia.org/wiki/DNA_computing

reply
Bro got cooked
reply
Low quality ad-hominem comments that don't add value are frowned upon here. This isn't Reddit.
reply
Stop cooking me
reply