upvote
But most "normal" neural networks are feed-forward, so they are guaranteed to terminate in a bounded amount of time. This rules Turing completeness right out. And even recurrent NNs can be "unfolded" into feed-forward equivalents, so they are not TC either.

You need a memory element the network can interact with, just like an ALU by itself is not TC, but a barebones stateful CPU (ALU + registers) is.

reply