Single Bit Neural Nets Did Not Work - https://fpga.mit.edu/videos/2023/team04/report.pdf
> We originally planned to make and train a neural network with single bit activations, weights, and gradients, but unfortunately the neural network did not train very well. We were left with a peculiar looking CPU that we tried adapting to mine bitcoin and run Brainfuck.
Straight forward quantization, just to one bit instead of 8 or 16 or 32. Training a one bit neural network from scratch is apparently an unsolved problem though.
> The trees that correspond to the neural networks are huge.
Yes, if the task is inherently 'fuzzy'. Many neural networks are effectively large decision trees in disguise and those are the ones which have potential with this kind of approach.
I don't think it's correct to call it unsolved. The established methods are much less efficient than those for "regular" neural nets but they do exist.
Also note that the usual approach when going binary is to make the units stochastic. https://en.wikipedia.org/wiki/Boltzmann_machine#Deep_Boltzma...
It was until recently, but there is a new method which trains them directly without any floating point math, using "Boolean variation" instead of Newton/Leibniz differentiation:
https://proceedings.neurips.cc/paper_files/paper/2024/hash/7...