I don‘t understand why you would train a NN for an operation like sqrt that the GPU supports in silicon.
I see it as a practical joke or a fun hack, like CPUs implemented in the Game of Life, or in Minecraft.
I see it as a practical joke or a fun hack, like CPUs implemented in the Game of Life, or in Minecraft.