They used a custom neural net with autoencoders, which contain convolutional layers. They trained it on previous experiment data.
https://arxiv.org/html/2411.19506v1
Why is it so hard to elaborate what AI algorithm / technique they integrate? Would have made this article much better
It seems like most of the implementation is FPGA, which I wouldn’t call “physically burned into silicon.” That’s quite a stretch of language
Because if it’s not an LLM it’s not good for the current hype cycle. Calling everything AI makes the line go up.
Thanks for tracking this down. I too am annoyed when so-called technical articles omit the actual techniques.
Because it does not align with LLM Uber Alles.
I'm half expecting to see "AI model" appearing as stand-in for "linear regression" at this point in the cycle.