logoalt Hacker News

menaerusyesterday at 8:45 AM1 replyview on HN

Not sure what you're on but I think what you said is incorrect. You can use hi-density HBM-enabled FPGA with (LP)DDR5 with sufficient number of logic elements to implement the inference. Reason why we don't see it in action is most likely in the fact that such FPGAs are insanely expensive and not so available off-the-shelf as the GPUs are.


Replies

wmfyesterday at 6:50 PM

Yeah, FPGA+HBM works but it has no advantage over GPU+HBM. If you want to store weights in FPGA LUTs/SRAM for insane speed you're going to need a lot of FPGAs because each one has very little capacity.