logoalt Hacker News

hedorayesterday at 8:12 PM1 replyview on HN

So, they're selling this as an AI accelerator, with drop in compatibility with existing boards, and no boost to RAM bandwidth.

As I understand things, it would be extremely unusual to ship a chip that was bound by floating point throughput, not uncached memory access, especially in the desktop/laptop space.

I haven't been following the Intel server space too carefully, so it's an honest question: Was the old thing compute and not bandwidth limited, or is this going to be running inference at the same throughput (though maybe with lower power consumption)?


Replies

Tepixyesterday at 8:18 PM

No, they're not selling this as an "AI accelerator":

Here is the quote:

"The company says operators deploying 5G Advanced and future 6G networks increasingly rely on server CPUs for virtualized RAN and edge AI inference, as they do not want to re-architect their data centers in a bid to accommodate AI accelerators."

Edge AI usually means very small models that run fine on CPUs.

show 1 reply