logoalt Hacker News

AlexCoventry04/13/20260 repliesview on HN

I have a transformer attention mechanism which seems to be more data-efficient than the usual dot product, and I'm trying to write a performant backwards kernel for it.