logoalt Hacker News

exabrialyesterday at 9:57 PM1 replyview on HN

This is where I see the economy of AI going:

* Inference becomes cheap

- speciality accelerators hit the market and race to the bottom begins

* Training remains expensive

- This works out for Anthropic/OpenAI, they go into the business of training

* Models become rental units or purchasable assets, you run on inference hardware

- Rent or own inference hardware

* Or you pay someone to do all of the above for you, at a premium


Replies

kcbyesterday at 10:17 PM

There's no magic bullet for inference on cheap accelerators. Any accelerator will still require large amounts of high bandwidth memory.

show 2 replies