logoalt Hacker News

drob518today at 2:19 AM2 repliesview on HN

I’m really curious how this scales up. Bonsai delivers an 8B model in 1.15 GB. How large would a 27B or 35B model be? Would it still retain the accuracy of those large models? If the scaling holds, we could see 100+B models in 64 GB of RAM.


Replies

cubefoxtoday at 3:21 AM

Also depends on how expensive training these models is. It's probably at least as expensive as full precision models, otherwise they would have mentioned it.

show 1 reply
MeetRickAItoday at 4:18 PM

[dead]