logoalt Hacker News

Show HN: 1-Bit Bonsai, the First Commercially Viable 1-Bit LLMs

388 pointsby PrismMLyesterday at 9:01 PM146 commentsview on HN

Comments

imta71770today at 12:48 AM

[dead]

68768-8790today at 1:10 AM

[dead]

rcdwealthtoday at 1:47 AM

[dead]

unit149today at 7:48 AM

[dead]

childrapsttoday at 1:56 AM

[dead]

volume_techtoday at 12:22 AM

[flagged]

zephyrwhimsytoday at 1:10 AM

[flagged]

show 1 reply
hatthewyesterday at 11:52 PM

I feel like it's a little disingenuous to compare against full-precision models. Anyone concerned about model size and memory usage is surely already using at least an 8 bit quantization.

Their main contribution seems to be hyperparameter tuning, and they don't compare against other quantization techniques of any sort.