when AI starts training itself accidentally on AI generated content, we all lose...
Meanwhile on the HN main page right now: "Embarrassingly Simple Self-Distillation Improves Code Generation" https://news.ycombinator.com/item?id=47637757
Don't we already have "RLHF on synthetic data"?
Meanwhile on the HN main page right now: "Embarrassingly Simple Self-Distillation Improves Code Generation" https://news.ycombinator.com/item?id=47637757