logoalt Hacker News

yorwbatoday at 6:33 AM0 repliesview on HN

What is classic about "skip updating parameters with high gradient/loss variance in multiple batches/samples"? Do you have a particular algorithm in mind that uses this heuristic?