logoalt Hacker News

qnleighyesterday at 11:45 PM11 repliesview on HN

> That will help save enormous amounts of power: up to 48 percent on a single charge,

Why does refresh rate have such a large impact on power consumption? I understand that the control electronics are 60x more active at 60 Hz than 1 Hz, but shouldn't the light emission itself be the dominant source of power consumption by far?


Replies

alok-gtoday at 3:25 AM

I used to be a display architect about 15 years back (for Qualcomm mirasol, et al), so my knowledge of the specifics / numbers is outdated. Sharing what I know.

High pixel density displays have disproportionately higher display refresh power (not just proportional to the total number of pixels as the column lines capacitances need to be driven again for writing each row of pixels). This was an important concern as high pixel densities were coming along.

Display needs fast refreshing not just because pixel would lose charge, but because the a refresh can be visible or result in flicker. Some pixels tech require flipping polarity on each refresh but the curves are not exactly symmetric between polarities, and further, this can vary across the panel. A fast enough refresh hides the mismatch.

show 2 replies
blovescoffeeyesterday at 11:58 PM

There's definitely a few reasons but one of them is that you have to ask the GPU to do ~60x less work when you render 60x less frames

show 3 replies
Veedractoday at 2:12 AM

I think the idea is that in an always-on display mode, most of the screen is black and the rest is dim, so circuitry power budget becomes a much larger fraction of overhead.

show 1 reply
veqqtoday at 6:37 PM

Really disappointing to only learn this after a decade, but on Linux changing from 60hz to 40hz decreased my power draw by 40% in the last hour since reading this comment.

perching_aixyesterday at 11:52 PM

I interpreted that bit as E2E system uptime being up by 48%. Sounds more plausible to me, as there'd fewer video frames that would need to be produced and pushed out.

eliftoday at 3:56 PM

Your GPU rendering 1 frame vs your GPU rendering 60 frames.

hedoratoday at 12:28 AM

This is an OLED display, so I don't think the control electronics are actually any less active. (They would be for LCD, which is where most of these low-refresh-rate optimizations make sense.)

The connection between the GPU and the display has been run length encoded (or better) since forever, since that reduces the amount of energy used to send the next frame to the display controller. Maybe by "1Hz" they mean they also only send diffs between frames? That'd be a bigger win than "1Hz" for most use cases.

But, to answer your question, the light emission and computation of the frames (which can be skipped for idle screen regions, regardless of frame rate) should dwarf the transmission cost of sending the frame from the GPU to the panel.

The more I think about this, the less sense it makes. (The next step in my analysis would involve computing the wattage requirements of the CPU, GPU and light emission, then comparing that to the KWh of the laptop battery + advertised battery life.

show 3 replies
perfmodetoday at 1:11 PM

[dead]

dealfinder994today at 7:51 AM

Great discussion! This reminds me of similar challenges in AI development.

jdubtoday at 12:41 AM

Before OLED (and similar), most displays were lit with LEDs (behind or around the screen, through a diffuser, then through liquid crystals) which was indeed the dominant power draw... like 90% or so!

But the article is about an OLED display, so the pixels themselves are emitting light.

show 2 replies
mmcnltoday at 2:55 PM

It doesn't. They take extreme use cases such as watching video until the battery depletes at maximum brightness where 90% of power consumption is the display. But in realistic use cases the fraction of power draw consumed by the display is much smaller when the CPU is actually doing things.

show 1 reply