logoalt Hacker News

zhivotatoday at 3:44 AM2 repliesview on HN

Typical hybrid inverters have an output rating around half the theoretical max input of the panels. This is due to theoretical max of panel input being very rare or even impossible in normal earth conditions, the presence of an attached battery to soak up part of the input, and the general cost benefit trade off of solar equipment (more throughput means more heat, means bigger heatsinks, means heavier and more expensive).

You can definitely get equipment that can do symmetrical input/output, but if you actually model out the supply and demand curves on the system it's not usually going to be worth the extra up front expense since peak input is a small portion of the day and that extra hardware will mostly sit idle.

For that matter people often design systems where peak input can't even be accepted by the inverter and the extra power is just wasted, because it's more valuable to have a steady input over a long period than to maximize the daily peak.


Replies

minitoartoday at 3:58 AM

Yes, my grid-tied system is like this. The panels are ~410W and each one has a microinverter with ~390W maximum or something. The more expensive inverters were not worth capturing the peak. You’re better off putting that money into more panels.

margalabargalatoday at 3:50 AM

In the US, most home solar installations do not have a in-home battery. It is not uncommon for rooftop solar to be producing >90% of nominal max, for hours at a time.

I know multiple people with solar and have discussed their specs with them extensively. Zero of them have inverters or microinverters sized below the theoretical max of their array.

Are you thinking of a purely off-grid setup without actually saying so?

show 1 reply