Hi all,
I would like to start this topic to get some efficiency information. So if you have efficiency information on MPPT chargers I would like to encourage you to post it here. Also the measurement methods are of interest.
What I did was the following:
- I use a lab power supply and set the current limit such that the output of the charger is the current at which you want to determine the efficiency.
- I set the voltage to 17V because that resembles the MP for a 12V panel
- I assure that the battery is sufficiently empty to assure charging is in CC mode.
- I have two power meters that I calibrated by measuring the energy for a certain time for a load at the current at which I want to determine the efficiency. One in front of the DUT and one behind the DUT.
- I measure the energy for a time that gives sufficient accuracy and divide the outgoing energy (Wh) by the incoming energy. This number is the end-to-end conversion efficiency.
Today I tried a setup to measure efficiency. The DUT was the Genasun GV-10. The result at an output current of 5A was 0.948. So 94.8% Not really the 96-98% they claim.
I posted some efficiency measurements in this issue: https://github.com/LibreSolar/MPPT-Charger_20A/issues/23
I’m in a hurry, but can provide measurement details in the next days.
.Looks very good. At lower currents I suspect the converter goes to DCM mode. That would require a larger inductor which will increase DCR. So Ithink it is an excellent compromise. The STM32 is 48Mhz so with 8 bit pwm resolution you could go to frequencies up to 180kHz correct?
I made a little mistake on how hooked up the power meters. Now I measure an efficiency of 95.2%. I made small video on Youtube: https://www.youtube.com/watch?v=m-hJIyYFY5Q&t=2s
Thanks for the video. Very interesting that they don’t reach the claimed efficiency.
Your PWM resolution calculation is correct. But it’s not necessarily beneficial to increase the frequency too much, as you need higher switching speeds to lower the switching losses, creating more EMC issues.
Here is a bit more info on my efficiency measurement as posted in the linked issue on github:
- Input power: EA PSI9200-50 with Kelvin measurement at charge controller solar terminal
- Output power: Li-Ion battery, approx. 13.5V, current measured with Fluke 175 Multimeter, Voltage at battery terminal of charge controller with Benning MM 1-3 Multimeter
- Duty cycle: 50% fixed (no MPPT active)
Well that is a serious power supply. 3.5K euro. The input voltage was 27V that is too pessimistic for a 34-36 cell panel Vmp would be around 17 volt (or double if you would connect 2 in series).
From your graph I learn that from 2-6A switching losses seem to impact efficiency roughly 0.5%. I am wondering were it would come from. With these frequencies it is probably not core loss with this low permeability core material, MOSFET Qg is very low. In the software I see a dead time of 300ns that strikes me a little high how was it determined?
I agree that increasing frequency means more loss in general but also means less large inductors and capacitors reducing volume, mass and cost.