SOC measurement and how it works

Thank you for the invitation link . Based on the SOC estimation and calculation i know that this is done by first estimation when first powered on then by actual current capacity measurement CC after . But at what point in the code does this switch over occur. In order to fully rate a battery system correct and for the bms to calculate the SOC what needs to happen after first boot ? Does the battery Need a full charge till the bms terminates ? Then a full discharge till the bms terminates at Low voltage cut off and if this maybe the case should these figures be 2.5v per cell low voltage and 3.65v per cell high voltage disconnection ? And at this point after 1 full charge and 1 full discharge does the battery change to Coulomb counting ?

// SOC calculation based on average cell open circuit voltage

void bq769x0::resetSOC(int percent)
{
if (percent <= 100 && percent >= 0)
{
coulombCounter = nominalCapacity * percent / 100.0;
}
else // reset based on OCV
{
printf(“NumCells: %d, voltage: %d V\n”, getNumberOfConnectedCells(), getBatteryVoltage());
int voltage = getBatteryVoltage() / getNumberOfConnectedCells();

    coulombCounter = 0;  // initialize with totally depleted battery (0% SOC)

    for (int i = 0; i < NUM_OCV_POINTS; i++)
    {
        if (OCV[i] <= voltage) {
            if (i == 0) {
                coulombCounter = nominalCapacity;  // 100% full
            }
            else {
                // interpolate between OCV[i] and OCV[i-1]
                coulombCounter = (double) nominalCapacity / (NUM_OCV_POINTS - 1.0) *
                (NUM_OCV_POINTS - 1.0 - i + ((float)voltage - OCV[i])/(OCV[i-1] - OCV[i]));
            }
            return;

based on this code the assumption is that when the battery is at over 100% and under 0% then the bms will not use the OCV points . However based on a test i have just conducted the battery is reaching the 2.5v range and the SOC still reads 32% unless the cells that are ignored and reporting 0.000v are interfering with the SOC calculation even though bms.getcellvoltages specifies only the valid cells ?

There are two typical ways for SOC calculation:

  1. Measure the open circuit voltage and calculate the SOC based on known (stored) OCV map of the used cell.
  2. Coulomb counting, starting from a known SOC.

Normally, a combination of both is used.

Especially for LiFePO4 cells, method 1 is quite difficult, as the OCV curve is very flat. So after startup, the estimation will always have a low accuracy.

After a full charge of the battery, the SOC is reset to 100% and coulomb counting starts to get a more accurate SOC.

Coulomb counting accuracy needs a correct setting of the battery capacity, of course. Unfortunately, it suffers from SOC drift, as current measurement errors accumulate. So it needs a full charge to re-calibrate every now and then.

Hi,

to clarify things a bit: the current bq769x0 library source code has no SoC calibration built in. resetSOC() ist only called once at startup. From this time on the coulomb counter is used to calculate the SoC. If resetSOC() is called with an argument between 0 and 100 this value is used as SoC without further calculation. If it’s called without argument like in the original main.cpp it calculates the SoC based on the given OCV points.
I tried to develop a simple calibration routine myself, but i’m not entirely happy with it yet. Stay tuned…

Regards
Frank

Hello Good day and many many thanks for this information . I have tried at best to read the code and understand in more detail how this ties in . Martin you mentioned that when the battery is charged and is once it is fully charged the soc in my case is sometimes 103% sometimes 120% . If i leave the system like this with no current passing into or out of the battery no further changes take place . However if i discharge the battery, the SOC does change in reducing the SOC Level .

Now in this case i dont get a reset of SOC to 100% .
What should technically take place in the measurement and how should this relate to the coulomb counting there after .
As i gather it should be OCV when bms is first powered on then SOC after reaching either 0% SOC or 100% soc ? But if ocv is estimated and we get soc of 110% a the end of a charge how should work in subsequent charge and discharge scenarios there after as the battery might keep going to 110% SOC repetitively and in this case would indicate the actual battery capacity is much larger then Stated in bms settings as capacity in mah…

Some clarification would be good on SOC vs capacity and where the first initial charge up should land the SOC % .?
Is more soc okay ?Should this be the point when the bms is powered off ? and then on again for SOC to read 100% ?

The BMS doesn’t need to be powered off if the SOC calculation is >100%, as soon as the voltage limits are not yet reached.

If the SOC reaches values >100% at end of charge, it should be reset to 100% and ideally, the actual capacity should be adapted so that the next time it is fully charged the SOC matches better.

I started to work on the BMS firmware again, as I designed a new board based on ISL94202 (you might have seen it). So I will also have a look at the SOC calculation. But first priority is to get the support of the core features of the ISL94202.

Thank you very much for this information. it has really helped clear up my understanding of the code and actual user experience too .
My actual problem stems from the fact that when a battery is under charge like a Lifepo4 and the voltage exceeds the OCV 100% point 21 then this causes soc to read 100% straight away. From here on-wards the value increases above 100% . The actualy problem is then if the full charge is completed with SOC reading something like 120% what should be done as protocol . Each time this occurs im actually charging a fresh battery and wanting to perform tests on them and as per say the capacity starts to count down from an extremely high value of 120+ %… and this in turn does have a negative impact on the Coulomb counting when discharging …
In this case once the correct battery capacity is input and the bms powered off and then on the soc will reset to 100% and a discharge and charge will function appropriately .

In this case i think it would be safe to say we should charge the battery up to 3.65v or anything above the highest ocv value .
Then the bms powered off and on to reset soc to 100% then an actual discharge can be performed to ensure Soc reading on the bms is an actual reflection of the batteries real capacity indication .
and this would be termed the primary Initialization charge , And any discharge and charge there after will follow Soc guidlines correctly provided battery capacity is accurately stated in the config !

If I understood correctly, you reset the SOC during charging? That should not be done. It should be reset only during idle, so that it gets the actual OCV and not some random other voltage.

Negative i did not reset during charging .
if we can term it like this ocv actually predicts the cells is at 100% soc when under charge at 3.43v when actually the cell will still take in energy to around 130% from 100% at 3.4v per cell.
So the theory is to charge a battery up even when AH capacity is stipulated correctly for the first charge then switch off the bms and on to get the SOC to read 100% then start the cycling of the battery pack or else the Soc will always be offset incorrectly from first primary kick off charge !

If my theory makes sense then always best to reset the bms after a initial charge when mating a fresh bms to fresh cells ? And commence with cycling the battery pack as unit .

Hello,
SOC is not a simple coulomb count, this method is poor and inefficient in the long run.
Because SOC varies with temperature, level of charge and discharge current, stopping time (leakage current and storage with temperature gradient, for a few weeks or months), age and duration of historical life of the cells, the quality of the assembly, etc.
So if you want a specific SOC level, you have 2 main solutions:
→ addition of a SOC device like MAX17205 or BQ34Z100
→ design of a Kalman filter after definition of an electric model with 3 RC cells min.
The first solution is simple and sufficient for recreation.
Regards.

Hello. Those chips look quite interesting. However, I think it should be possible to implement the same algorithms also in software on the microcontroller, as we’ve got the current and voltage measurements already.

Coulomb counting may not be the most sophisticated approach, but if properly reset after a full charge cycle it’s at least better than nothing. SOC estimation was not my primary focus so far. But I’d be happy to know more about your ideas for improvements.

Hi Martin,
before the deployment of these specific circuits, we had no other choice and had to write SOC algorithms, it took a long time to develop because it required a laborious theoretical study beforehand which had to be refined by a long period of tests and validations.
For lithium cells, the temperature is the most critical adjustment variable of the algorithm to obtain a quality SOC, it is also necessary that the voltage and current measurements be very precise and above all faithful over the entire thermal range of use.
For a large company aiming for a large production volume, spending a lot of time for these tuning makes sense because high precision and great robustness are required, for small and medium quantities and for standard cells these specialized circuits are a real providence. I don’t know if here we can present these solutions, but when I have a moment I will not hesitate to post some applications as examples. Have a good evening.

Hi,

I am currently looking at implementing a Sigma-point (Unscented) Kalman filter (SPKF/UKF) as described in this lecture chapter “Battery Management and Control” by Dr. Gregory L. Plett : http://mocha-java.uccs.edu/ECE5720/ECE5720-Notes03.pdf for my master thesis.

It should work for any battery chemistry including lead-acid, by defining the enhanced self-correcting (ESC) model for the cell chemistry.

You may have a look at the related github issue as well: Add SOC algorithm for lead acid and Li-ion batteries · Issue #3 · LibreSolar/charge-controller-firmware · GitHub

@HULK28 could you give me your perception of the chance of success with this method?

In the end we developed a hacky way of getting this to work… first step was to remove some % of couloumbs during charge to take into account charge inefficiency… for example we removed say 3-5% from a full charge cycle of 100ah … what this does is allows a current taper and voltage trigger process .
By doing this we take away from needing temperature compensation and or charge current accuracy efficiency … because at high charge current more heat is generated and the cells become less effective at storing energy so efficiency drops to say 95% round trip … yet at low c rates of say 10amps on a 100ah cell you get closer to 98% efficiency … this would require a complex table to be designed in code and adds alot of complexity to the R&D …
So by removing some ahr’s we forced to use a voltage as a measurement of the 2 part criteria to mark the battery as full .
The second flag in the function is current technically if the cell reaches say 3.5v in Lifepo4 we know its close to full but only when the current falls below 1amp …
So the battery can charge up to say 96% and hover at this value while the cells sit at 3.5v and current is being fed in slowly till the cell increases in internal resistance ( taking in less energy ) and when the current falls below 1amp for a preset time ( user defined ) then set the soc marker to 100% … This has worked for us and i would suggest making this as part of the new code for the new bms