A fully charged battery is 12.7V :
Your alternator should put out in the region of 15.1-14.7V depending whether cold or hot, the voltage may go up slightly on the upper level due to certain things like load etc, especially at idle but the battery is almost certainly a calcium/silver-calcium type which refers to the doping on the plates to prevent sulphation."Normal" lead-acid batteries have lead-antimony doping and need 14.4-13.8V to charge, the calcium increases the charging voltage by 0.1V/cell and with 6 cells, each of 2.2V (total 13.2V) you can see where the extra 0.6V of charging voltage is needed.
With so much electronics on cars thse days and so many things referenced to a stabilised supply of 5V, 10V, 12V etc and a variable battery voltage, it's easy to see if the battery drops below a certain level of charge, strange things will happen. To put this into perspective, many sensors work on the potential divider principle where two resistors in series divide the battery voltage. One of those resistors will be the sensor but these derived voltages will be compared to a known voltage from one of the stabilised supplies. To make the figures easy, let's say one of the stabilised reference voltages is 6V. Let's make a sensor where the resistance is 6k and is fed by a 6k current limiting resistor which provides a total of 12k Ohms. Call the sensor R1 and the limiting resistor R2, the voltage developed across R1 = VR1 = Vb x (R1/(R1 + R2) so in this example, VR1 = 6/12 x Vb (Vb is battery voltage) so at 12V, VR1 will be 6V.
If the threshold for a fault is 5.9V compared to the stabilised 6V supply, no fault will be shown but if the battery voltage drops to 11.5V, VR1 will also drop to 5.75V and hence register a fault.
Bit long-winded for a simple explanation but hopefully gives an insight into how these things work!