Hi everyone I enjoy reading the messages on the forum. This is my first post however.
I have some solar panels wired to a 24v system and wish to use an 18x to log the voltage produced by the panels (in sun panel voltage can be up to 35v) and also the voltage at my lead acid battery bank. I have the 18x connected and working with rev-ed serial LCD kit and a voltage divider (as per www.thebackshed.com datalogger. Voltage to ADC pin on the 18x is taken from the junction of a 39k and 10k (to gnd) divider. I have 4v7 zener for protection against high voltage and a smoothing capacitor.
I have seen the threads re calculating voltage from the ADC value if your input voltage is under 5v. However if I input 25.5v the voltage at junction of 39k and 10k reads at 3.44 v I get an ADC reading of 700 which calculates correctly (allowing for accuracy of multimeter etc). I have noticed if I use a variable (regulated) power supply and crank up the voltage there does not seem to be a linear relationship between my input voltage of between 20v to 36v and the read ADC value. Can someone suggest a method that I could use to display a reasonably accurate voltage in that range on the LCD screen, whether by changing the resistor divider values or using code.
Many thanks
Mike
I have some solar panels wired to a 24v system and wish to use an 18x to log the voltage produced by the panels (in sun panel voltage can be up to 35v) and also the voltage at my lead acid battery bank. I have the 18x connected and working with rev-ed serial LCD kit and a voltage divider (as per www.thebackshed.com datalogger. Voltage to ADC pin on the 18x is taken from the junction of a 39k and 10k (to gnd) divider. I have 4v7 zener for protection against high voltage and a smoothing capacitor.
I have seen the threads re calculating voltage from the ADC value if your input voltage is under 5v. However if I input 25.5v the voltage at junction of 39k and 10k reads at 3.44 v I get an ADC reading of 700 which calculates correctly (allowing for accuracy of multimeter etc). I have noticed if I use a variable (regulated) power supply and crank up the voltage there does not seem to be a linear relationship between my input voltage of between 20v to 36v and the read ADC value. Can someone suggest a method that I could use to display a reasonably accurate voltage in that range on the LCD screen, whether by changing the resistor divider values or using code.
Many thanks
Mike