The STM32F1xx has internal temperature sensor hooked on ADC1, it should be easy to read its value, but after more tries, I am still getting wrong values. My code returns values betwen 204 … 207 degrees of celsius, which is surely wrong…
Could somebody tell me where is a problem with this code?
Although it is not crystal clear the spec sheet implies the temperature coefficient of the voltage from the sensor should be positive. If so then the sign in your calculation should be the other way round, e.g. it should be (ADC_V - Offset_25) /Volts_degree + 25. Have you seen anything that implies a negative co-efficient?
I realise this would give totally screwy results with the ADC values you read but maybe that is because the ADC is not returning correct values yet.
The reference manual RM0008 says “The temperature sensor output voltage changes linearly with temperature. The offset of this
line varies from chip to chip due to process variation (up to 45 °C from one chip to another).” and gives this formula:
Temperature (in °C) = {(V25 - VSENSE) / Avg_Slope} + 25
The datasheets for the STM32F103xB chips (Nano) and STM32F103xC (Quad) both say V25 is 1.34-1.52V and Avg_Slope is 4.0-4.6 mV/°C.
AFAICS, your formula in the code is correct. What are your ADC raw values? Mine are around 1860, which would give 7°C. My ADC reading goes down after power-on, while I expect the chip temperature to rise, so the negative coefficient seems valid.
Thank you guys. I finally found the problem. I tried to set up the ADC, select channel, enable temperature sensor inside single function. I wasn’t reading the datasheets deeply, but I tried placing some delay between ADC initialization and reading. It seems working now, I am getting values between 2065…2072 at 21.3 ^C. The variation of converted value is pretty big, so filtering is a must. The datasheet says (thank you bobtidey) that the dependency of temperature from junction voltage is linear, but needs to be calibrated (the absolute shift and also the slope). Currently I am experimenting with getting two point calibration curve… But the temperature of CPU slowly rises and makes all of this useless
Here is the fixed code if someone will face same problem as me:
I use this code gitorious.org/dsonano/dso-bootl … .c#line132 with stuff mostly copy-pasted from some examples I have seen. As you can see there is an ADC calibration phase in there. Maybe you won’t need a magic delay if you do it like this.
Note also that the data sheet from bobtidey is for another chip series (STM32L) which for instance has calibration data burned in during fabrication.