getting temperature from internal CPU sensor

Hello,

The STM32F1xx has internal temperature sensor hooked on ADC1, it should be easy to read its value, but after more tries, I am still getting wrong values. My code returns values betwen 204 … 207 degrees of celsius, which is surely wrong…

Could somebody tell me where is a problem with this code?

Thanks!

Gabriel

int BIOS::SYS::GetTemperature()
{
  RCC_APB2PeriphClockCmd(RCC_APB2Periph_ADC1, ENABLE); 

  ADC_InitTypeDef ADC_InitStructure;
  /* ADC1 configuration ------------------------------------------------------*/
  ADC_InitStructure.ADC_Mode = ADC_Mode_Independent;	 
  ADC_InitStructure.ADC_ScanConvMode = DISABLE;			  
  ADC_InitStructure.ADC_ContinuousConvMode = DISABLE;	 
  ADC_InitStructure.ADC_ExternalTrigConv = ADC_ExternalTrigConv_None; 
  ADC_InitStructure.ADC_DataAlign = ADC_DataAlign_Right;		   
  ADC_InitStructure.ADC_NbrOfChannel = 1;
  ADC_Init(ADC1, &ADC_InitStructure);

  /* ADC1 regular channe16 configuration */ 
  ADC_RegularChannelConfig(ADC1, ADC_Channel_16, 1, ADC_SampleTime_239Cycles5);  
  /* Enable the temperature sensor and vref internal channel */ 
  ADC_TempSensorVrefintCmd(ENABLE);    
  /* Enable ADC1 */
  ADC_Cmd(ADC1, ENABLE);
  /* Enable ADC1 reset calibaration register */ 
  ADC_ResetCalibration(ADC1);
  /* Check the end of ADC1 reset calibration register */
  while(ADC_GetResetCalibrationStatus(ADC1));
  /* Start ADC1 calibaration */
  ADC_StartCalibration(ADC1);
  /* Check the end of ADC1 calibration */
  while(ADC_GetCalibrationStatus(ADC1));  
  /* Start ADC1 Software Conversion */ 

  ADC_SoftwareStartConvCmd(ADC1, ENABLE);	
	while (ADC_GetFlagStatus(ADC1, ADC_FLAG_EOC)!=SET);

	int ADCConvertedValue = ADC_GetConversionValue(ADC1);
	float fTemp = (1.42f - ADCConvertedValue*3.3f/4096)*1000/4.35f + 25;
  // 351.43 - 0.18521*ADC
	ADC_TempSensorVrefintCmd(DISABLE);

  //	return ADCConvertedValue;
	return (int)(fTemp);
}

I can not see the real problem, but here are a few ideas:

  • The supply voltage is 2.8 V not 3.3V
  • The temperature sensor accuracy is pretty bad, ± 20C without calibrating the offset
  • Try reading the internal voltage reference or the battery voltage (AIN12) to see if your ADC code is working

Although it is not crystal clear the spec sheet implies the temperature coefficient of the voltage from the sensor should be positive. If so then the sign in your calculation should be the other way round, e.g. it should be (ADC_V - Offset_25) /Volts_degree + 25. Have you seen anything that implies a negative co-efficient?

I realise this would give totally screwy results with the ADC values you read but maybe that is because the ADC is not returning correct values yet.

This app note and related code may be useful here.

st.com/internet/com/TECHNICA … 035957.pdf

The reference manual RM0008 says “The temperature sensor output voltage changes linearly with temperature. The offset of this
line varies from chip to chip due to process variation (up to 45 °C from one chip to another).” and gives this formula:
Temperature (in °C) = {(V25 - VSENSE) / Avg_Slope} + 25

The datasheets for the STM32F103xB chips (Nano) and STM32F103xC (Quad) both say V25 is 1.34-1.52V and Avg_Slope is 4.0-4.6 mV/°C.

AFAICS, your formula in the code is correct. What are your ADC raw values? Mine are around 1860, which would give 7°C. My ADC reading goes down after power-on, while I expect the chip temperature to rise, so the negative coefficient seems valid.

EDIT: tested temperature reading on my Nano

Thank you guys. I finally found the problem. I tried to set up the ADC, select channel, enable temperature sensor inside single function. I wasn’t reading the datasheets deeply, but I tried placing some delay between ADC initialization and reading. It seems working now, I am getting values between 2065…2072 at 21.3 ^C. The variation of converted value is pretty big, so filtering is a must. The datasheet says (thank you bobtidey) that the dependency of temperature from junction voltage is linear, but needs to be calibrated (the absolute shift and also the slope). Currently I am experimenting with getting two point calibration curve… But the temperature of CPU slowly rises and makes all of this useless :slight_smile:

Here is the fixed code if someone will face same problem as me:

ADC_InitTypeDef ADC_InitStructure;
ADC_InitStructure.ADC_Mode = ADC_Mode_Independent;	 
ADC_InitStructure.ADC_ScanConvMode = DISABLE;			  
ADC_InitStructure.ADC_ContinuousConvMode = DISABLE;	 
ADC_InitStructure.ADC_ExternalTrigConv = ADC_ExternalTrigConv_None; 
ADC_InitStructure.ADC_DataAlign = ADC_DataAlign_Right;		   
ADC_InitStructure.ADC_NbrOfChannel = 1;
ADC_Init(ADC1, &ADC_InitStructure);
ADC_TempSensorVrefintCmd(ENABLE);    
ADC_Cmd(ADC1, ENABLE);

BIOS::SYS::DelayMs(1);

ADC_RegularChannelConfig(ADC1, ADC_Channel_16, 1, ADC_SampleTime_55Cycles5);  
ADC_SoftwareStartConvCmd(ADC1, ENABLE);	
while (ADC_GetFlagStatus(ADC1, ADC_FLAG_EOC)!=SET);
int ADCConvertedValue = ADC_GetConversionValue(ADC1);
return ADCConvertedValue;

I use this code gitorious.org/dsonano/dso-bootl … .c#line132 with stuff mostly copy-pasted from some examples I have seen. As you can see there is an ADC calibration phase in there. Maybe you won’t need a magic delay if you do it like this.

Note also that the data sheet from bobtidey is for another chip series (STM32L) which for instance has calibration data burned in during fabrication.