Jun 20 2014
Image Credits: Barks/shutterstock.com
Sensors are devices that are used to detect and measure physical quantities. The properties of a sensor and its specifications can be described with the help of certain terminology, which provides detail on the ability of the sensor to detect and react to changes in the physical quantity measured.
Basic Sensor Terminology
Sensitivity – Sensitivity is the minimum value of an input parameter that is capable of creating a detectable output change. For gas sensors, this would determine the smallest concentration of gas that could be detected.
Range – The maximum and/or minimum values of the input parameter that can be measured. The dynamic range is defined as the total range of the sensor, i.e., from minimum to maximum.
Precision – The degree of reproducibility of a measurement by a sensor. A very precise sensor would always give very similar values when making the same measurement, though this may not reflect the ‘true’ value – that is determined by the accuracy.
Resolution – The smallest detectable incremental change of the input parameter that can be measured.
Accuracy - The maximum difference between the actual value and the output of the sensor.
Offset – Offset is the indicated output that exists when the input is zero. For example, light sensors may still show some output signal in a completely dark room – this value is the offset.
Linearity - The extent to which the actual output curve of the sensor deviates from the ideal curve, expressed in terms of percentage of non-linearity. Dynamic linearity is the ability of the sensor to follow rapid input changes.
Hysteresis – The characteristic of a sensor to produce different output for the same value of input when measured in different directions.
Response Time – The time required for a sensor output to change from the previous measured value to the current input value.
References
- Sensor Fundamentals – Michigan Tech
This article was updated on the 1st August, 2019.