Skip links

International System of Units and Common Metrology Terms

International System of Units and Common Metrology Terms

Metrology is a science based on comparisons. A measurement of voltage, length, pressure or force must be compared to a well known and defined value, in order to produce a valid result. When performing comparisons, the existence of references for all measurement quantities is necessary. These references and their derived quantities are defined in the International System of Units (SI).

The International System of Units (SI) is the foundation of modern metrology. The abbreviation SI is taken from the French name, Système International d’Unités, and is actually the modern form of the metric system. SI was established in 1960 by the General Conference of Weights and Measures.

The SI units are used internationally and they are the basis of all modern measurements. There are also Customary units (inch, foot, pound, and yard) which are defined in accordance with the SI units. For example, an inch is defined as being2,54 centimetresin length.

National Laboratories perform experiments to realize the SI units as defined. Some of these experiments lead to a representation of the unit. For example the representation of the “Volt” unit is realized with a Josephson array.

The System of Units

The SI consists of 29 units:

  •  7 Base Units
  • 2 Supplementary Units
  • 20 Derivative Units

Base Units

The seven base units, from which all other measurement parameters are traced, are: length, mass, time, electric current, thermodynamic temperature, luminous intensity and amount of substance.


The meter (m) is the SI unit of measurement for length. It is defined as the distance travelled by light in vacuum during a time interval of 1/299792458 second.


The kilogram (kg) is the SI unit of measurement for mass. Kg is the only unit which is still defined as a physical artefact, the mass of the International Prototype Kilogram which is a cylinder of platinum iridium alloy kept by BIPM inParis,France.


The SI unit of time is the second (s). It is defined as the duration of 9 192 631 770 cycles of radiation corresponding to the transition between the two hyperfine levels of the ground state of the caesium 133 atom.

Electric Current

The SI unit of measurement for electric current is the ampere (A). It is defined as the constant electric current producing a force of 2×10-7 newtons per meter of length, if maintained in two straight parallel conductors of infinite length, one meter apart in vacuum.

Thermodynamic Temperature

The Kelvin (K) is the SI unit of measurement for thermodynamic temperature. It is defined as the 1/273.16 of the thermodynamic temperature of the triple point of water.

Luminous Intensity

The Candela (cd) is the SI unit of measurement for luminous intensity. It is defined as the luminous intensity in a given direction of a source that emits monochromatic radiation at a frequency of 540×1012 Hertz, with a radiant intensity in that direction of 1/683 Watts per steradian.

Amount of Substance

The SI unit of measurement for an amount of substance is the mole (mol). It is defined as the amount of substance of a system that contains as many elementary entities as there are atoms in0.012 kilogram of carbon 12.

Supplementary Units

Plane angles and solid angles are the two supplementary units in the SI system. They are dimensionless quantities and are defined as follows:

Plane Angles

The SI unit of measurement for plane angles is the radian (rad). It is defined as a plane angle with vertex at the centre of a circle that is subtended by an arc equal in length to the radius.

Solid Angles

The steradian (sr) is the SI unit of measurement for solid angles. It is defined as the solid angle with vertex at the centre of a sphere that is subtended by an area of a spherical circle equal to that of a square with sides equal in the length to the radius.

Derived Units

There are 20 derived units which are obtained by combining the seven SI base units with each other and with other derived or supplementary units. The table below presents the derived units and their relation to the base units. As time goes on, more derived units may be added in order to cover the needs of science.

Parameter Unit Value In Terms of Si Base Units Value In Terms of Other SI Units
Frequency Hertz (Hz) [math]\frac{1}{s}[/math]
Force Newton (N) [math]\frac{kg-m}{s^2}[/math]
Pressure Pascal (Pa) [math]\frac{kg}{m-s^2}[/math] [math]\frac{N}{m^2}[/math]
Work or Energy Joule (J) [math]\frac{kg-m^2}{s^2}[/math] [math]n-m[/math]
Power Watt (W) [math]\frac{kg-m^2}{s^3}[/math] [math]j-s[/math]
Electrical Potential Volt (V) [math]\frac{kg-m^2}{s^3-A}[/math] [math]W-A[/math]
Electrical Resistance Ohm (Ω) [math]\frac{kg-m^2}{s^3-A^2}[/math] [math]V-A[/math]
Quantity of Charge Coulomb (C) [math]\frac{s-A}[/math]
Electrical Capacitance Farad (F) [math]\frac{s^4-A^2}{kg-m^2}[/math] [math]C-V[/math]
Conductance Siemens (S) [math]\frac{s^3-A^2}{kg-m^2}[/math] [math]A-V[/math]
Magnetic Flux Weber (Wb) [math]\frac{kg-m^2}{s^2-A}[/math] [math]V-S[/math]
Magnetic Flux Density Tesla (T) [math]\frac{kg}{s^2-A}[/math] [math]\frac{Wb}{m^2}[/math]
Inductance Henry (H) [math]\frac{kg-m^2}{s^2-A^2}[/math] [math]Wb-A[/math]
Temperature Celsius Degrees Celsius (°C) [math]K[/math]
Luminous Flux Lumen (lm) [math]cd[/math] [math]cd-sr[/math]
Illuminance Lux (lx) [math]cd-m^2[/math] [math]lm-m^2[/math]
Radioactivity Becquerel (Bq) [math]1-s[/math]
Absorbed Dose Gray (Gy) [math]\frac{m-s^2}{s^2}[/math] [math]\frac{J}{kg}[/math]
Equivalent Dose Sievert (Sv) [math]\frac{m^2}{s^2}[/math] [math]\frac{j}{kg}[/math]
Catalytic Activity Katal (kat) [math]\frac{mol}{s}[/math]

Common Metrology Terms

Everybody who is involved in metrology, from calibration technicians to customers in search of the suitable calibration laboratory, must have a good knowledge of the most commonly used metrology terms. Some of these terms are defined below in simple words.

  • Accuracy (Measurement Accuracy): A number which indicates the closeness of a measured value to the true value.
  • Adjustment: An operation that is performed in order to initially establish or restore at a later point an instrument’s specified performance level.
  • Calibration: A set of operations performed in accordance with a definite and documented procedure that compares the measurements performed by an instrument, to those made by a more accurate instrument or standard, for the purpose of detecting and reporting or eliminating by adjustment any errors in the instrument tested.
  • Calibration Interval: A specified or designated period of time between calibrations of an instrument. During this interval the instrument should remain within specified performance levels.
  • Calibration Label: A label affixed to an instrument to show its calibration status. Usually the label contains the instrument’s identification, the person who performed the last calibration, the date of the last calibration and the date of the next calibration.
  • Calibration Laboratory: A work space equipped with the appropriate test instruments, controlled environment, trained personnel and documented calibration procedures. Cal Labs perform many routine calibrations.
  • Calibration Report: A document which describes the calibration, provides the calibration results, mentions the calibration responsible, the conditions of measurements, the equipment used, the procedure of measurement and the measurement uncertainties.
  • Error (Measurement Error): The difference between the measured value and the true value of a measurement. The actual value of an error can never be known exactly, only estimated.
  • Measurement: A set of operations performed on a physical object or system according to an established, documented procedure, in order to determine the value of the object or system.
  • Metrology: The science of measurement. It contains everything that has to do with measurement: Designing, performing, documenting the measurement, evaluating and analyzing the results, calculating the measurement uncertainties.
  • Seal (Tamper Seal): A seal of appropriate design and material that is attached to an instrument to clearly indicate tampering. The purpose is to ensure warranty of calibration.
  • Specification: A documented presentation of the parameters, including accuracy or uncertainty, describing the capability of an instrument.
  • Standard (1) (Measurement Standard): An object, artifact, instrument, system or experiment that stores, represents or provides a physical quantity which serves as the basis for measuring the quantity.
  • Standard (2) (Paper Standard): A document describing the operations and processes that must be performed in order for a particular goal to be achieved. A well known standard is ISO 17025 which describes the requirements for the competence of testing and calibration laboratories.
  • Tolerance: The limits of the range of values that apply to a properly functioning measuring instrument. This term is strongly connected to the accuracy term.
  • Traceability: A calibration is traceable when each instrument and standard, in hierarchy stretching back to the national standards, was itself properly calibrated and the results properly documented. The documentation provides the information needed to prove that all calibrations in the calibration chain were properly performed.
  • Uncertainty: An estimate of the range of values, usually centered on the measurement value, which contains the true value of a measured quantity, with a stated probability.
  • Verification: The set of operations that assures that specified requirements have been met, or leads to a decision to perform adjustments, repair, downgrade performance or remove from use.
  • ppm (parts per million): a convenient way if expressing small fraction and percentages. For example, 15 ppm = 15 / 1000000 or 0.000015 or 0.0015%.
  • Unit Under Test (UUT): The instrument that is being tested / calibrated.

Written by Sofia