Tag Archives: ; performance; sensitivity; SNR; signal-to-noise; ethylbenzene; lineshape; resolution;

How to Evaluate a Benchtop NMR Instrument’s Technical Performance Part 3: 1H sensitivity

July 4th, 2019, by

In my recent posts on evaluating benchtop NMR system performance, I discussed the fundamental role the static (B0) magnetic field homogeneity plays in defining the lineshape and with it the resolution performance of the instrument. However, the quality of the magnetic field affects much more than just the instrument’s lineshape and resolution: since broadening of the lines due to B0 inhomogeneity causes them to be lower in amplitude, the quality of the field also directly affects the instrument’s sensitivity. In this post I explore the concept of instrument sensitivity in more detail and look at how to measure  1H sensitivity.

 

What is Meant by Sensitivity?

A formal definition of sensitivity is the ability of an instrument to detect a target analyte. This is usually expressed in NMR as the signal-to-noise ratio (SNR) for a defined concentration of reference substance. Simply put, the more sensitive the NMR spectrometer, the less sample you need to get the same SNR in your spectrum. The two principal enemies of any analytical measurement are higher noise levels and a lower intensity of the signal measured by the instrument’s detector for a sample of given concentration. With modern electronics the noise levels are consistent and should not vary much between different instruments.  This means the sensitivity depends primarily on the signal amplitude, which in turn depends on the lineshape and resolution of the instrument. A poor lineshape results in spectra with broad lines that are lower in amplitude, which decreases the SNR, thereby degrading the sensitivity of the instrument and increasing the amount of sample and/or measurement time required to get the same SNR in the spectrum, as we will see below.

(more…)