After 1H, 13C is easily the next most important nuclide in the NMR periodic table; 13C measurements can provide a wealth of valuable structural info. Unfortunately, with a receptivity that is around 5,500 smaller than that of 1H, 13C is a much less sensitive nuclide. This lower sensitivity demands the maximum performance from the NMR spectrometer to keep the measurement times and sample concentration within practical limits. Since 13C NMR has the reputation to be challenging even for high field spectrometers, people tend to think that only overnight experiments can be performed on bench top systems. In the first example below we want to show you that even at frequencies like 43, 60 or 80 MHz high quality 13C spectra can be acquired in a single scan. If your goal is to teach the principles of 13C NMR to students, it is worth knowing that good 13C NMR spectra can be acquired on concentrated organic liquid samples in just under a minute. Moreover students can collect a whole set of powerful multidimensional heteronuclear experiments in well under an hour. The spectrum below of neat propylbenzoate could serve as a useful example for teaching 13C NMR in an educational environment.
Figure1: 1D 13C NMR spectra of neat Propylbenzoate acquired with a single scan (blue), 4 scans (green) and 16 (red) scans totalling 5, 20 and 80 seconds of acquisition time respectively.
As the concentration of the sample decreases the experiment will take more time depending on the concentration and 13C sensitivity of your instrument. As described in the previous post, the effect of lower sensitivity can cause the experiment time to increase dramatically because of the square root relationship with the number of scans. Because 13C sensitivity is such a critical parameter we want to provide some standards that can be used as reference to understand and evaluate 13C measurements when considering a benchtop NMR instrument.
In my recent posts on evaluating benchtop NMR system performance, I discussed the fundamental role the static (B0) magnetic field homogeneity plays in defining the lineshape and with it the resolution performance of the instrument. However, the quality of the magnetic field affects much more than just the instrument’s lineshape and resolution: since broadening of the lines due to B0 inhomogeneity causes them to be lower in amplitude, the quality of the field also directly affects the instrument’s sensitivity. In this post I explore the concept of instrument sensitivity in more detail and look at how to measure 1H sensitivity.
What is Meant by Sensitivity?
A formal definition of sensitivity is the ability of an instrument to detect a target analyte. This is usually expressed in NMR as the signal-to-noise ratio (SNR) for a defined concentration of reference substance. Simply put, the more sensitive the NMR spectrometer, the less sample you need to get the same SNR in your spectrum. The two principal enemies of any analytical measurement are higher noise levels and a lower intensity of the signal measured by the instrument’s detector for a sample of given concentration. With modern electronics the noise levels are consistent and should not vary much between different instruments. This means the sensitivity depends primarily on the signal amplitude, which in turn depends on the lineshape and resolution of the instrument. A poor lineshape results in spectra with broad lines that are lower in amplitude, which decreases the SNR, thereby degrading the sensitivity of the instrument and increasing the amount of sample and/or measurement time required to get the same SNR in the spectrum, as we will see below.