Skip to Main Content
Skip Nav Destination

The available imaging modalities used for the diagnosis of human disease include planar X-ray, computed tomography (CT), ultrasound (US), magnetic resonance imaging (MRI), optical coherence tomography (OCT), single-photon emission computed tomography (SPECT), positron emission tomography (PET), and optical imaging. In particular, MRI is a non-invasive and non-ionizing technique that is able to provide images with a high spatial resolution and excellent contrast-to-noise ratio (CNR) for soft tissues. Considering its advantages, MRI has developed as a critical research and diagnostic tool since its discovery in the early 1970s.1  A potential limitation of MRI is that its sensitivity is lower than that of PET or SPECT. MR signal arises from a net magnetization due to the small population difference between the nuclear Zeeman energy levels of spin-½ nuclei in an external magnetic field. The polarization of the spin-½ nuclei is defined as the fraction of excess nuclei in the lower energy level.2  Conventional MRI detects hydrogen nuclei (1H) and typical 1H polarization values at body temperature and in clinical magnetic field strengths are 4.9×10−6 at 1.5 T and 9.9×10−6 at 3 T. Since polarization scales linearly with the magnetic field strength, sensitivity and thus signal-to-noise ratio (SNR) of conventional MRI can be improved by increasing the magnetic field strength. The strongest magnet used for whole body human MRI research is currently 9.4 T (Max Planck Institute, Tübingen, Germany).3  The increase in magnetic field strength comes with additional challenges, including a higher specific absorption rate (SAR), and also interactions between the radio frequency (RF) field and tissue, which can cause image inhomogeneities.4 

You do not currently have access to this chapter, but see below options to check access via your institution or sign in to purchase.
Don't already have an account? Register
Close Modal

or Create an Account

Close Modal
Close Modal