Chapter 1: Introduction
-
Published:19 Oct 2022
-
Special Collection: 2022 ebook collection
B. Lee Drake, B. L. MacDonald, and R. F. Shannon, Jr., in Advances in Portable X-ray Fluorescence Spectrometry
Download citation file:
This chapter begins with a brief introduction to portable X-ray fluorescence and what distinguishes it from its related laboratory instrumentation. This chapter also covers a short history of its development.
1.1 Introduction
In the original Star Trek series, which aired from 1966 to 1969, Chief Science Officer Mr. Spock carried with him a small handheld device which he called the Triquarter, which was capable of reading the elements within any material. This device acted as a universal analyzer, able to produce readings on any material almost instantaneously. In a world in which laboratories required large systems to achieve the same result, such a device would have appeared as a dream.
Over the past three decades, multiple compositional analysis methods have reached a level of miniaturization to fit this dream; including energy-dispersive X-ray fluorescence (ED-XRF), laser-induced breakdown spectroscopy, and Fourier-transform infrared spectroscopy (FTIR). Each system can obtain a spectrum which has the potential to characterize the composition of any sample (solid, liquid, gas, or plasma). With these systems, it is conceivable that an art conservator could perform a full evaluation of the Mona Lisa without touching it or moving it from its display. An archeologist could evaluate the elemental composition and molecular bonds of artifacts at a site in situ, without having to remove any from their context. A geologist can scan a cross-section of core without sampling any portion.
But there is a compromise to using each of these Triquarter-style instruments, and one that is quite serious. The standardized scientific units employed in scientific laboratories for decades assume homogenization of material – an archaeological ceramic transformed into a fused-glass bead, or a small sample of pigment dissolved into acid. And it was this homogenization work which made quantification possible. An inductively-coupled plasma mass spectrometer requires acid dilution into solution to create a matrix for analysis. Wavelength-dispersive X-ray fluorescence benefits from a similar acid dilution but takes an additional step of forming a fused glass bead. Earlier systems such as flame absorption spectroscopy would combust acid-leached solutions and analyze elemental composition based on the wavelengths of light emitted during combustion. Each technique used different wavelengths/energies of light to accomplish their tasks and thus produced very different spectra. This of course creates a problem – how to reconcile all the different methods so that the products of their analysis are comparable.
The answer is in quantification so that the results from each system are in the same standardized units, typically expressed as parts-per-hundred (%), parts-per-million (ppm), or parts-per-billion (ppb) of either mass or volume. The study of metals and geological materials is typically reported with % mass, while atmospheric sciences and some liquid analyses use volume%. The use of mass%in particular is due to common metallurgical processes which began almost 4000 years ago in the early Bronze Age, when 9 parts copper were added to 1 part tin to form a strong, durable metal.
The challenge then was to reconcile the spectra from homogenized samples with the traditional quantitative units that had been used for millennia. This was accomplished by using linear and non-linear models based on the correlation of peak height to a known concentration using statistical approaches,1 or by using more sophisticated algorithms which incorporated variation from the instrument into the resulting data.2 Regardless of the technique used, the validation of the equipment against accepted international standards was a critical last step. Not only were different types of equipment converging on the same standardized units but they could also be calibrated to a common definition. Regardless of whether you used X-rays or infrared photons, you could get a comparable result. The spectra can be thought of as raw analytic output, while the quantitative units were instead a synthetic combination of spectral data and outside information, whether it is empirical or based on fundamental parameters of the spectra acquisition.
Compare the effort employed in homogenization and calibration to Spock's use of the Triquarter. He didn't dissolve a rock sample in acid, or pulverize plant material to homogenize it on an alien planet; instead he simply pointed at the surface of a strange object and it gave a result. The key difference between laboratory systems and portable non-destructive applications is not in the details of their components so much as the reality of the materials under study. The vast majority of the material world is heterogeneous. If you were to walk outside and grab the first rock you found and analyze it in multiple spots with a handheld system, you would almost certainly get variation from one spectrum to the next. This is not a deficiency in the equipment or an error in the calibrations – it is working exactly as intended. Rather, it is a problem with reality – the distribution of elements is not even.
The use of portable equipment thus disrupts the very heart of the laboratory approach – the key difference is not in the components; a laboratory and portable XRF system will both include an X-ray tube, a detector, and a high-energy source capable of producing a spectrum. The details matter, but a higher vs. lower energy tube/X-ray source affects the range of elements analyzed as well as how quickly the spectrum is obtained. The quality of the detector affects the resolution and rate of acquisition of the spectrum, as well as the range of energies analyzed. But the knife cuts both ways – today, portable systems can often be just as capable as laboratory systems from a purely spectroscopic view. A more powerful tube can be offset by longer assay times. The difference between the two emerges from how they are used, rather than their components. The biggest difference is not the equipment, but the sample under study.
The distinction between portable and laboratory systems shrinks with every technological advance – the true differences are much more nuanced and related to the geometry of the tube and detector, the quality of the detector, and sample presentation. There are necessary sacrifices when miniaturizing equipment. This ultimately is what makes any Triquarter much more complex than laboratory systems. If we measure non-destructively, we take the world as it is, heterogeneity and all. While in some circumstances we can still express the results of these analyses as mass or volume%, this depends on the sample and its preparation, not the instrument.
However, using a Triquarter as an inexpensive substitute for a laboratory instrument completely misses the point. Using it to reproduce the same percent estimates that have been in place since the Bronze Age is the narrow view. The world is not a fused glass bead, but a vibrant, heterogeneous place. Using portable instrumentation to investigate it means that we must adjust methods to account for variation, rather than adjust the sample to fit the instrument.
Here, we will approach XRF in a very different manner than has been done so in the past. Rather than view the method exclusively as an analytical one, we will also take a broader qualitative look at the technique and how it can be used non-destructively on cultural heritage items, geologic profiles, and spatial mapping. We will also explore how the composition of objects influences spectra beyond elemental fluorescence lines, such as the implications of Compton shift and coherent scattering. Finally, we will approach the subject of dimensionality in XRF measurements. But first, we must situate XRF in the broader spectrum of scientific instrumentation, and understand how it works relative to other approaches.
1.2 Categorization
In science, it is common for a continuous phenomenon to be broken up into discrete units. We take something like distance and divide it into kilometers, meters, and millimeters. The same is true of our analysis of photons–electronvolts (eV) or kiloelectronvolts (keV) are used as a measure of energy. In our common language however, we use common names such as infrared or ultraviolet to cover these photon types.
At their heart, many scientific instruments record the interaction of a photon with an electron. And these interactions occur at increasingly small scales with increasingly higher energies. An XRF system and gas chromatography primarily differ in the energy of photons used, and in turn, the location of the electrons with which they interact.
If wavelength and energy are inversely related, what are the differences between wavelength dispersive and energy dispersive XRF (Figure 1.1)? Wavelength dispersive XRF instruments have better spectral resolution and use Bragg scattering which splits the wavelengths out as a function of angle after scattering the signal off an analyzer crystal. In other words, wavelength dispersive instruments that use an X-ray tube typically also use a crystal to single out a specific wavelength (energy) as the excitation source that is called a monochromator. The source wavelength can be varied to improve the signal for a given fluorescence line of interest, and the spectrum is collected over the range of the lines of interest. It is also possible to fix the excitation or to not monochromat the source at all, although it is typical to do so. For energy dispersive XRF, all of the energies are detected simultaneously (no analyzer crystal), and a Digital Pulse Processor (DPP) or a Multi-Channel Analyzer (MCA) is used to electronically separate the energies. In this case, the source, if it is an X-ray tube, does not normally use a monochromator. Radioactive sources are more monochromatic by nature and can be used in either Wavelength or Energy Dispersive instruments.
While terminology and even base units (eV vs. wavelength) vary from technique to technique, they can all be generalized in the following statement: we sent photons into atoms, and observed an interaction. Whether that something is florescence, disruption, or absorption is a matter of detail, but the point is that we are generally looking at photon–electron interactions for nearly all of the aforementioned techniques.
Broadly speaking, we can think of energy as being made of photons and matter being made of fermions. The photon–fermion interactions vary not just by energy but by how we measure them. For example, X-ray fluorescence, X-ray diffraction, and X-radiography all use the same energy range of photons, but one uses the fluorescence of those X-rays to identify elements, the other measures the diffraction of X-rays off or through the crystalline structure of the minerals to identify them, and the third simply looks at how many pass through to determine the density of a heterogeneous object. Same photons, same electrons, but very different detector angles. Put the detector at a 90–62° angle from the sample and you have X-ray fluorescence. Place it at different, shallower angles, and you have X-ray diffraction. Put the detector on the other side of the object being analyzed, and you have X-ray radiography.
The presentation of results is generally in a bivariate plot, with the photon's energy (or wavelength) along the x-axis and the quantity of those photons on the y-axis. This is known as a spectrum. The range of energies and resolution can vary wildly, but all diagrams such as these convey the same basic information of photon–electron interactions. In some cases, photons are obtained via excitation by electrons (EDS) or even neutrons (INAA), but the result is still a photon spectrum. As such, understanding the process of fluorescence lays the groundwork for being able to interpret spectra. While this guide is principally focused on XRF spectra, the principles will also extend to electron microscopes.
1.3 A Brief History of Portable XRF
A short but detailed history of X-ray fluorescence can be found in most instructional books, such as “Quantitative X-ray Spectrometry”, part of the Practical Spectroscopy Series.3 A brief early history starts with Wavelength Dispersive X-ray Fluorescence (WD-XRF) in the 1950s and 60s, some 40 years after Barkla first discussed it.4 In the 1970s Energy Dispersive X-ray Fluorescence (ED-XRF) became popular.3 Federal Laws starting in 1971 on the prevention of lead poisoning and continuing up to and past Title X in 1992 increased the need and economic feasibility for the development of hand-held or portable ED-XRF units for quick testing. Further governmental regulations such as RoHS (Restriction of Certain Hazardous Substances), adopted in 2003 by the European Union and put in place in 2006, as well as similar regulations from other countries, have increased the demand for XRF. The largest share of the portable ED-XRF market is in alloy analysis. Shortly after the development of portable ED-XRF in the early 1980s they became popular in the metal recycling industry, which has seen continued growth. Early units from the 1980s and 1990s used radioactive sources. In the 1990s the units became more compact and lighter. The older instruments frequently came in two parts, one that held the source and detector that you held in your hand and a heaver part with the batteries and everything else that was held by shoulder straps.
Shortly after the start of the 21st century, X-ray- or tube-based portable systems were developed. There were some in the field who rightly argued that a single energy excitation source like Fe55 and Cd109 provides a much cleaner spectrum as X-ray tubes produce a broad energy range of X-rays from Bremsstrahlung radiation that are scattered off the sample into the detector. However, the added versatility and reduced regulatory paperwork gave the X-ray tubes an advantage. X-ray tubes can be operated at different potential voltages and with different filters that “harden” the X-ray source. This ability allows an X-ray-tube-sourced instrument to measure a large range of elements with optimal performance by changing the tube's voltage. In contrast, source-based XRF instruments require multiple radioactive sources to extend the useable elemental range, and they were and still are lighter, but require in most cases routine replacement of the source. Today most XRF units use X-ray tubes.
The next major advancement was the development of Silicon Drift Detectors (SDD), which become commercially available around 2006 to 2007. The original idea dates to about 1983.5 Silicon PIN detectors were a great improvement at their time, but larger area Si PIN detectors had poorer resolution and lower maximum count rates. Silicon Drift Detectors suffer very little loss in performance with miniaturization and have maximum count rates of 10 or more times than that of a traditional Silicon PIN detector, while improving energy resolution by 20%. Additionally, they have a better response to low energy X-rays. Although elements as light as magnesium could be measured with a Si PIN detector, they did not perform nearly as well as Silicon Drift Detectors. A detailed overview of these technological advancements is covered in Chapter 12, using examples from the analysis of European oil paintings in Italy.
1.4 The Adoption of pXRF and Topics in This Volume
As evidenced by the diversity of chapters in this book, portable XRF instruments have been rapidly adopted in a remarkably wide range of research and industrial fields. Technological advances in instrument design, the miniaturization of components, and increased efficiencies in detector processing capabilities have resulted in cost-effective manufacturing, which in turn has led to the widespread and affordable availability of pXRF. Following this, fields that have adopted this technique have shown continuous growth in terms of refining sophisticated applications of use. A key subtext of this book that we continuously reinforce is that analysts today are increasingly empowered with the right tools and information to effectively use pXRF. The intention of this book is to provide readers with guidance on contemporary standards and best practices for those who wish to include pXRF as part of their regular analytical toolkit, whether they are engaged in industrial, academic, or hybrid fields.
This volume is organized into three main sections beginning with introductory chapters that provide overviews of fundamental concepts, followed by a series of contributed chapters that detail aspects of industrial and academic applications. It closes with a section on technical topics, such as reproducibility, standardizations, inter-instrument comparisons, quantifying uncertainty in pXRF measurements, and an overview of promising new advances in artificial intelligence and machine learning to enhance pXRF data analysis.
The book begins with an introductory overview of the essential principles of XRF and its safe and effective use for research and industrial applications, including qualitative and quantitative analytical approaches. Chapter 2 covers X-ray physics fundamentals such as the electromagnetic spectrum, interactions between photons and matter, the typical instrument parameters for most manufactured models that are currently on the market, and the ideal conditions required for effective spectrometry. The following two chapters dive into detail on qualitative (Chapter 3) and quantitative (Chapter 4) analysis, respectively, using examples from the measurements of oil paintings, pottery, and obsidian. Introductory users will find discussions on sample preparation and considerations for heterogeneous materials, how to optimize measurements for your sample matrix, and peak identifications highly useful. The latter chapter reviews core concepts of standardization and calibrations and overviews the most common mathematical approaches toward quantifying spectral data, such as empirical calibrations, fundamental parameters (FP), and Compton normalization. Chapter 5 provides an important overview of the main features of portable and handheld XRF instrumentation that ensure user safety, including some key recommendations on safe operation when analyzing biological samples (including human hair and skin), and non-destructive testing of archaeological objects.
Section two begins with a contribution to industrial applications (Chapter 6). This chapter offers a comprehensive overview of current applications, including metals and alloy analysis, materials grade identifications, positive material identification (PMI), and the best practices and regulatory standards associated with their use. Other applications such as precious metal identification, consumer product testing and environmental regulatory practices are covered, including a review of industry-specific standards for national and international use. Chapter 7 also covers regulatory applications of rapid screening of consumer products using pXRF, with a focus on FDA (Food and Drug Administration) regulations for food products, dietary supplements, pharmaceuticals, and cosmetics. Useful tips on quantitative analysis using standards and how to avoid false positives and false negatives are provided.
Chapters 8 and 9 delve into details on applications of pXRF to water and organic matrices. In “Chemical Analysis of Natural Waters (Chapter 8)”, the authors review considerations for calibrated XRF analysis of water, including sample preparation, experimental conditions, and a novel application of how to use pXRF spectra to quantify fluid density. These parameters are applicable to a range of contexts, including testing of solutes in lakes, streams, and groundwater for human consumption, agricultural applications, and environmental health and safety. Chapter 9 focuses on agricultural applications, such as the analysis of soils, plants, manure, fertilizers, and animal feed, outlining optimal excitation parameters and experimental conditions, such as those recommended by ICRAF (International Council for Research in Agroforestry) industry standards.
Chapter 10 turns the reader's attention to the energy sector, focusing on applications in oil and gas exploration. Details relevant to this sector, including typical experimental conditions, spectral interpretation, quantification methods, chemofacies classifications, and mathematical approaches to data interpretation are covered. In a different type of exploration, Chapter 11 highlights the fascinating realm of portable X-ray spectrometers in extraterrestrial contexts, including the most recent developments from the Mars Rover program. This chapter outlines the unique challenges associated with X-ray spectrometry in outer space, such as data storage and power supply, variations in thermal environments, issues with dust, radiation, and other atmospheric effects typical of this dynamic environment. Key discoveries are highlighted, including ground-truthing remote sensing observations and integration with other methods (e.g. onboard X-ray diffraction) in a Martian context.
Chapters 12 through 15 return our attention back to Earth, specifically to the analysis of ancient and historical materials by pXRF in the fields of technical art history and archeology. Chapter 12 provides a retrospective look at the evolution of portable ED-XRF instrumentation used to analyze works of art through the lens of museum-based applications in Italy. This chapter offers a detailed introduction on the unique challenges associated with working with objects of heritage significance, including metal sculptures (brass, bronze, gold, silver) and their patinas, and qualitative analysis of artists’ pigments that were used to produce paintings. Interestingly, this chapter describes the specifications of the predecessors of the portable devices that we know and use today, such as radioactive source-based detectors, and the incremental, yet crucial, improvements made in detector technologies and multi-channel analyzers (MCAs) to improve energy resolution, decrease spectral interferences, and minimize radiation damage to precious objects. They describe four key revolutions in portable XRF technology: (1) the transition from proportional gas counters to liquid nitrogen, and later to Peltier-cooled systems, which enabled increased spectral resolution and accuracy; (2) the transition from less stable mercury iodide (HgI2) detectors, to silicon-PIN (Si-PIN), and later to (3) the silicon-drift detectors that are presently the most common in use. They argue that the most recent (4) revolution in pXRF technology has been advanced by collimators, moveable stages, and enhanced detection systems that enable macro- and micro-XRF mapping capabilities, an especially useful technique for analyzing paintings.
Chapter 13 is a must-read for those interested in issues related to the analysis of archeological and historical brass, such as limitations for pXRF, common spectral interferences matrix effects and how to overcome them, and best practices for quantitative analysis. A particularly useful aspect of this chapter is the attention to the use of reference materials and empirical calibrations, including a thorough evaluation of relevant components of the well-known CHARM open-source calibration set.6–8 A case study on the analysis of historical brasses in the late historic period, Ontario (Canada) is provided. Following this, Chapter 14 covers similar concepts for the analysis of heritage glass, including typical procedures, best practices for effective spectral acquisition, calibration methods, and the beneficial information that can be gleaned from a strictly qualitative approach. Finally, Chapter 15 closes this section with the analysis of a different type of glass: obsidian. This chapter uses the example of the MURR2 obsidian reference set9 and two case studies of previously characterized obsidian sources in Belize and Ethiopia in order to delve into important issues of data reproducibility between units and between manufacturers. The authors explore the highly relevant topic of inter-instrument compatibility, as well as recommendations for shared calibrations and source materials and best practices for calibrations and publishing datasets.
The final section of this book addresses advanced topics of estimating uncertainties in pXRF measurements (Chapter 16), and novel uses of artificial intelligence (AI) and machine learning in pXRF (Chapter 17). In Chapter 16, the authors begin with the premise that most commercially available pXRF instruments have onboard quantification models and the outputs often include reported uncertainty values. Yet, how those values were quantified is not often disclosed to the end users. Therefore, in a scenario where manufacturer calibration models are often proprietary, how can an analyst understand or verify those uncertainty values? The chapter addresses this by defining what uncertainties are truly measuring, delving into the mathematics behind calculating limits of detection and various different mathematical approaches for quantifying uncertainties. Chapter 17 focuses on the topic of AI and machine learning applications, providing a detailed overview of the value of these approaches for improving the identification of elements and lowering the limits of quantification in pXRF spectra, as well as a frank assessment of the strengths and weakness of the most commonly encountered modeling types: neural networks, random forests, gradient boosting, and support vector machines. Examples are given for qualitative applications of wood and dinosaur bone classification, and quantitative analysis of brine chemistry and archaeological ceramics. The free to access, electronic supplementary information, is organized as a user reference guide for understanding elements and their interactions with X-rays. Core concepts such as infinite thickness and interaction volume are described, followed by an index of all elements, organized by atomic mass, including their characteristic features: energy (keV) and electron type (Ka1, Kb1, etc.), absorption edge, fluorescence efficiency, environmental abundance, common spectral interferences, and tips for effective spectrometry of different matrices.
We predict that the presence of pXRF instrumentation will only continue along its upward trajectory toward mainstream use in industrial and academic settings. It is crucial that practitioners continue to learn, share, and implement informed and effective use of this technique in order to promote and maintain important scientific standards. It is our sincere hope that the information given in this book provides readers with tips, tools, and inspiration to adopt pXRF into their own analytical toolkits.