A numerical procedure for understanding the self-absorption effects in laser induced breakdown spectroscopy
Abstract
Optical emission spectroscopic techniques, such as laser-induced breakdown spectroscopy (LIBS), require an optimal state of plasma for accurate quantitative elemental analysis. Three fundamental assumptions must be satisfied in order for analytical results to be accurate: local thermodynamic equilibrium (LTE), optically thin plasma, and stoichiometric ablation. But real-life plasma often fails to satisfy these conditions, especially the optical thinness of plasma, resulting in the reabsorption of emitted radiation called self-absorption. To study the self-absorption effect, we simulated optically thick emission spectrum at typical laser-produced plasma conditions. The simulation of the spectrum involves four stages, including the estimation of the ratio of the number density of various ionisation states in the plasma using the Saha–Boltzmann equation, the peak intensity of a spectral line using the Boltzmann equation, the full-width half maxima of each spectral line using the Stark broadening method, and the generation of full spectra by providing a Lorentzian profile for each spectral line. Then self-absorption is applied to the simulated spectrum. We investigated the dependence of the self-absorption coefficient on the plasma temperature, optical path length, and element concentration in the sample. Self-absorption decreases with increased plasma temperature, whereas it increases with increasing optical path length and analyte concentration. We also investigated the role of self-absorption in quantitative analysis by calibration-free LIBS with and without resonance lines of the binary alloy (Mg 50% & Ca 50%). We observed a drastic reduction in error from 27% to 2% in the composition estimation when excluding resonance lines.