A method for improving the accuracy of calibration-free laser-induced breakdown spectroscopy (CF-LIBS) using determined plasma temperature by genetic algorithm (GA)
Abstract
Accuracy is still a challenge for classical calibration-free laser-induced breakdown spectroscopic (CF-LIBS) quantitative analysis since absolute theoretical calculation and mathematical models cannot compensate for the self-absorption effect and plasma temperature variability. The aim of this research is to obtain a more accurate plasma temperature which contributes to the precise determination of the elemental composition of unknown samples for CF-LIBS. Herein, an internal reference-external standard with iteration correction (IRESIC) method is proposed to correct for the self-absorption effect and plasma temperature in CF-LIBS based on an internal reference line and one standard sample. The spectral intensities of each species are corrected by an internal reference line via the iteration procedure, and the internal reference line suffers from a negligible self-absorption effect for self-absorption correction. Furthermore, one standard sample matrix-matched with the unknown samples along with the genetic algorithm (GA) is utilized to simulate the accurate plasma temperature of the unknown samples. It is worth mentioning that the GA is used for plasma temperature correction through iteration correction for the first time in CF-LIBS. The proposed method demonstrates a significant improvement in accuracy compared with the classical CF-LIBS in the quantitative analysis of aluminum-bronze alloy samples due to the integrated merits of internal reference line usage and accurate plasma temperature evaluation.