Spectral stability improvement in laser-induced breakdown spectroscopy based on an image auxiliary data preprocessing method
Abstract
Owing to the complex laser–material interaction, large spectral fluctuation is one of the main causes of the relatively poor qualitative analysis performance of laser-induced breakdown spectroscopy (LIBS). In this study, a data preprocessing method, namely minimum distance and subsequent averaging (MD & A) using plasma image information, is proposed to identify effective spectra and improve the spectral stability of LIBS. The correlations between plasma image features and spectral line intensity were analyzed, which indicated that plasma average area intensity showed high correlation with plasma states. To evaluate the performance of the proposed method, original spectra were preprocessed by other traditional screening methods, and then, different quantitative models for Mn, Cu, Cr and Si elements in steel samples were separately established based on the partial least squares regression (PLSR) algorithm. Compared to the original spectra, the evaluation parameter R2 is improved to nearly 99% from 93% while the values of mean root mean squared error of prediction (RMSEP) and mean average relative error of prediction (AREP) are also reduced by over 50%. In addition, the mean relative standard deviation (RSD) of different spectral line intensities using MD & A can be reduced to the range of 1.07% to 3.12% from the range of 3.76% to 18.48%. The above results indicate that our proposed method can be easy to realize and significantly improve the spectral stability and quantitative analysis performance of LIBS compared to traditional methods.