Influence of baseline subtraction on laser-induced breakdown spectroscopic data
Abstract
Spectra acquired by laser-induced breakdown spectroscopy comprise both a valuable analyte signal and an undesired background signal originating from various sources. The latter is commonly suppressed, at least partially, by gating of the signal acquisition, or treated numerically during the data processing. The numerical treatment might lead to loss of information or introduction of spectral artefacts, depending on the applied methodology. Consequently, background subtraction might significantly influence both univariate and multivariate analysis of the LIBS data. While various baseline correction methods have recently been studied and compared for multivariate LIBS analysis, their influence on LIBS data remains unexplored. Therefore, the present work aims to elucidate the effects of numerical background estimation and subtraction on LIBS data in terms of limits of detection estimated by the signal-to-noise ratio method, considering several fundamentally different background estimation algorithms: polynomial fitting, heuristic estimation, wavelet smoothing, and non-parametric modelling. A threshold gate delay value was observed above which the numerical treatment of spectral background has to be done cautiously. In addition, it was found that the optimal measurement parameters and selection of the emission line yielding the best results depend on the planned spectral processing.