Analyzing high-dimensional data is a considerable challenge in statistics and data science. Issues like multicollinearity and outliers often arise, leading to unstable coefficients and diminished model effectiveness. Continuum regression is a useful method for calibration models because it effectively handles multicollinearity and reduces the number of dimensions in the data. This method condenses data into autonomous latent variables, resulting in a more stable, precise, and reliable model. It is possible to use the dimensionality reduction method without losing any important information from the original data. This makes it a useful tool for making calibration models work better. In the initial phase, minimizing dimensions via variable selection is crucial. The study aims to build and test the Continuum Regression calibration model using LASSO and SIR-LASSO variable selection preprocessing methods. SIR-LASSO is a method that integrates SIR with the variable selection capabilities of LASSO. This technique aims to handle high-dimensional data by identifying relevant low-dimensional structures. LASSO improves variable selection by applying a penalty to regression coefficients, reducing the impact of less significant or redundant variables. The integration improves SIR's efficacy in assessing high-dimensional data while also enhancing model stability and interpretability. This approach seeks to address the issues of multicollinearity and model instability. We conducted simulations using both low-dimensional and high-dimensional datasets to assess the efficacy of CR LASSO and CR SIR-LASSO. RStudio version 4.1.3 was used for the analysis. The "MASS" package was used to create data with a multivariate normal distribution. The "glmnet" package was used for LASSO variable selection, and the "LassoSIR" package was used for SIR-LASSO variable selection. In the simulation itself, LASSO surpasses SIR-LASSO in variable selection by yielding the lowest RMSEP value in every scenario. On the other hand, SIR-LASSO becomes less stable as the number of dimensions increases, which suggests that it is sensitive to large changes in variables. As shown by lower median RMSEP values across a range of sample sizes and situations, CR LASSO is usually better at making predictions than SIR-LASSO. The RMSEP distributions for LASSO are consistently tighter, which means that its performance is more stable and reliable compared to SIR-LASSO, whose data has more outliers and more variation. Even with a growing sample size, LASSO maintains its advantage, particularly when setting the value at 0.5. SIR-LASSO, although occasionally competitive, generally yields more variable results, particularly with larger sample sizes. Overall, LASSO appears to be a more reliable option for CR model with pre-processed variable selection.