Regression analysis is a statistical technique used to analyze the relationship between predictor and response variables. One of the parameter estimation methods commonly used for regression analysis is Ordinary Least Squares. This method produces unbiased and efficient estimates, known as BLUE (Best Linear Unbiased Estimator). In multiple linear regression analysis involving more than one predictor variable, it is essential to meet model assumptions such as the absence of multicollinearity. Multicollinearity is a condition where predictor variables have a high correlation, which can disrupt the stability of parameter estimates. Therefore, Ridge Regression and Jackknife Ridge Regression methods were used to address this issue. Both methods modify the least squares method by adding a bias constant value. This research uses the Open Unemployment Rate (OUR) data in Sumatra in 2022, and 3 predictor variables exhibit multicollinearity. Based on the analysis comparing the Mean Squared Error (MSE) values, the Jackknife Ridge Regression method yields the smallest MSE value, 0.004. Both methods are effective in addressing multicollinearity and identifying significant predictor variables for OUR in Sumatra Island, namely the Human Development Index (HDI), average years of schooling, number of poor people, Life Expectancy (LE), population density and inactive population