This study investigates and compares the predictive performance of Linear Regression and XGBoost algorithms in estimating Graphics Processing Unit (GPU) prices based on their technical specifications. GPU prices are known for their high volatility, influenced not only by hardware characteristics—such as memory capacity, clock speed, and bandwidth—but also by external market factors including demand from the gaming industry, machine learning applications, and cryptocurrency mining activities. The dataset used in this research comprises 475 GPU units from three leading manufacturers—NVIDIA, AMD, and Intel Arc—featuring 15 technical attributes obtained from publicly accessible data sources. Adopting an experimental quantitative approach, the dataset was divided into training and testing subsets using an 80:20 ratio. The data preprocessing phase involved handling missing values, detecting outliers through the Interquartile Range (IQR) method, performing data normalization, and encoding categorical features. The models were evaluated using four performance metrics: the Coefficient of Determination (R²), Mean Absolute Error (MAE), Root Mean Square Error (RMSE), and Mean Absolute Percentage Error (MAPE). The results demonstrate that XGBoost outperforms Linear Regression, achieving an R² of 0.8129, MAE of 85.07 USD, RMSE of 122.03 USD, and MAPE of 35.23%. In comparison, the Linear Regression model recorded an R² of 0.7629, MAE of 106.59 USD, RMSE of 137.38 USD, and MAPE of 56.04%. The superior performance of XGBoost can be attributed to its ability to model non-linear relationships and capture complex feature interactions among GPU specifications.