This study develops and evaluates a machine learning model for predicting optimal irrigation schedules using real-time environmental data collected from an Internet of Things (IoT) system. Building upon a previously validated smart farming monitoring system that provided real-time data on temperature, humidity, and soil moisture, this research addresses the next step: moving from monitoring to predictive analytics. Data collected over a six-day period from DHT11 temperature and humidity sensors, as well as soil moisture sensors, were used to train a predictive model. The model is designed to forecast future soil moisture levels, thereby providing farmers with proactive recommendations for irrigation. A Long Short-Term Memory (LSTM) neural network was employed to capture the temporal dependencies between atmospheric conditions and soil moisture. The model was trained on a portion of the collected data and then validated on a separate, unseen dataset. The evaluation yielded a Mean Absolute Error (MAE) of 2.5%, a Root Mean Square Error (RMSE) of 3.1%, and an R-squared (R2) value of 0.92, demonstrating high predictive accuracy. This approach aims to enhance water resource management, reduce manual intervention, and improve crop health by ensuring water is supplied only when necessary. The results indicate that the machine learning model can accurately predict irrigation needs, offering a significant improvement over traditional, reactive monitoring systems and marking a substantial step towards data-driven, precision agriculture.