Multi-task learning (MTL) aims to improve generalization by leveraging shared information across related tasks. However, conventional methods often rely on restrictive, pre-defined assumptions about task relationships, limiting their effectiveness in complex, heterogeneous environments, this paper introduces a Hierarchical Bayesian Model for Adaptive Multi-Task Learning (HB-MTL), a fully integrated probabilistic framework that learns the inter-task relationship structure directly from the data. By placing hyper-priors on the parameters of a shared task distribution, our model can flexibly capture a rich mosaic of relationships, including positive, negative, and null correlations, we employ Variational Inference for tractable posterior approximation, we validate our approach on a challenging synthetic benchmark, "MetroSim," designed to emulate the structural complexities of real-world systems, the results demonstrate that our model significantly outperforms a suite of strong baselines, particularly in its unique ability to leverage negative correlations and avoid negative transfer with unrelated tasks, the framework not only yields superior predictive accuracy but also provides an interpretable map of the learned task structure and robust uncertainty quantification, making it a powerful tool for practical applications
Copyrights © 2025