Claim Missing Document
Check
Articles

Found 1 Documents
Search

Transformer-Based Tabular Foundation Models: Outperforming Traditional Methods with TabPFN Babu, R Anand; Priya V, Vishwa; Kumar Mishra, Manoj; Ramesh Raja, Inakoti; Kiran Chebrolu, Surya; Swarna, B
International Journal of Engineering, Science and Information Technology Vol 5, No 3 (2025)
Publisher : Malikussaleh University, Aceh, Indonesia

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.52088/ijesty.v5i3.1146

Abstract

Scientific research and commercial applications rely heavily on tabular data, yet efficiently modelling this data has constantly been a problem. For over twenty years, the standard method for machine learning has been based on traditional models, with gradient-boosted decision trees (GBDTs). Despite recent advancements in deep learning, neural networks often fail to provide satisfactory results on compact tabular datasets due to factors such as overfitting, insufficient data intricate feature relationships. The study offers a Tabular Prior data Fitted Network, a foundation model developed by meta-learning on more than one million synthetic datasets generated sequentially, which is constructed on transformers to tackle these limitations. Without retraining or hyperparameter optimization, TabPFN learns to anticipate the best solutions for tabular problems, gaining inspiration from the achievements of GPT-like models in natural language processing. When applied to small to medium-sized datasets, its cutting-edge performance in inference speed accuracy outperforms that of traditional methods. TabPFN redefines efficient and scalable tabular data modelling, including generative capabilities, few-shot learning, rapid adaptation.