Show HN: TabPFN-2.5 – SOTA foundation model for tabular data
16 days ago
- #foundation-models
- #tabular-data
- #machine-learning
- TabPFN-2.5 is the next generation tabular foundation model, scaling to 20× more data cells than TabPFNv2.
- It outperforms tuned tree-based models and matches AutoGluon 1.4's accuracy on benchmarks with up to 50,000 data points and 2,000 features.
- A new distillation engine converts TabPFN-2.5 into compact MLP or tree ensembles, reducing latency while maintaining accuracy.
- Tabular foundation models (TFMs) like TabPFN-2.5 offer training-free prediction with strong calibration and generalization, eliminating the need for extensive tuning.
- TabPFN-2.5 supports messy, heterogeneous data, including categorical features, missing values, and outliers, scaling up to 50,000 samples and 2,000 features.
- The model significantly improves inference latency and deployment flexibility through distillation.