Job Description
WHO WE ARE
Foundation models have transformed text and images, but structured data - the largest and most consequential data modality in the world - has remained untouched. Tables power every clinical trial, every financial model, every scientific experiment, every business decision. No one has built a foundation model that truly understands them.
Until now. What LLMs did for language, we're doing for tables. The next modality shift in AI is happening - and we're hiring the team that makes it.
Momentum: We pioneered tabular foundation models and are now the world-leading organization in structured data ML. Our TabPFN v2 model was published in Nature https://www.nature.com/articles/s41586-024-08328-6 and set a new state-of-the-art for tabular machine learning. Since its release, we've scaled model capabilities more than 20x, reached 3M+ downloads, 6,000+ GitHub stars, and are seeing accelerating adoption across research and industry - from detecting lung disease with Oxford Cancer Analytics https://www.oxcan.org/news/prior-labs-and-oxford-cancer-analytics-partner-to-advance-liquid-biopsy-and-clinical-decision-making-in-lung-disease to preventing train failures with Hitachi https://siliconangle.com/2025/12/01/prior-labs-debuts-tabular-ai-foundation-model-scales-10-million-rows/ to improving clinical trial decisions with BostonGene https://priorlabs.ai/case-studies/boston-gene.
The hardest work is in front of us. We're scaling tabular foundation models to handle millions of rows, thousands of features, real-time inference, and entirely new data modalities - while building the infrastructure to deploy them in production across some of the most demanding industries on earth. These are open problems no one else is working on at this level.
Our team: We’re a small, highly selective team https://priorlabs.ai/about of 20+ engineers, researchers and GTM specialists, selected from over 5,000 applicants, with backgrounds spanning Google, Apple, Amazon, Microsoft, G-Research, Jane Street, Goldman Sachs, and CERN, led by Frank Hutter https://www.linkedin.com/in/frank-hutter-9190b24b/, Noah Hollmann https://www.linkedin.com/in/noah-hollmann-668b9010b/ and Sauraj Gambhir https://www.linkedin.com/in/sauraj-g/ and advised by world-leading AI researchers such as Bernhard Schölkopf and Turing Award winner Yann LeCun. We ship fast, create top-tier research, and hold each other to an extremely high bar.
What’s Next: In 2025, we raised €9m pre-seed led by Balderton Capital, backed by leaders from Hugging Face, DeepMind, and Black Forest Labs. The next phase of growth is here which makes this an optimal time to join.
ABOUT THE ROLE
Tabular data breaks the assumptions that make scaling work for language and vision. There's no natural sequence, no spatial structure, no shared vocabulary across datasets. The architectures and scaling laws that power LLMs don't transfer. We've made the first breakthrough with TabPFN — but the hardest problems are still ahead.
At Prior Labs, Research Scientists drive the core model agenda. You'll define research directions, design novel architectures, and publish work that advances the field — while ensuring your ideas translate into models that actually ship. We create cutting-edge models because the same people do both. As an early team member, you'll have significant technical ownership and room to grow as we scale.
The problems we're solving:
- Scaling transformer architectures from 10K to 1M+ samples — without the structural assumptions that make language models scale
- Building multimodal models that combine tabular, text, and numerical understanding
- Making models efficient enough for real-world deployment — not just accurate enough for a paper
- Designing architectures for time series, forecasting, anomaly detection, and multiple related tables
- Researching causal understanding in foundation models
What We're Looking For
- PhD in Computer Science, Applied Mathematics, Statistics, Electrical Engineering, or a closely related field, or equivalent research experience with demonstrated impact
- Publications at top-tier ML venues (NeurIPS, ICML, ICLR, etc.) or equivalent impact through widely used open-source, benchmarks, or deployed systems
- Strong experience building and analyzing machine learning models, including transformer or other sequence-based architectures, using PyTorch
- Solid understanding of training dynamics, generalization, scaling behavior, and common failure modes in deep learning systems
- Excellent engineering fundamentals and strong Python skills, with a track record of writing high-quality research code
Nice to Have
- Experience at an early-stage startup or research lab with a shipping culture
- Contributions to open-source ML libraries or tools
- Experience with model distillation, inference optimization, or efficient architectures
- Background in tabular data, time series, or other structured data — helpful but not required
Life at Prior Labs
We're a small, ambitious team solving one of the hardest problems in AI, and we're just getting started. You'll work closely with world-class researchers and builders who care deeply about the quality of their craft, the impact of their work, and the people they work with.
We move fast, we think rigorously, and we take the time to do things right. If you're excited by hard problems, motivated by real-world impact, and want to be part of building something that matters, we'd love to hear from you.
Our Commitments
We believe the best products and teams come from a wide range of perspectives, experiences, and backgrounds. That's why we welcome applications from people of all identities and walks of life, especially anyone who's ever felt discouraged by "not checking every box."
We're committed to creating a safe, inclusive environment and providing equal opportunities regardless of gender, sexual orientation, origin, disability, or any other trait that makes you who you are.