We want future AI systems to have superhuman epistemics: the ability to parse evidence at enormous scale and draw rigorous conclusions for both itself and the user. Search is the capability that determines whether a model can pick a signal out of noise, weigh conflicting evidence, and know what it doesn't know. Every higher-order capability we care about depends on search being trustworthy. If we want Claude to be a trustworthy collaborator on real knowledge work, it has to be a trustworthy searcher. We're hiring a Research Engineer to advance the science and engineering that goes into making Claude this trustworthy searcher. This is a research role for someone who is unusually rigorous: you'll define hypotheses about what makes a model an epistemically sound searcher, design the experiments that test them, and turn search post-training from a craft into a measurable science. You'll be the person who insists on cleanly isolated variables, calibrated metrics, and reproducible signal, while also having the engineering skill to build the infrastructure necessary to get them. This work sits at the intersection of reinforcement learning, retrieval, and evaluation, and it directly shapes how Claude behaves in any setting where evidence matters: research, analysis, agentic workflows, and beyond.
Stand Out From the Crowd
Upload your resume and get instant feedback on how well it matches this job.
Job Type
Full-time
Career Level
Senior