Program Manager, Data Operations

This listing is synced directly from the company ATS.

Role Overview

This is a senior-level Program Manager role focused on scaling data operations for AI research. Day-to-day, the hire will scope and execute expert-driven data programs, design pipelines to transform expert input into RL-ready datasets, and build tools to automate workflows. They will work closely with research engineering and domain experts to ensure high-quality, consistent data delivery that directly impacts the training of advanced AI models.

Perks & Benefits

The role offers remote work with hybrid options in Mountain View, CA and Bangalore, India, implying flexibility in location and time zones. Benefits include competitive salary and equity, health coverage, and flexible work arrangements. There is an opportunity to help define a new product category in AI infrastructure, suggesting significant career growth and impact in a pioneering environment.

Full Job Description

About Bespoke Labs

Bespoke Labs is an applied AI research lab pioneering data and RL environment curation for training and evaluating agents.

Recently, we curated Open Thoughts, one of the best open reasoning datasets used by multiple frontier labs, trained SOTA specialized models such as Bespoke-MiniChart-7B and Bespoke-MiniCheck, and taught agents to do multi-turn tool-calling with reinforcement learning.

Bespoke is uniquely positioned to capture a large market share of data and RL environment curation.

About the Role

We’re looking for a Program Manager, Data Operations to scale the data ecosystem behind frontier AI models—powering next-generation Reinforcement Learning (RL) environments, agentic tasks, and datasets that directly shape how advanced models learn, reason, and act in the world. This role owns the execution of Expert-driven data programs, combining program management with hands-on workflow design, quality definition, and delivery of RL-ready datasets.

What you'll do

  • Scope and execute Expert-driven data programs, from task design to delivery

  • Onboard and support domain experts to produce consistent outputs with speed

  • Design pipelines that transform Expert input into reviewed, structured, RL data

  • Define quality benchmarks, throughput targets, and dataset acceptance criteria

  • Translate model needs into scalable data workflows with Research Engineering

  • Operationalize learning-driven project experiences for Experts during delivery

  • Build tools/systems to standardize and automate workflows to scale operations

  • Track execution, unblock operations issues, and keep projects ahead of schedule

  • Monitor and optimize throughput, quality, and cost across Data Ops projects

  • Measure program success using quality, consistency, and delivery metrics

What we're looking for

  • Comfortable balancing hands-on workflow design with program coordination

  • Strong program/project management in data, ML, or ops-heavy environments

  • Ability to turn ambiguous research needs into clear guidelines and datasets

  • Experience working with domain experts and distributed contributors

  • Familiarity with ML/RL data pipelines, evaluation, and agent workflows

  • Detail-oriented with high standards for data quality and consistency

  • Systems thinker who can scale processes by orders of magnitude

  • Clear communicator across technical and non-technical stakeholders

  • Bias for ownership, speed, and continuous excellence

Logistics

Location: Mountain View, CA and Bangalore, India (Hybrid).

Compensation: Competitive salary and equity based on experience.

Benefits: Health coverage, flexible work arrangements, and the opportunity to help define a new product category in AI infrastructure

We encourage you to apply even if you don't meet every qualification. We value diverse perspectives and are happy to discuss how your unique background could be a great fit for this role.

Similar jobs

Found 6 similar jobs