Lead Data Engineer

This listing is synced directly from the company ATS.

Role Overview

As a Lead Data Engineer at AirOps, you will be responsible for owning and scaling the data platform that drives insights on AI search visibility and content performance. This senior role involves designing and operating data pipelines, defining company-wide data models, and leading a high-output team to ensure data quality and observability. Your work will directly impact customer-facing analytics and product features, enhancing strategic execution across the organization.

Perks & Benefits

AirOps offers equity in a rapidly growing startup along with a competitive benefits package tailored to your location. The flexible time off policy and generous parental leave reflect a supportive work culture. With a fully remote setup and a fun-loving team that values quality and curiosity, this role provides opportunities for career growth and collaboration across various functions.

⚠️ This job was posted over 4 months ago and may no longer be open. We recommend checking the company's site for the latest status.

Full Job Description

About AirOps

AirOps is the first end-to-end content engineering platform built for the AI era. In a world where discovery is shifting from traditional search to AI-driven platforms, we help brands get found—and stay found. We are currently in a phase of hyper-growth, having 5x’d our revenue in the last year by helping marketing teams at Ramp, Chime, Carta, and Rippling turn content quality into a durable competitive advantage.

Our platform equips marketers to navigate the new discovery landscape, prioritize high-impact opportunities, and create accurate, on-brand content that earns citations from AI and trust from humans. Backed by Greylock, Unusual Ventures, Wing VC, and Founder Collective, we are building the intelligent systems that will empower the next generation of marketing leaders. AirOps is headquartered in San Francisco, New York and Montevideo.

About the Role

As Lead Data Engineer, you will own and scale the data platform that powers AirOps insights on AI search visibility and content performance. You will set technical direction, write production code, and build a small, high-output team that turns raw web, content, and AI agent data into trustworthy datasets.

Your work will drive customer-facing analytics and product features while giving our content and growth teams a clear path from strategy to execution. You value extreme ownership, sweat the details on data quality, and love partnering across functions to ship fast without losing rigor.

Key Responsibilities

  • Data platform ownership: design, build, and operate batch and streaming pipelines that ingest data from crawlers, partner APIs, product analytics, and CRM.

  • Core modeling: define and maintain company-wide models for content entities, search queries, rankings, AI agent answers, engagement, and revenue attribution.

  • Orchestration and CI: implement workflow management with Airflow or Prefect, dbt-based transformations, version control, and automated testing.

  • Data quality and observability: set SLAs, add tests and data contracts, monitor lineage and freshness, and lead root cause analysis.

  • Warehouse and storage: run Snowflake or BigQuery and Postgres with strong performance, cost management, and partitioning strategies.

  • Semantic layer and metrics: deliver clear, documented metrics datasets that power dashboards, experiments, and product activation.

  • Product and customer impact: partner with Product and Customer teams to define tracking plans and measure content impact across on-site and off-site channels.

  • Tooling and vendors: evaluate, select, and integrate the right tools for ingestion, enrichment, observability, and reverse ETL.

  • Team leadership: hire, mentor, and level up data and analytics engineers; establish code standards, review practices, and runbooks.

Qualifications

  • 5+ years in data engineering with 2+ years leading projects

  • Expert SQL and Python with deep experience building production pipelines at scale

  • Hands-on with dbt and a workflow manager such as Airflow or Prefect

  • Strong background in dimensional and event-driven modeling and a company-wide metrics layer

  • Experience with Snowflake or BigQuery, plus Postgres for transactional use cases

  • Track record building data products for analytics and customer reporting

  • Cloud experience on AWS or GCP and infrastructure as code such as Terraform

  • Domain experience in SEO, content analytics, or growth experimentation is a plus

  • Clear communicator with a bias for action, curiosity, and a high bar for quality

Our Guiding Principles

  1. Extreme Ownership

  2. Quality

  3. Curiosity and Play

  4. Make Our Customers Heroes

  5. Respectful Candor

Benefits

  • Equity in a fast-growing startup

  • Competitive benefits package tailored to your location

  • Flexible time off policy

  • Parental Leave

  • A fun-loving and (just a bit) nerdy team that loves to move fast!

Similar jobs

Found 6 similar jobs