Data Engineer

This listing is synced directly from the company ATS.

Role Overview

This is a mid-level Data Engineer role on the Data Platform team at Snowflake, requiring 1-3 years of experience. Day-to-day responsibilities include designing and building complex data pipelines, developing data products, and collaborating with data science and engineering teams to scale infrastructure. The hire will have high impact by shaping data tools and informing product decisions across the company.

Perks & Benefits

The job is remote, but requires being in-office 3 days a week in Menlo Park, indicating a hybrid setup with flexibility. It offers opportunities for career growth in a fast-moving, innovative environment, with a culture that values low-ego, experimental mindsets and collaboration across functions like Engineering and Product Management.

Full Job Description

At Snowflake, we are powering the era of the agentic enterprise. To usher in this new era, we seek AI-native thinkers across every function who are energized by the opportunity to reinvent how they work. You don’t just use tools; you possess an innate curiosity, treating AI as a high-trust collaborator that is core to how you solve problems and accelerate your impact. We look for low-ego individuals who thrive in dynamic and fast-moving environments and move with an experimental mindset — who rapidly test emerging capabilities to discover simpler, more powerful ways to deliver results. At Snowflake, your role isn't just to execute a function, but to help redefine the future of how work gets done.

We’re looking for a talented Data Engineer for our Data Platform team. In this role, you will work closely with various data science, analytics and engineering teams across the company to build best in class data infrastructure and tools. This is a high-impact role that will also help shape the future of Snowflake products and services.

Location: Menlo Park (MUST be in-office 3 days week)

IN THIS ROLE YOU WILL:

  • Design, build, and own the key data infrastructure to manage data and usage across our Snowflake account

  • Interface with data scientists, product managers, and business stakeholders to understand data needs and help build data products and tools that scales across the company

  • Design, implement and maintain custom data ingestion pipelines sourcing data from application apis.

  • Collaborate closely with Engineering, Product Management, IT, Finance, and Compliance to inform product decision making with data and to identify opportunities for system improvements.

  • Answer questions from the executive team for board reporting, publications, and industry reports

  • Think creatively to find optimal solutions to our complex, often unstructured & ambiguous problems

WHAT YOU WILL NEED:

  • 1-3 years experience including:

    • Designing and building complex data pipelines in a large scale data environment

    • Familiarity with developing data products to drive data science and analytics initiatives

    • Experience with ETL and sourcing data from various sources

    • Experience with Apache Airflow is desirable

    • Experience using Snowflake a plus

    • BS in a technical field (CS, Physics, Math, etc), MS/PhD a plus.

    • Proficiency in Python & SQL

    • Ability to clearly present learnings to business leaders and technical stakeholders

    • The ability to thrive in a dynamic environment. That means being flexible and willing to jump in and do whatever it takes to be successful.

Snowflake is growing fast, and we’re scaling our team to help enable and accelerate our growth. We are looking for people who share our values, challenge ordinary thinking, and push the pace of innovation while building a future for themselves and Snowflake.

How do you want to make your impact?

For jobs located in the United States, please visit the job posting on the Snowflake Careers Site for salary and benefits information: careers.snowflake.com

Similar jobs

Found 6 similar jobs