Contract: Senior Data Engineer
Role Overview
This senior-level contract role involves building and operating a Data Platform as a Service for internal teams, focusing on scalable, secure, and well-governed data products. The engineer will work closely with data, analytics, and AI teams to enhance platform reliability, developer experience, and delivery speed through CI/CD, automation, and observability practices. Key responsibilities include enabling data mesh patterns, improving data quality frameworks, and optimizing platform performance and cost.
Perks & Benefits
The position is fully remote, specifically for Latin America, offering flexibility in work location. It provides opportunities for career growth by collaborating with diverse teams and influencing data platform standards in a global company. The role emphasizes a trust-driven and inclusive culture, with a focus on innovation and professional development in a remote tech environment.
Full Job Description
Upwork ($UPWK) is the world’s work marketplace. We serve everyone from one-person startups to over 30% of the Fortune 100 with a powerful, trust-driven platform that enables companies and talent to work together in new ways that unlock their potential.
Last year, more than $3.8 billion of work was done through Upwork by skilled professionals who are gaining more control by finding work they are passionate about and innovating their careers.
This is an engagement through Upwork’s Hybrid Workforce Solutions (HWS) Team. Our Hybrid Workforce Solutions Team is a global group of professionals that support Upwork’s business. Our HWS team members are located all over the world.
This hybrid engagement will help build and operate Data Platform as a Service capabilities for internal teams. The role focuses on enabling scalable, secure, reliable, and well-governed data products through platform engineering practices—CI/CD for data, data mesh enablement, automation, observability, and self-service workflows. This engineer will partner closely with data engineering, analytics, and AI teams to improve platform reliability, developer experience, and time-to-delivery.
Work/Project Scope:
- Build and operate platform services that enable teams to deliver data products reliably (pipelines, transformations, orchestration, metadata, governance).
- Design and implement CI/CD for data (tests, deployments, promotion workflows, rollback strategies, versioning).
- Improve data platform reliability through observability, SLAs/SLOs, alerting, incident response, and runbooks.
- Enable data mesh patterns: domain ownership, standardized interfaces, reusable templates, and paved paths.
- Develop internal tooling and automation for onboarding datasets, creating standardized pipelines, and enforcing best practices (quality, security, lineage).
- Implement or enhance data quality and validation frameworks (contract testing, reconciliation, anomaly detection).
- Optimize platform performance and cost (warehouse optimization, job efficiency, resource scaling).
- Collaborate with Security/Compliance to ensure encryption, access control, auditability, and least-privilege practices.
- Partner with AI teams to ensure data products are fit for AI/ML workloads (feature readiness, dataset versioning, reproducibility, governance).
- Improve and maintain Airflow orchestration, including DAG design, dependency management, and operational reliability for dbt and analytics workflows.
Must Haves (Required Skills):
- Strong software engineering foundation building production systems (Python and/or Rust preferred; strong APIs/services mindset).
- Proven experience in data platform engineering (not just building pipelines—building platforms for others).
- Hands-on experience with CI/CD, Infrastructure as Code, and automation.
- Experience with observability and reliability engineering (metrics, logs, tracing, SLOs, on-call readiness).
- Strong knowledge of modern data ecosystem patterns (data modeling, orchestration, warehousing/lakehouse concepts).
- Practical experience enabling data mesh or self-service platform capabilities.
- Ability to work across ambiguity, drive delivery, and influence standards.
Preferred Background/Experience:
- Experience with Snowflake and modern orchestration/testing patterns (dbt/SQLMesh-like workflows, Strong Airflow/Dagster, data quality tools).
- Experience with Kubernetes and cloud-native deployments. Experience integrating metadata/catalog/lineage tooling (Atlan/Collibra/Amundsen/OpenMetadata, etc.).
- Familiarity with AI data requirements (dataset governance, experiment reproducibility, feature pipelines).
Upwork is an Equal Opportunity Employer committed to recruiting and retaining a diverse and inclusive workforce. We do not discriminate based on race, religion, color, national origin, gender (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender identity, gender expression, age, status as a protected veteran, status as an individual with a disability, genetic information, or other legally protected characteristics under federal, state, or local law.
Please note that a criminal background check may be required once a conditional job offer is made. Qualified applicants with arrest or conviction records will be considered in accordance with applicable law, including the California Fair Chance Act and local Fair Chance ordinances. The Company is committed to conducting an individualized assessment and giving all individuals a fair opportunity to provide relevant information or context before making any final employment decision.
To learn more about how Upwork processes and protects your personal information as part of the application process, please review our Global Job Applicant Privacy Notice
Similar jobs
Found 6 similar jobs