Data Platform Engineer
Role Overview
This senior-level Data Platform Engineer role involves designing, building, and maintaining a declarative data platform to handle ingestion, transformation, and consumption at Lovable. You will develop core data infrastructure, ensure data quality and security, and collaborate with platform and domain teams to implement data pipelines. The role focuses on empowering teams to create scalable, reliable, and user-friendly data systems, impacting the company's data-driven operations.
Perks & Benefits
The job is fully remote, likely with flexible hours, though time zone expectations may align with Stockholm for collaboration. It offers a high-velocity, low-ego culture with opportunities for career growth in a talent-dense team at a generation-defining company. Benefits include working with cutting-edge tech and a focus on extreme ownership and impactful work, typical of tech startups.
Full Job Description
TL;DR - We are looking for data engineers to design, build, and maintain the end-to-end declarative data platform that powers ingestion, transformation, and consumption across Lovable. You are obsessed about empowering teams to ensure data systems are scalable, reliable, safe and easy to use.
Why Lovable?
Lovable lets anyone and everyone build software with plain English. From solopreneurs to Fortune 100 teams, millions of people use Lovable to transform raw ideas into real products - fast. We are at the forefront of a foundational shift in software creation, which means you have an unprecedented opportunity to change the way the digital world works. Over 2 million people in 200+ countries already use Lovable to launch businesses, automate work, and bring their ideas to life. And we’re just getting started.
We’re a small, talent-dense team building a generation-defining company from Stockholm. We value extreme ownership, high velocity and low-ego collaboration. We seek out people who care deeply, ship fast, and are eager to make a dent in the world.
What we are looking for:
Strong experience with cloud-based data platforms (Snowflake, BigQuery, Redshift, Databricks or similar).
Deep understanding of data ingestion frameworks, orchestration(Airflow, Dagster, Prefect or similar) and infrastructure as code (Terraform, Pulumi or similar).
Familiarity with event-driven architectures (Kafka, Pub/Sub, Kinesis or similar).
Proficiency in SQL, and distributed systems. Knowledge in Go is a plus.
Strong communication and documentation skills. Be able to align technical details with strategic goals.
What you will do:
Develop and maintain core data infrastructure (data lake, warehouse, orchestration, observability, governance).
Partner with other platform and domain teams to design data pipelines and implement declarative configuration for ingestion and transformation.
Ensure data quality, lineage, and security standards are embedded across all workflows.
Collaborate with Analytics Engineers to define modeling conventions and enforce schema governance.
Implement CI/CD pipelines for data systems and assist with cost optimization and performance tuning.
Define data platform best practices and support self-service enablement.
Our tech stack
We're building with tools that both humans and AI love:
Frontend: React
Backend: Golang and Rust
Cloud: Cloudflare, GCP, AWS, Many LLM providers
DevOps & Tooling: Github Actions, Grafana, OTEL, infrastructure-as-code (Terraform)
And always on the lookout for what's next!
How we hire
Fill in a short form then jump on an intro call with recruiter.
Complete the general programming exercise.
Show us how you approach problems during several technical interviews.
Tell us about your most impressive project.
About your application
Please submit your application in English - it’s our company language so you’ll be speaking lots of it if you join
We treat all candidates equally - if you’re interested please apply through our careers portal
Similar jobs
Found 6 similar jobs