← Back to jobs

Data Engineer

Europe
full_time
Europe

Role Summary

As a Data Engineer at Block Labs, you will lead the design and maintenance of a unified analytical data warehouse on ClickHouse. This senior-level role involves building scalable data pipelines, ensuring data accuracy, and collaborating with the data team to support business intelligence and advanced analytics initiatives. Your work will directly impact the efficiency and effectiveness of data-driven decision-making within the organization.

Benefits & Culture

Block Labs promotes a mature, mission-driven culture that values clarity and outcomes. The remote work setup allows for flexibility, making it suitable for various time zones across Europe. You'll have the opportunity for career growth in a low-ego environment, collaborating with top experts in the field while contributing to innovative Web3 projects.

Full Job Description

About Block LabsBlock Labs is a leading force in the Web3 space, incubating, investing in, and accelerating top-tier fintech, crypto, and iGaming projects. With a mission to shape the future of decentralized technology, we partner with visionary startups to raise funding, refine product-market fit, and grow their audiences.As we continue to expand, we are looking for an ambitious and self-driven individual to join our rapidly growing team!About The RoleWe are seeking a highly skilled Data Engineer to lead the design, implementation, and maintenance of our unified analytical data warehouse on ClickHouse. The ideal candidate is experienced in building scalable data pipelines, working with modern orchestration tools, and leveraging AWS services for data storage and processing. You will collaborate closely with data team to ensure data accuracy, reliability, and accessibility.Key ResponsibilitiesDesign, develop, and maintain a unified analytical data warehouse on ClickHouse (DWH) to support business intelligence and advanced analytics.Participation in DWH architecture designAnalysis and research of current system functionalityUpdating documentation based on resultsWriting automated testsDevelop and manage workflows and data orchestration using tools like Apache Airflow.Ensure seamless integration of AWS services (e.g., Redshift, S3, Lambda) and ClickHouse for data storage and transformation.Implement CI/CD pipelines for data workflows to ensure quality and agility in deployments.Monitor, debug, and resolve issues related to data pipelines and systems.Maintain documentation for data infrastructure, processes, and best practices.Required Skills and Experience:Technical Expertise:Python: Proficiency in Python for building and optimizing data pipelines.Data Orchestration: Hands-on experience with tools like Apache AirflowCI/CD: Familiarity with CI/CD tools (e.g., GitHub Actions, GitLab CI/CD) for automating data workflows.Data Modeling: Expertise in designing efficient and scalable data models (e.g., star/snowflake schemas).General Skills:Proven experience in building and maintaining scalable DWH, reliable ETL/ELT pipelines.Strong SQL skills for querying and performance optimization.Ability to work in an agile environment and adapt to evolving priorities.Excellent problem-solving skills with attention to detail.Effective communication and collaboration skills to work with technical and non-technical stakeholders.Preferred Qualifications:Cloud Platforms: experience with AWS services, including but not limited to Redshift, S3, Lambda, and Glue.ClickHouse: Practical experience in working with ClickHouse for data processing and analysis.Experience of building reports in Tableau or Power BIExperience with data governance, quality frameworks, and security best practices.What kind of culture can I expect?Mature, mission-driven, and low-ego. We value clarity over noise, outcomes over theatrics, and pace without chaos. If you re one of the smartest minds in your craft and want to build with other experts, you ll feel at home here.

Similar jobs

Found 6 similar jobs