DevOps
Antal Sp. z o.o.
⚲ Kraków
180 - 210 PLN netto (B2B)
Wymagania
- BigQuery
- GCP
- Apache
- ETL tools
- ETL
Opis stanowiska
DevOps Cloud Engineer (GCP / Data Platform / Airflow) 📍 Hybrid (2 days per week from Cracow) We are looking for an experienced DevOps Cloud Engineer to join a team responsible for building and evolving a scalable, multi-tenant data integration platform on Google Cloud Platform (GCP). The platform leverages Apache Airflow to orchestrate ETL/ELT pipelines, enabling reliable data movement from source systems through the platform into downstream vendor systems. A key focus of the role is improving automation, deployment pipelines, and implementing end-to-end data lineage for full visibility of data flows. Key responsibilities • Design, develop, test, and maintain ETL/ELT pipelines orchestrated via Apache Airflow (DAGs) • Build and enhance platform capabilities supporting multi-tenant data integration use cases • Design and implement scalable deployment pipelines for Apache Airflow environments • Develop and optimise SQL-based transformations and integrations, including work with Google BigQuery • Implement data lineage capabilities to enable end-to-end traceability of data flows • Automate operational and engineering processes to improve reliability, scalability, and efficiency • Build and maintain test automation frameworks (regression and performance testing) • Work in an Agile delivery environment, contributing to iterative development and continuous improvement • Troubleshoot complex issues across infrastructure, pipelines, and data flows; drive root cause analysis and preventive solutions • Collaborate with engineers, product stakeholders, and partner teams to ensure alignment and delivery Required skills & experience • 10+ years of experience in software engineering, DevOps, and/or cloud engineering roles • Strong hands-on experience with Google Cloud Platform (GCP), including BigQuery and related services • Proven experience building and maintaining ETL/ELT pipelines using Apache Airflow (DAG design, scheduling, monitoring, scaling) • Advanced SQL skills for data transformation, validation, and performance optimisation • Experience designing multi-tenant architectures and scalable, resilient platforms • Strong background in CI/CD and deployment automation, ideally for data platforms or Airflow environments • Experience with test automation, including regression and performance testing frameworks • Solid understanding of data concepts: data quality, metadata, governance, and data lineage • Familiarity with data visualisation tools for operational or lineage insights • Strong problem-solving skills with a focus on automation and reliability improvements • Excellent communication skills and ability to work in cross-functional Agile teams Why apply for an Antal job offer? When your application is successful, you will be supported by a dedicated Consultant who will stay in regular contact with you (via email or phone), help you prepare for interviews with your future employer, and ensure a smooth and professional recruitment process. About Antal Antal is a leading recruitment and HR advisory company, present in Poland since 1996 and later expanded to the Czech Republic and Hungary. Across the CEE region, we employ around 150 professionals who deliver a full range of services – from specialist and executive recruitment, employee outsourcing and HR consulting, to employer branding and market research. Our division-based structure combines deep industry expertise with functional specialisation, enabling us to provide tailored solutions for companies in every sector. We act as a trusted partner for both employers and candidates, sharing our knowledge and guiding them through every stage of the talent journey. We connect exceptional people with the right opportunities and help organisations build successful teams.