Mid Data Engineer (AWS)
TQLO SPÓŁKA Z OGRANICZONĄ ODPOWIEDZIALNOŚCIĄ
⚲ Warszawa
150 - 170 PLN/h netto (B2B)
Wymagania
- AWS
Opis stanowiska
Our Client is an international organization developing modern data-driven solutions in a cloud environment.The project focuses on building and developing scalable data pipelines and a data platform based on AWS. We are looking for a Data Engineer who wants to grow in the data space and have a real impact on data architecture and quality. WORK MODE: 100% remote WHAT WILL YOU DO? • Design and implement batch data pipelines in AWS (Amazon Web Services) using AWS Glue • Develop and optimize data transformations using Apache Spark and SQL (Structured Query Language) with a focus on performance and maintainability • Co-create data models for analytical solutions (data lake / data warehouse) • Ensure data quality, monitoring, and reliability of pipelines in distributed environments • Collaborate with business and technical teams to understand requirements WHAT ARE WE LOOKING FOR? • Min. 4 years of experience as a Data Engineer • Strong knowledge of SQL and Python (production-level coding) • Hands-on experience with AWS (Glue, S3, Lambda, Redshift) • Experience with ETL/ELT (Extract, Transform, Load) pipelines • Experience with CI/CD (e.g. GitLab) and Terraform (Infrastructure as Code) • Experience working in Agile environments and communicative English (min. B2/C1) Nice to have: • Knowledge of Power BI • Experience with data governance and data quality practices • Proactive mindset and willingness to improve existing solutions WHY IS IT WORTH JOINING? • 100% remote work in an international environment • Project for a minimum of 6 months with possible extension • Real impact on data architecture and technical decisions • Work with modern tech stack (AWS, Spark, Terraform) • High level of autonomy and collaborative culture Thank you for all applications! We will contact selected candidates. TQLO Sp. z o.o. – Employment Agency (KRAZ No. 33580)