JustJoin.IT Praca zdalna Senior

GCP Data Engineer

Link Group

⚲ Warszawa, Wrocław, Kraków, Poznań, Gdańsk

130 - 170 PLN/h netto (B2B)

Wymagania

  • ETL
  • Data
  • GCP
  • Big Data

Opis stanowiska

We are looking for a hands-on Data Engineer to build and maintain robust, scalable data solutions. Your primary focus will be developing complex data pipelines within the GCP ecosystem, ensuring high performance in both real-time and batch processing. You will work closely with architects to translate modern data strategies (like Data Mesh or Data Vault) into functional technical reality. Core Responsibilities • Data Pipeline Development: Build, deploy, and optimize end-to-end ETL/ELT processes using GCP-native technologies (BigQuery, Dataflow, etc.). • Advanced Data Modeling: Implement sophisticated relational and Big Data structures tailored for high-volume environments. • Modern Architecture Implementation: Practically apply frameworks such as Data Fabric, Data Mesh, and Data Vault within the data platform. • Real-time Processing: Develop and maintain systems for stream processing and data warehousing aggregations to support instant analytics. • Technical Collaboration: Actively participate in design workshops, providing technical insights and ensuring the feasibility of architectural choices. Technical Requirements • Professional Experience: 5+ years of hands-on experience in engineering complex, enterprise-grade data solutions. • GCP Expertise: Minimum 3 years of practical experience specifically within the Google Cloud Platform. • Tech Stack: Strong command of BI, ETL, and Big Data technologies (e.g., Spark, Airflow, dbt). • Data Design: Deep understanding of modern data warehousing, including real-time data flow and complex aggregations.