NoFluffJobs Praca zdalna Senior

GCP Data Engineer

Link Group

⚲ Remote

21 840 - 28 560 PLN (B2B)

Wymagania

  • GCP
  • Google cloud platform
  • BI
  • ETL
  • Big data
  • Spark
  • Airflow
  • dbt
  • modern data warehousing

Opis stanowiska

O projekcie: We are looking for a hands-on Data Engineer to build and maintain robust, scalable data solutions. Your primary focus will be developing complex data pipelines within the GCP ecosystem, ensuring high performance in both real-time and batch processing. You will work closely with architects to translate modern data strategies (like Data Mesh or Data Vault) into functional technical reality. Wymagania: - Professional Experience: 5+ years of hands-on experience in engineering complex, enterprise-grade data solutions. - GCP Expertise: Minimum 3 years of practical experience specifically within the Google Cloud Platform. - Tech Stack: Strong command of BI, ETL, and Big Data technologies (e.g., Spark, Airflow, dbt). - Data Design: Deep understanding of modern data warehousing, including real-time data flow and complex aggregations. Codzienne zadania: - Data Pipeline Development: Build, deploy, and optimize end-to-end ETL/ELT processes using GCP-native technologies (BigQuery, Dataflow, etc.). - Advanced Data Modeling: Implement sophisticated relational and Big Data structures tailored for high-volume environments. - Modern Architecture Implementation: Practically apply frameworks such as Data Fabric, Data Mesh, and Data Vault within the data platform. - Real-time Processing: Develop and maintain systems for stream processing and data warehousing aggregations to support instant analytics. - Technical Collaboration: Actively participate in design workshops, providing technical insights and ensuring the feasibility of architectural choices.