JustJoin.IT Praca zdalna Senior

Senior Data Engineer (with GCP )

Spyrosoft

⚲ Warszawa, Wrocław, Kraków

140 - 170 PLN/h netto (B2B)

Wymagania

  • Data Fabric
  • ETL tools
  • data vault 2.0
  • BI
  • Dataflow
  • BigQuery
  • SQL

Opis stanowiska

Project description: We are looking for a Senior Data Engineer to join our team and lead the design and implementation of high-performance, scalable data architectures. You will be a key player in shaping our data ecosystem, leveraging the full power of Google Cloud Platform (GCP) to solve complex business challenges. Tech stack: • Google Cloud Platform (BigQuery, Dataflow, Pub/Sub, etc.) • BI/ETL Tools & Big Data Technologies • Relational Databases (SQL) & NoSQL/Big Data storage • Data Vault 2.0, Data Mesh, Data Fabric Requirements: • 5+ years of hands-on experience building complex, production-grade data solutions. • 3+ years of dedicated experience within the Google Cloud Platform ecosystem. • Proven track record with BI/ETL and Big Data technologies. • Strong understanding of relational database design and big data architectural patterns. • Experience with real-time data processing and sophisticated data warehousing aggregation. • Deep understanding of Data Fabric, Data Mesh, and Data Vault methodologies. • Ability to lead technical discussions and explain complex architectural choices to stakeholders. Main responsibilities: • Design and develop end-to-end, complex data solutions and ETL/ELT pipelines. • Utilize GCP services to manage big data workloads and real-time processing. • Implement data warehousing aggregations and real-time streaming solutions. • Facilitate design discussions, document architectural decisions, and ensure best practices across the data lifecycle. • Apply Data Vault, Data Mesh, and Data Fabric principles to create a future-proof data environment.