Senior GCP Data Engineer
Spyrosoft
⚲ Wrocław
23 520 - 28 560 PLN (B2B)
Wymagania
- Google Cloud Platform
- Google cloud
- Cloud platform
- BI
- ETL
- Relational database
- Vault
Opis stanowiska
O projekcie: We are looking for a Senior Data Engineer to join our team and lead the design and implementation of high-performance, scalable data architectures. You will be a key player in shaping our data ecosystem, leveraging the full power of Google Cloud Platform (GCP) to solve complex business challenges. Tech stack:- Google Cloud Platform (BigQuery, Dataflow, Pub/Sub, etc.)- BI/ETL Tools & Big Data Technologies- Relational Databases (SQL) & NoSQL/Big Data storage- Data Vault 2.0, Data Mesh, Data FabricAbout Spyrosoft Spyrosoft is an authentic, cutting-edge software engineering company, established in 2016. In 2021 and 2022, we were among the fastest growing technology companies in Europe, according to the Financial Times. We were founded by a group of tech experts with established backgrounds in software engineering, who created an ‘engineer-to-engineer’ workplace, powered by enthusiasm, fairness and authentic relationships. Having a unique offering, which bridge the gap between technology and business, we specialise in technology solutions for industry 4.0, automotive, geospatial, healthcare & life sciences, employee experience & education and financial services industries. Wymagania: - 5+ years of hands-on experience building complex, production-grade data solutions.- 3+ years of dedicated experience within the Google Cloud Platform ecosystem.- Proven track record with BI/ETL and Big Data technologies.- Strong understanding of relational database design and big data architectural patterns.- Experience with real-time data processing and sophisticated data warehousing aggregation.- Deep understanding of Data Fabric, Data Mesh, and Data Vault methodologies.- Ability to lead technical discussions and explain complex architectural choices to stakeholders. Codzienne zadania: - Design and develop end-to-end, complex data solutions and ETL/ELT pipelines. - Utilize GCP services to manage big data workloads and real-time processing. - Implement data warehousing aggregations and real-time streaming solutions. - Facilitate design discussions, document architectural decisions, and ensure best practices across the data lifecycle. - Apply Data Vault, Data Mesh, and Data Fabric principles to create a future-proof data environment.