GCP Data Platform Engineer - Automation & Innovation Department
T-Mobile
⚲ Warszawa, Mokotów
Wymagania
- GCP
- BigQuery
- Cloud Storage
- Pub/Sub
- Dataproc
- Composer
- Cloud Run
- Looker
- Vertex AI
- IaC
- Terraform
- Terragrunt
- Python
- Docker
- Kubernetes
- GitLab
Opis stanowiska
Nasze wymagania: 3+ years of experience as a Data Platform Engineer in a data‑driven environment (preferably with GCP). Experience in developing enterprise ready solutions based on GCP data services (BigQuery, Cloud Storage, Pub/Sub, Dataproc, Composer,Cloud Run, Looker, Vertex AI). Experience in large‑scale data migration or cloud transformation projects. Experience with modern data platform patterns, including data lakehouse architectures on GCP (Cloud Storage + BigQuery). Hands‑on experience with Infrastructure‑as‑Code (IaC) tools, including Terraform/Terragrunt. Proficiency in Python. Experience with Linux, Docker/Kubernetes and Gitlab CI/CD pipelines. Very good command of English (spoken and written). Strong communication skills with the ability to explain complex technical concepts to business stakeholders. Mile widziane: Certyfikat ISTQB Foundation Level Podstawowa znajomość SQL Doświadczenie w testowaniu aplikacji webowych (np. testy UI/UX, regresja, smoke testy) Umiejętność korzystania z narzędzi AI w procesie developmentu. O projekcie: Join a new, strategic datatransformation project, where we’re moving analytics from onpremise to GCP and building our data architecture and data model from the ground up, with a strong focus on business value creation and CX of our customers. We work with technologies like GCP, Spark, Python, Kubernetes, BigQuery, Vertex AI, Terraform, Looker. We integrate diverse, high volume data sources, design streaming and batch processing layers, implement data governance, lineage, data quality and data security, and set up CI/CD and monitoring/SLOs to shorten the path from question to answer for our business and create a solid foundation for AI/LLM driven solutions. We’re looking for people who combine architecture and hands on engineering, understand business needs, bring proactivity, energy and fresh ideas, and want to actively shape the standards, patterns and long term direction of our data platform. Zakres obowiązków: Develop reusable frameworks for data processing and testing on GCP (e.g., BigQuery, Dataflow/Dataproc, Composer). Build and maintain batch and streaming data ingestion pipelines from various sources (databases, Kafka/MQ, APIs, files) into GCP. Implement automated tests and data quality checks for data pipelines. Collaborate with analysts and data scientists to deliver reliable, well‑documented datasets. Monitor, optimize and secure data pipelines in line with data governance and compliance standards.