Data Technical Lead
GFT Poland
⚲ Kraków
17 420 - 26 710 PLN (PERMANENT)
Wymagania
- Data engineering
- Data modelling
- Google cloud platform
- Google Cloud Platform
- Airflow
- Hadoop
- ETL
- SQL
- Unix
- Linux
- CD
- Git
- GitHub
- Jenkins
- Ansible
- Cloud
- Communication skills
- Spark
- Elasticsearch (nice to have)
- Java (nice to have)
- DevOps (nice to have)
Opis stanowiska
O projekcie: - Hybrid work in Krakow (2 office days per week) - Working in a highly experienced and dedicated team - Benefit package tailored to your needs (medical, sport, lunch subsidy, life insurance, etc.) - Online training and certifications - Access to e‑learning platform - Mindgram wellbeing platform - Work From Anywhere (up to 140 days/year abroad) - Social events Wymagania: What will you do? You will take a leading role in designing and developing large‑scale data engineering solutions within a modern cloud‑based environment. Working closely with engineers, analysts and business stakeholders, you will shape architecture, mentor the team, introduce best practices and ensure that data pipelines meet high standards of performance, security and reliability. You will also contribute to planning, decision‑making and continuous improvement across the engineering function. Your skills - 10+ years of experience in data engineering and software development - Proficiency in PySpark or Scala - Practical experience with Google Cloud Platform - Experience with Airflow or similar scheduling tools - Strong understanding of Hadoop ecosystem and ETL frameworks - Strong SQL and data modelling skills - Hands‑on experience with Python and Unix/Linux environments - Experience with version control and CI/CD tools (Git, GitHub, Jenkins, Ansible) - Ability to diagnose and resolve complex technical issues - Knowledge of cloud architecture design patterns - Experience leading engineers or acting as pod lead - Strong English communication skills Nice to have - Knowledge of ElasticSearch - Experience developing Java APIs - Experience with ingestion frameworks - Familiarity with Agile and DevOps methodologies Codzienne zadania: - Lead cloud‑native data engineering architecture and design - Develop scalable data pipelines using PySpark or Scala - Promote development standards, conduct code reviews and mentor engineers - Act as SME for data migration and complex technical initiatives - Work with analysts to confirm and refine technical requirements - Participate in planning meetings, sprint reviews and retrospectives - Build and maintain tooling for monitoring, performance and automation - Provide troubleshooting support across development and production pipelines - Drive improvements in CI/CD processes and infrastructure automation - Manage scheduling and orchestration through Airflow or similar tools