PySpark Data Engineer
iTeamly
⚲ Kraków, Warszawa, Łódź, Gdańsk, Wrocław
21 000 - 30 000 PLN (B2B)
Wymagania
- Data engineering
- PySpark
- Databricks
- Azure Data Factory
- Azure SQL
- Azure Data
- CD
- BI
Opis stanowiska
O projekcie: 🌟 What we offer: - Work on modern data solutions in Azure and Databricks environment - Flexible working model and stable long term cooperation - Exposure to international projects and stakeholders - Support for professional growth and continuous learning Wymagania: 🧠 Our requirements: - Minimum 3 years of experience in Data Engineering - Strong hands-on experience with PySpark (DataFrames, SparkSQL optimization, partitioning) - Practical experience with Databricks and Azure Data Factory - Knowledge of Azure SQL and core Azure services - Experience with CI/CD processes - Ability to work independently and solve problems with minimal supervision - Strong written and spoken English - Experience with Power BI Codzienne zadania: - Design, build, and maintain scalable data pipelines - Develop data processing and transformation workflows - Support data ingestion from multiple sources into cloud environments - Ensure performance, reliability, and data quality - Collaborate with business and technical stakeholders to translate requirements into data solutions - Contribute to documentation and continuous improvement of data processes - Work in a distributed team with a focus on product and business goals