Data Engineer
emagine Polska
⚲ Kraków
Wymagania
- Airflow
- Google Cloud Platform
- PySpark
- Tableau
- Haddop
- Data modeling
Opis stanowiska
Location: Cracow/ Hybrid (6 times per month from the office)Type of contract: B2B We are searching for a Data Engineer with experience in developing BI dashboards, particularly using Tableau. This role requires a strong technical foundation and expertise in data modeling to drive data integration and usability. Main Responsibilities • Develop Component Data Artifacts (CDAs). • Break down Master Data Artifacts (MDAs) and Global Data Artifacts (GDAs) into CDAs for integration. • Focus on data modeling, data reuse, and understanding data domains. • Implement data normalization concepts. • Efficiently query data and prepare it for integration into the data lake using Juniper pipelines. Key Requirements • Experience with Tableau. • Proficiency in data modeling concepts. • Strong technical understanding of GCP. • Experience with data normalization. • Knowledge of data integration techniques. • Familiarity with Hadoop and Airflow. • Proficient in pyspark. • Understanding of Juniper pipelines. • Knowledge of refinery data processes. Other Details • The Data Engineer will work closely with GCP engineers and platform teams. • The focus is to secure a consultant capable of managing data modeling and ensuring data reusability across various platforms.