NoFluffJobs Praca zdalna Mid

Cloud Data Engineer (Azure)

Verita HR

⚲ Remote

20 000 - 22 000 PLN (B2B)

Wymagania

  • Azure
  • SQL
  • Python
  • ETL
  • Data engineering
  • Azure Data Factory
  • Azure Databricks

Opis stanowiska

O projekcie: 📍 Client: european bank 🗣️ Recruitment: phone screen with our recruiter + 1 on-line meeting the with hiring managers 🗺️ Remote work with availability within Central European Time (CET) Verita HR is an international company providing recruitment support within #Fintech, #Finance and #Banking market in EMEA. We connect the most innovative organizations with the best people in the market. We conduct systematic market research, which allows our Digital Teams to be a step ahead of the competition. This role sits at the heart of a large digital transformation within a European banking institution. Data is treated as a real business asset, not an afterthought. As a Data Engineer, you will help build and run a modern data platform that supports analytics, reporting, and business decision-making across the organization. You will work on a mix of cloud-based and on-prem solutions, focusing on reliable data pipelines, data quality, and scalable data processing. What's in it for you: - Exciting role at a leading European bank - 6 months assignment, B2B contract - Fully remote role with availability within Central European Time (CET) - Working with cutting-edge IT technologies - Personal growth and development opportunities Wymagania: - 2 to 4 years of hands-on experience as a Data Engineer in cloud environments - Strong SQL skills - Practical experience with Python in data engineering use cases - Experience with Azure Data Factory and Databricks - Good understanding of data pipelines, ETL/ELT and data lifecycle - Ability to communicate clearly with both technical and non-technical stakeholders - Experience working in Agile teams - Cost awareness when designing data solutions - Curiosity and willingness to learn continuously Tech stack you will use: - SQL for data processing and analytics - Python for transformations and data quality logic - Azure Data Factory for ingestion (1-to-1 pipelines) - Databricks for transformations and data quality implementation - Azure-based data platform services - Git or similar source code management tools - CLI environments (bash, PowerShell) Codzienne zadania: - Building and maintaining end-to-end data pipelines - Ingesting data from multiple sources (structured, semi-structured, unstructured) - Transforming raw data into clean, usable data sets - Implementing data quality rules and validation logic - Serving data to business users, analysts, and internal applications - Managing storage as a core foundation of the data platform - Working closely with technical and non-technical stakeholders - Operating in Agile, DataOps and DevOps environments