JustJoin.IT Praca zdalna Senior

Data Engineer with MLOps

Experis Manpower Group

⚲ Warszawa

160 - 170 PLN/h netto (B2B)

Wymagania

  • Databricks
  • MLOps

Opis stanowiska

Senior Data Engineer (MLOps / Azure Databricks) Location: Poland (Remote / occasional office visits)Contract Type: B2BRate: 160–170 PLN/h + VATProject: Long-term international cooperation About the Role We are looking for a Senior Data Engineer with strong MLOps expertise to join an international team building production-grade data and machine learning pipelines on Azure Databricks. In this role, you will work at the intersection of data engineering, machine learning, and infrastructure automation, transforming experimental ML workflows into scalable, observable, and cost-efficient production systems. The position focuses on designing robust data architectures, automating ML pipelines, and ensuring high reliability and performance in modern cloud environments. Key Responsibilities • Design and maintain end-to-end data and ML pipelines using Azure Databricks, Delta Lake, and Unity Catalog • Build reproducible ML training and deployment workflows integrated with experiment tracking and model registry tools • Implement data quality frameworks and observability metrics following industry best practices • Develop dashboards to monitor data quality, model performance, and operational metrics (e.g., Lakeview, Grafana or similar tools) • Automate data ingestion and feature engineering pipelines using PySpark, SQL Warehouses, and Databricks Asset Bundles (DAB) • Build CI/CD pipelines for data and ML workflows using GitHub Actions or Azure DevOps • Manage data access, security, and governance policies • Optimize compute performance and infrastructure costs (cluster tuning, autoscaling, caching, partitioning) • Implement automated validation pipelines for data quality, model evaluation, and telemetry-driven updates • Monitor ML models for drift detection, feature stability, and prediction quality • Ensure environment consistency using Infrastructure as Code (e.g., Terraform) and containerization Required Skills & Experience • Strong experience in Data Engineering and MLOps environments • Hands-on experience with Azure Databricks and PySpark • Experience designing production-grade data and ML pipelines • Strong knowledge of Delta Lake architecture and data layering (bronze / silver / gold) • Experience with CI/CD pipelines for data and ML workflows • Experience with data quality frameworks and monitoring solutions • Knowledge of Infrastructure as Code tools such as Terraform • Experience with data security, governance, and access control • Strong analytical and problem-solving skills • Fluent English Nice to Have • Experience with Databricks Workflows, Unity Catalog, and Databricks Model Serving • Experience building monitoring dashboards (Lakeview, Grafana or similar) • Experience implementing ML observability frameworks • Experience optimizing large-scale distributed data pipelines What We Offer • Private medical care (Medicover) • Sports card (Multisport or equivalent) • Life insurance • Flexible benefits platform • Training and certification opportunities • Opportunity to work with modern Data & AI technologies • International project environment • Long-term, stable cooperation If you are passionate about building scalable ML and data platforms in Azure, and enjoy working at the intersection of data engineering, MLOps, and cloud automation, we would love to hear from you.