JustJoin.IT Praca zdalna Senior New

Data Engineer - Microsoft Fabric

Entrada AI

⚲ Kraków, Katowice, Wrocław, Poznań, Warszawa

160 - 210 PLN/h netto (B2B)

Wymagania

  • MS Fabric
  • One Lake
  • PySpark
  • Python
  • Delta Lake
  • T-SQL

Opis stanowiska

Allocation: Full-Time (40 hrs/week) Core Tech: Microsoft Fabric, OneLake, PySpark, Delta Lake Focus: Legacy Migration & High-Capacity Engineering ABOUT ENTRADA AI Entrada AI is a specialized consulting partner and a strategic portfolio company of Databricks Ventures. This is a rare distinction: Databricks invested in us because of our technical excellence, placing us in the "inner circle" of the ecosystem. For our engineers, this means direct access to product roadmaps, private previews, and the teams building the platform.You will join a team of industry veterans and Databricks MVPs. We look for engineers who value clean architecture over quick fixes. We don't just maintain pipelines; we solve complex architectural challenges for Fortune 500 clients using features often before they are widely available. The Mission We are seeking a high-capacity Data Engineer to serve as the technical engine for a series of mission-critical internal Microsoft Fabric initiatives. This is an execution-focused role where you will operate under the direction of the Customer Project Manager to modernize a complex data estate. Your goal is clear: transition legacy ETL processes into a high-performance, Spark-driven Fabric environment. You will be responsible for ensuring that every pipeline and model you touch is not just functional, but "Production-Ready" according to the highest engineering standards. What You’ll Do Fabric Implementation & Engineering • OneLake Architecture: Build and manage large-scale data environments, ensuring seamless integration across the Fabric ecosystem. • Pipeline Development: Design and tune complex data models using Fabric Notebooks (PySpark/Python) and Data Factory pipelines. • Medallion Excellence: Strictly implement the Bronze → Silver → Gold architecture to maintain data integrity and clear lineage. • Performance Tuning: Execute advanced Delta Lake optimization patterns, including V-Order, partitioning, and compaction, to ensure sub-second data access for downstream analytics. Migration & Logic Validation • Reverse Engineering: Deconstruct legacy business logic from platforms like SQL Server or Hadoop and rebuild it into modern, production-ready Spark or SQL workloads. • Technical Troubleshooting: Independently resolve hurdles related to schema evolution, data quality, and platform compatibility during the migration phase. • Validation & Handoff: Rigorously validate all models and pipelines to ensure project acceptance within a 15-day review window. Requirements • The Fabric Foundation (Non-Negotiable): Extensive hands-on production experience with Microsoft Fabric (OneLake, Lakehouse, Data Factory, and Spark). • Engineering Depth: 3+ years of intensive Data Engineering experience (5+ years in the broader Data/BI space). • Spark Mastery: Advanced command of PySpark and Python for complex transformations and model building. • Migration Specialist: Proven track record of successfully moving ETL workloads from legacy platforms into cloud-native environments. • SQL Fluency: Advanced SQL skills for deep-dive data validation and logic extraction. • Documentation: Ability to produce technical workflows that are clear, concise, and ready for client-side transition. Nice to Haves • Certified Expert: Microsoft Certified: Fabric Analytics Engineer Associate (DP-600). • Model Experience: Prior experience operating in a "Fixed Capacity" staff augmentation model, demonstrating high reliability and self-management. THE OFFER • Competitive Salary: 160 - 210 PLN / hour based on your experience and skills. • Stable Employment: Contract signed with our Polish entity (Entrada AI Poland). • 100% Remote: Full flexibility to work from anywhere in Poland. • Apple Hardware: Macbook Air M4 15" provided. • Referral Bonus: Bonus for bringing other top-tier engineers to the team. • Professional Growth RECRUITMENT PROCESS • Introductory Call (20 min): Short conversation with our Recruiter to discuss your background and expectations. • Technical Interview (60 min): Deep dive into your technical skills with our engineering team. • Optional Client Interview: Required only in specific cases. • Decision & Offer: We aim to close the process efficiently.