Senior Data Engineer (Databricks / Azure)
Remodevs
⚲ Warszawa, Wrocław, Gdańsk, Poznań, Kraków
160 - 180 PLN/h netto (B2B)
Wymagania
- PySpark
- Apache Spark
- Databricks
- SQL
Opis stanowiska
About Dateonic Dateonic is a boutique data engineering consultancy focused on building modern data platforms on Azure and Databricks for enterprise clients across Europe. We work hands-on with clients from architecture and implementation through to production rollout and knowledge transfer. Our projects are practical, fast-moving, and focused on delivering systems that are maintainable long after handoff. We’re a small senior team with minimal bureaucracy, direct communication, and real ownership from day one. The Role We’re looking for a Senior Data Engineer with strong Databricks and Azure experience to design and build scalable data platforms for enterprise clients. This is a hands-on engineering role. You’ll work across ingestion, transformation, governance, deployment, and client collaboration — helping shape both technical architecture and delivery standards. You’ll work on greenfield and modernization projects using technologies such as PySpark, Delta Lake, Unity Catalog, and Databricks Asset Bundles. What You’ll Do • Design and build scalable Bronze → Silver → Gold data pipelines using PySpark and Delta Lake • Develop and optimize data transformations on Azure Databricks • Build ingestion frameworks for structured and semi-structured data sources • Implement data quality checks, validation logic, and monitoring • Configure and work with Unity Catalog, including permissions and governance • Collaborate with clients to understand requirements and translate them into technical solutions • Contribute to architecture decisions and platform design • Implement CI/CD and deployment standards using Databricks Asset Bundles • Support downstream reporting and analytics integrations • Document solutions and support knowledge transfer to client teams Must Have • 5+ years of experience in data engineering • Strong hands-on experience with Apache Spark / PySpark • Commercial experience with Azure Databricks • Strong SQL skills and experience building complex transformations • Understanding of medallion architecture and modern data platform design • Experience working with Delta Lake • Experience with Azure services such as ADLS Gen2, Key Vault, and Entra ID • Solid software engineering practices: Git, code reviews, version control, testing • Strong communication skills and confidence working directly with clients • Ability to work independently in a remote environment Nice to Have • Unity Catalog experience • Databricks Asset Bundles (DABs) • Terraform or Infrastructure as Code experience • Power BI integration experience • CI/CD pipelines using GitHub Actions or Azure DevOps • Databricks certifications • Consulting or client-facing delivery experience What We Offer • 100% remote work • High ownership and autonomy • Direct collaboration with senior engineers and architects • Opportunity to influence technical standards and delivery approaches • Exposure to enterprise-scale Databricks projects • Flexible working hours focused on outcomes, not time tracking • Certification and professional development support • Clear growth path toward Lead or Architect roles Recruitment Process • Introductory conversation • Technical discussion with the engineering team • Decision No lengthy recruitment processes or take-home assignments.