NoFluffJobs Praca zdalna Senior New

Data Platform Engineer

Xebia sp. z o.o.

⚲ Wrocław, Rzeszów, Gdańsk, Warszawa

22 000 - 31 500 PLN (B2B)

Wymagania

  • Python
  • Data engineering
  • Kubernetes
  • Docker
  • Helm
  • GitLab
  • Linux
  • Big data
  • SQL
  • CI/CD
  • Airflow (nice to have)
  • MLflow (nice to have)
  • Superset (nice to have)
  • ArgoCD (nice to have)
  • HashiCorp (nice to have)
  • FastAPI (nice to have)
  • AI (nice to have)

Opis stanowiska

O projekcie: Who We Are While Xebia is a global tech company, our journey in CEE started with two Polish companies – PGS Software, known for world-class cloud and software solutions, and GetInData, a pioneer in Big Data. Today, we’re a team of 1,000+ experts delivering top-notch work across cloud, data, and software. And we’re just getting started. What We Do We work on projects that matter – and that make a difference. From fintech and e-commerce to aviation, logistics, media, and fashion, we help our clients build scalable platforms, data and AI solutions, and cutting-edge applications to shape the future of tech. Our clients include McLaren, Aviva, Deloitte, Spotify, Disney, ING, UPS, Tesco, Truecaller, AllSaints, Volotea, Schmitz Cargobull, Allegro, InPost, and many, many more. We value smart tech, real ownership, and continuous growth. We use modern, open-source stacks, and we’re proud to be trusted partners of Databricks, dbt, Snowflake, Azure, GCP, and AWS. Fun fact: we were the first AWS Premier Partner in Poland! Beyond Projects What makes Xebia special? Our community. We support tech communities, organize meetups (Software Talks, Data Tech Talks), and have a culture that actively support your growth via Guilds, Labs, and personal development budgets — for both tech and soft skills. It’s not just a job. It’s a place to grow. What sets us apart?  Our mindset. Our vibe. Our people. And while that’s hard to capture in text – come visit us and see for yourself. Wymagania: Your profile: - strong experience with Python, used for building data-related applications and services, - background in data engineering or working with data/analytics platforms, - very good knowledge of Kubernetes, including deployment, operations, and troubleshooting, - hands-on experience with Docker and Helm, - experience working with Git-based workflows (e.g. GitLab), - solid understanding of Linux-based environments, - ability to work across development and infrastructure domains, - experience with big data technologies such as Spark or HDFS, - familiarity with SQL engines (e.g. Trino), - experience operating open-source data tools in production environments, - understanding of CI/CD practices in platform or engineering teams, - good communication skills and strong sense of ownership, - good command of Polish (C1) and English (min. B2), - practical experience using AI-powered assistants (e.g. Claude Code, GitHub Copilot, Cursor) to improve productivity, quality, or decision-making in software delivery. Work from the European Union region and a work permit are required. Nice to have: - familiarity with tools such as Airflow, MLflow, or Superset, - experience with platform and DevOps tools (e.g. ArgoCD, HashiCorp Vault), - knowledge of data governance or security tools (e.g. Apache Ranger, metastore solutions), - experience with frameworks such as FastAPI or Streamlit, - previous experience transitioning from a data engineering role into a platform-focused role, - experience applying GenAI in a more structured way within the SDLC, including defined workflows, prompt patterns, or tool integrations embedded into daily work, - interest in and familiarity with emerging AI-driven practices (e.g. agent-based workflows, automation patterns, AI-augmented development), with a willingness to explore and experiment beyond standard approaches. Codzienne zadania: - designing, deploying, and maintaining applications running on Kubernetes, - integrating open‑source components into a unified analytics platform, - developing internal services and tools in Python (e.g. FastAPI, Streamlit), - managing deployments using Helm, Docker, Git (GitLab) and ArgoCD, - supporting and evolving tools used by analysts and data scientists (Airflow, JupyterHub, Superset, MLflow, Trino), - ensuring platform stability, scalability, and security, - collaborating closely with data, analytics, and infrastructure teams, - participating in architectural decisions and technology choices.