NoFluffJobs Praca zdalna Senior

Senior Data Engineer

Xebia sp. z o.o.

⚲ Rzeszów, Wrocław, Gdańsk

22 000 - 28 500 PLN (B2B)

Wymagania

  • Python
  • JVM
  • GCP
  • BigQuery
  • dbt

Opis stanowiska

O projekcie: Who We Are While Xebia is a global tech company, our journey in CEE started with two Polish companies – PGS Software, known for world-class cloud and software solutions, and GetInData, a pioneer in Big Data. Today, we’re a team of 1,000+ experts delivering top-notch work across cloud, data, and software. And we’re just getting started. What We Do We work on projects that matter – and that make a difference. From fintech and e-commerce to aviation, logistics, media, and fashion, we help our clients build scalable platforms, data and AI solutions, and cutting-edge applications to shape the future of tech. Our clients include McLaren, Aviva, Deloitte, Spotify, Disney, ING, UPS, Tesco, Truecaller, AllSaints, Volotea, Schmitz Cargobull, Allegro, InPost, and many, many more. We value smart tech, real ownership, and continuous growth. We use modern, open-source stacks, and we’re proud to be trusted partners of Databricks, dbt, Snowflake, Azure, GCP, and AWS. Fun fact: we were the first AWS Premier Partner in Poland! Beyond Projects What makes Xebia special? Our community. We support tech communities, organize meetups (Software Talks, Data Tech Talks), and have a culture that actively support your growth via Guilds, Labs, and personal development budgets — for both tech and soft skills. It’s not just a job. It’s a place to grow. What sets us apart?  Our mindset. Our vibe. Our people. And while that’s hard to capture in text – come visit us and see for yourself. Wymagania: Your profile: - strong commercial experience as a Senior Data Engineer, - very good programming skills in Python and one of JVM languages: Java, Kotlin, or Scala, - proven experience in building and maintaining streaming data pipelines, - hands-on experience with Google Cloud Platform, especially BigQuery, Cloud Composer, Dataproc, Dataflow, Cloud Run, - advanced knowledge of SQL and dbt, - solid understanding of software engineering best practices: TDD, Clean Code, CI/CD pipelines, - very good command of Linux/Unix environments, - experience with Docker and daily use of Git or other version control systems, - strong teamwork skills and a proactive, positive attitude, - fluent Polish and English at B2+ level, - readiness to participate in an on-call rotation (24/7, 7 dys per week / every 5-8 weeks) - practical experience using AI-powered assistants (e.g. Claude Code, GitHub Copilot, Cursor) to improve productivity, quality, or decision-making in software delivery. Work from the European Union region and a work permit are required. Nice to have: - experience with Spring Framework, - knowledge of Apache Beam, Scio, and Google Dataflow, - practical experience with Apache Flink, - experience with Infrastructure as Code, especially Terraform, - experience with Spark or PySpark (not necessarily in a Hadoop environment), - experience applying GenAI in a more structured way within the SDLC, including defined workflows, prompt patterns, or tool integrations embedded into daily work, - interest in and familiarity with emerging AI-driven practices (e.g. agent-based workflows, automation patterns, AI-augmented development), with a willingness to explore and experiment beyond standard approaches. Codzienne zadania: - developing and automating tools within the Data Platform, including: clickstream data processing, building and evolving an ID Graph service, implementing and enhancing Self-Service Analytics solutions, - designing, building, and maintaining effective streaming data pipelines, - supporting product, engineering, and analytics teams in building data-driven products and services, - solving technical and analytical issues reported via internal communication channels (e.g. Slack, email, incident management tools), - applying software engineering best practices such as Clean Code, TDD, and CI/CD, - participating in an on-call / incident response rotation.