👉 Flink Data Engineer
Xebia sp. z o.o.
⚲ Wrocław, Gdańsk, Rzeszów, Warszawa
20 000 - 33 500 PLN netto (B2B)
Wymagania
- Apache Flink
- Flink
- BigQuery
- Data Cloud Storage
- GCP
Opis stanowiska
🟣 You will be: • developing and enhancing real-time streaming pipelines using Apache Flink, • migrating existing Flink jobs using the DataStream API and adapting them to newer platform standards, • leading and executing the upgrade of the Flink platform to version 2.0, • designing, optimizing, and maintaining high-throughput, fault-tolerant streaming architectures, • migrating large-scale datasets from BigQuery (BQ) to Data Cloud Storage (DCS), • scaling and automating ongoing data migration processes to support growing data volumes, • converting datasets from Avro to Parquet format, with attention to performance, schema evolution, and storage optimization, • leveraging AI-powered tools to accelerate migration, validation, and transformation workflows, • ensuring data quality, integrity, and minimal downtime during migrations, • collaborating with cross-functional teams and clearly communicating technical concepts to non-technical stakeholders. 🟣 Your profile: • strong hands-on experience with Apache Flink, including development using the DataStream API, • proven experience maintaining and upgrading Flink environments, ideally including exposure to Flink 2.0, • deep understanding of streaming pipeline architecture, performance tuning, state management, and fault tolerance, • experience migrating large-scale datasets from BigQuery (BQ) to Data Cloud Storage (DCS), • strong proficiency in data format conversion, particularly Avro to Parquet, • ability to design, scale, and automate migration workflows while ensuring data integrity and minimal service disruption, • solid knowledge of Google Cloud Platform (GCP) and its data services, • good understanding of distributed systems, schema evolution, and storage optimization strategies, • ability to break down complex migration and platform challenges into clear, actionable steps, • proactive mindset with strong ownership of solutions and risk identification, • clear and effective communication skills, especially when explaining technical topics to non-technical stakeholders, • understanding of how machine learning or intelligent automation can be applied to optimize and monitor data workflows, • practical experience using AI-powered assistants (e.g. Claude Code, GitHub Copilot, Cursor) to improve productivity, quality, or decision-making in software delivery. 🟣 Nice to have: • experience working on high-scale, consumer-facing data platforms, • background in long-running migration programs involving multiple data sources and formats, • familiarity with observability, monitoring, and alerting for streaming systems, • interest in and familiarity with emerging AI-driven practices (e.g. agent-based workflows, automation patterns, AI-augmented development), with a willingness to explore and experiment beyond standard approaches. Work from the European Union region and a work permit are required. 🟣 Recruitment Process: CV review – HR Call – Interview – Client Interview – Decision 🎁 Benefits 🎁 ✍ Development: • development budget of up to 6,800 PLN, • we fund certifications e.g.: AWS, Azure, ISTQB, PSM, • access to Udemy, Safari Books Online and more, • events and technology conferences, • technology Guilds, • internal training, • Xebia Library, • Xebia Upskill. 🩺 We take care of your health: • private medical healthcare, • multiSport card - we subsidise a MultiSport card, • mental Health Support. 🤸♂️ We are flexible: • flexible working hours, • B2B or permanent contract, • contract for an indefinite period.