JustJoin.IT Hybrydowo Senior

Product Owner (Data Engineering)

TQLO SPÓŁKA Z OGRANICZONĄ ODPOWIEDZIALNOŚCIĄ

⚲ Warszawa

140 - 165 PLN/h netto (B2B)

Wymagania

  • Airflow
  • ETL tools
  • SQL
  • Python

Opis stanowiska

For our Client – a leading international technology organization building advanced data platforms at scale – we are looking for an experienced Product Owner to join a strategic Data Mesh initiative. This role sits at the intersection of data engineering, analytics, and product thinking. The focus is not on building data pipelines themselves (a dedicated platform team handles that), but rather on designing and delivering high-quality data products, implementing business logic, and working closely with stakeholders to translate business needs into analytical data solutions. The organization is continuously developing its Data Mesh architecture, offering the opportunity to work on a mature, well-designed data ecosystem while contributing to the evolution of data products that support key business processes. 📍 Location: Warsaw (preferred due to occasional team meetings)💻 Work model: Remote work possible; candidates able to occasionally visit the Warsaw office are preferred Responsibilities • Build and maintain advanced analytical data layers based on the Data Mesh methodology • Design and implement data products, including analytical tables and business dashboards • Translate business requirements into data models, logic, and analytical solutions • Implement and orchestrate data transformations using dbt and Airflow • Ensure data quality through validation mechanisms, anomaly detection, and automated profiling processes • Design analytical entities optimized for performance, scalability, and cost efficiency • Work closely with business stakeholders to understand processes and identify opportunities for data-driven improvements • Optimize infrastructure usage and control Snowflake platform costs through query optimization and efficient pipeline design • Support the evolution of the organization’s data architecture and analytical capabilities Requirements • Minimum 6 years of professional experience in roles such as Data Analyst, Data Engineer, or Data Product Engineer • Strong expertise in SQL, including query optimization • Experience with data modeling, data processing, storage strategies, and aggregation of large datasets • Practical experience implementing and orchestrating complex ETL/ELT workflows • Ability to apply statistical methods (e.g., percentiles, distributions) for data validation and root cause analysis • Experience working with dbt (models, macros, tests) and modern data platforms • Understanding of data architecture concepts, including relational and analytical modeling • Experience working with business stakeholders and translating requirements into data solutions • Strong communication skills and ability to operate in a self-driven role Nice to have • Experience using Python for data processing and analysis • Experience with Snowflake / Snowpark • Ability to design scalable dbt model architectures • Experience implementing advanced dbt testing strategies, including unit tests and custom macros • Experience optimizing Snowflake infrastructure cost What we offer • Opportunity to work on a large-scale Data Mesh environment • Real impact on data products and analytical architecture • Collaboration with experienced international data teams • Long-term B2B cooperation • Work with modern data technologies and large datasets 📍Please note that we will contact only selected candidates. Recruitment process conducted by TQLO Sp. z o.o. – Employment Agency (KRAZ 33580).