JustJoin.IT Praca zdalna Senior New

Data Engineer

PTT Consulting Sp. z o. o.

⚲ Warszawa

20 160 - 25 200 PLN netto (B2B)

Wymagania

  • End-to-end building of data platform
  • Python
  • SQL
  • Data integration (incl. SEI integration)
  • Data modeling
  • Power BI
  • Cloud (AWS /Azure / GCP)
  • Data Warehousing
  • Data orchestration tools
  • TBM Studio

Opis stanowiska

Working hours: 12:00 PM – 8:00 PM (CEST) Requirements • Strong experience in building data platforms (end-to-end, not only pipelines). • Advanced Python for data processing and pipeline development. • Strong SQL (CTEs, window functions, query optimization). • Experience with data integration projects (SEI integration strongly expected). • Strong knowledge of data modelling (3NF, star/snowflake schemas). • Experience with Power BI (mandatory) – building dashboards and transforming data for reporting. • Experience with cloud providers (Azure / AWS / GCP). • Experience with data warehousing platforms (e.g. Snowflake, BigQuery, Redshift, Synapse). • Understanding of ETL vs ELT and batch vs near real-time processing. • Experience building data pipelines and orchestration of workflows (e.g. Airflow, Prefect, Dagster, Azure Data Factory). • Experience working in multi-team engineering environments. Nice to have • TBM Studio knowledge (highly valued, even 3+ years is a strong advantage). • Experience with Apache Spark (PySpark / Spark SQL). • Understanding of Hadoop ecosystem fundamentals. • Experience with streaming / event-driven data processing (Kafka, Kinesis, Pub/Sub, Event Hubs). • Programming in Java or Scala. • Experience with Bash / shell scripting. • Familiarity with dbt. • Awareness of BI tools (Tableau, Looker). • Experience with API / SaaS data integrations. • Understanding of cloud cost optimization (FinOps). • Exposure to data governance, compliance, or GRC frameworks. Responsibilities • Design, build, and maintain a data platform from scratch. • Develop and maintain production-grade data pipelines (ETL/ELT) for data ingestion. • Build data integration pipelines across multiple systems (incl. SEI integration). • Design and implement end-to-end data architecture for analytics and reporting. • Create and maintain data models (3NF, star/snowflake). • Deliver and support Power BI dashboards and reporting layer (mandatory). • Ensure data quality, validation, and consistency across pipelines and systems. • Implement and maintain orchestration, scheduling, dependencies, and failure recovery. • Monitor and optimize performance, scalability, and cost efficiency. • Collaborate with DevOps, Data Architects, Business Integration Engineers, and Testers. • Apply software engineering best practices (version control, CI/CD, testing). Client A global leader with a sharp focus on lottery solutions. A confident step forward building on a long history of delivering safe and secure technology, demonstrating strong commitment to customers as a dedicated lottery service provider. Leveraging collective insight, experience, and expertise to create reliable and engaging solutions that help lottery clients achieve objectives, meet player needs, and deliver meaningful benefits to communities.