Data Engineer
TQLO SPÓŁKA Z OGRANICZONĄ ODPOWIEDZIALNOŚCIĄ
⚲ Warszawa
22 000 - 28 000 PLN brutto (UoP)
Wymagania
- PySpark
- Azure
- Databricks
- Python
Opis stanowiska
Our Client is a global financial organization developing modern solutions that support Trading, Risk Management, and advanced analytical models.The project focuses on building a modular XVA computation engine, enabling large-scale data processing, risk model development, and regulatory initiatives (SA-CCR, IMM). We are looking for an experienced Data Engineer to strengthen an international XVA team and support the development of modern Big Data solutions. 📍 WORK MODE Warsaw – hybrid (2–3 days per week in the office) 🧑💻 RESPONSIBILITIES • Designing, building, and maintaining complex ETL pipelines using Python and PySpark, processing large-scale, distributed datasets • Working within the Hadoop ecosystem (HDFS, Hive) and the Azure Databricks cloud environment, ensuring high performance, data integrity, and reliability • Supporting regulatory initiatives in Counterparty Credit Risk (SA-CCR, IMM) by developing and enhancing XVA computational components • Actively collaborating with Quant, Credit Risk, Market Risk, IT, and Trading Desk teams to deliver scalable, business-relevant data-driven solutions • Applying best practices in data governance, security, performance tuning, and team collaboration tools (Git, GitHub, Jira, Confluence) 🔍 REQUIREMENTS • Minimum 3 years of commercial experience as a Data Engineer in the banking sector • Very good knowledge of Python and hands-on experience with PySpark • Experience with Azure Databricks, Hadoop, S3, Cloudera, and large-scale data processing • Ability to work with SQL and analyze Java code (legacy systems) • Familiarity with development tools: Git, GitHub, Jenkins, Sonar, Jira, Confluence • English at C1 level (international environment) Nice to have • Experience with workflow orchestration tools (e.g. Airflow, Oozie, Control-M) • Hands-on experience with AWS, Azure, or GCP • Knowledge of streaming platforms (e.g. Kafka) • Experience with ServiceNow and Starburst 🤝 WHY JOIN? • Employment based on a permanent contract plus AKUP up to 75% and an annual bonus • Stable project in the risk and trading domain with real influence on architecture and technical solutions • Work in an international structure with close collaboration with experts from Europe and global centers of excellence • Experienced international team working with a modern Big Data technology stack • Real career growth towards data architecture, scalable computing, and risk models • Friendly atmosphere, high coding standards, and a supportive manager TQLO Sp. z o.o. – Employment Agency (KRAZ No. 33580) Thank you for all applications. We will contact selected candidates.