Data Engineer (mid)
Link Group
⚲ Cracow
17 000 - 20 000 PLN (B2B)
Wymagania
- GCP
- BigQuery
- PUB
- dbt
- Cloud
- Oracle
- SQL
- Python
- Java
- Scala
- DevOps
- Jenkins
- GitLab
- Kubernetes
- Infrastructure as Code
- ETL
- Kanban
- Jira
- Terraform (nice to have)
- Data modelling (nice to have)
Opis stanowiska
O projekcie: You will join an international consulting organization that supports both private and public sector clients in solving complex challenges related to people, processes, and digital transformation. Benefits: - medical care - multisport card - insurance Office location: Cracow Wymagania: - Strong hands-on experience with GCP tools (e.g., BigQuery, Dataflow, Pub/Sub, DataPlex, DBT) and modern cloud data ecosystems - Advanced Oracle SQL and PL/SQL skills, including work with complex stored procedures and large-scale datasets - Programming experience in languages such as Python, Java, or Scala - Familiarity with messaging, streaming, and DevOps tooling (e.g., Jenkins, GitLab) and container orchestration using Kubernetes - Practical experience with Infrastructure as Code solutions, preferably Terraform - Solid understanding of data modelling concepts, data warehousing architectures, and ETL/ELT best practices - Experience working in Agile delivery environments (Scrum, Kanban, Jira) - Exposure to financial markets, trading platforms, or high-performance data environments is considered a strong advantage Codzienne zadania: - Design, develop, and maintain scalable data pipelines within Google Cloud Platform environments Build and manage data integration and transformation processes connecting cloud services with on-premise Oracle databases - Collaborate closely with trading, risk, and analytics stakeholders to gather requirements and deliver both real-time and batch data solutions - Monitor and optimize data platform performance, particularly for latency-sensitive trading use cases - Work within Agile/Scrum teams to deliver business-critical data initiatives in cross-functional environments - Ensure proper data governance, lineage tracking, and regulatory compliance aligned with standards such as MiFID II and FCA requirements - Automate infrastructure and workflows using Infrastructure as Code, CI/CD pipelines, and containerization technologies like Docker and Kubernetes