Data Engineer
Spyrosoft
⚲ Warszawa
20 160 - 21 000 PLN (B2B)
Wymagania
- Data engineering
- Cloud platform
- Python (nice to have)
- SQL (nice to have)
- BI (nice to have)
- Power BI (nice to have)
- BigQuery (nice to have)
- Project management (nice to have)
Opis stanowiska
O projekcie: You will join a data-driven organization focused on building scalable and efficient data platforms to support business decision-making. The project involves designing and implementing modern data solutions within a cloud-based environment, with a strong emphasis on data quality, performance optimization, and cost efficiency. The role requires close collaboration with Data Analysts and business stakeholders to translate complex business needs into robust technical solutions. You will also work in an international environment, cooperating with global teams across different regions. Additional Information:- Duration: 3 months, starting in June, with the possibility of extension.- Type of cooperation: B2B contract- Equipment: will be provided by the clientAbout Spyrosoft Spyrosoft is an authentic, cutting-edge software engineering company, established in 2016. In 2021 and 2022, we were among the fastest growing technology companies in Europe, according to the Financial Times. We were founded by a group of tech experts with established backgrounds in software engineering, who created an ‘engineer-to-engineer’ workplace, powered by enthusiasm, fairness and authentic relationships. Having a unique offering, which bridge the gap between technology and business, we specialise in technology solutions for industry 4.0, automotive, geospatial, healthcare & life sciences, employee experience & education and financial services industries. Wymagania: - At least 3 years of professional experience in data engineering roles, preferably in Retail or consulting environments- Strong knowledge of Python and SQL, with hands-on experience in building and maintaining data pipelines- Experience working with cloud platforms, preferably GCP- Familiarity with BI tools such as Power BI or Looker- Experience with BigQuery and Airflow- Solid understanding of Agile methodologies and project management in Agile environments- Strong analytical and statistical skills, including a proven ability to validate and ensure data accuracy and quality- English and Polish proficiency at minimum B2 level Codzienne zadania: - Designing and developing data pipelines (both batch and streaming) to ensure reliable data flow across systems - Optimizing data processing performance and controlling infrastructure costs - Implementing monitoring systems, alerts, and data quality control mechanisms - Collaborating with Data Analysts and business stakeholders to gather and refine data requirements - Supporting junior team members through mentoring and conducting code reviews - Actively participating in Agile ceremonies, including sprint planning and effort estimation - Working closely with global teams to align on solutions and best practices