Pracuj.pl Hybrydowo Senior New

Senior Data Engineer

Allegro

⚲ Warszawa, Wola

18 400–25 450 zł brutto / mies.

Wymagania

  • Scala
  • Java
  • Python
  • Google Cloud Platform
  • Azure
  • AWS
  • Linux

Opis stanowiska

Nasze wymagania: Are programming in languages such as Scala or Java, Python Have strong understanding of distributed systems, data storage, and processing framework like dbt, Spark or Apache Beam Have knowledge of GCP (especially Dataflow and Composer) or other public cloud environments like Azure or AWS Use good practices (clean code, code review, TDD, CI/CD) Navigate efficiently within Unix/Linux systems Possess a positive attitude and team-working skills Are eager for personal development and keeping their knowledge up to date Know English at B2 level O projekcie: Flexible working hours in the hybrid model (4/1) - working hours start between 7:00 a.m. and 10:00 a.m. We also have 30 days of occasional remote work. The salary range for this position depending on the skill set is as follows (contract of employment, tax-deductible cost): - Senior Data Engineer: PLN 18 400 - 25 450 - Annual bonus based on your annual performance and company results. Our team is based in Warsaw. Zakres obowiązków: As part of the Data & AI area, we implement projects based on the practical "data science" and "artificial intelligence" applications of an unprecedented scale in Poland. Data & AI is a group of over 150 experienced engineers organized into over a dozen teams with various specializations. Some of them build dedicated tools for creating and launching BigData processes or implementing ML models for the entire organization. Others work closer to the client and are responsible for the implementation of the search engine, creating recommendations, building a buyer profile or developing an experimental platform. There are also research teams in the area whose aim is to find solutions to non-trivial problems requiring the use of machine learning. We are looking for Data Engineers who want to build a highly scalable and fault-tolerant data ingestion for millions of Allegro customers. The platform collects 5 billion clickstream events every day (up to 150k / sec) from all Allegro sites and Allegro mobile applications. This is a hybrid solution using a mix on-premise and Google Cloud Platform (GCP) services like Spark, Kafka, Beam, BigQuery, Pubsub or Dataflow. Oferujemy: Well-located offices (with e.g. fully equipped kitchens, bicycle parking, terraces full of greenery) and excellent work tools (e.g., raised desks, ergonomic chairs, interactive conference rooms). A 16" or 14" MacBook Pro or corresponding Dell with Windows (if you don't like Macs) and all the necessary accessories. A wide selection of fringe benefits in a cafeteria plan - you choose what you like (e.g., medical, sports or lunch packages, insurance, purchase vouchers). English classes that we pay for related to the specific nature of your job. A training budget, inter-team tourism (see more here), hackathons, and an internal learning platform where you will find multiple trainings. An additional day off for volunteering, which you can use alone, with a team, or with a larger group of people connected by a common goal. Social events for Allegro people - Spin Kilometers, Family Day, Fat Thursday, Advent of Code, and many other occasions we enjoy.