Pracuj.pl Hybrydowo Mid New

Data Engineer

Allegro

⚲ Warszawa, Wola

14 200–20 200 zł brutto / mies.

Wymagania

  • SQL
  • BigQuery
  • Snowflake Data Cloud
  • Python
  • PySpark
  • Airflow

Opis stanowiska

Nasze wymagania: At least 3 years of experience as a Data Engineer working with large datasets. Expert proficiency in SQL. Strong understanding of data modeling and cloud Data Warehouses (DWH). Hands-on experience with cloud warehouses like BigQuery or Snowflake (BigQuery preferred). Experience in designing and maintaining ETL/ELT processes. Experience working with large datasets in Python, utilizing data processing tools like PySpark. Familiarity with data orchestration tools like Airflow. High degree of autonomy and passion for translating complex business requirements into impactful data solutions. Familiarity with and adherence to good software engineering practices (clean code, code review, CI/CD). Proven capability to optimize cost and efficiency of data processing. Demonstrated ability to leverage AI/ML tooling to enhance data pipelines and analytical products. English proficiency at least B2 level. O projekcie: Delivery Experience is one of the fastest-growing areas of Allegro where we undertake new, complex projects to enhance logistics and warehousing processes. We are building technology that makes Allegro's deliveries easy, cost-effective, fast and predictable. Our team takes care of critical services along the Allegro shopping journey, responsible for predicting delivery times using statistical algorithms and machine learning, selecting the best delivery methods tailored to customers, and integrating with carrier companies. As a contributor in this position, you will help us grow our cutting-edge cloud analytical platforms, establishing the groundwork for the future of logistics intelligence. If you are up for the challenge and would like to build a future of logistics, make sure to apply! Zakres obowiązków: Develop and maintain data ingestion and processing pipelines for large volumes of data using BigQuery, Airflow, and dbt. Design and streamline data architecture that powers analytical products. Help drive the adoption of AI in analytics and unlock new possibilities on top of our data mesh. Develop tooling for monitoring and enhancing data quality and integrity. Manage and optimize costs related to data processing and infrastructure. Oferujemy: Well-located offices (with e.g. fully equipped kitchens, bicycle parking, terraces full of greenery) and excellent work tools (e.g., raised desks, ergonomic chairs, interactive conference rooms). A 16" or 14" MacBook Pro and all the necessary accessories. A wide selection of fringe benefits in a cafeteria plan - you choose what you like (e.g., medical, sports or lunch packages, insurance, purchase vouchers). English classes that we pay for related to the specific nature of your job. A training budget, inter-team tourism (see more here), hackathons, and an internal learning platform where you will find multiple trainings. An additional day off for volunteering, which you can use alone, with a team, or with a larger group of people connected by a common goal. Social events for Allegro people - Spin Kilometers, Family Day, Fat Thursday, Advent of Code, and many other occasions we enjoy.