Junior Data Engineer
Allegro
⚲ Warszawa, Wola
10 700–14 900 zł brutto / mies.
Wymagania
- SQL
- Snowflake Data Cloud
- BigQuery
- PySpark
- Airflow
Opis stanowiska
Nasze wymagania: You have at least 1 year of experience as a Data Engineer working with large datasets. You are proficient in SQL and have a strong understanding of data modeling and cloud DWH architecture. You have experience with cloud warehouses like Snowflake or BigQuery. You worked with large data sets in Python, using tools like PySpark or Airflow. You are familiar with containerization and CI/CD, ideally with Github Actions. You have a product mindset and you’re not afraid to translate business requirements into software and data solutions. You can confidently use AI tools in your daily work. You know English at a B2 level or higher. O projekcie: Flexible working hours in the hybrid model (4/1) - working hours start between 7:00 a.m. and 10:00 a.m. We also have 30 days of occasional remote work. The salary range for this position depending on the skill set is as follows (contract of employment, tax-deductible cost): PLN 10 700 - 14 900. Our team is based in Warsaw. Allegro Pay is the largest fintech in Central Europe – we are growing fast and need engineers who want to learn and develop, while at the same time solving problems related to serving thousands of RPSs. If, like us, you like flexing your mental muscles to solve complex problems and you would be happy to co-create the infrastructure which underpins our solutions, make sure you apply! In this role, you will be a contributor, helping us expand our modern cloud-based analytical solutions. We embrace challenging and interesting projects and take quality very seriously. Depending on your preference, your position may be more business-oriented or platform-oriented. Zakres obowiązków: You will be actively responsible for developing and maintaining processes for handling large volumes of data using state of the art tooling like Snowflake, Airflow and duckdb. You will help drive adoption of AI in analytics, unlocking new possibilities on top of our data mesh. We build and deploy agents for our daily work and to support our products. You will be streamlining and developing the data architecture that powers analytical products and work along a team of experienced analysts. You will be developing tooling for monitoring and enhancing the quality and integrity of the data. You will manage and optimize costs related to our infrastructure and data processing. Oferujemy: Well-located offices (with e.g. fully equipped kitchens, bicycle parking, terraces full of greenery) and excellent work tools (e.g., raised desks, ergonomic chairs, interactive conference rooms). A 16" or 14" MacBook Pro or corresponding Dell with Windows (if you don't like Macs) and all the necessary accessories. A wide selection of fringe benefits in a cafeteria plan - you choose what you like (e.g., medical, sports or lunch packages, insurance, purchase vouchers). English classes that we pay for related to the specific nature of your job. A training budget, inter-team tourism (see more here), hackathons, and an internal learning platform where you will find multiple trainings. An additional day off for volunteering, which you can use alone, with a team, or with a larger group of people connected by a common goal. Social events for Allegro people - Spin Kilometers, Family Day, Fat Thursday, Advent of Code, and many other occasions we enjoy.