Data Engineer - Smart City
Incubly
⚲ Łódź
1 000 - 1 400 PLN/dzień netto (B2B)
Wymagania
- AWS
- Airflow
- CI/CD
- DBT
- Snowflake
- Kafka
- SQL
- Python
Opis stanowiska
Hey! Nice to see you! Let’s share our stories to get to know a bit better…. We are business and technology enthusiasts constantly hungry for new challenges, self-development, and development, and nothing motivates us more than great software products and happy customers. At Incubly, we believe that great people want to work with great people, so we started building a company that would attract great minds and allow us to achieve everything without feeling like we're just working, but rather having fun. Our mission is to mainly support tech companies and startups (scaleups) in fast, high-quality scaling up of their teams and boosting their product development, testing, and deployment so that we can succeed together. We are currently working with Arrive, a leading global mobility platform that includes brands like EasyPark, Flowbird, RingGo, ParkMobile, and Parkopedia. Active in over 90 countries and 20,000 cities, Arrive helps make urban mobility smarter and travel easier through solutions like smart payments, optimized parking, and EV charging. It’s about more than movement—it’s about creating more livable cities and better travel experiences. If you're interested in working with us, let us share our requirements for competencies - we are looking for a Mid/Senior Data Engineer. Your daily responsibilities • Cooperate with the Data Scientist and Architects to understand the inputs and outputs of the models, as well as the transformations required to design the features • Part of our stack is based on open source solutions, as we are migrating to AWS, you will participate in moving and extending data pipelines on Snowflake • Participate in the design phase of the data processing flow to fulfil the use cases' requirements • Develop data flow by creating DBT models and transformations • Implement DBT tests, based on the product-defined test strategy • Implement and configure Airflow DAGs to meet data flow quality attributes • Cooperate with the DevOps team to deliver the platform constraints and improve the CI/CD We need you to have • Master's / Engineering studies specialised in the data domain • 3-5 years of experience in a similar domain • Knowledge of: • SQL - must • DBT - must • Python - must • Snowflake- must • Airflow - good to have • Trino - good to have • S3 - good to have • Iceberg - good to have • Hive Metastore - good to have • Kafka with Redpanda - good to have • Avro - nice to have • Druid - nice to have • Kubernetes - understand the concept and the ability to use • AWS - good to have Our Architecture and Technology Stack • Python, • Kafka on Redpanda • Snowflake • Postgres • DBT • Trino • S3 • Apache Druid • Apache Superset • Argo WF • Argo CD • Helm charts • Kubernetes Our offer In addition to great company and challenging projects, we can offer much, much more, i.e. • Knowledge sharing within our company • Agile and friendly atmosphere, non-violent communication, and full respect for diversity • Hybrid work model, from our Łódź office. • Monthly remuneration offered: 1.000 - 1.400 PLN net/ day on B2B