Data Engineer
MindFly Technologies Kft.
⚲ Budapest
25 368 - 31 248 PLN (B2B)
Wymagania
- Communication skills
- Data engineering
- Python
- Databricks
- SQL
- Testing
- pandas
- Azure
- SAFe
- Test management
- PySpark
- SDLC
Opis stanowiska
O projekcie: The goal is to design and deliver a fully integrated end-to-end data platform that enables reliable, scalable, and high-quality energy monitoring and analytics across the organization. As a Data Engineer, you will play a key role in shaping the overall data architecture and data product landscape, ensuring that data flows seamlessly from source systems through transformation layers into a centralized and harmonized data model. You will work in a highly collaborative, agile environment and contribute not only to implementation but also to the overall technical strategy and architecture of the platform. This role requires a strong end-to-end ownership mindset, combining hands-on engineering with architectural thinking to drive a complex and critical enterprise data initiative forward. The position offers a high degree of creative freedom, diverse responsibilities, and the opportunity to actively contribute to challenging, long-term digitalization projects and the design of modern data architectures. We expect an on-site presence at the project location in Austria of 3–4 days per month. Travel expenses will be covered by us. If this opportunity sounds interesting to you, feel free to get in touch – we will be happy to provide you with more detailed information about the position and our company. Wymagania: We are a young, flat-structured IT services company with a strong focus on software testing, software development, and data solutions. To strengthen our team, we are currently looking for a Data Engineer – either as a freelancer or in permanent employment. We are looking for a highly skilled Data Engineer with strong expertise in end-to-end data platform development and modern cloud-based data architectures. Must-have skills and experience: - Strong experience in data architecture (logical and technical design) - Excellent SQL skills - Advanced programming skills in Python (PySpark, Pandas) - Experience with Databricks and Microsoft Azure data services - Strong understanding of the Software Development Lifecycle (SDLC) - Experience in agile software development environments and data management projects - Proven ability to work in cross-functional agile delivery teams (SAFe or similar frameworks) - Experience in release processes and test management in data environments - Ability to explain and visualize complex technical concepts in a clear and structured way Codzienne zadania: - Design, build, and maintain production-grade end-to-end data pipelines - Develop scalable data processing solutions across multiple layers (Data Lake, harmonized models, data products) - Design and implement ETL/ELT workflows in collaboration with system and architecture teams - Work on a centralized data platform serving multiple downstream consumers - Ensure data quality, reliability, and performance across all data processing layers - Collaborate with Product Owners, Architects, and Engineers in an agile SAFe environment - Support integration of new data sources into the energy monitoring data ecosystem - Participate in code reviews, release cycles, and testing activities