NoFluffJobs Stacjonarnie Senior New

Senior Hadoop Big Data Developer

Braver IT S.A.

⚲ Warszawa, Gdynia, Gdańsk

23 000 - 28 000 PLN (B2B)

Wymagania

  • Scala
  • Spark
  • Linux Shell Scripting
  • SQL
  • Hadoop
  • Yarn
  • Sqoop
  • Hive
  • Impala
  • MapReduce
  • Oozie
  • CI/CD
  • Git
  • Ansible
  • Bamboo
  • Jenkins
  • Java (nice to have)
  • Flink (nice to have)
  • Kafka (nice to have)
  • Data analysis (nice to have)
  • SAFe Agile (nice to have)

Opis stanowiska

O projekcie: Offer: - Location: Gdynia, Gdansk, Warszawa, Poland (Hybrid, 3 days per week in the office). - Growth Opportunity: Potential to evolve into an Expert IT Developer role. - Work Environment: Customer-focused mindset, strong emphasis on collaboration and ownership. Wymagania: - 5+ years of experience with Scala and functional programming techniques. - Strong experience working with Spark for big data processing. - Proficiency in Linux Shell Scripting. - Solid knowledge of SQL. - Expertise in the Hadoop ecosystem, including technologies such as YARN, Sqoop, Hive, Impala, MapReduce, Oozie. - Experience with version control and CI/CD tools, including Git, Ansible, Bamboo, and Jenkins. Nice-to-Have Skills: - Java experience. - Familiarity with Streaming Technologies such as Flink and Kafka. - Basic knowledge of Data Analysis. - Familiarity with SAFe Agile methodology. Codzienne zadania: - Work within the Backend Application capability team to support key projects within Nordea's Business Support IT/Technology cluster. - Develop and maintain applications using the Hadoop Big Data ecosystem and technologies (YARN, Sqoop, Hive, Impala, MapReduce, Oozie). - Work within an Agile SAFe environment to deliver value on key projects. - Write and optimize Spark jobs and Scala code for processing large-scale data. - Collaborate with cross-functional teams to ensure a customer-focused approach to solution delivery. - Contribute to the CI/CD pipeline using Git, Ansible, Bamboo, and Jenkins to ensure seamless deployment and continuous improvement. - Ensure the infrastructure and data pipelines are scalable, optimized, and maintainable.