AWS Data Engineer
Capgemini Polska
⚲ Gdańsk, Oliwa, Kraków, Grzegórzki
Wymagania
- AWS
- Glue
- Redshift
- Lambda
- Athena
- S3
- Snowflake
- Docker
- Terraform
- CloudFormation
- Kafka
- Airflow
- Spark
- Python
- Scala
- Java
- Bash
- Spark Streaming
Opis stanowiska
Nasze wymagania: Proven experience in Big Data or Cloud projects involving processing and visualization of large and unstructured datasets, across various phases of the SDLC. Hands-on experience with the AWS cloud, including Storage, Compute (and Serverless), Networking, and DevOps services. Solid understanding of AWS services, ideally supported by relevant certifications. Familiarity with several of the following technologies: Glue, Redshift, Lambda, Athena, S3, Snowflake, Docker, Terraform, CloudFormation, Kafka, Airflow, Spark. Basic proficiency in at least one of the following programming languages: Python, Scala, Java, or Bash. Very good command of English (German language skills would be an advantage). Mile widziane: Experience with orchestration tools (e.g., Airflow, Prefect). Exposure to CI/CD pipelines and DevOps practices. Knowledge of streaming technologies (e.g., Kafka, Spark Streaming). Experience working with Snowflake or Databricks in a production or development environment. Relevant certifications in AWS, data engineering, or big data technologies. O projekcie: Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way you’d like, where you’ll be supported and inspired by a collaborative community of colleagues around the world, and where you’ll be able to reimagine what’s possible. Join us and help the world’s leading organizations unlock the value of technology and build a more sustainable, more inclusive world. YOUR ROLE Join our Insights & Data team, a community of over 400 professionals delivering impactful, data-driven solutions. We specialize in building scalable cloud-native data architectures and pipelines that power analytics and machine learning across various industries. Our work spans the full Software Development Life Cycle (SDLC), using modern data frameworks, agile practices, and DevOps principles. Zakres obowiązków: Design and implement solutions for processing large-scale and unstructured datasets (Data Mesh, Data Lake, or Streaming Architectures). Develop, optimize, and test modern DWH/Big Data solutions based on the AWS cloud platform within CI/CD environments. Improve data processing efficiency and support migrations from on-premises systems to public cloud platforms. Collaborate with data architects and analysts to deliver high-quality cloud-based data solutions. Ensure data quality, consistency, and performance across AWS services and environments. Participate in code reviews and contribute to technical improvements. Oferujemy: Practical benefits: yearly financial bonus, private medical care with Medicover with additional packages (e.g., dental, senior care, oncology) available on preferential terms, life insurance and access to NAIS benefit platform. Access to over 70 training tracks with certification opportunities (e.g., GenAI, Excel, Business Analysis, Project Management) on our NEXT platform. Dive into a world of knowledge with free access to Education First languages platform, Pluralsight, TED Talks, Coursera and Udemy Business materials and trainings. Cutting-Edge Technology: Position yourself at the forefront of IT innovation, working with the latest technologies and platforms. Capgemini partners with top global enterprises, including 145 Fortune 500 companies. Enjoy hybrid working model that fits your life - after completing onboarding, connect work from a modern office with ergonomic work from home, thanks to home office package (including laptop, monitor, and chair). Ask your recruiter about the details.