Big Data Engineer
With a defined Purpose to lead us, Deloitte acts with courage and conviction to make an impact that matters every day - across our businesses and around the world - by serving the most sophisticated clients, tackling the world's most complex problems, and promoting integrity and trust in the marketplace.
Deloitte is a variety of people, experience, industries, and services we deliver in 150 countries of the world. It is an intellectual challenge, a good starting point for your career, and an excellent opportunity for continuous development and gaining valuable life experiences.
The European Regional Delivery Center, ERDC for short, provides IT technology services to Deloitte clients across major European and EMEA, is part of a strongly integrated network of Global Delivery Centers that operate across the world.
2+ years’ experience in a similar role
Good knowledge in at least one major programming language, preferably Scala/Python/Java
Experience with Spark, Kafka, Hive, HBase, HDFS, Oozie, Ranger, etc.
Working knowledge of Linux based environments, command line, typical CLI tools, and basic administration skills
Experience with Scrum/Kanban methodology
Degree in Computer Science or equivalent
Good communication skills with the ability to develop strong client relationships
Involved in the full development cycle of complex data platforms
Take ownership of solution components
Deliver Big Data solutions from the Hadoop ecosystem
Perform data ingestion in batches or real-time
Drive change by staying up to date with the latest tools and technologies
Collaborate with senior developers and architects to ship fully-fledged features
Improve Agile delivery processes by continuously providing feedback and coming with new ideas