Big Data Engineer with Hadoop

Employer: Inetum Romania
Domain:
  • IT Hardware
  • IT Software
  • Job type: full-time
    Job level: 1 - 5 years of experience
    Location:
  • BUCHAREST
  • Updated at: 30.11.2021
    Short company description

    Inetum is an IT services company that provides digital services and solutions and a global group that helps companies and institutions to get the most out of digital flow. The Inetum group is committed towards all these players to innovate, continue to adapt and stay ahead. With its multi-expert profile, Inetum offers its clients a unique combination of proximity, a sectorial organization and solutions of industrial quality. Operating in more than 26 countries, the Group has nearly 27,000 employees and in 2020 generated revenues of €1,965 billion.

    Inetum Romania is an important player in the IT services and solutions market in our country, with over 14 years of activity. It is a stable, growing and profitable company with over 500 employees who provide, from the service centers in Bucharest, Pitesti and Constanta, IT consulting services, infrastructure and software development services, digital services, solutions for Smart City.

    Requirements

    • At least 5 years of experience in developing SAS language code
    • Very good skills of SQL and/or PL/SQL
    • Good knowledge of ETL flows (SAS Data Integration Studio preferably)
    • Working experience with QlikView is a plus
    • Working experience with IBM Mainframe z/OS would be a plus.
    • Experience with Hadoop Platform & Ecosystem at Enterprise scale.
    • Proven experience with: HDFS, YARN, MapReduce, analytical techniques/models (Machine Learning, Data Modelling and Visualization)
    • At least two or more Analytical programming and scripting languages, e.g., SQL, Python, Linux, Java
    • Data Driven evidenced through strong creativity, analysis, and problem-solving skills. Utilise Data/Visualisation techniques to illustrate issues and challenges
    • Excellent knowledge in administration Big Data systems in a production environment
    • Cloudera Certified Hadoop Administrator/Developer or comparable qualifications
    • Experience working closely in a team/squad/tribe using Agile methodologies (Scrum and/or Kanban), practicing DevOps and Continuous Delivery / Integration
    • Interpersonal/team skills to work in a high performing Agile team
    • Open-minded, clear solution oriented thinking, communicating behaviours

    Responsibilities

    • Design, develop, construct, install, test and maintaining the complete data management & processing systems
    • Contribute to the continuous optimization of the Big Data Platform and related infrastructure, network, database, and middleware capabilities to support and enable the development and operations of Data modules and solutions.
    • Discover opportunities for data acquisitions and explore new ways of using existing data.
    • Contribute to improving data quality, reliability & efficiency of the whole system.
    • Create data models to reduce system complexity and hence increase efficiency & reduce cost.
    • Administer the monitoring, maintaining, and supporting the operational capacity, availability, and performance of the Big Data Platform solutions against SLAs from a level two and level three support perspective
    • Contribute in technical discussions with Teradata Platform service providers (on-/ off-prem) to understand forecast and right-sizing impacts for short-, mid- and long-term capacity and performance requirements, iterating regularly