Big Data Engineer with Hadoop

Employer: Euro-Testing Software Solutions
Domain:
  • Telecommunication
  • Job type: full-time
    Job level: 1 - 5 years of experience
    Location:
  • BUCHAREST
  • nationwide
    Updated at: 06.02.2023
    Short company description

    Euro-Testing Software Solutions is a privately-owned software company specialized in Full-Service Software Testing, Penetration Testing, Vulnerability Identification & Management, Application and Data Security, Static & Dynamic Code Analysis as well as, DevOps/DevSecOps, Robotic Process Automation, Implementation and Customization for Atlassian and Micro Focus (HPE) products.

    Requirements

    − Hadoop
    − Python
    − HDFS
    − Experience with Hadoop Platform & Ecosystem at Enterprise scale
    − Proven experience with: HDFS, YARN, MapReduce
    − Analytical techniques/models including Machine Learning, Data Modelling and Visualization
    − At least two or more Analytical programming and scripting languages, e.g., SQL, Python, Linux, Java
    − Data Driven evidenced through strong creativity, analysis, and problem-solving skills
    − Utilize Data/Visualization techniques to illustrate issues and challenges
    − Excellent knowledge in administration Big Data systems in a production environment
    − Cloudera Certified Hadoop Administrator/Developer or comparable qualifications
    − Experience working closely in a team/squad/tribe using agile methodologies Scrum and/or Kanban, practicing DevOps and Continuous Delivery / Integration

    Flexible plan, not fully remote
    Regular working hours, might require extra hours of overtime

    Responsibilities

    − Provide and operate IT infrastructure and application services for Teradata, Vantage and Aster systems related to VF European Local Markets
    − Constantly optimize Teradata, Vantage and Aster infrastructure/application whilst delivering cost efficient services of high quality
    − Design, develop, construct, install, test and maintaining the complete data management & processing systems
    − Contribute to the continuous optimization of the Big Data Platform and related infrastructure, network, database, and middleware capabilities to support and enable the development and operations of Data modules and solutions
    − Discover opportunities for data acquisitions and explore new ways of using existing data.
    − Contribute to improving data quality, reliability & efficiency of the whole system
    − Create data models to reduce system complexity and hence increase efficiency & reduce cost
    − Design, develop, construct, install, test and maintaining the complete data management & processing systems
    − Contribute to the continuous optimization of the Big Data Platform and related infrastructure, network, database, and middleware capabilities to support and enable the development and operations of Data modules and solutions
    − Discover opportunities for data acquisitions and explore new ways of using existing data
    − Contribute to improving data quality, reliability & efficiency of the whole system
    − Create data models to reduce system complexity and hence increase efficiency & reduce cost