Data Engineer (Hadoop, Spark/Scala)

This job is no longer active!

View all jobs Luxoft Romania active


View all jobs Data Engineer (Hadoop, Spark/Scala) active on Hipo.ro

View all jobs IT Software active on Hipo.ro


Employer: Luxoft Romania
Domain:
  • IT Software
  • Job type: full-time
    Job level: 1 - 5 years of experience
    Location:
  • BUCHAREST
  • Updated at: 22.04.2020
    Remote work: On-site
    Short company description

    About Luxoft
    Luxoft, a DXC Technology Company (NYSE: DXC), is a digital strategy and software engineering firm providing bespoke technology solutions that drive business change for customers the world over. Acquired by U.S. company DXC Technology in 2019, Luxoft is a global operation in 44 cities and 23 countries with an international, agile workforce of nearly 18,000 people. It combines a unique blend of engineering excellence and deep industry expertise, helping over 425 global clients innovate in the areas of automotive, financial services, travel and hospitality, healthcare, life sciences, media and telecommunications. DXC Technology is a leading Fortune 500 IT services company which helps global companies run their mission critical systems. Together, DXC and Luxoft offer a differentiated customer-value proposition for digital transformation by combining Luxoft’s front-end digital capabilities with DXC’s expertise in IT modernization and integration. Follow our profile for regular updates and insights into technology and business needs.
    ​​​​​​​Luxoft Romania has been established since 2001. We currently have approximately 2500 employees working from different locations in the country.

    Requirements

    Mandatory Skills:
    Skills
    • Expertise in the implementation of end-to-end data processing chains
    • Mastery of distributed development
    • Knowledge of ingestion frameworks
    • Knowledge of Beam and its different execution modes on DataFlow
    • Knowledge of Spark and its different modules
    • Mastery of Java (+ Scala)
    • Knowledge of the ecosystem of NoSQL databases
    • Knowledge in building data product APIs
    • Knowledge of Dataviz tools and libraries
    • Ease in debugging Beam (+ Spark) and distributed systems
    • The popularization of complex systems
    • Control of the use of data notebooks
    • Expertise in data testing strategies
    • Strong problem-solving skills, intelligence, initiative and ability to resist pressure
    • Excellent interpersonal skills and great communication skills (ability to go into detail)

    Responsibilities

    Project Description:
    Contribute to the business value of Data-oriented products based on on-premise Datalake by implementing end-to-end data processing chains, from ingestion to API exposure and data visualization
    General responsibility: Quality of data transformed in the Datalake, proper functioning of data processing chains and optimization of the use of resources of on-premise or cloud clusters by data processing chains
    General skills: Experience in the implementation of end-to-end data processing chains and Big data architectures, mastery of languages and frameworks for the processing of massive data in particular in Streaming Mode (Java, Spark / Scala). Practice agile methods.


    Responsibilities:

    Main responsibilities
    During the definition of the project
    • Design of data ingestion chains
    • Design of data preparation chains
    • Data product design
    • Design of NoSQL data models
    • Data visualization design
    • Participation in the selection of services/solutions to be used according to usage
    • Participation in the development of a data toolbox
    During the iterative realization phase
    • Implementation of data ingestion chains
    • Implementation of data preparation chains
    • Implementation of data visualizations
    • Implementation of data products
    • Exhibition of data products
    • Configuration of NoSQL databases
    • Distributed processing implementation
    • Use of functional languages
    • Debugging distributed processing and algorithms
    • Identification and cataloging of reusable items
    • Contribution to the evolution of work standards
    • Contribution and advice on data processing problems
    During integration and deployment
    • Participation in problem-solving
    During serial life
    • Participation in the monitoring of Operations
    • Participation in problem-solving

    Other info

    Reasons to join us

    • Attractive salary and benefits package
    • We invest into your professional training including business domain knowledge, and allow you to grow your professional career.
    • We encourage creative-thinking into an open-minded work environment. Frequently the relaxation rooms are the place where the most ambitions ideas are born.
    • We are not just professional teams, we are also friends that have fun working together
    If you are an active person and you feel motivated by the creation/development of the software solutions, then this is the place to be, you will not get bored.

    Job-uri similare care te-ar putea interesa:

    Hybrid

    Hybrid

    Remote

    Vezi job-uri similare (477)