Big Data Engineer
At Atos, we are striving to create the firm of the future by bringing together people, technology and business. Every day we power progress for our clients and partners, the wider community and ourselves. It is our unique approach that makes this possible.
Atos is aglobal leader in digital transformation with approximatelyemployees in 72 countries and annual revenue of around € 12 billion. Atos Romania operates in Bucharest, Timişoara and Braşov and employs more than 2100 people.
he European number one in Big Data, Cybersecurity, High Performance Computing and Digital Workplace, The Group provides Cloud services, Infrastructure & Data Management, Business & Platform solutions, as well as transactional services through Worldline, the European leader in the payment industry.
With its cutting-edge technologies, digital expertise and industry knowledge, Atos supports the digital transformation of its clients across various business sectors: Defense, Financial Services, Health, Manufacturing, Media, Energy & Utilities, Public sector, Retail, Telecommunications and Transportation. The Group is the Worldwide Information Technology Partner for the Olympic & Paralympic Games and operates under the brands Atos, Atos Consulting, Atos Worldgrid, Bull, Canopy, Unify and Worldline. Atos SE (Societas Europaea) is listed on the CAC40 Paris stock index.
For more information and to see our current vacancies, access the Career section from our website
Also we’re waiting you to be our friend on Facebook page.
What are your tasks?
- uilding and maintaining of Data Lake environments including Cloud environments
- upport deployment, customizations, upgrades and monitoring via DevOps tools
- reating automation across the Big Data environments
- articipation in R&D projects related to Big Data and Cyber Security
- racking trends and latest issues related to the domain of conducted projects
- reate processes and procedures in the environments
- reating technical documentation
What do you bring with you?
- amiliarity with technologies like Kibana, Elasticsearch, Hadoop, HDFS, HBase, Spark
- eadiness to work with Data lake and Fast data (processing tenth/hundreds thousands of events per second in cluster/cloud environment)
- ood working knowledge in at least one of the following fields:
➢ Experience with ELK, Spark streaming, Kafka, Nifi, Flume, ZooKeeper, Hive, Hawq, Cassandra, Impala, Scala, Java
➢ Familiarity with languages (especially R or Python)
➢ Knowledge of application architecture patterns, development patterns
➢ Knowledge of Anisble , PostgreSQL
➢ Know-how in Big Data environments - familiar with technologies like Hadoop, MapR
- tructured work organization, pleasure working in teams and a high level of service orientation.