Big Data DevOps Engineer
Vauban joined the GFI Group at the beginning of January 2019. With more than 450 consultants in Romania, Vauban is a leading provider of IT services and innovative applications. Created in 2007, the company has experienced strong growth and has established itself as a reference partner for key accounts in the Banking, Telecom, Industry and Energy sectors.Cerinte
Mid/senior level, at least 3 years in working with Cloudera data platformResponsabilitati
Knowledge in configuring & troubleshooting of all the components in the Hadoop ecosystem like Cloudera, Cloudera Manager, HDFS, Hive, Impala, Oozie, YARN, Sqoop, Zookeeper, Flume, Spark, Spark standalone, Kafka (incl. Kafka Connect), Apache Kudu, Cassandra, HBase
Develop and maintain documentation relating to Hadoop Administration tasks (upgrades, patching, service installation and maintenance).
Understand Hadoop’s Security mechanisms and implement Hadoop Security (Apache Sentry, Kerberos, Active Directory, TLS/SSL).
Understand the role of Certificate Authorities, the setup of Certificates and their configuration in relation to Linux and TLS/SSL
Intermediate programming/scripting skills. Ideally in Java or Python and ksh/bash
Understanding of networking principles and ability to troubleshoot (DNS, TCP/IP, HTTP).
Nice to have / a plus:
Knowledge of one or more of the following: ElasticSearch, Kibana, Grafana, git/scm, Atlassian Suite (Confluence, Jira, Bitbucket) Jenkins/TeamCity, Docker and Kubernetes is highly appreciated
Should have experience in scripting for automation requirement. (. scripting via Shell, Python, Groovy etc)
Work and continuously improve the DevOps pipeline and tooling to provide active .
Interesting salary conditions
Undetermined period of contract
Career plan (professional, academic and financial)
Professional and friendly working environment.