Medior Java/Scala Developer (CIB ICT Digital)
UniCredit Services is the global service Company of UniCredit that provides solutions in the Information & Communication Technology (ICT), Back Office & Middle Office, Real Estate, Security and Global Sourcing areas. The Company has about 11.000 colleagues and oversees activities in: Austria, Germany, Italy, Poland, United Kingdom, Czech Republic, Romania, Slovakia, Hungary, plus 2 branches: one located in New York and one in Singapore.Requirements
1.5 years of development experience with Java SE and/or EE;
Experience with SQL and NoSQL databases;
Experience with Apache Spark framework; is a plus;
General knowledge with distributed computing platforms – Hadoop is a plus;
Experience with DWH solutions such as HIVE/IMPALA is a plus;
Experience with SOLR, Elastic Search or similar solutions is a plus;
Experience with Akka framework is a plus;
Team player focused on accomplishing team objectives;
Good communication and presentation skills;
High attention to details, fast learner;
Availability to travel on occasion between Iasi, Bucharest and Milan;
Our offer to you
We are looking to extend the team managing a Scala/Spark/Hadoop project in Iasi with a Java/Scala developer position. Being part of the backend team, you will be involved in the design, development, integration with other computational modules, maintenance and devops activities of a financial application heavily based on a Spark computational engine that runs in a Hadoop environment.
Blink - CRM for the Corporate Companies area in the CEE countries (11 clients). The application calculates in batch financial indicators. The project is developed in Scala and Spark and runs in a distributed environment - the Hadoop cluster with Cloudera technology stack.
EGM - a project that builds advanced corporate groups. The project is developed in Scala and Spark for the computational engine for advanced group construction and Java SE/EE for API.
Tableau - integrating Blink BE with a Business Intelligence tool. The project is developed in the Scala/Spark/Oracle/Tableau server and runs in a distributed environment - the Hadoop cluster with Cloudera technology stack.
Key tasks and responsibilities
Good understanding of functional specifications in order to provide support for technical analysis;
Development of new application features/modules;
Maintenance of application/modules;
Involvement in deployment activities such as installation, configuration, integration for different environments;
Application/modules performance profiling and optimization;
Improvement of internal devops tools;
Documentation of features/modules/procedures;
Keeping up-to-date with the Big Data technologies in order to constantly improve the technical solutions of the product;
Note: Please be informed that for including you in the recruiting process we will need your specific consent according to General Regulations on the Protection of Personal Data.