Full stack Engineer

This job is no longer active!

View all jobs Intralinks active


View all jobs Full stack Engineer active on Hipo.ro

View all jobs IT Software active on Hipo.ro


Employer: Intralinks
Domain:
  • IT Software
  • Job type: full-time
    Job level: > 5 years of experience
    Location:
  • BUCHAREST
  • nationwide
    Updated at: 14.07.2016
    Short company description

    When teamwork and collaboration really matter – there’s Intralinks. Since 1996.

    Trusted globally for more than 18 years, we bring collaboration and document sharing that’s safe, secure, compliant and fully auditable.
    We offer all of our products through the cloud, but unlike other clouds, ours is totally secure and what happens in it stays safe.
    Gartner likes us too. They’ve crowned us top supplier of enterprise collaboration and social software solutions.

    Requirements

    Required Experience and Skills:

    • Motivated, self-starter, ability and strong desire to learn new technologies and methodologies as required
    • Strong background in using Javascript frameworks such as Nodejs
    • Must have hands-on experience implementing data extraction and transformations from a variety of data sources such as SQL, NoSQL and Hadoop-like systems
    • Ability to build and optimize data transformations, data quality routines and workflows
    • Can write custom transformations in Java to process data in multiple formats (Relational, S3, Swift, etc.) to a structure required for the data marts
    • Experienced in designing and building logical and physical data models of operational data marts/data warehouses based on Analytics data store technology such as Vertica, Oracle or EMC Greenplum.
    • Strong data modeling skills, specifically in design and building of Snow-flake and Star schema models to achieve manageable, high performance reporting data store
    • Ideal to have worked on a large data mart or data warehouse implementation (Terabytes in size)
    • Strong SQL skills. Programming experience in a higher language such as Java and PL/SQL is a must
    • Familiarity with data management disciplines required to design, build and implement high quality data marts
    • Familiarity with Big Data Analytics architecture and patterns using highly scalable, parallel processing data platforms
    • Hadoop Map Reduce and Spark experience are a strong plus.
    • Demonstrable experience in creating ETL solution that involves data extraction from source systems all the way through creating analytics friendly database tables and views

    Responsibilities

    Overview

    The successful candidate will be responsible for designing and building logical and physical data models of operational data marts/data warehouses based on Analytics data store technology. This includes implementing data extraction and transformations from a variety of data sources as well as building and optimizing data transformations, data quality routines and workflows. This is a technical role that would ideally require familiarity with Big Data Analytics architecture and patterns using highly scalable, parallel processing data platforms. The candidate will interact with product management team as well as internal architectural team and will work towards designing and building logical and physical data models needed for our reporting and analytics engine.