Big Data Developer

This job is no longer active!

View all jobs Brainspotting active


View all jobs Big Data Developer active on Hipo.ro

View all jobs IT Software active on Hipo.ro


Employer: Brainspotting
Domain:
  • IT Software
  • Job type: full-time
    Job level: 1 - 5 years of experience
    Location:
  • BUCHAREST
  • nationwide
    Updated at: 11.08.2019
    Short company description

    We are the leading technology recruitment and selection consultancy in Romania, working on highly specialized technologies for permanent and interim positions. Since our inception in 2001 as the first specialized IT recruitment agency in Romania we supported over 400 national and global clients in acquiring strategic IT talent for their business.

    We are deeply passionate about technology and highly responsible about our work. We understand the pressure of the business and the fact that you need fast and quality results. Therefore we always go for the extra mile to deliver the best IT&C talent for your business, no matter how challenging the project may seem.

    Requirements

    To complete the ideal candidate profile, you need to have:
    • BS or higher in Computer Science or related discipline
    • 1.5+ years of experience in software development and database concepts, in general
    • Experience in Object-oriented/object function scripting languages such as Python, Java, Scala.
    • Experience with relational database internals, including both query processing and query planning, or other data processing infrastructure
    • Basic knowledge of key data structures and algorithms
    • Knowledge of data modelling and understanding of different data structures and their benefits and limitations under particular use cases
    • Proficient understanding of distributed computing principles
    • Mandatory experience with Spark (Apache Spark, HDFS). Apache Kafka, Apache Kerberos, Elasticsearch are a plus.
    • Experience with NoSQL databases, such as HBase, Cassandra or MongoDB, preferably HBase
    • Good knowledge of Big Data querying tools, such as Pig, Hive, and Impala, preferably Hive
    • Experience with Management of Hadoop cluster, with all included services, is a plus
    • Good knowledge of data warehousing solutions
    • Good ability to familiarize with unknown code in order to analyze and improve it.
    • Experience with version control software (preferably Git)
    • Experience with Agile methodologies
    • Good English skills (written and spoken)

    Responsibilities

    • Design and build the infrastructure with a focus on large-scale data extraction, preparation and loading of data from a variety of sources to turn information into insights using multiple platforms.
    • Work closely with other data and analytics team members to design, develop, maintain and evaluate big data solutions.
    • Develop prototypes and proof of concepts for the selected solutions
    • Extract data from a variety of sources like relational databases, NoSQL Databases, Distributed File System.
    • Write clean, well-engineered, maintainable code that conforms with accepted standards
    • Develop quality code through the unit and functional testing
    • Participate in the iteration planning and team standup meetings
    • Work with talented and determined engineers and designers to shape the future of big data solutions
    • Discover continuous learning opportunities in your everyday activity
    • Add your own mix of flavours to our dynamic and innovative team
    • Connect with passionate people in our open and friendly environment.

    Other info

    Our client is a product development consulting company focused on designing and developing scalable high-performance web and mobile applications. For them creating software is more than writing clean, well-engineered, maintainable code. It’s about building a team that constantly delivers on time and within budget.
    They target a continuous deployment process, constantly striving to improve their code quality, test coverage and choosing the best stack for the job, while enhancing estimations and hitting milestones.