ML Data Architect (6 months contract; remote)

Angajator: FreelancerIT
Domeniu:
  • IT Software
  • Tip job: project-based
    Nivel job: peste 5 ani experienta
    Orase:
  • Job la nivel national
    Actualizat la: 04.06.2020
    Scurta descriere a companiei

    First and foremost we believe in the people on this site. We believe in those who want to develop and grow, both small companies and IT professionals and we know that we can create something beautiful together. We believe in their dreams, we believe in your dreams and we intend to prove the quality of the Romanian IT industry.

    Cerinte

    -BS or MSc degree in a relevant field with at least 3 years relevant industry experience
    -Experience as a data architect designing and implementing enterprise data platforms
    -Experience as a handson developer with Python, SQL, Linux scripting
    -Detail orientation with strong analytical and troubleshooting skills
    -Self-motivated and focused
    -Comfortable collaborating with geographically dispersed teams
    -Excellent written and spoken communication skills
    -Conversant in machine learning fundamentals
    -Take pride in finding ways to engineer things better, faster and correct. A focus on automation

    Nice to have:
    -Experience with open core, state monitoring software like Puppet on Ruby
    -Experience with Snowflake and other MPP columnar storage SaaS solutions
    -Experience with R Studio, Matlab, Databricks or Anaconda ML solutions
    -Experience with data management for entertainment and SVOD/AVOD streaming
    -Experience with live streaming gaming or esports data collection and analytics
    -Experience architecting UI interfaces for custom Https applications

    Responsabilitati

    -Architect and integrate data versioning platforms like DVC. Similar to Get for code
    -Architect and develop international MAM and DAM data platforms and integrations for digital other ML assets
    -Architect, integrate and implement data transfer software like Aspera vs Signiant
    -Architect, integrate and implement workflow management with Airflow, Luigi or Jenkins
    -Develop API data connection with NAS, RAID, cloud and HDSF data stores
    -Develop application Rest API and web service SOAP APIs integrations
    Script SQL and NoSql queries for acquiring, retrieving and augmenting data assets and metadata from NAS, DynamoDB, MongoDB, Snowflake, S3 and Blob storage
    -Python scripting for workflows, jobs, ETLs and machine learning execution
    -Architect and integrate enterprise NAS storage
    -Familiarity with Pytorch and/or TensorFlow frameworks
    -Creation, population and maintenance of XML and JSON data schemas
    -Develop and automate third party application rest APIs
    -Scripting of runtime languages like Node.js
    -Architect and implementation Kubernetes and Docker integrations
    -Architect and support code GPU ML infrastructure and data integrations
    -Architect and support data backups, replication and failover