Job-uri similare care te-ar putea interesa: |
|
|---|---|
![]() |
(Senior) Data & MLOps Engineer
Hybrid |
![]() |
Data Engineer
Hybrid |
![]() |
Senior Data Engineer
Hybrid |
| Vezi job-uri similare ( 331 ) | |
Data Engineer – DHW Cloud
This job is no longer active!View all job-urile Evolve today active.View all job-urile Data Engineer – DHW Cloud active on Hipo.roView all job-urile in IT Software active on Hipo.ro |
| Employer: | Evolve today |
| Domain: |
|
| Job type:: | full-time |
| Job level: | 1 - 5 ani experienta |
| Location: |
|
| Updated at: | 06-12-2025 |
| Remote work: | Hybrid |
Short company description
At Evolve today we offer complex Human Resources services, from business consulting to headhunting and recruitment. We have extensive experience in finding talented professionals, for a variety of industries, especially in the IT sector.
We started with a dream: to create a Human Resources company with a strong, unique profile on the market, deeply shaped by our core values: commitment to performance, professionalism, a balanced approach and deep satisfaction for our clients.
From that moment, we nurtured special relationships and we continuously evolved, being flexible, but staying true to our values. We aim higher and higher and we invite you to EVOLVE with us!
Requirements
We’re recruiting for a forward-thinking tech team that’s building one of the most modern cloud analytics platforms in Europe. As a Data Engineer focused on Cloud Data Warehousing, you’ll help shape a scalable data mesh architecture that powers high-impact AI and business intelligence solutions.
You’ll work with terabyte-scale relational data, enabling advanced data services—from deep learning models that predict customer demand to real-time analytics that drive strategic decisions.
📍 Hybrid Program – Bucharest | 🌐 GCP Ecosystem | 🧠 AI & Analytics-Driven
Ready to turn data into decisions? Apply now and let’s build the future together.
Your Profile:
5+ years of experience in data engineering within a cloud environment (GCP preferred).
Strong Python skills and deep understanding of database architectures (MPP experience is a plus).
Solid background in relational data management and data warehousing at scale.
Familiarity with streaming/messaging tools (Pub/Sub, Kafka).
Analytical mindset with a structured, solution-oriented approach.
Agile, proactive, and accountable in your work style.
Fluent in English; conversational German (A2+) is a bonus.
Responsibilities
What You’ll Do:
Build and optimize data pipelines in Google Cloud Platform using BigQuery, Dataflow, Python, and Kubernetes.
Design and evolve a custom data management framework with layered architecture (data vault + business layer).
Collaborate with Data Scientists, Analysts, and fellow Engineers to deliver clean, scalable, and business-ready data.
Contribute to a self-built architecture using Google APIs/SDKs, GitLab CI/CD, and modern DevOps practices.
Support streaming and messaging integrations (Pub/Sub, Kafka) for real-time data flows.
Ensure high data quality and service reliability across the mesh and stakeholder-facing layers.


