Job-uri similare care te-ar putea interesa: |
|
|---|---|
![]() |
AI & Cloud Development Engineer
Timisoara, |
![]() |
Specialist in Engineering Methods and Tools for AI-Enhanced Model-Based Development
Timisoara, |
![]() |
Internship Embedded Software Engineer (Actuators & Networks) - AES (TM)
Timisoara, |
| Vezi job-uri similare ( 41 ) | |
Data Engineer @Tremend
This job is no longer active!View all job-urile Jobs from Hipo in Timisoara active.View all job-urile Data Engineer @Tremend active on Hipo.roView all job-urile in Engineering active on Hipo.ro |
| Employer: | Jobs from Hipo in Timisoara |
| Domain: |
|
| Job type:: | full-time |
| Job level: | 1 - 5 ani experienta |
| Location: |
|
| Updated at: | 23-10-2025 |
| Remote work: | On-site |
Job Description
Tremend is looking for a Data Engineer to join our team of bright thinkers and doers. You’ll use your problem-solving creativity to figure out our client’s most complex and challenging problems across different industries. We are on a mission to transform the world, and you will be instrumental in shaping how we do it with your ideas, thoughts, and solutions.
Responsibilities:
- Design, implement, and test data pipelines, ETL/ELT processes, and data storage solutions
- Build and maintain scalable, reliable, and high-performance pipelines for structured and unstructured data
- Work with big data technologies and distributed systems to process and transform large datasets
- Collaborate with data modelers, data scientists, and solution architects to ensure efficient data flows and optimal storage
- Partner with data scientists and analysts to deliver clean, well-structured, and accessible data
- Coordinate with integration teams and data source owners for smooth ingestion of data from multiple systems
- Automate provisioning, deployments, and environment management for data platforms
- Create and support APIs and data services for internal and external consumers
- Ensure solutions meet requirements for scalability, performance, data quality, and governance
- Hands-on experience with ETL/ELT design and implementation
- Familiarity with big data ecosystems (e.g., Spark, Hadoop, Kafka, Flink)
- Experience with one or more cloud platforms (Azure, AWS, GCP) and modern data engineering tools (e.g., Databricks, Snowflake, BigQuery, Synapse)
- Proficiency in programming languages such as Python, Java
- Solid understanding of SQL and experience with both relational and non-relational databases
- Experience with data modeling, schema design, and data partitioning strategies
- Knowledge of workflow orchestration tools (e.g., Airflow, Data Factory) is a plus
- Experience with version control (e.g., Git) and CI/CD practices
- Familiarity with Agile development methodologies and test management basics
- Bachelor’s degree in Computer Science, Engineering, or related field (or equivalent experience)
- 2–3 years of professional experience in data engineering or related roles
- Strong teamwork, communication, analytical thinking, and problem-solving skills
- Fluent in English, both spoken and written
- Curiosity and drive to continuously learn, adapt, and share knowledge
- A fast-paced tech environment
- Continuous growth & learning
- Open feedback culture
- Room for own initiative & ideas
- Transparency about results & strategy
- Recognition & reward for hard work
- Working with a flexible schedule
- Medical subscription
- Meal tickets
- Extra vacation days - starting with 25 vacation days
- Many others perks
Benefits
- Performance bonus
- Annual bonus
- Medical subscription
- Dental subscription
- Extra days off
- Flexible work schedule
- Meal vouchers
- Holiday vouchers


