Tuesday, October 28, 2025
Hipo.ro

Cloud Data Engineer @Suvoda

Posted: 5 days ago

Job Description

Short Company DescriptionEchipa noastra de specialisti in resurse umane selecteaza constant cele mai interesante oportunitati profesionale, monitorizand permanent site-urile companiilor din Romania si din strainatate. Astfel, tu ai acces rapid si usor la joburi valoroase, pe care altfel le-ai putea rata.RequirementsSuvoda is seeking a skilled and driven Data Engineer to help evolve our data platform towards a data mesh architecture. In this role, you'll design and build domain-oriented data products and support near real-time reporting.You'll work on building and optimizing ETL/ELT pipelines using AWS Glue and PySpark, ensuring scalable, high-performance data processing across our platform.Responsibilities:Contribute to the design and implementation of a data mesh architecture using GraphQL APIs to expose domain-owned data products.Build and maintain a modern AWS-based data lake using S3, Glue, Lake Formation, Athena, and Redshift.Develop and optimize ETL/ELT pipelines using AWS Glue and PySpark to support batch and streaming data workloads.Implement AWS DMS pipelines to replicate data into Aurora PostgreSQL for near real-time analytics and reporting.Support data governance, quality, observability, and API design best practices.Collaborate with product, engineering, and analytics teams to deliver robust, reusable data solutions.Contribute to automation and CI/CD practices for data infrastructure and pipelines.Stay current with emerging technologies and industry trends to help evolve the platform.Requirements:Bachelor's degree in a technical field such as Computer Science or Mathematics.At least 4 years of experience in data engineering, with demonstrated ownership of complex data systems.Solid experience with AWS data lake technologies (S3, Glue, Lake Formation, Athena, Redshift).Understanding of data mesh principles and decentralized data architecture.Proficiency in Python, SQLExperience with data modeling, orchestration tools (e.g., Airflow), and CI/CD pipelines.Strong communication and collaboration skills.Preferred Qualifications:Master's degree, especially with a focus on data engineering, distributed systems, or cloud architecture.Hands-on experience in infrastructure-as-code tools (e.g., Terraform, CloudFormation)Expertise in AWS Glue and PySpark for scalable ETL/ELT developmentExperience with event-driven architectures (e.g., Kafka, Kinesis).Familiarity with data cataloging and metadata management tools.Knowledge of data privacy and compliance standards (e.g., GDPR, HIPAA).Background in agile development and DevOps practices.

Job Application Tips

  • Tailor your resume to highlight relevant experience for this position
  • Write a compelling cover letter that addresses the specific requirements
  • Research the company culture and values before applying
  • Prepare examples of your work that demonstrate your skills
  • Follow up on your application after a reasonable time period

Related Jobs