Monday, October 27, 2025

Job Description

Responsibilities• Build and maintain robust ETL/ELT pipelines to transform raw data into clean, structured datasets.• Ensure data solutions are reliable, high-performing, and production-ready.• Ensure strong data governance, security, and quality control across all pipelines and datasets.• Write clean, well-documented code and support testing, automation, and version control practices.• Collaborate closely with analysts, business stakeholders, and IT teams to understand data needs and deliver user-friendly data solutions.• Contribute to a knowledge-sharing culture and help uplift the technical standard of the team.• Continuously refine and optimize data models (e.g., star schema) to improve reporting efficiency.• Proactively identify and address performance bottlenecks and inefficiencies in data workflows.Requirements• Minimum 4-5 years of experience in data engineering, with at least 2 years working in AWS cloud-based environments• Experience in building and managing data models and data pipelines for large-scale data environments.• Proficiency in AWS tools: S3, Glue, Redshift, Lambda, Step Functions.• Strong SQL and Python skills for data transformation and automation.• Hands-on experience working with various data sources, including structured and unstructured data.• Solid grasp of data warehousing concepts, data modeling, and modern ELT/ETL practices.• Strong problem-solving skills and ability to work independently within a cross-functional team.

Job Application Tips

  • Tailor your resume to highlight relevant experience for this position
  • Write a compelling cover letter that addresses the specific requirements
  • Research the company culture and values before applying
  • Prepare examples of your work that demonstrate your skills
  • Follow up on your application after a reasonable time period

Related Jobs