Gmass HR Solutions VietNam

Data Engineer Lead (Snowflake, Lakehouse Architecture)

Posted: 2 hours ago

Job Description

Salary: Upto 70 Mil Job Description· Lead the full architecture and development of a modern, cloud-native data platform on Snowflake, aligned with Lakehouse principles.· Build and maintain advanced ETL/ELT pipelines with strong reliability, performance, scalability, and reusability.· Design and optimize analytical data models (star schema, normalized models) for BI, analytics, and machine learning use cases.· Partner closely with data scientists and AI engineers to deliver training datasets, model-scoring pipelines, and data services supporting AI/agent applications.· Manage ingestion and integration of both structured and unstructured data while ensuring robust data governance, security, and compliance.· Promote best practices in code quality, documentation, testing, and CI/CD across the data engineering team.· Mentor junior and mid-level engineers, perform code reviews, and cultivate a high-standards engineering culture.· Collaborate directly with global product and business teams in fluent English—covering requirement gathering, solution design, and presenting architectural recommendations.· Support the development of AI agents and intelligent data services leveraging real-time data and vector-based retrieval.· Lead performance optimization, cost control, and scalable infrastructure management across the data ecosystem.Required working experience· 5+ years of hands-on experience in data engineering or large-scale data platform development.Required Languages:· Excellent English communication skills—comfortable working with international teams and senior stakeholders.Required knowledge and skills· Demonstrated expertise with Snowflake, including architecture, performance tuning, cost optimization, and ELT design. (Experience with Databricks/Spark/Delta Lake is a plus.)· Advanced proficiency in Python and SQL with the ability to deliver production-quality code.· Practical experience building ETL/ELT workflows using Airflow, dbt, or similar orchestration frameworks.· Strong knowledge of Lakehouse architecture, data warehousing concepts, and large-scale performance optimization.· Solid understanding of cloud platforms (AWS, Azure, or GCP).

Job Application Tips

  • Tailor your resume to highlight relevant experience for this position
  • Write a compelling cover letter that addresses the specific requirements
  • Research the company culture and values before applying
  • Prepare examples of your work that demonstrate your skills
  • Follow up on your application after a reasonable time period

You May Also Be Interested In