Hakkōda, an IBM Company

Senior Consultant - Data Engineer

Posted: 1 hours ago

Boost Your Application

Stand out with our professional, ATS-friendly resume templates designed to get you noticed by recruiters.

Download Resume Templates

Job Description

IntroductionA career in IBM Consulting is built on long-term client relationships and close collaboration worldwide. You’ll work with leading companies across industries, helping them shape their hybrid cloud and AI journeys. With support from our strategic partners, robust IBM technology, and Red Hat, you’ll have the tools to drive meaningful change and accelerate client impact. At IBM Consulting, curiosity fuels success. You’ll be encouraged to challenge the norm, explore new ideas, and create innovative solutions that deliver real results. Our culture of growth and empathy focuses on your long-term career development while valuing your unique skills and experiences.Your Role And ResponsibilitiesWhat we are looking for:We are looking for a Sr. Consultant, Data Engineer to join our growing team of experts. This position will work in the design and development of Snowflake Data Cloud solutions. The work includes data ingestion pipelines, data architecture, data governance and security. The ideal candidate is an experienced data pipeline builder and migrations who enjoys optimizing data systems and building them from the ground up.Key ResponsibilitiesData Engineering & Solution DeliveryBuild and maintain scalable data pipelines, ingestion frameworks, and ELT/ETL processes using modern cloud technologies.Implement ingestion and CDC patterns using tools like Fivetran, Qlik Replicate, or native cloud services.Develop transformation logic using SQL, dbt, Python, and DataOps best practices.Engineer data models across Dimensional, Data Vault, Lakehouse, and semantic layers.Optimize pipelines for performance, reliability, cost efficiency, and scalability.Platform EngineeringDevelop and optimize cloud-native data platforms (Snowflake, Databricks, AWS, Azure, GCP).Implement warehouse/lakehouse structures, storage standards, governance rules, and data quality frameworks.Leverage orchestration tools such as Airflow, dbt Cloud, or native cloud schedulers to deploy and operate pipelines.Consulting & Client PartnershipEngage with client technical and business teams to gather requirements, define solutions, and ensure clarity.Communicate progress, blockers, and recommendations clearly and professionally.Participate in architecture reviews and support Solution Architects in defining best-fit approaches.Serve as a trusted advisor during delivery, balancing technical execution with client expectations.Data Governance, Quality & SecurityImplement data validation, observability, and monitoring frameworks to ensure trust and reliability.Apply RBAC, data classification, masking, encryption, and compliance best practices.Document technical solutions, lineage, standards, and handover materials.Mentorship & Team LeadershipSupport and mentor junior and mid-level engineers, promoting Hakkoda’s engineering standards and delivery excellence.Conduct code reviews, provide technical guidance, and contribute to team continuous improvement.Collaborate on internal accelerators, reusable templates, and Hakkoda-branded methodologies.Innovation & Continuous LearningStay current with emerging technologies across Snowflake, Databricks, AI/ML, streaming platforms, and modern data tooling.Contribute to internal communities of practice, accelerators, and thought leadership.Share learnings and promote a culture of experimentation and continuous improvement.Required Technical And Professional Expertise6+ years of experience in Data Engineering or related technical roles.Strong SQL expertise and experience with ELT/ETL pipeline development.Hands-on experience with at least one major cloud platform (AWS, Azure, GCP).Good experience with modern data platforms (Snowflake, Databricks, BigQuery, Redshift, Synapse).Proficiency in Python or similar languages for data engineering tasks.Knowledge of data modeling (Dimensional, Data Vault, Lakehouse).Experience with CICD workflows and version control (GitHub, Bitbucket).Consulting or client-facing delivery experience.

Job Application Tips

  • Tailor your resume to highlight relevant experience for this position
  • Write a compelling cover letter that addresses the specific requirements
  • Research the company culture and values before applying
  • Prepare examples of your work that demonstrate your skills
  • Follow up on your application after a reasonable time period