Linkedprime
PwC Middle East

ETIC, Data Engineer (Snowflake), Senior Associate

Posted: 4 minutes ago

Job Description

Line of ServiceAdvisoryIndustry/SectorTechnologySpecialismAdvisory - OtherManagement LevelSenior AssociateJob Description & SummaryAt PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth.In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions.Key Responsibilities Design and build robust ELT pipelines using Snowflake, integrating data from various structured and unstructured sources. Develop and maintain data models, including star/snowflake schemas, data marts, and enterprise data warehouses on Snowflake. Implement data transformation logic using Snowflake SQL, Streams & Tasks, and tools like DBT (Data Build Tool). Optimize performance of Snowflake warehouses, queries, and materialized views to support business-critical analytics. Implement data governance, security, and access control policies within Snowflake using RBAC and masking policies. Work closely with data analysts, BI developers, and data scientists to understand data requirements and deliver high-quality datasets. Automate CI/CD workflows for Snowflake objects and transformations using tools like GitHub, Azure DevOps, or GitLab. Monitor and troubleshoot data pipeline issues, ensuring reliability, consistency, and accuracy. Document data workflows, lineage, and architecture for maintainability and knowledge sharing. Required Skills & Qualifications Bachelor’s or Master’s degree in Computer Science , Information Systems, Engineering, or a related field. Strong knowledge of SQL and data modeling best practices (e.g., Kimball, Inmon). Hands-on experience with Snowflake-specific features: Virtual Warehouses, Streams, Tasks, Time Travel, Cloning, and Secure Views. Experience with ETL/ELT tools such as DBT, Matillion , Fivetran , Informatica, or Azure Data Factory. Familiarity with cloud platforms (Azure, AWS, or GCP), especially Snowflake integrations. Proficiency in at least one scripting/programming language: Python, Scala, or Java. Strong understanding of data governance, lineage, and metadata management. Version control and CI/CD experience using Git, Terraform, or similar tools. Education (if blank, degree and/or field of study not specified)Degrees/Field of Study required:Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required SkillsAzure Data Lake, Microsoft Azure, Microsoft Azure SQL Database, Snowflake Data WarehouseOptional SkillsAccepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline {+ 28 more} Desired Languages (If blank, desired languages not specified) Travel RequirementsNot SpecifiedAvailable for Work Visa Sponsorship?NoGovernment Clearance Required?NoJob Posting End Date

Job Application Tips

  • Tailor your resume to highlight relevant experience for this position
  • Write a compelling cover letter that addresses the specific requirements
  • Research the company culture and values before applying
  • Prepare examples of your work that demonstrate your skills
  • Follow up on your application after a reasonable time period

You May Also Be Interested In