BrickxAI

Data Warehouse Engineer (dbt + Snowflake Specialist)

Posted: just now

Job Description

Job Description: We are looking for an experienced Data Warehouse Engineer with strong expertise in dbt (data build tool), Snowflake, and ELT pipeline development to build an end-to-end analytics-ready data platform for a fintech lending product. You will design and implement star schema data models, create fact and dimension tables, and develop incremental dbt transformations to support portfolio analytics, customer segmentation, and risk tracking.The role involves creating scalable and auditable pipelines, ensuring data governance, and integrating with BI tools such as Tableau and Power BI. You’ll also establish CI/CD workflows, manage environment segregation, and automate orchestration through Airflow or dbt Cloud.Key Responsibilities:Design and implement Snowflake-based data warehouses using dbt.Build staging, intermediate, and marts dbt models.Enforce data quality via dbt tests (unique, not_null, relationships).Automate data ingestion through Fivetran, Airbyte, or Snowpipe.Support analytics and dashboarding teams with clean data models.Qualifications:3+ years of experience with dbt, Snowflake, SQL, and data modeling.Proven experience with ETL/ELT architecture and CI/CD pipelines.Knowledge of data governance, version control, and orchestration tools.Background in fintech or lending analytics preferred.

Job Application Tips

  • Tailor your resume to highlight relevant experience for this position
  • Write a compelling cover letter that addresses the specific requirements
  • Research the company culture and values before applying
  • Prepare examples of your work that demonstrate your skills
  • Follow up on your application after a reasonable time period

Related Jobs