Job Description

About the RoleWe are seeking an experienced Azure Databricks & Airflow Engineer to join our Data Engineering team on a 6-month contract. The ideal candidate will have strong hands-on expertise in Data Engineering, Databricks, Apache Airflow, PySpark, Python, Unity Catalog, and Data Modelling, ideally within banking or financial services. The role involves designing, building, and optimizing scalable data pipelines, ensuring data governance, and enabling robust analytics and reporting capabilities.Key ResponsibilitiesDevelop and maintain data pipelines using Azure Databricks, Apache Airflow, PySpark, Python, and related Azure services.Implement and manage Unity Catalog for data governance, lineage, access controls, and cataloging of datasets.Design and optimize data models (Dimensional, Star, Snowflake schemas) for analytical reporting and data warehousing in financial environments.Monitor, optimize, and troubleshoot ETL/ELT workflows, ensuring high performance, reliability, and security.Collaborate with Data Architects, Analysts, and Business stakeholders to understand requirements and translate them into scalable technical solutions.Implement best practices for DevOps, CI/CD, version control (Git), and deployment automation.Ensure data quality, integrity, and compliance with security policies and regulatory standards (GDPR, SOX, BCBS, etc.).Work with cross-functional teams to support AI/ML readiness and prepare datasets for advanced analytics.Provide documentation, KT, and support for ongoing operations and handover to internal teams.Required Skills & ExperienceTechnical Skills:Strong hands-on experience with Azure Databricks (Delta Lake, Unity Catalog)Proven experience with Apache Airflow (scheduling, orchestration, DAG development)Advanced programming skills in PySpark and PythonSolid knowledge of Data Modelling (OLTP, OLAP, Dimensional Modelling)Experience with Azure Data Lake, Azure Data Factory, Azure SQL, Azure DevOpsStrong understanding of data governance, data lineage, access management, and metadata catalogingExperience in SQL performance optimization, ETL frameworks, and Delta Lake architectureDomain & Other Skills:Experience in Banking, Insurance, or Financial Services is strongly preferredFamiliarity with regulatory/compliance requirements (e.g., GDPR, SOX, BCBS, PCI-DSS)Strong analytical and problem-solving skillsExcellent communication and documentation abilitiesAbility to work in agile and collaborative team environmentsNice to HaveExperience with MLflow, Azure Cognitive Services, or AI data pipelinesKnowledge of Power BI, SSIS, or Synapse AnalyticsExposure to CI/CD pipelines using Azure DevOps or GitHub Actions

Job Application Tips

  • Tailor your resume to highlight relevant experience for this position
  • Write a compelling cover letter that addresses the specific requirements
  • Research the company culture and values before applying
  • Prepare examples of your work that demonstrate your skills
  • Follow up on your application after a reasonable time period

You May Also Be Interested In