Monday, October 27, 2025

Job Description

We are seeking a skilled and detail-oriented data professional to design, build, and maintain scalable data solutions that enable data-driven decision-making across the organization. The ideal candidate will have hands-on experience with Azure Data Factory (ADF), Azure Databricks, and CI/CD pipelines, along with strong analytical and data modeling capabilities.Key Responsibilities:Design, develop, and maintain robust data pipelines using Azure Data Factory (ADF) and Databricks.Collaborate with business and technical stakeholders to gather, analyze, and translate data requirements into scalable solutions.Develop and optimize ETL/ELT workflows to integrate data from various sources into centralized data models or data warehouses.Build and maintain data models (conceptual, logical, and physical) that ensure consistency, quality, and accessibility of enterprise data.Implement and maintain CI/CD pipelines for automated deployment and version control of data solutions.Conduct data analysis and validation to ensure accuracy, integrity, and completeness of data across environments.Create dashboards, reports, or analytics views to support business insights and data visualization needs.Work closely with cross-functional teams (data scientists, analysts, business users) to improve data quality, performance, and governance.Ensure security, compliance, and documentation standards are followed throughout the data lifecycle.Required Skills & Experience:Bachelor’s or Master’s degree in Computer Science, Information Systems, Statistics, or a related field.3–8 years of experience in data engineering, analytics, or modeling (depending on role level).Strong proficiency with Azure Data Factory (ADF) and Azure Databricks for pipeline orchestration and data processing.Hands-on experience with SQL, Python, and data transformation frameworks.Solid understanding of data warehousing concepts, data modeling (Dimensional, 3NF), and ETL best practices.Experience with CI/CD tools (Azure DevOps, Git, or similar) for automated deployment.Knowledge of cloud platforms (Azure, AWS, or GCP) and related data ecosystem tools.Strong analytical thinking, problem-solving, and communication skills.Preferred (Nice-to-Have):Experience with Power BI, Synapse Analytics, or Snowflake.Exposure to Data Vault, Medallion Architecture, or Lakehouse design principles.Understanding of data governance, metadata management, and data security practices.Familiarity with Agile/Scrum project methodologies.

Job Application Tips

  • Tailor your resume to highlight relevant experience for this position
  • Write a compelling cover letter that addresses the specific requirements
  • Research the company culture and values before applying
  • Prepare examples of your work that demonstrate your skills
  • Follow up on your application after a reasonable time period

Related Jobs