Data & Databricks Test Automation Engineer
Posted: 6 days ago
Job Description
Job DescriptionData & Databricks Test Automation EngineerCompany OverviewCitco is a global leader in financial services, delivering innovative solutions to some of the world's largest institutional clients. We are seeking a Test Automation Engineer specializing in Databricks and data platforms to ensure the quality and reliability of our data solutions.Role DescriptionAs a Data & Databricks Test Automation Engineer, you will be responsible for developing and implementing automated testing frameworks for Databricks-based solutions, data pipelines, and data quality validation. You will work closely with data engineering teams to ensure data accuracy and reliability across our Lakehouse architecture.Key Responsibilities Databricks TestingDesign and implement automated testing for Databricks notebooks and workflowsCreate test frameworks for Delta Lake tables and ACID transactionsDevelop automated validation for structured streaming pipelinesValidate Delta Live Tables implementations Data Pipeline TestingAutomate testing for ETL/ELT processes in DatabricksImplement Spark job testing and optimization validationCreate test cases for data ingestion and processing workflowsDevelop automated checks for data transformationsTest Unity Catalog features and access controls Quality AssuranceDesign and execute data quality test strategiesImplement automated data reconciliation processesDevelop performance testing for large-scale Spark jobs Monitoring & ReportingImplement pipeline monitoring test frameworksCreate automated test dashboardsGenerate quality metrics and testing reportsMaintain comprehensive test documentationRequirements & Qualifications Educational BackgroundBachelor’s degree in Computer Science, Data Science, or related fieldRelevant certifications in Databricks or data testing are a plus Technical Experience2+ years hands-on experience with Databricks (Spark)Strong programming skills in Python (PySpark) and SQLExperience with data testing frameworks and toolsKnowledge of AWS services (S3, Glue, Lambda)Understanding of Delta Lake and Lakehouse architectureExperience with version control systems (Git) Additional SkillsStrong analytical and problem-solving abilitiesExperience with large-scale data processingKnowledge of data quality best practicesUnderstanding of data governance and compliance requirementsExperience with Agile methodologies Platform KnowledgeDatabricks workspace and notebook developmentDelta Lake and Delta Live TablesUnity Catalog for governance testingSpark optimization and performance testing
Job Application Tips
- Tailor your resume to highlight relevant experience for this position
- Write a compelling cover letter that addresses the specific requirements
- Research the company culture and values before applying
- Prepare examples of your work that demonstrate your skills
- Follow up on your application after a reasonable time period