Key Responsibilities· Build and optimize ETL workflows using Databricks (PySpark, SQL)· Develop data warehousing solutions in Snowflake, ensuring high performance· Implement data quality, lineage, and governance practices· Ingest data from multiple banking sources (APIs, FTP, transaction systems)· Collaborate with analysts, product owners, and compliance teams· Ensure encryption, role-based access, and data masking for sensitive banking data· Automate workflows and CI/CD pipelines using tools like Airflow, Azure DevOps, or TerraformRequired Skills· Expertise in Snowflake, Databricks, and SQL· Experience working with financial data (banking domain preferred)· Proficiency in Python, PySpark, and cloud platforms (Azure, AWS, or GCP)· Familiarity with data governance, security standards, and compliance frameworks· Strong understanding of data modeling and performance tuning in large-scale systems
Customize your resume to highlight skills and experiences relevant to this specific position.
Learn about the company's mission, values, products, and recent news before your interview.
Ensure your LinkedIn profile is complete, professional, and matches your resume information.
Prepare thoughtful questions to ask about team dynamics, growth opportunities, and company culture.