XPT Software

Data Solution Designer

Posted: 7 hours ago

Job Description

Role Overview: We are seeking an experienced Data Solution Designer to design and implement scalable, secure, and high-performing data solutions for our financial services client. The ideal candidate will have strong expertise in Data Warehousing, Data Lakes, and modern data platforms such as Databricks, with a proven ability to translate complex business requirements into robust technical architectures.Key Responsibilities:• Design end-to-end data architecture and solutions aligned with business and regulatory needs in the financial domain.• Develop and optimize Data Lake and Data Warehouse solutions to support analytics, reporting, and data science use cases.• Collaborate with stakeholders, data engineers, and analysts to define data models, ETL/ELT pipelines, and data governance frameworks.• Leverage Databricks and cloud platforms (Azure/AWS/GCP) for scalable data processing, integration, and analytics.• Ensure compliance with financial data security, privacy, and regulatory requirements.• Define data standards, architecture principles, and best practices across data platforms.• Conduct performance tuning, capacity planning, and solution reviews to ensure efficiency and scalability.Required Skills and Experience:• 8+ years of experience in data architecture, solution design, or data engineering roles.• Strong hands-on experience with Data Warehousing (e.g., Snowflake, Synapse, Redshift, BigQuery) and Data Lake technologies.• Proficiency with Databricks, Spark, and distributed data processing frameworks.• Deep understanding of ETL/ELT design, data modeling (Dimensional/Relational), and data governance.• Experience working in financial services, banking, or insurance domains with knowledge of data security and compliance.• Strong understanding of cloud-native data architectures.• Excellent communication and documentation skills to interact with technical and business stakeholders.Preferred Qualifications:• Experience with data cataloging, metadata management, or data lineage tools.• Knowledge of Python, SQL, and CI/CD for data pipelines.• Certification in Azure Data Engineer / Databricks / AWS Data Analytics or equivalent.

Job Application Tips

  • Tailor your resume to highlight relevant experience for this position
  • Write a compelling cover letter that addresses the specific requirements
  • Research the company culture and values before applying
  • Prepare examples of your work that demonstrate your skills
  • Follow up on your application after a reasonable time period

You May Also Be Interested In