geidea

Data Architect Sr. Manager

Posted: just now

Job Description

Established in 2008, Geidea epitomises customer focused empowerment and commercial success through continuous innovationGeidea makes best in class digital payment solutions available for all by attracting and leveraging the best creative & entrepreneurial talent in the marketOur solutions give any business the chance to get ahead and reach for more no matter their size or maturity.Our technology mirrors our people - Smart, Innovative & Forward Thinkingwww.geidea.netTo maintain competitive advantage as we grow, we are currently looking for new Data Architect Sr. Manager.Job purpose:A highly skilled Data Architect to design and lead the development of scalable, secure, and high-performance data platforms across the enterprise. This role is central to shaping the data ecosystem supporting our Fintech services—including real-time financial transactions, credit scoring models, regulatory reporting, and customer analytics.Data Architect will work across engineering, analytics, and product teams to build modern data infrastructure incorporating Big Data platforms, data lakes, ELT/ETL pipelines, and data warehouses. Your solutions must ensure high availability, governance, security, and performance—compliant with SAMA and other regulatory frameworks.Key accountabilities and decision ownership:Data Architecture DesignLead the end-to-end architecture of enterprise data platforms including Data Lakes, Lakehouse, and Data Warehouses.Design and maintain canonical data models (conceptual, logical, and physical) for structured, semi-structured (JSON, XML), and unstructured data.Develop architectural blueprints for hybrid and cloud-native solutions leveraging AWS, Azure, or GCP.Standardize data ingestion, transformation, and serving layers for streaming and batch use casesBig Data & Distributed Systems EngineeringArchitect Big Data processing solutions using Apache Spark, Flink, Presto, Trino, or Databricks for large-scale processing of financial and behavioral data.Implement distributed file systems (e.g., HDFS, S3, ADLS) and optimize file formats like Parquet, ORC, and Avro.Ensure scalable compute using EMR, Dataproc, or AKS/Kubernetes-based platforms.Data Lakes and Lakehouse ArchitectureDesign and operationalize data lakes as central repositories for raw and curated data assets.Build Delta Lake, Apache Hudi, or Iceberg-based architectures to support ACID transactions, schema evolution, and time travel.Define governance standards across raw, staged, curated, and analytics layers of the lake architecture.ELT/ETL Frameworks & PipelinesDevelop robust ELT/ETL pipelines using tools like Apache Airflow, DBT, AWS Glue, Azure Data Factory, or Kafka Connect.Optimize data transformations for performance, reusability, and modular design (e.g., using SQL/Scala/Python).Ensure orchestration of dependencies, retries, alerting, and logging in a fully observable pipeline ecosystem.Data Warehousing & BI IntegrationArchitect cloud-based data warehouses such as Snowflake, BigQuery, Redshift, or Synapse Analytics.Define dimensional models (Star, Snowflake), facts/dimensions, and materialized views optimized for analytics and dashboarding tools (e.g., Power BI, Tableau, Looker).Enable self-service analytics by integrating semantic layers, metadata management, and data cataloging tools.Security, Governance & ComplianceImplement data encryption (at-rest/in-transit), tokenization, and row/column-level security mechanisms.Define and enforce data governance policies, including data lineage, classification, and auditing aligned with SAMA, NCA, GDPR, and internal policies.Integrate with data catalogs (e.g., Alation, DataHub, Apache Atlas) and governance tools.DevOps & ObservabilitySupport CI/CD for data pipelines and infrastructure as code using Terraform, CloudFormation, or Pulumi.Implement observability practices via data quality checks, SLAs/SLIs, and monitoring tools like Prometheus, Grafana, or Datadog.Must have technical/professional qualifications:Bachelor’s or master’s degree in computer science, Data Engineering, or a related technical field.8+ years of experience in data architecture, with significant contributions to production-grade data systems.Proven track record in designing and deploying petabyte-scale data infrastructure in Fintech, Banking, or RegTech environments.Technical ExpertiseStrong command of Big Data technologies: Apache Spark, Hive, Hudi, Kafka, Delta Lake, Flink, Beam.Proficiency in Python, SQL, and optionally Scala/Java.Experience with cloud-native services on AWS (S3, Glue, Redshift, EMR, Lake Formation), Azure (Data Lake, Synapse, ADF), or GCP (BigQuery, Dataproc, Pub/Sub).Mastery of data modeling (3NF, dimensional, data vault), data versioning, and schema registry.Familiarity with ML feature stores, stream processing, and event-driven architectures is a plus.Soft SkillsStrategic thinking and ability to balance long-term vision with short-term delivery.Strong documentation, architecture diagramming (UML, ArchiMate), and presentation skills.Excellent English communication (Arabic a plus).Experience leading architecture reviews, engaging with seniorstakeholders, and mentoring data engineers.Certifications:AWS Certified Data Analytics – SpecialtyMicrosoft Azure Solutions Architect / Data EngineerGoogle Cloud Professional Data EngineerDatabricks Certified Data Engineer ProfessionalTOGAF or DAMA CDMPOur values guide how we think and act - They describe what we care about the mostCustomer first - It’s embedded in our design thinking and customer service approachOpen - Openness allows us to constantly improve and evolveReal - No jargon and no excuses!Bold - Constantly challenging ourselves and our way of thinkingResilient – If we fail, we bounce back stronger than beforeCollaborative - We know that we can achieve a lot more as a teamWe are changing lives by constantly striving for a better solutionClick apply below and become part of the Geidea story

Job Application Tips

  • Tailor your resume to highlight relevant experience for this position
  • Write a compelling cover letter that addresses the specific requirements
  • Research the company culture and values before applying
  • Prepare examples of your work that demonstrate your skills
  • Follow up on your application after a reasonable time period

You May Also Be Interested In