Confidentiel

Data Engineer

Posted: Nov 1, 2025

Job Description

RESPONSIBILITY• Develop and maintain ELT/ETL pipelines from source systems (e.g., SQL Server) to analytics stores (e.g., ClickHouse/TiDB).• Build streaming data flows and schedule batch workflows (e.g., Airflow or similar).• Model data (3NF/star/snowflake) and implement partitioning, clustering, and indexes for performance and cost efficiency.• Implement robust data quality checks (tests, constraints, anomaly detection) and SLAs with alerting.• Create and maintain dimensional models and curated datasets for BI (e.g., Metabase) with governed semantics.• Optimize SQL and storage layouts; design materialized views to accelerate critical dashboards and APIs.• Harden data security (RBAC, masking, encryption) and enforce privacy/compliance for PII and financial data.• Build observability: job monitoring, lineage metadata, and failure diagnostics; document runbooks.• Collaborate with analysts, data scientists, and product teams to define requirements, schemas, and acceptance tests.• Contribute to CI/CD for data code; write clean, version‑controlled SQL/Python with code reviews and linters.• Automate backfills, reprocessing, and schema migrations with minimal downtime and clear rollback plans.• Manage cloud resources (storage, compute) for cost and performance; propose improvements proactively.• Provide on‑call/after‑hours support on a rotation for critical pipelines and releases.• Maintain clear documentation (ERDs, pipeline diagrams, data contracts) and knowledge sharing artifacts.• Support feature engineering and data readiness for ML with MLOps teams (feature stores, reproducibility).Cloud/container foundations (Huawei Cloud CCE/Kubernetes, Docker).QualificationsBachelor’s degree in Computer Science, Software/Data Engineering, or related field. 4–8 years in data engineering with production ELT/ETL, data modeling, and performance tuning.Certified membership in the Saudi Council of Engineering Strong organizational and problem-solving skills.Strong SQL and Python; experience with workflow orchestration• Hands‑on with modern analytics stores (ClickHouse/TiDB/columnar DBs) and OLTP sources.• Experience with streaming (Kafka) and CDC (e.g., Debezium) is a plus.• Data governance basics: quality checks, lineage/metadata, documentation, access controlAbility to work effectively in a fast-paced environmentExperience in the logistics or supply chain industry is a plus

Job Application Tips

  • Tailor your resume to highlight relevant experience for this position
  • Write a compelling cover letter that addresses the specific requirements
  • Research the company culture and values before applying
  • Prepare examples of your work that demonstrate your skills
  • Follow up on your application after a reasonable time period

You May Also Be Interested In