RiDiK (a Subsidiary of CLPS. Nasdaq: CLPS)

Senior Data Engineer-Python

Posted: just now

Job Description

About the CompanyRiDiK is a global technology solutions provider and a subsidiary of CLPS Incorporation (NASDAQ: CLPS), delivering cutting-edge end-to-end services across banking, wealth management, and e-commerce. With deep expertise in AI, cloud, big data, and blockchain, we support clients across Asia, North America, and the Middle East in driving digital transformation and achieving sustainable growth. Operating from regional hubs in 10 countries and backed by a global delivery network, we combine local insight with technical excellence to deliver real, measurable impact. Join RiDiK and be part of an innovative, fast-growing team shaping the future of technology across industries.About the RolePossess a degree in Computer Science/Information Technology or related fields. At least 8 years of relevant working experience in software or data engineering with proficiency in Python. Strong experience in unit and integration testing. Familiarity with DevOps practices and Agile methodologies. Strong software engineering, analytical and problem-solving skills. Good team player and communication skill.Responsibilities:Develop and maintain data pipelines and ETL/ELT processes, and ensuring maintainability through unit and integration testing.Collaborate with data teams to understand requirements and automate deployment and monitoring.Optimize data storage and troubleshoot issues to enhance performance. Qualifications:Possess a degree in Computer Science/Information Technology or related fields.At least 8 years of relevant working experience in software or data engineering with proficiency in Python. Required Skills:Strong experience in unit and integration testing.Familiarity with DevOps practices and Agile methodologies.Strong software engineering, analytical and problem-solving skills.Good team player and communication skill. Preferred Skills:Experience in AWS and Kubernetes (K8s).Familiarity with data platforms such as Snowflake, Apache Spark, or Apache Hive, as well as orchestration tools like Apache Airflow, Dagster, or Prefect.Familiarity with GitHub workflows and Datadog.

Job Application Tips

  • Tailor your resume to highlight relevant experience for this position
  • Write a compelling cover letter that addresses the specific requirements
  • Research the company culture and values before applying
  • Prepare examples of your work that demonstrate your skills
  • Follow up on your application after a reasonable time period

You May Also Be Interested In