Luxoft

Senior Data Engineer

Posted: 3 minutes ago

Job Description

Project Description:Join our Development Center and become a member of our open-minded, progressive and professional team. You will have a chance to grow your technical and soft skills, and build a thorough expertise of the industry of our client. In this role you will be working on projects for one our world famous clients, a large international investment bank.On top of attractive salary and benefits package, Luxoft will invest into your professional training, and allow you to grow your professional career.Responsibilities:Key Responsibilities: Solution Design: Architect data pipelines down to the low-level elements, ensuring clarity and precision in implementation. Data Sourcing: Extract data from diverse repositories including relationaldatabases (Oracle, PostgreSQL), NoSQL stores, file systems, and otherstructured/unstructured sources. Data Transformation: Design and implement ETL/ELT workflows to standardize and cleanse data using best practices in data engineering.Pipeline Development: Build scalable, fault-tolerant data pipelines that support batch and streaming use cases. Cloud data processing: Load transformed data into GCP destinations such as BigQuery or Cloud Storage using tools like Dataproc, Dataflow, and other GCPnative services. Workflow Orchestration: Design and manage workflows using orchestration tools such as Apache Airflow or Cloud Composer. Data Format Expertise: Work with various data formats including JSON, AVRO, Parquet, CSV, and others. Optimization & Monitoring: Ensure performance, reliability, and cost-efficiency of data pipelines through continuous monitoring and tuning. Collaboration: Work closely with data architects, analysts, and businessstakeholders to understand data requirements and deliver high-quality solutions.Mandatory Skills Description:Required Skills & Experience:• Proven experience in data engineering across hybrid environments (on-premise and cloud).• Strong proficiency in SQL and Python or Java/Scala.• Hands-on experience with ETL/ELT tools and frameworks.• Deep understanding of GCP data services: BigQuery, Dataproc, Dataflow, Cloud Storage.• Familiarity with data modeling, schema design, and metadata management.• Experience with workflow orchestration tools (e.g., Apache Airflow, CloudComposer).• Knowledge of data governance, security, and compliance best practices.

Job Application Tips

  • Tailor your resume to highlight relevant experience for this position
  • Write a compelling cover letter that addresses the specific requirements
  • Research the company culture and values before applying
  • Prepare examples of your work that demonstrate your skills
  • Follow up on your application after a reasonable time period

You May Also Be Interested In