Luxoft

Senior Data Engineer (SQL and GCP)

Posted: 6 days ago

Job Description

Project Description:Join our Development Center and become a member of our open-minded, progressive and professional team. You will have a chance to grow your technical and soft skills, and build a thorough expertise of the industry of our client. In this role you will be working on projects for one our world famous clients, a large international investment bank.On top of attractive salary and benefits package, Luxoft will invest into your professional training, and allow you to grow your professional career.Responsibilities:• Solution Design: Architect data pipelines down to the low-level elements, ensuring clarity and precision in implementation.• Data Sourcing: Extract data from diverse repositories including relationaldatabases (Oracle, PostgreSQL), NoSQL stores, file systems, and otherstructured/unstructured sources.• Data Transformation: Design and implement ETL/ELT workflows to standardize and cleanse data using best practices in data engineering.• Pipeline Development: Build scalable, fault-tolerant data pipelines that support batch and streaming use cases.• Cloud data processing: Load transformed data into GCP destinations such as BigQuery or Cloud Storage using tools like Dataproc, Dataflow, and other GCPnative services.• Workflow Orchestration: Design and manage workflows using orchestration tools such as Apache Airflow or Cloud Composer.• Data Format Expertise: Work with various data formats including JSON, AVRO, Parquet, CSV, and others.• Optimization & Monitoring: Ensure performance, reliability, and cost-efficiency of data pipelines through continuous monitoring and tuning.• Collaboration: Work closely with data architects, analysts, and businessstakeholders to understand data requirements and deliver high-quality solutions.Skills Description:• Proven experience in data engineering across hybrid environments (on-premise and cloud).• Strong proficiency in SQL and Python or Java/Scala.• Hands-on experience with ETL/ELT tools and frameworks.• Deep understanding of GCP data services: BigQuery, Dataproc, Dataflow, Cloud Storage.• Familiarity with data modeling, schema design, and metadata management.• Experience with workflow orchestration tools (e.g., Apache Airflow, CloudComposer).• Knowledge of data governance, security, and compliance best practices.Nice-to-Have Skills Description:• GCP certification (e.g., Professional Data Engineer) is a major plus• Experience with CI/CD for data pipelines.• Exposure to containerization and Kubernetes.• Familiarity with data cataloging tools / metadata management• Experience with Big Data technologies

Job Application Tips

  • Tailor your resume to highlight relevant experience for this position
  • Write a compelling cover letter that addresses the specific requirements
  • Research the company culture and values before applying
  • Prepare examples of your work that demonstrate your skills
  • Follow up on your application after a reasonable time period

You May Also Be Interested In