Hepsiburada (NASDAQ: HEPS)

Data Platform Engineer

Posted: 3 hours ago

Job Description

At Hepsiburada, we are driven by our mission to improve people's lives by developing innovative products and services. Prioritizing customer satisfaction, we offer over 280 million products across more than 30 categories. Through our marketplace model, we bring together over 100,000 businesses. With Türkiye’s and the region's largest Smart Operations Center, industry-leading R&D initiatives, and innovative solutions, we contribute significantly to the growth of the e-commerce ecosystem. For the past two years, we have proudly held the title of Turkey's most recommended e-commerce platform.Through our innovative services like HepsiJET, Hepsipay, HepsiLojistik, HepsiAd, and Hepsiburada Global, we create value for all our stakeholders. Committed to harnessing technology for social benefit, our “Technology Empowerment for Women Entrepreneurs” program has connected thousands of women entrepreneurs with e-commerce, supporting their growth. Our goal is to leverage digitalization and e-commerce to enable greater economic participation.With 25 years of experience driven by innovation and entrepreneurship, we proudly continue our journey as “Türkiye’s Hepsiburada” and the first and only Turkish company listed on NASDAQ, the world's leading technology exchange.For our colleagues, we offer a work environment that supports creating together and adding more meaning to our work. As a team full of opportunities, we are happy to develop, produce and succeed together.If you want to be part of a team that creates value for everyone and makes life easier through innovative products and services—and contribute to exciting success stories—the future starts here.We are seeking a motivated Mid-Level Data Platform Engineer to join our central Data Infrastructure team. This is not a typical data engineering role where you just build pipelines. You will be a core builder of our entire data ecosystem.Responsibilities:Design & Develop Core Components: Build and maintain robust, reusable data connectors (in Java) and data processing libraries (in Python) for various data sources (Kafka, MongoDB, Elasticsearch, etc.) to feed our BigQuery DWH.Standardize & Champion Best Practices: Define, document, and promote best practices for critical data engineering tools like Airflow, dbt etc..Own & Operate Key Systems: Take ownership of our central data catalog (DataHub), ensuring it remains a reliable source of truth. You will also play a key role in the administration and support of our new analytics platform, SAS Viya.Automate & Optimize: Develop systems to monitor and control DWH costs, measure data quality across the organization, and ensure data security standards are met.Embrace DevOps Culture: Manage and scale our data infrastructure on Docker and Kubernetes, and build sophisticated CI/CD pipelines in GitLab to automate testing and deployment.Innovate & Explore: Conduct Proof-of-Concept (POC) projects for new data technologies and solutions to continuously evolve our data stack.Empower Users: Develop small full-stack web applications and tools to enable non-technical users to interact with data in a self-service manner.Qualifications:2-5 years of hands-on experience in a Data Engineering, Software Engineering, or a similar platform-focused role.Strong programming skills in either Python or Java. Proficiency in both is a huge plus.Expert-level knowledge of SQL and deep understanding of data warehousing concepts, data modeling, and ELT processes. Direct experience with BigQuery is highly desirable.Hands-on experience with a workflow orchestration tool, preferably Airflow.Familiarity with containerization technologies (Docker) and orchestration (Kubernetes).You have a "platform mindset": you enjoy building tools and creating standards that make other engineers more productive.Experience with version control (Git) and CI/CD practices (preferably GitLab CI).Excellent problem-solving skills and a passion for learning new technologies.Strong communication skills and proficiency in English.Nice-to-Haves:Event-driven architectures and messaging systems like Kafka or RabbitMQ.Modern data transformation tools like dbt.Data catalog solutions like DataHub or Amundsen.NoSQL databases such as MongoDB or Elasticsearch.Experience in cloud cost monitoring and optimization.A willingness to learn and own the administration of new enterprise solutions like SAS Viya (prior experience is not required, but curiosity is!).https://medium.com/hepsiburadatechhttps://stackshare.io/hepsiburadaBu ilan sebebiyle yapacağınız başvurular aracılığıyla toplanacak kişisel verileriniz veri sorumlusu D-Market Elektronik Hizmetler ve Ticaret A.Ş (“Hepsiburada”) tarafından 6698 sayılı Kişisel Verilerin Korunması Kanunu (“KVKK”) ve ilgili mevzuat kapsamında iş başvuru süreçlerinin yürütülmesi amacıyla otomatik yollarla işlenecektir.Aydınlatma Metni’ne https://www.hepsiburada.com/kisisel-verilerin-korunmasi linkinden ulaşabilirsiniz.

Job Application Tips

  • Tailor your resume to highlight relevant experience for this position
  • Write a compelling cover letter that addresses the specific requirements
  • Research the company culture and values before applying
  • Prepare examples of your work that demonstrate your skills
  • Follow up on your application after a reasonable time period

You May Also Be Interested In