We are searching for a dedicated and results-oriented Lead Data DevOps Engineer to drive the development and operationalization of an Enterprise Data Platform (EDP) Foundation. This role focuses on implementing an on-premises data platform, integrating cutting-edge technologies, and ensuring automation, scalability, and multi-tenancy capabilities while maintaining security and operational excellence.
ResponsibilitiesInstall and configure platform components to enable seamless integration across the EDP stackSet up Role-Based Access Control (RBAC) to ensure granular permissions and enforce security standardsDesign and maintain CI/CD pipelines using GitLab CI or equivalent tools to streamline build, test, and deployment activitiesIntegrate infrastructure-as-code tools such as Terraform into pipelines to ensure predictable and auditable deploymentsDeploy robust logging and monitoring mechanisms using the LGTM stack (Loki, Grafana, Tempo, Mimir) to enhance observabilityCreate and manage a centralized Single Management Console for unified platform oversightEnable secure multi-tenancy configurations that allow independent data environments for diverse users and teamsAutomate workflows and pipelines for data ingestion, transformation, and analytics frameworksOversee infrastructure management with Kubernetes and Red Hat OS to ensure system reliability and scalabilityImplement security policies leveraging HashiCorp Vault and Open Policy Agent (OPA) for consistent compliance and access controlDiagnose and resolve issues across platform components to ensure optimal performance and availabilityCollaborate with diverse stakeholders, including Data Engineering teams and customer technical groups, to align deliverables and achieve milestonesRequirements5+ years of experience in DevOps, Data Engineering, or similar roles involving large-scale platform deploymentExpertise in technologies such as Apache Kafka, Apache Spark, and Apache Iceberg for data streaming and processing workflowsProficiency in SQL and data storage technologies including PostgreSQL and Trino for querying and analytical processingStrong background in containerization and orchestration with Kubernetes, alongside experience working with Red Hat OSDeep understanding of infrastructure-as-code practices using tools like Terraform and security management with HashiCorp VaultCompetency in deploying observability solutions through the LGTM stack (Loki, Grafana, Tempo, Mimir)Capability to implement policy enforcement strategies using Open Policy Agent (OPA)Demonstrated ability to design and manage multi-tenancy within enterprise-grade platformsCollaboration skills to work cross-functionally with technical and business stakeholdersNice to haveFamiliarity with MinIO for object-based storage and compatibility with S3 environmentsBackground in distributed SQL query engines like Trino for big data applicationsProficiency in working with Apache Spark Streaming for real-time data processingTechnologiesData Platform Components:
Apache Kafka Apache Spark (including Spark Streaming) MinIO (Object Storage compatible with S3) Apache Iceberg (Table format for analytical datasets) PostgreSQL Trino (Distributed SQL query engine for big data) Infrastructure & Security: Red Hat OS Kubernetes for container orchestration HashiCorp Vault for secrets and credential management Open Policy Agent (OPA) for policy enforcement Logging and Monitoring: LGTM stack (Loki, Grafana, Tempo, Mimir)We offerWe connect like-minded people:
Delivering innovative solutions to industry leaders, making a global impactEnjoyable working environment, whether it's the vibrant office or the comfort of your own homeOpportunity to work abroad for up to two months per yearWe invest in your growth: Leadership development, career advising, soft skills and well-being programsCertifications, including GCP, Azure and AWSUnlimited access to LinkedIn Learning and Get AbstractFree English classes with certified teachersWe cover it all:
Monetary bonuses for taking part in the referral programComprehensive medical & family care packageSeven trust days per year (sick leave without a medical certificate)Discounts from 800+ partners (sports activities, restaurants, stores and services)At EPAM Belarus, employees have the flexibility to choose the environment that suits them best. You can work from any location in Belarus, whether it's your home or our offices in Minsk, Grodno, Brest, Gomel, Mogilev or Vitebsk. EPAM is a leading global provider of digital platform engineering and development services. We are committed to having a positive impact on our clients, our employees, and our communities.
We embrace a dynamic and inclusive culture. Here, you will collaborate with multi-national teams, contribute to numerous innovative projects that deliver the most creative and cutting-edge solutions, and have an opportunity to learn and grow continuously. No matter where you are located, you will join a dedicated, creative, diverse community to help you discover your fullest potential.
Customize your resume to highlight skills and experiences relevant to this specific position.
Learn about the company's mission, values, products, and recent news before your interview.
Ensure your LinkedIn profile is complete, professional, and matches your resume information.
Prepare thoughtful questions to ask about team dynamics, growth opportunities, and company culture.