We are looking for a talented and motivated Data DevOps Engineer to drive the operationalization and maintenance of an Enterprise Data Platform (EDP) Foundation, ensuring a seamless and secure environment for data-driven operations at scale.
ResponsibilitiesInstall and configure platform components to streamline integration within the Enterprise Data Platform stackSet up RBAC to enforce granular permissions and ensure adherence to security best practicesDesign and maintain CI/CD pipelines using tools like GitLab CI, integrating infrastructure provisioning with technologies such as TerraformImplement Logging and Monitoring capabilities with the LGTM stack for enhanced observability and performance trackingBuild and deploy a centralized Single Management Console for efficient platform managementImplement multi-tenancy capabilities to offer independent and secure environments for various users, teams, and locationsAutomate data ingestion, transformation, and querying workflow pipelinesOptimize platform infrastructure using Kubernetes and RedHat OS to ensure high availability and scalabilityUtilize HashiCorp Vault and Open Policy Agent to enforce robust security policies and maintain compliance standardsWork closely with engineers and platform teams to improve release processes and foster continuous deliveryMonitor, troubleshoot, and resolve issues to guarantee operational reliability and system performanceCollaborate with customer technical teams to meet project goals and milestones effectivelyRequirements3+ years of experience working with Kubernetes for container orchestration and Red Hat OS in enterprise environmentsExperience with Apache Kafka, MinIO, Apache Iceberg, and Apache Spark, including Spark StreamingProficiency in managing distributed SQL query engines like Trino and databases such as PostgreSQLBackground in Terraform for infrastructure provisioning and GitLab CI or similar tools for automated deployment pipelinesKnowledge of HashiCorp Vault and Open Policy Agent for implementing secure access control and policy enforcementExpertise in Logging and Monitoring tools, particularly the LGTM stack (Loki, Grafana, Tempo, Mimir)TechnologiesData Platform Components:
Apache KafkaApache Spark (including Spark Streaming)MinIO (Object Storage compatible with S3)Apache Iceberg (Table format for analytical datasets)PostgreSQLTrino (Distributed SQL query engine for big data)Infrastructure & Security: Red Hat OSKubernetes for container orchestrationHashiCorp Vault for secrets and credential managementOpen Policy Agent (OPA) for policy enforcementLogging and Monitoring: LGTM stack (Loki, Grafana, Tempo, Mimir)We offerWe connect like-minded people: Delivering innovative solutions to industry leaders, making a global impactEnjoyable working environment, whether it's the vibrant office or the comfort of your own homeOpportunity to work abroad for up to two months per yearWe invest in your growth:
Leadership development, career advising, soft skills and well-being programsCertifications, including GCP, Azure and AWSUnlimited access to LinkedIn Learning and Get AbstractFree English classes with certified teachersWe cover it all: Monetary bonuses for taking part in the referral programComprehensive medical & family care packageSeven trust days per year (sick leave without a medical certificate)Discounts from 800+ partners (sports activities, restaurants, stores and services)At EPAM Belarus, employees have the flexibility to choose the environment that suits them best. You can work from any location in Belarus, whether it's your home or our offices in Minsk, Grodno, Brest, Gomel, Mogilev or Vitebsk.
EPAM is a leading global provider of digital platform engineering and development services. We are committed to having a positive impact on our clients, our employees, and our communities. We embrace a dynamic and inclusive culture. Here, you will collaborate with multi-national teams, contribute to numerous innovative projects that deliver the most creative and cutting-edge solutions, and have an opportunity to learn and grow continuously. No matter where you are located, you will join a dedicated, creative, diverse community to help you discover your fullest potential.
Customize your resume to highlight skills and experiences relevant to this specific position.
Learn about the company's mission, values, products, and recent news before your interview.
Ensure your LinkedIn profile is complete, professional, and matches your resume information.
Prepare thoughtful questions to ask about team dynamics, growth opportunities, and company culture.