Help design, build and continuously improve the clients online platform. Research, suggest and implement new technology solutions following best practices/standards. Take responsibility for the resiliency and availability of different products. Be a productive member of the team. Requirements 4+ years of experience as a Data Engineer on modern cloud data platforms or in advanced analytics environments. Own the end-to-end lifecycle of data products—from design to deployment and ongoing optimization—aligned with target architecture and business goals. Champion a DevOps mindset, managing CI/CD pipelines, infrastructure as code (Terraform), and GCP cloud services.
Transform data into meaningful, actionable insights using DBT and BigQuery. Design and implement modular, scalable, and secure data products that operate independently and meet non-functional requirements. Collaborate closely with Product Owners and stakeholders to evolve the vision for current data products and identify opportunities for new ones. Partner with product teams across domains to support data mesh implementation and interoperability. Drive continuous improvement by identifying and reducing technical debt, and implementing best practices. Conduct root cause analysis and manage problem resolution for data-related incidents. Stay up to date with advancements in cloud, data engineering, and analytics technologies.
Strong passion for data, technology, and teamwork. Proficient in DBT for transformation and pipeline orchestration. Experience with data formats such as Avro and Parquet. Proficient in SQL and other data query languages. Skilled in data-centric programming using Python, Java, or Scala. Solid understanding of data modeling techniques and their trade-offs. Familiar with both NoSQL and relational databases (RDBMS). Strong collaboration and communication skills, with a co-creative and proactive mindset. Comfortable making independent decisions in a decentralized, agile environment. Experience with data visualization tools. Experience with GCP tools such as Dataflow, Dataproc, and BigQuery.
Familiarity with data processing frameworks like Apache Beam, Spark, Hive, or Flink. Fluent in English, both written and spoken. Benefits A challenging, innovating environment. Opportunities for learning where needed.
Customize your resume to highlight skills and experiences relevant to this specific position.
Learn about the company's mission, values, products, and recent news before your interview.
Ensure your LinkedIn profile is complete, professional, and matches your resume information.
Prepare thoughtful questions to ask about team dynamics, growth opportunities, and company culture.