Job Title: Data Engineer – GCP & DBTLocation: Stockholm, SwedenHybrid: 2-3 days per week on-siteExperience: 4–6 YearsDomain: Digital – Google Data EngineeringStart: August/SeptemberRole Overview: We are looking for an experienced Data Engineer who thrives in modern cloud environments and is passionate about building scalable, secure, and high-performing data products. You will take full ownership of end-to-end data product development, optimization, and support — from design to deployment — working closely with stakeholders and product teams to drive value across the organization. Key Responsibilities: Take full accountability for building, optimizing, and supporting both existing and new data products aligned with the target vision.
Lead by example in promoting a DevOps mindset, managing CI/CD pipelines and infrastructure as code using Terraform. Build, transform, and serve meaningful insights from data using DBT and BigQuery on Google Cloud Platform (GCP). Ensure data products are independently deployable and meet high standards for security, scalability, observability, and performance. Collaborate closely with Product Owners and stakeholders to define the vision of data products and identify opportunities to support evolving customer needs. Engage with internal and external product teams in the context of data mesh architecture and best practices.
Drive continuous improvement initiatives, reducing technical debt, and optimizing data systems. Stay updated with the latest trends in data engineering, analytics, and cloud technologies. Conduct root cause analysis and problem resolution as part of support activities. Required Skills & Competencies: 4+ years of hands-on experience in data engineering on modern cloud data platforms or advanced analytics environments. Strong hands-on expertise with DBT (Data Build Tool). Solid experience with data query languages (e. g. , SQL) and data-centric programming using Python, Java, or Scala. Experience with various data formats such as Avro and Parquet. Deep understanding of data modeling techniques and their trade-offs.
Familiarity with both NoSQL and relational databases (RDBMS). Experience working with CI/CD pipelines, Terraform, and Cloud infrastructure, especially GCP. Strong collaborative mindset with excellent communication skills in English (written and verbal). Self-driven with the ability to work independently and make informed decisions. Prior experience with data visualization tools is a plus. Desirable Skills: Hands-on experience with GCP tools: BigQuery, Dataflow, DataprocExposure to data processing frameworks such as: Apache Beam, Spark, Hive, Flink
Customize your resume to highlight skills and experiences relevant to this specific position.
Learn about the company's mission, values, products, and recent news before your interview.
Ensure your LinkedIn profile is complete, professional, and matches your resume information.
Prepare thoughtful questions to ask about team dynamics, growth opportunities, and company culture.