SHERPANY

Data Engineer (9 months contract, Europe remote) - Sherpany

Posted: 3 hours ago

Job Description

Datasite and its associated businesses are the global center for facilitating economic value creation for companies across the globe. From data rooms to AI deal sourcingand more. Here you’ll find the finest technological pioneers: Datasite, Blueflame AI, Firmex, Grata, and Sherpany. They all, collectively, define the future for business growth.Apply for one position or as many as you like. Talent doesn’t always just go in one direction or fit in a single box. We’re happy to see whatever your superpower is and find the best place for it to flourish.Get started now, we look forward to meeting you.Job DescriptionSherpany by Datasite is the leading Swiss meeting management solution, designed to meet the unique needs of the board, board committee, and executive meetings. Our solution streamlines the entire meeting process to make meetings more productive and thus enhancing company performance. Our customers include well-known medium to large companies in all industries, such as Axpo, Raiffeisen Bank and Calida Group. More than 400 companies already use Sherpany.We’ve come a long way since 2011. Sherpany is now a team of 150 talented individuals, working from all around the world. Our culture is rooted in trust and responsibility, and we’re proud of the productive and healthy nature of our work environment.About The RoleWe’re looking for a hands-on data professional to support ongoing data pipeline and modeling work within our data warehouse. You’ll collaborate closely with the analytics team to maintain and improve existing workflows, data quality, dbt models, KPI definitions, and documentation practices.Please note this is a temporary 9 months contract with the possibility to extend after this period. Key ResponsibilitiesMaintain and optimize end-to-end ELT pipelines to ensure reliable, well-structured data flows into the warehouse.Develop, test, and maintain scalable data models following modern data modeling and best practices.Support data validation, testing, documentation, and governance across environments.Contribute to metadata and lineage tracking, as well as standards for KPI modeling and data quality.RequirementsStrong proficiency in SQL and Python (PySpark preferred).Solid experience with dbt Core and GitHub-based workflows (pull requests, CI/CD).Familiarity with the Azure data ecosystem and Delta Lake architecture.Experience with ELT/ETL tools (Airbyte, ADF, or similar, in addition to dbt).Detail-oriented, collaborative, and comfortable working in a remote setup.Nice to HaveExperience building or maintaining Tableau data sources and dashboards.Understanding of data governance, documentation, and cataloging practices (e.g., Collibra, dbt docs)Exposure to master data management tools (Reltio).Our company is committed to fostering a diverse and inclusive workforce where all individuals are respected and valued. We are an equal opportunity employer and make all employment decisions without regard to race, color, religion, sex, gender identity, sexual orientation, age, national origin, disability, protected veteran status, or any other protected characteristic. We encourage applications from candidates of all backgrounds and are dedicated to building teams that reflect the diversity of our communities.

Job Application Tips

  • Tailor your resume to highlight relevant experience for this position
  • Write a compelling cover letter that addresses the specific requirements
  • Research the company culture and values before applying
  • Prepare examples of your work that demonstrate your skills
  • Follow up on your application after a reasonable time period

You May Also Be Interested In