TechMagic

BI Consultant/Lead (Azure Fabric and/or Open-Source Stack)

Posted: Nov 7, 2025

Job Description

💼 Part-time (2–3 days/week) | Remote | Long-termA fast-growing Swiss-based consulting and technology company specializing in data analytics, AI implementations, and ERP systems is looking for a BI Consultant / Lead to strengthen its team. The company builds intelligent data and analytics solutions for international clients, combining modern cloud technologies with open-source flexibility.Founded in 2021 and headquartered in Switzerland, it partners with organizations across industries to help them unlock the value of their data — from data integration and modeling to analytics and AI-driven insights.You’ll join a dynamic environment where experts in Azure Fabric, data engineering, and BI work side by side with business stakeholders to deliver measurable business impact.We Mainly Run Two Stacks Azure Fabric: OneLake/Lakehouse, Fabric/Synapse pipelines, SQL endpoints, Power BI. Open-Source: Kafka, Apache NiFi, Iceberg, dbt, ClickHouse, PostgresScope/impact Own the conversation with business, shape the analytics roadmap/KPIs, and define the semantic model. Build the critical pieces hands-on or steer our (very capable) data engineering team end-to-end. Translate questions into robust data models, data contracts, and reliable dashboards; set QA/governance standards.Must-haves (either track) Client-facing BI leadership; ability to frame problems and land simple, valuable solutions. Strong SQL and dimensional modelling (star/snowflake); comfortable defining metrics/semantics. Backlog writing for engineers (dbt tasks/ELT, data contracts, acceptance criteria).Azure Fabric track (nice-to-haves become musts if choosing this track): Fabric/Synapse pipelines, Lakehouse/OneLake, SQL endpoints. Power BI (dataflows, DAX, tabular modeling, Row-Level Security). CI/CD for BI (deployment pipelines, versioning, environments).Open-Source track (nice-to-haves become musts if choosing this track) Kafka event streams, Apache NiFi flows (ingest/enrichment/routing). dbt (models/tests/docs), ClickHouse for analytics at scale; familiarity with OLAPengines. Apache Superset (dashboards, permissions) and basics of orchestration (e.g., Airflow/Argo) - nice to have.Cross-cutting pluses Python for data wrangling; testing (dbt tests, unit/contract tests). Data quality & governance (SLAs, monitoring, lineage); GDPR awareness. Finance analytics background (P&L, margin, cash, working capital) is a strong plus.Work format: B2B contract, paid days off (holidays, sick leave, public holidays), budget for a laptop.

Job Application Tips

  • Tailor your resume to highlight relevant experience for this position
  • Write a compelling cover letter that addresses the specific requirements
  • Research the company culture and values before applying
  • Prepare examples of your work that demonstrate your skills
  • Follow up on your application after a reasonable time period

You May Also Be Interested In