interactive investor

Data Engineer

Posted: 28 minutes ago

Job Description

WHO WE ARE:interactive investor is an award-winning investment platform that puts its customers in control of their financial future.We've been helping investors for nearly 30 years. We've seen market highs and lows and been resilient throughout. We're now the UK's number one flat-fee investment platform, with assets under administration approaching £75 billion and over 450,000 customers.For a simple, flat monthly fee we provide a secure home for your pensions, ISAs and investments. We offer a wide choice of over 20,000 UK and international investment options, including shares, funds, trusts and ETFs.We also bring impartial, expert content from our award-winning financial journalists, highly engaged community of investors, and daily newsletters and insights.PURPOSE OF ROLE:The Data Engineer role will help ii design, build, and continually improve the firm's Data Platform, consolidating datasets such as customer, transaction, marketing, web analytics, and market data into a trusted, comprehensive source for analytics and Data Science/ML/AI. You will design, build, and run robust Python/SQL pipelines, orchestrated with Dagster, to deliver and transform data in our Snowflake Data Warehouse.The role also partners with the wider Data and Innovation team and business stakeholders on Intelligent Automation—embedding AI agents within data workflows to replace manual, data‑heavy processes—while maintaining high standards of data quality, security, and governance.While our Data Analysts primarily build and maintain reporting outputs, you should be comfortable presenting data via Streamlit and occasionally Power BI.Work may span Microsoft Azure, Amazon Web Services, and Google Cloud, leveraging their AI agent feature sets where appropriate.We are seeking candidates with a range of experience levels, from Junior to Senior and Lead positions.RequirementsKEY RESPONSIBILITIES:Support and monitor the daily Data Platform build; investigate and resolve issues from overnight jobsOrchestrate reliable ELT/ETL pipelines using Dagster (assets, jobs, schedules, sensors), implementing dependency management, retries, backfills, SLAs, and alerting to populate the warehouse (star schemas, snapshot tables, slowly changing dimensions)Provide reusable SQL queries and data extracts to Data Analysts and business users; promote self‑service patternsMonitor and triage Data Request tickets for ad‑hoc data needs across business functions, including legacy transaction record requestsMaintain clear data lineage and up‑to‑date data cataloguing and dictionaries; advise on the most appropriate fields for specific use casesPartner with business stakeholders to identify manual, data‑heavy processes and redesign them as automated pipelines, incorporating AI agents into data workflows (e.g., classification, enrichment, reconciliation, document processing, alerting)Integrate pipelines with business systems, APIs, and webhooks; implement scheduling, retries, and alerting through orchestrationApply data quality checks, validation rules, and observability (e.g., automated tests, SLAs, anomaly detection, etc)Tune performance and cost (query optimisation, partitioning/clustering in Snowflake, indexing, caching, efficient storage formats such as Parquet/Delta)Practice strong DataOps: Git‑based version control, pull requests/code reviews, CI/CD, environment promotionEnsure compliance with data privacy, security, and regulatory requirements; uphold access controls, encryption, and auditabilityMaintaining the documentation for the Data Warehouse including design documentation to accompany new scripts/processes and corresponding Data DictionariesAs required, develop and support new and existing data outputs via Streamlit and occasionally Power BI dashboards/reports for operational MI and ad‑hoc analysisSKILLS & EXPERIENCE REQUIRED:In-depth knowledge of fundamental database concepts (design and queries) and strong knowledge of advanced topics (management and optimising)Advanced SQL knowledge: can write new and interpret existing complex multi-join aggregation queries Advanced understanding of data mart concepts: star/snowflake schemas, snapshot tables, slowly changing dimensionsStrong knowledge of Python with a focus on scripting data collection and transformation queriesExperience with Dagster (preferred) for orchestration: assets, jobs, schedules, sensors for event‑based and scheduled pipelines; familiarity with comparable cloud services is a plusExperience with Snowflake (or similar cloud data warehouses)Strong DataOps practices: Git-based version control, pull requests, code reviews, CI/CD for data pipelines and infrastructure, environment promotionExperience developing data outputs using Streamlit (primary) and BI visualization tools such as Power BI (occasional)Understanding of data quality frameworks and observability and operational monitoring/alertingExposure to Intelligent Automation in data workflows, including safe use of AI agents for enrichment, classification, or document processing; familiarity with cloud AI agent feature sets (e.g., Azure AI/Agents including Azure OpenAI, AWS Agents for Bedrock, Google Vertex AI Agents)Familiarity with security, governance, and privacy concepts Able to translate high‑level business requirements into clear data requirements and robust technical designsBenefitsGroup Personal Pension Plan - 8% employer contribution and 4% employee contributionLife Assurance and Group Income ProtectionPrivate Medical Insurance - Provided by Bupa25 Days Annual Leave, plus bank holidaysStaff Discounts on our investment productsPersonal & Well-being Fund - Supporting your physical and mental wellnessRetail Discounts - Savings at a wide range of high street and online retailersVoluntary Flexible Benefits - Tailor your benefits to suit your lifestylePlease Note: We will do our utmost efforts to respond to all applicants. However, due to the high volume of applications we're currently receiving, if you haven't been contacted within 30 days of application, please consider unsuccessful.interactive investor operates in accordance with the UK Equality Act 2010. We welcome applications from individuals of all ages, disabilities, gender identities, marital status, pregnancy/maternity, race, religion or belief, sex, and sexual orientation. We are committed to treating all applicants fairly and making reasonable adjustments where needed to support disabled applicants. We actively prevent all forms of discrimination, harassment, and victimisation—whether direct, indirect, associative, or perceptive

Job Application Tips

  • Tailor your resume to highlight relevant experience for this position
  • Write a compelling cover letter that addresses the specific requirements
  • Research the company culture and values before applying
  • Prepare examples of your work that demonstrate your skills
  • Follow up on your application after a reasonable time period

You May Also Be Interested In