Current Vacancies

Data Engineer

Location - Newcastle
Employment Type - Full Time - Permanent
Salary - £55,000 - £68,500 per annum
Hours Per Week - 37.5

We're building something special — and we need a talented Data Engineer to help bring our Azure data platform to life. 

This is your chance to work on a greenfield Enterprise Data Warehouse programme in the insurance sector, shaping data pipelines and platforms that power smarter decisions, better pricing, and sharper customer insights. 

The Data Engineer will design, build, and optimise scalable data pipelines within Azure Databricks, ensuring high-quality, reliable data is available to support pricing, underwriting, claims, and operational decision-making. This role is critical in modernising SBG’s cloud-based data infrastructure, ensuring compliance with FCA/PRA regulations, and enabling AI-driven analytics and automation.

By leveraging Azure-native services, such as Azure Data Factory (ADF) for orchestration, Delta Lake for ACID-compliant data storage, and Databricks Structured Streaming for real-time data processing, the Data Engineer will help unlock insights, enhance pricing accuracy, and drive innovation. The role also includes optimising Databricks query performance, implementing robust security controls (RBAC, Unity Catalog), and ensuring enterprise-wide data reliability.

Working closely with Data Architects, Pricing Teams, Data Analysts, and IT, this role will ensure our Azure Databricks data ecosystem is scalable, efficient, and aligned with business objectives. Additionally, the Data Engineer will contribute to cost optimisation, governance, and automation within Azure’s modern data platform.

Key Responsibilities:

  • Data Pipeline Development – Design, build, and maintain scalable ELT pipelines using Azure Databricks, Azure Data Factory (ADF), and Delta Lake to automate real-time and batch data ingestion.
  • Cloud Data Engineering – Develop and optimise data solutions within Azure, ensuring efficiency, cost-effectiveness, and scalability, leveraging Azure Synapse Analytics, ADLS Gen2, and Databricks Workflows
  • Data Modelling & Architecture – Implement robust data models to support analytics, reporting, and machine learning, using Delta Lake and Azure Synapse.
  • Automation & Observability – Use Databricks Workflows, dbt, and Azure Monitor to manage transformations, monitor query execution, and implement data reliability checks.
  • Data Quality & Governance – Ensure data integrity, accuracy, and compliance with industry regulations (FCA, Data Protection Act, PRA) using Databricks Unity Catalog and Azure Purview.
  • Collaboration & Stakeholder Engagement – Work closely with Data Scientists, Pricing, Underwriting, and IT to deliver data-driven solutions aligned with business objectives.
  • Data Governance & Security – Implement RBAC, column-level security, row-access policies, and data masking to protect sensitive customer data and ensure FCA/PRA regulatory compliance.
  • Innovation & Continuous Improvement – Identify and implement emerging data technologies within the Azure ecosystem, such as Delta Live Tables (DLT), Structured Streaming, and AI-driven analytics to enhance business capabilities.

Required Skills/Experience:

  • Hands-on experience in building ELT pipelines and working with large-scale datasets using Azure Data Factory (ADF) and Databricks.
  • Strong proficiency in SQL (T-SQL, Spark SQL) for data extraction, transformation, and optimisation.
  • Proficiency in Azure Databricks (PySpark, Delta Lake, Spark SQL) for big data processing.
  • Knowledge of data warehousing concepts and relational database design, particularly with Azure Synapse Analytics.
  • Experience working with Delta Lake for schema evolution, ACID transactions, and time travel in Databricks.
  • Strong Python (PySpark) skills for big data processing and automation.
  • Experience with Scala (optional but preferred for advanced Spark applications).
  • Experience working with Databricks Workflows & Jobs for data orchestration.
  • Strong knowledge of feature engineering and feature stores, particularly in Databricks Feature store for ML training and inference.
  • Experience with data modelling techniques to support analytics and reporting.
  • Familiarity with real-time data processing and API integrations (e.g., Kafka, Spark Streaming).
  • Proficiency in CI/CD pipelines for data deployment using Azure DevOps, GitHub Actions, or Terraform for Infrastructure as Code (IaC).
  • Understanding of MLOps principles, including continuous integration (CI), continuous delivery (CD), and continuous training (CT) for machine learning models.
  • Experience with performance tuning and query optimisation for efficient data workflows.
  • Strong understanding of query optimisation techniques in Databricks (caching, partitioning, indexing, and auto-scaling clusters).
  • Experience monitoring Databricks workloads using Azure Monitor, Log Analytics, and Databricks Performance Insight
  • Familiarity with cost optimization strategies in Databricks and ADLS Gen2 (e.g., managing compute resources efficiently).
  • Problem-solving mindset – Ability to diagnose issues and implement efficient solution
  • Experience implementing Databricks Unity Catalog for data governance, access control, and lineage tracking.
  • Understanding of Azure Purview for data cataloging and metadata management.
  • Familiarity with object-level and row-level security in Azure Synapse and Databricks
  • Experience working with Azure Event Hubs, Azure Data Explorer, or Kafka for real-time data streaming.
  • Hands-on experience with Databricks Structured Streaming for real-time and near-real-time data processing.
  • Understanding of Delta Live Tables (DLT) for automated ELT and real-time transformations.
  • Analytical thinking – Strong ability to translate business needs into technical data solution
  • Attention to detail – Ensures accuracy, reliability, and quality of data.
  • Communication skills – Clearly conveys technical concepts to non-technical stakeholders.
  • Collaboration – Works effectively with cross-functional teams, including Pricing, Underwriting, and IT.
  • Adaptability – Thrives in a fast-paced, agile environment with evolving priorities.
  • Stakeholder management – Builds strong relationships and understands business requirements
  • Innovation-driven – Stays up to date with emerging technologies and industry trends.

Qualifications:

  • A degree in Computer Science, Data Engineering, Mathematics, Statistics, or a related field (or equivalent experience).
  • Snowflake SnowPro Core Certification 
  • Snowflake SnowPro Advanced: Data Engineer Certification (Preferred).
  • Azure Cloud Professional Data Engineer 
  • Awareness of regulatory requirements, including FCA, GDPR, and PRA compliance, is advantageous.
  • Optional certifications in Agile or DevOps, such as Scrum Master, SAFe, or CI/CD-related courses.
  • Strong hands-on experience may be considered in place of formal qualifications.

Our Benefits:

  • Hybrid working – 2 days in the office and 3 days working from home
  • 25 days annual leave, rising to 27 days over 2 years’ service and 30 days after 5 years’ service. Plus bank holidays!
  • Discretionary annual bonus
  • Pension scheme – 5% employee, 6% employer
  • Flexible working – we will always consider applications for those who require less than the advertised hours
  • Flexi-time
  • Healthcare Cash Plan – claim cashback on a variety of everyday healthcare costs
  • Electric vehicle – salary sacrifice scheme
  • 100’s of exclusive retailer discounts
  • Professional wellbeing, health & fitness app - Wrkit
  • Enhanced parental leave, including time off for IVF appointments
  • Religious bank holidays – if you don’t celebrate Christmas and Easter, you can use these annual leave days on other occasions throughout the year.
  • Life Assurance - 4 times your salary
  • 25% Car Insurance Discount
  • 20% Travel Insurance Discount
  • Cycle to Work Scheme
  • Employee Referral Scheme
  • Community support day
  • Christmas and Summer parties

Working at SBG

At Somerset Bridge Group we aim to build a sustainable and innovative business focused on underwriting, broking and claims handling of UK motor insurance, offering transparent products and an efficient and fair service to our policy holders.

We are very proud to have been awarded a Silver Accreditation from Investors in People! We recognise that all of our people contribute to our success. That's why we are always looking for talented people to join our team - people who share our vision, who are passionate about what they do, and who want to be part of something special. 

Equal Opportunity Employer

Somerset Bridge Group is committed to creating a diverse environment and is proud to be an Equal Opportunity Employer. We prohibit discrimination or harassment of any kind based on race, color, religion, national origin, sexual orientation, gender, gender identity or expression, age, pregnancy, physical or mental disability, genetic factors or other characteristics protected by law. SBG makes hiring decisions based solely on qualifications, skills and business requirements.