We combine passionate people with sector-specific expertise. Our approach is collaborative, creative and human. Our outputs are transparent, robust and transformative.
From problem identification, data management and data science, to visualisation, interpretation and the delivery of actionable intelligence – we work across the entire data lifecycle.
Transformation in the healthcare sector represents more than just technical advancement – it represents progress for patients. This drives us forward.
Balancing technical breakthroughs and emerging technologies with transparency and compliance is a delicate exercise. When we succeed, society benefits.
Data progress at a local level enables a proactive response to the specific needs of the community. We help realise this potential.
As the defence sector responds to ever-evolving global risk, it demands increased data efficiency and deeper insights. Our expertise is up to the challenge.
Explore the offers that, together, make up our complete data lifecycle service. Supporting case studies evidence our expertise.
We’re always looking for fresh ideas that can solve our clients’ problems and move our expertise forward. In return, we provide a collaborative, human and open place to work. One that celebrates individual talent and collective success.
Explore our opportunitiesSuccessful
Projects
United
Experts
Years of
Growth
Focussed
Vision
A robust strategy can unlock all the benefits your data has to offer. By harnessing emerging technology. By exploiting new techniques. By maximising value from your data estate.
Our strategy experts deliver the framework and direction for all data led initiatives to follow. Guiding you on the journey to becoming a data driven organisation.
We solve challenges through collaboration and innovation.
Group sessions, interviews and research provide our strategy experts with a
comprehensive picture of your data and how you use it. We need to understand the
decisions it informs (and, as importantly, those it does not) and its value to your
organisation.
With these insights, we can design a data strategy that considers regulatory
environment and related constraints (for example, GDPR); opportunity analysis driven
by your ambitions; and an artefact review with the potential development of a Target
Operating Model (TOM).
From our findings, we can create a joint roadmap to support a successful strategy
rollout.
For clients who have an existing strategy, we offer a range of standalone packages
to enhance the data journey. This includes cloud and AI readiness assessments; data
science roadmap(s); and data and information governance models.
Data strategy
Target Operating Model (roles, functions, technology and investment)
Implementation roadmap
For a client within the defence sector, we established a Data Science and AI Centre
of Excellence. We embedded its working practices, culture, standards, ethics and
governance protocols.
As part of the data strategy, we’ve developed a Target Operating Model and a Front
Door for data science and AI project requests. Our goal is always to build a
long-term service that the client could take full control of.
Building from this success, we’re now delivering AI-CNN and machine learning for a
range of projects including NLP and algorithmic test beds.
Evolution starts with exploration. Whether solving a specific problem or implementing a broader strategy, our user centric discovery process dissects your existing data solutions to reveal possible barriers and identify untapped potential.
Both are essential to plotting the best course forward for data-led success.
Our discovery phase can be focussed on a specific problem or broader business goals.
We have two separate approaches to meet differing needs:
We can apply sector
dependant requirements such as the Government Digital Service (GDS) – a public
sector methodology comprising user research; service design; data architecture and
prototyping.
Or, if we’re informing a specific decision or particular
problem, we can collaborate to identify, define and agree on the specific questions
that require answering. We then begin the discovery, understanding your current data
and technical solutions to map the way forward.
A service aligned to and meeting user needs
Service
concepts (an MVP or Alpha)
with which to test and learn
Investment efficiency on a specific service
and/or project
Guidance on how to proceed and procure, build and deliver a
service/project
For East and North Hertfordshire CCG, we were asked to explore an approach to
population health analytics that would enable more proactive healthcare.
There were two linked purposes: to use data at a population level to inform service
planning and design; and to leverage person-level data to inform direct
care.
We embarked on a six-week phase of user research. We wanted to
understand exactly how they wished to consume data and what, ultimately, they wanted
to do with it.
Our research was broad enough to encompass their current data
assets, environment, and how they could both be developed. We also explored
predictive models with the aim of being able to plan model development for local
purposes.
Our discovery outputs were comprehensive. We provided: the build
phase requirements and key objectives, such as achieving increased usage of a
linked-dataset for population health monitoring. Self-service access to underlying
data. Robust and updated long-term governance agreements. And cloud-based platform
design that’s scalable, extensible and interoperable, incorporating the latest
security and governance approaches
A client in the defence sector is responsible for oversight and delivery of
equipment assets around the world. If an item becomes obsolete or isn’t working,
it’s either disposed of or sent to storage.
The client had a question: they
wanted to know if their systems could be linked together to understand the asset
journey in more detail. To help answer it, we provided insight and advice using a
data science discovery.
Our exploration focused on a number of areas:
improved approaches to enable better analysis of data and insight. Mapping the
end-to-end process of disposals, spanning pan-defence systems and teams for the
first time. Building a backlog of disposal hypotheses relating to inefficiencies in
the process and areas of fraud risk. And, finally, using a data-driven approach to
test those hypotheses and propose recommendations based on the results of machine
learning analysis.
The discovery outcomes highlighted the areas within the
previous process that were wasteful of both time in disposing of equipment, and
through the cost of renting equipment storage space. With this knowledge, we were
able to provide visibility around the disposal unit and movement of assets, as well
as a process map to document best practice for asset disposal.
Your data strategy and user needs drive our governance-first design approach. We work to understand the constraints, regulations and ethical consequences of your desired solution.
Then we translate your strategy into technical designs that are secure, robust and scalable – building a solid foundation for future delivery.
We start with a clear problem to be solved or solution to be found. Then, working
closely with you, we ensure two things: that your governance requirements are met;
and the ethics of deriving value from your data are considered.
At the heart
of this collaboration is a mix of group sessions, interviews and research. From this
deep dive, we’ll put together a business case and implementation roadmap – spanning
from skills, capacity development and technical requirements, right through to
investment.
If it supports the solution, we can also take advantage of our
formal accredited partnerships with organisations such as Microsoft, Amazon and
Tableau.
ETHICS
Understanding and developing:
ARCHITECTURE
Design, build and implementation enabling:
GOVERNANCE
Understanding and bringing together:
INFRASTRUCTURE
Delivering value including:
DHSC required a new medical examiners platform to collect, secure, search and
analyse person-identifiable data on all deaths in the NHS in England & Wales. The
platform was to be a key part of the Department’s long-term plans to reform how
death certifications work in England and Wales.
Working to GDS standards, we
took the non-technical client team through an end-to-end service design process. Our
solution started with data strategy and discovery, before confirming and
articulating the data architecture, governance and infrastructure designs. Finally,
we delivered Alpha and Beta builds from our data engineering and interaction
teams.
From an architecture standpoint, MedEx is a PaaS web application with
an API application tier and an Azure Cosmos DB backend. The agreed n-tier
architecture compromises technology, CI/CD and patterns that are leading-edge within
the NHS.
The web-based production environment is now used by hospitals
across the UK to input real-world case data and interactions with clinicians and the
bereaved. It’ a 24/7 service with an SLA aligned to a range of stakeholders, meeting
user need and commercial constraints.
As the data is highly sensitive, data
governance was essential. Covering the deceased, their next of kin, medical team and
other interested parties relevant to the investigations of the Medical Examiner, a
comprehensive Data Protection Impact Assessment (DPIA) supported by robust data
security was essential.
National regulatory requirements were reviewed and
assessed constantly to ensure compliance, along with best practice and GDS
protocols. Using encryption, database auditing and other measures, data was
safeguarded in support of organisational security commitments and compliance
requirements defined by NHS Digital.
As your strategy is deployed, higher volumes of data are produced from an increasing number of sources.
We create the processes, infrastructure and pipelines that will bring that data together and prepare it for data science and AI.
Data engineering typically begins at the point where the artifacts of the solution
design are known, (such as the high and low-level architecture, data model, logical
schemas and where the data resides). The rules that govern how data can be processed
and the environments and toolsets where data will reside are established.
We
then build the foundational data infrastructure and tooling to integrate,
consolidate and make the data suitable for analysis in a data science project. We do
this using data pipelines, Extract Transform Load (ETL) scripts, data
cleansing/profiling, and data warehouses or data lakes. We then apply reporting and
visualisation tooling as required.
Typically, our approach follows an Agile
development methodology to accelerate the process. This enables us to deliver
functionality and capabilities, increase user engagement, and be responsive to
changing sprint requirements with greater efficiency. Depending on your preference,
our projects also follow CI/CD DevOps processes.
We strive to adopt a technology agnostic approach. The technology used is chosen as part of a solution design and often forms the input to a specific data engineering project. Our approach will typically comprise:
Bespoke data platforms
Database engineering including:
APIs
Verification &
validation
Test engineering
Data migration
Data-centric
software engineering
Foundations enabling interrogation and analysis of data
Delivering value
downstream via high quality data engineering
Providing infrastructure
required to become a data-driven organisation
A resilient, secure and
performant environment
The DVLA are in the process of modernising their existing systems. This involves
migrating mainframes, bespoke applications and other technology to a modern
cloud-based platform.
Using an Agile methodology, our solution supplements
and up-skills their existing team and prototype build. To achieve this, we called on
our experienced Azure engineers with technical specialists in the Microsoft
analytics suite, working in the existing Azure tenant, to build a secure MI/BI data
engineering platform.
The conceptual design involved a number of components.
Event Hub was used to ingest data from a variety of sources (JSON, .CSV, RDBMS
etc.). Stream analytics were used to process and store data in Data Factory and
Azure SQL. And analysis services allowed for ultimate display in multiple potential
reporting suites such as Power BI.
The platform went live in 2020. Its first
source system involved the surfacing of data from tachometers used on passenger
carrying and goods vehicles. This represented a significant shift from taking
snapshots at set intervals to real-time, event-based processing – this enabled
faster visibility of data and a higher degree of reporting accuracy at any given
time.
Our data science services break down the silos of internal and external data sets to deliver more powerful insights, enabling smarter decisions. Expertise we can transfer to you – growing your capability and confidence.
We achieve this by helping you frame a data related problem statement. We then apply the latest predictive modelling, AI techniques and statistical approaches to deliver intelligent, actionable outputs for you and your organisation.
We use AI techniques such as machine learning and neural networks to offer a clearer
and quicker way to solve problems.
Our deep understanding of analytical
methodologies allows us to design a solution that fits your needs. Data science and
AI approaches are different to traditional mathematical and statistical techniques.
They use the power of technology to deliver faster, more accurate and less obvious
answers to questions – enabling data-driven decision making.
Our expert team
of data scientists work collaboratively to enable seamless, ongoing knowledge
transfer whether on site or remote. Our mixed-delivery model combines permanent
staff and carefully selected, vetted associates. This ensures access to a large pool
of technically accredited experts, and a broad range of skills and experience to
call on.
Typically, we combine an Agile methodology to scope and agree
packages of work, with rapid turnaround of deliverables over several iterations.
Sector Lead
Business Analyst
Delivery Content/Service
Design
Technical/Data Architect
Building confidence, capability, capacity and knowledge
Becoming a proactive
vs reactive organisation
Providing insight to inform clear questions and
outcome
Delivering robust and explainable methodologies
Delivering
clear and actionable findings
The CQC held vast quantities of notifications that were inconsistently categorised,
variably complete, and largely free-text statutory notification returns.
Traditionally, they were manually interpreted by inspectors who decided on an
appropriate risk response.
CQC was seeking a decision support tool using
artificial intelligence (AI) to help their inspectors sift, present and link
information to support decision making. Success would be measured against four key
indicators: increased efficiency, assessable data, tacit knowledge leveraged and
increased consistency.
The project involved processing a significant amount
of unstructured data to apply a range of natural language processing (NLP) and
textual analytical techniques. The processed output notification data contained a
number of structured, classified risk and temporal aspects that provided meaning,
along with risk or safeguarding impacts. Powerful insights that could inform
inspector decisions and help them manage and prioritise their workload.
Outcomes included: previously unknown untapped information being made available to
inspectors for review; consistency of processing unstructured data and analysing
across boundaries, provider organisations and types; automatic processing of manual
and time intensive tasks; and, finally, providing a methodology which allows CQC to
improve organisational memory and to support inspectorate decision making.
We’ve been working to design improved development and business processes, data
science methodologies and governance frameworks for the existing data
platforms.
As an independent analytical function, we provide real-time
analysis about infection outbreaks through the provision of data science and data
engineering teams.
Data in isolation has no power. It’s how the user interacts with the data that determines how impactful and effective it will be.
Combining user needs and experience design, we provide data input, visualisation and reporting services for your organisation. Giving your data the context it needs to inform decisions and drive progress.
We build intuitive frontends that allow users to easily collect, secure, analyse and
understand complex data.
Our product experts help you to explore your ideas
– from capturing needs and wireframing, through to a fully embedded, live solution.
We can also create static reports, presenting a clear, intuitive and faithful
representation of our analysis.
Our experience, gained from years of working
across multiple sectors, help us to interpret the outputs of analysis. We tease out
the learning and work with you to identify the best path forward.
1. Understanding need
2. Requirements exploration and definition
3. Service design
4. Prototyping
5. User testing
6. Validation
7. Service deployment
8. Support & Knowledge Transfer
User-Centric Design
Microsoft Power BI
Amazon QuickSight
Tableau
Qlik
D3.JS
R-Shiny
Python
Jupyter
Notebooks
Facilitating better, faster and lower cost of decisions
Making data
transparent and accessible
Enabling self-serve business intelligence
Interactive tools and services which appeal to users
Driving organisation
wide success
GIRFT is a National NHSI programme run in partnership with the Royal National
Orthopaedic Hospital. It aims to improve the quality of care and investment in the
NHS by reducing unwarranted variation.
We worked with multiple specialties
and clinical leads, supporting the programme across more than 30 health related
services. Our role was developing specialist analytics data packs. Each one
contained a suite of data visualisations showcasing variation across a specific
clinical area.
Alongside data packs, we delivered national specialty reports
working with the GIRFT team and national clinical leads. Each report shared findings
from dozens of hospital visits and interrogated the data to demonstrate anecdotal
findings and drivers of variation in outcomes and efficiency.
Based on the
findings, we worked with the National Clinical Lead to develop potential service
changes. These included: clinical practice changes through education and evidence,
service model changes, and organisation and service structural changes.
We assisted healthcare technology provider, HasTech, with implementing their PAMMS
and Covid-19 data into a new Azure cloud data platform, as well as developing
Covid-19 intelligence dashboards in Power BI.
Our data engineering and
interaction team developed a pipeline to ingest National Food Standards Agency
(NFSA) ratings data to Azure, via API, using Data Lake, Data Factory and Azure
SQL.
We then built three new Covid-19 intelligence dashboards in Power BI,
re-designing 12 prototype reports. We supported the ingest of NHS England’s (NHSE)
new Capacity Tracker data, providing local authorities across Greater London with
reporting not currently available.
The outcomes were far reaching and
impactful. We harmonised data reporting from 32 local authorities and other external
sources, such as the Care Quality Commission (CQC). We provided users with a clean,
interactive display to analyse Covid-19 information in care homes. And we
future-proofed the collection, collation and display of information.