
In life sciences, as in any complex business, it’s too easy to get hold of the wrong end of the stick in the event of a quality issue.
That’s because managers aren’t typically taking into account the whole picture when assessing what has gone wrong. The tendency is to hone in on a specific part of the business or a particular process, to establish why a fault occurred or something important was overlooked.
But in reality there might have been a whole series of more subtle contributors to the problem which would be missed in such an analysis. If there has been a deviation at a factory level or with an IT system, the ensuing root-cause analysis may conclude that this is down to an omission in associated documentation. Or if a mistake has been made, that a gap in training is responsible.
A wider perspective, which involves looking for patterns across similar incidents, might reveal that processes are more prone to errors during the night shift due to inadequate lighting, or fatigue from excessive overtime. Rather than point to a need to refresh training, the solution may turn out to be HR related.
If the issue is linked to machine failure, trend analysis might reveal that such faults are routinely occurring after a production run of 10,000 units – and that pre-emptive, preventative maintenance after 9,000 units provides a robust solution.
Joining the dots
To gain the fuller perspective and be able to make more accurate and effective judgements, companies must first have access to all of the information from multiple sources.
Yet too often teams have distinct systems for Quality - in R&D, in Manufacturing, and for Corporate-level quality management respectively - preventing managers from seeing the complete picture.
Quality capabilities and insights might be strewn across different documents, departmental QMS repositories, process automation rules, specific analytics tools and more. However thorough these different resources might be in the data they hold collectively, that combined picture will have limited value as long as insights have to be pieced together and pored over manually.
Dropping the ball
Drops in quality often occur where processes are handed over from one department or system to another – for instance, as Regulatory Affairs completes a marketing authorisation submission which is accepted and passes to Manufacturing; or as a Corporate-level quality process is handed on to a particular department or site.
As long as Quality is managed and viewed in a fragmented way, managers will never be in a position to identify and remedy the fuller circumstances that are causing problems. However intensive the introspective analysis and however comprehensive the fault-apportioning meetings, the risk of reaching the wrong conclusions will remain until that analysis moves up a level.
Even where companies have identified the issue and are working towards a more consolidated, cross-function view of Quality, it’s common for associated documents and QMS data to exist separately, preventing a joined-up approach to quality management. It doesn’t help that most Quality systems are set up to work that way – perpetuating the existence of islands of insights, which hamper effective improvements.
Live data, meaningful visualisation
The future state of quality management depends on all information sources and real-time readings feeding into the same knowledge pool – from continuous machine and temperature readings to geotags for failure events/customer complaints/medical enquiries that allow location-based fault patterns/clusters of PV activity to be discerned.
The ability to visualise findings in different ways becomes important here, too. For instance, a text-based list showing issues seemingly spanning four countries will not obviously indicate that those locations are all within 25 miles of a single facility, whose medicines storage unit is operating at the wrong temperature because of an air-conditioning fault. Improved analytics and the ability to view information in different ways – using graphs, clickable charts, geo-tagging / maps views and other advanced data visualisation options – would allow this at-a-glance insight.
As long as data has to be actively exported and imported between different systems, the potential for next-generation quality management will always be compromised.
Be ahead of the curve
The authorities are steadily pushing companies towards a data-first quality management strategy, but life sciences companies do not have to wait for regulatory mandates to get their own house in order.
Consider how inefficient it is to manually prepare documents such as Annual Product Quality Reviews today, for instance – manually refreshing and collating all of the data, and preparing graphs, each time. The days of doing this are numbered: systems such as our CARA Life Science Platform allow such reports to be generated automatically from live data – something companies could realise significant time savings and reduction in errors from today.
On top of the administrative savings, the chance to act on more holistic insights and reduce risk and costs through better targeted and more timely action would soon add up.
That’s a future we’re helping to deliver today.
- Luisa Burggraf, Generis
