top of page

Driving Efficiency in Life Sciences Business Processes with Standardised In-Flight Data

Max Kelleher, COO, Generis Remco Munnik, Director, Iperion, a Deloitte Business.


Pharmaceutical companies recognise that the absence of standardised data can hinder their agility and innovation. This is particularly applicable to in-flight data, as it travels between different software solutions. Max Kelleher, Chief Operating Officer at Generis and Remco Munnik, a Director at Iperion, a Deloitte business, offer practical guidance on mapping and managing operational data to enhance business operations.



Leveraging live company master data more effectively and strategically, thereby creating a flow of broader data and insights between functions, will enhance a range of different use cases. Much of this ‘in-flight’ data is incidental information captured as part of a task, yet its value in providing oversight, traceability and impact assessment to senior management could be considerable – if only companies could find a way to harness and control it more systematically.


The handover of data between point software solutions - such as regulatory systems (RIMS), clinical trial management (CTMS), pharmacovigilance (PV) - is where gaps and discrepancies in information between systems occur, leading to operational blind-spots and strategic oversights at best, or regulatory incompliance at worst. This makes hard work of change management, and could mean that product development information, and patient safety events, aren't fully traceable.


Overcoming the silos, interconnecting the data, and keeping those connections dynamic and smart, is the next big opportunity - and provide the key to using everyday operational data to drive business improvements. But how?


The answer lies in understanding where key data is generated, and how the supply and demand of that data looks across the 'chain of custody' as that data is re-used in different ways. Then a plan can be devised for improving the connection and flow of more unified data (one enhanced source, rather than inconsistent duplicates) across departmental divides.



For young biotech companies starting from scratch, there is a clear opportunity to establish clean, consistent and definitive data from the outset, whereas for larger and more established companies the best options may be around intelligently mapping existing data sources and data flow. Then interconnections and interdependencies can be identified and managed more effectively, until such time as data remediation and end-to-end standardisation can be achieved (e.g. to bring data fields and formats into line.)


It’s in this context that leveraging Ontologies is attracting interest as an option, for instance – allowing inconsistently-formatted data to coexist, while recognising that the items referenced are the same, and linked. This is a useful first step in the move to treat all data as one joined-up resource, so that it can drive new actionable insights, decisions and processes. A more thorough overhaul of data can then happen more gradually over time.


With all of this in mind, here are some considerations and tips for tackling internal data transformation.



Where to Start with Legacy Systems

Unless the company is a young biotech with a largely greenfield tech set-up, Life Sciences companies will be approaching the road to data-based operational agility with a considerable amount of baggage.


Large legacy systems, vast volumes of data, and the variable quality and availability of that data, will make it hard to know where to start in transforming its contribution and value. Rather than try to tackle everything everywhere all at once, the prudent choice involves identifying some tangible gains from higher-quality, interconnected data which, once cleaned and combined, will tell a fuller story. That might be linking supply chain data to Regulatory data, to enable serialisation, (semi)automated batch release, and mitigation of shortage reporting, for instance. Or perhaps the aim is to shave a week off clinical development timescales, or complete eCTD applications, or submit variations more speedily. All depending on the priority and size of the company.



Identify Target Processes


Mapping what data exists, and where, is the best place to start with all of this. It is only through visualising the current spread of information assets and associated use cases that companies will appreciate the potential for greater uniformity and fluidity of data use between the different departments.


This will help the company establish key processes to transform, for quick yet potentially far reaching wins for the business. An effective map will chart where given data is used along a process including creation, modification, and re-use by different teams and systems.


Where there is an existing process optimisation or digital transformation team in place, or consultants that are advising on associated initiatives, these professionals would be the ideal drivers of a cross-functional data map – in partnership with key functions such as Regulatory Affairs, Quality, and so on. Companies that have already appointed Chief Data Officers, or equivalents, will have a head start as these roles typically take more of a view of the commercial value of data, where Regulatory Affairs might not be the direct creator of the data, but more the guardian - as the spider in the web - providing a more detailed perspective of data’s links and touch points.



Create a Chain of Data Custody


A lot has been said and written already about the importance of improving and maintaining high data quality, as its day-to-day value in supporting real-time business processes increases.


While some arguments favour a strong sense of data ownership within specific functions with the most involvement with the given data, it can be more powerful to encourage everyone across the company to buy into the value of consistent data so that all functions and teams play their own part in keeping data clean, compliant, comprehensive and current.


Effective strategies here involve strong, broad communication of the associated benefits of robust data, and incentives (recognition and reward) for those who actively play their part. Instead of data ‘ownership’, think in terms of a ‘chain of data custody’ spanning multiple groups of data processors and guardians over time.


Once companies can more readily visualise their current data position and the full scale of the task ahead of them - to make their data work harder for the organisation - it’s time to decide the most prudent way forward.


In the case of large pharma companies with extensive product portfolios and vast system and data legacies, comprehensive data remapping and/or investing in master data management is likely to be an overwhelming undertaking that could take many years.




A Broader Opportunity


Regulators, through their adoption of data standards, are championing global identifiers for medicinal products and their active substances. Life Sciences companies that are inventing and developing these products and substances would benefit greatly from adopting data standards consistently from early development, and throughout their marketing authorisation/registration information and variations submissions.


By implementing data standards internally, ensuring consistent data and sharing it reliably in-flight throughout extended processes, life sciences companies can leverage their company master data to reveal broader data insights and enhance productivity and ultimately patient outcomes.



About the Authors


Max Kelleher is Chief Operating Officer at Generis and formerly the company's Head of European Operations. He is passionate about providing a viable, pragmatic path for modernising enterprise information management in regulated industries. His close work with both pharma companies and specialist solution partners has afforded him deep insight into the critical modern-day challenges that traditional approaches to business processes and information use in complex industries like Life Sciences do not fulfil.



Remco Munnik is a Director at Iperion, a Deloitte business, and a respected subject matter expert in RIM, eCTD, xEVMPD and ISO IDMP. He is Chair of Medicines for Europe Telematics Group; and President of the IRISS Forum, a global, open, multidisciplinary, non-profit networking organisation for life science professionals by life science professionals. Iperion, a Deloitte business is a globally-operating life sciences consultancy firm which is paving the way to digital healthcare, by supporting standardisation and ensuring the right technology, systems and processes are in place to enable insightful business decision-making and innovation



View the original article from tks here





bottom of page