Digitalising HPLC methods: the path to interoperability
Posted: 25 July 2023 | European Pharmaceutical Review | No comments yet
Transferring analytical methods between companies is often a challenging, time-consuming manual process. In this Q&A, Dr Birthe Nielsen, Project Lead at The Pistoia Alliance, speaks about an ongoing project to digitally transfer HPLC data between different vendors.
What are the key challenges when it comes to sharing data?
Companies frequently share data both internally and externally, as well as with partners and contract research organisations (CROs). That data is often captured and formatted in a way that is specific to the company of origin, or the manufacturer of the lab equipment used. Typically, these data are not interoperable. This means that a significant amount of work must be done to harmonise data before it can be analysed. This can include, for example, the manual keying of data – which, as well as being time consuming, carries considerable risk of human error. Such errors also exacerbate the reproducibility crisis.
How can these issues be overcome?
Greater digitalisation of paper-based processes will help overcome these issues. Creating a standardised means of capturing and recording data – be it experiment results or methods – will enable it to comply with the FAIR principles (Findable, Accessible, Interoperable, Reusable). This sets the stage for future data science technologies, such as artificial intelligence (AI) and machine learning, by ensuring data is machine‑readable from the point of its creation.
What are some additional challenges related to sharing experimental data?
Consistent methods must be employed across organisations to facilitate regulatory approvals, but this is a process that can hinder drug development. Often, experiment methods are simply shared over email or physically posted, with no centralised repository that scientists can consistently access. This means CROs must re-establish and revalidate data, which is labour‑intensive and open to human interpretation or error. Additionally, experiment information is typically stored locally on a PC or electronic lab notebook (ELN) that is not backed up, meaning a single cyberattack can wipe them out instantly and the work is lost. Improved methods of exchange and storage will create better flexibility, reproducibility, efficiency and even improve security.
How can open data standards be used to harmonise data from multiple systems?
The FAIR data principles ensure data is standardised at the point of its creation and capture, so that datasets are closer to being harmonised on arrival. Labs and companies will always use different systems so any opportunity for collaboration is advantageous for every stakeholder.
The Alliance has several projects focused on creating data standards – for example, the DataFAIRy project, which aims to convert bioassay data into machine-readable formats that adhere to the FAIR principles. There’s also the IDMP Ontology Project, which has created a freely available data model to ensure new drug submissions are compliant with European Medicines Agency (EMA) regulations, with less manual data preparation needed. Pre-competitive collaboration in this space benefits everyone because whether you are a biopharmaceutical company, vendor, publisher or equipment manufacturer, you are invested in ensuring the research ecosystem works seamlessly and drives new breakthroughs.
Tell us about the Methods Hub project and its aims…
The Methods Hub project was launched to facilitate easier sharing of experiment methods data between different organisations. Data integrity, method reproducibility and interoperability are increasingly valued across the pharma industry and by regulators, but method recapitulation is challenging.
Labs and companies will always use different systems so any opportunity for collaboration is advantageous for every stakeholder
Scientists often have more than 30 instruments in a single lab, each with a different user interface, and will spend lots of time inputting variables onto each to reproduce a single experiment. The Methods Hub project seeks to build a bridge for analytical methods to transition from text‑based to fully digitalised and machine‑readable instructions, in a standardised format that can be adopted industry-wide. The proof of concept (PoC) successfully completed a digital transfer via cloud of analytical high-performance liquid chromatography (HPLC) methods, proving it is possible to move analytical methods securely between two different chromatography data systems (CDS) vendors with ease.
What were the main challenges you encountered?
One of the main challenges was finding the right balance of technology and domain-expertise to build the data model. The Alliance’s network was able to bring together the right mix of expertise.
How will this project be expanded in future?
In the future, we hope to link both analytical methods and results data in a single platform. This would significantly aid data analytics by enabling information from multiple vendors to be pooled and visualised simultaneously. This would be an industry first. We also need to extend the solution to all major CDS vendors and to cover most commonly used hardware. Our ultimate goal is to make Methods Hub an integral part of the system infrastructure in every analytical lab.
What is your vision for the lab of the future? How will projects like the Method Hub support this vision?
Digital technologies will be the driving force behind Lab of the Future, enabling greater pre-competitive collaboration and accelerated drug development pipelines. While technologies like AI or virtual reality may steal the headlines, underpinning their success will be strong data standards. That’s where Methods Hub comes in – enabling seamless sharing and integration of methods data to get experiments off the ground and accelerate early phases of development.
Birthe Nielsen, PhD
Birthe is a consultant and project lead at life sciences not-for-profit organisation The Pistoia Alliance. Prior to her role with the Alliance, she worked as a Principal Lecturer in Analytical Science at The University of Greenwich. Birthe is based in London but is originally from Denmark where she completed a Master’s in Engineering (Biotechnology). She completed her industry-funded PhD with the University of Portsmouth (UK) in 2007 from the School of Pharmacy.
Issue
Related topics
Analytical techniques, Chromatography, Data Analysis, Data integrity, HPLC, Process Analytical Technologies (PAT), Technology