The Future trends in Bioanalytical Outsourcing
Paul Denny-Gouldson heads the overall strategic planning for the various market verticals and scientific domains at IDBS. He obtained his Ph.D. in Computational Biology from Essex University in 1996, and has authored more than 25 scientific papers and book chapters.
In this commentary Paul Denny-Gouldson outlines some of the future trends we could expect to see in the bioanalytical outsourcing sphere.
Pharma and biotech transformation is a hot topic and, as the industries evolve, we are expecting to see significant shifts in the organizational and economic models. We are already seeing a change in mind-set at senior levels through an acceptance and increased adoption of collaboration and outsourcing. While this will inevitably take a while to become the norm, all the signs are that it undoubtedly will.
Part of the transformation we are likely to see will be the use of outsourcing within all areas of the drug research and development process, not just in areas, such as clinical trials and manufacturing, where outsourcing has historically been used to great effect.
Preclinical development has seen a similar change in recent years, with a greater acceptance and use of contract services. Bioanalytical (BioA) testing is no exception – and many service providers have emerged following site closures and the reorganization of large pharma companies.
The providers of contract services have also invested heavily in provisioning new services to cope with the demand. Estimates show the top 10 global clinical researchers outsource a combined turnover in excess of $21 billion per year. The combined revenue of the next 50 contract research organizations (CROs) is double this amount. It is not clear what fraction of this is attributable specifically to BioA, but the overall numbers are likely to be significant.
But while things are rapidly changing at a considerable scale, there will still be barriers and opportunities as the transformation continues.
‘Efficiency, efficiency, efficiency’ is a rallying cry we have heard for a long time in BioA and this will continue. Speed of sample analysis, method validation and reporting all offer opportunities for improved efficiency – but that only takes you so far and is only focused on the laboratory processes, not the overall process of outsourcing.
Looking at the process of collaborating and outsourcing will offer opportunities for efficiency gains. So how could that play out?
Simply, through better support for the whole outsourced process, from work order, sample exchange, analysis execution, data analysis & study reporting and, finally, data repatriation (the integration of structured data into internal systems for downstream decision support). Some of these areas have already been addressed with software, but they tend to be ‘in-house’ centric, and don’t tackle issues with work ordering and request tracking, sample exchange and data repatriation. It’s not just about ‘efficiency, efficiency, efficiency’ either – there are other elements and opportunities that also need consideration.
Reducing errors and exceptions is important in the BioA space, and given its regulated aspects, it is vital to flag them appropriately when they occur so they can be actioned quickly. Automation of lab execution is a trend that has been discussed previously – and the evolution of lab designs, bespoke workspaces in laboratories and robotics all play a role in the ‘robustness of lab execution’. But, with the outsourcing model, the opportunities for errors to occur are obvious. Samples, study parameters etc. will need to be transferred between various parties, information will be transcribed between systems, data will be sent to and fro.
The result? Areas previously well-managed in a ring-fenced internal operation become areas of concern in an outsourced process.
Data is a foundational part of any scientific process, and clearly BioA is no different. The results of analytical runs provide both critical information and data that is used to make decisions on the progress of a new therapeutic. This is combined with other data and information up and down the R&D development process to support discovery and FDA reporting and filings (NDAs, INDs, BLAs).
In the ‘old’ world, the systems used to aggregate and manage this data and information were internal, and integrated where possible to reduce manual transcription. The outsourcing model makes this a whole lot more complicated and puts pressure on a number of different parties.
There’s pressure on the analytical service provider – who need to get the data and information to the requesting organization in a suitable format. And there’s also pressure on the requesting organization – who need to be able to take that data and information from the service provider and integrate it into their internal systems.
Why is this challenging? Many of these integrations are not externalized, often due to security concerns or technology, and are not ‘adaptable’ to this new ‘open process’ workflow. These barriers can be overcome but they require effort and technology changes to do so.
A further challenge relates to the data formats that are exchanged, and the method for doing so. Essentially, there are two models:
- Contracted providers are told how and what the data should look like by requesting organizations – making it easy to repatriate data; and
- Contracted providers provide standard reports and data formats to requesting parties – making it easy to simplify the data management systems in the service providers.
There are obvious benefits to each party depending on which model is used: the first benefits the requesting organizations, the second benefits the service providers. Data brokering can reduce the business impact on both parties and we will see rapid evolution in this area as the pressure to continually operate more efficiently and effectively builds.
There’s another aspect too, concerning what data is exchanged. This brings more issues.
Most applications and instruments used in the lab do not deal with open data formats and standards. Fact data, like values and conditions, can be exchanged – but linking analytical data to fact data can present a considerable challenge.
The scientific rigour applied to data is only increasing, and requesting organizations will want all the data associated with a sample analysis so they can re-examine the results should they want to, or need to, at a later date. Without standard data formats this is very difficult, as the requestor must have all the same “applications” as the service provider to be able to open the data files to interrogate them. There are options available – examples like AniML ( https://www.animl.org/overview ) and the Allotrope Foundation ( www.allotrope.org ) which try to reduce the barriers and eliminate the data exchange barriers – but their adoption is not ubiquitous.
Another area likely to develop is the provision of greater insight into where the work request is in terms of tasks and jobs. Real-time insight into the work progress – with updates provided to the requesting organization by the service provider – is seen in many other industries. Just look at the Domino’s pizza purchase and delivery app to see how this can work in practice!
Okay, the BioAnalytical area is a lot more complex than takeaway pizza, but the idea is still valid. The trick is not to overburden the scientist on either side with extra work – having to update progress at a micro scale would become tedious. Conversely, not enough information about progress would make the information worthless. But, if optimized appropriately, such insight could be used to monitor progress of work and potentially help as a communication vehicle with job scheduling and the fixing of issues should they occur: a notifications and issues resolution framework rather than a disconnected set of emails, phone calls and documents.
Some providers are taking this task-request one step further by looking at integrating the lab instrumentation into a task request framework using internet of things technologies, taking the opportunities for monitoring and issue resolution to another level. Couple this real-time data with analytics and the BioA lab moves on again into the realms of predictive analytics, where issues can be predicted and resolved before they even happen – like instrument calibration drifting based on the detailed and automatic analysis of historical data, for instance.
However, some organizations that have historically outsourced all their BioA studies are now ’insourcing’ and rebuilding their in-house resources. Having the BioA scientists and the DMPK scientists in close proximity allows for better collaboration in the areas of new method innovation. Remember, once developed, the methods can still then be outsourced – but the development, validation and optimization occurs internally.
The above merely scratches the surface of what could result from increased outsourcing in BioA study execution. New business models and services will emerge as providers jostle to differentiate themselves from their competitors and overall service will be broken down into component parts, with the goal of delivering the best insight and information into a given service request. Data and information will always be at the heart of transformation, but we are already seeing more standards-based and greater “data detail” information exchange requirements – a trend which is set to continue.
The pace of change and the demand for innovation in BioA will create the need to address many new technological challenges which, in turn, will drive innovation – but to tackle these issues effectively, organizations will have to get used to working together.