High throughput materials discovery “reducing product time to market”
Posted: 23 January 2008 | | No comments yet
With ever mounting market pressure on industries, from increasing global competition, along with consumer desire for value for money and improved performance results there is a greater driving force to stay one step ahead by reducing product time to market. This enforced impetus has many companies having to continually improve existing formulations and launch new products in order to expand their product portfolio and market share.
With ever mounting market pressure on industries, from increasing global competition, along with consumer desire for value for money and improved performance results there is a greater driving force to stay one step ahead by reducing product time to market. This enforced impetus has many companies having to continually improve existing formulations and launch new products in order to expand their product portfolio and market share.
With ever mounting market pressure on industries, from increasing global competition, along with consumer desire for value for money and improved performance results there is a greater driving force to stay one step ahead by reducing product time to market. This enforced impetus has many companies having to continually improve existing formulations and launch new products in order to expand their product portfolio and market share. Although the use of automation to accelerate product development and reduce research bottle-necks is already well established in the pharmaceutical, bio-technology and catalysis industries, Automation in many other large industries, especially in the area of materials research, is still largely under-utilised and many processes are still performed manually.
There are a great number of benefits from using high throughput equipment and research methodologies to an organisation that chooses to implement them. The more apparent advantages are accelerated sample throughput, leading to reduced project time and cost which enables companies to compete more efficiently. It also provides the opportunity to map out a larger experimental space than possible with manual approaches, allowing the organisation to tackle research tasks previously thought to be beyond their capabilities and therefore letting them broaden their product patent coverage. Also, one of the less acknowledged benefits of the use of high throughput techniques is the “faster-to-no” result which allows companies to perform more speculative research and to rapidly identify and terminate unsuccessful projects.
Despite these advantages, many obstacles have been reported by organisations looking to enter into use of high throughput practices. A common barrier, especially for smaller companies is the justification of the capital expenditure for the required equipment and software, followed by the lack of knowledge and the requirement to train internal staff in these techniques. An additional concern is the difficulties in handling the large quantity of data generated by these techniques. But the most frequent stumbling block for many companies from industries not commonly practicing these techniques, is a lack of knowledge of available equipment or their capabilities and therefore a belief that these approaches can not be applied to their particular problems. Of course, it is not always possible to automate every process due to the difficulties in sample manipulation and the requirement of user interpretation to direct some processes. This is especially true in the materials industry where a diverse range of problems can occur due to the variety of physical properties of materials such as, “sticky” solids, high viscosities, corrosive reagents, temperature and air sensitive chemicals, which all contribute to the difficulties in automating their synthesis, formulation and testing processes.
The manipulation of some liquids and mixtures can even alter their properties, as not all liquids act as Newtonian fluids, which adds another dimension of complexity to automating materials liquid handling. Some fluids when placed under shear as experienced during aspiration and dispensing through a needle or pipette tip, can display viscosity changes. These materials are commonly referred to as, pseudoplastic (shear thinning, for example, shampoo, paint and most emulsions), dilatant (shear thickening, for example, mixtures of water and sand or cornstarch solutions) and plastic (for example, ketchup) and can lead to inaccuracies in automated dispensing. For these reasons, industrial research which requires the movement, combination and/or mixing of liquid and solid samples is still largely conducted by hand. With the recent advent of new technologies, methodologies and techniques pioneered to handle these types of materials (Figure 1.), significant opportunities exist to accelerate materials research thereby increasing throughput, sensitivity, accuracy, reproducibility and accelerating product development and subsequently drastically reducing product time to market.
In principle, by simply employing flexible and economical solutions it should be possible to drastically accelerate a number of common manual processes used in materials research. Herein we present an example of how an economical process was developed to accelerate a standard analysis required by a number of industries.
High throughput surface tension analysis
Understanding the surface tension properties of aqueous surfactants, small molecules and polymeric materials is important for the pharmaceutical, coatings, paints, inks, surfactant and household products industries. The conventional method of measuring surface tension is rather tedious and time-consuming and only a few compounds can be measured by a single user per day. With the commercial drive to reduce costs and increase throughput, the following process explains the background behind the measurement of surface tension and the manor in which a generic high throughput process for conducting this measurement was developed2. By using an Eppendorf epMotion 5075LH3 automated liquid handling workstation, large numbers of samples can be prepared quickly and accurately in 96 well microplates. These are then fed into a Kibron Delta-8 surface tension plate reader where the surface tension of each well is measured4. As shown in Figure 2, the Kibron Delta-8 has eight du Nouy probes spaced in parallel on micro balances to fit the column of a standard 96 well plate format. Between measurements, the surface of these probes are then cleaned automatically using a furnace, thus allowing 96 surface tension measurements to be conducted within 3 minutes.
By using the epMotion 5075LH liquid handler, a highly flexible and user friendly process has been created thus allowing researchers to deliver their surfactants, small molecules or polymer samples, of known concentration, direct from the laboratory in convenient and easy to handle formats, such as 1.5ml centrifuge tubes, bar-coded sample vials or 96 well micro tube plates. These are placed directly onto the deck where the epMotion 5075LH then takes an accurate aliquot (60µl – 300µl) from each sample and conducts a serial dilution in a standard 96 well microtitre plate using an aqueous stock solution. A replicate daughter plate is then carefully produced in a Teflon coated 96 well Kibron surface tension plate which allows the retention of a droplet shape meniscus into which the surface tension probes can make contact as displayed in Figure 3.
The acquired adsorption isotherms allows the calculation of surfactant critical micelle concentrations (CMCs), limiting surface tension, cross-sectional surface area (As) and air-water partition coefficient (Kaw) for each sample. This can also be used to show the effects of polymer and small molecule mixtures on the surface tension with respect to changes in pH, and can be easily achieved by repeating the same measurement process using dilution mediums of different pH in the reservoir racks (Figure 4.). The ability to produce a pH profile can indicate the activity and absorbance properties of the molecule to pH changes, for example the gastrointestinal tract5. This enables rapid research to be conducted on selective pH-responsive materials such as drug release technologies.
In the following case the simultaneous use of a liquid handling robot and high throughput surface tensiometer allows the determination of polymer surfactant adsorption isotherms with unprecedented speed and accuracy. Five amphiphilic block copolymer solutions at 1.0 w/v % were prepared using distilled water and loaded into the liquid handling robot in 1.5 ml centrifuge tubes. The robot was programmed to perform 22 dilutions at 50 % of each polymer solution (concentrations from 1.0 – 4.8 x 10-8 w/v %) into standard 96 well microtitre plates. 50 mL of each solution was then transferred into Kibron 96 well plates and measured using the Kibron Delta-8 surface tensiometer. The 176 measurements were performed in approximately 6 minutes. The data from the serial dilutions were combined using standard Kibron Delta-8 software procedures (Figure 5 and Table 1.). The remaining sample dilutions in the mother plate can be retained and used in further repeat measurements to gage reproducibility or used for other tests such as; particle size, solubility and stability measurements, thus maximising the information generated by a single process.
This simple and compact automated work cell is able to conduct high throughput surface tension analysis of industrially viable materials in a cost effective manner. Currently 96 samples can be prepared and analysed every hour with a high degree of accuracy and reproducibility. This method is not only over 500 times faster than conventional methods, but is also conducted on a smaller and more economical scale. On average, a hundred CMC determinations requires a total sample volume of only ~ 60 ml, where using conventional methods the respective total liquid volume would be multiple litres, so large savings in both costs of reagent and waste are clear.
Rapid optimisation of liquid and solid dispensing
The effects of morphology and chemical functionality of materials bring about huge variability in bulk physical properties of liquids and solids. This leads to enormous problems in handling and dispensing these substances accurately using automation, and is a problem faced by many industries, especially in the area of formulation research. As a result, a development project was undertaken to design and implement a universal process to automatically optimise the accuracy and reproducibility of manipulation and dispensing of materials. These processes use Design of Experiments (DoE) and multivariate analysis software9 to rapidly optimise the dispensing algorithms of solids and liquid on a number of automated platforms. The process was first employed to optimise solid dispensing in a wide range of vial formats and reactors on Chemspeed Accelerator SLT106 Synthesis and VLT 100 and SWAVE formulation platforms. When encountering a new solid to be dispensed it is placed on the platform in a dispensing cartridge, chosen to suit the morphology of the solid, and dispensed using a program to vary and evaluate all the parameters used to dispense the solid, thus producing an optimised dispensing algorithm. Up to 72 solids can be placed on the deck to be tested in series at a rate of a solid being optimised every 5 – 10 minutes. This is then followed by a reproducibility investigation with these new dosing parameters which, if successfully achieved within the required limits, the optimised dosing parameters are then recorded along with the substance information and batch number through the LIMS system into a solid dosing database. The same principle was subsequently applied to the dispensing of liquids for high throughput formulation research. Currently manufacturers of automated liquid handling platforms have a set of “optimised” liquid classes for operator to choose from for dispensing the classic liquid mediums used in the bio-tech industry such as water, DMSO, and so on. However in industrial formulation research, such as for household care products, it is rare to dispense only pure mediums and usually instead requires the dispensing of a mixture of chemical components in a base medium. The components of these mixtures can significantly alter the physical properties of the solution and thus affect the dispensing accuracy if classic liquid class pipetting parameters are used. This is especially a problem when dispensing polymer and surfactant containing solutions, which can be difficult to handle accurately due to changes in viscosity or the formation of bubbles. This phenomenon is even common to the bio-tech and pharmaceutical industries where components of broths, serum and growth mediums have a surface active effect which caused bubbles to occur during pipetting, resulting in discrepancies in dispensing accuracy especially when using low volumes. Nearly all automated liquid handling software allows the customisation of these pipetting parameters to generate new liquid classes, but it is usually carried out by experienced operators and are typically developed using a best guess or trial and error approach. Although laborious, this strategy may be acceptable for some basic solvents, but would not be suitable for complex solutions such as, non-Newtonian fluids, low vapor pressure solvents, formulations and surfactants. Therefore a process for rapidly generating and optimising liquid dispensing has to be both user friendly and flexible to accommodate all these types of solutions. Also, ideally the system should be applicable to any type of liquid handling technique, such as, syringe pump, positive displacement, air displacement, and so on. The system that was developed uses the standard control software of the robot itself to rapidly optimise and generate an algorithm for accurate dispensing and therefore the technique is applicable to any type of liquid handling platform. The system can be used to follow either type of ISO regulation procedure for liquid handling optimisation, such as gravimetric (ISO 8655-6:2002) and absorbance or titration (ISO 8655-7:2005). Gravimetric analysis, although slower, was found to be the more reliable method for optimisation, of viscous fluids, formulations and mixtures for materials research, as accurate bulk density measurement can be made. As with the solid dosing optimisation, the strategy employed was to create a standard test program where the variable parameters (factors) within the particular automated liquid handling workstations software, such as, aspiration speed, dispensing speed, air gaps, and so on, are statistically varied (in simple cases from high, medium and low settings) according to a layout of a DoE work list. The program is then used to pipette the test liquid at high, medium and low volume ranges into pre-weighed vessels for gravimetric analysis or a micro plate for calibrated absorbance analysis for dispensing accuracy. The resulting calculated accuracy in dispensing volume (responses) are fed into the work list to produce a statistical model displaying the synergistic and antagonistic effects of the variable parameters on the dispensing accuracy of the test liquid, see Figure 7. In the case of a surfactant stock solution, the response surface model clearly shows that to minimise the error in liquid handling caused by the effect of bubble formation, a high aspiration speed and slow dispensing speed is required. This, coupled with information determined from other software pipetting variables, also examined by the DoE program, necessitates a short fast blowout to be conducted after dispensing to break the surface tension of any bubbles than may have formed at the end of the slow dispensing step and therefore minimise any pipetting inaccuracy. This method can even be used to test multichannel pipette modules though the parallelisation of the one test program for each channel.
The model than can be used to predict the optimal parameter settings to give the best accuracy of dispense for the test liquid. This is then used to perform a reproducibility study of multiple dispenses at high, medium and low volume ranges to be used to calculate the new Coefficient of Variance (%CV). Multiple test liquids can be optimised in series on the workstation series at a rate of a one every 10 -15 minutes when using absorbance analysis and 15 – 25 minutes when using automated gravimetric analysis. This method gives the operator the capability to rapidly generate and optimise their own liquid classes as and when required. This method was also used to improve the standard liquid classes set in the software of a number of makes of liquid handling robotic platforms. In the example below the manufactures optimised liquid parameters for water using a 50µl single channel pipette tool which was then compared to those after undergoing the previously described optimsation method, see
An improvement of over 15% in the %CV values was shown over all volume ranges including the minimum dispensing volume of the workstation. This method allows the operator to increase the flexibility and precision of the equipment available to them to not only get the best out of their research, but to also expand the variety of research that can be conducted. Like the solid dosing optimisation, we anticipate that these methods will also be incorporated into manufactures software in the near future, allowing optimisation to be carried out automatically during a process to maintain or improve accuracy or as an “on the fly” operation to generate optimised liquid class database for each new liquid, leading to a step change in automated liquid handling flexibility and functionality. The use of automation in a number of other materials research processes is currently under development and by applying new methodologies and practices, the problems discussed in the above article can be overcome. As demonstrated by the previous examples it is evident that large opportunities exist to greatly accelerate research in the materials industry, thereby increasing throughput, accuracy, reproducibility and drastically reducing product time to market.
References
- Chemspeed Technologies, http://www.chemspeed.com
- Centre for Materials Discovery, http://www.materialsdiscovery.com
- Eppendorf AG, http://www.eppendorf.com
- Kibron Inc., http://www.kibron.com
- K. Kiehm, M. Brewster, J. Peeters, and J.B. Dressman. Estimating intestinal permeability by surface activity profiling. In AAPS Annual Meeting and Exposition, Nashville, November 6-10, 2005.
- K. Kiehm, M. Brewster, J. Peeters, and J.B. Dressman. Comparison of drug partitioning between octanol/water and the air/water interface., In AAPS Annual Meeting and Exposition, Nashville, November 2005.
- Surface Activity Profiling of Drugs Applied to Predict of Blood-Brain Barrier Permeability, C. Johans, T. Soderlund, P. Suomalainen and P.K.J. Kinnunen., J. Med. Chem., 2004, 25;47, (7), 1783.
- Making Surface Tension Measurement a Practical Utility for Modern Industrial R&D, C. Johans, I. Palonen, P. Suomalainen and P.K.J. Kinnunen, American Laboratory News, December 2005.
- MODDE 7.0, Umetrics Inc., http://www.umetrics.com
Dr. Neil Campbell
Senior Experimental Officer, Automation and Process Development Specialist, Centre for Materials Discovery, University of Liverpool
Neil gained his BSc Hons in Chemistry, Drug Design and Toxicology at the University of Hull in 1998 followed by an industrially funded doctorate (Qinetiq) in LCD and OLED display materials. After his PhD, he expanded his research into the area of nano technology and in 2002 he was invited to work in Japan at the renowned science city of excellence Tsukuba. “The Yokoyama Nano-structured Liquid Crystal Project” consisted of a multinational team of leading industrial and academic scientists under the direction of the nano-technology leading expert Prof. Hiroshi Yokoyama. In 2004 he joined Professor Andy Cooper’s research team at Liverpool to develop nano particle release formulations which led to the generation of a start-up company. Neil then joined the Centre for Materials Discovery (CMD) during its creation in June 2005 and with a wide background of experience in industrial research, development and processing techniques, he currently specialises in automation and process development at the CMD.