[ad_1]
Technological advances have been pivotal in expanding scientific research and contributing to an exponential increase in data output. A primary factor contributing to the rise in data output has been the shift from manual bench work to high-throughput technologies. In addition, the generation of such enormous volumes of data has led to breakthroughs in microprocessing, data management and AI used to interpret the data. These advances have further spurred the development of high-throughput laboratory technologies to keep pace with data processing speeds.1 As such, high-throughput technologies have become crucial to research areas, including drug discovery, genomics, and molecular biology.
High-throughput technologies have enabled researchers to take on ambitious projects that are expected to transform medicine. For example, Prof. Michael Snyder, chair of the department of genetics and directors of the Center for Genomics and Personalized Medicine at Stanford Medicine, is currently running the Integrated Personal Omics Profiling (iPOP) study. iPOP involves undertaking unprecedented deep biochemical profiling of approximately 100 individuals generally classed as healthy. In doing so, the study hopes to determine what “normal” biochemical and physiological profiles look like at an individual level. This information will enable researchers to understand what processes are affected in various disease states and ultimately improve diagnoses, disease monitoring, and the success of targeted therapies. When asked about the role of high-throughput technologies in this study, Snyder said, “nearly all assays are high throughput, including genomics approaches, many other omics assays and wearables. We believe that genome sequencing and other omics technologies, as well as wearables, will be routinely employed by the healthcare sector. Most importantly, we hope to transform what is presently sick care into healthcare where we focus on keeping people healthy rather than treating people when they are ill.”
High-throughput systems are well established in drug discovery, where hundreds of thousands of potential drug candidates need to be screened.1 In addition, high-throughput sequencing technologies have transformed genetics and genomics, with over one million human genomes sequenced by 2020.2 This sequence data has provided the pharmaceutical industry with many potential drug targets that warrant further exploration, fueling the need for greater throughput.3
One of the main mechanisms of increasing throughput is miniaturization, as smaller experimental platforms and the need for reduced reagent volumes will allow more experiments to be conducted with fewer resources, thereby lowering costs.4,5 The push to miniaturize experiments has given rise to microfluidics, which investigates and manipulates fluids at the submillimeter scale. Working with smaller sample and reagent volumes is also advantageous when working with precious samples. In addition, microfluidics also leverages the microscale fluid phenomena, giving researchers far greater control over the spatiotemporal dynamics that impact the experiment.6
Other technologies that enable high-throughput include automation, robotics, and liquid-handling robotics. This article will explore how high-throughput technologies are continuing to advance research in the life sciences. It will also and highlight various areas directly benefiting, including drug discovery, genomics and molecular biology.
What Is a Microarray?
A microarray is a miniaturized assay comprising a solid surface printed with a multitude of molecules which are available for multiplex interaction and detection. They have applications in many fields including genomics, proteomics, diagnostics, biomarker quantitation and others. Watch this webinar to learn about what microarrays are and how they are made as well as examples of how microarrays are used across research and pharma.
High-throughput screening in drug discovery
Drugs targeting the central nervous system (CNS) have a notoriously high failure rate – for clinical trials of neurodegenerative disorders, such as Alzheimer’s disease, this failure rate can be > 99%.7 In addition, preclinical testing of these drugs is still largely dependent on in vivo screening that is expensive and time-consuming. Therefore, there has been considerable focus on developing in vitro model systems recapitulating the blood-brain barrier (BBB) to assess drug delivery to the brain. These cell culture systems have significantly reduced the amount of time and money required to conduct drug screening and have helped to address ethical concerns surrounding animal testing. In recent years, high-throughput technologies have further enabled these assays to meet the demand of current pharmacological development speeds.
For example, researchers at the University of Artois in France have miniaturized a patented BBB in vitro model, which has enabled high-throughput screening of compounds using automated technology. The model, previously conducted manually using a 12-well format, was miniaturized into a 96-well design and automated using robotic cell seeding, assay testing and imaging. These changes increased the number of experimental points performed in the same cell culture volume. Furthermore, automation increased precision and reduced the time spent by researchers on performing assays. Ultimately this allows for more compounds to be screened in the early stages of drug discovery.5
The automation of screening assays has given the drug discovery field a tremendous boost in productivity – with particular focus being given to the automation of drug synthesis, which is anticipated to increase speed and capacity significantly.8 Progress has already been made in automating the synthesis of organic compounds; however, these processes are still reliant on manual intervention and chemists to design synthetic strategies. In applying high-throughput technologies to compound synthesis, the next step involves incorporating artificial intelligence (AI) to assume parts of the synthetic planning.9 Scientists at the Massachusetts Institute of Technology have recently developed a combined AI-driven synthesis planning and robotically controlled experimental platform. This system effectively relieves the chemist from routine tasks and represents a massive step in fully autonomous chemical synthesis.10 The ever-increasing demand to improve efficiencies and reduced research costs in drug discovery will spur further research and development in high-throughput technologies.
High-throughput technologies for genomics research
High-throughput sequencing has greatly assisted rapid whole-genome sequencing during infectious disease outbreaks – such as the COVID-19 pandemic – and tracking antimicrobial resistance.11,12 However, even though considerable advances have been made in sequencing, the upstream techniques used to prepare samples for sequencing have lagged behind.
Sample preparation presents a bottleneck for high-throughput sequencing and has implications for its use in molecular diagnostics. For example, a study in the United Kingdom found that the turnaround times for routine cancer diagnosis using genomic analysis can be as long as six days in all-manual laboratories. While aspects of the sample preparation workflow have been successfully automated, such as nucleic acid extraction, preparing reagents and plates continues to be performed manually. Automating liquid handling has been identified as a cost-effective way of improving the performance of high-throughput laboratories. The cost of laboratory personnel as part of the genome sequencing workflow can be significantly reduced from 15% of the total cost in conventional laboratories to just 4% when automated sample preparation steps are introduced. This reduces the time spent by highly skilled life scientists pipetting liquids allowing them to refocus their attention on data analysis. Automating these repetitive tasks also leads to increases in overall accuracy and precision of experiments that can boost repeatability and reproducibility. This has a direct impact on the rate and success of drug development while decreasing cost.2,13
In sample preparation where automation is established, namely nucleic acid extraction, high-throughput techniques can make a huge impact. Liquid-handling robots and microfluidics have been the leading technologies used to enhance the performance in nucleic acid extraction. However, using these methods often requires high DNA input, making such technologies unsuitable for preparing low-quantity samples. These input requirements have also been a significant barrier to using whole-genome sequencing as a rapid diagnostic testing strategy or in environmental microbiology and natural product discovery. This obstacle has recently been overcome by developing a novel microfluidic sample preparation platform with significantly reduced DNA input requirements. This technology is capable of low input (10 000 cells) whole-genome shotgun (WGS) sequencing. In addition, the throughput is also scalable, as evidenced by its application in the sequencing of over 400 clinical Pseudomonas aeruginosa libraries.12 Such novel high-throughput technologies can aid in advancing genomics by broadening their application in scientific research.
Efficient and Automated 384 Well qPCR Set-Up
Setting up a qPCR is a tedious process consisting of multiple pipetting steps. One particularly challenging task is reformatting from microcentrifuge tubes into a 384 well plate, it is time consuming and requires a lot of concentration. Another common problem is the loss of valuable and expensive substances. Download this app note to discover how a pipetting robot allows considerably faster sample preparation and eliminates the risk of reformatting errors.
High-throughput tumor profiling
High-throughput technologies have allowed researchers the opportunity to design targeted therapies for diseases that have a high degree of heterogeneity and that would otherwise be treated uniformly. Targeted therapies work by targeting specific drivers of disease and are based on an awareness of differences in disease between patients. In tailoring treatments to a particular disease profile, the safety and efficacy of drugs can be enhanced. One field benefiting greatly from a targeted therapeutic approach is oncology. High-throughput technologies enable clinicians to gain a clearer and more comprehensive understanding of molecular heterogeneity between and within individuals with cancer. For example, high-throughput tumor profiling that incorporates genomics and transcriptomics can be used to identify heterogeneity in markers that can influence prognosis, drug resistance and sensitivity as well as adverse effects. Understanding these differences can then be used to tailor treatments specific to the patient’s requirements. High-throughput tumor profiling can go one step further by allowing researchers the opportunity to understand the cellular heterogeneity that exists inside tumors. Such intratumor heterogeneity has been detected spatially and temporally.14 These findings highlight the importance of individualized treatments for cancer patients with heterogeneous tumors and the need for high-throughput technologies to facilitate such therapies.
While understanding the cancer heterogeneity can help design individualized treatments, there are a limited number of cancers where this approach is applicable. One reason is that very few gene mutation/drug pairs have been established. Therefore, to expand personalized medicine, drugs need to be evaluated using primary tumor tissue. Unfortunately, using patient-derived xenografts can be extremely costly, time-consuming to prepare and only allow for a limited number of drugs to be assessed.
Researchers at the University of California, Los Angeles, led by principal investigator Assistant Prof. Alice Soragni, are experimenting with three-dimensional spheroids and organoids to overcome these challenges. The advantage of organoids is that they can be readily established from primary cancers, thus making the direct assessment of drugs possible. Assays involving these structures can also be easily automated for high throughput. In one example, the researchers screened a library of 240 kinase inhibitor compounds to establish clinically actionable drug sensitivities for tumors15 obtained from surgeries. When asked how far this platform has progressed Soragni said, “we have definitely expanded our programs which are for now research-related16*… we are in the initial phases of clinical translation, and are working on designing a clinical trial to confirm if we can predict therapeutic outcomes with our system… it is a step-by-step process.”
In addition to pursuing clinical trials to validate the platform, the lab is continually developing the platform. In a recent publication, high-speed live cell interferometry17* was built into the platform.
Soragni elaborated on this, “We’ve added a couple of different innovations to the platform, one is bioprinting which we really think is going to be a game-changer for us. It removes all manual manipulation that you can think of [from the platform].” The second, in collaboration with the Teitell Lab at UCLA, introduces an alternative way to measure how the organoids are responding to therapies – using high-speed live cell interferometry. Soragni highlighted that, “it is non-invasive and label-free, so you can measure [the cells] over and over again … and we believe it has the potential to be a really sensitive approach to detect how organoids react to therapy.” High-throughput drug screening using patient-derived patient organoids has the potential to benefit individual patients/ patient groups by rapidly identifying promising novel compounds or existing drugs that can be repurposed – multiple drugs within classes can be assessed to try to determine the most promising treatments warranting further optimization – or selection for clinical trials.15
However Soragni warns that to reach a point where these models can be used to guide therapy it’s important to consider tumor heterogeneity, “we can introduce a sampling bias based on where the sample is taken from.” She explains that while this isn’t as much of a problem when working with therapy-naïve tumors, if tumors have been treated with various therapies and evolved under therapeutic pressure, heterogeneity is a bigger concern and can fuel drug resistance. “We have the ability to sample the same patient repeatedly to follow the tumor’s natural history, and we can sample various locations and metastases. This allows us to see quite substantial differences,” explained Soragni.
Prospects for high-throughput technologies
The processing power of computers has transformed the way scientists generate and interpret data. To keep pace with these developments, life science laboratories have introduced high-throughput technologies to increase performance. Undoubtedly, as high-throughput technologies become further integrated into laboratories, more benefits will emerge.
References
1. Schneider G. Automating drug discovery. Nat. Rev. Drug Discov. 2018;17(2):97-113. doi: 10.1038/nrd.2017.232
2. Tegally H, San JE, Giandhari J, de Oliveira T. Unlocking the efficiency of genomics laboratories with robotic liquid-handling. BMC Genom. 2020;21(1):1-15. doi: 10.1186/s12864-020-07137-1
3. Lappalainen T, Scott AJ, Brandt M, Hall IM. Genomic analysis in the age of human genome sequencing. Cell. 2019;177(1):70-84. doi: 10.1016/j.cell.2019.02.032
4. Mayr LM, Fuerst P. The future of high-throughput screening. J. Biomol. Screen. 2008;13(6):443-448. doi: 10.1177%2F1087057108319644
5. Moya EL, Vandenhaute E, Rizzi E, et al. Miniaturization and automation of a human in vitro blood–brain barrier model for the high-throughput screening of compounds in the early stage of drug Discovery. Pharmaceutics. 2021;13(6):892. doi: 10.3390/pharmaceutics13060892
6. Sackmann EK, Fulton AL, Beebe DJ. The present and future role of microfluidics in biomedical research. Nature. 2014;507(7491):181-189. doi: 10.1038/nature13118
7. Mehta D, Jackson R, Paul G, Shi J, Sabbagh M. Why do trials for Alzheimer’s disease drugs keep failing? A discontinued drug perspective for 2010-2015. Expert Opin Investig Drugs. 2017;26(6):735-739. doi: 10.1080%2F13543784.2017.1323868
8. Green CP, Spencer PA, Sarda S. Advancing automation in Compound Management: a novel industrial process underpinning drug discovery. Drug Discovery Today. 2021;26(1):5-9. doi: 10.1016/j.drudis.2020.09.032
9. Wang Z, Zhao W, Hao G, Song B. Automated synthesis: current platforms and further needs. Drug Discovery Today. 2020;11:2006-2011. doi: 10.1016/j.drudis.2020.09.009
10. Coley CW, Thomas DA, Lummiss JA, et al. A robotic platform for flow synthesis of organic compounds informed by AI planning. Science. 2019;365(6453). doi: 10.1126/science.aax1566
11. Baker DJ, Aydin A, Le-Viet T, et al. CoronaHiT: high-throughput sequencing of SARS-CoV-2 genomes. Genome Med. 2021;13(1):1-11. doi: 10.1186/s13073-021-00839-5
12. Kim S, De Jonghe J, Kulesa AB, et al. High-throughput automated microfluidic sample preparation for accurate microbial genomics. Nat. Commun. 2017;8(1):1-10. doi: 10.1038/ncomms13919
13. Miles B, Lee PL. Achieving reproducibility and closed-loop automation in biological experimentation with an IoT-enabled lab of the future. SLAS Technol. 2018;23(5):432-439. doi: 10.1177%2F2472630318784506
14. Govindarajan M, Wohlmuth C, Waas M, Bernardini MQ, Kislinger T. High-throughput approaches for precision medicine in high-grade serous ovarian cancer. J. Hematol. Oncol. 2020;13(1):1-20. doi: 10.1186/s13045-020-00971-6
15. Phan N, Hong JJ, Tofig B, et al. A simple high-throughput approach identifies actionable drug sensitivities in patient-derived tumor organoids. Commun. Biol. 2019;2(1):1-11. doi: 10.1038/s42003-019-0305-x
16. Shihabi AA, Davarifar A, Nguyen HTL, et al. Personalized chordoma organoids for drug discovery studies. bioRxiv 2021:2021.05.27.446040. doi: 10.1101/2021.05.27.446040 *Preprint
17. Tebon PJ, Wang B, Markowitz AL, et al. High-speed live cell interferometry for screening bioprinted organoids. bioRxiv. 2021. doi: 10.1101/2021.10.03.462896 *Preprint
[ad_2]
Source link