Podium presentations are organized into 10 educational tracks. Podium abstracts and speaker information are organized first by track and then by session below.
To search for a specific speaker use the ‘Find’ functionality in your browser (usually Ctrl + F).
To view a complete schedule of podium presentations and schedule of events for SLAS2018 and to view speaker bios and photos, please visit the SLAS2018 Event Scheduler.
Track Chairs: Melanie Leveridge, GSK and Shaun McLoughlin, Abbvie
Session Chair: Andreas Luippold, Boehringer Ingelheim Pharma GmbH & Co KG
MALDI-TOF-MS - A label free technology for high throughput screening
Frank Buettner, Boehringer-Ingelheim Pharma GmbH & Co.KG
Mass spectrometry (MS) is an emerging technology for identifying and characterizing molecules that modulate biological targets, offering a label free, direct detection method. This technology enables the application of more physiologically relevant assays and reduces time and costs compared to current classical approaches increasing the efficiency of the drug discovery process.
In the past, the throughput of MS-based assay technologies was limited, but recent developments in the field of MALDI-TOF-MS devices and spotting technologies substantially increased the ability for miniaturization and speed of such approaches. However, the application of MALDI is based on a matrix-compatible sample preparation step and is limited to a certain space of analytes. This requires the identification of MALDI compatible, physiological relevant assay conditions, as well as development of fast and reproducible liquid handling procedures.
The talk will shed light on challenges in this process and provides results of this application in high throughput screening projects.
Label Free Approaches to Quantify Small Molecule-Receptor Binding
David McLaren, Merck & Co, Inc.
Ligand-receptor binding assays are commonly employed to study the thermodynamics and kinetics of receptor-ligand interactions. Conventional binding assays use labeled ligands which can be resource-intensive to prepare and are not always suitable. We have developed label-free LC-MS based assays to quantitate small molecule binding to both soluble and membrane associated proteins. In direct binding mode, the methodology enables rapid assessment of ligand affinity to its molecular target as well as the concentration of the bound ligand. Structure-activity relationships can also be assessed in competition binding mode. This presentation will describe the experimental design, validation and application of equilibrium-based, direct and competition binding assays using LC-MS for bound ligand quantitation, and the utility of the method for assessing binding kinetics.
Label Free High-Throughput ESI-MS:A Novel Sampling Interface for ADME and HTS
Hui Zhang, Pfizer Inc.
Label-free LC/MS based screening technology is routinely used in pharmaceutical industries for hit discovery and various ADME profiling applications. Although the current analysis speed of less than 30 seconds per sample is quite promising, it still cannot match the throughput provided by plate-reader based HTS platforms. In this study direct injection is coupled with an open-port probe (OPP) for direct sampling into a standard ESI ion source. Screening speeds of <2 seconds-per-sample were demonstrated with high sensitivity (attomole loading), good quantitation capability (>3 orders of magnitude), and broad compound coverage (from small molecule pharmaceuticals to peptides and antibodies).
The use of a “classic” ESI ion source for MS analysis yielded a perfectly Gaussian-shaped signal peak with baseline width of 0.8 - 1.5 seconds. High sensitivity and reproducibility were demonstrated for this approach, showing linearity over three orders of magnitude, and sensitivity (attomole loading for small molecules, and sub-femtomole loading for intact antibody). The continuous-flow of carrier solvent for the OPP maintained ionization stability and actively cleaned the entire flow system resulting in no observed carry-over. The advantages of this integrated system approach were demonstrated with a Drug-Drug Interaction (DDI) assay, where various substrates/metabolites were monitored and compared to conventional analysis.
Small molecule direct binding by use of ASMS for target tractability assessment and high throughput hit identification
Geoff Quinque, GlaxoSmithKline
Affinity Selection Mass Spectrometry (ASMS), a label free assay that connects a binding event to the accurate mass identity of the ligand involved, is an established HTS triage platform at GSK that has been used to generate hit qualification data on more than 60 targets during the past three years. As part of a paradigm shift to screen novel targets, we are exploring the use of ASMS for hit identification, target tractability assessments and tool compound identification. The benefits include reduced cycle time through streamlined assay development, and reduced attrition through identification of compounds that directly engage the target protein. A mass-encoded 180,000 compound library has been created for ASMS screening, and is comprised of compounds that represent aspirational chemical space in terms of molecular weight, cLogP and property forecast index. The output of the ASMS platform has been evaluated against existing target-specific biochemical and biophysical data to develop a better methodology that maximizes the identification of biochemically active compounds while minimizing the overall hit rate. Nearly 85% of compounds with known biochemical and/or biophysical activity showed binding to a protein target with our platform. A sub-set of the full library is being used to evaluate target tractability, and has been used to screen 30+ historical targets, with the goal of correlating compound binding to tractability predictions. Overall, ASMS tractability outcomes align well with Encoded Library Technology (ELT) and HTS tractability observations. From a methods optimization perspective, continued development of the sample preparation protocols and the LC-MS platform are being targeted to maximize sensitivity and increase platform throughput. Furthermore, the development of an end-to-end informatics solution will complement the analytical platform. This presentation will highlight the ASMS platform developed for hit identification and target tractability assessments and illustrate its application of a kinase screening campaign as a proof of concept.
Session Chair: Christina Rau, Cellzome GmbH, a GSK Company
A highly-reproducible automated protein sample preparation workflow for quantitative mass spectrometry in plasma or blood
Jennifer Van Eyk, Cedar Sinai Medical Center
BACKGROUND. Sample preparation for protein quantification by mass spectrometry requires multiple processing steps including denaturation, reduction, alkylation, protease digestion, and peptide cleanup. Scaling these procedures for the analysis of numerous complex biological samples, such as plasma, can be tedious and time-consuming, as there are many liquid transfer steps and timed reactions where technical variations can be introduced and propagated. Therefore, we have automated this digestion workflow and adapted it to include the preparation of dried blood obtained from remote sampling devices, allowing high throughput analysis of both archived and “real-time” sampling of our pathological surveillance biomarkers. Our pathological surveillance biomarker assay is composed of 72 plasma proteins that screen for 8 pathological signatures. METHODS. We established an automated sample preparation workflow with a total processing time for 96 plasma or blood samples of 5 hours, including a 2-hour incubation with trypsin. Peptide cleanup is accomplished by online diversion during the LC/MS/MS analysis. RESULTS. In a selected reaction monitoring (SRM) assay targeting 6 plasma biomarkers and spiked β-galactosidase, mean intra-day CVs for 5 samples ranged from 5.5%-8.9% for serum and 3.9%-7.2% for plasma, and mean inter-day CVs over 5 days ranged from 5.8%-10.6% for serum and 3.9%-6.0% for plasma. As well for the highly multiplex surveillance biomarker assay, 90% of the transitions from 6 plasma samples repeated on 3 separate days had total CVs below 20%. Similar results were obtained when the workflow was transferred to a second site: 93% of peptides had CVs below 20%. In an analysis of plasma samples from 48 individuals (disease and healthy), the average CVs for spiked β-galactosidase was < 15%. The workflow was adapted for the direct processing of remote blood sampling devices (Neoteryx) and achieved equivalent high performance for spiked β-galactosidase when part of a 10 and 72 protein SRM assays.
The human secretome and kidney fibrosis – identification of novel fibroblast biology through exploitation of the human secretomics library.
Douglas Ross-Thriepland, AstraZeneca
Diabetic nephropathy is the world’s leading and most rapidly growing cause of end stage renal disease, with up to 40 % of all diabetic patients developing chronic kidney damage – a common manifestation of which is tubulointerstitial fibrosis, driven by overactive fibroblast cells. The human secretomics project is a collaboration between KTH and AstraZeneca to develop methods to purify from cell factories all 6400 human membrane and secreted proteins. The resultant library of highly bioactive molecules will enable exploration of novel biology and identification of tractable protein targets for drug discovery. We have for the first time used a 700 set secretomics library as a novel modality to probe the biology of the causative cell type in chronic kidney disease, fibroblasts. By building a high content imaging assay to follow the phenotypic fate of primary human kidney fibroblasts we have screened the secretomics library alongside a standard small molecule campaign. This innovative use of the human secretome library enabled us to identify novel regulators of fibroblast biology not previously identified, and whose identification will now enable strategies to develop next generation biological treatments that will halt or slowdown disease progression in patients with chronic kidney disease.
Integrating high resolution mass spectrometry with cheminformatics for standardized, routine non-targeted metabolomics
Oliver Fiehn, UC Davis
Over the past 20 years, metabolomics has evolved into using either multi-targeted assays, usually with nominal mass resolution spectrometers, or non-targeted approaches with high resolution mass spectrometry. We will here show that how to merge targeted approaches with high quality non-targeted discovery metabolomics. We will highlight the importance of advanced, open access data processing, the proper use of quality controls and internal standards, and full reporting of raw data as well as result data.
At the NIH West Coast Metabolomics Center, we use 17 mass spectrometers in the central facility for providing data, informatics services and collaborative research for over 400 projects and more than 25,000 samples per year. These services include commercial assays for plasma analytics, the p180 kit, in addition to steroid, bile acid and oxylipin assays for more than 100 target compounds. Most projects, however, use our three integrated non-targeted metabolomics assays: primary metabolism for up to 200 identified compounds per study using GC-TOF MS, complex lipids for more than 600 identified lipids per study using high resolution liquid chromatography / tandem mass spectrometry and more than 150 identified compounds per study for biogenic amines using hydrophilic interaction chromatography/ high resolution mass spectrometry.
We use standardized data processing in free-access MS-DIAL 2.0 software that is far superior standard solutions with respect to data deconvolution, compound identification and false positive/false negative peak detection. This software is now integrated with MS-FINDER 2.0 software for predicting and annotating spectra of biomarkers with unknown chemical structures. Both programs work excellently for high resolution GC-MS and LC-MS data. In addition, we harness the power of legacy data from more than 2,000 projects we have acquired since 2004 that is available to the biomedical and biological research community at large, the BinVestigate interface to our BinBase metabolome database. We showcase how the integrated use of these resources identified novel epimetabolites in cancer metabolism, both on a prospective cohort scale (in lung cancer) and as new epitranscriptome metabolites from modified RNA molecules (in a range of cancers except for liver cancer).
Combining 3D liver microtissues with lipid loading and lipidomics as a screening model for non-alcoholic fatty liver disease
Patrick Guye, InSphero AG
Non-alcoholic fatty liver disease (NAFLD) is the most common chronic liver disease in the world, affecting all racial, ethnic, and age groups without sex predilection. NAFLD is characterized by an excessive accumulation of lipids in hepatocytes (steatosis) and in combination with inflammatory processes (NASH) progressively develops into end stage liver disease, making it a major clinical concern.
Here, we describe a novel, screening-compatible human liver microtissue in vitro model for studying the etiology of steatosis and therapeutic strategies in a 3D configuration. The steatosis model is based on incubation with Oleate/Palmitate and displays a distinct and quantifiable accumulation of macro- and/or microvesicular lipid droplets within the hepatocytes. It maintains prolonged viability and liver-specific functionality in comparison to 2D cultures and can be produced in a 96-well SBS-compatible format.
Lipid accumulation was studied using fluorescent imaging followed by algorithmic analysis as well as by a novel LC/MS method for a full lipidomics analysis using unsupervised learning techniques.
Oleate as well as Palmitate induced a time- and concentration-dependent lipid accumulation, preferentially causing microvesicular (Oleate) or macrovesicular (Palmitate) steatosis. The highest lipid accumulation was observed after 7 days of Oleate treatment. The combination of both fatty acids in a physiological relevant 2:1 (Oleate:Palmitate) ratio resulted in a mixed phenotype. Lipidomics analysis confirmed increased concentrations of di- (18:1/18:1) and triglycerides (18:1/18:1/18:1) in microtissues upon treatment with Oleate or Oleate/Palmitate compared to medium and BSA control. In microtissues treated with Palmitate, increased concentrations of triglycerides (14:0/16:0/16:0) were observed. Lipidome principal component analysis allow for a clear distinction between the different treatment groups by corresponding clustering.
This 3D human liver microtissue model is particularly well-suited to study the formation as well as the prevention of steatosis by whole lipidome profiling, and is highly amenable for running comparisons to clinical samples. Moving beyond steatosis, the immune-competent status of these microtissues may even serve as starting point to study the etiology of NASH when combined with inflammatory stimuli.
Session Chair: Shaun McLoughlin, AbbVie
Hit Triage and Mechanism Validation for Phenotypic Screening: Considerations and Strategies
Fabien Vincent, Pfizer
Phenotypic drug discovery approaches can positively affect the translation of preclinical findings to patients. However, significant differences exist between target-based and phenotypic screening, prompting a need to re-assess our strategies and processes to most effectively prosecute phenotypic projects. First, phenotypic screens have dual goals of delivering both efficacious compound series as well as novel molecular targets for diseases of interest whereas only desirable chemical matter is sought for target screens. Second, while confirming binding and functional impact is sufficient for target screening hits, the situation is noticeably more complex for phenotypic screening hits. Here, hits acting through a number of (largely unknown) mechanisms in a large and often poorly understood biological space need to be triaged to differentiate desirable mechanisms from undesirable ones.
Given these fundamental differences, the hit triage and validation process was critically re-evaluated in light of the unique characteristics of phenotypic screening. Key considerations and specific strategies will be shared and exemplified by in house and literature case studies.
Genome-wide CRISPR-mediated Gene Disruption Presents a Shortcut to Acquired Resistance that Reveals Small Molecule Mechanism of Action
Jon Oyer, Abbvie
Phenotypic screening in small molecule drug discovery presents the opportunity to discover novel therapies, but thorough identification of a small molecule target remains an obstacle. To address this challenge we applied whole-genome pooled CRISPR screening as a Shortcut To Acquired Resistance in Search of mechanism (STAR-Search). This strategy uses CRISPR to generate a population where individual cells each possess a distinct targeted mutation. This comprehensive pool of mutations is then subjected to positive selection, which enriches cells that acquire resistance to compound treatment. The resistance is caused by targeted mutations that are readily identified by sequencing the stably integrated targeting construct. We hypothesize that the identity of gene disruptions underlying resistance can reveal mechanism of action or factors proximal to the direct target. Our group has successfully applied STAR-Search to multiple phenotypic screening hits, thus demonstrating its strong potential as a tool in target identification/validation.
Our application of STAR-Search examined three small molecules that each elicits cytotoxic effects against a unique spectrum of cancer lines. CGS-18, which preferentially induces apoptosis in breast cancer lines, was dosed onto MDA-MB-468 cells stably transduced with the Brunello CRISPR gRNA library. Cells that survived CGS-18 selection showed enrichment of gRNAs targeting a single gene SULT1A1. MDA-MB-468 cells also undergo apoptosis in response to CGS-59 treatment, so this positive selection was performed in parallel with the previous screen. In this selected population, gRNAs targeting MGST1 were the most highly enriched. Validation experiments have confirmed that individual disruption of SULT1A1 or MGST1 confers resistance to CGS-18 or CGS-59, respectively. The third small molecule, CGS-85, displayed selective killing of multiple myeloma cell lines. This compound was profiled in the BioMap Diversity+ Panel where its phenotypic effects showed strong correlation to the reference database profile generated by the oxidative phosphorylation inhibitor oligomycin. LP-1 cells transduced with the CRISPR library that survived either CGS-85 or oligomycin selection showed enrichment of gRNAs targeting a large number of genes, but this group converged on a common mechanism: mitochondrial oxidative phosphorylation. Despite substantial overlap between the majority of screening hits, prominent differences suggested distinct direct molecular targets. Subsequent enzyme assays showed CGS-85 potently inhibits isolated mitochondrial complex I, whereas oligomycin confirmed as an inhibitor of complex V. Together these examples illustrate the potential of STAR-Search to reveal small molecule mechanisms of action and specifically uncover novel biological connections due to the comprehensive and systematic nature of the genome-wide CRISPR targeted disruptions.
UPT and SCLS, two unique workflow for Drug Target Identification
Chaitanya Saxena, Shantani Proteome Analytics Pvt. Ltd.
Ever-existing need of identifying the targets of bioactive molecules is recently fuelled by resurgence of Phenotypic Screenings in drug discovery. We will be presenting advantages and case studies of two of our proprietary target identification technologies that can be utilized, in tandem, at different stages of drug development. At ‘Hit’ stage, where compound SAR information is limited, Universal Unique Polymer Technology (UPT), that allows enrichment of targets of underivatized molecule, can be applied in narrowing / identifying the targets of the bioactive molecules. UPT relies on immobilizing the compound utilizing non-covalent, weak-interaction forces of the molecule on a polymer surface, that provides complementary weak-interaction forces for immobilization. Compound immobilized on the polymer are quantified and thus prepared compound specific matrices are incubated with the biological lysate for affinity capture of the target. Captured target proteins are eventually identified using Mass-Spectrometry and the specificity of capture is assigned by comparing the proteins identified from multiple compound loaded and control polymer matrix surface. Key advantage of UPT is that it allows affinity enrichment of target without compound derivatization. For the compounds that have travelled to ‘lead’ stage of development and SAR of the compound is well defined, a SubCellular Location Specific (SCLS) Target Capture Technology is utilized in confirming the identity and the subcellular location of the target. In SCLS, compound of interest is tagged to different subcellular location specific peptide probes. In multiple experiments, probes localize the compound at different cellular location and functional activity of the compound is recorded. The subcellular location, that shows maximum functional response, is then chosen as the target enriched compartment and utilized for target capture experiments. Antibody against the peptide allows the recovery of the probe and the affinity captured protein targets. Eventually captured target proteins are identified using Mass-Spectrometry. Key advantage of SCLS is that it allows investigating the target and mechanism of action in subcellular location manner. For critical evaluation of these new methods, along with the success examples, limitations of these methods will also be presented.
Therapeutic Targeting of the Unfolded Protein Response to Treat Disease
Luke Wiseman, PhD, The Scripps Research Institute
Imbalances in proteostasis are implicated in the onset and pathogenesis of etiologically-diverse disorders including systemic amyloid diseases and ischemic heart disease. Recent work has shown that activation of the unfolded protein response (UPR)-associated transcription factor ATF6 ameliorates imbalances in proteostasis associated with these disorders. However, the lack of pharmacologic approaches to selectively activate ATF6 has limited the development of this approach to intervene in disease. We employed a high-throughput screening approach to identify first-in-class compounds that preferentially activate the ATF6 arm of the UPR. Here, we will describe the mechanism of action for these compounds and highlight their therapeutic potential to correct pathologic imbalances in proteostasis. Collectively, these results will show that pharmacologic ATF6 activation is a broadly applicable strategy to therapeutically intervene in diverse types of disease.
Track Chairs: Ed Ainscow, Carrick Therapeutics and Ralph Garripa, MSKCC
Session Chair: Gianluca Pegoraro, National Cancer Institute/NIH
A High Throughput Imaging Assay for the Quantification of Gene Expression Dynamics at the Single Cell Level
Gianluca Pegoraro, National Cancer Institute/NIH
The establishment and maintenance of gene expression programs is essential for cellular differentiation and organism development. For this reason, gene expression is tightly regulated at the level of mRNA transcription, splicing, and translation. Recently, a combination of genetically encoded fluorescent reporters capable of binding and visualizing mRNA transcripts in living cells, such as MS2 stem loops and MS2-GFP, and of image processing techniques to detect, track and measure these transcripts has enabled the characterization of the dynamic regulation of these processes in live cells. We will describe the design and implementation of a high-throughput imaging assay consisting of panels of cell lines stably expressing a variety of endogenous genes tagged with MS2-stem loops, automated live-cell confocal microscopy for the long-term visualization of the expression dynamics of these genes at the single allele level, automated image processing for cell and transcription site tracking in time-lapse series, and the generation of gene expression trajectories for hundreds of cells per sample. Furthermore, we will show practical implementations of this imaging-based assay to measure the transcriptional kinetics of several independently MS2-repeats-tagged genes, and to quantify changes in transcriptional on/off cycles for a glucocorticoid receptor (GR) regulated locus. Overall, the development of this approach opens the possibility of screening focused chemical or oligo siRNA libraries to identify and characterize novel molecular mechanisms regulating gene expression dynamics.
Collaborative Phenotyping at King's College London: HipSci and the Stem Cell Hotel
Davide Danovi, King's College London
We work in the framework of the Human Induced Pluripotent Stem Cells Initiative (HipSci) project, funded by the Wellcome Trust and MRC (www.hipsci.org). Here, we will present in particular the characterisation of a large panel of human induced pluripotent stem cells, focusing on the integration of high content imaging data with genomics. Imaging over 100 human iPS cell lines from healthy donors we have observed evidence for inter-individual variability in cell behaviour. Cells were plated on different concentrations of fibronectin and phenotypic features describing cell morphology, proliferation and adhesion were obtained by high content imaging as in our previously reported method. Furthermore, we have used dimensionality reduction approaches to understand how different extrinsic (fibronectin concentration), intrinsic (cell line or donor) and technical factors affected variation. We have identified with our platform specific RNAs associated with intrinsic or extrinsic factors and single nucleotide variants that account for outlier cell behaviour. We will also mention significant progress in the integration of dynamic imaging data with other datasets. By leveraging the expertise derived on this project, we now provide to internal and external scientists a dedicated laboratory space for collaborative cell phenotyping to study how intrinsic and extrinsic signals impact on human cells to develop assays for disease modeling and drug discovery and to identify new disease mechanisms.
Identifying molecules for loss of function genetic diseases and pathological secreted factors using high-dimensional morphological profiling.
Chadwick Davis, Recursion Pharmaceuticals
The drug discovery flowchart can be a long and labour intensive process with dozens of single endpoint assays to characterize compound behaviour. At Recursion Pharmaceuticals, we have developed an image-based drug discovery platform that enables the rapid evaluation of compounds using high-dimensional phenotypic signatures that provide efficacy, undesirable toxicity, and potential cellular MoA, earlier in the flowchart, at the hit finding stage. Diseases are modelled in human cells by addition of specific disease-relevant perturbations such as gene disruption, inflammatory cytokines, infectious agents, and others. The cells are labelled with a proprietary set of cellular stains designed to cover a broad range of morphological features and inform on a large scope of biology. Deep learning and additional computer vision methods are used to extract high-dimensional disease specific signatures from our images which accurately represent distinct and subtle cellular responses to disease perturbants and therapeutic candidates. This unbiased, high-dimensional, phenotypic platform enables us to discover highly disease-specific drug candidates that act through both known and novel biology and allows us to screen disease models in at an unprecedented rate.
Automated High Content Confocal Imaging of Organ-Chips
Samantha Peel, AstraZeneca
Microphysiological systems are in vitro models that aim to accurately recapitulate the organ microenvironment by including additional physiological cues such as shear stress from the microfluidic component. Implementation of microphysiological systems within the pharmaceutical industry aims to improve the probability of success of drugs by generating models that are human and disease relevant. AstraZeneca in collaboration with Emulate has invested in the use of ‘organs-on-chips’ for pre-clinical efficacy and toxicity prediction. Whilst capturing cellular phenotype via imaging in response to drug exposure is a useful readout in these models, application has been limited due to difficulties in the ability to image the chips robustly and at scale.
We designed and 3D printed bespoke adaptors to allow compatibility of such chips with a high throughput, high content confocal microscope (Yokogawa CV7000). We also implemented a workflow that incorporates intelligent scanning to map out the cell chambers within each chip ready for higher magnification imaging. At the higher resolution, Z stack slices through the different cell layers within each chip are captured. The process from mapping of the chip regions to higher resolution Z stacking of the cells within each chip is fully automated. By automating this process, we have reduced acquisition time from >16 hours per 8 chips (manual) to just 50 mins (95% time saving), as well as reducing variability, improving quality, and removing user bias. Furthermore, it is now possible to examine whether cells show differential responses depending upon their location on the chip.
The setup of this automated workflow will be described along with application to a multi-cellular, liver Organ-Chip system where detailed cellular phenotype (including morphology, proliferation, apoptosis, and mitochondrial function) in response to a proprietary AstraZeneca compound exposure was captured.
Session Chair: Keith Olson, Corning
Leveraging label-free dynamic mass redistribution technology to study G protein-coupled receptor ligand pharmacodynamics
Chris Hague, University of Washington - Pharmacology
Label-free dynamic mass redistribution (DMR) technology represents a powerful approach to studying G protein-coupled receptor (GPCR) signaling in cultured cells. Recently, our laboratory has leveraged DMR to study multiple facets of human adrenergic receptor biology, including:
To conclude, label-free DMR technology is a diverse, powerful tool that can be used to study both transfected and endogenous GPCRs in cultured cells; to deconvolute functional modules of GPCR macromolecular complexes; to address the importance of specific structural domains for GPCR function; and can be combined with traditional analytical methods to facilitate pharmacological characterization of ligand-receptor interactions.
Virus Particle Mobility Measurement for High Throughput Analysis
Jerome Schmitt, NanoEngineering Corporation
In initial tests a newly demonstrated differential mobility analyzer (DMA) air-electrophoresis instrument now achieves high-resolution (1/FWHM ~30) for particle size classification in the 0-70 nm size range. The target application of this size range is to survey insect viruses (20-60 nm diameter). The integrated analytical process sequence exploits the extreme size uniformity of virion-particles within a virus species. This relatively monodisperse distribution is readily separated from the matrix. Parent samples may be pre-processed using centrifugation, ultrafiltration, dialysis etc. This pre-processing generated refined retentate biofluid that has been enriched as much as 1000-fold in virion content. This tool is favored in mosquito analysis by two factors: 1) infected insects display high viral load thereby enhancing signal strength and 2) insects are infected by only one virus at a time, thus reducing potential complexity of virion particle size distribution spectra. Once the sample is preprocessed, the instrument operates as follows: (i) dissolved virus particles are introduced in the gas phase via electrospray (ES); (ii) the large initial charge on the virus particles is reduced to one elementary unity via carefully controlled contact with oppositely charged ions; (iii) particle sizes present are determined by mobility analysis in the gas using the differential mobility analyzer (DMA); (iv) particles in each selected size are individually counted with a condensation nucleus counter (CNC), providing information on virus concentration. Step (ii) neutralization is now possible using a bipolar electrospray neutralizer source of free electrons, a relatively simple, compact device that eliminates previous reliance on radioisotope neutralizers, permitting safe, portable operation. The air-recirculating fully-contained DMA instrument (no high vacuum) is intended for service in high throughput analysis. It may be challenged with large series of unique refined biofluid samples with target throughput of 10 samples per hour along with frequent calibration runs and target cost of ≤$1 per analyte. Principles of device operation will be reviewed. Representative high resolution DMA viral spectra will be presented along with outline of future work.
Measuring cell-free, label-free drug binding kinetics to native integral membrane protein targets in the membrane.
Jaime Arenas, Nanotech Biomachines
Numerous valuable therapeutic targets are integral membrane proteins (IMP) such as GPCRs, ion channels, and transporters. However, IMPs are unstable when purified away from the membrane and cannot be isolated in their native state to evaluate the binding of drug candidates. This limitation forces lead development programs to introduce mutations and use detergent-based solubilization and reconstitution techniques to stabilize purified IMPs for ligand binding characterization. This expensive and time-consuming approach has limited success and introduces a significant risk to yield inaccurate kinetic binding information for lead selection. To address this challenge, we have developed a graphene bio-electronic sensor technology (GBEST) specifically designed to measure label-free kinetic binding of drug candidates to native IMP targets without removing them from the cell membrane.
A simple cell membrane preparation, expressing the target IMP, is used to cover the GBEST sensor surface with a flat membrane. A small voltage across the sensor creates a current that is highly sensitive to electrostatic changes in the immediate vicinity of the sensor surface. Thus, electrostatic changes caused by the binding of drugs to target IMPs in the membrane can be monitored in real time to generate label-free kinetic binding data in a cell-free system that includes all the natural components of the cell membrane.
The fluidity of the membrane is critical to support the IMP’s native conformation and function. We have successfully demonstrated that cell membranes deposited on the GBEST sensors remain fluid and show lateral movement of target IMPs in the plane of the membrane. Furthermore, direct demonstration of IMP fluidity confirms that its native conformation has not been altered by adsorption to the sensor surface.
We show that the GBEST biosensor detects real-time kinetic binding of small molecule and biotherapeutic drugs to IMPs, and we show that the KD value obtained with GBEST correlates well with the expected benchmark value.
We will present recent data sets and discuss the capabilities of the GBEST biosensor to enable lead characterization with native IMPs. The use of GBEST in lead discovery will enable lead characterization directly on native membranes to generate biologically relevant data for lead selection.
Novel Graphene Field Effect Biosensing Technology for Binding Kinetics
Brett Goldsmith, Nanomedical Diagnostics
We introduce a breakthrough electrical label-free biosensor that provides a new approach to measuring binding kinetics. This approach uses a label-free technique called Field Effect Biosensing (FEB) to measure biomolecular interactions. Field effect biosensors use a semiconducting material to monitor changes in binding potential of biomolecules such as proteins, nuceotides, peptides, and small molecules conjugated to the semiconductor surface. Practical use of this technology for biology requires use of a biocompatible semiconductor such as graphene. Graphene is a 2-dimensional sheet of sp2 hybridized carbon that is well known for its excellent electrical conductivity, high surface area, and unique biocompatibility. Basic electronic devices using graphene were first demonstrated in 2004; this work won the Nobel prize in 2010. In nanotechnology labs, graphene biosensors have pushed existing limits of detection for label free sensors and have shown the ability to measure a large range of biochemical interactions from detecting DNA SNPs to small molecules binding to GPCRs.
We will present our architecture and implementation of graphene based FEB biosensors for label free kinetics. In our architecture, FEB measures the current through a graphene biosensor with targets conjugated to the surface and used as a functional active-biology gate dielectric. Any interaction or binding that occurs with the target causes a change in conductance that is monitored in real-time.
We will also present data from our recently published research demonstrating sensitivity into the pM range to inflammation markers (IL-6) and Zika viral antigen (ZIKV NS1). High precision measurements of protein kinetics captured using this technology, commercially available as the Agile R100, are comparable to both ELISA and standard label free biomolecule characterization tools. Specifically, we show an improvement in signal-to-noise and in lower limit of detection. These results demonstrate that graphene-based platforms are highly attractive biological sensors for next generation kinetics characterization.
Session Chair: Deb Nguyen, Cellular Approaches
ExVive™ 3D Bioprinted Tissue Modeling of Liver Injury and Disease In Vitro
Kelsey Retting, Organovo Inc.
Successful prediction of candidate drugs can be hampered by the lack of in vitro tools to model complexities of human tissue biology. This translational challenge can result in low safety and efficacy predictability and contribute to attrition in drug development. To bridge this gap, the Organovo NovoGen® Bioprinting Platform was utilized to develop ExVive™ 3D Bioprinted Tissues, fully cellular 3D tissue models fabricated by automated spatially controlled cellular deposition. The multicellular architecture of the 3D model can better recapitulate native tissue structure and function compared to standard in vitro models, allowing for complex, tissue-level phenotypes associated with chronic injury and disease. Biochemical and histological characterization demonstrates that ExVive™ Human Liver Tissue enables detection of compound-induced progressive liver fibrogenesis. Here we characterize progression of TGFβ-mediated induction of fibrosis and blockade by Galunisertib as evidenced by key biomarkers, cytokine production, regulation of fibrotic genes, and collagen deposition, collectively demonstrating the utility of the model for the evaluation of interventional strategies and providing evidence of clinically relevant pathway modulation in vitro.
Development Of 3-Dimensional Human Cortical Spheroid Platforms With High Homogeneity And Functionality For High Throughput And High Content Screening
Cassiano Carromeu, StemoniX
The human cerebral cortex is organized in a complex 3-dimensional (3D) structure comprising different neural cell types. The coordinated work of these different cell types is key for brain function and homeostasis. Recently, much work has been focused on obtaining 3D brain organoids in an attempt to better recapitulate the brain development/function in vitro. However, current protocols may lead to variable organoid size and function, making the use of these powerful tools impractical in a drug screening scenario. Here we describe the development of a highly homogenous human induced Pluripotent Stem ell (hiPSC)-derived cortical spheroid screening platform in 384 well format, composed of cortical neurons and astrocytes. Immunofluorescence analysis indicated that these derived neurons and astrocytes display key markers of cellular identity as well as maturity, such as synaptic proteins and glutamate transporters. Viability assays carried out with compounds with known mechanism of action indicated scaleability and feasibility of the assays, with results comparable to a standard 2D model employing the same culture composition. Kinetic, high trhoughput calcium flux analysis performed in a in a Fluorometric Imaging Plate Reader (FLIPR) highlighted that the spheroids present quantifiable, robust and uniform spontaneous calcium oscillations. The calcium signal was modified with excitatory and inhibitory modulators coherently and in a highly reproducible fashion, confirming the presence of a functionally integrated glutamatergic/GABAergic circuit. High speed confocal imaging confirmed homogenous calcium oscillations at the cellular level, whereas multielectrode array (MEA) analysis demonstrated robust synchronous neurophysiological activity at the network level. Additionally, these cortical organoids are amenable to immunostaining in suspension, enabling scalable high content image-based assays focused on key protein markers. Altogether, the developed 3D cortical spheroid platform can be easily implemented as a reliable high throughput screening platform to investigate complex cortical phenotypes in vitro, as a reliable high-throughput screening platform for toxicology studies, disease modeling and drug testing.
Multi-Sensor Integrated Multi-Organ-on-Chips Platforms
Y. Shrike Zhang, Brigham and Women s Hospital, Harvard Medical School
We have developed a fully integrated multi-organ-on-a-chip platform consisting of both biomimetic human organoid models and auxiliary sensing units for continuous, in situ monitoring of organoid behaviors towards pharmaceutical compounds in an automated and uninterrupted manner. This platform was designed to be modular, consisting of a breadboard for microfluidic routing via built-in pneumatic valves, microbioreactors for hosting organoids, a physical sensing unit for measurement of the microenvironment parameters, one or multiple electrochemical sensing units for detection of soluble biomarkers secreted by the organoids, medium reservoir, and bubble traps. Individual modules were interconnected with Teflon tubes to allow for fluid flow. This multi-organ-on-chips platform was further hosted in a custom-designed benchtop incubator capable of maintaining the ambient temperature and when necessary, carbon dioxide control as well. The on-chip valving was achieved by pneumatic pressure applied through programmable Wago controller and Festo valves, using a set of MATLAB codes. These codes were written to also drive the electrochemical station, which was annexed to a multiplex detector, for automated electrochemical measurements at pre-determined time points. Physical sensing was achieved using a data acquisition (DAQ) card connected to a LabVIEW program, and was continuous during the entire course of experiments at desired data sampling rate. Miniature microscopes were directly fitted at the bottom of the microbioreactors to achieve in situ imaging. In addition, we successfully fabricated functional liver, liver cancer, and cardiac organoids within the microbioreactors. The liver and liver cancer organoids were fabricated through photopatternning of human primary hepatocytes and human HepG2 hepatoma cells in liver lobule-mimicking structures, while the cardiac organoids were produced by culturing human induced pluripotent stem cells-derived cardiomyocytes on photopatterned aligned grooves/ridges. Using the integrated approach, all the sensing was performed in situ in an uninterrupted and automated manner, allowing for long-term monitoring of drug-induced organoid toxicity in a human liver-and-heart-on-a-chip platform insulted by acetaminophen for 5 days and a human liver-cancer-and-heart-on-a-chip platform challenged with doxorubicin for 24 h, in both cases of which morphological changes of the organoids and alterations to their biomarker secretion were clearly observed. We believe that our novel platform will provide a new method for integrating existing biomimetic organoid models with a potential to achieve large-scale automation in the drug screening process.
Complex Tissue Biology and Throughput in a Microplate-Based Organ-on-a-Chip System
Jos Joore, MIMETAS — the Organ-on-a-Chip Company
The OrganoPlate is a microfluidic tissue culture platform, which enables high-throughput culture of microtissues in miniaturized organ models. In the OrganoPlate(1), extracellular matrix (ECM) gels can be freely patterned in microchambers through the use of phaseguide technology. Phaseguides (capillary pressure barriers) define barrier-free channels in microchambers that can be used for ECM deposition or medium perfusion. The microfluidic channel dimensions not only allow solid tissue and barrier formation, but also perfused tubular epithelial vessel structures can be grown. We have developed a range of multi-cellular organ- and tissue models for drug efficacy and toxicity studies, including blood vessels, brain, gut(2), and kidney.
In this presentation, I will focus on biological and technological aspects of both healthy and diseased tissue models in the OrganoPlate, including platform specific assays, such as a barrier integrity assay and neuronal network activity assays. I will present data demonstrating that 3D tissues cultured in the OrganoPlate are suitable for any-throughput drug efficacy and toxicity screening, trans-epithelial transport studies, and complex co-culture models in an in vivo-like microenvironment.
1. S. J. Trietsch, G. D. Israëls, J. Joore, T. Hankemeier, and P. Vulto, “Microfluidic titer plate for stratified 3D cell culture.,” Lab Chip, vol. 13, no. 18, pp. 3548–54, Sep. 2013
2. S. J. Trietsch, E. Naumovska, D. Kurek, M. C. Setyawati, M. K. Vormann, K. J. Wilschut, H. L. Lanz, A. Nicholas, C. P. Ng, J. Joore, S. Kustermann, A. Roth, T. Hankemeier, A. Moisan, P. Vulto, “Membrane-free culture and real-time barrier integrity assessment of perfused intestinal epithelium tubes.”, Nature Communications, 2017 accepted
Session Chair: Fred King, GNF
Complex Assays for Complex Targets: Next-gen Oncology Drug Discovery
Serena Silver, Novartis Institute for Biomedical Research
Simple assays measuring cell viability have been a workhorse of the cancer research world, enabling high throughput biology endeavors to identify new targets and new drugs. However, limited success for some of these molecules as single agents in the clinic has underscored the need for drug combinations to effect lasting results in patients. In addition, the recent advances of new types of cancer targets, such as those in immune-oncology, often require more complex assays to interrogate this biology. Due to the nature of combination studies and high content assays, these are often limited by practical considerations in the quantity and scope of compounds that can be assessed. In these cases, considerable effort in rational screen design is required up front to ensure that our results will be actionable; using a deep understanding of the MOA of our compound toolkit and multiple chemotypes targeting key functions. I will present several methodologies and assays we have used to better understand the action of our molecules as well as to identify effective combination partners.
Designed diversity and bioannotated compound libraries for obtaining maximal value from screens in induced pluripotent stem cell-derived cells and other complex biological systems
Tim James, Evotec
As part of the current resurgence in phenotypic screening, early stage assay systems are becoming increasingly complex in an attempt to better model the disease state and therefore improve translation to the clinic. One downside to this complexity is that it substantially reduces the throughput of such systems, and it is generally not feasible to screen millions of molecules in them as would be common in a more traditional HTS campaign. In response to this constraint we have developed two focused compound collections for use in phenotypic screening projects. The first is a bioannotated or chemogenomic library containing molecules with well-characterised pharmacology, intended for use in repurposing and knowledge-driven target deconvolution approaches. The second is a diverse phenotypic library that attempts to maximise chemical and pharmacological diversity within a compact set of 20,000 compounds, intended as a representative sample of our regular HTS collection. In this presentation the design, construction and annotation of these libraries will be outlined. Some examples of our experiences with screening them in drug discovery projects will then be discussed. Finally, analysis of the extent to which these first generation collections have fulfilled their design criteria to date will be presented, and directions of future enhancements reviewed.
Application of a robustness set to guide the development of 1536-well format assays for HTS
Steven van Helden, Pivot Park Screening Centre B.V.
High Throughput Screening is susceptible to pick up compounds that show activity across a range of assay platforms and against a range of proteins, so called frequent hitters. The most common causes of frequent hitters are metal chelation, chemical aggregation, redox activity, compound fluorescence, cysteine oxidation or promiscuous binding. By changing assay conditions the effect of some of these liabilities can be minimized.
For this purpose, a robustness set was developed to evaluate an assay prior to screening on the likelihood of picking up frequent hitters. The robustness set consists of a 1536-well plate containing eight types of compounds based on their chemical/physical properties: Aggregators, Chelators, Coloured, Fluorescent, Luciferase, Reactive, Redox and Clean compounds, all dissolved in DMSO.
A statistical workflow was developed to analyse the results of the robustness set in a standardized manner.
Examples will be shown on how the application of the robustness set has helped to select the optimal assay conditions for screening and how it has helped to design an efficient hit-triaging cascade to deselect artefacts from the primary screen.
Label-free Raman spectroscopy for rapid identification of biologics
Santosh Paidi, Department of Mechanical Engineering, Johns Hopkins University
Monoclonal antibody based biologics are gaining immense popularity as therapeutic agents to treat a wide array of diseases such as cancer and inflammation. As a result, increasing number of biologics are currently undergoing clinical development and approval for clinical translation. Such a rise in demand for biologics necessitate the development of rapid, label-free and automated characterization tools for meeting stringent quality control requirements during their manufacturing. To meet regulatory requirements and reduce business risk associated with fill operations, accurate identification of the drug products is a critical and necessary analysis during multiple stages of manufacturing and distribution. However, due to high similarity in the chemical structures of these drugs, establishing product identification is challenging and the traditional wet lab techniques are destructive, labor intensive and expensive to perform multiple times during production and, even more so, for fill finish testing. Therefore, there is an urgent need for quick, inexpensive and reliable methods for biologics identification. Here, we report the first application of spontaneous and label-free plasmon-enhanced Raman spectroscopy coupled with multivariate data analysis for identification of a cohort of closely related human and murine antibody drugs. Building on finite difference time domain simulations, we synthesized nanoparticles of optimal morphology to compare the feasibility of performing SERS-based bulk sample detection with that of spontaneous Raman spectroscopy. We have developed partial least squares-discriminant analysis derived decision algorithms that provide near-perfect classification accuracy in predicting the identity of these drugs based on the subtle, but consistent, differences in their spectra, which are otherwise invisible to gross visual inspection. We have shown that the performance of the decision algorithm with plasmon-enhanced Raman spectroscopy, even at much lower biologic concentrations, is comparable with that of spontaneous Raman spectroscopy. Together, these results establish the feasibility of developing an automated non-perturbative spectroscopic pipeline for rapid identification and quality control during manufacturing and fill-finish testing of biologics – thus alleviating the principal limitations of conventional wet chemistry analyses.
Session Chair: Ann Hoffman, GSK
Tumor Organoids for Therapeutic Discovery and Personalized Medicine
Dan LaBarbera, University of Colorado Anschutz Medical Campus
The past decade has seen a revolution in developing 3D tissue models of organ function, anatomy, and disease which can be employed for markedly improved drug screening approaches. These models are referred to as organoid, organotypic, or spheroid and these terms are used interchangeably within the literature. Organoids are defined by their ability to mimic in vivo organ function and/or disease, they can be engineered with multiple cell types and microenvironment components, and organoids have the distinct ability to self-assemble. Like organoids, tumor organoids mimic in vivo tumor biology and recapitulate key interactions between extracellular matrix (ECM) molecules and tumor cell receptors that initiate signaling events regulating and promoting cancer.
Tumor organoids prove to be very useful for modeling epithelial-mesenchymal transition (EMT), a reversible process that allows adherent epithelial cells to undergo morphological changes acquiring the motile mesenchymal cell phenotype. This phenotypic plasticity is essential for human and animal body development and wound healing, allowing cells to shed from the epithelium and invade and migrate through the microenvironment to specific locations where mesenchymal cells differentiate or induce differentiation of other cells into specialized cell types and stem cells to produce tissues, organs and bones. However, aberrant EMT is linked as a major driving force in the pathology of the most prominent human diseases: fibrosis, cardiovascular disease, inflammatory disease, eye disease, and cancer progression and metastasis. Therefore, drug discovery targeting EMT has become an attractive strategy towards more effective therapies, particularly for the treatment malignant cancers.
This presentation will discuss our recent innovative tumor organoid models of various cancers, which we have engineered for 3D HCS drug discovery targeting EMT. Our tumor organoid models feature an innovative dual fluorescent biomarker reporter of EMT that can effectively track the forward and reverse EMT transition in live tumor organoids. Moreover, we will discuss hit confirmation approaches and secondary assays used to validate compounds that specifically modulate or reverse EMT. Finally, we will discuss our most recent approaches to develop patient derived tumor organoid (PDTO) models suitable for high-content analysis and screening towards achieving the goal of personalized medicine in cancer.
The 4th Dimension: Exploring the temporal effect of chemotherapeutic regimens in physiologically relevant 3D solid primary tumor and metastatic models.
Madhu Lal-Nag, NIH/NCATS
The wide use of 2D monolayer cultures for cancer drug discovery reflects the technical ease of implementation for drug screening, and the view that oncogenes or tumor suppressor genes are the key genetic drivers of cancer cell proliferation, and therefore, inhibiting these tumor driver genes with drugs should prevent tumor growth. However, there is now ample evidence that the cellular and physiological context in which these oncogenic events occur play a key role in how they drive tumor growth in vivo, and therefore, in how a tumor responds to drug treatments. In vitro three dimensional (3D) spheroid cell culture tumor models are being developed to potentially enhance the predictability and efficiency of drug discovery in cancer. Furthermore, the insight that primary tumors are vastly different from their metastatic counterparts has necessitated a paradigm shift in the development of HTS screening models to efficiently recapitulate key components of primary and metastatic disease. Here we describe the development of an HTS compatible 3D single and multi-cell type tumor spheroid based assay screening platform representing primary disease using libraries of small molecule inhibitors, genome editing and gene silencing reagents (RNAi/CRISPR). One such platform developed as described here generates one spheroid per well with a size of approximately 400µM. The real time study of the effect of chemotherapeutic agents on proliferation and morphology of 3D spheroids adds a temporal component, “time” as the fourth dimension which we hope will better replicate pharmacological effects observed in tumors in vivo, including onset and duration of efficacy and resistance. A modification of this screening platform to accurately represent the metastatic niche is also described. This platform was developed with the objective of finding modulators that are differentially active in 2D vs 3D conditions over time in preventing metastasis and invasion. The ability to increase the throughout for a 3D spheroid assays will enable the generation of pharmacological profiles of chemotherapeutic agents and will hopefully illustrate more effective therapies that might have been missed in 2D, and deprioritize treatments options that might have looked very potent in 2D but have not efficacy in 3D. This approach to cell biology has the potential to improve the physiological relevance of cell-based assays and advance the quantitative modeling of biological systems from cells to organisms.
Microtissues in 4D to Improve Drug Toxicity Risk Assessment
James Pilling, AstraZeneca
Drug discovery and development is often halted or delayed due to toxicological risks associated with the candidate drug and in particular those associated with cardiac toxicity. Cellular models that enable early and accurate assessment of compound liability are required. We have developed a suite of 384-well based 3D spheroid based multicellular model systems and applied novel kinetic imaging and analytical approaches to better assess compound toicity risk early in drug discovery. Cardiac microtissues utilised tri-cultures of human primary, IPS cell derived and primary cells to better represent cardiac tissue structure and functional activity. Characterisation of these model systems show physiologically relevant responses. Cardiac microtissues had a spontaneous beat rate of 62 ± 24 beats/minute (mean ± SD) and the microtissues maintained synchronized contraction transients following stimulation at 1, 2 and 3 Hz. To study and quantify changes in cardiac contractility we developed a bespoke fast frame-rate widefield image acquisition methodology coupled with optical flow image analysis and Wavelet decomposition. Structural cardiotoxicity was assessed using cytotoxicity and live cell high-throughput confocal microscopy, combined with analysis of endoplasmic reticulum integrity and mitochondrial membrane potential from all-in-focus images. Validation against a panel of in vivo clinical and pre-clinical compounds that represented diverse mechanisms of toxic effect showed improved sensitivity and specificity over 2D model systems. 73% of internal compounds stopped due to changes in cardiac pathology between first GLP dose and FTiH (2001-2014) were detected using this live cell imaging and cytotoxicity approach for structural cardiotoxicity with functional cardiotoxicants identified at 91% sensitivity and 80% specificity. These developed models and imaging-based screening systems are in use at a scale enabling full dose-response testing of compounds to enable effective decisions to be made early in a drug project lifecycle. Our results demonstrate the potential to use sophisticated imaging and machine learning analysis techniques to interrogate increasingly complex cellular systems such as microtissues to assess and mitigate for toxicity risk in preclinical drug discovery.
High-throughput 3D Assays
Anthony Frutos, Corning Incorporated
Traditional methods for 3D cell culture are often time consuming, display variability, and lack the necessary throughput for screening applications. To address these concerns we have developed a set of assay plates and flask formats that enable highly reproducible formation and screening of spheroids. In this talk we highlight through a series of case studies how these tools enable a variety of applications including organoid models, immune oncology, and hepatotoxicity models. Specifically, we show how a novel combination of Spheroid and Transwell plates enables the investigation of immune cell homing, tumor cytotoxicity, and tumor immune evasion in an easy-to-use 3D high throughput assay. We; highlight the formation of gastrointestinal organoids derived from human induced pluripotent stem cells. And we demonstrate how primary human hepatocytes models can be improved with these tools to more closely correlate with in vivo toxicity data.
Session Chair: David Piper, Thermo Fisher Scientific
The Open Targets Cell Line Epigenome Project: Determining the Biological Relevance of Cellular Assay Models through Epigenetic Analysis
Rebecca Randle, Screening Profiling & Mechanistic Biology, GlaxoSmithKline
The Open Targets Cell Line Epigenome Project addresses the challenge of selecting appropriate cellular models for target validation and drug screening that exhibit sufficient relevance to pathways and phenotypes associated with a particular disease or biology. The implementation of more complex, disease relevant models through use of 3D culture, tissue slices and primary cells is improving the predictive power of in vitro assays. However, due to limitations in cell and tissue supply, scalability, assay reproducibility and amenability to genetic manipulation, there often remains a need to utilise transformed cell lines, particularly for higher throughput cellular screens and gene editing studies. Currently cell lines are often chosen for these purposes based on historical usage, even if they are a poor substitute for that cell type or tissue. To address the gap in data driven cell line and model selection, we have developed a novel systematic approach to determine biological relevance through the generation and analysis of transcriptomic and epigenomic data. Epigenomic and transcriptomic profiles (RNA/ChIP/ATAC-seq) from common immortalised cell models have been generated and integrated with publicly available reference data from primary cells. Statistical methods have been developed to score cells based on distance / similarity at the global genome level or more specific gene sets, signaling pathways or genomic loci of interest. The data and tools we have generated provide an impactful framework enabling biologists to select the most appropriate, predictive cellular model for their research and to better establish optimal assay critical paths for translating target biology & compound pharmacology to the clinic. In this presentation we will demonstrate how we have used this approach to quantify the impact of 2D vs. 3D culture in cellular models of liver drug toxicity; identify the most relevant immune cell models of inflammatory response and validate immortalised cell surrogates for genome wide gene editing studies.
New Functional Genomics toolsets: Arrayed loss of function screening with LentiArray CRISPR libraries
Jon Chesnut, Thermo Fisher Scientific
Identifying and validating targets that underlie disease mechanisms and can be addressed to provide efficacious therapies remains a significant challenge in the drug discovery and development process. Mechanisms of RNAi have provided the use of siRNA and shRNA to knock-down RNA and suppress gene function. However, depending on the nature of the targets, cells, biology and end-point assays, these approaches may suffer variously from their transient nature, design complexity, incomplete knock-down or off-target effects. The use of CRISPR (clustered regularly interspaced short palindromic repeat)-associated Cas9 nuclease and guide RNA (gRNA) provides a strong alternative that can produce transient or long-lasting impact, straightforward design, knock-out of genes and increased specificity. A number of laboratories have already published reports demonstrating how pools of gRNA can be delivered to cells and “hits” can be established through enrichment or depletion of cells following a “survival” assay and identified by sequencing the introduced gRNAs in the remaining cell population. Here we demonstrate a knock-out screening approach that utilizes the Invitrogen™ LentiArray™ CRISPR library to interrogate the impact of individual gene knock-outs on the NFκB pathway as measured by a functional cell-based assay. We describe the library design concepts, the assay development, initial screening results and validation of specific identified hits. We elucidate the key factors in developing a robust assay including both transduction and assay optimization to achieve the highest levels of transduction efficiency and assay window and provide data from initial screens using the Invitrogen™ LentiArray™ CRISPR kinome library. We expect these approaches to be scalable to the entire human genome and portable to multiple cell types and end-point assays including both high-throughput plate-based assays and high-content imaging based assays.
NGS-based Genome-wide Genetic Screens for RNA Processing Regulators
Hai-Ri Li, University of California, San Diego
High throughput genetic and chemical screens have been powerful tools to comprehensively identify regulators in specific cellular pathways or drug leads in both industry and academia. However, most screens rely on one or a few functional readouts, even with so-called high content screens. We have developed a robust and fully automatable screen platform that couples with NGS (next-generation sequencing) to monitor the expression of hundred-to-thousand endogenous genes associated with a phenotype without need to purify RNA. The platform is based on annealing a cohort of specific oligonucleotides to specific transcripts and/or their isoforms followed by solid phase selection, RNA-templated oligonucleotide ligation, and PCR amplification using bar-coded primers. We refer this assay strategy as RASL-seq (RNA Annealing, Selection, Ligation coupled with NGS). Pooled samples from up to 1500 reactions in 384 wells can then be sequenced in a single lane of an Illumina sequencing flowcell to obtain quantitative information on targeted transcripts.
We have previously identified a gene signature associated with activated androgen receptor (AR) on prostate cancer cells and applied RASL-seq to identify chemicals that can specific inactivate the AR pathway, thus establishing the proof-of-concept for gene signature-based chemical screens, which are equally applicable to both druggable and non-druggable targets. We have also utilized the RASL-seq platform to screen for regulators involved in the regulation of pre-mRNA splicing and alternative polyadenylation in mammalian cells. Coupled with our efforts in technology development, we have developed a bioinformatics pipeline to process the data for quantitative and network analyses. The RASL-seq platform thus offers a general solution to pathway dissection in both genetic and chemical screens.
The use of DNA Encoded Library Technology to identify hits for less tractable targets
Stephen Rees, AstraZeneca
Over the last decade there has been increasing application of DNA Encoded Library (DEL) technology to complement traditional high throughput screening for hit discovery. DNA Encoded Libraries consist of hundreds of millions of molecules synthesised on a DNA tag, such that the structure of the small molecule is genetically encoded by the sequence of the tagged DNA. These libraries are tested against molecular targets in an affinity based selection method. Through the use of such libraries it is possible to test huge numbers of small molecules without incurring the costs of creating a traditional small molecule library and without the automation infrastructure requirements to house and test such compounds. It is the rapid advancement in Next Generation Sequencing technologies that has enabled the creation of this screening paradigm, the ability to mine screening data in depth and has reduced the costs of screening. DEL technology has been adopted by many organisations including AstraZeneca. In partnership with X-Chem we have screened over 40 targets using this screening paradigm. In this presentation I will describe the principles of the DEL platform, and how AstraZeneca has applied this platform as part of our integrated hit discovery strategy, providing examples of the identification of hit series for less tractable targets, and the use of the DEL platform to identify novel binding sites on target proteins.
Track Chairs: Taosheng Chen, St Jude and Louis Scampavia, Scripps
Session Chair: Franck Madoux, Amgen Thousand Oaks
SLAS2018 Innovation Award Finalist: Ultrafast all-optical laser-scanning imaging - Enabling deep single-cell imaging and analysis
Kevin Tsia, The University of Hong Kong
Studying cell populations, their transition states and functions at the single cell level is critical for understanding in normal tissue development and pathogenesis of disease. However, current platforms for single-cell analysis (SCA) lack the practical combination of throughput and precision that is limited by the prohibitive costs and time in performing SCA, very often involving thousands to millions individual cells – largely explaining the limited applications of SCA to date. For creating new scientific insights and enriching the diagnostic toolsets, it is valuable to explore alternative biomarkers, notably biophysical markers, which maximizes the cost-effectiveness of SCA because of its label-free nature. Also, as it is closely tied with many cellular behaviours, biophysical markers can complement and correlate with the information retrieved by existing biochemical markers with high statistical precision – providing a comprehensive catalogue of single-cell properties and thus a new landscape of “Cell Altas”.
Optical microscopy is an effective tool to visualize cells with high spatiotemporal resolution. However, its full adoption for high-throughput SCA has been hampered by the intrinsic speed limit imposed by the prevalent image capture strategies, which involve the laser scanning technologies (e.g. galvanometric mirrors), and/or the image sensors (e.g. CCD and CMOS). The laser scanning speed is fundamentally limited by the mechanical inertia of the mirrors whereas the image capture rate of CCD/CMOS sensor is fundamentally limited by the required image sensitivity. Notably, this speed-versus-sensitivity trade-off of the image sensor explains why the throughput of flow cytometry has to be scaled down from 100,000 cells/sec to 1,000 cells/sec when the imaging capability is incorporated.
To address these challenges, we adopt two related techniques to enable imaging flow cytometry with the unprecedented combination of imaging resolution and speed. Sharing a common concept of all-optical laser-scanning by ultrafast spatiotemporal encoding of laser pulses, these techniques, time-stretch imaging and free-space angular-chirp-enhanced delay (FACED) imaging enable ultrahigh-throughput single-cell imaging with multiple image contrasts (e.g. quantitative phase and fluorescence imaging) at a line-scan rate beyond 10’s MHz (i.e. an imaging throughput up to ~100,000 cells/sec). Moreover, they also enable quantification of intrinsic biophysical markers of individual cells – a largely unexploited class of single-cell signatures that is known to be correlated with the overwhelmingly investigated biochemical markers. All in all, these ultrafast single-cell imaging platforms could find new potentials in deep machine learning complex biological processes from such an enormous size of image data (from molecular signatures to biophysical phenotypes), especially to unveil the unknown heterogeneity between different single cells and to detect (and even quantify) rare aberrant cells.
Automated Fluorescence Lifetime Imaging Assays in 2-D and 3-D Cell Culture
Sunil Kumar, Imperial College London
We present an open source, automated multiwell plate fluorescence lifetime imaging (FLIM) high content analysis platform to assay protein interactions and read out genetically-expressed biosensors utilizing Förster resonant energy transfer (FRET). This instrument provides automated acquisition of optically sectioned fluorescence lifetime images utilising a Nipkow spinning disc scanner and wide-field time-gated imaging and we have demonstrated its application to 2-D and 3-D cell cultures.
In 2-D cell culture we demonstrate the potential to assay protein-protein interactions in fixed or live cells and measure the kD of interactions between the MST1 kinase and members of the RASSF protein family, which are of importance for apoptosis via the Hippo signaling pathway. This assay utilises GFP and mCherry-labelled proteins and combines quantitative fluorescence intensity measurements with FLIM data globally fitted to a double exponential decay profile. Recognizing that this model does not correspond to the underlying behavior of fluorescent proteins undergoing FRET, owing to their slow rotational correlation time contradicting the assumption of FRET averaged over rapidly varying orientations of fluorophore dipoles, we have developed a new tool to analyse FLIM FRET data based on the assumption of static randomly orientated fluorophore dipoles that can be used to correct conventional FLIM FRET data.
To translate our FLIM FRET HCA platform to 3-D cell culture, we have applied the automated optically sectioning multiwell plate FLIM microscope to tumour spheroids expressing FRET biosensors. In particular, we have applied FLIM to cells expressing T2AMPKAR (a version of the AMPKAR FRET biosensor modified to replace CFP with mTq2FP) which can be used to map the activation of AMP activated protein kinase (AMPK), a key regulator of cellular energy homeostasis. We first validated this FRET biosensor in HEK293T cells by obtaining the dose response curve for known activators of AMPK in an automated 2-D FLIM FRET assay and then expressed T2AMPKAR in tumour spheroids and showed that we obtain equivalent dose response curves using the same multiwell plate optically sectioned FLIM platform. We have also studied the FLIM FRET readout of T2AMPKAR with multiphoton microscopy.
The software for automated FLIM data acquisition is written in Micromanager and our global analysis software tool, FLIMfit, is written in MATLAB. Both are openly available and can be accessed at http://www.imperial.ac.uk/photonics/research/biophotonics/instruments--software/openflim-hca/
High throughput 2D and 3D cell and whole-organism screenings in nanoliter format on Droplet-Microarray platform
Anna Popova, Institute of Toxicology and Genetics (ITG) , Karlsruhe Institute of Technology
Small molecule high-throughput screenings are essential for the fields of drug discovery and toxicology. Hundreds of millions of compounds are screened every year. In these screenings, compounds are tested against molecules (biochemical screens), cells, 3D cellular systems and even whole organisms. Routine screenings in academia and pharma industry are performed in microtiter plates. The main drawbacks of using microplates for large experiments are, first, relatively high volumes and therefore high reagent and cell consumption, and, second, requirement of pipetting robotics. Due to these reasons not every biological laboratory can afford high throughout experiments. Another essential drawback is incompatibility with large screenings of rare but physiologically relevant cells such as patient-derived primary and stem cells due to restricted amount of cell material.
We have developed a technology that allows for screenings of cells in 2D and 3D environment and of whole-organism in miniaturized array format. Droplet-Microarray technology is based on patterns of hydrophilic spots separated from each other by superhydrophobic, water repellent, regions. The difference in wettability of spots and borders generates the effect of discontinuous dewetting and enables spontaneous, without pipetting, formation of arrays of separated droplets of nanoliter to microliter volumes trapping live cells and even small animals.
In the past years we developed all necessary protocols for culturing cells in 2D and 3D environment, parallel addition of compounds and reagents to individual droplets and performing various phenotypic assays with read-out based on microscopy. Here I will present our latest developments and results on compound screenings on patient-derived leukemia cells, tumor spheroids and embryonic bodies.
Droplet-Microarray is universal platform that is compatible with various biological assays including compound screenings and transfection-based assays on different cell types (adherent and suspension cells, stem cells, and primary cells), 3D spheroids, hydrogels and embryos. We believe that this technology will open a new opportunities for high-throughput screenings that were not affordable or possible with other technologies till now.
1-D High-Throughput Screening Assays for Primary Human T Cells
Steve Wang, Boston University
The parallel microfluidic cytometer (PMC) is an imaging flow cytometer that operates on statistical analysis of low-pixel-count, one-dimensional (1-D) line scans. It is efficient in data collection and operates on suspension cells. Our 1-D instrument leverages both the high throughput aspects of traditional flow cytometry and the high spatial content of 2-D imaging cytometers. In this talk, we present a supervised automated pipeline for the PMC that minimizes operator intervention by incorporating automated multivariate logistic regression for data scoring. The approach quantifies biomarker localization of activated T cells into a single descriptive ‘activity score’ readout. Reducing complex phenotypes into a simple readout has many advantages for drug screening and characterization. We test the self-tuning statistical algorithms in human primary T cells in flow with various drug response assays. We readily achieve an average Z’ of 0.55 and SSMD of 13. The PMC is volume efficient, needing only 4 µL of sample volume per well. Anywhere from 3000 to 9000 independent sample tests can be processed from a single 15 mL blood donation. The parallel nature of our laser scanning system enables high well throughput and is extremely scalable. We conclude that the new technology will support primary cell protein localization assays and “on-the-fly” data scoring at a sample throughput of more than 100,000 wells per day. This is, in principle, consistent with large-scale primary pharmaceutical screens. We demonstrate that 1-D imaging provides many advantages for rapid development of primary T cell assays in flow.
Session Chair: Haian Fu, Emory University
High-throughput screening with a genetically engineered 3D organoids system
Haian Fu, Emory University
The development of organ-like 3D “organoids” with defined genomic background aims to recapitulate 3D architecture of tumors in an in vitro environment for cancer biology studies and therapeutic development. Here, we report the development of an organoid HTS culturing system for compound screening. We have adapted a single air-liquid interface Air (ALI) chamber system pioneered by the Kuo team for efficient culturing of organoids. The ALI system combines the accurate multilineage differentiation and physiology of in vivo system and allows vigorously expansion of primary organoids for in vitro manipulation and modeling. Using genetically engineered KRasG12D mouse pancreatic and colon organoids as a model system, we miniaturized the 3D culturing system for the growth of organoids from ALI to a 384-well format. The growth of organoids was monitored by automated imaging, which was implemented for the screening of anticancer agents. This study led to the discovery of compounds with genomic selectivity for therapeutic validation. Such a robust assay platform enables large-scale library screening to accelerate 3-D culture-based synthetic lethal screening and drug discovery.
Development of a 3D-High Throughput Assay to Identify Compounds that Block the Growth of Patient Derived Glioma Stem Cells
Victor Quereda, The Scripps Research Institute
Glioblastoma (GBM) is the most aggressive primary brain cancer with a recurrence rate of nearly 100% and a 5-year survival rate less than 5%. Recent studies have shown that GBMs contain a small population of glioma stem cells (GSCs) that are thought to be a major contributor to chemotherapy resistance and responsible for relapse disease. Consequently, identifying compounds that modulate GSC proliferation may dramatically improve treatment response. While high throughput screening (HTS) assays for drug discovery have traditionally used 2D cancer cell models, these monolayer cultures are not representative of tumor complexity. To increase translational relevance three-dimensional (3D) cell culture models have recently received more recognition. Furthermore, patient derived GSCs can be grown as neurospheres and in vivo can functionally recapitulate the heterogeneity of the original tumor. Using patient derived GSC enriched cultures we have developed a 1536-well spheroid-based cytotoxicity assay. In a pilot screening we have tested ~3,400 drugs comprising most Food and Drug Administration (FDA) approved Drugs. This automation-friendly assay yielded an average S/B of 181.3 ± 1.81 and Z’ of 0.77 ± 0.02 demonstrating a robust assay. Importantly, several compounds were identified as potential anti-GBM drugs from this pilot screen, demonstrating the applicability of this assay for large scale HTS. These studies may provide a basis for expedited drug repositioning into a GBM clinical study due to their well characterized pharmacology and safety profile in humans.
Primary Cell 3D Pancreatic Cancer Organoid Models for Phenotypic High-throughput Therapeutic Screening
Shurong Hou, The Scripps Research Institute Molecular Screening Center
Pancreatic cancer remains a leading cause of cancer-associated death, with a median survival of ~ 6 months and 5-year survival rate less than 8%. The tumor microenvironment promotes tumor initiation and progression, and is associated to cancer metastasis and drug resistance. Traditional high throughput screening (HTS) assays for drug discovery use lab adapted 2D monolayer cancer cell models, which inadequately recapitulate the physiologic context of cancer. Primary cell 3D cell culture models have recently received renewed recognition not only due to their ability to better mimic the complexity of in vivo tumors but, are now cost effective and efficient. Here we describe phenotypically relevant 3D cell culture in ultra-low-attachment high density 384 and 1536 well plates using a magnetic force-based bioprinting technology. We have validated HTS amenable 2D and 3D spheroid/organoid-based cytotoxicity assays using 4 pancreatic cancer-associated cell lines against 5 known anti-cancer agents, and thereby screened ~3,400 drugs from Approved Drug and National Cancer Institute (NCI) collections. Assay quality was notable with Z’ averaging >0.8 across all assays and cell lines. As anticipated, results from the 3D screen were significantly different from the parallel screen performed on 2D cell monolayers. Collectively, these data indicate that a complex 3D cell culture can be adapted for quantitative HTS and may improve the disease relevance of assays used for therapeutic screening. Further analysis provides a basis for expedited translation into clinical study due to their well-known pharmacology in humans.
RNA sequencing paired with a novel liver 3D cellular model: a breakthrough technology for high throughput drug-drug interaction screening
Noushin Dianat, ESPCI Paris
Drug-drug interaction (DDI) through drug-metabolizing enzyme activity induction leads to acceleration of elimination of concomitantly administered drugs, and thus to their therapeutic efficacy decrease or abolition. This can be a major issue, for instance, CYP3A4 enzyme induction can lead to termination of a pre-development candidate or a lead candidate in clinical development. The capability to accurately predict the DDI risk related to each drug candidate is currently of great interest in pharmaceutical industry. However, unlike other critical ADME factors, there is no existing high throughput screening (HTS) assay for Cytochrome (CYP) P450 family induction to screen compound libraries or large series. Besides, 2D cultured human hepatocytes traditionally used in such tests do not reflect the in vivo physiology and metabolic activities of human liver. Loss of epithelial morphology after few days in culture, dramatic decrease in phase I and II metabolizing enzymes expression and activity and low throughput, are amongst the shortcomings to their use in DDI HTS. To circumvent these shortcomings, we have developed a novel technology called "BioPearl", to fabricate miniaturized 3D spheroids from primary human hepatocytes for high throughput DDI screening. Using this technology, miniaturized core-shell capsules (350µm of diameter) composed of a thin layer of alginate and a liquid core of cells are generated with the elevated rate of 1500 capsules produced per second. "HepatoPearls" generated in this way display vivo-mimicking characteristics such as: a) epithelial morphology with formation of bile canaliculi network, b) lifespan of more than 45 days c) high and stable metabolic activity of phase I and II metabolising enzymes d) albumin secretion e) urea synthesis and f) CYP inducibility over 6 weeks. In parallel, we have established a fully automated method combining RNA-row-column-plate molecular barcoding with NGS RNA-Seq, allowing high throughput targeted CYP3A4 (one of the main liver drug-metabolizing enzymes) gene expression quantification on HepatoPearls. Through this approach, gene expression can be measured in over 2000 samples in a single NGS run by simultaneous sequencing of reverse transcribed, barcoded and amplified RNAs of the gene of interest and selected housekeeping gene as a stable expression reference. Sequences are then processed through an in house developed bioinformatics script that allows to further generate CYP3A4 based DDI risk scores for tested drugs. To validate the whole workflow, HepatoPearls were treated with a panel of 24 compounds (each at 4 concentrations in triplicate) and DDI risk through CYP3A4 induction was evaluated by the developed NGS RNA-Seq approach. Our results showed to be reproducible and more predictive of FDA vivo-classification (than 2D model). In conclusion, we have established for the first time a robust and integrated workflow allowing high throughput DDI screening based on a 3D human liver cellular model coupled with a gene expression readout.
Session Chair: Shane Horman, GNF
Challenges of Quantitative High Throughput Confocal Microscopy of 3D Spheroids
Jeffrey Morgan, Brown University
Multicellular 3D spheroids are providing far more biological complexity than standard 2D monolayer cell culture. There is growing excitement for the use of spheroids in phenotypic drug discovery and the development of more predictive models of toxicity. However, unlike thin 2D monolayers that are easily imaged, there are significant challenges to quantitative confocal imaging of spheroids. This talk will address these challenges as they pertain to quantitative high throughput confocal microscopy.
A 3D High-Content Screening assay as in vitro model to study polycystic kidney disease
Hester Bange, Leiden University
Autosomal dominant polycystic kidney disease (ADPKD) is caused by mutations in either the Pkd1 or Pkd2 gene. The most important characteristic of this disease is the formation of cysts in the kidney, which reduces renal function and will lead to end stage renal disease. Although it is known that ADPKD is caused by mutations in the Pkd1 or Pkd2 gene, it is not yet understood why this mutation leads to cyst formation. Since cysts cannot form in conventional in vitro 2D cell culture, current research on ADPKD relies heavily on the use of animal models. The lack of proper in vitro models makes the study of this disease all the more challenging.
To address this, we developed a 3D high-content in vitro screening assay usable for mechanistic studies as well as target discovery in ADPKD. This culture system uses kidney collecting duct Pkd1 KO cells, which spontaneously form small cysts when cultured in our 3D hydrogel. In the presence of the test compounds, cAMP inducer Forskolin is added to stimulate the cyst swelling. To examine the effect of the compounds on the swelling, cysts are fixed, stained and imaged. The 3D image stacks are analyzed with our OminerTM image analysis software, capable of measuring many phenotypic characteristics, including cyst size, nucleus shape and thickness of cyst wall. This also enables us to identify compounds that are effective and do not influence cell viability, and discard compounds which have undesired therapeutic profiles. These methods are optimized for the use of lab automation, capable of testing large compound libraries in a single experiment.
To follow up on previously presented work (Booij et al, SLAS Discovery, 2017), we screened a collection of 2320 natural products and bioactive compounds. Multiple hit compounds were identified and validated in vitro. Based on the phenotypic profile, we then selected several of these hit compounds for in vivo validation in mouse models. One of these compounds proved effective in reducing cyst progression and collagen deposition in a dose-dependent manner.
In these experiments, we show that this 3D in vitro screening model can be used to select compounds that have the desired phenotypic profile, which was validated in vivo. These results prove the applicability and reliability of this model in Drug Discovery for ADPKD.
Alzheimer’s Disease Modeling and High-throughput Drug Screening using Homogeneous Arrays of Human Neurospheroids
Mehdi Jorfi, Massachusetts General Hospital and Harvard Medical School
Neurospheroids serve as a widely accepted in vitro platform for disease modeling and drug screening. However, current approaches to recreate neurodegenerative diseases in a dish using neurospheroids rely on mixtures of spheroids that are heterogeneous in size, which limit their applications in basic mechanistic studies and drug screening. Here, we show the in vitro culture of uniformly-sized stem-cell-derived human neurospheroids in large arrays, where they can be monitored for months, and closely recapitulate key hallmarks of familial Alzheimer’s disease including pathogenic accumulation of amyloid-β (Aβ) and phosphorylated tau. The three-dimensional (3D) microarray system generates uniform-sized neurospheroids, with less than 1% variability in diameter in a 96-well array with 1,536 microwells. This performance is key to measuring with unprecedented precision the efficacy and side-effects of Aβ modulating drugs in large scale arrays. We also observed accumulation of amyloid-β and pathogenic phosphorylated tau species after 7-8 week-differentiation in our 3D neurospheroid model of Alzheimer’s disease, not in the control 3D spheroids. This accumulation of amyloid-β was blocked by β-secretase inhibitor treatment. To further extend the capability of our array platform and accelerate drug screening of the human neurospheroids for drug discovery, we leveraged microfabrication and 3D printing techniques to develop a 96-well array with 1,536 microwells. The 96-well array is comprised of five main components: (i) a 3D designed and printed 96-well frame, (ii) a high-quality glass substrate with high transmittance of over 92% and high optical clarity for fluorescence wavelengths, (iii) a microfabricated microarray with 1,536 microwells, (iv) a self-adhesive 96-well silicon superstructure that adheres to the microarray, and (v) a lid. Using this array, we generated uniformly-sized neurospheroids and treated with various compounds including gamma–secretase inhibitor, β–secretase inhibitor, gamma–secretase modulator, Imidazenil, and Methotrexate for different concentrations. We have also confirmed our microarrays can be used for differentiating and modeling disease phenotype using human iPSC-derived neurospheroids. The advantages associated with this microarray platform include, but not limited to facile microarray construction, ease of culture, high-throughput sampling, low amount of reagents required to establish the neurospheroids, cost effective, and compatible with wide range of commercially available automated handling machines. This robust in vitro platform could serve as a valuable next-generation tool for sophisticated 3D models of complex neurodegenerative diseases such as Alzheimer’s disease and expediting the central nervous system drug discovery.
SLAS2018 Innovation Award Finalist: An Ultra High-Throughput 3D Assay Platform for Evaluating T-cell-Mediated Tumor Killing
Shane Horman, GNF
3-dimensional cellular assay platforms are increasingly recognized as robust surrogates for mimicking in vivo disease pathology. In particular, the multicellular spheroid model has been widely utilized in exploratory drug discovery campaigns. However, these complex 3D cell models have previously been restricted to low- or medium-throughput formats due to the technical logistics of forming spheroids in a 1536-well microtiter plate. We have developed a novel microphysiological 3D assay that quantitates T-cell-mediated killing of 3D colorectal cancer tumor spheroids using a new 1536-well spheroid plate. This assay incorporates CD3-stimulated primary patient T-cells in culture with colorectal cancer tumor spheroids and enables parallel assessment of spheroid size and viability as well as T-cell penetration into the 3D spheroid structure. Using this assay platform we screened a library of annotated compounds for spheroid viability and discovered several small molecule candidates that synergize with CD3 stimulation and enhance T-cell-mediated tumor spheroid killing. This phenotypic 3D cell model represents a robust organotypic ultra-HTS platform that can greatly enhance immuno-oncology drug discovery programs.
Session Chair: Sam Michael, NIH/ NCATS
Integrating Environmentally Friendly Tactics into a High-Throughput Screening Setting
Carleen Klumpp-Thomas, NIH/NCATS
Throughout everyday life there are many considerations and practices in place when it comes to recycling, minimizing waste, cleaner energy and reuse to cut down on the impact to our planet’s ecosystem, with millions of individuals around the world making the choice to conserve keeping these principles in mind. However, this mindset and conscientiousness to conserve is not on the radar when it comes to the world of high-throughput screening and science in general, where the term ‘consumable’ is ubiquitous and pipette tips, petri dishes, microplates, solvents and an extensive list of materials are disposed of after one use every day, with most of this waste needing to be handled as chemical or biohazardous, further increasing the negative environmental impact. Luckily this mentality is changing, with the availability of new technologies and the use of experimental data proving its effectiveness NCATS has been able to implement and adopt several methods into many aspects of their high-throughput screening processes which are friendlier to our environment than the traditional equivalents. In many cases these eco conscious practices yield higher quality, cleaner data as well as even eliminating the need for automated assays having to be repeated by catching detrimental issues in real time. Here the focus will be about the integration of equipment onto peripheral devices of the robotic screening platforms, processes and supporting modular operations with the overall goal of conservation. Not only will the use of these concepts be demonstrated but more importantly the successful adaptation will be shown with supporting data. Spanning the last 7 years NCATS has not only been mindful but has been continuously advancing and developing tactics in order to minimize waste without sacrificing high quality data. This ultimately proves that science including high-throughput screening specifically can evolve to incorporate environmentally friendly techniques while continuously advancing the field.
Beyond Small Molecules: Translational Biology Drives Automation Evolution
Jonathan Lippy, Bristol Myers Squibb
Laboratory automation and liquid handler capabilities have evolved since the mid 1990’s to keep pace with the ever changing scientific landscape. Early application of automation was focused on small molecules and high-throughput screening. Since that time, demands from the scientific community have brought the development of newer technologies and enhanced requirements of automation and liquid handler capabilities. No longer does the “one size fits all” automation strategy work for early and late stage drug discovery. Demands to move towards high-throughput screening with biologics, millamolecules and antibody drug conjugates places new requirements on the automation capabilities. The need for more flexible, adaptable and dynamic automation with smaller footprints and enhanced capabilities has become the norm. Here we describe a transformational approach to evolve from in-house to commercial automation and enable multi-modality capabilities from Hits-to-Leads. Over the past 15 years, the marketplace has grown significantly and in-house solutions have become obsolete. The movement towards non-typical reagents such as, whole blood, primary cells and human matrices has driven our requirement to establish flexible automation. Furthermore, complex assay formats such as high content imaging, flow cytometry and kinetic readouts have pushed demands beyond traditional single mode biochemical and reporter based readouts. We have implemented a fit-for-purpose approach that provides high-fidelity integrated automation to drive HTS while providing connectivity with Lead Optimization through modular flexible systems. We have delivered a fleet based and standardized solution to drive usability, reduce footprint and minimize downtime. Additionally, we have connected bioassay processes with compound informatics to drive closed loop screening capability and deliver screening process efficiencies. This holistic approach has provided state-of-art capability to keep pace with the ever changing demands of the scientific and technology landscape.
Lab Automation Drones for Mobile Manipulation in High Throughput Systems
DONGBIN KIM, University of Nevada, Las Vegas
In lab automation, there is a wide range of robots. Robots are employed to accelerate sample handling, such as in high throughput screening (HTS), manipulators and transfer lines rapidly manipulate micro-plates amongst numerous test stations. The net result is that a typical HTS system can handle over 500,000 samples a week. In the age of big data, higher throughput means faster pharmaceutical development and hence quicker patent registrations and earlier market penetration. HTS systems are often custom-tailored to maximize throughput with many high- precision 6-DOF robot manipulators. Such robots employ parallel jaw grippers to gently and precisely position and orient micro-plates.
However, once configured, they are not easily changed. This is important because as new tests emerge, older HTS systems cannot easily perform them. The National Institutes of Health (NIH) in the United States are looking at the potential of lab automation drones to add flexibility to existing HTS systems. The notion has merit; aerial manipulation research is an active area. High degree of freedom (DOF) robots with dexterous arms has been addressed in transformative applications such as material handling, disaster response, and personal assistance. And micro-plates are relatively easy to robotically lift and orient. Issues like ground effect, limited battery life, and obstacle avoidance are indeed relevant to lab automation but also remain open research topics. The critical gap in a lab automation drone appears to be the lack of aerial manipulation arms and grippers.
Recently, several configuration systems including single DOF aerial grasping, non-redundant and fully redundant articulated aerial manipulation, have been explored to create manipulation systems. But all the arms in aerial manipulation are serial; a motor in each joint results in a heavy arm. To the author’s best knowledge, the author’s lab has been the first to introduce a parallel-mechanism arm for aerial manipulation. The previous work concluded its higher degree of precision and lower toque impact on the drone’s stability versus serial manipulators. In this work, the authors present a design of a 6-DOF parallel mechanism arm with a sensorized parallel jaw gripper. The test-and-evaluation approach and results are given.
An end-to-end automated solution for provisioning compounds from a large liquid library fortarget and hit identification efforts
Keith Miller, Pfizer WR&D
Pfizer, like many large pharma, holds millions of compounds in its collection. This presentation will detail the migration from a single-use tube technology requiring a fit-for-purpose building, to a standard lab footprint automated system utilizing multi-use containers and acoustic-based liquid dispensing. The resulting solution is seamlessly integrated with a commercially available Enterprise compound-to-assay requesting tool. Any member of the global organization has the ability to order compounds for plating to any number of assays, with plate shipment to any location. The solution, called Hit ID Provisioning System (HIPS) incorporates rule-based automation behavior in making final deliverables of assay ready plates that ensure plate quality under minimized stock consumption. A novel compound binning algorithm compensated for the limitations of current acoustic liquid handling logistics. Key considerations around implementation of new technology platforms will be reviewed in evaluating how the HIPS was rolled out to enable Pfizer researchers and collaborators access to the compound collection without interruption while improving plate quality and saving resources.
Session Chair: Jason Matzen, GNF Systems
SLAS2018 Innovation Award Finalist: High-throughput single-cell DNA sequencing of AML tumors with droplet microfluidics
Dennis Eastburn, Mission Bio, Inc.
Single cell analysis tools are crucial to understand the role that rare or heterogeneous cancer cells play in tumor progression. To enable the characterization of genetic variation within cancer cell populations, we developed a novel approach that barcodes amplified genomic DNA of individual cells confined to microfluidic droplets. The barcodes are used to reassemble the genetic profiles of individual cells from next-generation sequencing data. A key feature of our approach is the “two-step” microfluidic workflow. The microfluidic workflow first encapsulates individual cells in droplets, lyses the cells and prepares the genomic DNA for amplification with proteases. Following this lysate preparation step, the proteases are inactivated and droplets containing the genomes of individual cells are then paired with molecular barcodes and PCR reagents. We demonstrate that the two-step microfluidic approach is superior to workflows without the two-step process for efficient DNA amplification on thousands of individual cells per run with high coverage uniformity and low allelic dropout of targeted genomic loci.
To apply our single-cell sequencing technology to human tumor samples, we developed a targeted panel to partially sequence 23 genes frequently mutated in acute myeloid leukemia (AML) including TP53, DNMT3A, FLT3, NPM1, NRAS, IDH1 and IDH2. Using this panel, we were able to sensitively assay SNP and indel defined clones within AML samples collected longitudinally at the time of diagnosis, remission and relapse. Our single-cell data indicates that clonal populations inferred from VAFs obtained from bulk sequencing data may not fully resolve the heterogeneity within tumors; moreover, the single-cell nature of our approach enabled the unambiguous colocalization of multiple mutations within subclones not possible with bulk measurements. Collectively, our results show a greater degree of heterogeneity in AML tumor samples than is commonly appreciated with traditional sequencing paradigms and they demonstrate the value of single-cell analysis for AML.
An Approach to Neglected Disease through Automation, Collaboration and High Value Chemical Libraries.
Mitchell Hull, California Institute for Biomedical Research
Traditionally, the format of high throughput screens set large compound libraries against single targets, usually represented by simple biochemical, reporter gene or viability assays. The chemical libraries, were often derived by combinatorial chemistry methods and generally had unknown value or bioactivity. To maximize this approach, programs frequently focused on target families, which could be screened with the same format. This methodology generated large amounts of data very quickly, but it was not always proportionately successful. This approach has waned and there is a shifting trend toward disease-based programs, which emphasize target diversity and multiple biologically relevant assays screened against smaller chemical libraries. Traditional screening is still relevant and continues at Calibr. However, we have embraced the later approach, specifically in our mission to address neglected disease. To improve the chemical matter discovered from smaller screens, we created a library of high value pharmacophores with known or suggested bioactivities. Implementing the most relevant bioassays for neglected diseases can be challenging in a standard lab. Not only do they require high biosafety levels, they may require atypical facilities and skill-sets such as insectariums and continuous culturing of whole organisms. To expand assay diversity, we reached out to other organizations and academic labs to collaborate on screens, which would supplement our own programs. Through these relationships, we expanded the assay diversity of our programs and the collaborator gained access to our library and our own capabilities, assistance and expertise. We will describe the challenges, solutions and successes of this approach, specifically in providing assay support, compound management and data curation to these collaborators.
SLAS2018 Innovation Award Finalist: Combinatorial Drug Screening, High-Throughput Flow Cytometry, and Agile Integration: a Modern Platform for Personalized Treatment Discovery for Cancer Patients
Transon Nguyen, Notable Labs
Our mission at Notable Labs is to identify actionable, personalized treatments for cancer patients. To help us achieve this goal, we have developed a platform that combines combinatorial testing, drug repurposing, and several high-throughput technologies to automate our phenotypic screens on primary patient samples. Our current focus is in acute myeloid leukemia (AML) and other hematological malignancies, although the platform and assay can be extended to other indications.
Our automation platform includes our custom-made laboratory information management system (LIMS) working in tandem with our robotic workcell, which handles all screening and assay operation. The architecture of this system is designed to allow for separation between conceptualization and execution; scientists can plan their screens and experiments through the LIMS, then walk to the workcell and start an automated run that executes their plan, with minimal preparation.
The core assay in this platform is a high-throughput flow cytometry assay that generates a wealth of phenotypic data on the primary patient samples that are run through the system. Over the course of the assay, the robotic scheduler serves as a middleman for information flow between the LIMS and the workcell instruments themselves. The LIMS sends relevant data about the screen to the scheduler, which in turn acts on the data and directly controls the instruments to run the assay.
After completion of an assay, raw data flows back from the workcell instruments to the scheduling software, which consolidates the data and uploads it to our cloud-based LIMS. From there, a number of in-house software tools are used to streamline our flow cytometry data analysis, allowing scientists to analyze complex, multi-dimensional flow data across thousands of wells.
The platform has been validated across a number of patients. Our data has led to actionable treatment options for relapsed and refractory patients that has, in some instances, resulted in complete remission in AML patients. The platform and architecture that we have developed brings together our collective hardware, software, and biological knowledge, and demonstrates the predictive power of our data-driven approach to personalized medicine for cancer patients.
Co-development of Instruments and Assays to Optimize Biologically Relevant Screening/Profiling Systems
Jason Matzen, GNF Systems
GNF develops and manufactures high throughput drug discovery platforms for both small molecules and biologics. The High Throughput Screening (HTS) and Automated Cellular Profiling (ACP) systems feature high capacity, excellent performance and reliability. The highly integrated systems are complemented by smaller modular systems that allow for rapid reconfiguration and ultimate flexibility. More recently, these systems and devices have been challenged by more complex 3D, physical stimulation, highly multiplexed gene expression/NGS readouts and other phenotypic assays coupled with complex cell sources such as primary and co-cultures. This presentation will highlight several case studies where the co-development of both instruments and assays resulted in unique and powerful solutions.
Session Chair: Louis Scampavia, Scripps Florida; Department of Molecular Therapeutics
DIY integration of a Hamamatsu FDSS to a High-Throughput Screening System; a problem solving and design-for-manufacture exercise, and supporting case for the value of in-house prototyping ability
Eric Wallgren, National Institutes of Health - National Center For Advancing Translational Sciences
A perception exists in the life sciences field that the creation, implementation, integration, modification and maintenance of instrumentation are tasks exclusively to be outsourced to dedicated vendors. At the National Center for Advancing Translational Sciences (NCATS) we believe that readily available solutions can be quick and cost-effective, but if the science or the scientist dictates a new tool that doesn’t yet exist, the ability to quickly design and produce real, usable instruments can tremendously accelerate progress.
This value is proven by our in-house integration of a Hamamatsu FDSS7000EX Functional Drug Screening System to an existing single-arm High Throughput Robotic Platform, and further integration of an Ion Field Tip Charger plasma pin tool cleaning system to that FDSS.
This project demonstrates that the process of problem solving is of enormous importance to the outcome. Good design involves engineering, but not exclusively so; time spent at the very beginning to “consider what bears consideration” is always time well spent, and it’s often the least expensive time billed to the project.
“A designer is an emerging synthesis of artist, inventor, mechanic, objective economist and evolutionary strategist” – R. Buckminster Fuller
This quote encapsulates the concept that problem solving for life sciences is an open-ended, inquisitive process in which diverse disciplines spanning engineering and sociology must be given consideration, equally.
Open development from user to vendor and back again; how everybody wins
Neil Benn, Ziath
The ‘vendor’ community and ‘user’ community are today becoming commonly intertwined; with the user community taking advantage of modern prototyping and manufacturing technologies such as 3D printing, micro-controllers and laser cutting. In addition; the vendor community is often using these technologies to producing products. This means that there is a significant overlap such as we have never seen before.
The presenter has previously worked in the instrument user community at major pharmaceutical companies, large biotech, startup biotech and academia. During this period a close collaborative relationship between the user and supplier lead to improved performance of the equipment purchased and increased reliability. Now the same person runs an instrument company which sells to the end user and we now see the other side of the coin – how to support the equipment in the field as a manufacturer. Concepts such as printing your own spare parts and even the concept of flat pack style delivery will be explored. In addition the reality of this will be discussed; issues such as giving out the design for internal parts of a product could leave a company’s designs open to be reused by a competitor and in addition also there is a degree of willingness by the user of the equipment to do the repairs by themselves.
This presentation will discuss how this ‘overlap’ can be leveraged to produce better products, better interaction and better results for all parties. This presentation will explore opportunities to further these aims and bring the supplier and user of everyday laboratory equipment together.
In-house software and processes to support High Content Screening of Primary Neurons
Pierre Baillargeon, Scripps Florida
The integration of High Content Screening (HCS) devices onto High Throughput Screening (HTS) platforms to support neuroscience research presents unique challenges for drug discovery teams. In particular, the informatics aspect of HCS applied to neuroscience is an area where advances in software automation can result in substantial throughput and efficiency gains for researchers. In addition to the data processing and storage requirements of HCS above traditional HTS readers, neuroscience assays present a number of unique challenges for the HTS environment. These challenges include ensuring data integrity from acquisition through analysis & QC, porting data between multiple distinct HCS platforms and providing end-user analytic tools for ongoing intermediate assay results.
This presentation will focus on the development and implementation of novel in-house software utilities used at Scripps Florida to support the Synaptogenesis neuroscience drug discovery project. This project utilizes multiple HCS platforms, has an ongoing non-traditional HTS timeline and requires on-demand access to the full HCS data stream, from raw source images to final endpoint results. The biology of the Synaptogenesis project is currently amenable to 384 well format while the Scripps Florida uHTS platform is optimized for 1536 well screening. Supporting a HTS campaign where the compound collection resides in a different plate density than the assay plate required the development of custom robotic and informatics procedures. The Synaptogenesis endpoint calculation requires measurements at DIV 12 and DIV 14 where each assay plate is represented by 3,072 images (384 wells at 4 images per well for each of two separate reads) which must be associated with relevant metadata (plate barcode, well row, well column and well quadrant) for downstream tracking. Previous neuroscience assays in this format were not amenable to robotics screening and were limited in throughput. To date, we have successfully screened a large portion (greater than 40,000 compounds) of the Scripps Drug Discovery library, in quadruplicate, iteratively over a 9 month period.
To meet these challenges, custom software tools have been developed which enables scientists to manage what would otherwise be an overwhelming amount of data. These tools include a web based portal that allows end users to easily review HCS neuroscience data as it comes off the HTS platform and to quickly drill down on endpoint data back to the images acquired by the reader. The advantages of developing in-house informatics over commercial products and the impact of these tools on the ongoing neuroscience research at Scripps Florida is presented.
Routine lab automation with culture dishes made easy - How the PetriJet platform technology helps to makes drinking water analysis faster and better
Felix Lenk, TU Dresden INT
Digitalization, Automation and Miniaturization currently change the way we live and work. It also affects daily work in laboratories. The disruptive development of new technologies such as open source automation technology, the Internet of Things (IoT) and 3D-printing offer endless possibilities for an in-house engineering of new laboratory devices which are compact, adaptable and smart.
At the SmartLab systems department of the Technische Universitaet Dresden, Germany approaches for the laboratory of the future have been developed and implemented. This includes the PetriJet platform technology which was developed the automate all processes associated with culture dishes in environments such as routine laboratories for drinking water or blood samples as well as culture development for the next generation of antibiotics.
The device technically is a x-y-robot consisting of two linear axles enabled to transport variable sizes and shapes of culture dishes from A to B through a 3D-printed gripper-system which can also remove the lid of the culture dish. Core part of the programming is a self-learning control software that does not need any teaching – the most time-consuming part of setting up a typical robot. With the presented solution an experiment conducted on samples is planned only once and executed for all culture dishes in the machine with the right processing stations – e. g. sample imaging – installed. It is not necessary to specify locations for culture dish piles and treated dishes are allocated dynamically while user interactions are directed by LED-lighting. The system can process more than 1.200 culture dishes in an 8-hour shift and is equipped with a storage unit for these culture dishes. Several processing stations e. g. for sample plating or drug discovery are under development.
The first PetriJet platform designed and build in the SmartLab systems lab in Dresden, Germany now operates in a routine laboratory for drinking water analysis in Hamburg, Germany and currently visually inspects a four-digit number of drinking water samples subject to infection with Legionella bacteria every day.
The systems integrated image analysis software counts colonies on direct samples as well as on filters, sorts the samples and is linked to the LIMS of the company speeding up to inspection process by factor 4 and making the data available to customers just in time even after a night shift. Inspection quality and throughput has increased significantly and stored proof images are available even weeks after sample treatment enabling a completely new approach to data mining and infection tracking.
Track Chairs: Daniel Sipes, GNF and Rob Howes, AstraZeneca
Session Chair: Andy Zaayenga, SmarterLab
Accelerating clinical and translational research for biomarker discovery through advanced, standardized cell isolation methodologies
Rohit Gupta, Stanford University
Specimens are not meant to live in a freezer. Their sole purpose in life is to produce data. Biorepositories are critical to accelerating clinical and translational research technologies and discoveries. Human subject research depends on the availability of standardized biorepository methods for collection, storage, processing, and distribution of biological specimens alongside associated patient metadata. Stanford Medicine’s growth across the Bay Area has created an opportunity for us to connect participants to bench-side research in ways never before possible. Our biobank has an emphasis on more advanced sample processing geared at downstream, functional analysis using viable cell suspensions. Work is often performed in tandem with specialized assay groups, such as the Human Immune Monitoring Center (directed by Dr. Holden Maecker) to leverage cutting-edge technologies such as CyTOF, single-cell RNA sequencing, flow cytometry, and immunoassay. Many of these assays require specimen types from humans that have been processed using very specific techniques and methodologies to prevent the introduction of artifacts. In particular, the standardization of the procedures for cell isolation is critical to the success of the translational research; by increasing throughput and minimizing ‘hands-on’ time, applications for biomarker discovery have a chance to be accelerated and reproduced.
At Stanford Medicine, the lab of Dr. Irving Weissman has coordinated the build out of a unique biorepository dedicated to collection and advanced, standardized processing of tissue and tumors into viable single-cell suspensions. Over the course of the last two years, tumors with match normal tissue have been collected alongside archives of clinical, pathology and surgical notes. In November 2016, Sydney Gordon, a graduate student in Weismann’s lab, discovered the novel increased expression of PD-1 on tumor macrophages in colon cancer of mouse models. Sydney was hoping to translate her findings into humans and did so by taking advantage of Stanford’s biorepository. Within a matter of two months, Sydney was able to repeat her discovery on human samples that had been preserved for functional analysis. She validated her findings of increased PD-1 expression on human colon cancer tumors, opening the door to multiple discussions around new drug therapeutics in cancer immunotherapy. Key to her success was the standardization and care taken to procure and process the solid tissues into single-cell suspensions with minimal impact to the cell surface antigens and cryopreserve the cells viably. Automating parts of this pipeline presents an opportunity to greatly improve the throughput, while also standardizing the methodology away from technician variability.
Automated biobanking in R&D at Regeneron Pharmaceuticals
Rostislav Chernomorsky, Regeneron Pharmaceuticals
Regeneron Pharmaceuticals is a leading science and technology company delivering life-transforming medicines for serious diseases. Founded by physician-scientists nearly 30 years ago, our science-driven approach has resulted in five FDA-approved medicines and numerous product candidates in a range of diseases, including asthma, pain, cancer, and infectious diseases. In addition to our medicines, our innovations include the VelociSuite®technologies, world-class manufacturing operations, one of the largest human genetics sequencing efforts in the world, and rapid response technologies being used for global good.
At Regeneron, we strive to use the most innovative and cutting-edge technologies to expedite and assure success in the drug discovery process. Large-scale biobanking, combined with proprietary automated solutions developed at Regeneron, allow us to manage and process large numbers of samples, ranging from tissues to nucleic acids and proteins. Accurate, highly-efficient sample storage allows researchers at Regeneron to store and extract several thousand unique samples per week for research purposes; done manually and without the aid of an automated biobank, this step would consume precious hours of human resources, lead to sample tracking errors, and result in unpredictable downstream problems in data analysis.
Due to the nature of research and discovery, and associated need to store a wide range of tissues, we employ various cold store solutions. As the number of programs increase, so do the complexity of sample variety and the need for tracking of every sample. To compliment the large number of manual freezers, we began to strategically implement automated biobanks at different locations. One of the main drivers for biobanking is to gain efficiency of sample handling and maintain full integrity of the sample and its whereabouts.
Our range of automated biobanks is as diverse as samples we store in it. We store nucleic acids, various tissues, cells and proteins in appropriate storage conditions. All biobanks are capable of automatic sample handling: receiving jobs from LIMS, barcode scanning during sample introduction, sample picking and data transmission to the respective LIMS. A fully automated biobank saves a significant amount of time when compared to manual sample processing.
Currently we use automated cold storage solutions ranging from -20°C to -150°C from various companies for the respective tissue storage conditions. By carefully evaluating the need of each group and the capability of each individual biobank, we were able to match the biobank hardware with respective group, enabling different departments to be more efficient.
Genomics-driven Drug Discovery at the Regeneron Genetics Center
John Overton, Regeneron Genetics Center
The Regeneron Genetics Center (RGC) was officially launched in early 2014 with a goal of advancing basic science around the world by providing valuable genetic information to researchers, physicians, and patients to ultimately help identify novel targets for Regeneron drug development. The RGC has used innovative sample biobanking, proprietary automated technology, advanced DNA sequencing, and state-of-the-art cloud computing to - in just over three years of sequencing - build one of the world’s most comprehensive genetics databases. The database currently contains the genetic information of well over 200,000 patient volunteers, many of which are paired with detailed de-identified medical records, and continues to grow rapidly.
Automated sample biobanking and sample preparation have allowed us to manage and process massive numbers of DNA samples with near perfect fidelity and achieve incredibly high levels of DNA sequencing efficiency. Well-engineered sample preparation platforms have allowed us to scale production more than 10-fold over the past three years without adding any additional infrastructure or headcount. While quickly expanding our efforts and improving our processes we have increased our data quality standards above traditional expectations while significantly cutting data production costs.
Building on Regeneron’s strengths in mouse genetics and genetics-driven drug discovery, the large amount of information that is generated at the RGC allows us to elucidate the genetic factors that cause or influence a vast range of human diseases and ultimately will make drug development faster and more precise. We are using our unique database and capabilities to build better-informed clinical trials, to validate our existing programs, and to identify new drug targets and therapeutic indications; several successful applications of this already exist and will be presented. Based on genetic information we have advanced programs into human studies, started programs in early stage clinical development, and we have identified numerous new drug candidates for possible development programs.
Bridging the Materials Management Gap: Leveraging Amgen’s Compound Management Infrastructure to Include Biologics
Christina Glazier, Amgen Inc
Amgen’s Research Materials Management group was originally built to manage the research small molecule collection. The software, automation, and processes put in place allowed us to grow the collection 10-fold while simultaneously reducing staffing requirements. This presentation explores utilizing small molecule management “tricks of the trade” to enable efficient biologics sample tracking including barcoding strategies, temperature requirements, standardization of consumables and attributes within sample types. As a result of this initiative the distribution network of hundreds of freezers and materials across Amgen is now searchable globally.
Session Chair: Daniel Sipes, Genomics Institute of the Novartis Research Foundation
Development of an Automated High Throughput CHO Stable Pool Platform for Generating Large Protein Collections
Paul Anderson, Gnf Systems
Recombinant protein expression and purification is a central process in biomedical research and Chinese hamster ovary (CHO) cells are a primary workhorse for protein production from mammalian cells. GNF has developed a robust suite of software and automated systems to support high throughput CHO (HT-CHO) stable pool establishment, archive of cell banks and protein purification. Pools are established in 96-well plates, maintained until they are ready for scale up, and then expanded into an AutoFlask™. Once cells reach the desired density, cell bank archives are created and one or more batch production AutoFlasks™ are inoculated depending on the amount of protein requested. As an example, a single 50mL culture expressing a human IgG1 antibody typically yields 10 milligrams of protein. Innovative solutions ranging from a new software Dashboard to manage projects and execute processes, a recently developed non-invasive Flask Density Reader and an upgraded harvest and purification system compatible with magnetic beads will be presented. This platform enables cost-effective, facile production of proteins at quantities and quality useful for early stage drug discovery tasks such as screening, protein engineering and even in vivo studies.
Receptor Mediated Delivery of Antisense Oligonucleotides to Pancreatic ß-cells.
Johan Meuller, AstraZeneca
Therapies reversing loss of ß-cell mass, a primary defect in type 2 diabetes (T2D), are an unmet medical need. Antisense oligonucleotides (ASO) represent a promising drug modality for ß-cell regenerative treatments, since they can be used to knockdown genetic targets linked to ß-cell loss in type 2 diabetes. However, pancreatic ß-cells are resistant to uptake of ASOs from systemic circulation.
Conjugation of ASO to a high affinity ligand for a receptor that has the ability to internalise, and has an enriched expression on the target cell, was investigated as a way to increase the specific uptake of ASO into the desired target cell. Here, we successfully demonstrate this concept using a GLP1 receptor (GLP1R) peptide agonist to specifically deliver ASOs to pancreatic ß-cells.
In this study ASOs complementary to MALAT1 or FOXO1 were conjugated to a range of GLP1 peptide analogues and the effect on the different processes leading up to a productive uptake , i.e. gene knockdown, was studied. To allow for efficient uptake the peptide-ASO conjugates need to retain a high affinity for the receptor while still promoting internalisation of the cargo. Furthermore, the ASO needs to escape the endosomal compartments in order to be functional.
The capacity of different peptides with and without various linkers and ASOs were screened for receptor binding and G-protein coupled activation using cAMP and dynamic mass redistribution assays in GLP1R expressing HEK293 cells. The ability to further engage in ß-arrestin dependent signalling was adressed using ß-arrestin recruitment and receptor internalization assays in recombinant cells. Receptor-dependent internalization of fluorescently labelled peptide and peptide-ASO conjugate were studied by confocal imaging in overexpressing cells. Enhanced productive uptake was demonstrated in HEK293 GLP1R cells by measuring gene knockdown of MALAT1 or FOXO1. Following treatment with the GLP1 conjugates, a 20 to 60 fold reduction in IC50 was achieved compared to the respective unconjugated parent ASOs. Consistent with these results, treatment of primary rodent islet cells with the ASO conjugates significantly increased gene knockdown compared to the parent ASOs.
In vivo, in mice, single subcutaneous injections of GLP1-ASOs dose dependently increased ASO exposure and reduced target gene expression in islets while unconjugated ASO gave no effect. Furthermore, no effect on gene expression was observed in liver even at doses reaching >70% knock down in islets, suggesting that GLP1 conjugation confers enriched uptake in islets compared to peripheral tissues.
To the best of our knowledge, this study is the first demonstration that GLP1R, a class B G-protein coupled receptor, can be used to enhance the productive uptake of ASO cargo and therefore be used for the delivery of ASOs to pancreatic ß-cells.
Identification of new negative regulators of ciliogenesis in breast cancer cells through high-throughput siRNA screening
Marion Failler, NYU Pelmutter Cancer Institute
Breast cancer is a major cause of death in women in the world. The basal subtypes, also recognized as triple negative breast cancers (TBNC), are the most aggressive type and account for the highest mortality rate in patients. Currently, there are no FDA approved targeted therapies for TNBC, and innovative approaches are necessary to develop new therapeutic options. The primary cilium is a membrane-bound, cell surface projection assembled from centrosomes and singularly expressed in the majority of cells in the human body, serving as a cellular 'antenna' in the recognition and transduction of extra-cellular stimuli, such as growth factors. This organelle forms during cellular quiescence and disassembles when cells enter the cell cycle and proliferate. Interestingly, primary cilia are frequently lost in malignant tumors, such as breast tumors. Thus primary cilia may play a repressive role in regulating cell proliferation and could lower breast cancer development.
In order to identify negative regulators of ciliogenesis that could represent target for new drugs, we performed a high content screen using an arrayed library containing pooled siRNAs targeting 23,000 human genes in triplicate on Hs578T cells, a basal B breast cancer cell line which forms cilia at low frequency. Detecting cilia by automated immunofluorescence staining and imaging, we identified 350 candidate genes (~1-2%) that increased the number of ciliated cells. Candidate genes were retested in secondary screens in additional cell lines to distinguish the genes involved in cilia formation common to all cell lines and the ones specific to the (sub)types of (breast) cancer.
There is overwhelming evidence that in vitro three-dimensional tumor cell cultures more accurately reflect the complex in vivo microenvironment than simple two-dimensional cell monolayers. In order to test the candidate genes from the 2D cell culture experiments in a tertiary screen to see their effect on tumor growth, migration and invasion, we grew Hs578T cells in ultra-low attachment (ULA) 96-well roundbottomed plates, where tumor cell suspensions formed a three-dimensional structure within 24 h. Three-dimensional spheroid assays are considered valid models to recapitulate features of tumors and, combined with new technologies of automated imaging and analysis, will contribute to a better understanding of ciliogenesis and breast cancer and to an important step in anticancer drug research.
Identification of Novel GPCR Peptide Agonists using Autocrine-based High Throughput Screening of Large Combinatorial Peptide Libraries
Patricia McDonald, Scripps Florida
Compared to small molecule drug discovery efforts, peptides have always been noted for their selectivity, potency, and rapid optimization. However, short-lived in vivo activity and lack of oral bioavailibilty have hindered their advancement. Recent advances in peptide modifications that improve stability and new drug delivery systems are expected to override these difficulties such that peptide therapeutics are experiencing an exciting revival. The application of peptides as therapeutic agents in targeted drug delivery, and as diagnostic and prognostic tools in various disease states is also on the rise. Systematic screening of peptide libraries is a well-established approach that typically uses affinity-based technologies such as phage-display; a system for large scale selection of proteins, peptides, and antibodies based on their binding affinity and specificity. A major advantage of phage display is the extensive diversity of variant proteins that can be represented in a phage library providing a means of rapidly screening large numbers of proteins against potential binding partners. However, it does not provide any information regarding the ‘binder’s’ intrinsic activity.
Herein we describe the development of an autocrine cell-based functional assay for the selection of G-protein coupled receptor (GPCR) agonists from large intracellular combinatorial peptide libraries. In our system one out of ~108 different peptides and the GPCR of interest are co-localized in the plasma membrane of cells. When the co-localized peptide activates the neighboring receptor a fluorescent signal is generated such that each cell becomes a reporter unto itself. The system was validated by selection of highly potent agonists for the Glucagon Like-peptide 1 receptor (GLP-1R). These agonists were further validated using a battery of orthogonal cell-based functional assays. Interestingly, one peptide (P5), unlike the receptor’s natural ligand GLP-1 or the GLP1 analogue Exendin 4 that promote receptor coupling to both G-protein and arrestin signal transduction pathways, P5 exhibits ligand bias in that it promotes G-protein signaling but a greatly attenuated arrestin response was greatly attenuated. We evaluated the physiological consequences of P5 G-protein signaling bias as compared to Exendin 4 in DIO mice, P5 displayed a decreased insulinotropic effect as compared to Exendin 4, yet significantly improved glucose tolerance. Furthermore, chronic treatment with P5 more effectively corrected hyperglycemia, and improved hemoglobin A1c levels. High throughput autocrine-based functional screening enabled the discovery of a potent and selective GLP-1R G-protein biased agonist that may provide a novel therapeutic approach to T2DM. Thus, our system offers a powerful approach to identify rare peptide ligands with unique signaling properties to interrogate receptor function in normal and disease states. Such studies pave the way to identify new ligands for numerous GPCRs and other cell surface receptors that may be useful for de-convolution of signal transduction pathways and discovery of new mechanisms of receptor activation.
Session Chair: Rob Howes, AstraZeneca
Generating highly potent and efficacious antibodies to the ligand-gated ion channel P2X4 for the treatment of neuropathic pain
Wendy Williams, MedImmune, UK
Ion channels are involved in numerous biological processes and have been implicated in many different disease states. Drug discovery approaches have focussed on small molecule compounds to modulate the activity of ion channels, however due to the high homology between family members and difficulty in achieving selective compounds the subsequent off target toxicities has limited their use. Antibodies are renowned for their high affinity and highly specific interactions with their targets and have been of increasing interest to the ion channel drug discovery field. However, ion channels are challenging targets for antibody therapeutics, mainly due to their complex membrane-integrated structures and multiple conformational states. This makes them hard to express and purify, which has limited the ability to utilise ion channels in antibody isolation campaigns. P2X4 is a ligand gated ion channel proposed to be involved in the onset and maintenance of neuropathic pain. Here we describe the isolation and characterisation of a panel of P2X4 specific modulatory antibodies using phage display and hybridoma technologies, together with automated patch clamp electrophysiology and calcium influx assays. Furthermore, we demonstrate in vivo efficacy of our antibodies in a relevant model of neuropathic pain. This is the first example, to date, of antibodies that can modulate the P2X4 ion channel directly with high potency and specificity. This highlights the exciting potential therapeutic opportunity of an antibody for modulating this type of ion channel.
Next Generation Antibody Discovery - Eficient mining of the B cell repertoire
Daniel Lightwood, UCB Pharma
Single B cell isolation methodologies have emerged as an important technology in the quest to identify high quality functional monoclonal antibodies to a range of therapeutic targets, including those that are considered extremely challenging. Here we describe the use of a number of cutting-edge antibody discovery technologies to efficiently interrogate the B cell repertoire of immunised animals and humans to identify rare antibodies with desirable characteristics. We employ a high-throughput automated B cell culture screening platform to mine out the memory B cell repertoire and a novel fluorescence-based proximity secretion assay (“fluorescent foci” system) to sample the plasma cell repertoire directly in niches such as the bone marrow. More recently we have been developing a droplet microfluidic platform to enhance our ability to sample both the memory and plasma cell pool. We will describe these technologies and how we have applied them to several therapeutically relevant targets.
Accelerating Discovery:Development of a High-throughput Mammalian Expression Platform as Part of a Fully Integrated Biologics Workflow
Melissa Crisp, Eili Lilly and Company
The success of antibody discovery relies heavily on the ability to interrogate large, diverse antibody pools. However, the number of clones that can be screened for function is often limited by the capacity to express a large numbers of proteins at concentrations adequate for downstream cell-based assays. Bottlenecks in expression typically diminish success rates and often require project teams to re-initiate antibody discovery campaigns resulting in delays in timelines.
At the Lilly Biotechnology Center in San Diego, we have established a workflow utilizing both integrated automation systems, as well as stand-alone equipment, to overcome the limitations of throughput in the discovery of protein therapeutics. This “Islands of Automation” approach was strategically designed to allow for maximal flexibility in the incorporation of future antibody discovery technologies, as well as for cost efficiencies. Furthermore, this approach allows for intermittent quality control checks as well as parsing of plates either directly to project teams or to the next workstation.
At the centerpiece is a high-throughput mammalian expression platform, capable of performing ~4000 small-scale transient transfections/day. This system, built in collaboration with Wako Automation, evolved from a semi-automated process and this conversion required careful consideration of equipment selection, consumables cost, custom engineering as well as process redundancy to ensure minimal downtime. We have also streamlined this workflow to accommodate multiple plate types via a simple barcoding system and dynamic allocation of plates to cytomat shakers, effectively increasing efficiency, reducing FTE time and lowering reagent costs.
The expression platform requires communication with both upstream and downstream automation systems for seamless data tracking, accomplished through linkages with Beckman Coulter’s DART (Data Acquisition and Reporting Tool) and our in-house LIMS. In addition, evaluation of the overall workflow across the integrated automation systems identified key bottlenecks that were addressed through purchases and enablement of key stand-alone instruments.
High-throughput imaging and selection of viable clones in line with single cell sorting improves viability of clones for downstream analysis
Steven Wiltgen, Molecular Devices
An FDA requirement for biologics production is to provide evidence that the host cell line being employed is derived from a single, parental cell (i.e. monoclonal). Conventional techniques for isolating single cells such as limiting dilution and flow cytometry-based methods are significantly limited by process inefficiencies, including low plating densities and low viabilities. These inefficiencies, in turn, reduce the capability to screen for high-producing clones, thereby increasing timelines and costs at early stages of antibody discovery and cell line development processes. Microfluidics-based methods hold promise for improving upon such inefficiencies due to their ability to sort single cells in a low stress environment. Here we present the optimization of a microfluidics-based method—the single cell printing method—for imaging and screening of clones prior to single cell sorting, which significantly improved cell viability compared to other cell sorting techniques.
Track Chairs: Benjamin Haley, Genentech and Prashant Mali, UC San Diego
Session Chair: Benjamin Haley, Genentech, Inc
Mining novel CRISPR systems for new genome engineering tools
Patrick Hsu, Salk Institute for Biological Studies
CRISPR systems exist broadly throughout prokaryotic life and constitute an incredible diversity of adaptive immunity mechanisms. Here we present a framework to computationally mine and experimentally characterize novel CRISPR systems for useful bioengineering tools.
Something old, something new: Improving genome editing efficiency over CRISPR with a new generation of TALE nucleases
Jason Potter, Thermo Fisher Scientific
Genome editing has become easier with the advent of CRISPR-Cas9. However, the CRISPR system has the drawback of requiring a sequence motif (PAM) in order to bind and cleave genomic DNA. Attempts to overcome this limitation have been made by developing a suite of orthogonal Cas9s through directed engineering or through isolation of Cas9 variants with novel PAMs. A more universal approach can be achieved by using Transcription Activator-Like Effector Nucleases (TALENs). In the shadows of the CRISPR revolution, TALENs have been engineered to remove their binding site requirement for a 5’ T, thereby removing any specific sequence requirement. In parallel, improved editing has been achieved through delivery of TALEN mRNA via electroporation and we have developed a high throughput assembly method using pre-made RVD libraries which allows rapid production of TALEN mRNA in a day. We demonstrate that when using TALEN mRNA we can achieve high cleavage efficiency in a variety of cells. Taking advantage of the ability to design TALENs to target any sequence and the observation that the success of SNP editing is highly influenced by the proximity of the cut to the desired edit site, we also demonstrate that TALENs can facilitate superior HDR editing efficiency compared to Cas9 by being able to position TALENs at the SNP site regardless of the sequence. This is especially relevant in editing genomic regions with a low abundance of PAMs.
Automating gene editing for deciphering cancer pathways using microfluidics
Hugo Sinha, Concordia University
In recent years, we have witnessed a breakthrough in genome engineering technology, attributed to the gene-editing technique CRISPR-Cas9 (or often called CRISPR) that works like a pair of scissors to cut, insert or reorder specific genetic fragments, creating changes in the biological cell to understand gene function. CRISPR is full of promise and has already been used in a variety of applications such as to help create mosquitoes that do not transmit malaria (Hammond et al. 2016), to eradicate pathogen genomes from infected species (Ebina et al. 2013, Hu et al. 2014), and more recently to test and to battle cancer (Sanchez-Rivera and Jacks 2015, Shi et al. 2015, Platt et al. 2014). However, with the advent of this technology, there is still a lack of new treatments found for cancer. Progress in this area has been hindered primarily by the lack of automation tools for manipulating, editing, and analyzing large genomes without any bias – this has limited our understanding of the genes and biological processes involved with cancer. Here, I will describe how we have developed a new automated microfluidic tool that will target a specific set of genes in lung cancer cells (specifically H1299 cells) and determine which genes are modulators of cancer progression. This new gene-editing tool powered by droplet-based microfluidics is being used to eliminate multiple perturbations within cells while the readouts will depend on cell population measurements. Such a technology has emerged as a versatile liquid handling platform for automating biology (Shih et al. 2013) (Ng et al. 2015) and screening-based applications (Dressler, Casadevall, and deMello 2017). In my presentation, I will describe our system to automate gene-editing processes specific to the CRISPR-Cas9 editing workflow, namely cell culturing, lipid-mediated transfection, and cellular analysis. Next, I will show results from optimizing our gene-editing platform to assess the impact of variations in several parameters on the efficacy of cell transfection and gene targeting using Cas9. Finally, we will demonstrate the broad applicability of the device showing results from a knockout loss-of-function screen that is tackling several oncogenes. Overall, this study aims at demonstrating that our genome editing-on-a-chip approach will greatly speed up validation of loss-of-function screens, including genome wide arrayed or pooled screens, at relatively low cost, with minute amount of material and without the need for enrichment analysis based on next-generation sequencing profiles as required by pooled screens. We believe that this new method will further enhance our understanding of mechanisms related to cancers, which we hope can possibly lead to novel therapies options for those suffering from this disease.
Enhanced, streamlined approaches to facilitate CRISPR/Cas9-mediated gene activation.
Benjamin Haley, Genentech, Inc
CRISPR/Cas9 is conventionally used as a loss-of-function tool within the context of cell-based genetic screens. However, modification of Cas9 can convert it from a targeted DNA damaging enzyme into a transcriptional activator, thus enabling its use for gain-of-function studies. We have developed a suite of vectors that streamline the process of generating a stable, effective CRISPR/Cas9 activator cell line. Here, we will describe the comparative performance of different Cas9-activator combinations expressed from these vectors, and we will discuss best practices and potential pitfalls when engineering cells to be competent for CRISPR-mediated gene activation.
Session Chair: Kristen Brennand, ISMMS
Modeling the contribution of common variants to schizophrenia risk.
Kristen Brennand, ISMMS
Schizophrenia (SZ) is a debilitating psychiatric disorder for which the complex genetic mechanisms underlying the disease state remain unclear. Whereas highly penetrant variants have proven well-suited to human induced pluripotent stem cell (hiPSC)-based models, the power of hiPSC-based studies to resolve the much smaller effects of common variants within the size of cohorts that can be realistically assembled remains uncertain. We identified microRNA-9 as having significantly downregulated levels and activity in a subset of SZ hiPSC-derived neural progenitor cells NPCs, a finding that was corroborated by a larger replication cohort and further validated by an independent gene-set enrichment analysis of the largest SZ genome-wide association study (GWAS) to date. Overall, this demonstrated a remarkable convergence of independent hiPSC- and genetics-based discovery approaches. In developing this larger case/control SZ hiPSC cohort of hiPSC-derived NPCs and neurons, we identified a variety of sources of variation, but by reducing the stochastic effects of the differentiation process, we observed a significant concordance with two large post mortem datasets. We predict a growing convergence between hiPSC and post mortem studies as both approaches expand to larger cohort sizes. Meanwhile, we have been integrating CRISPR-mediated gene editing, activation and repression technologies with our hiPSC-based neural platform, in order to develop a scalable system for testing the effect of a manipulating the growing number of SZ-associated variants and genes in NPCs, neurons and astrocytes. Altogether, our objective is to understand the cell-type specific contributions of SZ risk variants to disease predisposition.
From Patients to Neurons and Back Again
Lee Rubin, Dept of Stem Cell and Regenerative Biology, Harvard University; Harvard Stem Cell Institute
The foundational idea for the use of patient-specific induced pluripotent stem cells (iPSCs) in disease modeling was that this approach would result in a significant improvement in the efficiency of drug discovery and, eventually, in more success in the clinic. Whether this is true depends heavily on the predictive power of this human cell-centric approach, compared to that using cell lines of the types more widely employed in industry. This is a reasonable time to assess the amount of progress that has been made thus far.
Most of the work in my laboratory has focused on studying neurodegenerative and neuropsychiatric diseases using human neurons and other cells derived from iPSCs. Our work, as well as that from numerous other labs, has shown that some, but not all, aspects of these disorders can be reproduced in rather simple tissue culture environments. Moreover, recent studies have demonstrated that iPSC-based research can provide new information concerning previously unrecognized disease features, as well as generate data supporting the use of specific therapeutics on defined patient populations. On the other hand, it seems increasingly likely that more complex in vitro systems, particularly those in which neurons differentiate in 3-dimensional conditions, will be required to model other disease-associated features. These include changes that either require significant amounts of time to occur or are associated with modified neural circuit behavior. I will present several examples of work showing the advantages and limitations of iPSC-derived neuronal work.
Addressing the Scalability of Human iPSC-derived Neurons for HTS Implementation and Phenotypic Screening
BanuPriya Sridharan, The Scripps Research Institute Molecular Screening Center, Scripps Florida
Traditional high throughput screening (HTS) assays for neuronal targets employ non-primary non-human neuronal cells due to the scale necessary for HTS as well as the unreliable and economically demanding nature of primary neurons. The discovery of new drugs for neuropsychiatric disorders have further been hampered by lack of access to disease-relevant human primary neurons and appropriate disease models. Human induced pluripotent stem cell (hiPSC) technology can address some of the obstacles by allowing the generation of human neurons through (1) embryoid body (EB) formation, (2) cultivation on stromal feeder cells, and, (3) employing lineage specific differentiation factors. The former techniques for hiPSC reprogramming (1&2) are slow, variable, and not yet scalable for HTS applications. Straightforward methods to reproducibly differentiate hiPSCs to functional cortical induced neurons (iN) in less than two weeks by forced expression of a single transcription factor has been demonstrated but, never taken to the HTS scale (3). We have successfully recapitulated the aforementioned technique and leveraged the CRISPR technology to define the path to a plate-compatible format amenable for large-scale HTS implementation. The resulting iN cells exhibit appropriate genetic and fluorescent markers that give confidence of bonafide neuronal differentiation. Imminently, we intent to test the preliminary iN cells for their ability to post-mitotically increase synaptogenesis following treatment with LOPAC test compounds via staining for synaptophysin. Ultimately, we will determine reliability and reproducibility over time with industrial scale robotics. Furthermore, we also intend to leverage the CRISPR technology to create a library of disease-relevant-phenotypes from hiPSC-derived cellular models that will provide more opportunities for all biologists to study epigenetic mechanisms and scale-up screening initiatives with Scripps Research Institute Molecular Screening Center (SRIMSC).
SLAS2018 Innovation Award Finalist: Optical tools for single-cell manipulations and sequencing
Santiago Costantino, University of Montreal
Classical examination of tissue and cellular samples heavily relies on microscopy platforms, where molecular probes and a myriad of contrast agents are routinely used to investigate the molecular biology of cells. Nevertheless, a versatile, efficient and non-invasive approach to tag individual cells chosen upon observation is still lacking.
Here we describe cell labelling via photobleaching (CLaP), a method that enables instant, specific tagging of individual cells based on a phenotypic classification. This technique uses laser irradiation for crosslinking biotin on the plasma membrane of living cells and fluorescent streptavidin conjugates. Furthermore, the very same instrument used to image cells can tag them based on their morphological characteristics, dynamic behavior and localization within the sample at a given time, or any visible feature that distinguishes particular cells from the ensemble. The incorporated mark is stable, non-toxic, retained for several days, and transferred by cell division but not to adjacent cells in culture.
We combined CLaP with microfluidics-based single-cell capture followed by PCR assays and transcriptome-wide next-generation sequencing. We computed a number of quality control metrics to verify that CLaP does not interfere with protocols of sample preparation for transcriptomic experiments. To the best of our knowledge, CLaP is the first simple technology that allows correlating spatial and molecular information visible under a microscope when cells are individually sequenced.
Session Chair: Prashant Mali, Department of Bioengineering, University of California San Diego
Elimination of the “essential” warburg effect in mammalian cells through a multiplex genome engineering strategy
Nathan Lewis, University of California, San Diego
Innovative genome editing technologies are accumulating and are rapidly elucidating the genetic basis of many cell functions. However, many disease phenotypes and physiological processes are governed by complex networks involving many gene products. Thus, numerous phenotypes need to be unraveled using multiplex genome editing techniques and in silico analysis. Furthermore, these approaches must further leverage legacy knowledge to identify the genes involved and understand the cell response. As an example, we unraveled the core cellular network controlling the Warburg effect, also known as aerobic glycolysis. This effect involves the tendency of rapidly-proliferating cells to consume excess glucose and secrete copious amounts of lactic acid, despite having adequate oxygen for more efficient oxidative phosphorylation. This response is a hallmark of cancer, immune cell expansion, and other processes with rapidly proliferating cells. However, efforts to test the purpose of Warburg metabolism have been stymied by the difficulty to generate cell lines that are unable to produce lactic acid, since genes involved have proven to be essential or synthetic lethal in proliferating cells. We discovered a genetic circuit involving multiple genes that control lactic acid secretion, and using a multiplex genome editing with CRISPR, we successfully eliminated lactic acid secretion and enabled the deletion of multiple “essential” genes. Surprisingly, the cells show improved metabolic and growth phenotypes, despite the elimination of this fundamental metabolic activity. To understand how immortalized mammalian cells can cope without this seemingly essential metabolic process, we conducted a comprehensive analysis of these cell lines using time-course RNA-Seq, metabolomics, and a genome-scale metabolic network model we have developed for Chinese hamster ovary cells (1). Thus, through a multiplex metabolic engineering effort and comprehensive systems biology analysis, we have been able to engineer out a hallmark phenotype of proliferating cells and begin to understand now a cell can survive without a seemingly essential process.
1. Hefzi, H. et al. A Consensus Genome-scale Reconstruction of Chinese Hamster Ovary Cell Metabolism. Cell Syst. 3, 434–443.e8 (2016).
A high-throughput arrayed CRISPR/Cas9 functional genomics approach to study NF-kB signaling
Patrick O'Shea, Astrazeneca
CRISPR/Cas9 is increasingly being used as a tool to prosecute functional genomic screens. Array-based CRISPR/Cas9 screens offer the ability to interrogate more diverse phenotypes than pool-based screens but their execution at scale in difficult to transfect cells brings challenges. We undertook an array-based CRISPR/Cas9 screen at 800 kinome-scale to investigate mediators of TNFα-mediated NF-κB activation. We used an ME180 cell line stably expressing Cas9 and a beta-lactamase reporter of Nf-κB activation alongside a developed lentiviral sgRNA library. Hits were validated by confirmation of DNA insertion/deletion and screening orthogonal reagents. Screening data quality was within acceptable limits (Z’>0.6) and genes associated with canonical NF-κB signalling were identified. Our data provide unique insights into approaches and tools to explore novel biology with array-based gene editing in cellular assays. Our results demonstrate the potential for genome-scale screens at high specificity using CRISPR/Cas9 and in a wide variety of cell backgrounds and phenotypes.
Combining CRISPR/Cas9 screening with custom engineered reporter cell lines to identify genes required for tubulin formation
Mark Gerber, MilliporeSigma
Phenotypic high-throughput / high content screens have become popular tools for elucidating molecular and genetic pathways in biological systems. Phenomics, or high-dimensional biology, incorporates screening methods that can enable many parameters to be tested in concert under similar or identical conditions, providing a potential wealth of information about a specific biological process. Here we describe the use of a gene-edited reporter cell line, U2OS LMNB1-TUBA1B-ACTB (Sigma-Aldrich CLL1218), to phenotypically detect genes responsible for tubulin formation. CLL1218 was transduced with CAS9 Blasticidin Lentiviral Particles (Sigma-Aldrich LVCAS9BST) and selected. Following selection, the pool was cloned, and derived clones were then screened for CRISPR/Cas9 activity using a known active gRNA. Preferred clones were expanded and banked to be used in a semi-automated high-throughput CRISPR library screens to identify modulators of tubulin expression, formation, and distribution. Proof-of-concept was demonstrated using a set of CRISPR guides specific for vimentin. Creation of a vimentin knock-out in the CLL1218-Cas9 reporter line alters cell morphology that can be visually detected on a variety of imagining platforms, including high-content instruments. This case-study describes an effective methodology to combine multi-pronged gene-editing with phenotypic screening to enrich our knowledge of gene and molecular interactions in complex biological systems. Further, with an expanded array of reporter cell lines at the researcher’s disposal, this type of strategy can be adjusted to dissect many other relevant pathways and phenotypes.
Therapeutic Strategies Via CRISPR-Cas: New Approaches and New Challenges
Prashant Mali, Department of Bioengineering, University of California San Diego
The recent advent of RNA-guided effectors derived from clustered regularly interspaced short palindromic repeats (CRISPR)–CRISPR-associated (Cas) systems have dramatically transformed our ability to engineer the genomes of diverse organisms. As unique factors capable of co-localizing RNA, DNA, and protein, tools and techniques based on these are paving the way for unprecedented control over cellular organization, regulation, and behavior. Here I will describe some of our ongoing efforts towards engineering this system for enabling therapeutic applications.
Track Chairs: Jonathan O'Connell, Forma Therapeutics and Gwen Hansen, Nurix
Session Chair: Gwenn Hansen, Nurix
Highthroughput Binder Confirmation (HTBC): A new non-combinatorial synthesis platform created to enhance and accelerate hit ID.
Joseph Franklin, Xiaopeng Bai, Lijun Fan, Kenneth Lind, Heather O’Keefe, Eric Shi, Jennifer Summerfield, Jerry Yap & Jeffrey Messer, NCE Molecular Discovery - GSK
Encoded Library Technology (ELT) is a hit identification platform that uses ultra-large collections of chemically diverse DNA-encoded small molecule libraries selected for affinity against a therapeutically relevant target. Recent advances in ELT libraries, library pooling strategies, selections and DNA sequencing have vastly increased the number of actionable chemotypes produced for a given selection campaign. In practice, only a small fraction of these chemotypes are synthesized as discrete molecules without the encoding DNA using traditional organic synthesis. To address this bottleneck we developed an automated microscale parallel synthesis platform that uses double stranded DNA with a cleavable linker as a chemical handle. This High Throughput Binder Confirmation (HTBC) platform uses the original DNA-Encoded Library (DEL) chemistry and will recapitulate the products, side-products and intermediates produced in the original library synthesis. The resulting compounds are cleaved from the DNA support and are screened as small molecule mixtures by Affinity Selection Mass Spectrometry. The platform is capable of assessing target engagement for hundreds of compounds per month and is used at GSK to prioritize synthesis decisions for more traditional scale organic synthesis.
DNA-encoded library screening on a GPCR: identification of agonists and antagonist to protease-activated receptor 2 (PAR2) with novel and diverse mechanisms of action.
Niek Dekker1, AstraZeneca, Innovative Medicines Biotech Unit, Gothenburg, Mölndal SE-431 83, Sweden
Protease-activated receptor-2 (PAR2) is irreversibly activated by proteolytic cleavage of the N-terminus which unmasks a tethered peptide ligand that binds and activates the transmembrane receptor domain eliciting a cellular cascade in response to inflammatory signals and other stimuli. PAR2 is implicated in a wide range of inflammatory and other diseases including cancer. Activation of PAR2 on sensory neurons leads to hyperphophorylation of TRP channels resulting in pain and hyperalgesia. The discovery of small molecule antagonists to PAR2 has proven challenging. DNA-encoded library (DEL) screening on purified PAR2 delivered both antagonists and agonists, exemplified by AZ3451 (SLIGRL PAR2 IP-one IC50 = 23 nM) and AZ8838 (SLIGRL PAR2 IP-one IC50 = 1500 nM), and agonist AZ2429 (EC50 of 53 nM in IP-one). Crystal structures of antagonist bound to the GPCR revealed that AZ8838 binds in a fully occluded pocket near the extracellular surface. Functional and binding studies reveal that AZ8838 exhibits slow binding kinetics, which is an attractive feature for a PAR2 antagonist competing against a tethered ligand. Antagonist AZ3451 binds to a remote allosteric site outside the helical bundle. We propose that antagonist binding prevents structural rearrangements required for receptor activation and signalling. AZ3451 and AZ8838 were tested in a rat model of PAR2-induced oedema using 2fLIGRL-NH2 (350 µg/paw in 100 µL and trypsin (20 µg/ paw in 100 µL). At a 10 mg/kg dose, both compounds exhibited reduction of paw swelling in both in vivo models. These results confirm that at least two allosteric sites exist on the PAR2 receptor and can be blocked resulting in reversal of in vitro and in vivo PAR2 mediated signaling. DEL screening on purified PAR2 combined with crystallography provided a basis for the development of selective PAR2 antagonists for a range of therapeutic indications.
Co-authors: Dean G. Brown1, Giles A. Brown2, Robert K.Y. Cheng2, Matt Clark3, Miles S. Congreve2, Robert Cooke2, John Cuozzo3, Andrew S. Doré2, Christoph Dumelin3, Karl Edman1, Rink-Jan Lohman4, Yuhong Jiang4, David P. Fairlie4, Cedric Fiez-Vandal2, Stefan Geschwinder1, Christoph Grebner1, Marie-Aude Guie3, Nils-Olov Hermansson1, Ali Jazayeri2, Patrik Johansson1, Anthony Keefe3, Rudi Prihandoko2, Mathieu Rappas2, Oliver Schlenker2, Eric Sigel3, Arjan Snijder1, Holly Souter3, Linda Sundström1, Benjamin Tehan2, Barry Teobald2, Peter Thornton1, Dawn Troast3, Giselle Wiggin2, Ying Zhang3, Andrei Zhukov2 and Fiona H. Marshall2.
1AstraZeneca, Innovative Medicines Biotech Unit, Gothenburg, Mölndal SE-431 83, Sweden
2Heptares Therapeutics Ltd, Biopark, Broadwater Road, Welwyn Garden City, Hertfordshire, AL7 3AX, UK.
3X-Chem Inc., 100 Beaver St. Waltham MA 02453
4Institute for Molecular Bioscience, The University of Queensland, Brisbane, Qld 4072, Australia
Protein quality and assay development for successful DNA-encoded library screening
Allison Olszewski, X-Chem
Active hit identification from DNA-encoded library screens is driven by high quality protein targets. Because the effective concentration of individual DNA-encoded library molecules in the screen is very low, the immobilized protein target concentration must exceed the dissociation constant to drive protein-library molecule binding. The protein target should also be a consistent conformation and without aggregates so the resulting data is associated with a single protein form. Ensuring that the immobilized protein target maintains biochemical or biophysical activity in the course of selection biases the outcome to functionally active hits. In this presentation, case studies of protein target assessment enabling DNA-encoded library screening success will be shown.
DNA Encoded Library Selection Method to Rank Order Primary Hits by Affinity
Jeff Messer, GlaxoSmithKline
DNA encoded libraries are now routinely employed as part of reductionist lead generation campaigns in Pharma. The large number of compounds contained in many of these libraries (> 1 Billion) when combined with modest hit rates (0.1%) often result in thousands of potential hits. The compounds are generated as large combinatorial mixtures and “selected” for affinity to the target of interest. As a result the first step in hit triage is to resynthesize the compounds of interest without the DNA tag and confirm that the observed affinity for the target translates into the desired functional activity. Here we present experimental protocols and informatics methods that can estimate the affinity of the hits in the DNA encoded library mixture, thus enabling the incorporation of a ligand efficiency estimate into the decision making process for compound resynthesis.
Session Chair: Stephen Johnson, Bristol Myers Squibb
Compound Collection Evolution: Augmentation and design informatics
Stephen Johnson, Bristol Myers Squibb
The composition of the screening collection plays an important role in the identification of initial starting compounds for medicinal chemistry campaigns. We will describe our approach to the selection, analysis, and augmentation of general and targeted screening collections for the purpose of hit identification. This includes a balanced view of representation, diversity, and physico-chemical properties, with a strong focus on chemical filters, to provide tractable initial hits for early lead identification. The development of specialty collections, such as a fragment screening deck, using this balanced approach will also be discussed. Finally, cheminformatics tools to simplify the analysis of clusters of compounds for the follow-up of initial hits will be presented.
SLAS2018 Innovation Award Finalist: Pharos – A Torch to Use in Your Journey In the Dark Genome
Rajarshi Guha, NIH
It is well known that a relatively small set of protein targets receive the bulk of research attention and thus funding. However, there are potential (druggable) opportunities in the remaining under-studied and un-studied proteins. To address this the NIH initiated the "Illuminating the Druggable Genome" program to characterize the dark regions of the druggable genome. As part of this program, a Knowledge Management Center (KMC) was created to aggregate and integrate heterogeneous data sources and data types creating a centralized location for information about all protein targets identified as part of the druggable genome. Since then the KMC has expanded to consider the entire human proteome. In this presentation, we describe Pharos, the user interface for the KMC knowledgebase. We provide an overview of the data sources and types made available via Pharos and then describe the architecture of the system and its integration with KMC & external resources. In particular we highlight the rich search facilities that enable a user to drill down to relevant subsets of data but also support the notion of "serendipitous search". Given the heterogeneous set of data types available for individual targets, it is useful to quantify how much and what types of data is available for a target. We describe the development of knowledge profiles and a Knowledge Availability Score (KAS), both derived from the Harmonizome, which is a resource that has characterized data availability across different data sources and types in a uniform manner. We then highlight how the KAS is concordant with knowledge trends characterized by traditional metrics such as publications and grants. We discuss the use of the KAS in the Pharos interface and an example of prioritizing understudied targets by computing the similarity of their knowledge availability profiles with that of well-studied targets.
Evolution of Small Molecule Screening Libraries
Alex Aronov, Vertex Pharmaceuticals Inc.
The past decade has seen an evolution in our approaches to both the content of the small molecule screening collections and the ways to optimally use the established libraries. The talk will address a number of facets of screening deck assembly and utilization: size and organization of the HTS screening deck; dynamic hit forecasting; identification and removal of frequent hitters; expanding HTS deck with new chemical matter; successful application of fragment libraries; and assembly and stewardship of chemogenomic libraries.
Large scale profiling in human primary-cell based phenotypic assays identifies novel outcome pathways for drug efficacy in cardiovascular disease
Ellen Berg, DiscoverX Corporation
We have previously identified an in vitro signature, characterized by increased cell surface levels of serum amyloid A (SAA) in a human primary cell-based coronary artery smooth muscle cell model of vascular inflammation (BioMAP CASM3C system), shared by certain compound classes associated with cardiovascular toxicity. Data mining a large reference database containing more than 4,500 test agents (drugs, experimental chemicals, etc.) profiled in this assay identified certain mechanisms to be associated with this signature: MEK inhibitors, HDAC inhibitors, GR/MR Agonists, IL-6 pathway agonists, as well as modulators of SIRT1. Since SAA is a clinical biomarker associated with risk of cardiovascular disease in humans, these results suggested that these mechanisms might contribute to cardiotoxicity by direct promotion of vascular dysfunction through SAA within vascular tissues. To further extend these studies, we mined the reference database to identify agents that decrease levels of SAA in the BioMAP CASM3C system without causing overt cytotoxicity. Notable agents that were found to decrease the cell surface level of SAA relative to vehicle control include GLP-1, an endogenous peptide developed as a drug used for treatment of diabetes, roflumilast, a PDE IV inhibitor used for the treatment of chronic obstructive pulmonary disorder, the BCR-Abl inhibitor and oncology drug, imatinib, and a mimetic of ApoA-1, the major lipoprotein of HDL. These represent agents that have been shown to have cardiovascular protective effects in clinical or in vivo studies (some within their class). The results here suggest a potential mechanism for this cardiovascular benefit through regulation of SAA, possibly through interfering with the involvement of SAA in the recruitment and activation of monocytes in the vascular wall. These findings support the value of a large chemical biology database of reference drugs profiled through primary human cell-based phenotypic assays. This database has been mined to reveal several novel associations with adverse events and identified potential mechanisms of toxicity, and here we show how this database can be used to generate new hypotheses for drug efficacy. Collectively these data support a disease and adverse outcome pathway for cardiovascular disease involving the regulation of SAA.
Session Chair: Pete Rahl, Fulcrum Therapeutics
MAP4K4 mediates human cardiac muscle cell death: Human pluripotent stem cell-derived cardiomyocytes for target validation and drug development
Kathryn Chapman, Domainex Ltd
Cardiac muscle cell death due to acute ischemic damage (myocardial infarction, “heart attack”) remains the single commonest cause of death and disability worldwide, with 7 million new cases per year. Heart disease is projected to increase as the population ages, and its socio-economic burden to rise for the foreseeable future. Current therapies restore blood flow (reperfusion) or decrease the heart’s workload, improving myocyte survival only through extrinsic effects. Although directly suppressing cardiomyocyte death is logical, no existing clinical counter-measures target the relevant intracellular pathways. Progress has been hampered by lack of validation in pre-clinical human models.
Professor Michael Schneider has demonstrated that Mitogen-activated Protein Kinase Kinase Kinase Kinase-4 (MAP4K4) is activated in failing human hearts and relevant rodent models. Using human induced pluripotent stem cell-derived cardiomyocytes (hiPSC-CM), he has demonstrated that death induced by oxidative stress requires MAP4K4. Specifically, MAP4K4 shRNA confers protection to hiPSC-CMs.
Working in collaboration with the drug discovery services company Domainex, a novel highly specific MAP4K4 small-molecule inhibitor - DMX-5804 - has been developed. Its ability to enhance cell survival, mitochondrial function, and calcium cycling in hiPSC-CMs has been demonstrated. Furthermore, DMX-5804 provides proof of principle that drug discovery guided by hiPSC-CM assays can predict efficacy in vivo, since this compound reduces the volume damaged by ischemia-reperfusion injury in mice by nearly 70%.
These experiments implicate MAP4K4 as a well-posed target for suppression of human cardiac cell death, and highlight the utility of hiPSC-CMs for drug development in common acquired cardiac disorders.
Chemical and genomic identification of globin regulators that induce fetal hemoglobin reactivation
Pete Rahl, Fulcrum Therapeutics
Red blood cell disorders like Sickle Cell Disease (SCD) and beta-thalassemias are caused by alterations within the gene for the hemoglobin beta (HBbeta) subunit. A fetal ortholog of HBbeta, hemoglobin gamma (HBgamma) can reverse disease-related pathophysiology in these disorders by also forming complexes with the required hemoglobin alpha subunit. Because beta-like globin expression is developmentally regulated, with a reduction in the fetal ortholog (gamma) occurring shortly after birth concomitantly with an increase in the adult ortholog (beta), it has been postulated that maintaining expression of the anti-sickling gamma ortholog may be of therapeutic benefit in children and adults. Previously, inhibitors of chromatin modifying enzyme G9a/GLP (G9a-i) have been shown to upregulate HBgamma expression relative to HBbeta expression and therefore G9a/GLP has been proposed as a reasonable molecular target for maintenance of the anti-sickling HBgamma ortholog. However, we have uncovered limitations to G9a-i as a therapeutic strategy to reactivate HBgamma and therefore set out to identify novel modulators and targets of HBgamma expression using both chemical probe and CRISPR-based genetic screening strategies. We identified multiple druggable components of lipid metabolism, nuclear receptor pathways and transcription/chromatin regulatory pathways that modulate HBgamma mRNA using our automated, cell-based chemical genetic screening platform. Through characterization of these regulators, we have demonstrated that CRISPR targeting of different protein domains of components of the globin regulatory network can have profoundly different effects on globin gene expression patterns. More specifically, modulation of key domains of chromatin writers, readers and erasers results in markedly different globin expression profiles that informs small molecule discovery against these novel targets. Additionally, we are utilizing a newly developed in vitro SCD cellular model to investigate how these globin gene regulators impact SCD pathophysiology.
Phenotypic Drug Discovery using patient-iPSCs derived from a rare disease of Pompe
Yohei Nishi, Center for iPS Cell Research and Application (CiRA), Kyoto University
Recent advances in the development of induced pluripotent stem cell (iPSC) technology, gene editing tools, imaging assay technologies, etc. have led to increased interest in phenotypic drug discovery (PDD) approaches. In particular, human iPSCs can generate human ’disease in a dish’. Also, iPSCs have many advantages such as human origin, easy accessibility, scalability, ability to produce the desired cells, and the possibility to develop personalized medicines with patient-iPSCs. Nevertheless, few pharmaceutical companies have introduced drug discovery using iPSC technology. This is because iPSC technology still has many problems which include cost, reproducibility, batch to batch variation and differentiation. Center for iPS Cell Research and Application (CiRA), Kyoto University are currently working on solving these problems.In this presentation, we picked up our pilot study of PDD using iPSC, which is Pompe disease project. Pompe disease is a rare inherited metabolic disease of lysosomal glycogen accumulation due to dysfunction of acid alpha-glucosidase (GAA). Current therapy is a replacement enzyme therapy of its lysosomal enzyme, but it does not completely recover in the skeletal muscle function. Recently it was reported to be caused by dysfunction of autophagy and lysosomes. So, we developed an HTS-compatible screening system focusing on autophagy. For a preparation of patient relevant myocytes, we established a robust and scalable differentiation system by forced expression of MyoD (master regulator of muscle differentiation). Using this, we screened around 5000 compounds (FDA-approved drugs and bioactive compounds) and found several candidates. Next, we evaluated them for other functional assays and predicted the mechanism of action (MoA) of candidate compounds by identifying drug-specific gene expression signatures using RNA-seq. These compounds we discovered could be applied to drug repurposing and lead to clinical translation. In conclusion, we believe that PDD using iPSC technology is a powerful approach to identify novel MoAs and therapies, especially for drug discovery of rare diseases.
Identification of Subtype-selective Vulnerabilities and Biomarkers in Non-small Cell Lung Cancer
Bruce Posner, UT Southwestern Medical Ctr
Lung cancer is the leading cause of cancer-related deaths with more than 1.3 MM people dying of this disease each year. Drs. John Minna and Adi Gazdar have collected over 100 non-small cell lung cancer (NSCLC) tumor samples and created corresponding cell lines that they have characterized by molecular and biological methods. From these studies and others, we postulated that lung cancer can be characterized as a collection of diseases, each distinguished by unique biomarkers and chemical vulnerabilities. Using a chemistry-first approach, we carried out high throughput, phenotypic screens of 12 representative lung cancer cell lines from our NSCLC panel against the UT Southwestern chemical library (~230,000 compounds). In follow up studies, we identified ~180 small molecule toxins representing over 30 chemical series that are toxic to subsets of the extended lung cancer cell panel (~105 cell lines) but not to immortalized human bronchial epithelial cells (HBEC’s). We have developed and used bioinformatics tools (machine learning and a novel variation of the Kolmogorov-Smirnov statistic) in conjunction with orthogonal molecular data sets (e.g. somatic mutations, gene expression, protein expression, metabolic flux, etc.) to identify biomarker hypotheses that have provided insights into mechanism of action and possible targets of these cancer subtype-selective compounds. Subsequent biochemical, cell based, and in vivo studies on selected compounds have identified several novel target/biomarker relationships that may have therapeutic impact.
Track Chairs: Margaret DiFilippo, Dotmatics and Amy Kallmerten, Merck
Session Chair: Nicole Glazer, Merck
Advanced Analytics and Visualisation for Biology Big Data
Robert Brown, Dotmatics Inc
The scientific field will soon overtake Astrophysics as the major producer of data within the scientific domain. Not only is this data large, it is also complex and has many different forms of data that need to be represented together intuitively so a user can discover knowledge. Therefore, biological data represents a visualization challenge because of complexity, size and diversity; additionally, the analytic algorithms need to be designed with "thinking outside the box" while grounded in concepts that the user can understand.
This talk will present the program Vortex [Dotmatics Ltd] that has incorporated multiple biological visualization that can handle multiple data types together and allow interactive manipulation of this data on the scale of genomes on the standard user laptop. The program additionally integrates biology with a multitude of data types including advanced chemical analytics. The talk will also show how multiple data reduction analytical techniques have been integrated to enable users both visualize information and discover knowledge within biology. This covers the concepts of hypothesis testing and discovery mining and how this is integrated within graphical representations that are simple to understand by easy to manipulate and work with.
Transforming Science: How the “Smart Lab” is the Key to Unlocking Results
Sridhar Iyengar, Elemental Machines
The “Internet of Things” (or IoT) is transforming virtually every workplace, and science-based industries are no exception. In fact, technology is poised to radically transform science-based discovery – as soon as it is harnessed correctly. While highly complex, these industries are among the last to truly benefit from using IoT to enhance the automation efforts that are key to high-throughput technologies and related fields.
With an influx of next-generation Science 2.0 companies, this is all changing. These companies, many based on IoT technologies specifically applied to the world of science, are bringing the rigor and automation that prevail in manufacturing to earlier stages of scientific product research and discovery, from pharma to synthetic biology and much more.
By automatically gathering new data streams for scientists – filling data “gaps” in their work – and automating the execution of protocols to eliminate human variability, the implications include faster time to results, faster time to market, and better understanding of scientific processes.
Simply put, harnessing the technologies that have provided the building blocks for the IoT and applying them to scientific processes has widespread benefits. It helps teams identify unknown factors that may be contributing to outcomes – essentially the proverbial “needle in the haystack" – and accelerates discovery.
Automation will also transform roles within science-based departments – and unlike some industries, will not eliminate jobs at the scale some predict. Instead, technology will streamline processes so employees can be more efficient and strategic. Teams no longer need to use staff to manually record, enter and model data. Nor do teams need to be in the lab 24x7 to monitor their experiments. Instead staff can use their time for more value-added work to further optimize team performance.
This talk will explore how IoT enhances automation efforts and fundamentally reshapes science-based work environments and the implications for people in all roles – from research to operations management.
Developing and Implementing a Scientific Data Strategy for Pharma
Nicole Glazer, Merck
The discovery research paradigm requires integration of a broad range of human biology data and knowledge in order to generate and explore diverse hypotheses. Scientists often spend a significant amount of their time and resources in analytics and informatics projects trying to find, access, understand, curate and integrate data. While scientific information is generally managed effectively for its primary use, it often lacks the accessibility and context that facilitates secondary use and cross-functional integration on-demand. As a result, much of the research informatics efforts across the pharmaceutical industry are focused on creating single point solutions to these challenges within a particular problem space or functional area. As the use of predictive modeling, analytics and machine learning increases to address the challenges of declining R&D productivity and increasing pressures for demonstrating product value, a cohesive scientific data strategy and scalable approaches are required to handle the ever increasing variety of data types, data sources, data models and analytics patterns. It also calls for a reevaluation of data access rules, accountability, and data stewardship culture to realize business strategic goals while managing risk.
Automating Screening Cascades:Linking Compound Data to Compound Logistics
Travis Mathewson, Pfizer Inc
Plate-based assays are often part of a series of assays designed to narrow down a list of compounds and/or gain knowledge regarding selectivity, on/off target effects, ADME mechanisms, or pharmacological profile. This presentation will demonstrate the concept of Automated Screening Cascades (ASCs) that integrate the decision making algorithm with compound logistics. ASCs automatically identify and submit compounds to the next assay in the cascade. This ensures that no compounds and data are missed. Data processing tools can be built to enable users to ‘subscribe’ Discovery Projects to a collection of logically linked assays such that any compound made for that Project is automatically selected, submitted, and processed for screening with minimal end user input. In automating the compound selection process, compounds can go through the defined screening funnel efficiently and effectively to minimize cycle times and speed up the decision making process.
Session Chair: Yohann Potier, Novartis Institute for Biomedical Research
Answers to questions not yet asked: Informatics strategy applied to scientific questions
Yohann Potier, Novartis Institute for Biomedical Research
The amount of scientific data generated in current life science research is too large to be analyzed without informatic tools. While automation and new statistical algorithms provide useful tools for the analysis of large amounts of data, one also needs to ask the appropriate scientific questions to pursue discovery.
This presentation will describe how we used different technologies to address the specific needs in various scientific domains in a sustainable way. Topics such as software architecture, data integration, visualization will be considered.
Cloud-based qPCR analysis software for rapid-throughput screening of antisense oligonucleotides
Donald Milton, Ionis Pharmaceuticals
One distinct advantage of the antisense oligonucleotide (ASO) platform has over other therapeutic approaches is the ability to rapidly screen for safe and effective ASO compounds against new molecular targets. While RNA screening assays will often utilize sequencing technology like RNA-Seq, today, the gold standard for rapid and cost effective gene expression quantification remains quantitative polymerase chain reactions (qPCR). Advances in automation, miniaturization and microfluidics have enabled researchers to develop high-throughput qPCR assays that generate thousands of data points per day creating an increased burden on downstream data management, computation and analysis.
Here we present a software tool to complement these high-throughput methods by simplifying the computation and visualization of screening results for standard-curve experiments. Our qPCR pipeline uses modern cloud technology provided by Amazon Web Services to permit users to dynamically generate workflows for a vast array of plate-based qPCR assays. Our visualization tools make use of the HELM notation (Hierarchical Editing Language for Macromolecules) to both visualize the genomic targets as well as the distinct chemistries employed in SAR screens.
Supervising the Unsupervised: Maximizing Biological Impact in Cellular Imaging
Finnian Firth, GlaxoSmithKline
The exciting challenge of imaging data is the sheer number of options to recognize and retrieve meaningful content; while some turn to the ever-growing algorithmic tool-shed of machine learning, others utilize a priori knowledge of the biology at hand to arrive at the answer. With a balance between these two paramount, we implemented a hybrid workflow to re-analyse compound data in a phenotypic COPD screen. Allowing biological subject matter expertise to guide data-driven decisions, and vice-versa, we used a combination of knowledge-based, supervised, and unsupervised methods to de-convolute patient-derived macrophages into patient-specific subpopulations. At this level of granularity, we could discern previously masked effects of compounds on healthy and diseased cells, both in their physical properties and population makeup. These differences proved to be key when understanding the underlying phenotypic changes. Avoiding “black box” algorithms, instead favouring those which could be interrogated by biological and data scientists alike, led to faster and more relevant analysis cycles, and helped cement a “marriage” between statistical significance and biological relevance. Here, we discuss the analytical methodologies invoked to achieve this.
Applying 'NewSQL' technologies to scientific data to enable self-guided data discovery and analysis
Daniel Weaver, PerkinElmer
Scientific data organization and analysis remains a significant impediment to drug discovery particularly in late-stage animal studies, despite years of effort and ongoing “data lake” projects. Recent shifts to more heavily employ outsourced research have further fragmented data standards, increased reliance on ad hoc reports, and yielded single-use data. We have developed a novel approach to data curation and aggregation that enables scientists to self-serve scientific data regardless of its originating source and deposit those data into self-guided and open-ended analysis. Our approach relies on NoSQL database technologies to connect to structured existing data source(s) (like internally developed data lakes) or ad hoc sources like folders of Excel spreadsheets. All the results, regardless of source, are indexed into a common data shape that drive performance and ensures a consistent user experience. Discovered results are presented through RESTful web services or a “NewSQL” front-end. During the course of the past year, we have refined this approach through a collaborative program with a large drug discovery company. In this presentation, we will describe the motivation of our approach, show the results, and provide metrics for how much this novel approaches speed data discovery and utilization.
Session Chair: Clare Tudge, GSK
Preparing Early Drug Discovery for the Machine Learning Revolution
Scott Harrison, Merck
The mass-collection of data and its mining with machine learning algorithms is driving a revolution within the industrialized world, enabling data-forward organizations to improve operational efficiency and facilitate decision-making. Operationalizing data science and machine learning pipelines requires well-behaved, curated datasets on which to train models, and ideally, a streamlined workflow that handles data collection, standardization, model building and publication. In early drug discovery, the process of iterative drug design known as the “design, make, test” cycle, whereby chemists design and evaluate drug candidates, synthesize selected designs, and test them in biological assays is becoming increasingly complex as the technologies employed in each step continue to advance. Successful management of these complex workflows is often difficult to accomplish with available commercial software packages, which often necessitates the construction of novel applications that are capable of integrating with legacy systems. While building highly-integrated software tools is a distinct challenge, the reward is an opportunity to gather information which could be used to further improve the efficiency of drug discovery. This talk will highlight Merck’s approach to building tools that enable collaborative drug design and high-throughput chemical synthesis while gathering information that we anticipate will transform the way scientists discover new medicines.
Combined experimental and computational HTS approaches to target NSD3-mediated protein-protein interactions for therapeutic discovery
Andrey Ivanov, Emory University
Lung cancer is the leading cancer killer in both men and women in the US, and the survival for lung cancer patients at five years remains as low as 15% or less. The recent advances in cancer genomics engaged with the expanded landscape of oncogenic protein-protein interaction network have enhanced our understanding of cancer biology offering new hope for novel therapeutic strategies to exploit lung cancer vulnerability. To address this urgent medical need, we utilized a highly robust high-throughput protein-protein interaction (PPI) screening platform and established a lung cancer-associated PPI network, termed OncoPPi. The OncoPPi links the cancer driver genes, both oncogenes and tumor suppressors and allows to identify new tumor dependencies to inform novel strategies for therapeutic interventions. As one example, the OncoPPi has revealed a direct interaction between MYC oncogene and NSD3 protein, which plays a critical role in regulation of chromatin remodeling through a direct association with BRD4. BRD4-MYC PPI suggests a novel BRD4 regulatory mechanism and a potential target for perturbing the NSD3-mediated oncogenic pathway. Thus, inhibition of interactions of NSD3 with MYC and BRD4 by small molecules would provide new tools to investigate the NSD3-dependent tumorigenesis, and will facilitate lung cancer drug development. To achieve this goal, we have developed a time-resolved fluorescence resonance energy transfer (TR-FRET) assay to monitor the interaction of NSD3S and MYC in miniaturized 1536-well ultra-high-throughput screening (uHTS) format. Furthermore, to identify compound scaffolds required for the efficient disruption of NSD3 PPIs, we have developed and applied an integrated computational workflow that combines classical cheminformatics approaches with the large-scale cross-validation virtual screening methods. Based on the promising data from our pilot screening, we have launched a large-scale screening campaign to discover potent and selective inhibitors of NSD3 interactions. Together, the OncoPPi network serves as a powerful resource to uncover new cancer vulnerabilities on oncogenic protein-protein interactions. Our complementary experimental and computational high-throughput approaches provide a robust workflow to discover novel inhibitors of challenging PPIs to facilitate anti-cancer drug development.
A Low-Cost Open-Source Cloud-based Liquid Handling Robotic Platform for Performing Remote Real-Time Collaborative Experiments
Kasper Stoy, IT University of Copenhagen
We have developed a robotic-system capable of performing routine liquid-handling experiments, as well as artificial chemical life experiments. Our platform consists of an actuation-layer on top, an experimental-layer in the middle, and a sensing-layer at the bottom. The actuation-layer comprises the robot-head and modules mounted on it. The modules, e.g. pipet-modules, OCT-scanner, extruder, PH-probe, are designed to perform actions on experiments. The head holds modules and moves in the horizontal plane. The experimental-layer holds the reaction vessels. The sensing-layer consists of a camera below the experimental-layer to monitor the experiment. It collects data from experiment, and provides feedback for robot to interact with experiment.
To develop an open-source multi-platform user-interface for remote real-time control of our robotic-system, we decouple user software for programing experiments from robot control-software. Therefore, we use an integrated controller-hardware, namely a Raspberry-Pi3 single-board computer, instead of a dedicated computer. The resulting platform eases software management as installing, and managing software libraries required for feedback-based experiments on different hardware, and operating-systems was difficult. Furthermore, it is affordable owing to the low cost of Raspberry-Pi. This approach also enables us to implement a cloud-based software architecture for our platform.
The cloud-based software architecture for our robotic-system provides resource sharing and reusability of experiment protocols, the ability to work on the robotic-system collaboratively, and parallelizing experiments on different robotic-systems. Sharing resources allows users to benefit from experiment protocol-templates provided for common experiments, and also protocol-examples developed by other users. This is specifically helpful as our liquid-handling robot can be used for numerous applications by different users, therefore sample experiment-protocols can save a significant amount of time for the user-community. Collaboration on robotic-platforms, i.e. multiple users can work on the same experiment simultaneously, provides novel opportunities for researchers. On the user-interface, they see changes other users are making to the experiment protocol real-time. They can modify the same experiment as a team, or receive notifications regarding experiment progress. Moreover, users can continue to work on the same experiment on another machine. Finally, parallelizing experiments improves efficiency, specifically for artificial-chemical-life experiments, as several long-lasting experiments are performed on multiple-platforms.
A cloud-based implementation of the user-interface of our robotic-platform is a paradigm shift from single-user single-platform concept to single-user multi-platform, multi-user single-platform, and multi-user multi-platform approaches. A single-user multi-platform paradigm, i.e. a user being able to control several robotic-systems at the same time, and run the same code on multiple-robots, allows for a high degree of parallelism. A multi-user single-platform, i.e. several users can work on the same robot simultaneously, provides a great potential for collaboration on the robotic-platform. A multi-user multi-platform approach, i.e. several users, e.g. a team, being able to work on multiple-robots, enhances resource sharing, and reusability of experiment-protocols.
A comparison of High Throughput Screening and Virtual Screening Approaches in a Biochemical Coupled-Enzyme Inhibition Assay Program
Margaret Kenney, Icagen, Inc
High throughput screening (HTS) and virtual screening (VS) are distinct strategies employed in lead discovery programs to identify potential drug candidates. In an effort to discover novel inhibitors of a Type 2 diabetes target, we employed a multi-pronged strategy using both experimental (HTS) and theoretical (VS) approaches.
An in-house compound library was screened in a well characterized biochemical, coupled-enzyme assay that has been miniaturized to 1536-format and optimized for HTS. Both enzymes are potential targets on the same pathway.
Virtual screen characterization against each of the two enzymes broadens the likelihood of garnering hits. The in silico approach for each target enzyme was unique, given their different crystal structures. The first enzyme, with a particularly large binding pocket, required a sub-pocket system to be employed. To date there is no known inhibitor for this target. On the other hand, the second target has a known inhibitor and proved to be less challenging. The virtual screens were performed on the same set of compounds as the HTS, along with another 6.5 million molecules- curated based on availability and favorable structural properties- from the eMolecule database. Molecular dynamic simulations followed by ensemble docking on Phase filtered data set and mmGBSA based postprocessing calculations were carried out. A proprietary feature matching method was employed to select a small but diverse set of hit compounds.
Results from a preliminary set of 253 VS hits tested in the biochemical assay resulted in a 24% confirmation rate, demonstrating the effectiveness of the VS approach. Further characterization of the confirmed hits established that half of them act at the first enzyme and half at the second.
A head-to-head comparison of the outcome of both approaches was performed. Multiple parameters were evaluated, including hit quality and true/false positive and negative rates. Evaluation of these factors, along with overlap between the compound sets identified from- and the relative merits of- each approach are discussed. The combination of HTS and VS methods proved to be complementary tactics to address molecular targets and to identify unique chemical matter, with VS playing a critical role by tapping into the very large chemical space of commercial collections, which are not readily available for HTS and are expensive to acquire.
Track Chairs: Chun-wa Chung, GSK and Peter Hodder, Amgen
Session Chair: Anthony Orth, Genomics Institute of the Novartis Research Foundation
Towards a comprehensive strategy to target identification and mode of action elucidation for bioactive small molecules
Markus Schirle, Novartis Institutes for Biomedical Research
Target identification and the elucidation of mechanism of action (MoA) for bioactive small molecules are key steps in phenotypic and pathway-centric approaches to drug discovery. In recent years, various strategies have been introduced and refined that address these questions from different angles and provide glimpses of different aspects of the often complex physical and functional interactions of a compound when exerting its biological effects in vivo: Affinity-based approaches aim to describe the (protein) interactome of drug candidates which constitutes the full spectrum of potential efficacy and off-targets. These include quantitative chemoproteomics such the combinations of small molecule affinity chromatography or photo-affinity labeling with mass spectrometry-based protein identification and quantitation, as well as large scale implementations of biophysical approaches in vitro such as size-exclusion chromatography. On the other hand, a variety of functional genetic and genomic strategies have been introduced that include (unbiased or targeted) generation of compound-resistant cells followed by identification of the resistance-conferring genetic changes by next generation sequencing as well as the utilization of genome-wide knock-down and deletion approaches including RNAi and CRISPR. In these cases, the generation of target/MoA hypotheses is based on the elucidation of functional relationships between a gene and the compound-induced phenotype. In contrast to the individual protein or gene resolution provided by these former strategies, cellular profiling approaches interrogate the overall cellular response to compound treatment at the level of signaling, gene expression, viability or metabolism. Finally, knowledge-based approaches rely on empirical and computational approaches and a reference collection of compounds with known targets and MoA to make inferences. Since the various approaches provide orthogonal information and have unique strengths, multipronged strategies are best suited to provide a comprehensive picture of the target/MoA space of a bioactive compound and ultimately enable successful elucidation of the efficacy target and its functional link to the phenotype under investigation. The various classes of target ID platforms will be presented and discussed in the context of real-life applications.
Multi-Modal Assays for Functional and Chemical Genomics
Robert Damoiseaux, UCLA
Traditional single readout assays can provide deep insight into biological systems - especially when couple with high throughput screening methodologies. Multiplexed readout assays such as e.g. two-color assays are a logical extension that furthers our insight into biological systems but have inherent issues in complex biological systems of heterogeneous manner such as patient derived cell lines or primary material. Here, multi-modal readouts have the potential to provide insights in to mode of action of drugs and biological phenomena alike. In the first part of this talk we will highlight how multimodal readouts can be utilized to provide insight into how interferon receptor signaling regulates checkpoint receptors PDL-1 and PDL-1 in patient derived melanoma cell lines using functional genomics. In the second part we will demonstrate how chemical genomics can be combined with force phenotyping to yield insights into differences between traditional readouts such as calcium flux measurements which are thought to play a major role in cell contractility vs. actual measurements of such contractility based on imaging of live patient cells on soft patterned cell substrate.
An intergenerational approach to identifications of mechanism of action and drug targets
Wei Zheng, National Center for Advancing Translational Sciences, National Institutes of Health, Bethesda, MD, USA
High quality lead compounds can be found from the phenotypical screening, especially in the screens for drug repurposing. But the mechanisms of action and drug targets for these lead compounds are usually unknown. It is a challenge to work on the target identification and to study the mechanism of action for a lead compound. We have recently employed two platform technologies to unlock drug-target engagement including the Drug Affinity Responsive Target Stability (DARTS) with proteomics by mass spectrometry and Cellular Thermal Shift Assay (CETSA) using target specific antibodies. Both methods use the native proteins in cells to determine the binding of a drug to its target protein, which results in physiologically relevant binding affinity of drug with its target protein. Via an integration of these methods, we have identified the AMPK beta-subunit as a drug target for beta-cyclodextrin which allosterically activates AMPK and autophagy. We also identified three potential targets for Torin-2 which potently suppresses malarial gametocytes. Additional examples of target identification using these methods will be overviewed and discussed. Therefore, this integrational approach with DARTS and CETSA can be broadly used for identification of new drug targets.
A Quantitative Target Engagement Approach to Profile Compound Affinity and Residence Time Across Enzyme Classes In Live Cells
Matthew Robers, Promega Corporation
Intracellular target selectivity is fundamental to pharmacological mechanism. Although there are currently a number of acellular techniques to quantitatively measure target binding or enzymatic inhibition, no biophysical approach exists that offer quantitative, equilibrium-based analysis of target engagement across enzyme classes in live cells. Here we report the application of an energy transfer technique (NanoBRET) that enables the first quantitative approach to broadly profile target occupancy, compound affinity, and residence time for a variety of target classes including kinases and chromatin modifying enzymes. The NanoBRET method allows for broad kinome profiling of inhibitor selectivity against nearly 200 kinases, and enables a mechanistic interrogation of the potency offsets observed between cellular and acellular analysis. Compared to published biochemical profiling results, we observed an improved intracellular selectivity profile for certain clinically-relevant multi-kinase inhibitors. Due to high levels of intracellular ATP, a number of putative drug targets are unexpectedly disengaged in live cells at a clinically-relevant drug dose. The energy transfer technique can also be performed in real time, allowing for measurements of drug residence time. Broad kinase profiling of compound residence time reveals surprising kinetic selectivity mechanisms.
Session Chair: David Israel, GSK
ALIS Affinity Selection in Pharmaceutical Discovery
Peter Dandliker, Merck
Affinity selection mass spectrometry (ASMS) is a general, high-throughput method to select and identify small molecule ligands from complex compound mixtures. Merck has advanced a specific ASMS approach termed ALIS (Automated Ligand Identification System), a two-dimensional LC/MS system in line with high-resolution mass spectrometry, to routinely assess one million compound / target encounters per day. This high throughput capability, while traditionally employed for small molecule hit identification, has recently been adapted to deconvolute molecular targets of phenotypically active compounds of unknown mechanism, in an approach termed Protein Array ALIS (PA-ALIS), and to quantitatively rank order the binding affinity of medicinal chemistry analogs in complex mixtures (Protein Titration or PT-ALIS). The PT-ALIS method, when combined with nanoscale parallel or mixture synthesis permits identification of analogs most likely to exhibit potent functional activity starting from very small quantities of material and without need for compound purification prior to biological assay. An introduction to ALIS and the novel application to medicinal chemistry and target identification will be presented.
Identifying new allosteric sites on PTP1B using fragment-based tether scanning
Zachary Hill, University of California, San Francisco
Due to its role in regulating insulin receptor kinase, protein-tyrosine phosphatase 1B (PTP1B) has been a long sought after drug target for the treatment of diabetes and other metabolic disorders. Unfortunately, due to the high homology between PTP family members and the charged nature of substrate mimics, developing selective and cell-permeable active site inhibitors of PTP1B has proven notoriously difficult. For this reason, there has been great interest in developing compounds that allosterically modulate PTP1B activity. Towards this goal, we have been applying a disulfide-tethering fragment-based approach to identify and characterize new binding and allosteric sites on PTP1B. Here we report our progress to date, and show that this “Tether Scanning” approach has allowed us to identify new binding sites on PTP1B, as well as new disulfide fragments that modulate PTP1B activity.
HIPStA, a High Throughput Alternative to CETSA
Robert Blake, Genentech
The measurement of drug – target interaction in the cellular context is critical to many drug development programs. The Cellular Thermal Stability Assay (CETSA) represents an established broadly applicable method for measuring drug target interaction. However CETSA has some major limitations that make it difficult to scale to the throughput typically required for a drug development project. It requires heating samples to different temperatures and centrifugation and / or filtration steps which limit throughput. The HSP90 Inhibitor Protein Stability Assay (HIPStA) is a novel method for measuring drug target interaction. Like CETSA, HIPStA is based on the premise that the binding of a ligand to a target protein can influence that protein’s stability. Instead of using heat to destabilize a protein, HIPStA uses a Heat Shock Protein 90 inhibitor (HSP90i) to cause protein instability. Instead of scanning a range of different temperatures to establish a thermal denaturation curve, HIPStA applies a range of concentrations of an HSP90i to determine an HSP90i induced denaturation curve, and ultimately measures the ability of a compound to stabilize a protein. We present data demonstrating the proof of concept for the HIPStA method, using 3 different classes of drug discovery targets: Receptor tyrosine kinases, Nuclear Hormone Receptors and Cytoplasmic Protein Kinases. HIPStA represents a more scale-able alternative to CETSA for detecting drug-target interaction in cells.
Free energy perturbation calculations as a computational assay for structure based drug design
Victoria Feher, Schrodinger, Inc.
For ~25 years the implementation of computational methods in drug discovery has promised to deliver new drugs to candidacy faster and at lower cost than more traditional drug discovery methods. This promise is increasingly coming to fruition as computational methods have become more robust, more widely used and high powered inexpensive GPU technology allow these calculations to be completed in a timeframe that is meaningful to structure-based drug discovery teams. Notably, recent advances in free energy perturbation algorithms, such as FEP+, allow physics-based interrogation of ligand – target interactions as a predictive computational assay. A number of examples, both retrospective and prospective from our drug discovery collaborations will be provided to illustrate the implementation and success of this method. Typical discovery and optimization processes such as selection of ligands with the best target affinities within a chemical series, selection of the best ligands for target selectivity and solubility as well as maintaining potency while moving away from an ADME liability are exemplified. For each case, a set of ~100 or more virtual ligand ideas were prioritized to less than 10 to be synthesized thereby hastening the discovery process.
Session Chair: Rusty Lipford, Amgen
Network-driven drug discovery: Exploring biological and chemical space
Victoria Flores, eTherapeutics
A characteristic of complex systems is that their behaviour emerges due to interactions between the multiple constituent entities, and therefore cannot be predicted by a linear combination of the behaviour of the individual components in isolation. This concept applies to biological systems where cellular functions arise from interactions between proteins, second messengers and metabolites within the cell. As such, cellular processes, including those manifested in different pathologies, can be modelled as a collection of interactions to capture their biological complexity. Interaction networks can be used as mathematical models to represent these complex systems.
Modelling cell complexity as networks to capture the changes that underpin disease could be of benefit to discover new therapeutic agents, as they should be better candidates to address biological degeneracy, cell robustness, as well as disease phenotypes arising from multi-component molecular changes. In the context of network biology, the drug discovery approach can be seen as the identification of agents that may have an effect on the disease by perturbing the underlying network.
We have implemented and used such approach in a drug discovery platform. Here, cellular disease mechanisms are modelled as protein interaction networks, and network theory-based algorithms are used to identify protein sets that upon perturbation, will disrupt the integrity of the disease network. The underlying assumption is that structural changes on the disease network translate into modifications on the disease phenotype.
The rationale for compound selection is based on their protein footprint being close to the protein set identified by network analysis. This compound selection step utilises empirical activity data from bioactivity databases together with activity predictions generated from machine learning models.
The computational process generates lists of compounds potentially enriched in actives for the selected disease mechanism. Those compounds are then tested in a variety of cell-based phenotypic assays representing the disease mechanisms being targeted. Hit compounds, defined as compounds active across the assay panel, are confirmed assessed for QC and and IP and taken into a medicinal chemistry programme for optimisation using phenotypic-driven approaches.
We have applied the platform described above for the discovery of novel drug candidates in diverse biologically complex diseases. The approach is highly productive and consistently identifies hits that have been progressed into novel potent and, selective drug molecule.
Serine Hydrolase Drug Discovery by Activity-based Protein Profiling
Micah Niphakis, Abide Therapeutics
Serine hydrolases are one of the largest known enzyme classes comprised of over 250 members in humans. These enzymes have diverse physiological roles in metabolism, inflammation, neurotransmission, and blood clotting through the production or degradation of a wide range of bioactive molecules. While some members of this class are well-characterized and even the targets of approved drugs, many members of the serine hydrolase family remain uncharacterized with respect to their biochemical and physiological functions in large part due to the lack of tools to investigate these enzymes in living systems.
Activity-based protein profiling (ABPP) has emerged as a powerful chemoproteomic approach for serine hydrolase inhibitor development. Here, we show how Abide Therapeutics has used ABPP to enable the discovery, optimization and characterization of a diverse collection of inhibitors for members of this class. Furthermore, we demonstrate how this technique provides a unified activity, selectivity and target engagement assay applicable to each stage of serine hydrolase drug discovery, from primary screens to clinical trials in humans.
Targeting the DCN1-UBC12 Protein-Protein Interaction in the Neddylation Activation Complex
Shaomeng Wang, University of Michigan
The Cullin-RING E3 ubiquitin ligases (CRLs) regulate homeostasis of approximately 20% of cellular proteins and their activation require neddylation of their cullin subunit. Cullin neddylation is modulated by a scaffolding DCN protein through interactions with both the cullin protein and an E2 enzyme such as UBC12. Here we report the discovery of high-affinity, cell-permeable small molecule inhibitors of the DCN1-UBC12 interaction. Using these small-molecule inhibitors as chemical probes, we have made a surprising discovery that the DCN1-UBC12 protein-protein interaction is much more important for the neddylation of cullin 3 over other cullin family members. Treatment of cells of different tissue types with these potent DCN1 inhibitors selectively convert cellular cullin 3 into a unneddylated inactive form with no or minimum effects on other cullin members. Our data firmly establish a previously unrecognized specific role of the DCN1-UBC12 interaction for cellular neddylation of cullin 3. Our compounds represent the first-in-class of selective inhibitors of a specific cullin member, and are excellent probe compounds to investigate the role of the cullin 3 ligase in biological processes and human diseases. We will also discuss their potential therapeutic applications.
Targeted Protein Degradation via Redirecting the Action of CRL4 E3 Ligases
Brian Cathers, Celgene
Discovery of the mechanism of action of the multiple myeloma drug lenalidomide opens a new chapter in drug discovery. Distinct cereblon binding molecules evoke different phenotypic responses yet bind the same target. Solution of the ligand bound CRBN complex provides a rationale for distinguishing “gain of function” targeting of key substrates including the transcription factors Aiolos and Ikaros, the protein kinase CK1alpha, or the translation termination factor GSPT1. Is it possible to harness the action of a single E3 ligase and direct its actions toward new and different substrates? Are other ligases able to be co-opted in a similar fashion? The presentation will explain distinctions amongst existing drugs and point to the therapeutic power of harnessing protein homeostatic mechanisms via redirecting the action of E3 ligases.
Track Chairs: Angela Cacace, Fulcrum Therapeutics and Paul Blainey, MIT Department of Biological Engineering and Broad Institute of MIT and Harvard
Session Chair: Paul Blainey, MIT Department of Biological Engineering and Broad Institute of MIT and Harvard
SLAS2018 Innovation Award Finalist: Electrophoretic Cytometry Isolates Cytoskeleton Molecular Complexes of Single Cancer Cells
Julea Vlassakis, UC Berkeley/UCSF Joint Graduate Program in Bioengineering
Changes in molecular interactions underpin disease and drug treatment alike. To regulate actin polymerization and depolymerization, over 100 binding proteins complex with monomeric actin (G-actin, 42 kDa) and filamentous actin (F-actin, up to 100s of monomers) [1,2]. In cancer progression, actin polymerization is disrupted, impacting numerous essential cellular processes (from cell motility to proliferation ). Consequently, oncology drugs targeting stabilization of F-actin filaments have been studied . However, actin binding proteins and oncology drugs compete with small-molecule stains for actin binding sites. Thus gold-standard F and G-actin stains have limited utility in cancer studies of actin polymerization and binding protein complexation . Furthermore, assays (e.g., bulk ultracentrifugation) that physically separate F and G-actin and dissociate inhibiting drugs require millions of cells . Single-cell resolution of actin polymerization state and binding protein complexes would inform drug development, but is currently unfeasible.
We have developed a micro-scale electrophoretic cytometry assay that preserves chemical interactions to separate and detect molecular complexes in up to 1000s of single-cells. As a first demonstration, we fractionate F and G-actin from single cancer cells in a microwell array patterned in polyacrylamide gel. We use gel lid fluidics  to introduce a series of lysis buffers, first containing non-ionic detergents to preserve interactions, followed by a depolymerization buffer. G-actin is electrophoresed in interaction-stabilizing lysis buffer and immobilized in the gel, while F-actin is size-excluded from the gel. Upon delivery of depolymerization buffer, F-actin is electrophoresed in the opposite direction of the G-actin and immobilized. Antibody detection of the actin species yields quantitation of previously unmeasured heterogeneity in F and G actin at the single-cell level.
We will discuss application of electrophoretic cytometry molecular complex separation to evaluate single-cell response to stimuli that alter actin polymerization state and molecular interactions. Future work will assess efficacy of drugs targeting actin maintenance, as well as their impact on the binding protein machinery responsible for actin polymerization/depolymerization.
(1) Masai, J.; Ishiwata, S.; Fujime, S. Biophys. Chem. 1986, 25 (3), 253–269.
(2) dos Remedios, C. G.; Chhabra, D.; Kekic, M.; Dedova, I. V; Tsubakihara, M.; Berry, D. A.; Nosworthy, N. J. Physiol. Rev. 2003, 83 (2), 433–473.
(3) Rao, K. M. K.; Cohen, H. J. Mutat. Res. 1991, 256 (2–6), 139–148.
(4) Senderowicz, A. M.; Kaur, G.; Sainz, E.; Laing, C.; Inman, W. D.; Rodríguez, J.; Crews, P.; Malspeis, L.; Grever, M. R.; Sausville, E. A. J. Natl. Cancer Inst. 1995, 87 (1), 46–51.
(5) Bubb, M. R.; Senderowicz, A. M.; Sausville, E. A.; Duncan, K. L.; Korn, E. D. J. Biol. Chem. 1994, 269 (21), 14869–14871.
(6) Heacock, C. S.; Bamburg, J. R. Anal. Biochem. 1983, 135 (1), 22–36.
(7) Yamauchi, K.; Herr, A. Microsystems Nanoeng. 2017, 3, 16079.
Massive Parallel Phenotyping, Culturing, Assaying and Sequencing of Single Cells on an OptoSelect Chip
Hayley Bennett, Berkeley Lights, Inc.
Recent developments in sequencing library preparation allow genomic analysis by next generation sequencing for 1,000s of individual cells. However, current technologies require abundant starting material and rarely link functional characterization and phenotypic information with sequencing results from each individual cell.
Here we present Berkeley Lights’ proprietary OptoSelect™ technology that enables precise manipulation of single cells using low-intensity visible light. For the first time, individual cells can be isolated, annotated, cultured and assayed on a single nanofluidic chip. Cells of interest can then either be exported for various downstream applications or be directly processed to generate high quality gene expression and genotyping data via our newly developed bead based sequencing preparation technology. This approach captures genomic information onto our proprietary barcoded beads that can be decoded by both our Beacon™ system and by sequencing, providing linkage between phenotypic and genomic information.
To demonstrate the flexibility of our technology, we isolated separate samples of T-cells, memory B-cells, and plasma cells as single or groups of cells on chip using light based dielectrophoreis. A series of assays were performed to further characterize the cells. The T-cells and memory B-cells were also cultured over 2-3 days. To further characterize isolated cells, we captured either mRNA, B Cell Receptor (BCR) or T Cell Receptor (TCR) sequences from each isolated and annotated single T-cell or B-cell clone. Here, we describe a new integrated protocol that precisely barcodes each single cell or clone while keeping its specific phenotypic information and allows us to generate high quality complementary genomic data.
With the capability to culture primary cells and perform various real-time functional assays, our system provides an automated and extremely flexible solution to empower research and discovery in various fields including antibody discovery and immuno-oncology.
Single cell spatial genomics by seqFISH
Long Cai, Caltech
Identifying the spatial organization of tissues at cellular resolution from single cellgene expression profiles is essential to understanding many biological systems. We have developed an in situ 3D multiplexed imaging method to quantify hundreds of genes with single cell resolution via Sequential barcodedFluorescence in situ hybridization (seqFISH) (Lubeck et al., 2014). We used seqFISH to identifyunique transcriptional states by quantifying and clustering up to 249 genes in 16,958 cells. By visualizing these clustered cells in situ, we identified regions within distinct composition of cells in different transcriptional states. Together, these resultsdemonstrate the power of seqFISH in transcriptional profiling of complex tissues. Lastly, I will discuss our work in writing lineages and cell event history into genome of cells by CRISPR/Cas9 genome editing and reading out the stored information in single cells by seqFISH.
New approaches for single cell genome sequencing and mutation analysis
Paul Blainey, MIT Department of Biological Engineering and Broad Institute of MIT and Harvard
Microfluidics and whole-genome amplification are enabling single-cell genomic analyses. At the same time, these technologies limit single-cell genomic studies by imposing cost and complexity (microfluidics) and degrading data quality (whole-genome amplification). Here I will present two new methods for single-cell genome analysis, one that requires no microfluidics or specialized equipment for direct single-cell genome amplification and another that leverages culture-based amplification rather than biochemical amplification to enable studies of de novo mutations in single cells.
Session Chair: Angela Cacace, FULCRUM Therapeutics
Spatial neuron cell-type mapping in mouse brain by in situ sequencing
Mats Nilsson, Science for Life Laboratory, Stockholm University
Single-cell RNA-seq (scRNA-seq) is a powerful tool to classify cells to molecularly defined cell types. However, the information about absolute frequency of cells and exact spatial location the within the original tissue is lost. The brain is the most complex tissue in mammals with respect to the number of different cell-types and the way they are arranged locally and through long range cell-to-cell connections. Here, we demonstrate that in situ sequencing (Ke et al., Nat.Meth., 2013) can be used to build a cell-type spatial map of 100 000s of cells in sections of mouse brain. We use in situ sequencing to map the activity of 99 marker genes within single cells across sections of mouse brains. The marker genes are selected to identify neurons in cortex and hippocampus as defined by scRNAseq. In a single experiment on a single standard microscopy slide, we can analyse four coronal brain sections from adult mice. Each section contains around 100,000 cells and we generate about 3 million reads per section. The read distribution for individual marker genes matches well with the Allen Brain Atlas. To turn the 99 molecular distributions into cell-types, we use a probabilistic approach to assign identity to individual cells based on comparison with the profiles of 35 cell types as defined by scRNA-seq. The sensitivity of the approach is demonstrated by our identification of rare Pvalb-expressing cells among pyramidal cells in stratum pyramidale, and Cck-positive cells, in stratum radiatum.
Ke, R., Mignardi, M., Pacureanu, A., Svedlund, J., Botling, J., Wahlby, C. & Nilsson, M. In situ sequencing for RNA analysis in preserved tissue and cells Nat. Methods 10, 857-860 (2013).
Pathology from the Molecular Scale on Up.
Yury Goltsev, Stanford University
High parameter single cell analysis has driven deep understanding of immune processes. Using a next-generation single-cell “mass cytometry” platform we quantify surface and cytokine or drug responsive indices of kinase target with 45 or more parameter analyses (e.g. 45 antibodies, viability, nucleic acid content, and relative cell size). Similarly, we have developed two advanced technologies that enable deep phenotyping of solid tissue in both fresh frozen and FFPE formats (50 – 100 markers).
I will present evidence of deep internal order in immune functionality demonstrating that differentiation and immune activities have evolved with a definable “shape”. Further, specific cellular neighborhoods of immune cells are now definable with unique abilities to affect cellular phenotypes—and these neighborhoods alter in various disease states. These shapes and neighborhoods are altered during immune surveillance and “imprinted” during, and after, pathogen attack, traumatic injury, or auto-immune disease. Hierarchies of functionally defined trans-cellular modules are observed that can be used for mechanistic and clinical insights in cancer and immune therapies.
Spatially Resolved Sequencing in Three-dimensional Cancer Tissue to Construct a High-Definition Genomic Map in Space
Sungsik Kim, Interdisciplinary Program for Bioengineering, Seoul National University
We all live in three-dimensional space, and so do our cells. Each of us in this globe has different characteristics and genetic information, and so do cancer cells in a tumor. We have very different attributes depending on countries and cultures, and cancer cells in a tumor also show distinct phenotypes according to their genotypes, epigenetic states, and environments. If someone thinks that the earth is a point-like zero-dimensional object, he/she has overlooked the diversity of people and would have a misleading conclusion. In the same vein, to fully understand cancer, we need to analyze cancer cells as is, without extracting and collecting their genetic material in a single tube.
Populations of cancer cells display serious heterogeneity in their phenotypic traits, which is important in both scientific and clinical aspects as it is deeply related to carcinogenesis and clinical outcomes. However, profiling genetic information in tumor cells en masse averages out variability between each tumor cells. Therefore, heterogeneous genetic information in tumor cells should be accessed by isolating each single cell or a minimal number of neighboring cells in tumor tissue into different reactors to separate their genetic information from that of surrounding populations.
In addition to the importance of each cell's identity, it is valuable to analyze a spatial organization of each cancer cell in the original tissue context. Spatial information of genetic information can affect clinical interpretation, because cancer cells evolve through geographic conditions and microenvironment of tissue, and can act differently depending on their location in the tumor. Thus, the spatially resolved sequencing platform meets the needs of cutting-edge cancer biology, which links histopathology to genomics to enable synergistic and more precise interpretation of cancer.
Here, we describe a spatially resolved sequencing method in three-dimensional cancer tissue to create a high-definition genomic map in space. To enable this, we have developed Phenotype-based High-throughput Laser Isolation and Sequencing (PHLI-seq) technology to isolate each cancer cell in consecutive tissue sections using a single laser pulse (~ 1 isolate/second). The isolated cells then underwent whole genome amplification and sequencing. The PHLI-seq system is equipped with an infrared nanosecond pulse laser and a discharging layer for cancer cell isolation. We have also developed automating software, which can be used by hospital pathologists or laboratory researchers to analyze cells remotely. We applied PHLI-seq to breast cancer tissue to analyze genome-wide copy number alterations (CNA) and single nucleotide alterations (SNA), and map each isolated cell's genomic data to the tissue's original location. Finally, we constructed a cancer genome map in 3D space of a breast cancer and visualized it using 3D visualization software. This study would provide new insights into cancer cell heterogeneity in relation to the spatial location of cancer cells.
Raman spectroscopy of isogenic breast cancer cells derived from organ-specific metastases reveals distinct biochemical signatures
Chi Zhang, Johns Hopkins University
Objective characterization of the biomolecular divergences of metastatic lesions, which distinguish them from the primary tumor, remains challenging but is crucial for better understanding of organ-specific adaptations that regulate metastatic progression. Using an orthotopic xenograft model, we have isolated isogenic metastatic human breast cancer cells directly from organ explants that show phenotypic differences from the primary tumor cell line. Leveraging label-free Raman spectroscopic measurements on these isogenic metastatic breast cancer cells from the brain, spine, lung and liver, we designed decision algorithms to enable accurate differentiation without requiring staining or human interpretation. The Raman spectroscopy-based decision models show significant diagnostic power in resolving these isogenic cell lines by analyzing the nucleic acid, protein, lipid and metabolite content. The latter differences were validated through metabolomic analyses that revealed tissue of origin distinctions between the cell lines. Our findings provide evidence that metastatic spread generates tissue-specific adaptations at the molecular level within cancer cells, and open the door for use of Raman spectroscopy to define organ-specific smart chemotherapeutic approaches.
Session Chair: Benedict Cross, Horizon Discovery Ltd
Next generation target discovery: systematic application of the CRISPR toolkit
Benedict Cross, Horizon Discovery Ltd
Forward genetic screening with CRISPR–Cas9 has provided a promising new way to interrogate the phenotypic consequences of gene manipulation in high-throughput, unbiased analyses in target ID, target validation, drug MOA analysis and patient stratification. Diseases previously refractory to systematic high-throughput interrogation are now coming into the cross-hairs of powerful new functional genomic solutions. To date, the majority of screens have been conducted using loss-of-function perturbation driven by CRISPR–Cas9 enacted gene knock-out. Although powerful, this approach does not allow for the examination of activating gene function, leaving a salient gap in the functional genomic analysis. In order to add depth to our discovery platforms, we have constructed new platforms using both CRISPRi and CRISPRa transcriptional regulation tools. Both of these platforms have been adapted to use next generation, highly optimised whole-genome targeting libraries in order to enact maximum gene expression modulation. Our validation analysis of these approaches revealed outstanding performance and sensitivity, with greater than ten-fold improvement in detection rates compared to existing tools.
Screening for drug resistance with this dual platform yields unambiguous target discovery and simultaneous evaluation of both activating and inhibiting perturbations reveals direct and opposing phenotypic effects within complex gene networks. Thus, in contrast to loss-of-function-only analysis, these tools can switch the response of affected cells to either sensitisation or resistance allowing the discovery of key genes which sit in the centre of the hit nexus. These findings demonstrate the unique power of bi-directional functional genomic screening approaches.
The application of these tools to new therapeutic areas is expected to yield crucial new target ID. A major global research focus is in immuno-oncology and the discovery of new immuno-oncology drug targets, including those that alter the character and frequency of T-cell-mediated anti-tumour responses. Although we and others have been able to develop tools that allow highly efficient gene editing of primary T-cells by CRISPR–Cas9, the application of pooled functional genomic screening to primary T-cells has proved a technological hurdle. We have optimised and substantially adapted our pooled CRISPR screening platform to the particular challenge of primary T-cell biology and we will present an update on this promising new capability.
CROP-seq: Pooled CRISPR screening with single-cell transcriptome readout – a high-throughput method for dissecting gene-regulatory mechanisms
Andre Rendeiro, CeMM Research Center for Molecular Medicine of the Austrian Academy of Sciences
CRISPR-based genetic screens are accelerating biological discovery, but current methods have inherent limitations. Widely used pooled screens work well for mechanisms that affect cell survival and proliferation, and can be extended by sortable marker proteins. However, they are restricted to measuring the distribution of guide RNAs before and after applying a selective challenge, and do not provide any detailed phenotypic information. Since the actual cellular responses are not measured, the interpretation and validation of screening hits is generally work-intensive and prone to false positive results. Arrayed CRISPR screens, in which only one gene is targeted at a time, allow for more comprehensive molecular readouts, but at much lower throughput.
We have recently developed a third and complementary screening paradigm, single-cell CRISPR screens, based on the idea that gRNAs and their induced cellular responses are already compartmentalized within single cells. We combined pooled CRISPR screening with single-cell RNA sequencing into a broadly applicable workflow, directly linking guide RNA expression to transcriptome responses in thousands of individual cells (Datlinger et al. 2017 Nature Methods). Our method for CRISPR droplet sequencing (CROP-seq) enables pooled CRISPR screens for entire gene signatures that can be derived directly from the data. Due to its single-cell resolution, CROP-seq can localize the effect of perturbations in complex tissues and cellular differentiation hierarchies, and can work efficiently on scarce material. Furthermore, CROP-seq is compatible with virtually all current methods for single-cell RNA-seq and established strategies for pooled library cloning.
Since the original publication, we continued to develop CROP-seq with a particular focus on in vivo screens in Cas9 mice. We are exploring combinations with alternative functions, such as CRISPR inhibition, activation and targeted epigenetic modifications. I will further provide insights into further technical improvements such as higher gRNA detection rates and demonstrate CROP-seq compatibility with single cell sequencing platforms capable of further upscaling screens. Given the increasing throughput of single-cell transcriptomics and the advent of single-cell multi-omics technology (reviewed in: Bock et al. 2016 Trends in Biotechnology), CROP-seq has the potential to provide comprehensive characterization of large CRISPR libraries and constitutes a powerful method for dissecting cellular regulation at scale.
CRISPR-Mediated Tagging of Endogenous Proteins with a Luminescent Peptide
Keith Wood, PROMEGA CORPORATION
The effects of synthetic compounds on signaling pathways are often evaluated using overexpressed genetic reporters. It is now possible with CRISPR/Cas9 to better preserve native biology by appending reporters directly onto the endogenous genes. For this purpose, we introduced the HiBiT peptide (11 amino acids) as a small reporter tag capable of producing bright and quantitative luminescence through complementation (KD = 700 pM) with an 18 kDa subunit derived from NanoLuc (LgBiT). The small size of HiBiT minimally alters protein structure, while the luminescent assay provides sensitive analysis at very low expression levels. Using CRISPR/Cas9, we demonstrated that HiBiT can be rapidly and efficiently integrated into the genome to serve as a quantitative tag for endogenous proteins. Without requiring clonal isolation of the edited cells, we were able to determine changes in abundance of the hypoxia inducible factor 1A (HIF1α) and several of its downstream transcriptional targets in response to various stimuli. In combination with fluorescent antibodies, we further used energy transfer from HiBiT to directly correlate HIF1α levels with the hydroxyproline modification that mediates its degradation. These assay methods allowed dynamics in protein abundance and covalent modifications to be assessed within 24-48 hours of introducing synthetic oligonucleotides together with Cas9 into the cells, thus circumventing the prerequisite for molecular cloning.
Precision Enabled: Discovery of Gene Networks and New Drug Targets in Metastatic Melanoma with Single Cell Sequencing
Michael Stadnisky, FlowJo, LLC
Despite the buzz regarding cloud facilitated data integration, notification, and tracking, LIMS and electronic laboratory notebooks have failed to deliver for multi-omics. Instead, current solutions are open source or have the user-friendliness of an electronic medical records system -- raising the activation energy and time required to install, to collaborate, and ultimately to produce insight. Herein, we describe a workflow-based collaboration and communication approach and its use in a coordinating the analysis of single cell gene expression from 19 melanoma metastases. We show that each metastatic tissue is unique to each patient, but identify a rare subpopulation present in each tumor which shares the same gene expression pattern. Thus, we identify two new drug targets shared between patients, and uncover the gene expression pattern of the immune response, particularly exhausted CD8+ T cells within each metastatic tissue could be reversed. Thus, we show how single cell phenotyping and gene expression experiment execution may be coordinated and executed to reveal both tumor and immune response drug targets and gene expression networks. We extend this work to compare the tumor gene expression patterns to the published literature and drug trials, and show that drug repurposing may be an effective strategy for melanoma.
Track Chairs: Andrew De Mello, ETH Zürich and Sammy Datwani, Labcyte
Session Chair: Sammy Datwani, Labcyte Inc.
Microfluidics and Commercial Success? Experience and examples of the last 16 years.
Mark Gilligan, Blacktrace Holdings Ltd
The presenter has built a group of companies over the last 16 years which exploit Microfluidics in a wide range of ways. Using Microfluidics to make products such as:
During this time, the Dolomite part of the group was also a leading consulting team in microfluidics, and attacked challenges in very varied application areas from the oil industry, through food, pharmaceuticals and diagnostics to name a few. As a result of this history the presenter has extremely diverse knowledge and experience of the successes and failures of exploitation of microfluidics, as well as understanding why these outcomes were arrived at. This coupled with the presenter having been an Editorial board member of the RSC Journal "Lab on a Chip" since 2012 gives yet another insight from the academic perspective.
The presenter will try to help the audience to understand the factors that have shaped successes and failures to ensure that future ventures try to avoid the pitfalls of the past.
An Acoustic Microfluidic Device for Hematopoietic Stem Cell Enrichment from Whole Blood
Charles Lissandrello, Draper
Emerging cell therapies require efficient methods for purification of target cells prior to subsequent processing. In the case of stem cell-based therapies large numbers of hematopoietic stem cells (HSCs) must be collected and purified from patient peripheral blood; a challenging task because even after mobilization, the concentration of HSCs in the collected product is typically less than 1% of all cells. Existing processing techniques, such as density gradient centrifugation and subsequent magnetic separation, achieve some of the requirements, however, they often provide low yield, are costly, time-consuming, and labor intensive. Acoustic separation has emerged as a versatile technology for flow-through liquid handling and particle manipulation. The technique relies upon differences in the size, density, and compressibility of various blood components in order to achieve rapid label-free discrimination between target and off-target cells.
Here, we present a system that continuously separates HSCs from both healthy and diseased whole blood using acoustically-mediated separation in a plastic microfluidic device. We and others have previously demonstrated acoustic separation of bacteria, beads, and liposomes from blood cells, but this is the first report showing enrichment of HSCs directly from patient samples. In addition, because our microchannel is constructed entirely of polystyrene, it is suitable for scale-up to clinically relevant processing rates, with the potential for flow rates approaching 100 ml/hr. Our device consists of a microchannel mounted on a piezoelectric actuator and a temperature-controlled stage. The actuator excites an acoustic standing wave within the fluid cavity, transverse to the flow direction. This standing wave exerts a force on blood cells which drives them toward the centerline of the flow. Larger and denser cells experience a larger force compared to smaller and less dense cells, and are more strongly focused. At the downstream end of the channel, a trifurcating outlet allows for the separation of strongly focused cells (e.g., red blood cells and neutrophils) from those that are weakly focused (e.g., HSCs and lymphocytes). In this work the system is tuned to enable the separation and collection of HSCs.
We achieve enrichment of the HSC population (CD34+ as % total white blood cells) from 9% to 22%, a factor of 2.4x, starting from unpurified whole patient blood. This enrichment was achieved in a single pass through the device with HSC recovery of 40% and reduction of the red blood cell concentration by 62%. These figures can be improved by multiple passes through the system and by device optimization. Our results demonstrate that we are able to efficiently and specifically purify HSCs from whole blood in a continuous flow-through device. In addition, our device is fabricated from low-cost components and is straightforward to operate, giving it the potential for future use in sample purification for cell therapy.
Small Airway-on-a-Chip: A Novel Microphysiological System to Model Human Lung Inflammation, Accelerate Drug Development and Enable Inhalation Toxico-analysis
Kambez Benam, University of Colorado Denver
Lung diseases are a major global health problem with rising incidence and morbidity; they constitute three out of five leading causes of death worldwide (World Health Organization). Lower respiratory infections, chronic obstructive pulmonary disease (COPD) and lung cancers collectively account for over 8 million deaths annually. Unfortunately, development of new therapeutics and advancement in our understanding of inhalation toxico-pathology have been considerably hindered by challenges in studying organ-level complexities of human lung in vitro. Importantly, clinical relevance of widely used animal models of pulmonary disorders is questionable. Here, we applied a microengineering technological approach known as ‘organ-on-chip’ to create a human lung ‘Small Airway-on-a-Chip’ that supports full differentiation of a pseudostratified mucociliary bronchiolar epithelium from normal or diseased donors underlined by a functional microvascular endothelium. Small Airway Chips lined with COPD epithelia recapitulated features of the disease including selective cytokine hypersecretion, increased neutrophil recruitment, and clinical exacerbations by exposure to pathogenic stimuli. Using this robust in vitro approach, it was possible to detect synergistic tissue-tissue communication, identify new biomarkers of disease exacerbation, and measure responses to anti-inflammatory compounds that inhibit cytokine-induced recruitment of circulating neutrophils. In addition, by connecting the Small Airway Chip to a custom-designed and modular electromechanical instrument that ‘breathes’ freshly produced whole cigarette smoke in and out of the chip microchannels, we successfully recreated smoke-induced oxidative stress, identified new ciliary micropathologies, and discovered unique COPD-specific molecular signatures (‘Breathing-Smoking Lung-on-a-Chip’). Moreover, this platform revealed a subtle ciliary damage triggered by acute exposure to electronic cigarette. Therefore, the human Small Airway-on-a-Chip and Breathing-Smoking Lung-on-a-Chip technologies represent new tools to study normal and disease-specific responses of the human lung airway to pathogens and tobacco-related products across molecular, cell, and tissue levels in an organ-relevant context, which can facilitate identification of new clinical biomarkers and potential therapeutic targets.
SLAS2018 Innovation Award Finalist: Microfluidic Platform for Next Gen Life Science Consumables
Paul Hung, COMBiNATi Inc
The growing demand on precision, personalized, predictive and preventive medicine will continue to propel new tools, new devices and new drugs to the market. COMBiNATi is developing a novel microfluidic platform technology to disrupt the life science consumables. Conventional labware such as test tubes, petri dishes, microtiter plates and flasks, despite its ease of use, are neither effective nor efficient. Microfluidic technology, despite the early enthusiasm back in the early 90s, suffers slower-than-expected adoption because of its cumbersome chip-to-world interface and challenges in scaled-production. COMBiNATi’s core microfluidic technology will allow translation of rapid prototyped microfluidic design into a commercial ready product using a semi-permeable thermoplastic thin film.
Our first product under development is a simple and affordable digital PCR (dPCR) platform to accelerate the paradigm shift from qPCR to dPCR for precision targeted nucleic acid quantification. Using Cyclo-Olefin-Polymer (COP), the consumable design exhibits an array of interdigitated microfluidic lollipops connecting to a serpentine channel, with reagent reservoirs injection molded together with the microfluidics, and the bottom sealed with the semi-permeable thin film. A tubing-free pneumatic interface was used to introduce open chemistry bulk PCR reagent, backfill the microfluidic lollipops, and isolate each lollipop with air. After 40 cycles of PCR on a flat block thermal cycler, the device was scanned by either an image plate reader or a microarray scanner. An Image J applet was created to stencil each lollipop to analyze fluorescent intensities, generate scattered plot, allow user to view the images of questionable partitions to reject false-positives before applying Poisson statistics with software-guided threshold. A prototype integrating pneumatic, thermal and imaging control was developed to allow “one-click” dPCR workflow. Preliminary results have been obtained with TaqMan rare allele mutation assay (KRAS), ERBB1 copy number variation analysis, as well as HIV viral load quantification.
Beyond dPCR, COMBiNATi is also working on a Synthesized Tissue Array Reactor (STAR) for physiologic-relevant in vitro liver cell culture, and a Microbioreactor Array as a high throughput perfusion model for upstream bioprocessing optimization. We believe our microfluidic platform technology will deliver next gen life science consumables by solving the two critical commercialization barriers: difficult-to-use and difficult-to-scale. We are looking to transform the life science industry, one consumable at a time.
Session Chair: Amar Basu, Wayne State University
The Chromium System for Enabling High Resolution Biology
Ben Hindson, 10x Genomics
Reconstructing individual genomes and understanding the impact on biology remains a significant challenge. While large numbers of genomes and transcriptomes have been sequenced, the resulting resolution of these data remains insufficient for many applications. Traditional reference based, short-read analysis of genomes provides an incomplete picture of individual genome architecture. Likewise, while traditional transcriptomics has provided many biological insights, higher resolution data will allow for new information to be obtained. We have developed a high-throughput solutions that addresses both areas.
For genomic applications, we partition limiting amounts of high molecular weight DNA such that unique bar codes can be added as part of library generation. This approach allows us to couple long-range information with high-throughput, accurate short read sequencing, generating a data type known as Linked-Reads. Coupling this novel datatype with new algorithms allows us to access a greater percentage of the genome as well as identify the full spectrum of variant types. Additionally, Linked-Reads enable de novo assembly with modest amounts of sequencing.
For transcriptomic applications, our microfluidics system partitions single cells and then barcodes their transcriptional content. This high resolution transcriptional profiling allows for the discrimination of discrete cell types from complex mixtures, allowing for the dissection of complex biological processes at high throughput. This opens up new applications for better discriminating immunological processes as well as understanding tumor micro-environment.
Inline, Label-free Detection Using the Droplet Frequency Sensor
Amar Basu, Wayne State University
Inline detectors are extensively used in chemical separations and other life sciences workflows to quantify analytes based on fundamental properties. For example, absorbance (UV-VIS) detectors measure the analyte’s light-absorbing chromophores, refractive index (RI) detectors measure molecular cross section, and electrochemical (EC) detectors and mass spectrometers (MS) measures the analyte’s charge or mass to charge ratio. Although hydrophobicity and solubility are important properties of an analyte, to date there are no inline detectors based on such properties. Here, we present the drop frequency sensor (DFS), a novel inline detector which quantifies an analyte based on its adsorption to a liquid interface.
The DFS is based on the surfactant retardation effect, a phenomenon first described by Levich in the 1960s. Levich observed that the velocity of a rising bubble is lower than expected if surfactants are present. Surfactants adsorb to the bubble’s interface, and surface flows convect them to the trailing end, where they aggregate into a stagnant cap. The cap has two effects, both of which increase drag: 1) the interface becomes immobile, and 2) the nonuniform surfactant concentration results in a surface tension gradient, which induces a Marangoni force opposing the motion of the bubble. The DFS exploits a similar effect in droplets flowing through a microchannel. Droplets of the analyte are generated by combining the sample stream with a stream of oil in a microfluidic tee junction. If the droplet contains hydrophobic molecules or other surface-active species, the molecules adsorb to the interface and are convected to the trailing end, similar to Levich’s experiments. Here, they form a stagnant cap which increases drag on the droplet train, and therefore increases channel’s hydrodynamic resistance. In a pressure-driven system, the increased resistance reduces flow rate as well as the frequency of drop generation. The droplet frequency is measured with a light scattering detector.
The DFS demonstrates excellent quantitation capability for Bovine Serum Albumin (BSA), a globular protein with known hydrophobic regions. Injection of BSA into the analyte stream temporarily reduces the drop frequency, and the frequency returns to baseline, generating a chromatographic peak. The peak area increase linearly with the quantity of injected BSA with a correlation coefficient R2=0.997. This process is highly repeatable, which is important for measurement precision. The limit of detection for BSA is 2ng, and < 200pg for L-galectin, a hydrophobic protein with smaller molecular weight. The high signal-to-noise ratio suggests that even lower detection limits are possible. The low detection limits are achieved because the high surface area to volume ratio favors adsorption phenomena.
Next Generation Precision Epigenetic Analyses via Automated Droplet Microfluidics
Ryan Bailey, University of Michigan -
Insights into epigenetics and chromatin dynamics have profoundly affected our understanding of biological processes including development, aging, complex diseases and oncogenesis, providing a more comprehensive view than could be ascertained by considering information from genetic or gene expression studies alone. Importantly, a pipeline of new therapeutics is emerging that directly target epigenetic modifications or the enzymes that install these modification. Therefore, detection of genome-associated protein complexes, as well as histone post-translational modifications and positioning, are crucial for developing of new diagnostic and therapeutic strategies. However, the methods utilized in the research laboratory are not at present readily translatable to clinical applications. We are developing a suite of droplet microfluidic components that can be integrated into a robust and automated platform supporting chromatin immunoprecipitation and nucleosome footprinting. The ability to perform these assays with high reproducibility and minimal sample input requirements is poised to enable the clinical realization of personalized epigenetics.
SLAS2018 Innovation Award Finalist: Label-free prediction of cancer cell invasion by single-cell physical phenotyping
Amy Rowat, UCLA
The physical properties of cells, such as cell deformability, are promising label-free biomarkers for cancer diagnosis and prognosis. Here we determine the physical phenotypes that best distinguish human cancer cell lines, and their relationship to cell invasion. We use the high throughput, single-cell microfluidic method, quantitative deformability cytometry (q-DC), to measure six physical phenotypes including elastic modulus, cell fluidity, transit time, creep time, cell size, and maximum strain at rates of 102 cells/s. By training a simple k-nearest neighbor machine learning algorithm, we demonstrate that multiparameter analysis of physical phenotypes enhances the accuracy of classifying pancreatic cancer cell lines compared to single parameters alone. We also discover a set of four physical phenotypes that predict invasion; using these four parameters, we generate the physical phenotype model of invasion by training a machine learning algorithm with experimental data from a set of human ovarian cancer (HEYA8) cells that overexpress a panel of tumor suppressor microRNAs. We validate the model using breast and ovarian human cancer cell lines with both genetic and pharmacologic perturbations. Our results reveal that the physical phenotype model correctly predicts the invasion of five cancer cell samples. We also identify a context where our model does not accurately predict invasion, which incites deeper investigation into the role of additional physical phenotypes in cancer cell invasion. Taken together, our results highlight how physical phenotyping of single cells coupled with machine learning provide a complementary biomarker to predict the invasion of cancer cells.
Session Chair: Olivier Frey, InSphero AG
Current landscape and future opportunities in implementing human microphysiological models in pre-clinical drug development
Jason Ekert, R&D Platform Technology & Science, GlaxoSmithKline
The pharmaceutical industry is facing great challenges still owing to high R&D costs and low overall success rates of clinical compounds during drug development. In phase I clinical trials the majority of failures are due to safety related issues. While more than 50% of failures in phase II and III clinical trial are due to a lack of efficacy and a quarter due to safety issues, where safety includes those failures that were due to an insufficient therapeutic index. Drug failures in clinical trials are mainly due to the poor translational relevance and clinical predictive power of existing preclinical models which include human cell based in vitro and animal models. The drug discovery community has recognized the critical need for new testing approaches to generate more translatable and reliable predictions of drug efficacy and safety in humans. This has driven the recent advancements in cell biology, tissue engineering, biomaterials, and emerging platforms such as microfabrication, microfluidics and bioprinting in the development of innovative in vitro technologies that more closely recapitulate human tissues and organs. These three dimensional (3D) human in vitro models such as 3D spheroids/organoids, organs-on-chips, and bioprinted tissues could provide the basis for preclinical assays with greater translatability and predictive power. They could be applied for greater insight into mechanisms of human disease, mechanisms of toxicity or for early confirmation of new therapy efficacy. I will provide a perspective on the breadth of new opportunities available for the integration of these 3D human in vitro models within drug discovery and the related challenges in adoption. I will introduce key technological background and advantages/limitations of each novel 3D human in vitro models with examples from recent studies or cases. Furthermore, I will discuss the essential validation process for these 3D human in vitro technology and the importance of integration of various models and the translatability to the clinic. I will conclude by examining how 3D in vitro technology will begin to tackle major technical challenges at the critical steps of conventional and the evolving drug discovery process.
SLAS2018 Innovation Award Finalist: Automating multi-tissue microphysiological systems using 3D microtissues
Olivier Frey, InSphero AG
The next step towards more biomimetic in vitro models is the design of multi-organ devices which allow communication of different tissue types. Combining physiologically relevant organ models in perfusion systems bears technological challenges and often leads to complicated culturing setups. Complex systems require trained personnel, reduce reproducibility and make integration into scalable routine processes difficult. The multi-tissue platform presented here builds on the European project "Body-on-a-Chip" and is developed in collaboration with ETH Zurich. Microfluidic channels and chambers were engineered for culturing of microtissue spheroids under physiological flow conditions. The platform has a plate-format, is produced completely out of polystyrene, and complies with SBS-standard dimensions. It includes 8 parallel channels, with each channel containing up to 10 microtissue compartments. The compartments have minimal dead volume ( < 2 uL) and are directly accessible with a robotic pipet tip for microtissue loading and retrieval. Open media reservoirs are located at both ends of each channel. Perfusion flow is generated through tilting the device back and forth on an automated system inside an incubator. Multiple devices can be operated in parallel increasing the number of conditions and statistical replicates executed in parallel. The concept allows on-demand interconnection of up to 10 same or different microtissues per channel in a very flexible way. With the broad range of available spheroid-based organ-models, a variety of pre-clinical testing applications can be generated using the very same platform. Using the system we, for example, were able to demonstrate that liver and islet microtissues showed significantly higher functionality under flow conditions compared to static culturing. The metabolic function of liver has been used to activate prodrugs and study their effect on tumor in liver-tumor co-cultures. Liver-islet interactions are currently investigated for metabolic disease models and investigate the influence of glucose-stimulated insulin secretion on liver metabolism.
Inflammation-on-a-chip – High-throughput microscale arrays for human neutrophil swarming
Daniel Irimia, Harvard Medical School - Massachusetts General Hospital
Neutrophil swarms protect healthy tissues by sealing off sites of infection. In the absence of swarming, microbial invasion of surrounding tissues can result in severe infections. Recent observations in animal models have shown that swarming requires rapid neutrophil responses and well-choreographed neutrophil migration patterns. However, in animal models, physical access to the molecular signals coordinating neutrophil activities during swarming is limited. Here, we report the development and validation of large microscale arrays of targets for the study of human neutrophils during swarming ex vivo. We characterized the synchronized growth of thousands of swarms at once, towards live-microbe and microbe-like synthetic particles simulating infections. We took advantage of the synchronized swarming in small volumes to analyze in detail the mediators released at different phases of human-neutrophil swarming against various targets. We found that the mediators coordinating human-neutrophil swarming form a complex network, with multiple levels of redundancy, which includes more than 40 signaling proteins, i.e. stimulatory of neutrophil activity, proteolytic enzymes and enzyme inhibitors, activators of other immune and non-immune cells (monocytes, lymphocytes, endothelial cells, adipocytes, etc). We identified only one mediator that limits the growth of neutrophil swarms, LAX4, which is a lipid and has been associated before with the restoration of immune homeostasis. We compared the swarming behavior of neutrophils from patients following major trauma and healthy individuals and found various deficiencies that resolve over time. Overall, we report a new platform technology for studying neutrophil swarming, a behavior that is relevant to various disease and physiologic processes, and which could serve as discovery and validation platform for novel anti-inflammatory and anti-microbial treatments.
SELA-Chip: A microfluidic airway organ-on-a-chip system for studying respiratory health
Edmond Young, University of Toronto
The lung airway tissue environment is comprised of different cell types, including airway epithelial cells (ECs) and smooth muscle cells (SMCs) that have been shown to play major roles in the pathogenesis of chronic lung diseases (CLDs). Communication between ECs and SMCs is a crucial aspect in CLDs triggered by exacerbations such as air pollutants and pathogens. SMC-EC interactions are thus central to elucidating mechanisms in CLDs, and important for revealing new opportunities for CLD therapy. However, our understanding of SMC-EC interactions in lung airways remains limited due to a lack of experimental models that can accurately mimic the human physiology of the lung. Recent advances in organ-on-a-chip lung models have demonstrated excellent physiologic mimicry of the alveoli and small airway function. However, these systems have yet to recapitulate airway SMC-EC interactions in microfluidic cell culture, and have relied on biologically inert polyester membranes to separate culture compartments. Furthermore, existing microfluidic devices face challenges in throughput due to use of PDMS and designs that are not amenable to parallelization.
Our objective was to address these limitations by developing a multi-layer microfluidic organ-on-a-chip device, called SELA-Chip, which uses a biocompatible hydrogel for compartmentalization, as well as an arrayable design to increase experimental throughput, with specific application to studying EC-SMC interactions of the lung airway. The SELA-Chip was fabricated in acrylic using a combination of micromilling and solvent bonding techniques. The chip consisted of 3 vertically stacked microfluidic compartments representing a top epithelial chamber, a middle ECM gel layer, and a bottom reservoir chamber for SMC culture. The key design element that realized this concept was the protruding ledge on each side of the middle microfluidic compartment, which created a unique cross-section that enabled gel suspension. A hydrogel mixture consisting of Type I collagen and Matrigel was used to form the suspended hydrogel layer. Once the gel was suspended, we cultured Calu-3 epithelial cells on the top surface for 21 days, and co-cultured human bronchial SMCs on the bottom surface for an additional 14 days. In separate experiments, we removed the liquid medium above the Calu-3 cells after 14 days, and introduced air-liquid interface (ALI) culture for 31 days. Immunostaining was performed to visualize cell markers including ZO-1 tight junctions, MUC5AC, F-actin, and α-SMA. SMC alignment was analyzed using ImageJ to measure cell orientation angles with respect to the gel cross-section. After 31 days of ALI culture, we observed positive staining of MUC5AC, an indication of differentiated mucin-producing goblet cells. Cocultures of ECs and SMCs showed excellent ZO-1 tight junction formation, and increased SMC alignment from days 4 to 7. Future work will involve introducing airflow, pollutants and airborne pathogens to examine EC-SMC interactions under physiologic conditions.