2010 Overview | Day 1 | Day 2 | Day 3 | Distinguished Faculty | Pre-Conference Course
User Group Meetings | Speakers | Download 2010 Final Brochure 

High Content Analysis - Day 1


Wednesday, January 13

7:00-8:00 Conference Registration and Morning Coffee

8:00-8:10 Welcoming Remarks from Conference Director

Julia Boguslavsky, Cambridge Healthtech Institute

Sponsored by
8:10-8:15 Introduction from Executive Sponsor

Jeff Haskins, Ph.D., Product Line Director, Cellular Imaging & Analysis, Thermo Fisher Scientific

8:15-9:15 Panel Discussion with End-Users and Vendors

Discussion Questions Include:

  • What new applications and image analysis tools are expected to be launched in 2010?
  • What is the progress on imaging standards?
  • How is the need for new probes and protein-protein interaction imaging tools addressed?
  • What are the new application areas undergoing rapid adoption?

9:15-10:30 Coffee Break with Exhibit and Poster Viewing

HCA of Stem Cells

10:30-10:35 Chairperson’s Opening Remarks

10:35-11:00 High-Content Imaging in Oncology Discovery: Identification and Characterization of Novel Targets in Cancer Stem Cells

Jonathan Low, Ph.D., Post-Doctoral Scientist, Cancer Cell Growth and Survival, Lilly Corporate Center

Although the cycling of eukaryotic cells has long been a primary focus for cancer therapeutics, recent advances in imaging and data analysis allow even further definition of cellular events as they occur in individual cells and cellular subpopulations in response to treatment. High-content imaging (HCI) has been an effective tool to elucidate cellular responses to a variety of agents, however, these data were most frequently observed as averages of the entire captured population, unnecessarily decreasing the resolution of each assay. Here we dissect the eukaryotic cellular subpopulations in response to treatment using HCI in conjunction with unsupervised K means clustering. We first generate distinct phenotypic fingerprints for each major cell cycle and mitotic compartment and use those fingerprints to characterize chemotherapeutic agents. We determine that the cell cycle arrest phenotypes caused by these agents are similar to, though distinct from, those found in untreated cells both in vitro and in vivo, and that these distinctions frequently suggest the mechanism of action. Further, we demonstrate the power of this technique to identify novel targets and detect the differential effects of target knockdown on cancer stem cells through the use of shRNA libraries. High-content data are then integrated with additional discovery tools to link phenotypic changes with cellular pathways. HCI analysis of imaging data, obtained from individual cells under all of these research conditions, grouped into cellular subpopulations, and multiplexed with additional tools represents a powerful method to discern both cellular events and treatment effects.

11:00-11:25 Characteristics and Re-Programming of Breast Cancer Stem Cells

Fredika M. Robertson, Ph.D., Professor, Department of Experimental Therapeutics; Director, Translational Research, The Morgan Welch Inflammatory Breast Cancer Research Program, The University of Texas M.D. Anderson Cancer Center

Very aggressive tumor types contain a high percentage of cells defined as cancer stem cells (CSCs) with characteristics similar to those of embryonic stem cells including a slow turnover time that can be imaged and quantitated by their retention of the nucleoside analog ethynyl deoxyuridine (EDU). CSCs form 3-dimensional (3D) tumor spheroids that differentially express specific surface markers including stage specific embryonic antigens 1 and 4 (SSEA1/4), CD133, and CD44+/CD24-/low. CSCs have characteristic patterns of gene expression of molecules in signaling pathways that regulate survival, self-renewal, pleuripotency, and multi-drug resistance. Agents that can either stimulate the normally quiescent CSCs to re-enter the cell cycle or agents that target transcription factors regulating self renewal and survival result in re-programming CSCs for their elimination. The effects of exposure to these agents that can re-program activities of CSCs will be discussed in the context of image-based analysis.

Sponsored by
GE Healthcare
11:25-11:40 Illuminating your Stem Cell Research with the IN Cell Analyzer 2000: A focus on Cell Colony Analysis
Stephen Minger, Ph.D., Head Research & Development, Cell Technologies, GE Healthcare
Understanding Stem Cell colony growth, status and progression through differentiation routes has become a major factor in harnessing the potential of Stem Cell technologies for the future. High Content Analysis of whole well images from the IN Cell Analyzer 2000 using Investigator image analysis software will be discussed which provide new approaches and valuable insights into these important evaluations.

11:40-11:55 Using Embryonic Stem Cells for Drug Discovery

Amy Sinor, Ph.D., Assay Development Scientist, Department of Stem Cell and Regenerative Biology, Harvard Stem Cell Institute, Harvard University

Many neurodegenerative diseases involve the selective death of specific types of cells.  Scientists interested in understanding selective neural degeneration have been hampered by the lack of experimentally convenient in vitro systems.  In the last few years, it has become possible to generate large numbers of motor neurons from mouse embryonic stem (ES) cells.  This has enabled us to carry out new types of studies directed at identifying therapeutics for two different motor neuron diseases: Spinal Muscular Atrophy (SMA) and Amyotrophic Lateral Sclerosis (ALS). To address this issue, we have carried out high-content screens in ES derived motor neurons from both wildtype, SMN deficient, and mutant human SOD1 ES cells.  All the ES cell lines used carry a transgene in which GFP expression is regulated by the HB9 promoter, a motor neuron specific marker.  This marker allowed us to identify motor neurons in order to determine protein expression of SMN and the number of motor neurons.  Our goal was to identify small molecule compounds that could increase SMN levels and promote neuronal survival.


Image Analysis

10:30-10:35 Chairperson’s Opening Remarks

10:35-11:00 Quantifying Challenging Phenotypes in Images

Mark-Anthony Bray, Ph.D., Computational Biologist, Imaging Platform, Broad Institute

Many challenging image-based phenotypes have recently become quantifiable due to advances in image analysis and machine learning algorithms. Our recent work in the area has enabled high-content analysis of phenotypes relevant to multiple basic biological processes and clinically relevant diseases. For example, our recent work has enabled screens of phenotypes in physiologically relevant co-culture systems, where cell types with diverse morphologies are present in each sample. The variety of phenotypes that can be accurately quantified using software continues to grow.

11:00-11:25 Quantitative Analysis of High-Content Screens: How Can Machine Intelligence Help?

Peter Horvath, Ph.D., Image Processing Scientist, Light Microscopy Centre, ETH Zurich

Accurate quantitative analysis is essential for high-content screens. We will show cell-based classification with machine learning techniques. A novel semi-supervised learning-based method will be presented to speed up the learning process with orders of magnitude. Finally, we will present new machine intelligence methods for accurate quality control.

11:25-11:50 Application of Pattern Recognition to Image-Based Small Molecule Screening Data for Phenotypic Analysis

John McLaughlin, Ph.D., Scientist & Manager, Biology, Rigel Pharmaceuticals, Inc.

This presentation will describe an image based phenotypic screen for AuroraB Kinase inhibitors that we have developed, which lead to the discovery and subsequent development of a small molecule R763/AS703569 currently in clinical trials for cancer. This screen is a proliferation type assay in which cancer cell lines are treated with small molecules for 48hrs then fixed and stained for the presence of DNA and Actin. We create training sets from treatments with control compounds and use them to create support vector machine classifiers that are subsequently used to mine our data for interesting phenotypes in addition to AuroraB. Our large annotated data set with many well-characterized controls has provided an excellent opportunity to validate and improve the predictive capabilities of the classifiers. Various strategies for increasing training set robustness have demonstrated an impressive ability to productively mine screening data collected on a weekly basis over many years. We have found that pattern recognition can significantly enhance and speed attempts to quantify what are often overwhelmingly large and complex image data sets produced by image-based screening.

11:50-12:15 High-Performance Image Analysis for High-Content Screening

Dadong Wang, Ph.D., Project Leader, Biotech Imaging, CSIRO

Large image datasets and fast turnaround requirements have made efficient High Content Screening (HCS) a challenging task. With the enormous progress in high performance computing, computers with multi-core CPUs have become standard and GPUs are being used more widely in data and compute-intensive environments. This talk will report some of our studies in high performance image analysis and its applications in HCA, including GPU based image analysis and multi-core based batch processing for the quantitative High Content Analysis of neurite outgrowth. With the multi-core based batch processing on a quad-core machine, the time for our neuron body detection algorithm has been reduced to 38% of the original, and 46% for neurite analysis. The results show that the high performance image analysis can significantly increase the throughput of HCS and improve the workflow in laboratories.

Flow Cytometry

10:30-10:35 Chairperson’s Opening Remarks

10:35-11:00 Implications of High-Throughput Flow Cytometry on Drug Discovery

J. Paul Robinson, Ph.D., SVM, Professor, Cytomics & Deputy Director, Bindley Bioscience Center, Cytomics & Imaging, Purdue University

The time has come for high-content tools such as flow cytometry to also move into the high-throughput domain. This requires both hardware and software changes. It is not easy to move a technology that has a 40-year history of operating under the same conditions, to change its basic operational rationale. However, that is happening. One of the major changes is the radical change in analytical tools becoming available. This presentation will outline these recent tool-sets that will transform the field of flow cytometry.

11:00-11:25 Phospho Flow Cytometry in Drug Discovery: From Screening to Clinical Trials

Peter Krutzik, Ph.D., Senior Scientist, Baxter Lab in Genetics Pharmacology, Microbiology & Immunology, Stanford University

Flow cytometry is a powerful tool for analyzing 10 or more parameters at the single cell level. Recently, the use of phospho-specific antibodies has allowed us to measure intracellular signaling events in addition to classical surface markers. This enables us to analyze kinase signaling cascades in complex primary cell populations such as human peripheral blood. Using phospho flow, we performed a small molecule drug screen in primary cells, in both 96 and 384 well format, to search for inhibitors of immunological pathways. The screen yielded pathway- and novel cell type-specific inhibitors of cytokine-induced type-specific inhibitors of cytokine-induced Jak-Stat signaling. The method was used both in vitro and in vivo to confirm drug activity. To improve sample throughput, we employed Fluorescent Cell Barcoding (FCB), a multiplexing method that enables combination of samples prior to antibody staining. We will also discuss preliminary work in automating the phospho flow method for large scale screening projects.

11:25-11:50 High-Content High-Throughput Flow Cytometry for Small Molecule Discovery

Eric Prossnitz, Ph.D., Professor, Cell Biology and Physiology, University of New Mexico

The University of New Mexico Center for Molecule Discovery continues to innovate in the application of the HyperCyt flow cytometry platform for high-content high throughput small molecule discovery. The platform is evolving for 1536 well plates and direct sample delivery. Recently, we have demonstrated HTS applications with primary cells and yeast multiplex model systems for TOR pathway analysis, as well as innovative molecular assays for intracellular trafficking pathways. The flow cytometry platform is well-suited to fill a unique niche in small molecule identification for cell and molecular assays in suspension, especially in complex cell suspensions for primary cells, hematopoietic stem cells, and leukemia.

11:50-12:15 SERS Cytometry for High-Content Analysis: More Parameters for Less

John P. Nolan, Ph.D., Professor, La Jolla Bioengineering Institute

Fluorescence methods dominate the field of cytometry, providing sensitive and quantitative measurements of molecules, cells, and other particles. In flow cytometry especially, multiple light sources, filters and detectors enable as many as 20 different fluorescence probes to be detected and measured simultaneously and rapidly on individual cells. However, this requires the use of multiple lasers and fluorophores with emission spectra that fill the optical spectrum from the UV to the near IR, and significant increases in this number are unlikely with existing light sources, fluorophores, and detectors. To make more efficient use of this spectral range, we have developed instruments and probes that take advantage of surface-enhanced Raman scattering (SERS). SERS occurs at the surface of metal nanoparticles and offers sensitivity comparable to fluorescence, but with much more efficient use of the optical spectrum, providing the potential of hundreds of tags to be resolved with a single laser line and less than 100 nm of spectral space. We use nanoparticle probes with distinctive SERS spectra functionalized with antibodies or other targeting molecules to measure multiple targets simultaneously. Raman flow cytometers use spectrographs and array detectors to measure high resolution SERS spectra from hundreds of individual particles per second. Simultaneous Raman and fluorescence flow cytometry provide the best of both worlds, with fluorescence measurements of both functionally and antigenic markers combined with very highly multiparameter measurements of antigens or other targets.


Luncheon Technology Showcase:
High-Content Screening

12:30-1:00 Improved Speed and Applications Flexibility for High-Content Screening: The MDS Complete Imaging Solution for HCS MolecularDevicesNEW 

Michael Sjaastad, Ph.D., Director, Marketing, Cellular Imaging, MDS Analytical Technologies

Speed and application flexibility allow researchers to process more compounds in high-content screens while maintaining the data quality and content achieved using traditional microscopy. MDS Analytical Technologies offers a Complete Imaging Solution for HCS to seamlessly acquire, analyze and identify compounds for hit selection. Three choices in instrumentation provide a range of image resolution and speed for all HCS applications. Turnkey image analysis modules enable hundreds of specific assays while proprietary parallel processing software now accelerates image analysis many fold. These new capabilities improve workflow and accelerate hit identification. We will present examples of the Complete Imaging Solution for HTS used for confocal imaging campaigns, object based image screens at 5 minutes per 1536 well plate, and large organism screening of Zebrafish.

Sponsored by
Millipore (updated)
1:00-1:30 Technology Short Talk
lluminating Your Pathway to Discovery: High Content Analysis Assays For Efficacy and Toxicity Profiling
Andrew Ball, Ph.D., Senior Scientist, Millipore Corporation 
The quality of High Content Analysis data depends heavily on the quality of the detection reagents employed. At Millipore, we have developed a large portfolio of HCA assays for multiple stages in drug discovery and development, from primary screening thru in vitro toxicity assessment. These include assays for hepatotoxicity, neurotoxicity, cell cycle control, cell signaling, cellular stress pathways, kinases and GPCR profiling. We will present data showing the effectiveness of these assays in screening applications, and will describe how Millipore’s approach to HCA offers the end user enhanced detection capabilities and improved productivity.

1:30-2:00 Technology Short TalkThermoScientific
Redistribution assays: Adaptation of Assays Designed for Compound Screening to RNAi Screening Procedures
Yuriy V. Fedorov, Ph.D., Research Scientist III, Thermo Scientific Genomics, Thermo Fisher Scientific
Redistribution® technology is a cell-based assay technology that uses protein translocation as a readout for the activity of cellular signaling pathways.  Redistribution assays were initially developed for screening and profiling of small molecule drug candidates however these assays are also amenable to use with RNAi reagents for functional genomic screening in mammalian cells. Here we describe an adaptation of Rad51 Redistribution HCA assay to RNAi-based screening.  RAD51 is important for homologous recombination and genetic integrity and represents a likely node for investigating putative intersections between hypoxia-induced cell stress and DNA repair pathways.  Using optimized conditions and proper siRNA controls, we screened a Human Protein Kinase siRNA library and identified kinases that may be considered candidate targets in chemotherapeutic sensitization strategies. In conclusion, we demonstrated the utility of RNAi-based screening combined with high content cell imaging to assess the molecular factors involved in Rad51 regulation and their contribution to tumor biology.



Luncheon Technology Showcase:
High-Content Data Analysis

12:30-12:45 Intelligent Assay Development: Removing the Bottlenecks Using iDEV™ SoftwareThermoScientific

Scott Keefer, M.B.A., Product Manager, Thermo Fisher Scientific

Rapid, robust, in depth image analysis is the key to researchers productivity in high content, yet developing assays for a vast range of biologies, from simple translocations, through complex morphological phenotypes to whole organisms such as Zebrafish, remains a bottleneck for many researchers. The Thermo Scientific iDEV software combines our BioApplication image analysis power with the latest innovations in software to provide a simple workflow that guides assay development in the most efficient manner possible, even for someone new to image analysis and high content. We will demonstrate this new innovation and show how even the most complex biologies can be quickly and comprehensively analyzed, in minutes.

Sponsored by
GE Healthcare
12:45-1:00 Realizing the Potential of your HCA Data
Abhay Kini, Ph.D., Product Manager, Cell Technologies, GE Healthcare
IN Cell Miner HCM is a software environment that provides data management, visualization and integration capabilities.  Built on EMC Documentum®, the enterprise content management system, IN Cell Miner HCM provides easy access to and enables integration of HCA data across multiple platforms across drug discovery and facilitates management of HCA data with functional annotations.  IN Cell Miner allows the small laboratory or enterprise customer to seamlessly integrate analytic tools with HCS/HCA workflows for easy visualization, analysis and realization of the full potential of their HCA data.

Sponsored by

1:00-1:15 Beyond Basic HCS Data Management: Learning, Modeling, and Advanced Data Analysis using Pipeline Pilot

Kurt Scudder, Ph.D., Solution Scientist, Accelrys

Accelrys’ Pipeline Pilot has found a place in the labs of many HCA practitioners, taking advantage of the image analysis, statistics, and plate data handling collections in the product. The toolbox approach allows developers and users to envision a way to analyze or visualize data, then rapidly construct one or more protocols to enable that vision. This approach complements and extends the HCS instrument vendors’ data management and analysis software. Accelrys has facilitated this by building into Pipeline Pilot connectivity to the data management systems from most major HCA vendors such as Cellomics, GE Healthcare, BD Bioscience, PE, Molecular Devices, and Beckman-Coulter. This capability can now be combined with the learning and advanced data modeling capabilities in Pipeline Pilot to move beyond simple HCS analysis and data management into more detailed examination of images and extracted data, and examination of the data for latent patterns or characteristics which can give new insights. All of this can be done while remaining within the Pipeline Pilot environment. Examples of the application of advanced data modeling with images and image objects will be presented.

Sponsored by
1:15-1:30 High-Content Hit Selection Based on Single-Cell Data–Leveraging Rich Biological Outcomes with Extreme Efficiency

Stephan Heyse, Ph.D., Head, Genedata Screener, Genedata AG

High-content screening experiments produce rich information on phenotypic changes of individual cells when subjected to treatment with compounds, siRNAs, or other inducers. While the management of the resulting microscope images is the current concern, upcoming challenges are the biologically meaningful representation and quantification of HCS outcomes. This includes distinguishing cell sub-populations of differential response, statistically aggregating them across wells and replicates, normalizing signals and eliminating errors, separating and quantifying phenotypes and effects. Leveraging this information from the complex single-cell data sets, with millions of data points per plate, requires a scalable framework with automated data processing and intelligent management functions, including scientists’ review at any stage of the process. Taking examples from large siRNA and compound screens, we show how such systematic in-depth analysis of high content screens can be accomplished routinely, passing from single-cell data to hit selection in a highly efficient workflow.

1:30-2:00 Technology Short Talk

Sponsored by BD Biosciences 
Characterization and Isolation of Stem Cells Using High-Content Imaging and Flow Cytometry
Jurg Rohrer, Ph.D., Director, Research & Development, BD Biosciences 
Differentiation of pluripotent stem cells often yields inconsistent and heterogeneous cell populations that are problematic for transplantation and for quantitative and comparative analyses. The challenge remains to identify unique combinations of markers to facilitate the isolation of distinct cell populations by fluorescence activated cell sorting (FACS) with the highest degree of purity. To address this challenge, BD has developed the first 96-well plate based lyophilized antibody screening panels for the high-content analysis of cell populations by imaging or flow cytometry. Data generated with the BD LyoplateTM Screening Panel on several stem cell types using the BD PathwayTM HCA imager and BDTM LSR II HTS cell analyzer will be presented, including a specific example of how a novel combination of unique cell signatures was discovered using the screening panel to analyze the differential expression of cell surface markers on hESC, self-renewing neural stem cells (NSC), and differentiated NSC (glial progenitors and neurons). The characterization of positive and negative selection markers on each cell type enabled the isolation of near-pure populations of hESC-derived neurons and glia using a BD FACSAriaTM II cell sorter. Highly pure stem cells and differentiated cells are useful for downstream applications such as transplantation, microarray analysis, disease modeling, and drug discovery, development and screening.



HCA for Compound Screening

2:15-2:40 Speaker to be Announced


2:40-3:10 Phenotypic High-Content Screens Utilizing Multi-Parametric Data Analysis for Novel Lead Identification

Daniela Gabriel, Ph.D., Associate Director, Lead Finding Platform, Novartis Institutes for Biomedical Research

High-content screening (HCS) applications allow the characterization of novel compounds in a cellular environment. Generally compound dependent effects are analyzed with regard to target specificity whereas a great potential of HCS is the analysis of cellular phenotypes by generation of multidimensional readouts of cellular effects in response to compound treatment. Multivariate statistics provide a range of data reduction and classification tools to not only identify hits but also to classify the compound’s effect and to consider different responses in subpopulations. Utilizing multivariate analysis of phenotypic profiles enhances the potential of hit discovery in small molecule screening and help classifying hits for target identification. The rationale behind this strategy will be illustrated with screening examples.

3:10-3:35 Combining High-Content and High-Throughput Screening

Tina Garyantes, Ph.D., Global Head, Screening and Assay Sciences, sanofi-aventis

Accurate prediction of human responses to potential drugs is often unreliable and leads to a high attrition rate in development. One of the strategies implemented to increase the chance of isolating the best chemical matter as early as possible during the drug discovery process is to implement cell-based screens, which address targets and compounds in a more physiological context. High-Content Analysis assay formats are examples of those cellular assays that provide accurate and relevant multiple parameter information from a unique experiment. The presentation will focus on a few selected examples where High-Content Screening has been scaled up to High-Throughput Screening to facilitate the decision on compound progression thanks to the wealth of data generated. The added value and lessons learned will be discussed.

Data Analysis and Management

2:15-2:40 Patches and Batches: New Approaches to Analyzing Drug Effects on Subcellular Patterns

Robert F. Murphy, Ph.D., Professor, Departments of Biological Sciences and Biomedical Engineering, Carnegie Mellon University

A critical task that is addressed by high-content screening and analysis is learning the effects of compounds (e.g., drug candidates) on the localization of targets (e.g., proteins). This is usually performed by exhaustive analysis of many compounds on one protein, a process that is repeated for the next drug. I will describe methods for learning the dependency of many proteins on many compounds without such exhaustive analysis.

Sponsored by
2:40-3:10 Multivariate Characterization of High-Content Data for Relevant Phenotypic Change in Type II Diabetes Screening

Jonathan Z. Sexton, Ph.D., Assistant Professor, Biomanufacturing Research Institute and Technology Enterprise (BRITE), North Carolina Central University

Current fluorescent probe technology and increased multiplexation in HCS can result in an overwhelming array of cellular data. The characterization of subtle phenotypic changes can be challenging and resulting phenotypic endpoints are often non-obvious. Phenotypic change must then be reported back to an information management system for registration of biological data with chemical structures. Here we present an automated method for the discovery of relevant phenotypic change through multivariate analysis of preliminary HCA data in the assay development phase to guide discovery in large scale screening efforts, including (1) How to use multivariate techniques to discover meaningful phenotypic change; (2) Reduction of primary screening data to facilitate the merging of biological data with chemical information for hit triage and prioritization in data management systems; (3) Tools for developing a high content screening information pipeline.

3:10-3:35 Multilayered Analysis of HCS Data: An Integrated Approach to Scientific Insight

Ansuman Bagchi, Ph.D., Director, Applied Computer Science & Mathematics, Merck & Co.

Live-Cell Imaging


2:15-2:40 Quantitative 4D Live-Cell Imaging Reveals Regulation of Kinetochore Alignment within the Metaphase Plate

Jason R. Swedlow, Ph.D., Professor, Quantitative Cell Biology, University of Dundee

A hallmark of mitosis in most eukaryotic cells is the formation of a metaphase plate half-way between the spindle poles, about which chromosomes exhibit oscillatory movements. These movements are accompanied by changes in the distance between sister kinetochores, commonly referred to as ‘breathing’. The relationships between oscillations, breathing, and formation of the metaphase plate, and the molecular components that regulate these processes, are poorly understood. We developed a four-dimensional imaging assay combined with computational image analysis that identifies and tracks sister kinetochores over time, classifies kinetochores as aligned or unaligned, and determines the mitotic phase of the cell. Our assay shows that late prometaphase and metaphase oscillation and breathing speeds are most sensitive to depletion of microtubule depolymerases, while oscillation and breathing periods are most sensitive to perturbations that alter the stiffness of the mechanical linkage between sisters. It also reveals that metaphase plates become thinner as cells progress towards anaphase, due to a progressive reduction in oscillation speed at constant oscillation period.

2:40-3:10 Sponsored by Cyntellect 

New Methods for Detecting Toxicity in Adherent Cells using Label-Free, Brightfield Live Cell Analysis
Fred Koller, Ph.D., President & CEO, Cyntellect, Inc.
Automated, high-throughput microscopy, coupled with sophisticated image analysis software, has been broadly adopted in drug discovery and basic cell biology research.  New products for live cell analysis using label-free detection have also recently become available.  However, it has remained difficult to produce high quality images from brightfield (or transmitted light) sources, particularly away from the center of a well, or in a high-throughput manner. Cyntellect’s new Celigo™ adherent cell cytometer addresses these unmet needs by utilizing large-field F-theta scanning lens and galvanometer scanning mirror technologies to rapidly image living cells in their natural state within cell culture flasks and microplates.  The Celigo cytometer provides label-free brightfield morphology for cell identification and classification, and provides unique full-well uniform illumination to analyze every cell in every well.  The cytometer is designed for throughput and is capable of imaging and processing data in as little as 5-15 minutes per plate across a range of microplate formats.  Using the Celigo cytometer, a compound library was screened to identify cytotoxic effects on cell proliferation in adherent cells (human lung carcinoma A549 cells) and non-adherent cells (promyelocytic leukemia HL-60 cells).  Following treatment with compounds, cells were analyzed by direct cell counting in microplates and the results were compared to standard MTT assay for cytotoxicity. IC50 values were comparable for the two methods, however, the Celigo cytometer limit of detection was 10 cells/well (384-well plate) compared with >75 cells/well for the MTT assay.  The Celigo cytometer also enables kinetic growth tracking and cell morphological analysis in a wide range of microplates and flasks that cannot be performed using traditional endpoint assays, such as MTT.    Also, the Celigo cytometer can detect and analyze a wide variety of fluorescent cell stains and assays with up to three independent fluorescent channels in addition to the brightfield channel.

3:10-3:35 Live-Cell Imaging of Caspase Activation for High-Content Screening

Hakim Djaballah, Ph.D., Director, HTS Core Facility, Memorial Sloan Kettering Cancer Center

Caspases are central to the execution of programmed cell death and their activation constitutes the biochemical hallmark of apoptosis. No method currently available that allows continuous live-cell monitoring of caspase activation in high-content assays; which require amenability to high-density plate formats, live and continuous monitoring, and the fluorogenic reporter to be non-toxic and does not interfere with nor stimulate induction of apoptosis. My talk will focus on the adaptation of a high-content assay method utilizing the DEVD-NucView™ 488fluorogenic substrate that meet the required criteria, and for the first time, we show caspase activation in live cells induced either by chemicals or RNAi. I will present data on the adaptation, optimization and validation of the use of this substrate as a homogeneous, live assay reporter for monitoring real-time kinetics of induction of apoptosis in 384-well plates, and discuss its advantages for use in real time screening of chemical and RNAi libraries for the rapid identification of novel modulators of apoptosis; and discuss the merits of using this new high-content live cell assay to screen chemical or RNAi libraries.

3:35-4:45 Refreshment Break with Exhibit and Poster Viewing


4:45-5:10 HCS as a Key Technology in Primary and Secondary Screening

Oliver Poeschke, Ph.D., Senior Scientist, Assay Development, Biomolecular Pharmacology Lead Discovery, Merck Serono

HCS has evolved from a new technology into a powerful compound screening platform over the last years in our company. This is documented by a number of success stories, of which two case studies will be presented. I will describe how HCS enables the identification, characterization and chemical optimization of small molecules leading to in-vivo active compounds in the early drug discovery phase.

5:10-5:35 High-Content Screening of the NIH MLSMR Library

Susanne Heynen-Genel, Ph.D., Director, High-Content Screening Systems, Conrad Prebys Center for Chemical Genomics, Burnham Institute for Medical Research

The Conrad Prebys Center for Chemical Genomics (CPCCG) at the Burnham Institute for Medical Research is one of four comprehensive screening centers for the NIH Molecular Libraries Probe Production Centers Network (MLPCN). While these comprehensive screening centers cover all aspects of small molecule library screening and probe development, CPCCG also specializes in high-content analysis and screening. Thus high-content assays against very diverse targets are being developed and screened at CPCCG. This presentation will describe some examples of the image-based assays and screens run against the NIH Molecular Libraries Small Molecule Repository (MLSMR) library including recently performed GPCR and lipid screens.

5:35-6:00 Automated High-Content Screening for Compounds That Disassemble the Perinucleolar Compartment (PNC)

Steve Titus, Ph.D., Staff Scientist, Biology, NIH Chemical Genomics Center

We have conducted an automated high-content screen using a collection of 140,000 small molecules to look for compounds which disassemble the PNC. The assay was conducted on an InCell 1000 imager in 1536 well format. From the 140,000 compounds screened, 120 hits were identified and ordered, and 91 confirmed activity. I will discuss the assay optimization, screen, and hit confirmation processes in detail.

4:45-5:10 The Role of Open Source in High-Content Screening Informatics

Karol Kozak, Ph.D., Head, Computation Analysis, HCA/HTS Informatics, LMC-RISC, Institute for Biochemistry

Its main strength – the high information content of the delivered data (images) – means at the same time a considerable challenge on the IT side. The aim of this presentation is to describe an open-source informatics platform for integrating, sharing and processing HCS data using a workflow-oriented architecture. This talk will focus on a platform that covers the whole process of a HCS screen, from the basic compound management through the image processing and data handling, up to the analysis of the screens. The platform offers not only standard/routine functions but also integrates the newest, state-of-the-art algorithms related to machine learning, data mining, standards and knowledge management. A further specialty of this platform is that we want to actively make use of its open source nature by creating an international development community through which one can reach not only new informatics solutions but broaden the application areas and accelerate the development speed.

5:10-5:35 The Open Microscopy Environment: Image Informatics for Biological Microscopy and HCAs

Jason R. Swedlow, Ph.D., Professor, Quantitative Cell Biology, University of Dundee

We have developed an open-source software framework to address the needs for image data integration and interoperability known as the
Open Microscopy Environment (OME). OME has three components—an open data model for biological imaging, standardized file formats and software libraries for data file conversion and software tools for image data management and analysis. The OME Data Model has recently been updated to more fully support fluorescence filter sets, the requirement for unique identifiers, including LSIDs, and screening experiments using multi-well plates. The OME-TIFF file format and the Bio-Formats file format library provide an easy-to-use set of tools for converting data from proprietary file formats. These resources enable access to data by different processing and visualization applications, and sharing of data between scientific collaborators. The Java-based OMERO platform includes server and client applications that combine an image metadata database, a binary image data repository and high performance visualization and analysis. The current release of OMERO includes interfaces for C/C++ and Python to support a wide variety of client applications and support for Matlab-based applications like Cellprofiler. For computational analysis of images, this standardized interface provides a single mechanism for accessing image data of all types-- regardless of the original file format. Moreover, a compute distribution facility is included, to support multi-cpu computing installations.

5:35-6:00 Bioassay Ontology and Software Tools to Integrate and Analyze Diverse Data Sets

Stephan C. Schurer, Ph.D., Research Assistant Professor, Pharmacology, Center for Computational Science, University of Miami Miller School of Medicine

The primary goal of the project is the development of a content and software framework to enable systematic mining and analysis of large screening data sets from diverse biological assays, including HCA studies. Currently, bioassay experiments and their endpoints are described primarily as free text. Formalizing this knowledge is required for any computational analysis that involves multiple assay data sets. It is also required to integrate these data with existing bioinformatics resources, such as pathway and interaction databases. A framework to enable researchers to exploit these enormous resources will improve the efficiency of chemical biology research projects, and will lead to the discovery of treatments, usually compounds, that may not emerge from individual screening campaigns.

4:45-5:10 Using FLIM to Analyze 2-Way and 3-Way Protein:Protein Interaction and Caspase Activation in Live Cells

David Andrews, Ph.D., Professor, Biochemistry and Biomedical Sciences, Canada Research Chair in Membrane Biogenesis, McMaster University

We have been using Fluorescence Lifetime IMaging (FLIM) to measure protein:protein interactions in live cells by Fluorescence Resonance Energy Transfer (FRET) between fluorescence proteins. Methods to measure caspase activity in live cells using a single molecule consisting of a CFP donor linked via a caspase site to an YFP acceptor are well established. We have successfully automated this assay for high throughput using the FLIM Opera. To examine interactions crucial for regulation of apoptosis by interactions between different Bcl-2 family proteins is more challenging because the relative concentration of the two proteins varies in different areas of the cell. Because fluorescence lifetime is independent of concentration we used FLIM FRET to measure 3-way interactions between the anti-apoptotic protein Bcl-XL with the pro-apoptotic proteins Bid, Bad and Bim. In a single cell it was possible to examine drug induced FRET between CFP-Bid and YFP-Bcl-XL as well as FRET between YFP-Bcl-XL and RFP-Bad or RFP-Bim on mitochondria. Currently we are using this approach to examine in live cells the molecular mechanism of drugs such as ABT-737 that are designed to disrupt the interactions between Bcl-XL and its binding partners.

5:10-5:35 Live-Cell Imaging of Specific RNAs Using Fluorescent Probes

Gang Bao, Ph.D., Professor, Robert A. Milton Chair in Biomedical Engineering, College of Engineering Distinguished Professor, Wallace H. Coulter Department of Biomedical Engineering, Georgia Institute of Technology and Emory University

With the recent development of novel techniques for imaging RNA in living cells, it is now possible to study the dynamics of RNA expression and regulation. In this presentation I will review the application of fluorescent probes, especially molecular beacons, in live-cell RNA detection, viral infection studies, and the isolation of stem cells. Common challenges faced by fluorescent probes, such as probe design, delivery, and target accessibility, are also discussed. It is expected that continued advancements in live-cell imaging of RNA will open new and exciting opportunities in a wide range of biological and medical applications.

5:35-6:00 Quantitative Molecular Imaging in Living Cells via FLIM

Mary-Ann Mycek, Ph.D., Associate Professor & Associate Chair, Biomedical Engineering; Faculty Member, Applied Physics Program; Core Member, Comprehensive Cancer Center, University of Michigan

Fluorescence lifetime imaging microscopy (FLIM) employs fluorophore lifetime, rather than fluorescence intensity, for image contrast. Compared to intensity-based methods, lifetime imaging requires less calibration and/or correction for fluorophore concentration, photobleaching, and other artifacts that affect intensity measurements. FLIM has been employed to probe the microenvironments of endogenous and exogenous fluorophores, including measurements of cellular metabolic co-factors, pH, dissolved gas concentration, and molecular interactions via FRET. Several applications of FLIM for quantitative, live cell imaging will be described, including studies of cellular metabolic pathways, improved FRET detection of oncogene association, microfluidic bioreactor characterization for continuous cell culture, and improved precision for low-light FLIM imaging.

6:00-7:00 Reception with Exhibit and Poster Viewing