Abstract
Microarray technology is a powerful tool for measuring RNA expression for thousands of genes at once. Various studies have been published comparing competing platforms with mixed results: some find agreement, others do not. As the number of researchers starting to use microarrays and the number of cross-platform meta-analysis studies rapidly increases, appropriate platform assessments become more important. Here we present results from a comparison study that offers important improvements over those previously described in the literature. In particular, we noticed that none of the previously published papers consider differences between labs. For this study, a consortium of ten laboratories from the Washington, DC–Baltimore, USA, area was formed to compare data obtained from three widely used platforms using identical RNA samples. We used appropriate statistical analysis to demonstrate that there are relatively large differences in data obtained in labs using the same platform, but that the results from the best-performing labs agree rather well.
This is a preview of subscription content, access via your institution
Access options
Subscribe to this journal
Receive 12 print issues and online access
$259.00 per year
only $21.58 per issue
Rent or buy this article
Prices vary by article type
from$1.95
to$39.95
Prices may be subject to local taxes which are calculated during checkout
Similar content being viewed by others
References
Kane, M. et al. Assessment of the sensitivity and specificity of oligonucleotide (50-mer) microarrays. Nucleic Acids Res. 28, 4552–4557 (2000).
Hughes, T. et al. Expression profiling using microarrays fabricated by an ink-jet oligonucleotide synthesizer. Nat. Biotechnol. 19, 342–347 (2001).
Yuen, T., Wurmbach, E., Pfeffer, R.L., Ebersole, B.J. & Sealfon, S.C. Accuracy and calibration of commercial oligonucleotide and custom cDNA microarrays. Nucleic Acids Res. 30, e48 (2002).
Barczak, A. et al. Spotted long oligonucleotide arrays for human gene expression analysis. Genome Res. 13, 1775–1785 (2003).
Carter, M. et al. In situ-synthesized novel microarray optimized for mouse stem cell and early developmental expression profiling. Genome Res. 13, 1011–1021 (2003).
Wang, H. et al. Assessing unmodified 70-mer oligonucleotide performance on glass-slide microarrays. Genome Biol. 4, R5 (2003).
Kuo, W., Jenssen, T., Butte, A., Ohno-Machado, L. & Kohane, I. Analysis of mRNA measurements from two different microarray technologies. Bioinformatics 18, 405–412 (2002).
Kothapalli, R., Yoder, S., Mane, S. & Loughran, T.P. Jr. Microarray results: how accurate are they? BMC Bioinformatics 3, 22 (2002).
Li, J., Pankratz, M. & Johnson, J. Differential gene expression patterns revealed by oligonucleotide versus long cDNA arrays. Toxicol. Sci. 69, 383–390 (2003).
Tan, P. et al. Evaluation of gene expression measurements from commercial platforms. Nucleic Acids Res. 31, 5676–5684 (2003).
Youden, W. Enduring values. Technometrics 14, 1–11 (1972).
Irizarry, R.A. et al. Exploration, normalization, and summaries of high density oligonucleotide array probe level data. Biostatistics 4, 249–264 (2003).
Dudoit, S. et al. Normalization for cDNA microarray data: a robust composite method addressing single and multiple slide systematic variation. Nucleic Acids Res. 30, e15 (2002).
Gentleman, R.C. et al. Bioconductor: open software development for computational biology and bioinformatics. Genome Biol. 5, R80 (2004).
Tsai, J. et al. Resourcerer: a database for annotating and linking microarray resources within and across species. Genome Biol. 2 software0002.1–0002.4 (2001).
Acknowledgements
We thank A. Nones and K. Broman for useful suggestions. The work of R.A.I. is partially funded by the National Institutes of Health Specialized Centers of Clinically Oriented Research (SCCOR) translational research funds (212-2494 and 212-2496). The work of G. Germino and I. Kim was partially funded by NIDDK U24DK58757.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Competing interests
The authors declare no competing financial interests.
Supplementary information
Supplementary Fig. 1
Observed log fold change versus RT-PCR log fold change for the four altered genes and 12 selected genes. (PDF 672 kb)
Supplementary Fig. 2
Preprocessing effect. (PDF 1279 kb)
Supplementary Fig. 3
SD (measure of precision) plotted against experience of technician for each of the five Affymetrix labs. (PDF 453 kb)
Supplementary Fig. 4
Precision assessment for the new Affymetrix chip. (PDF 525 kb)
Supplementary Table 1
Within and across lab assessment measure comparing preprocessing procedures as for Supplementary Figure 2. (PDF 52 kb)
Supplementary Table 2
Effect of annotation on across-lab agreement. (PDF 59 kb)
Supplementary Table 3
Across-lab agreement. (PDF 61 kb)
Supplementary Table 4
Assessments for results from lab 5 using old and new chip compared to results from the best performing labs. (PDF 48 kb)
Rights and permissions
About this article
Cite this article
Irizarry, R., Warren, D., Spencer, F. et al. Multiple-laboratory comparison of microarray platforms. Nat Methods 2, 345–350 (2005). https://doi.org/10.1038/nmeth756
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1038/nmeth756
This article is cited by
-
A uniform data processing pipeline enables harmonized nanoparticle protein corona analysis across proteomics core facilities
Nature Communications (2024)
-
Comparison Between Next-Generation Sequencing and Microarrays for miRNA Expression in Cancer Samples
National Academy Science Letters (2023)
-
Correlation-Based Analysis of COVID-19 Virus Genome Versus Other Fatal Virus Genomes
Arabian Journal for Science and Engineering (2023)
-
Measurements of heterogeneity in proteomics analysis of the nanoparticle protein corona across core facilities
Nature Communications (2022)
-
Reproducibility of mass spectrometry based metabolomics data
BMC Bioinformatics (2021)