Subjects

  • A Corrigendum to this article was published on 01 June 2005

Abstract

Microarray technology is a powerful tool for measuring RNA expression for thousands of genes at once. Various studies have been published comparing competing platforms with mixed results: some find agreement, others do not. As the number of researchers starting to use microarrays and the number of cross-platform meta-analysis studies rapidly increases, appropriate platform assessments become more important. Here we present results from a comparison study that offers important improvements over those previously described in the literature. In particular, we noticed that none of the previously published papers consider differences between labs. For this study, a consortium of ten laboratories from the Washington, DC–Baltimore, USA, area was formed to compare data obtained from three widely used platforms using identical RNA samples. We used appropriate statistical analysis to demonstrate that there are relatively large differences in data obtained in labs using the same platform, but that the results from the best-performing labs agree rather well.

Access optionsAccess options

Rent or Buy article

Get time limited or full article access on ReadCube.

from$8.99

All prices are NET prices.

References

  1. 1.

    et al. Assessment of the sensitivity and specificity of oligonucleotide (50-mer) microarrays. Nucleic Acids Res. 28, 4552–4557 (2000).

  2. 2.

    et al. Expression profiling using microarrays fabricated by an ink-jet oligonucleotide synthesizer. Nat. Biotechnol. 19, 342–347 (2001).

  3. 3.

    , , , & Accuracy and calibration of commercial oligonucleotide and custom cDNA microarrays. Nucleic Acids Res. 30, e48 (2002).

  4. 4.

    et al. Spotted long oligonucleotide arrays for human gene expression analysis. Genome Res. 13, 1775–1785 (2003).

  5. 5.

    et al. In situ-synthesized novel microarray optimized for mouse stem cell and early developmental expression profiling. Genome Res. 13, 1011–1021 (2003).

  6. 6.

    et al. Assessing unmodified 70-mer oligonucleotide performance on glass-slide microarrays. Genome Biol. 4, R5 (2003).

  7. 7.

    , , , & Analysis of mRNA measurements from two different microarray technologies. Bioinformatics 18, 405–412 (2002).

  8. 8.

    , , & Microarray results: how accurate are they? BMC Bioinformatics 3, 22 (2002).

  9. 9.

    , & Differential gene expression patterns revealed by oligonucleotide versus long cDNA arrays. Toxicol. Sci. 69, 383–390 (2003).

  10. 10.

    et al. Evaluation of gene expression measurements from commercial platforms. Nucleic Acids Res. 31, 5676–5684 (2003).

  11. 11.

    Enduring values. Technometrics 14, 1–11 (1972).

  12. 12.

    et al. Exploration, normalization, and summaries of high density oligonucleotide array probe level data. Biostatistics 4, 249–264 (2003).

  13. 13.

    et al. Normalization for cDNA microarray data: a robust composite method addressing single and multiple slide systematic variation. Nucleic Acids Res. 30, e15 (2002).

  14. 14.

    et al. Bioconductor: open software development for computational biology and bioinformatics. Genome Biol. 5, R80 (2004).

  15. 15.

    et al. Resourcerer: a database for annotating and linking microarray resources within and across species. Genome Biol. 2 software0002.1–0002.4 (2001).

Download references

Acknowledgements

We thank A. Nones and K. Broman for useful suggestions. The work of R.A.I. is partially funded by the National Institutes of Health Specialized Centers of Clinically Oriented Research (SCCOR) translational research funds (212-2494 and 212-2496). The work of G. Germino and I. Kim was partially funded by NIDDK U24DK58757.

Author information

Author notes

    • Michael Wilson

    Present address: Ambion, Inc., Austin, Texas 78744, USA.

Affiliations

  1. Department of Biostatistics, Johns Hopkins Bloomberg School of Public Health, Baltimore, Maryland 21205, USA.

    • Rafael A Irizarry
  2. Department of Surgery, Johns Hopkins University, Baltimore, Maryland 21205, USA.

    • Daniel Warren
    •  & Yanqin Yang
  3. McKusick-Nathans Institute of Genetic Medicine, Johns Hopkins University School of Medicine, Baltimore, Maryland 21205, USA.

    • Forrest Spencer
  4. JHU NIDDK Gene Profiling Center, Department of Medicine, Johns Hopkins University, Baltimore, Maryland 21205, USA.

    • Irene F Kim
    •  & Gregory Germino
  5. Department of Environmental Health Sciences, Johns Hopkins Bloomberg School of Public Health, Baltimore, Maryland 21205, USA.

    • Shyam Biswal
    •  & Hannah Lee
  6. The Institute for Genomic Research, 9712 Medical Center Dr., Rockville, Maryland 20878, USA.

    • Bryan C Frank
    •  & John Quackenbush
  7. Department of Pathology, Johns Hopkins University, Baltimore, Maryland 21231, USA.

    • Edward Gabrielson
  8. Division of Pulmonary and Critical Care Medicine, Johns Hopkins University School of Medicine, Mason F. Lord Bldg., Center Tower #665, Baltimore, Maryland 21224, USA.

    • Joe G N Garcia
    •  & Shui Qing Ye
  9. NCI's Microarray Core Facility, Advanced Technology Center, Gaithersburg, Maryland 20877, USA.

    • Joel Geoghegan
    • , Ernest Kawasaki
    •  & David Petersen
  10. Department of Pathology, Johns Hopkins University, School of Medicine, Baltimore, Maryland 21287, USA.

    • Constance Griffin
    •  & Laura Morsberger
  11. Research Center for Genetic Medicine, Children's National Medical Center, George Washington University, Washington, DC 20052, USA.

    • Sara C Hilmer
    •  & Eric Hoffman
  12. W. Harry Feinstone Department of Molecular Microbiology and Immunology, Johns Hopkins Bloomberg School of Public Health, Baltimore, Maryland 21205, USA.

    • Anne E Jedlicka
    •  & Alan Scott
  13. Department of Molecular Biology and Genetics, Johns Hopkins University, Baltimore, Maryland 21205, USA.

    • Francisco Martínez-Murillo
  14. Department of Biostatistics and Computational Biology, Dana-Farber Cancer Institute, 44 Binney Street, Boston, Massachusetts 02115-6084, USA.

    • John Quackenbush
  15. Microarray Research Facility, Research Technologies Branch, DIR, National Institute of Allergy and Infectious Diseases, Bethesda, Maryland 20892, USA.

    • Michael Wilson
  16. Oncology Microarray Facility, Johns Hopkins University, Baltimore, Maryland 21231, USA.

    • Wayne Yu

Authors

  1. Search for Rafael A Irizarry in:

  2. Search for Daniel Warren in:

  3. Search for Forrest Spencer in:

  4. Search for Irene F Kim in:

  5. Search for Shyam Biswal in:

  6. Search for Bryan C Frank in:

  7. Search for Edward Gabrielson in:

  8. Search for Joe G N Garcia in:

  9. Search for Joel Geoghegan in:

  10. Search for Gregory Germino in:

  11. Search for Constance Griffin in:

  12. Search for Sara C Hilmer in:

  13. Search for Eric Hoffman in:

  14. Search for Anne E Jedlicka in:

  15. Search for Ernest Kawasaki in:

  16. Search for Francisco Martínez-Murillo in:

  17. Search for Laura Morsberger in:

  18. Search for Hannah Lee in:

  19. Search for David Petersen in:

  20. Search for John Quackenbush in:

  21. Search for Alan Scott in:

  22. Search for Michael Wilson in:

  23. Search for Yanqin Yang in:

  24. Search for Shui Qing Ye in:

  25. Search for Wayne Yu in:

Competing interests

The authors declare no competing financial interests.

Corresponding author

Correspondence to Rafael A Irizarry.

Supplementary information

PDF files

  1. 1.

    Supplementary Fig. 1

    Observed log fold change versus RT-PCR log fold change for the four altered genes and 12 selected genes.

  2. 2.

    Supplementary Fig. 2

    Preprocessing effect.

  3. 3.

    Supplementary Fig. 3

    SD (measure of precision) plotted against experience of technician for each of the five Affymetrix labs.

  4. 4.

    Supplementary Fig. 4

    Precision assessment for the new Affymetrix chip.

  5. 5.

    Supplementary Table 1

    Within and across lab assessment measure comparing preprocessing procedures as for Supplementary Figure 2.

  6. 6.

    Supplementary Table 2

    Effect of annotation on across-lab agreement.

  7. 7.

    Supplementary Table 3

    Across-lab agreement.

  8. 8.

    Supplementary Table 4

    Assessments for results from lab 5 using old and new chip compared to results from the best performing labs.

  9. 9.

    Supplementary Methods

About this article

Publication history

Received

Accepted

Published

DOI

https://doi.org/10.1038/nmeth756

Further reading