Correspondence | Published:

Data analysis

Approximation aids handling of big data

Nature volume 515, page 198 (13 November 2014) | Download Citation

We need a radical shift in our approach to data analysis — towards approximation. Our technical capacity is being overtaken by the unprecedented rate of data generation by today's powerful instruments and computers (see H. Esmaeilzadeh et al. Commun. ACM 56, 93–102; 2013).

In my view, it is often more efficient to replace conventional, precise computations with scalable approximations that have tight error bounds — in algorithms and data structures, for example. This should be adequate for interpreting data, particularly when refining the explorative phase of an analysis. Researchers may ultimately still want to perform detailed analyses as indicated trends become clearer. And for some applications, approximation will not work at all.

Approximations, however, are already accepted in many realms, such as next-generation DNA sequencing (M. L. Metzker Nature Rev. Genet. 11, 31–46; 2010). It is time to let mathematical ingenuity replace our obsession with precision.

Author information

Affiliations

  1. Imperial College London, UK.

    • Thomas Heinis

Authors

  1. Search for Thomas Heinis in:

Corresponding author

Correspondence to Thomas Heinis.

About this article

Publication history

Published

DOI

https://doi.org/10.1038/515198d

Comments

By submitting a comment you agree to abide by our Terms and Community Guidelines. If you find something abusive or that does not comply with our terms or guidelines please flag it as inappropriate.

Newsletter Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing