Once, we might have studied patterns and processes that relate to a single gene or gene family, and made inferences from these about the whole genome. High-throughput genomic technologies now allow us to go directly for the genome-scale data sets. But even in this brave new world of genomics, basic genetic tools still have their place. As two recent papers that are discussed in this month's highlights show (page 243), improved versions of such tools will be vital for generating resources for global functional analyses.

In future, improvements in the way we analyse these large data sets will be as important as improvements in the way we generate them. Perhaps the most important trend at this end of the genetic research 'pipeline' has been the revolution in the way genetic data is analysed. As Mark Beaumont and Bruce Rannala discuss on page 251, a fundamental change is taking place: Bayesian statistics are starting to replace traditional methods for a wide range of purposes. Computational complexity previously constrained their use, but these approaches are now a convenient way to deal with data that is influenced by many interdependent variables, which is frequently the case in genetics.

Society too must learn to cope with the changes that genomic technologies bring. Richard Sharp and colleagues (page 311) examine how policy makers can keep up with the pace of change.

Despite understandable concerns about their social implications, genomic technologies are overwhelmingly good news for genetics: they allow us to answer questions that were out of our reach before. This point is elegantly illustrated in the review by Laurence Hurst and colleagues (page 299), who discuss how our previously naive view of the organization of eukaryotic genomes is undergoing a paradigm shift as a result of whole-genome analyses.