Now is the season for predictions, so here are two. First, historians of the future—if they are still using the Christian calendar—will find it convenient to remember 2000 as the year when humankind determined its own genetic sequence. Second, they will regard this discovery as significant primarily because of its implications for understanding the brain.

Pedants may quibble, but this enormous scientific achievement will coincide fairly precisely with the millenium. The sequence of chromosome 22 was announced in December 1999, and a draft sequence of the entire genome (which will be about 90% complete, and is likely to be accurate to better than 99%) will be available by spring 2000. Celera Genomics, a private company, hopes to assemble a full sequence using a shotgun method by 2001. There will be a justifiable celebration at some point in the next three years or so, when the public consortium declares the entire sequence 'operationally complete' (meaning an error rate of less than 1 in 10,000, minus a few gaps), but the reality is that there will be no clear end point to the project. So 'the millenium' will be good enough.

Although the most immediate practical effects of the genome project will probably be new insights into human disease, it does not seem far-fetched to suggest that the biggest impact in the long term will be in neuroscience. It is, after all, our brains that make us who we are, both as a species and as individuals. The genome project, and the work that will build on it, will provide a new depth of understanding of how our brains are built and how they differ from those of other species. But perhaps even more important—and more challenging for society—will be the new insights it will provide into how individuals differ from each other.

Some of these issues are discussed1 by Robert Plomin of the Institute of Psychiatry in London, writing in a special Nature supplement that is being distributed with this issue of Nature Neuroscience. Plomin focuses on intelligence, and more particularly on 'general cognitive ability' (defined by factor analysis and often abbreviated to g), which is one of the most heritable behavioral traits. Rather than being determined by a single gene, g is a complex trait that is influenced by many genes, each having a relatively small effect, as well as by the environment. Experts disagree on the magnitude of the genetic contribution to g, but Plomin argues that it probably around 50%; in other words, that about half of the variance in g can be attributed to genetic variation within the population. Regardless of the exact figure, it seems clear that there are genes that affect intelligence, as well as many other behavioral traits.

Many of these genes are likely to be identified as a result of the genome project. Classical population genetics has indicated that genetic effects on g are largely additive; that is, individual alleles exert effects that are independent of interactions with other alleles. This lack of statistical interaction (which is not to be confused with interactions at the molecular level) means that it should be relatively straightforward to identify the genes involved, provided that enough data are available to achieve the necessary statistical power. This power will most likely come from two emerging technologies, single-nucleotide polymorphisms (SNPs) and DNA microarrays. SNPs are the most common form of human allelic variation, and are currently being identified in very large numbers, with both public and private funding. Some of these polymorphisms will themselves be causes of phenotypic variation, whereas others will be linked to the causal polymorphisms and will therefore be useful as genetic markers for association studies. There are also tremendous efforts being made to develop microarray methods to the point where whole genomes can be scanned rapidly and cheaply for a comprehensive set of SNPs. These technologies are integral to the identification of risk alleles for complex diseases, and they should be equally applicable to other complex traits such as personality and brain function.

All this will not happen overnight, but the implications for society seem potentially far-reaching. Plomin, like most thoughtful scientists, is careful to emphasize that science alone should not and cannot determine social policy, which must depend on judgments of values as well as facts. At present, we can only speculate on what society will make of this new knowledge. There will surely be individuals who will be burning with curiosity to know their own genetic makeup and willing to pay for the information, for the same reason that they buy books such as Know Your Own I.Q. Will people also want to find reproductive partners based on genetic data, and should it be legal to provide a service for those who do? Should parents be permitted to have their children genotyped in order to plan their education? One hopes that genetic testing will never become compulsory, but even if the idea is anathema to most western democracies, there is no guarantee that all societies will feel the same way.

A deeper understanding of behavioral genetics may also bring many benefits. It might, for instance, lead to better educational strategies, not just for the most or least gifted, but for everyone. One may also hope that it will help to combat racial prejudice—lay people are often unaware of the fact that the great majority of human genetic variation cuts across racial categories, and that there is little evidence for systematic genetic differences between races, other than in the superficial characteristics by which they are normally recognized. On the other hand, genetic information may give new opportunities for discrimination against individuals. Clearly, great challenges as well as great uncertainties lie ahead. But as Plomin says, "the only thing that seems completely clear is that nothing will be gained by ignoring the issue".