One month after the earthquake and tsunami hit Japan, there is still no clear picture of the further hazard posed by the wrecked nuclear reactors and spent fuel ponds at the Fukushima nuclear power plant (see page 146), and monitoring of fallout remains patchy. To improve the situation, better data, in more user-friendly forms, and more sophisticated analyses are essential. Compared with the 1979 Three Mile Island accident or the 1986 Chernobyl disaster, there is certainly much more information available about this latest nuclear accident — largely thanks to the Internet and online media. Japan's science ministry, and other bodies, have issued reams of data, including daily environmental radiation measurements — an admirable feat, given that the Japanese authorities are also having to deal with the huge aftermath of the quake and tsunami.

The authorities have failed badly to forewarn the public of events they must have known were likely to happen.

But as Peter Sandman, a risk consultant based in Princeton, New Jersey (http://www.psandman.com/whatsnew.htm), points out, the authorities have failed badly to forewarn the public of a series of events that they must have known were likely to happen. This has resulted in nasty surprises such as radioactive pollution of the sea (see page 145), foods and tap water — as well as this week's upgrading of the accident to level 7, the highest on the International Nuclear and Radiological Event Scale and matched only by Chernobyl. As a result, many people now do not trust the authorities to tell them if the situation is likely to worsen, sapping public confidence.

The Tokyo Electric Power Company (TEPCO), which runs the Fukushima plant, has also on at least four occasions had to retract as incorrect its findings on the amount and composition of radionuclides in areas in and around the plant, or on reactor parameters. This has created uncertainty and public mistrust in the company's monitoring abilities. In its defence, damaged plant instrumentation means that key data on events inside the reactors are sometimes missing or unreliable. Even so, the most complete and credible publicly available analyses of possible reactor-event scenarios have come not from Japan, but from outside scientists, nuclear-reactor makers and regulatory authorities.

Similarly, it is pertinent to ask why, so far, the only detailed publicly available forecasts of the direction and concentration of atmospheric radionuclide plumes have come from overseas agencies. The Japanese almost certainly have data that would allow much higher-resolution forecast maps of Fukushima and the surrounding areas. Although the Japanese authorities are releasing data daily on radiation levels in the air, soil and water, these are scattered across multiple, individual web pages. This uncoordinated approach was excusable in the early days, but data collection and presentation urgently need to improve. The authorities have also failed to provide vital context on how these exposure rates translate (or not) into what matters to people, such as health effects, and where they make farming impossible. The recurring narrative that this or that radiation dose is as much as would be given by an X-ray or a CT scan doesn't cut it, as health effects, for example, depend most on accumulated doses over long times.

Information on fallout distribution made public by the government and TEPCO also lacks basic metadata, such as the latitude and longitude of sampling points or the sampling protocols used, and results are presented as static PDFs from which researchers cannot easily extract the data. As a result, it is next to impossible for academic researchers and others to compile and map the daily reports and gain a better picture of the situation and of changes over time and space. The Japanese authorities, the International Atomic Energy Agency and other bodies with relevant information must present it as dynamic data and high-resolution maps that also show day-to-day variation, total net cumulative soil deposition and where hotspots are, and as models of what's happening overall rather than just spot counts.

Data analysis should also not be left to governments alone. Researchers are rightly calling for an independent group to process the data and publish evidence-based risk assessments. They also want data in machine-readable formats, such as spreadsheets, databases and spatial data formats. This would unleash the diverse creativity of academic researchers, journalists, software geeks and mappers, who are often better equipped, and more agile than governments and international agencies, to present data online in timely, informative and compelling ways. To convert raw data into high-quality, user-friendly forms is not a luxury, but essential for helping to build public trust.