Should journals be taking more action to help ensure that papers describing LEDs or photodetectors with record performance have been correctly characterized and their performance claims are valid? And if so, what initiatives would be most effective and welcomed by the community? These are questions that Nature Photonics is currently considering. Here, we present some options and discuss their relative merits. We welcome feedback and hearing your opinions on the matter.

Credit: B Christopher / Alamy Stock Photo

In this issue of Nature Photonics, an international group of scientists that are experts in organic optoelectronics have written a Comment on the topic of LED characterization and best practice (M. Anaya et al. Nat. Photon. https://doi.org/10.1038/s41566-019-0543-y; 2019). Inspired by issues of malpractice, it describes in detail the key device metrics that should be reported (such as radiance, external quantum efficiency, colour stability and lifetime, for example), the various means by which they can be correctly measured and the common measurement pitfalls that can be encountered. In a similar vein, a Comment on the topic of photodetectors was published almost exactly a year ago in December 20181.

While we, like other scientific journals, routinely rely on robust peer review by experts in the field to help catch problems and raise concerns over suspicious data and incorrect methodology in LED and detector papers, this approach alone is not failsafe or 100% effective. Could the use of device certification, checklists orientated around best-practice guidelines, or round-robin testing in more than one lab help improve the situation?

One possibility, which is already well established in the field of photovoltaics, is the use of device certification. In the photovoltaics area, samples of solar cells with claims of record performance are now routinely sent to an accredited national laboratory such as the National Renewable Energy Laboratory (NREL) or the Newport Corporation in the US, for example, and measured by a carefully calibrated system and a certificate of their performance is produced and charted. At Nature Photonics, we strongly encourage authors of record-breaking solar cell papers to validate their claims by certification where feasible.

Should such certification schemes be introduced and embraced by other communities to validate performance claims of other optoelectronic devices, in particular LEDs and photodetectors? For example, several of the Fraunhofer institutes in Germany would be well placed to offer such a service, and, in principle, national standards labs such as the National Institute of Standards of Science and Technology (NIST) in the US, National Physical Laboratory (NPL) in the UK and others for example, could also perhaps assist and become involved?

Proponents of certification say that it provides an independent, unequivocal stamp of validity, thus providing complete confidence in device performance data and ruling out erroneous claims. Certification also enables benchmarking activities, allowing different technologies and devices to be easily compared on a like-to-like basis. Critics of such approaches, however, comment that there can be issues with the cost, time and accessibility of such services or incompatibility with new technologies or device designs that are unstable and quickly degrade, or are fabricated on a size scale that cannot be accommodated, for example.

Another way forward could be the introduction of characterization checklists for LEDs and photodetectors based around widely accepted best-practice guidelines. The idea being that authors of relevant papers complete and submit a checklist alongside their paper that describes details of critical measurements and tests. Such checklists are then scrutinized by reviewers while assessing the paper and published alongside the paper in the case that it is accepted. Nature Photonics has already introduced such checklists in the area of solar cells2 and claims of lasing3 and they have been warmly embraced by the relevant communities. Should such checklists be drawn up and introduced for LEDs and photodetectors? Such checklists can usefully enforce transparency and sound methodology into a paper that may otherwise be absent. However, there is also the risk that they can be seen as yet another piece of paperwork or bureaucracy that puts researchers off making a submission.

A further option could be the adoption of round-robin testing where devices with exceptional performance are sent around several different laboratories for verification by several groups before publication. However, this approach may be difficult to implement in practice due to issues of competitive interests, intellectual property or timing.

No decisions have yet been made at Nature Photonics as to whether we will be implementing any of these measures. At this stage, we simply wish to enter a dialogue with the community to openly discuss these options and explore their benefits, feasibility and practicality. One thing is certain, we deeply care about the integrity of LED and photodetector papers that we publish and if measures do need to be introduced to improve reporting and validation in these areas we won’t hesitate to do so.