Photodetectors sensitive to short-wavelength (1.4–3 μm) infrared light are employed in applications such as long-distance telecommunications. Using silicon for infrared photodetectors would have the advantages of low cost and compatibility with existing electronic device technology; however, silicon has an electronic bandgap of 1.12 eV and therefore it is transparent to light with wavelengths longer than 1.1 μm. Sub-bandgap absorption in silicon at low temperatures has been achieved by introducing structural defects that modify its electronic properties, but this approach does not work at temperatures relevant for applications. Jonathan Mailoa and Tonio Buonassisi at Massachusetts Institute of Technology and colleagues in the US and Australia have now demonstrated room-temperature infrared photoresponse in a heavily doped silicon photodiode.
By introducing a supersaturated concentration of gold impurities in a 150-nm-thin silicon single crystal, the researchers fabricate a planar photodiode that shows photoresponse up to wavelengths of 2.2 μm. Photocurrent generation is mediated by mid-gap states introduced by the gold dopants: electrons are excited from the valence band to the conduction band via the dopant states. The detection efficiency is relatively small, but it could be improved up to a factor of 100 by increased sub-bandgap light absorption and device design optimization. Furthermore, the use of other impurity elements could enable spectral tuning of the photoresponse.