Direct retrieval of Zernike-based pupil functions using integrated diffractive deep neural networks

Retrieving the pupil phase of a beam path is a central problem for optical systems across scales, from telescopes, where the phase information allows for aberration correction, to the imaging of near-transparent biological samples in phase contrast microscopy. Current phase retrieval schemes rely on complex digital algorithms that process data acquired from precise wavefront sensors, reconstructing the optical phase information at great expense of computational resources. Here, we present a compact optical-electronic module based on multi-layered diffractive neural networks printed on imaging sensors, capable of directly retrieving Zernike-based pupil phase distributions from an incident point spread function. We demonstrate this concept numerically and experimentally, showing the direct pupil phase retrieval of superpositions of the first 14 Zernike polynomials. The integrability of the diffractive elements with CMOS sensors shows the potential for the direct extraction of the pupil phase information from a detector module without additional digital post-processing.


Wave analysis
The ID2N2 can be considered a system composed by a series of layers, each consisting of × resolvable pixels that act as diffractive neurons able to receive, modulate and transmit a light field. Each diffractive neuron can be considered as a source of secondary waves with amplitude and phase determined by the product of the input field and the transmission coefficients of each diffractive neuron. The first layer under coherent illumination generates an input represented by a × complex-valued vector. In our case, the diffractive layers modulate the phase of the is the complex field distribution at the input field, 0 ( , , ) is the amplitude coefficient of each diffractive neuron (in our case is constant and equal to 1), z = 2 • , λ is 3 the wavelength, Δn is the difference between the refractive index of the ID2N2 material and air, and Φ is the phase value of each ID2N2 pixel.
The ID2N2 can work in reflection or in transmission mode, and both phase and amplitude values of each diffractive neuron can be adjusted, providing a complex wave modulation. In our work we consider only the case of coherent transmissive ID2N2 and phase only modulation. We consider the absorption of the photoresist negligible.

TensorFlow-based design and training
The implementation of the presented ID2N2s is a two-step process: -first, we train the diffractive network in-silico, for which we assume the nonlinear activation function to be a ReLU function with optimized parameters (see Figure S1b for optimization).
-The computer-designed diffractive neural network is then printed to perform the experimental validation of the diffractive elements functionality. When optically characterising the functionality of the DNN, the ReLU function is approximated by the nonlinear photoelectric conversion of the CCD camera 3 . In this way, the presented ID2N2 directly produces an intensity map that represents the original pupil phase and therefore no post-detection computation is required.
For the in-silico training and numerical testing, we build the training and testing datasets generating random pupil phase functions using Zernike polynomials from Z1 to Z14 (OSA/ANSI indices) (Supplementary Figure 2a) as they covered the strongest aberrations typically seen in microscopy 4 .
The Zernike polynomials are defined as: where m and n are nonnegative integers with n ≥ m ≥ 0 (m = 0 for even Zernike polynomials), φ is the azimuthal angle, ρ is the radial distance 0 ≤ ρ ≤ 1, and R n m (ρ) are: Typical response curves of a light sensitive detector module. Image adapted from 7 .
During the training, we studied the relation between the brightness level of the output image -During the detection step, by adjusting the camera exposure parameters and in this way changing the NEE point.
Given the low impact of the parameter t on this specific task, for this qualitative experimental demonstration, we fixed exposure and gain conditions without optimizing the parameter t.
Vectorial two-photon nanolithography The 3D-printed DN2 were obtained by converting the calculated phase value of each pixel (Φ) into a relative height map ΔZ = λΦ / 2πΔn, where Δn is the refractive index difference between the photoresist and air, and λ is the wavelength, in our case Δn=0.5 and λ =758 nm. The 3D model of the DN2 was obtained using a Matlab code able to generate a point by point coordinate system.
The DN2 were fabricated using a custom-made two-photon nanolithography (TPN) 8 Usually, commercial TPN systems fabricate 3D objects by photopolymerizing consecutive layers of materials till the object is manufactured entirely. The minimum axial step (the separation along the z-axis between two consecutive layers) available in commercial systems, like Nanoscribe, is 100 nm 10 . In this work, we used our custom made TPN system and employ a simplified vectorial printing approach to print the diffractive elements of the DN2. We define the starting and the ending point for each rod that constitutes an artificial neuron, which is printed as a single rod, allowing for nanometric control of the length, enabled by the nanometric