Skip to main content

Thank you for visiting You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

Automating insect monitoring using unsupervised near-infrared sensors


Insect monitoring is critical to improve our understanding and ability to preserve and restore biodiversity, sustainably produce crops, and reduce vectors of human and livestock disease. Conventional monitoring methods of trapping and identification are time consuming and thus expensive. Automation would significantly improve the state of the art. Here, we present a network of distributed wireless sensors that moves the field towards automation by recording backscattered near-infrared modulation signatures from insects. The instrument is a compact sensor based on dual-wavelength infrared light emitting diodes and is capable of unsupervised, autonomous long-term insect monitoring over weather and seasons. The sensor records the backscattered light at kHz pace from each insect transiting the measurement volume. Insect observations are automatically extracted and transmitted with environmental metadata over cellular connection to a cloud-based database. The recorded features include wing beat harmonics, melanisation and flight direction. To validate the sensor’s capabilities, we tested the correlation between daily insect counts from an oil seed rape field measured with six yellow water traps and six sensors during a 4-week period. A comparison of the methods found a Spearman’s rank correlation coefficient of 0.61 and a p-value = 0.0065, with the sensors recording approximately 19 times more insect observations and demonstrating a larger temporal dynamic than conventional yellow water trap monitoring.


Insecta is the most speciose class of terrestrial fauna1 and the majority of the world’s biodiversity is composed of this class2. In epidemiological and agricultural ecosystems, insects serve as both beneficial organisms3,4,5 and economic pests6,7. Data on insects can support biodiversity conservation8,9, human health protection10 and increased food production11.

Insects are monitored via established sampling methods including trapping, sweep netting, and portable aspiration12,13,14. These methods are imperfect resulting in biases towards size15,16,17 and stage18. Additionally, conventional methods may be time-consuming, costly and prone to human error such as person-to-person variation in sampling execution19,20,21. New methods, like insect anesthetization sampling22, are being implemented to minimize these biases. Regardless of sampling method, insect identification is time consuming and requires specialized training.

In order to reduce the cost of insect monitoring and identification, automation of insect trapping23,24,25,26,27 and identification27,28,29,30,31 has been developed. While these methods could greatly improve monitoring via traps, they are unsuitable for monitoring a general insect population since trap designs and baits are generally biased in regard to species32,33.

Automation of insect monitoring without traps could reduce species bias of conventional methods and human error, thus greatly improving the state of the art. Insect identification has been automated as early as 1973 using wingbeat frequency34,35,36, and today remote insect sensing includes acoustic detection37, radar observations38,39,40 and lidar41,42,43. Acoustic methods work best with a solid medium26,44, though acoustic monitoring of free flying insects has been demonstrated45,46,47. While radar technologies have much larger monitoring range16,40,48,49,50, they are unsuitable for monitoring small insects, or insects around vegetation, such as a crop canopy. Optical methods were early used as to overcome many of these limitations51,52,53. Today, lidar can be used to record a large number of observations in a long transect54,55,56,57,58 and distinguish between species groups by wingbeat frequency (WBF)55,59. However, lidar equipment requires a trained operator and requires constant supervision due to eye safety restrictions.

Here we present an autonomous near-infrared sensor for monitoring of flying insects in the field. The sensor aims to minimize human biases, be usable by non-technical personnel, and be capable of unsupervised long-term monitoring. Compared to existing entomological lidars, it has a smaller measurement volume but is eye safe and weatherproof.

Instrument design

The sensor is weatherproof, compact, and intended for field use by non-technicians. Like entomological lidar instrumentation, an air volume is illuminated, and light backscattered from insects entering the measurement volume is recorded by a high-speed photodetector. In addition, the instrument is equipped with a satellite navigation device, a camera for situational photos, and an environmental sensor monitoring temperature, humidity, and light intensity. An internal Global System for Mobile Communications (GSM) modem allows for communication and data transfer. The sensor can be powered by any 12 V power supply, including utility power, batteries, or solar power, and has a maximum power consumption of 30 W during monitoring. A photo of the sensor is shown in Fig. 1 and an internal block diagram is described in Fig. 2.

Figure 1
figure 1

Situational photo of the sensor. As insects fly into the measurement volume, the backscattered light is recorded by the receiver. Insect observations are automatically extracted and transmitted along with environmental data, location, and situational photos, to the cloud via a GSM connection. Using a solar panel and battery, the sensor is capable of unsupervised, long-term monitoring in remote locations.

Figure 2
figure 2

General measurement principle. Light is emitted and collimated from the LED board at 808 nm and 980 nm and modulated at different carrier frequencies. The backscattered light from an insect entering the measurement volume is collected by a lens and focused onto a QPD. The four QPD-quadrants are independently amplified by a TIA and sampled. The digital data streams are sent to the FPGA, where 8 digital lock-in amplifiers individually amplify each wavelength in the digital signal processing (DSP) unit. The resulting 8-channel data stream is analyzed by the MCU which extracts events from the data stream. The events can then be stored locally or sent via GSM modem to a cloud database. Created using Power Point 365.


The emitter module consists of a rectangular array of LEDs emitting two spectral bands at 808 nm and 980 nm with total output of 1.6 W and 1.7 W, respectively. The two wavelengths are modulated in a square wave at 118.8 kHz and 79.2 kHz respectively. The LEDs are mounted in a checkerboard pattern to achieve a homogeneous beam profile. The total area of the checkerboard, and thus the beam size at the source, is 82 cm2. The light emitted from each diode is partially collimated by an asymmetrical lens and expands with 20° and 4° diverging angles (\({\theta }_{E}\)). The full width half maximum (FWHM) of the emitted light is 26 nm for the 808 nm band and 47 nm for the 970 band.


The backscattered light from insects entering the overlap between the beam and the receiver’s field of view (FoV) is collected by a near infrared coated aspheric lens (60 mm focal length, ø 76.2 mm aperture) onto a silicon quadrant photodiode (QPD) with a total area of 1 cm2. The receiver is focused at 1 m and has a 4° divergence angle (\({\theta }_{R}\)). Quadrant detection of insects allow for basic range and size estimation60,61 and can differentiate ascending and descending insects as well as migrating insects with tailwind or host- or scent-seeking insects with headwind.

Signal processing

Each quadrant of the QPD is amplified by a dedicated trans-impedance amplifier (TIA) with a bandwidth of 10 Hz–1 MHz and a gain of 0.75 V/µA around 100 kHz. The amplified signals are sampled by four analogue–digital converters (ADC) with 14-bit output at a rate of 6 MHz. The digital data-streams are sent into a field-programmable gate array (FPGA) where eight digital lock-in amplifiers are implemented in VHDL (Very High-Speed Integrated Circuit Hardware Description Language). This allows the two spectral bands to be recorded independently on each quadrant, resulting in an 8-channel data stream. The data is then filtered by a low-pass filter with a cut-off at 5 kHz and digitally sampled to a 20 kHz, 16-bit data stream before it is sent to a microcontroller unit (MCU) for event extraction and further processing (Fig. 3). Since insects generally have wing beat frequencies below 1 kHz, a 5 kHz cutoff allows us to resolve a minimum of five harmonics in the frequency spectra. The increase in bit depth is possible due to the oversampling of the unfiltered signal.

Figure 3
figure 3

Frequency diagram. The wide beam yields long insect transit times, and the corresponding frequency resolution is high enough to accurately capture most species. The frequency response curve (red) is flat in the wingbeat frequency region and the effect of the LP filter at 5 kHz is indicated. The 5 kHz bandwidth allows a minimum of 4 harmonic overtones to be recorded even for mosquitoes with very high wingbeat frequencies.

Measurement volume

The measurement volume is defined by the overlap between the beam and the FoV. Its size and shape can be adjusted by changing the angle (\({\theta }_{S}\)) between the emitter and receiver.

The beam, FoV and the measurement volume have been mapped by a custom-built 3-axis robot covering a volume of 2 m × 1.5 m × 1.5 m. The robot is equipped with a photodetector, an illumination source, and a sphere dropping mechanism. The photo-detector and illumination source are used to map the emitted beam and FoV respectively while sphere dropping mechanism allow us to verify the signal intensity from a standard object at any point in the volume. Using these methods, the signal response from an arbitrary target can be estimated. The volumes were measured at 20 planes along the Z axis, from 30 to 1655 mm, each plane consisting of 56 × 56 measurement points in a 12 mm grid. The calculated signals were then compared to actual measurement values by dropping black and white spheres. The white spheres were assumed to be 100% reflective and the black spheres had a 5% reflectivity.

The measurement volume properties for targets with various optical cross sections (OCS) at different angles are shown in Table 1. The size of the measurement volume is dependent on the minimum acceptable sensitivity, which is related to the noise in the instrument. In the following results, the edge of the volume is defined as the limit where the signal to noise ratio (SNR) is larger than 10 for typical noise levels in a field installation. The signal to noise ratio is defined as the maximum value of the recorded signal divided by the peak-to-peak noise. The volumes for a 10 mm2 target are shown in Fig. 4.

Table 1 Measurement volume parameters at different angles for different target OCS. The target OCS values correspond roughly to a small midge, a small beetle, and a honeybee.
Figure 4
figure 4

Measured FoV, beam, and measurement volume for the three angles. Each volume is mapped at 20 planes along the Z axis and each plane consists of 56 × 56 measurement points with 12 mm spacing. For the FoV and beam, all measurement points below 2% of the maximum value are excluded. For the measurement volume all points with a SNR < 10 for a 10 mm2 target are excluded. A low angle yields a longer and larger, but less sensitive, measurement volume. The FoV is identical in all configurations.

Data processing

Automated event extraction

The sensor records intervals of 10 min (4 quadrants, 2 spectral bands, 16 bit and 20 kHz sample rate after demux of carrier frequency) and automatically extracts insect observations from each recording. The event extraction is inspired by earlier work but modified to reduce computational load42,43,59,62. The event extraction algorithm was developed during prior experiments in various conditions. In simple terms, it aims to quantify the noise level and subsequently multiply it with a signal-to-noise factor to yield a threshold. All events that exceed this threshold are then extracted.

In the chosen implementation, the signal in each channel was downsampled to 2 kHz and a rolling median boxcar filter with a width of 2 s and 50% overlap was used to estimate the quasi-static baselines (the baselines can change with environmental conditions, static objects in the beam etc.). A 2 s window width makes the median estimation insensitive to insect observations, which has an average transit time of ca 100 ms. The standard deviation of the baseline was measured with an identical filter, applied to all datapoints below the median. The selection of values below the median reduces the influence of rare events, such as insects, on the noise level estimation.

The interpolated median signals were removed from the full resolution data and we employed a Boolean condition for insect detection when the time series exceed ten times the estimated standard deviation. A high threshold factor rejected weak observations which could yield unreliable results in the downstream feature extraction. The Boolean time series were eroded by 500 µs and dilated by 30 ms. The erosion rejects short spikes, outliers and insect signals to short to be interpreted and the dilation includes insect observation flanks. The logical OR function was applied across all QPD-quadrants and spectral channels. Extracted observations are transmitted to a cloud database along with metadata such as baseline and noise level, via GSM connection or stored locally until a connection is available. An example of the event extraction process is shown in Fig. 5, and the insect event is shown in greater detail in Fig. 6.

Figure 5
figure 5

An example of the event extraction process in a single channel for visibility. (a) The data, in the 810 nm band of a single QPD segment after the rolling median has been removed. The part of the signal above the event threshold is marked in grey, and the final insect event after erosion and dilation of the binary map is marked in green. (b) Intensity distribution of the data.

Figure 6
figure 6

Insect event example. (a) The 810 nm signal for a single insect event in of one of the QPD segments. The insect wingbeats appear as undulating spikes. The minimum envelope of the signal is interpreted as the insect body contribution to the signal. (b) The Welch spectral density of the event. The fundamental wingbeat frequency and harmonics are seen in the event signal. This event has a fundamental wingbeat frequency of 160 Hz and an average body-to-wing ratio of 0.4.

Each insect observation, along with its associated timestamp and device identifier, is automatically uploaded to the cloud via one-way AMQP (Advanced Message Queuing Protocol), with unique connections for each device. Virtual computing is then used to further process, analyze, and securely store data for further use and aggregation.

Feature extraction/data interpretation

The QPD segments collect backscattered light from different sections of the measurement volume. For a single object passing through the measurement volume, the signal strength within each QPD-quadrant is related to the object’s OCS as well as its position. As the OCS varies with each wingbeat, the wingbeat frequency can be resolved. Many methods have been used to extract the wingbeat frequency from insect observations62,63,64 and most are based on identifying the fundamental frequency in the frequency domain, as shown in Fig. 6b.

In addition to the wingbeat frequency, the body and wing contribution can be measured from each time signal which allows calculation of additional features such as body-to-wing ratio. Additional features can be calculated by comparing the relative intensity of the body and wing signals in the two spectral bands. These bands differentially index melanin absorption65,66,67 and may yield some sensitivity to wing interference patterns66,68,69, although not enough to uniquely determine wing membrane thickness. Together these features can be used to quantify the morphology of different insect groups and allow remote classification of insects according to order, family, genus or species32,64,69,70.

Field validation


The sensor was field-tested against a conventional insect monitoring method, yellow water traps (22 cm diameter)33,71, in an organic oilseed rape (Brassica napus L.) field in the vicinity of Sorø, Denmark (55° 29′ 04.3″ N 11° 29′ 34.6″ E). During a four-week period (04/22/20–05/22/20), insects were monitored with six sensors and six yellow water traps. The water traps were filled with water and soap, immediately drowning any insects landing in the trap. Sensors and traps were placed in a grid pattern, consisting of four linear transects 30 m from and perpendicular to the field’s southern-most edge. This is illustrated in Fig. 7. Each transect consisted of three monitoring points (either sensors or traps) with 45 m spacing, and a separation of 22.5 m between transects. The first and third transect consisted of sensors and the second and fourth were yellow water traps. During the field study presented in this work, \({\theta }_{S}\) was set to 20° in order to maximize the signal strength of small targets at close range.

Figure 7
figure 7

Layout of sensors and traps on the field. Sensors and traps were placed in a grid pattern ca 30 m from the field edges. The four north–south transects are separated by ca 22 m and consists of either sensors or water traps, spaced by 45 m. Image data from Google Earth 2021, Aerodata International Survey Mapdata 2021.

Fundamentally the two methods observe different insect behaviors. While the sensor looks at insects flying above the crop canopy, the yellow water traps look at insects that occur within it. Further confounding the comparison, yellow is attractive to some insects33. Therefore, some proportion of insects will be attracted to the yellow water traps, resulting in overrepresentation of some species72,73. However, water traps constitute the standard practice for pest monitoring in oilseed rape for many species.

Data analysis

The water traps were emptied daily, except for Sundays and 7 additional days (3 sample days in late April and 4 days in mid May) where we were unable to empty the traps. Sensor data was recorded continuously. All insects in the traps were collected, but to allow for a more direct comparison of methods, non-flying insects and thrips found in water traps were excluded from further analysis.

The sensor data was aggregated according to the collection time of the water traps. Insects trapped during Sundays were added to the following days count and the number of collected insects was normalized by the number of trapping days. One day, April 30th, was excluded due to instrument malfunction. The average number of recorded insect observations per sensor per day and per hour was calculated. The calculated numbers were normalized by sensor uptime, which was on average 90% throughout the measurement period. Observations during heavy rainfall and without any distinguishable wingbeat frequency, ca 1% of the observations, were automatically removed from the data using a classification algorithm.


The insect activity recorded by the sensors and traps respectively are shown in Fig. 8. Insect counts from sensors and traps cannot be directly equated due to differences in measurement subject (insect flights vs insect landings) and non-homogeneous insect distribution; however, they serve to visualize similarities in gross changes in insect activity over the sample period. The results demonstrate a significant correlation between the sensor and trap results, specifically with a Spearman’s rank correlation coefficient of 0.61 and a p-value = 0.006574. Over the course of the season, an average of 1122 ± 242 (SE) insect observations per day were collected per sensor (excluding downtime), compared to an average of 63 ± 6 (SE) insects caught per water trap per day over the same period.

Figure 8
figure 8

Sensor-trap comparison. (a) Average insect counts across sensors per day. Errorbars indicate the standard deviation between the sensors. (b) Average insect count across yellow traps per day. Errorbars indicate the standard deviation between the traps. (c) Sensor vs trap counts during days where both sensor and trap data were available. The red line is the linear least square fit (LSTQ) with a Spearman correlation coefficient of 0.61.


Here we present a sensor for automated unsupervised field monitoring of insect flight activity. The sensor illuminates an air volume and records the backscattered light from insects that fly through the measurement volume. Discrete insect observations are automatically extracted from the continuous raw data flow and transmitted over a cellular connection to a database in the cloud. Field validation showed the number of recorded insect observations correlates with the number of individual insects trapped by a conventional insect monitoring method. Furthermore, the sensor recorded an order of magnitude more insects than the conventional method over the same period.

The automation of insect monitoring has the potential to reduce monitoring bias, cost, and human labor, potentially resulting in an increased ability to collect large quantities of biodiversity, public health, and economically relevant insect data. Additionally, the observations from the sensors were available in real time, whereas emptying and counting insects from traps required a significant amount of labor. While this work was limited to comparing total insect counts from the traps, it is possible for a skilled expert to identify these insects to the sub-species level. This is an area were the traps currently have a strong advantage over this sensor and similar instrumentation. Developing and evaluating species specific insect classification algorithms is therefore a major focus. Significant work is still needed prior to field implementation to test possible use cases and limitations of this system.

One of the most striking differences in monitoring methods is the day-to-day variability in the number of data points collected (Fig. 7). While the yellow traps catch a similar number of insects each day, the difference between low and high flight activity days were more visible in the sensor. Early analysis of the trap and sensor data indicates that the peak recorded during May 7–11 is due to a pollen beetle (Brassicogethes aeneus) activity spike. This will be the subject of further studies.

Another marked difference between the sensor and the water traps is the number of data points collected over the same collection period. Each sensor observed ~ 19× more insect observations than insects collected in the water trap. While in general the correlation between the two values is considered more relevant than the absolute number, one advantage of a much higher observation is the ability to get statistically sound data aggregated with very high temporal resolution. In this work, the data was aggregated to match the collection times of the traps but it could easily be aggregated down to hourly activity. The higher temporal resolution and continuous monitoring during unsociable hours allows for the comparatively easy and low-labor collection of data on insect circadian rhythms, as well as direct weather interactions.

We hypothesize that the sensors observe different insect behaviors compared to conventional monitoring methods since only airborne (flying or jumping) insects are recorded. Therefore, we did not expect a perfect correlation between the sensors and the conventional methods. Sweep netting is likely the most similar monitoring method since it also catches insects in flight above the crop. However, sweep netting, which also collects insects on plants, occurs at a point measurement in time and is typically performed along a transect, rather than at a fixed point in the field19. Also, each trapping method is biased towards different insects, influencing catch15,17.

Trapping methods, such as the water traps used in this study, monitor insects landing, walking, or jumping to a specific point and do not record insects in flight. Also, each trapping method is biased towards different insects, with the trap color influencing the trap catch33. It would therefore be beneficial to include multiple trap types in the ground truthing in future work. Additionally, both the sensors and the conventional ground truthing methods assume that the recorded insect activity in one specific point in the field is representative of the insect activity in the near surroundings.

Although we do not fully understand in what manner, the sensor is also most likely biased towards reporting certain species groups. Most primarily, its only capable of recording airborne insects and unsuitable for monitoring during rain. Insect vision is focused towards the visual or ultraviolet spectrum and not capable of resolving infrared light and we believe the emitted beam has very little influence on insect behavior75. However, in a homogeneous landscape such as an agricultural field, any foreign object placed above the canopy could serve as an attractant to insects. Finally, the size of the measurement volume varies with the OCS of the insects and larger insects will be over-represented. To provide a complete picture of the insect population, this should be considered. Along with species specific observations, this is an area where we expect significant progress.

Automated insect monitoring has the potential to facilitate pest prevention, public health studies and biodiversity monitoring. Compared to alternative methods, such as automated traps, the sensor described in this paper comes at a higher cost per unit and with higher power requirements. Compared to previously described entomological lidars, we record fewer observations, but with a longer transit time and higher sampling frequency. We believe the advantage of entomological lidars, such as the sensor described in this paper, is the ability to potentially monitor and discriminate between multiple species using a single instrument.

In further work we will explore the possibilities of unsupervised long-term monitoring of insect activity and species recognition.


In this work, we have introduced an unsupervised automated sensor for insect monitoring. The measurement principle is similar to entomological lidar setups but is optimized for near-field measurements. This simplifies the installation process and increases the robustness of the sensor, allowing it to be operable by non-technical experts and enables long-term unsupervised monitoring.

The sensor automatically extracts insect events from the raw data and transmits these via a built-in modem for further processing. From the recorded observations, features such as the wingbeat frequency, body-wing ratio, and melanisation factor are computed and used to predict the insect classification down to species. During a 4-week deployment in an oilseed rape field, the detected flight activity was shown to be correlated with a conventional monitoring method.

The capabilities, standardization, and scalability of this sensor-based method has the potential to improve the state of the art in insect monitoring. To date, 119 similar units have been deployed in field and in 2021 the cloud database encompassed > 18 million insect observations. The sensor can be used to explore areas such as biodiversity assessment, insecticide resistance, and long-term monitoring of remote areas, facilitating research studies currently difficult or impossible to conduct with conventional methods.


  1. Stork, N. E. How many species of insects and other terrestrial arthropods are there on earth? (2017).

  2. Scudder, G. Insect Biodiversity: Science and Society—Google Books (Wiley-Blackwell, 2009).

    Google Scholar 

  3. Lami, F., Boscutti, F., Masin, R., Sigura, M. & Marini, L. Seed predation intensity and stability in agro-ecosystems: Role of predator diversity and soil disturbance. Agric. Ecosyst. Environ. 288, 106720 (2020).

    Google Scholar 

  4. Gallai, N., Salles, J. M., Settele, J. & Vaissière, B. E. Economic valuation of the vulnerability of world agriculture confronted with pollinator decline. Ecol. Econ. 68, 810–821 (2009).

    Google Scholar 

  5. Consoli, F. L., Parra, J. R. P. & Zucchi, R. A. Egg Parasitoids in Agroecosystems with Emphasis on Trichogramma (Springer Science, 2010).

    Google Scholar 

  6. Sánchez-Guillén, R. A., Córdoba-Aguilar, A., Hansson, B., Ott, J. & Wellenreuther, M. Evolutionary consequences of climate-induced range shifts in insects. Biol. Rev. 91, 1050–1064 (2016).

    PubMed  Google Scholar 

  7. Zalucki, M. P. et al. Estimating the economic cost of one of the world’s major insect pests, Plutella xylostella (Lepidoptera: Plutellidae): Just how long is a piece of string?. J. Econ. Entomol. 105, 1115–1129 (2012).

    PubMed  Google Scholar 

  8. Dornelas, M. & Daskalova, G. N. Nuanced changes in insect abundance. Science (80-). 368, 368–369 (2020).

    CAS  ADS  Google Scholar 

  9. Didham, R. K. et al. Interpreting insect declines: Seven challenges and a way forward. Insect Conserv. Divers. 13, 103–114 (2020).

    Google Scholar 

  10. Greenwood, B. M., Bojang, K. & Whitty, C. J. M. Malaria. Lancet 365, 98 (2005).

    Google Scholar 

  11. Dangles, O. & Casas, J. Ecosystem services provided by insects for achieving sustainable development goals. Ecosyst. Serv. 35, 109–115 (2019).

    Google Scholar 

  12. Burkholder, W. E. & Ma, M. Pheromones for monitoring and control of stored-product insects. Annu. Rev. Entomol. 30, 257–272 (1985).

    CAS  Google Scholar 

  13. Morris, R. F. Sampling insect populations. Annu. Rev. Entomol. 5, 243–264 (1960).

    Google Scholar 

  14. Strickland, A. H. Sampling crop pests and their hosts. Annu. Rev. Entomol. 6, 201–220 (1961).

    Google Scholar 

  15. Bannerman, J. A., Costamagna, A. C., McCornack, B. P. & Ragsdale, D. W. Comparison of relative bias, precision, and efficiency of sampling methods for natural enemies of soybean aphid (Hemiptera: Aphididae). J. Econ. Entomol. 108, 1381–1397 (2015).

    CAS  PubMed  Google Scholar 

  16. Osborne, J. L. et al. Harmonic radar: A new technique for investigating bumblebee and honey bee foraging flight. VII Int. Symp. Pollinat. 437, 159–164 (1996).

    Google Scholar 

  17. Zink, A. G. & Rosenheim, J. A. State-dependent sampling bias in insects: Implications for monitoring western tarnished plant bugs. Entomol. Exp. Appl. 113, 117–123 (2004).

    Google Scholar 

  18. Rancourt, B., Vincent, C. & De Oliveira, A. D. Circadian activity of Lygus lineolaris (Hemiptera: Miridae) and effectiveness of sampling techniques in strawberry fields. J. Econ. Entomol 93, 1160–1166 (2000).

    CAS  PubMed  Google Scholar 

  19. Binns, M. R. & Nyrop, J. P. Sampling insect populations for the purpose of IPM decision making. Annu. Rev. Entomol. 37, 427–453. (1992).

    Article  Google Scholar 

  20. Portman, Z. M., Bruninga-Socolar, B. & Cariveau, D. P. The state of bee monitoring in the United States: A call to refocus away from bowl traps and towards more effective methods. Ann. Entomol. Soc. Am. 113, 337–342 (2020).

    Google Scholar 

  21. Montgomery, G. A., Belitz, M. W., Guralnick, R. P. & Tingley, M. W. Standards and best practices for monitoring and benchmarking insects. Front. Ecol. Evolut. 8, 579193 (2021).

    Google Scholar 

  22. Bick, E., Dryden, D. M., Nguyen, H. D. & Kim, H. A novel CO2-based insect sampling device and associated field method evaluated in a strawberry agroecosystem. J. Econ. Entomol. 113, 1037–1042 (2020).

    CAS  PubMed  Google Scholar 

  23. Wen, C. & Guyer, D. Image-based orchard insect automated identification and classification method. Comput. Electron. Agric. 89, 110–115 (2012).

    Google Scholar 

  24. Chen, Y., Why, A., Batista, G., Mafra-Neto, A. & Keogh, E. Flying insect classification with inexpensive sensors. J. Insect Behav. 27, 657–677 (2014).

    Google Scholar 

  25. Potamitis, I. & Rigakis, I. Novel noise-robust optoacoustic sensors to identify insects through wingbeats. IEEE Sens. J. 15, 4621–4631 (2015).

    CAS  ADS  Google Scholar 

  26. Eliopoulos, P. A., Potamitis, I., Kontodimas, D. C. & Givropoulou, E. G. Detection of adult beetles inside the stored wheat mass based on their acoustic emissions. J. Econ. Entomol. 108, 2808–2814 (2015).

    CAS  PubMed  Google Scholar 

  27. Ärje, J. et al. Automatic image-based identification and biomass estimation of invertebrates. Methods Ecol. Evol. 11, 922–931 (2020).

    Google Scholar 

  28. Hobbs, S. E. & Hodges, G. An optical method for automatic classification and recording of a suction trap catch. Bull. Entomol. Res. 83, 47–51 (1993).

    Google Scholar 

  29. O’Neill, M. A., Gauld, I. D., Gaston, K. J. & Weeks, P. Daisy: An automated invertebrate identification system using holistic vision techniques. in Proceedings of the Inaugural Meeting BioNET-INTERNATIONAL Group for Computer-Aided Taxonomy (BIGCAT) 13–22 (1997).

  30. Chesmore, E. D. Methodologies for automating the identification of species. in First BioNet-International Work. Gr. Autom. Taxon. 3–12 (2000).

  31. Martineau, M. et al. A survey on image-based insect classification. Pattern Recognit. 65, 273–284 (2017).

    ADS  Google Scholar 

  32. Silva, D. F., De Souza, V. M. A., Batista Geapa, K. E. & Ellis, D. P. W. Applying machine learning and audio analysis techniques to insect recognition in intelligent traps. in Proceedings—2013 12th International Conference on Machine Learning and Applications, ICMLA 2013. (2013).

  33. Capinera, J. L. & Walmsley, M. R. Visual responses of some sugarbeet insects to sticky traps and water pan traps of various colors. J. Econ. Entomol., 71(6), 926–927 (1978).

    Google Scholar 

  34. Moore, A., Miller, J. R., Tabashnik, B. E. & Gage, S. H. Automated identification of flying insects by analysis of wingbeat frequencies. J. Econ. Entomol. 79, 1703–1706 (1986).

    Google Scholar 

  35. Riley, J. R. Angular and temporal variations in the radar cross-sections of insects. Proc. Inst. Electr. Eng. (IET) 120, 1229–1232 (1973).

    Google Scholar 

  36. Reed, S. C., Williams, C. M. & Chadwick, L. E. Frequency of wing-beat as a character for separating species races and geographic varieties of Drosophila. Genetics 27, 349 (1942).

    CAS  PubMed  PubMed Central  Google Scholar 

  37. Mankin, R. W., Hagstrum, D. W., Smith, M. T., Roda, A. L. & Kairo, M. T. K. Perspective and promise: a century of insect acoustic detection and monitoring. Am. Entomol. 57(1), 30–44 (2011).

    Google Scholar 

  38. Drake, V. A. & Reynolds, D. R. Radar Entomology: Observing Insect Flight and Migration (Cabi, 2012).

    Google Scholar 

  39. Long, T. et al. Entomological radar overview: System and signal processing. IEEE Aerosp. Electron. Syst. Mag. 35, 20–32 (2020).

    Google Scholar 

  40. Drake, V. A., Hatty, S., Symons, C. & Wang, H. Insect monitoring radar: Maximizing performance and utility. Remote Sens. 12, 596 (2020).

    ADS  Google Scholar 

  41. Brydegaard, M. & Jansson, S. Advances in entomological laser radar. IET Int. Radar Conf. (2018).

    Article  Google Scholar 

  42. Jansson, S. Entomological Lidar: Target Characterization and Field Applications (Department of Physics, Lund University, 2020).

    Google Scholar 

  43. Malmqvist, E. From Fauna to Flames: Remote Sensing with Scheimpflug-Lidar (Department of Physics, Lund University, 2019).

    Google Scholar 

  44. Mankin, R. W., Hagstrum, D. W., Smith, M. T., Roda, A. L. & Kairo, M. T. K. Perspective and promise: A century of insect acoustic detection and monitoring. Am. Entomol. 57, 30–44 (2011).

    Google Scholar 

  45. Miller-Struttmann, N. E., Heise, D., Schul, J., Geib, J. C. & Galen, C. Flight of the bumble bee: Buzzes predict pollination services. PLoS ONE 12, 1–14 (2017).

    Google Scholar 

  46. Li, Y. et al. Mosquito detection with low-cost smartphones: Data acquisition for malaria research. arXiv:1711.06346 [stat.ML] (2017).

  47. Mukundarajan, H., Hol, F. J. H., Castillo, E. A., Newby, C. & Prakash, M. Using mobile phones as acoustic sensors for high-throughput mosquito surveillance. Elife 6, 1–26 (2017).

    Google Scholar 

  48. Osborne, J. L. et al. A landscape-scale study of bumble bee foraging range and constancy, using harmonic radar. J. Appl. Ecol. 36, 519–533 (1999).

    Google Scholar 

  49. Smith, A. D., Riley, J. R. & Gregory, R. D. A method for routine monitoring of the aerial migration of insects by using a vertical-looking radar. Philos. Trans. R. Soc. London. Ser. B Biol. Sci. 340, 393–404 (1993).

    Google Scholar 

  50. Chapman, J. W., Smith, A. D., Woiwod, I. P., Reynolds, D. R. & Riley, J. R. Development of vertical-looking radar technology for monitoring insect migration. Comput. Electron. Agric. 35(2–3), 95–110 (2002).

    Google Scholar 

  51. Schaefer, G. W. & Bent, G. A. An infra-red remote sensing system for the active detection and automatic determination of insect flight trajectories (IRADIT). Bull. Entomol. Res. 74, 261–278 (1984).

    Google Scholar 

  52. Farmery, M. J. Optical studies of insect flight at low altitude. (Doctoral dissertation, University of York, 1981).

  53. Farmery, M. J. The effect of air temperature on wingbeat frequency of naturally flying armyworm moth (Spodoptera exempta). Entomol. Exp. Appl. 32, 193–194 (1982).

    Google Scholar 

  54. Malmqvist, E. & Brydegaard, M. Applications of KHZ-CW lidar in ecological entomology. EPJ Web Conf. 119, 25016. (2016).

    Article  Google Scholar 

  55. Brydegaard, M. et al. Lidar reveals activity anomaly of malaria vectors during pan-African eclipse. Sci. Adv. 6, eaay5487 (2020).

    PubMed  PubMed Central  ADS  Google Scholar 

  56. Malmqvist, E. et al. The bat–bird–bug battle: Daily flight activity of insects and their predators over a rice field revealed by high-resolution Scheimpflug Lidar. Roy. Soc. Open Sci. 5(4), 172303 (2018).

    ADS  Google Scholar 

  57. Fristrup, K. M., Shaw, J. A. & Tauc, M. J. Development of a wing-beat-modulation scanning lidar system for insect studies. Lidar Remote Sens. Environ. Monit. 2017, 15. (2017).

    Article  Google Scholar 

  58. Hoffman, D. S., Nehrir, A. R., Repasky, K. S., Shaw, J. A. & Carlsten, J. L. Range-resolved optical detection of honeybees by use of wing-beat modulation of scattered light for locating land mines. Appl. Opt. 46, 3007–3012 (2007).

    PubMed  ADS  Google Scholar 

  59. Jansson, S., Malmqvist, E. & Mlacha, Y. Real-time dispersal of malaria vectors in rural Africa monitored with lidar. Plos one. 16(3), e0247803 (2021).

    CAS  PubMed  PubMed Central  Google Scholar 

  60. Jansson, S. & Brydegaard, M. Passive kHz lidar for the quantification of insect activity and dispersal. Anim. Biotelemet. 6, 6 (2018).

    Google Scholar 

  61. Jansson, S. P. & Sørensen, M. B. An optical remote sensing system for detection of aerial and aquatic fauna. U.S. Patent Application No. 16/346,322 (2019).

  62. Malmqvist, E., Jansson, S., Török, S. & Brydegaard, M. Effective parameterization of laser radar observations of atmospheric fauna. IEEE J. Sel. Top. Quant. Electron. 22, 1 (2015).

    Google Scholar 

  63. Drake, V. A., Wang, H. K. & Harman, I. T. Insect Monitoring Radar: Remote and network operation. Comput. Electron. Agric. 35, 77–94 (2002).

    Google Scholar 

  64. Kirkeby, C. et al. Advances in automatic identification of flying insects using optical sensors and machine learning. Sci. Rep. 11, 1555 (2021).

    CAS  PubMed  PubMed Central  ADS  Google Scholar 

  65. Jacques, S. L. Erratum: Optical properties of biological tissues: A review (Physics in Medicine and Biology (2013) 58). Phys. Med. Biol. 58, 5007–5008 (2013).

    Google Scholar 

  66. Li, M. et al. Bark beetles as lidar targets and prospects of photonic surveillance. J. Biophoton. (2020).

    Article  Google Scholar 

  67. Brydegaard, M. Advantages of shortwave infrared LIDAR entomology. in Laser Applications to Chemical, Security and Environmental Analysis LW2D-6 (Optical Society of America, 2014).

    Google Scholar 

  68. Brydegaard, M., Jansson, S., Schulz, M. & Runemark, A. Can the narrow red bands of dragonflies be used to perceive wing interference patterns? Ecol. Evol. 8(11), 5369–5384 (2018).

    PubMed  PubMed Central  Google Scholar 

  69. Gebru, A. et al. Multiband modulation spectroscopy for the determination of sex and species of mosquitoes in flight. J. Biophotonics 11(8), e201800014 (2018).

    PubMed  Google Scholar 

  70. Potamitis, I. Classifying insects on the fly. Ecol. Inform. 21, 40–49 (2014).

    Google Scholar 

  71. Heathcote, G. D. The comparison of yellow cylindrical, flat and water traps, and of Johnson suction traps, for sampling aphids. Ann. Appl. Biol. 45, 133–139 (1957).

    Google Scholar 

  72. Vaishampayan, S. M., Kogan, M., Waldbauer, G. P. & Woolley, J. Spectral specific responses in the visual behavior of the greenhouse whitefly, Trialeurodes vaporariorum (Homoptera: Aleyrodidae). Entomol. Exp. Appl. 18, 344–356 (1975).

    Google Scholar 

  73. Mound, L. A. Studies on the olfaction and colour sensitivity of Bemisia tabaci (Genn.) (Homoptera, Aleyrodidae). Entomol. Exp. Appl. 5, 99–104 (1962).

    Google Scholar 

  74. Virtanen, P. et al. SciPy 1.0: Fundamental algorithms for scientific computing in Python. Nat. Methods 17, 261–272 (2020).

    CAS  PubMed  PubMed Central  Google Scholar 

  75. Van Der Kooi, C. J., Stavenga, D. G., Arikawa, K., Belušič, G. & Kelber, A. Evolution of insect color vision: From spectral sensitivity to visual ecology. Annu. Rev. Entomol. 66, 435–461 (2021).

    PubMed  Google Scholar 

Download references


The authors want to thank Jakob Dyhr for kindly making his organic oilseed rape field in Sorø, Denmark, available for the field experiments. Thanks to Lene Sigsgaard, Samuel Jansson, Don R Reynolds and Sam Cook for helpful discussions and comments.


This work was supported by Innovation Fund Denmark under Grant nos. 9078-00183B and 9066-00051B (9065-00239B, 9066-00051A), the Danish Environmental Protection Agency under Grant no. MST-667-00253 and Norsk Elektro Optikk AS.

Author information

Authors and Affiliations



K.R. wrote the first draft, produced figures, and conducted data analysis. K.R., E.B., and L.S. developed paper outline and structure. E.B. contributed to the introduction, field validation, discussion, and conclusion. L.S. contributed to the data processing section and discussion. K.P. and L.M. contributed to the instrument software development. A.S., R.L., and F.E. contributed to the instrument development and instrument characterization section. M.S. contributed with editing and contributed to figures. S.H., B.B., C.G., and J.L. collected and counted insects during the field trials. T.N. led the development of the instrumentation. J.L. took over development leadership in 2020.

Corresponding author

Correspondence to Klas Rydhmer.

Ethics declarations

Competing interests

All authors are or were (partly) affiliated with FaunaPhotonics, the company that developed the sensor described in this study.

Additional information

Publisher's note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Rydhmer, K., Bick, E., Still, L. et al. Automating insect monitoring using unsupervised near-infrared sensors. Sci Rep 12, 2603 (2022).

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI:


By submitting a comment you agree to abide by our Terms and Community Guidelines. If you find something abusive or that does not comply with our terms or guidelines please flag it as inappropriate.


Quick links

Nature Briefing

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing