Description
The incidence of pollen allergies is steadily increasing, now affecting 20-30% of the global population (Alska et al., 2025). Several factors contribute to this. First, there is an increase in sensitization to pollen, for which the “hygiene hypothesis” is considered a major cause (Platts-Mills, 2015). Furthermore, urbanisation entails an increased exposure to air pollutants that have been linked to greater severity of allergic reactions (Zhao et al., 2016). Additionally, climate change seems to play a crucial role: with increasing carbon dioxide concentration and warmer temperatures, plants can grow more and flower for longer, leading to greater pollen production (Ariano et al., 2010, Ziska et al., 2019).
Therefore, monitoring pollen is, and will be, crucial to deepen our understanding of its distribution and epidemiological implications. Currently, most of the pollen monitoring and counting is performed manually from Hirst type volumetric traps (Hirst, 1952), making it both expensive and time-consuming. This prevents real-time monitoring and dense sampling, since the observation stations must be checked by trained personnel at best daily. Recently, there have been attempts to automate pollen sampling and identification both in Germany (Oteros et al., 2015) and Switzerland (Sauvageat et al., 2020). However, such devices are still costly and can only reliably identify a few pollen taxa.
Our work focuses on conceiving microfluidic devices that allow for automated pollen monitoring, with the overarching goal of real-time automated classification and counting of pollen grains. The first step towards such an ambitious objective is to develop a microfluidic platform able to acquire sharp images of pollen grains from different perspectives in a continuous flow.
We have employed 2.5D hydrodynamic focusing to capture high-resolution images of pollen grains flowing in our device by confining the sample at the bottom of the channel (Patel et al., 2023). By doing so, the pollen grains are also subjected to a steep gradient of shear forces, thereby inducing rotation within the camera’s field of view (Kleiber et al., 2020). Therefore, we can obtain several images of the same pollen grains from different angles. This will allow us to obtain a reliable training dataset for a neural network programmed to automatically classify pollen from microscopic images.
Concurrently, we are exploring how acoustic forces can help achieve focusing and multi-angular imaging of pollen. As proof of concept, we employed a bulk acoustic wave device comprising a silicon-glass chip and a piezoelectric element actuated with dual frequency (Jonai and Akiyama, 2023), thus achieving two-dimensional focusing of the pollen grains.
This work is the first step to achieve automated, real-time pollen monitoring. We envision an inexpensive microfluidic platform that can be deployed in every location of interest, able to provide images of the airborne pollen grains at any given time.