Example：10.1021/acsami.1c06204 or Chem. Rev., 2007, 107, 2411-2502
Guided Event Filtering: Synergy between Intensity Images and Neuromorphic Events for High Performance Imaging. IEEE Transactions on Pattern Analysis and Machine Intelligence (IF16.389), Pub Date : 2021-09-20, DOI: 10.1109/tpami.2021.3113344 Peiqi Duan,Zihao Wang,Boxin Shi,Oliver Cossairt,Tiejun Huang,Aggelos Katsaggelos
Many visual and robotics tasks in real-world scenarios rely on robust handling of high speed motion and high dynamic range (HDR) with effectively high spatial resolution and low noise. Such stringent requirements, however, cannot be directly satisfied by a single imager or modality, rather by complementary sensors. In this paper, we explore the synergy between traditional frame-based sensors with high spatial resolution and low sensor noise, and emerging event-based sensors with high speed and high dynamic range. We introduce a novel computational framework, termed Guided Event Filtering (GEF), to process the two streams of input data and output super-resolved yet noise-reduced events. GEF first associates the events with the reference image via a flow model. It then performs joint filtering that inherits the mutual structure from both inputs, with a switch mechanism for event self-guided filtering. Lastly, GEF re-distributes the filtered events in the space-time volume while preserving the statistical characteristics of the original events. We demonstrate the benefits of GEF by applying the output high quality events to existing event-based algorithms across diverse application categories, including high speed corner tracking, depth estimation, high frame-rate video synthesis, and super resolution/HDR/color image restoration.