|
|
--- |
|
|
license: mit |
|
|
--- |
|
|
|
|
|
**Spiking Neural Network as Adaptive Event Stream Slicer (NeurIPS'24)** |
|
|
|
|
|
*Abstract:* |
|
|
Event-based cameras are attracting significant interest as they provide rich edge information, high dynamic range, and high temporal resolution. Many state-of-the-art event-based algorithms rely on splitting the events into fixed groups, resulting in the omission of crucial temporal information, particularly when dealing with diverse motion scenarios (e.g., high/low speed). In this work, we propose SpikeSlicer, a |
|
|
novel-designed plug-and-play event processing method capable of splitting events |
|
|
stream adaptively. SpikeSlicer utilizes a low-energy spiking neural network (SNN) |
|
|
to trigger event slicing. To guide the SNN to fire spikes at optimal time steps, we |
|
|
propose the Spiking Position-aware Loss (SPA-Loss) to modulate the neuron’s state. |
|
|
Additionally, we develop a Feedback-Update training strategy that refines the slicing decisions using feedback from the downstream artificial neural network (ANN). |
|
|
Extensive experiments demonstrate that our method yields significant performance |
|
|
improvements in event-based object tracking and recognition. Notably, SpikeSlicer |
|
|
provides a brand-new SNN-ANN cooperation paradigm, where the SNN acts as an |
|
|
efficient, low-energy data processor to assist the ANN in improving downstream |
|
|
performance, injecting new perspectives and potential avenues of exploration. |
|
|
|
|
|
<p align="left"> |
|
|
<img src="spikeslicer.gif" alt="Logo" width="80%"> |
|
|
</p> |
|
|
|
|
|
More details can be found at [paper](https://arxiv.org/pdf/2410.02249) and [code](https://github.com/AndyCao1125/SpikeSlicer). |
|
|
|
|
|
We provide example Test Logs and Weights in this HF hub: |
|
|
* Logs: TransT_baseline_test_log.zip; TransT_SpikeSlicer_Small_test_log.zip; TransT_SpikeSlicer_Base_test_log.zip |
|
|
* Weights: TransT_baseline_pth; TransT_SpikeSlicer_Small_pth; TransT_SpikeSlicer_Base_pth |