Created at 6am, Apr 5
Ms-RAGArtificial Intelligence
0
A Methodology to Study the Impact of Spiking Neural Network Parameters considering Event-Based Automotive Data
2uSyO-N-J6VFF3zNH-d5xOY9vN5grxi3BDp3Yuw35hE
File Type
PDF
Entry Count
54
Embed. Model
jina_embeddings_v2_base_en
Index Type
hnsw

Iqra Bano, Rachmad Vidya Wicaksana Putra, Alberto Marchisio, Muhammad ShafiqueAbstract—Autonomous Driving (AD) systems are considered as the future of human mobility and transportation. Solving computer vision tasks such as image classification and object detection/segmentation, with high accuracy and low power/energy consumption, is highly needed to realize AD systems in real life. These requirements can potentially be satisfied by Spiking Neural Networks (SNNs). However, the state-of-the-art works in SNN-based AD systems still focus on proposing network models that can achieve high accuracy, and they have not systematically studied the roles of SNN parameters when used for learning event-based automotive data. Therefore, we still lack understanding of how to effectively develop SNN models for AD systems. Toward this, we propose a novel methodology to systematically study and analyze the impact of SNN parameters considering event-based automotive data, then leverage this analysis for enhancing SNN developments. To do this, we first explore different settings of SNN parameters that directly affect the learning mechanism (i.e., batch size, learning rate, neuron threshold potential, and weight decay), then analyze the accuracy results. Afterward, we propose techniques that jointly improve SNN accuracy and reduce training time. Experimental results show that our methodology can improve the SNN models for AD systems than the state-of-the-art, as it achieves higheraccuracy (i.e., 86%) for the NCARS dataset, and it can also achieve iso-accuracy (i.e., ∼85% with standard deviation less than 0.5%) while speeding up the training time by 1.9x. In this manner, our research work provides a set of guidelines for SNN parameter enhancements, thereby enabling the practical developments of SNN-based AD systems.

For instance, if we investigate the impact of B = 10, then we perform experiments with the following setting: B = 10, lr = 1e3, Vth = 0.4. Note, the experimental results and related discussion for the exploration will be provided in Section V-A until Section V-D.
id: 0162df302c67af9daae03bae70578891 - page: 4
C. Parameter Enhancements for Improving the SNN Learning Quality It aims at improving the SNN learning quality, i.e., increasing the accuracy and/or reducing the training time. To do this, we first analyze the experiment results from the previous exploration step to identify the effective value for each parameter. The effective parameter value is typically characterized by the one that leads to high accuracy under a relatively short training time. Therefore, our strategy is to perform parameter enhancements by tuning the selected parameters to follow the effective values based on the experimental results. Note, the experimental results and related discussion for the SNN parameter enhancements will be provided in Section V-E and Section V-F.
id: 853b448fbc3159b75bfec234e7d61616 - page: 4
IV. EVALUATION METHODOLOGY Fig. 7 illustrates the experimental setup for evaluating our methodology. Here, we employ a Python-based implementation that runs on an Nvidia RTX 6000 Ada GPU machine, and the generated outputs are the accuracy (i.e., training and test) and the log of experiments (e.g., number of epoch and loss scores). The Python-based implementation is built using the numpy, torch, and torchvision library packages. We employ the SNN architecture described in Table I with the STBP learning rule, the NCARS dataset as the workload, and 200 epochs for the training phase. We consider the stateof-the-art work, i.e., CarSNN with the following default setting: B = 40, lr = 1e3, Vth = 0.4, and wdecay = 0, as the reference comparison partner. SNN Architecture Learning Rule Running on GPU Trained SNNTesting Python-based Implementation (PyTorch) Accuracy (.txt) Experiment Log (.txt) NCARS DatasetSNN Parameter Settings Training
id: f52b997c3cac779f32d658c08b2c2830 - page: 4
Fig. 7. The experimental setup in our evaluation. V. EXPERIMENTAL RESULTS AND DISCUSSION A. Impact of the Batch Size
id: e91a8c8e72c0130cb0a6cbe4b3df2114 - page: 5
How to Retrieve?
# Search

curl -X POST "https://search.dria.co/hnsw/search" \
-H "x-api-key: <YOUR_API_KEY>" \
-H "Content-Type: application/json" \
-d '{"rerank": true, "top_n": 10, "contract_id": "2uSyO-N-J6VFF3zNH-d5xOY9vN5grxi3BDp3Yuw35hE", "query": "What is alexanDRIA library?"}'
        
# Query

curl -X POST "https://search.dria.co/hnsw/query" \
-H "x-api-key: <YOUR_API_KEY>" \
-H "Content-Type: application/json" \
-d '{"vector": [0.123, 0.5236], "top_n": 10, "contract_id": "2uSyO-N-J6VFF3zNH-d5xOY9vN5grxi3BDp3Yuw35hE", "level": 2}'