Mobile QR Code QR CODE

2024

Acceptance Ratio

21%

Title Optimizing Hardware Resources for Low-Power Binary Neural Networks Using Approximate Bitwise Operation
Authors (Dongchan Lee) ; (Youngmin Kim)
DOI https://doi.org/10.5573/IEIESPC.2025.14.6.825
Page pp.825-836
ISSN 2287-5255
Keywords Neural network; Binary neural network; FPGA; Approximate; Accumulator
Abstract Artificial neural networks have recently been widely used in image classification, object detection, and character recognition. However, the amounts of learning and computation in the model required to achieve high accuracy have increased rapidly. As a result, a bottleneck phenomenon has intensified. Research on approaches such as reducing the weights models and optimizing calculations are being conducted to solve this problem. Binary neural networks are receiving significant attention for field-programmable gate array (FPGA)-based designs owing to their high computational efficiency and low-power designs. In this paper, we propose a binary neural network with an FPGA-based low-power accumulator. Based on the hardware resource consumption of each layer, operations are optimized by targeting more hardware-intensive layers. In addition, we propose a new method for operating the accumulator for adding the existing operation results during the learning process in the neural networks. As a result, a binary neural network using the optimized accumulator reduces power by up to 55% compared to the previous network; the other hardware usage was also reduced by 27%. Nevertheless, the delay time remains constant, and the accuracy remains at 90%.