Volume 20 No 13 (2022)
Download PDF
DEEP NEURAL NETWORK APPROACH FOR VEHICLE IDENTIFICATION IN INTELLIGENT TRANSPORTATION SYSTEMS
K.M.N.Syed Ali Fathima1*, Dr.K.Merriliance
Abstract
Accidents can be decreased by detecting and tracking vehicles. Objects in the frame are located, along
with their positions and classifications, by vehicles. Regionally based item detection was successful,
according to researchers. We suggest a region-based deep learning network to find vehicles. Regionbased learning is used by faster and grid-based RCNNs. The suggested method can find multiple
automobiles. The primary duty of ITS is vehicle classification. This study suggests categorizing vehicles
based on their points of view using an SVM and HOG. Numerous items are examined. The plan can assist
law enforcement in both securing the car from thieves and physically identifying it via videotaped
surveillance. Despite SVM's accuracy, its execution time is unreasonably long due to human fatigue from
watching numerous movies and photos. Employing a business is expensive. CNN categorizes vehicle
views using SVM, Decision Tree, Random Forest, etc. Today, the method enhances vehicle viewpoint
classification. Front, back, and side perspectives are used in the suggested classification of the vehicle.
Images of cars are categorized by CNN. According to the evaluation, DeepVehicleNet can function
effectively in practical transportation and autonomous driving systems. ITS applications benefit from
vehicle-re-identification. The SCRM Network is suggested in this study to increase the accuracy of
vehicle re-identification. The Scale-Integrated Feature Mapping Framework of this network extracts
notable channel features. The suggested work performs better than earlier ones.
Keywords
Grid RCNN, DeepVehicleNet, SCRM, CBCL, KITTI, VERI, VRIC
Copyright
Copyright © Neuroquantology
Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
Articles published in the Neuroquantology are available under Creative Commons Attribution Non-Commercial No Derivatives Licence (CC BY-NC-ND 4.0). Authors retain copyright in their work and grant IJECSE right of first publication under CC BY-NC-ND 4.0. Users have the right to read, download, copy, distribute, print, search, or link to the full texts of articles in this journal, and to use them for any other lawful purpose.