Skip navigation

Please use this identifier to cite or link to this item: http://10.10.120.238:8080/xmlui/handle/123456789/423
Full metadata record
DC FieldValueLanguage
dc.contributor.authorBhat P.G.en_US
dc.contributor.authorSubudhi B.N.en_US
dc.contributor.authorVeerakumar T.en_US
dc.contributor.authorLaxmi V.en_US
dc.contributor.authorGaur M.S.en_US
dc.date.accessioned2023-11-30T08:32:29Z-
dc.date.available2023-11-30T08:32:29Z-
dc.date.issued2020-
dc.identifier.issn1530437X-
dc.identifier.otherEID(2-s2.0-85079603056)-
dc.identifier.urihttps://dx.doi.org/10.1109/JSEN.2019.2954331-
dc.identifier.urihttp://localhost:8080/xmlui/handle/123456789/423-
dc.description.abstractIn this article, a particle filter based tracking algorithm is proposed to track a target in video with vivid and complex environments. The target is represented in feature space by both color distribution and KAZE features. Color distribution is selected for its robustness to target's scale variation and partial occlusion. KAZE features are chosen for their ability to represent the target structure and also for their superior performance in feature matching. Fusion of these two features will lead to effective tracking as compared to other features due to their better representational abilities, under challenging conditions. The trajectory of the target is established using the particle filter algorithm based on similarity between the extracted features from the target and the probable candidates in the consecutive frames. For the color distribution model, Bhattacharya coefficient is used as a similarity metric whereas Nearest Neighbor Distance Ratio is used for matching of corresponding feature points in KAZE algorithm. The particle filter update model is based on kinematic motion equations and the weights on particles are governed by an equation fusing both the color and KAZE features. Centre Location Error, Average Tracking Accuracy and Tracking Success Rate are the performance metrics considered in the evaluation process. Also, the overlap success plot and precision plot is considered for performance evaluation. On the basis of these metrics and visual results obtained under different environment conditions: outdoor, occluding and underwater ones, the proposed tracking scheme performs significantly better than the contemporary feature-based iterative object tracking methods and even few of the learning-based algorithms. © 2001-2012 IEEE.en_US
dc.language.isoenen_US
dc.publisherInstitute of Electrical and Electronics Engineers Inc.en_US
dc.sourceIEEE Sensors Journalen_US
dc.subjectcoloren_US
dc.subjectfusionen_US
dc.subjectKAZEen_US
dc.subjectparticle filteren_US
dc.subjectVisual trackingen_US
dc.titleMulti-Feature Fusion in Particle Filter Framework for Visual Trackingen_US
dc.typeJournal Articleen_US
Appears in Collections:Journal Article

Files in This Item:
There are no files associated with this item.
Show simple item record


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.