标题:Depth-Sensitive Mean-Shift Method for Head Tracking
作者:Zhang, Ning; Yang, Yang; Liu, Yun-Xia
通讯作者:Yang, Yang
作者机构:[Zhang, Ning; Yang, Yang] Shandong Univ, Sch Informat Sci & Engn, Jinan 250100, Peoples R China.; [Liu, Yun-Xia] Shandong Univ, Sch Control Sci & En 更多
会议名称:12th International Conference on Intelligent Computing (ICIC)
会议日期:AUG 02-05, 2016
来源:INTELLIGENT COMPUTING METHODOLOGIES, ICIC 2016, PT III
出版年:2016
卷:9773
页码:753-764
DOI:10.1007/978-3-319-42297-8_70
关键词:Depth image; Mean-shift; Tracking; Kinect2.0
摘要:Target tracking is one of the most basic application in computer vision and it has attracted wide concern in recent years. Until now, to our best knowledge, most research focused on the tracking research with 2D images, including the Tracking-Learning-Detection (TLD), particle filter, Mean-shift algorithm, etc. While with the advanced technology and lower cost of sensors, 3D information can be used for target tracking problems in many researches and the data can be obtained by laser scanner, Kinect sensor and etc. As a new type of data description, depth information can not only obtain the spatial position information of target but also can protect privacy and avoid the influence of illumination changes. In this paper, a depth-sensitive Mean-shift method for tracking is proposed, which use the depth information to estimate the range of people's movement and improve the tracking efficiency and accuracy effectively. What's more, it can adjust kernel bandwidth to adapt to the target size according to the distance between target and the depth camera. In the designed system, Kinect2.0 sensor is not only used to get the depth data and track the target but also can be mobilized by steering gear flexibly when tracking. Experimental results show that these improvements make Mean-shift algorithm more robust and accurate for handling illumination problems during tracking and it can achieve the purpose of real-time tracking.
收录类别:CPCI-S;EI;SCOPUS
资源类型:会议论文;期刊论文
原文链接:https://www.scopus.com/inward/record.uri?eid=2-s2.0-84978803752&doi=10.1007%2f978-3-319-42297-8_70&partnerID=40&md5=e28bf5665d1568f37f6101fe35bd4807
TOP