Fisher Motion Descriptor for Multiview Gait Recognition
Autor
Castro, F.M.
Muñoz Salinas, Rafael
Guil, N.
Marín-Jiménez, M.J.
Fecha
2017Materia
Gait recognitionMultiple viewpoints
Motion
Dense trajectories
Fisher vectors
METS:
Mostrar el registro METSPREMIS:
Mostrar el registro PREMISMetadatos
Mostrar el registro completo del ítemResumen
The goal of this paper is to identify individuals by analyzing their gait. Instead of using binary silhouettes
as input data (as done in many previous works) we propose and evaluate the use of motion descriptors based
on densely sampled short-term trajectories. We take advantage of state-of-the-art people detectors to de ne
custom spatial con gurations of the descriptors around the target person, obtaining a rich representation of
the gait motion. The local motion features (described by the Divergence-Curl-Shear descriptor [1]) extracted
on the di erent spatial areas of the person are combined into a single high-level gait descriptor by using
the Fisher Vector encoding [2]. The proposed approach, coined Pyramidal Fisher Motion, is experimentally
validated on `CASIA' dataset [3] (parts B and C), `TUM GAID' dataset [4], `CMU MoBo' dataset [5] and the
recent `AVA Multiview Gait' dataset [6]. The results show that this new approach achieves state-of-the-art
results in the problem of gait recognition, allowing to recognize walking people from diverse viewpoints on
single and multiple camera setups, wearing di erent clothes, carrying bags, walking at diverse speeds and
not limited to straight walking paths.