Click on “Download PDF” for the PDF version or on the title for the HTML version.
If you are not an ASABE member or if your employer has not arranged for access to the full-text, Click here for options.
Computer vision based recognition of behavior phenotypes of laying hens
Published by the American Society of Agricultural and Biological Engineers, St. Joseph, Michigan www.asabe.org
Citation: Paper number 054002, 2005 ASAE Annual Meeting . (doi: 10.13031/2013.19471) @2005
Authors: T. Leroy, E. Vranken, E. Struelens, B. Sonck, D. Berckmans
Keywords: Computer vision, image processing, poultry housing, behavior, classification
Automated individual animal behavior surveillance, by means of low-cost cameras and
computer vision techniques, has the ability to generate continuous data providing an objective
measure of behavior, without disturbing the animals.
The specific purpose of this current study was to develop an automatic computer vision technique
that is capable of continuously measuring the trajectory and rotation behavior of an individual laying
hen and recognizing six different behavior phenotypes (standing, sitting, sleeping, grooming,
scratching, pecking).
For this purpose, a model-based algorithm has been developed, based on the fact that behavior can
be described as a time-series of different subsequent postures. The quantification of the hens
posture consists of its position, orientation and a set of parameters describing its shape, obtained by
fitting a point distribution model to the hens outline. Applying this algorithm to subsequent images in
a video sequence, the successive values of the hens posture parameterization represent the hens
behavior within that sequence. The time series of positions and orientations of the hen directly
quantify its trajectory and rotation behavior. From the time series of shape parameters, the behavior
is classified as one of six behavior phenotypes for which the system is trained. Training was done by
modelling a template for each behavior phenotype from training video sequences with that
phenotype, labelled by a trained ethologist. For the behavior classification the template matching
technique was used, where the behavior in a new video fragment is classified as the behavior type
for which the model gives the best match.
The system was tested on a set of over 14000 video fragments of a single hen in a cage, each
fragment containing one of the six behavior types. The average classification rate was between 70%-
96%, except 21% for pecking, due to an unreliable tracking of the chickens head. Best results were
obtained for sleeping (96%) and standing (90%).
(Download PDF) (Export to EndNotes)
|