Click on “Download PDF” for the PDF version or on the title for the HTML version.

If you are not an ASABE member or if your employer has not arranged for access to the full-text, Click here for options.

Non-destructive Assessment of White Striping in Broiler Breast Meat Using Structured-Illumination Reflectance Imaging with Deep Learning

Published by the American Society of Agricultural and Biological Engineers, St. Joseph, Michigan

Citation:  Journal of the ASABE. 66(6): 1437-1447. (doi: 10.13031/ja.15667) @2023
Authors:   Ebenezer Olaniyi, Yuzhen Lu, Anuraj Theradiyil Sukumaran, Tessa Jarvis, Clinton Rowe
Keywords:   Deep learning, Imaging, Meat quality, Poultry, Structured illumination, White striping.


Broiler breast meat with white striping (WS) was imaged under sinusoidally modulated structured illumination.

Amplitude component (AC) images resolved better WS characteristics than direct component (DC) images.

Models built on deep features classified meat samples into 2 and 3 classes according to WS severity.

The AC images consistently outperformed the DC images and achieved the best accuracy of 96.6% in 2-class modeling.

Abstract. Visual inspection is the prevailing practice in the industry for assessing white striping (WS) in broiler breast meat. However, this approach is subjective, laborious, and prone to error. Several studies have utilized imaging technology under uniform illumination; however, detecting defects, such as WS, remains challenging. This study investigated the efficacy of the emerging structured illumination reflectance imaging (SIRI) combined with deep learning (DL) for the WS assessment of broiler meat. Broiler fillets with varying degrees of WS were imaged using a custom-assembled monochromatic SIRI system (0.05-0.40 cycles/mm). The acquired SIRI pattern images were demodulated into direct components (DC) and amplitude components (AC) at each spatial frequency. Pre-trained DL models, including two Very Deep Convolutional Networks (VGG16 and VGG19) and two Densely Connected Convolutional Neural Networks (DenseNet121 and DensetNet201), were evaluated as fine-tuned end-to-end classifiers and feature extractors separately to differentiate the meat samples. Fine-tuned VGG16 resulted in the best 2-class and 3-class classification accuracies of 94.5% and 74.6%, respectively, based on the AC images at 0.30 cycles/mm, improving over the accuracies of the corresponding DC images by 10.4% and 8.9%, respectively. Fine-tuned VGG19 and DenseNet201 also achieved substantial improvements of AC over DC images by 12% or higher. Linear discriminant analysis, in conjunction with principal component analysis for dimension reduction, yielded better accuracies of 96.6% in 2-class and 83.4% in 3-class classification, based on the deep features extracted by DenseNet121 from the AC images at 0.40 cycles/mm and 0.30 cycles/mm, respectively, representing improvements of 5.0% and 2.9% based on the features of the corresponding DC images. The SIRI technique combined with DL is effective for differentiating between normal and WS-affected broiler meat.

(Download PDF)    (Export to EndNotes)