Machine learning model better detects woody breast in chicken
Using hyperspectral imaging with machine learning raised accuracy of detection in chicken meat to 95%.
While “woody breast” can mean a chewier chicken sandwich for consumers, for the industry it can mean up to $200 million annual yield loss.
Work done by the Arkansas Agricultural Experiment Station is not only making woody breast easier to detect in chicken meat but is accurate up to 95% of the time.
The development could help improve quality assurance and customer confidence in one of Arkansas’ most economically important agricultural products. What allows researchers to see inside the meat is a combination of a hyperspectral camera, which examines the meat through various energy wavelengths, and machine learning to interpret what the camera sees.
“We’ve been able to improve accuracy of detection of woody breast by utilizing machine learning to analyze complex data from images with a hyperspectral camera,” said Dongyi Wang, an assistant professor in the biological and agricultural engineering department for the experiment station, the research arm of the University of Arkansas System Division of Agriculture.
“The next step will be trying to integrate the system online and make this beneficial for stakeholders,” Wang said, noting that this specific application of image analysis had not been done before.
Loss in premium meat
Woody breast meat is harder and chewier than normal chicken breast, but it is still safe to eat, according to Casey Owens, professor of poultry processing and products for the Arkansas experiment station and a co-author of the study. When detected by processers, either by people or computer-assisted imaging technology, she said the meat is diverted from whole-breast packaging for further processing into products including chicken nuggets and patties.
The loss in premium as a whole-muscle product accounts for yield loss as high as $200 million in Arkansas and over $1 billion in direct and indirect costs annually across the U.S. poultry industry, Owens added. Up to 20% of chicken breast meat can have the defect, which is more common in larger birds of 8-9 lb. versus 6-7 lb. birds.
Hyperspectral imaging
Hyperspectral imaging is a rapid, non-invasive way to capture detailed data about objects and their composition. This data can be used to classify food products according to food quality, consumer preferences and other product requirements.
But hyperspectral images come with tons of data. That’s where machine learning comes in.
Chaitanya Pallerla, a food science graduate student who has been working on the project for the past two years with Wang as his adviser, said the new machine learning model is called NAS-WD. When correlated with known data about the “woodiness” of chicken breasts, the model allows for deeper and wider analysis of hyperspectral images to identify the defect.
“In hyperspectral imaging, there are common machine learning models being used, but we were able to develop a new model that could be well suited for correlating more than two variables,” Pallerla said. “We kind of took two different models, made a few changes and put them together to detect patterns better and correlate the hyperspectral data with hardness of the chicken meat.”
The results of their research were published in the journal Artificial Intelligence in Agriculture under the title “Neural network architecture search enabled wide-deep learning (NAS-WD) for spatially heterogenous property awarded chicken woody breast classification and hardness regression.”
The results showed that NAS-WD can classify three woody breast defect levels with an overall accuracy of 95%, outperforming the traditional models like the Support Vector Machine and Multi-Layer Perception, which offered accuracy of 80% and about 73%, respectively.
Wang said the study provides an example of how to use new algorithms to mine data and dig into key information. The form of hyperspectral imaging used in the research is called “push broom,” which takes an image of several objects once every 40 seconds. In comparison, a more common industry method called “snapshot” takes an image of individual objects as fast as every 30 milliseconds. The snapshots have a lower resolution than the push broom method, but software upgrades may one day provide higher resolution for the snapshot images, Pallerla said.
Wang said his team is working on deploying this technology in the real-time system.
The study was supported in part by the Agriculture and Food Research Initiative (project award numbers 2023-70442-39232 and 2024-67022-42882) from the U.S. Department of Agriculture’s National Institute for Food & Agriculture.
The University of Arkansas System Division of Agriculture’s mission is to strengthen agriculture, communities and families by connecting trusted research to the adoption of best practices.