Use of 3D imaging to predict retail yield on beef subprimals

The classification of bovine carcasses in the meat industry in the United States has been fundamental for the estimation of animal yields. However, traditional equations used to calculate these yields tend to overestimate and have limitations in determining the yield of subprimals. To address these...

Descripción completa

Detalles Bibliográficos
Autor principal: Penados B., Esteban A.
Otros Autores: Acosta, Adela
Formato: Tesis
Lenguaje:Inglés
Publicado: Zamorano: Escuela Agrícola Panamericana 2025
Materias:
Acceso en línea:https://hdl.handle.net/11036/7820
Descripción
Sumario:The classification of bovine carcasses in the meat industry in the United States has been fundamental for the estimation of animal yields. However, traditional equations used to calculate these yields tend to overestimate and have limitations in determining the yield of subprimals. To address these limitations, this study explored the alternative of using three-dimensional (3D) imaging technology using LiDAR sensors to develop yield prediction models for retail cuts in a nondestructive manner. Three-dimensional images of the subprimals were collected using the Polycam application implementing the iPad LiDAR sensor, which were processed with MeshLab software to collect 25 independent variables such as area, perimeter, height and length of the retail cuts. These variables were used in multiple linear regression models to predict the weight of the subprimals, the number of steak and their individual weight and variation. The accuracy results of the developed models reached an R² of up to 0.90 in the prediction of strip loin weight and 0.70 for rib roll weight. No accurate equations were developed for trim weight due to its fat variation. The adoption of three-dimensional imaging technology in the meat industry has significant potential to streamline evaluation processes. This technology enables the generation of real-time information, enhancing decision-making accuracy and minimizing errors and human involvement. Implementing these technologies has the potential to greatly enhance operational efficiency and planning.