English

中国农机化学报

中国农机化学报 ›› 2023, Vol. 44 ›› Issue (1): 172-177.DOI: 10.13733/j.jcam.issn.2095-5553.2023.01.024

• 农业信息化工程 • 上一篇    下一篇

基于改进红绿色差和Otsu的葡萄果穗图像分割

周文静1,赵康2,马晓晓1,田志芳1   

  1. 1. 新疆科技学院信息科学与工程学院,新疆库尔勒市,841000;
    2. 石河子大学机械电气工程学院,新疆石河子,832003
  • 出版日期:2023-01-15 发布日期:2023-01-18
  • 基金资助:
    巴音郭楞蒙古自治州科学技术研究计划项目(202201);新疆维吾尔自治区博士研究生科研创新计划项目(XJ2020G080);大学生创新创业训练计划项目(S202113561015)

Image segmentation of grape cluster based on improved redgreen difference and Otsu

Zhou Wenjing, Zhao Kang, Ma Xiaoxiao, Tian Zhifang.   

  • Online:2023-01-15 Published:2023-01-18

摘要: 为提高田间复杂环境下传统图像分割法分割葡萄果穗图像准确度低的问题,提出一种基于改进红绿色差和Otsu算法的田间葡萄果穗图像分割方法。选取与人类视觉相近的RGB颜色空间,提取并分析R、G特征图的直方图,经分析对其点乘特征图并进行Otsu运算,再经过形态学处理,实现对田间环境下葡萄果穗图像的分割。与灰度图、(R-G)特征图和(R-G)/(R+G)特征图分别采用最大阈值分割法(Otsu)分割的结果进行对比,试验结果表明,红绿色差点乘Otsu分割法的分割结果最优,准确率为92.37%,F1值90.13%。对50幅图像做了测试,其中图像准确率最高为97%,准确率最低为79%,其平均准确率为88.75%。所提出的方法能够实现葡萄果穗较完整的分割,并可为葡萄果穗的识别、定位提供研究基础。

关键词: 红绿色差, Otsu, 葡萄果穗, 图像分割, 特征图

Abstract: In order to improve the low accuracy of the traditional image segmentation method for segmenting grape cluster images in complex field environments, a grape cluster in field image segmentation method based on improved redgreen difference method and Otsu algorithm is proposed. This paper selects the RGB color space which is similar to human vision, extracts and analyzes the histogram of the R and G feature maps, analyzes the point multiplication feature maps and performs Otsu operation to realize the segmentation of the red grape cluster image in the field environment. Comparing with the results of the grayscale map, (R-G) feature map and (R-G)/(R+G) feature map using the maximum threshold segmentation method (Otsu) segmentation. The experimental results show that the red and green handicap is multiplied by the Otsu segmentation method. The result which accuracy is 92.37%, and the F1-score is 90.13% is the best, and then the morphological processing of the segmentation results can achieve a more complete segmentation of grape clusters. 50 images were tested, of which the highest accuracy was 97%, the lowest was 79%, and the average accuracy was 88.75%. The method proposed in this paper can provide a research basis for the recognition and positioning of grape clusters.

Key words: redgreen difference; Otsu, grape clusters, image segmentation, feature map

中图分类号: