English

中国农机化学报

中国农机化学报 ›› 2022, Vol. 43 ›› Issue (2): 163-170.DOI: 10.13733/j.jcam.issn.20955553.2022.02.023

• • 上一篇    下一篇

基于移动端轻量模型的杂草分类方法研究

陈启1,陈慈发1,邓向武2,袁单飞1   

  1. 型的杂草分类方法研究基于移动端轻量模型的杂草分类方法研究*陈启1,陈慈发1,邓向武2,袁单飞1(1. 三峡大学计算机与信息学院,湖北宜昌,443000;
    2. 广东石油化工学院电子信息工程学院,广东茂名,525000
  • 出版日期:2022-02-15 发布日期:2022-03-02
  • 基金资助:
    广东石油化工学院人才引进及博士启动项目(2019rc044);国家青年科学基金项目(31801258)

Research on weed classification method based on mobile lightweight model

Chen Qi, Chen Cifa, Deng Xiangwu, Yuan Danfei.   

  • Online:2022-02-15 Published:2022-03-02

摘要: 杂草分类的研究大多集中在服务器端模型,而该类模型存在规模较大、占用较多计算资源、计算速度较慢、准确率不高等问题,以ResNet50和MobileNetV3_large模型为基础,采用数据增强、迁移学习来进行快速训练,在训练中通过设置加权Softmax损失函数的权重,最后再利用精度高的服务器端模型指导和优化移动端模型,从而得到一个轻量模型。试验结果表明,本文轻量模型相比移动端模型MobileNetV3_large,在模型大小变化不大的情况下,识别准确率提升1.2%;相比服务器端模型ResNet50,准确率提升0.78%,平均每张推理时间减少7.8%,模型大小减少80%,本研究可为杂草精准施药的实施应用提供理论基础和技术支持。

关键词: 人工智能, 杂草识别, 轻量卷积, 损失函数, 知识蒸馏

Abstract:  The research on weed classification mostly focuses on the serverside model, which has the problems of large scale, occupying more computing resources, slow computing speed, and low accuracy. Based on ResNet50 and MobileNetV3_large models, data enhancement and migration learning are used for rapid training. In training, the weight of the weighted Softmax loss function is set. The highprecision serverside model is used to guide and optimize the mobile side model to obtain a lightweight model. The experimental results show that compared with the mobile terminal model MobileNetV3_large, the recognition accuracy of the lightweight model in this paper is improved by 1.2% when the model size changes little. Compared with the serverside model ResNet50, the accuracy is improved by 0.78%, the average reasoning time per piece is reduced by 7.8%, and the model size is reduced by 80%. This study can provide a theoretical basis and technical support for the implementation and application of precision weed application.

Key words: artificial intelligence, weed identification, lightweight convolution, loss function, knowledge distillation

中图分类号: