新疆农业科学 ›› 2018, Vol. 55 ›› Issue (3): 548-555.DOI: 10.6048/j.issn.1001-4330.2018.03.018

• • 上一篇    下一篇

基于无人机可见光遥感的棉花面积信息提取

李路曼1,郭鹏1,张国顺2,周倩1,吴锁智1   

  1. 1.石河子大学理学院,新疆石河子 832003;
    2.石河子大学信息科学与技术学院,新疆石河子 832000
  • 出版日期:2018-03-20 发布日期:2018-06-28
  • 通信作者: 郭鹏(1981-),男,安徽利辛人,博士,高级实验师,硕士生导师,研究方向为遥感技术应用,(E-mail)gp163@163.com
  • 作者简介:李路曼(1994-),女,河北沧州人,研究方向为遥感技术应用,(E-mail)13126709892@163.com
  • 基金资助:
    国家国际科技合作专项项目(2015DFA11660);国家大学生创新创业训练计划项目(201710759064);大学生研究训练项目(SRP2017212)

Research on Area Information Extraction of Cotton Field Based on UAV Visible Light Remote Sensing

LI Lu-man1, GUO Peng1, ZHANG Guo-shun2, ZHOU Qian1, WU Suo-zhi1   

  1. 1. College of Science, Shihezi University, Shihezi Xinjiang 832003, China;
    2. College of Information Science & Technology, Shihezi University, Shihezi Xinjiang 832003, China
  • Online:2018-03-20 Published:2018-06-28

摘要: 【目的】针对传统大区域棉花种植信息提取方法相对落后的问题,运用面向对象的影像分析方法,对无人机遥感试验获取的可见光影像进行棉花种植信息的提取。【方法】选用双子星MyFlyDream MTD固定翼无人机搭载佳能EF-M 18-55相机,获取新疆建设兵团第八师135团的可见光影像,借助eCognition软件平台,运用面向对象的方法对研究区内棉花种植信息进行提取试验。【结果】目视解译提取的棉花种植面积为0.35 km2,面向对象提取的棉花种植面积为0.33 km2,分类结果精度为94.29%,误差系数为5.71%,可以有效地提取研究区域棉花种植信息。【结论】面向对象的分类方法相比于传统的基于像素的分类方法提取精度更高,更加接近于目视解译的提取结果。

关键词: eCognition; 面向对象; 无人机可见光遥感; 棉花; 种植信息

Abstract: 【Objective】 This project aims to use object-oriented image classification method to extract the planting information of the visible light remote sensing image of the UAV in the hope of providing a new method for extracting large-scale farmland information and improving the speed and precision of classification results.【Method】The study selected fixed-wing UAV equipped with a camera and obtained the visible light images of 135th regiment farm of the eighth division of Xinjiang Production and Construction Corps. With the help of eCognition software platform, using the object-oriented method, the cotton planting information in the study area was extracted for experiments.【Result】The planting area of cotton extracted by visual interpretation was 0.35 km2,and that by object-oriented approach was 0.33 km2. The results showed that this method could effectively extract the cotton planting area in the study area, and the classification accuracy reached 94.29%, error of 5.71%.【Conclusion】Compared with traditional pixel-based classification methods, using the object-oriented classification method to extract the range information of visible light images captured by UAV has higher extraction accuracy and is greatly closer to visual interpretation.

Key words: eCognition; object oriented; visible light remote sensing of UAV; cotton; planting information

中图分类号: 


ISSN 1001-4330 CN 65-1097/S
邮发代号:58-18
国外代号:BM3342
主管:新疆农业科学院
主办:新疆农业科学院 新疆农业大学 新疆农学会

出版单位:《新疆农业科学》编辑部
地址:乌鲁木齐市南昌路403号新疆农业科学院
邮编:830091
电话:0991-4502046
E-mail:xjnykx-h@xaas.ac.cn


版权所有 © 《新疆农业科学》编辑部
本系统由北京玛格泰克科技发展有限公司设计开发
总访问量: 今日访问: 在线人数:
网站
微信公众号
淘宝购买
微店购买