Logo BSU

Пожалуйста, используйте этот идентификатор, чтобы цитировать или ссылаться на этот документ: https://elib.bsu.by/handle/123456789/274193
Заглавие документа: Object detection for unmanned aerial vehicle camera via convolutional neural networks
Авторы: Saetchnikov, Ivan.V
Tcherniavskaia, Elina. A.
Skakun, Victor.V.
Тема: ЭБ БГУ::ЕСТЕСТВЕННЫЕ И ТОЧНЫЕ НАУКИ
Дата публикации: июн-2021
Библиографическое описание источника: IEEE Journal on Miniaturization for Air and Space Systems
Аннотация (на другом языке): The object tracking alongside the image segmentation have recently become a particular significance in satellite and aerial imagery. The latest achievements in this field are closely related to the application of the deep-learning algorithms and, particularly, convolutional neural networks (CNNs). Supplemented by the sufficient amount of the training data, CNNs provide the advantageous performance in comparison to the classical methods based on Viola-Jones or support vector machines. However, the application of CNNs for the object detection on the aerial images faces several general issues that cause classification error. The first one is related to the limited camera shooting angle and spatial resolution. The second one arises from the restricted dataset for specific classes of objects that rarely appear in the captured data. This article represents a comparative study on the effectiveness of different deep neural networks for detection of the objects with similar patterns on the images within a limited amount of the pretrained datasets. It has been revealed that YOLO ver. 3 network enables better accuracy and faster analysis than region convolution neural network (R-CNN), Fast R-CNN, Faster R-CNN, and SSD architectures. This has been demonstrated on the example of “Stanford dataset,” “DOTA v-1.5,” and “xView 2018 Detection” datasets. The following metrics on the accuracy have been obtained for the YOLO ver. 3 network: 89.12 mAP (Stanford dataset), 80.20 mAP (DOTA v-1.5), and 78.29 (xView 2018) for testing; and 85.51 mAP (Stanford dataset), 79.28 (DOTA v-1.5), and 79.92 (xView 2018) on validation with the analysis speed of 26.82 frames/s.
URI документа: https://elib.bsu.by/handle/123456789/274193
ISSN: 2576-3164
DOI документа: 10.1109/JMASS.2020.3040976
Лицензия: info:eu-repo/semantics/openAccess
Располагается в коллекциях:Архив статей

Полный текст документа:
Файл Описание РазмерФормат 
2_5404845016323461724.pdf378,2 kBAdobe PDFОткрыть
Показать полное описание документа Статистика Google Scholar



Все документы в Электронной библиотеке защищены авторским правом, все права сохранены.