Araştırma Makalesi
BibTex RIS Kaynak Göster

İnsansız Hava Araçlarında Gömülü Sistem Üzerinden Derin Öğrenme ile Nesne Tespiti

Yıl 2022, Sayı: 34, 292 - 298, 31.03.2022
https://doi.org/10.31590/ejosat.1081713

Öz

Hava robotiği olarak adlandırılan insansız hava araçları (İHA), afet yönetimi, trafik yoğunluğu ve sınır güvenliği gibi sivil ve askeri alanlarda veri ve görüntü toplamak için son zamanlarda sıklıkla kullanılmaktadır. Kamera görüş açısının değişkenliğinden dolayı İHA ile yüksek irtifada alınan görüntüler üzerinden anlık nesne tespiti yapmanın zorlukları vardır. Bu çalışma İHA’ ya CSI (Camera Serial Interface) modülü ile bağlanmış bir kameradan, farklı açı ve koşullarda alınan görseller ile Evrişimli Sinir Ağı tabanlı SSD MobileNet kütüphanesi kullanılarak nesne tespit etmeyi amaçlamaktadır. İHA üzerinden kamera ile elde edilen görüntüler NVIDIA Jetson Nano gömülü sistem bilgisayarı ile işlenmiş ve sınıflandırılmıştır. Gerçek zamanlı hedef tespitinde karşılaşılan problemlerin başında, değişken hava koşulları ve aydınlatma sebebiyle düşük çözünürlüklü elde edilen hareketli nesnelerin sınıflandırılması yer almaktadır. Bu duruma derin öğrenme kütüphanesinde yer alan öğrenilmiş görsellerle, kameradan alınan görseller arasındaki yakın özellikli veriler eşleştirilerek çözüm aranmaktadır. Bu çalışmada gömülü sistem içerisinde bir algoritma oluşturularak elde edilen görüntüler üzerinde düzenleme yapılmış, derin öğrenme kütüphanesi ile özellik karşılaştırma işleminin ardından elde edilen çıktılar nesne sınırı ve mAP (mean Average Precision) ortalama hassasiyet yüzdesi kullanılarak değerlendirilmiştir. Sonuç olarak, İHA üzerinden alınmış gerçek zamanlı görsel verilerden insan tespiti için %95.5 mAP son kesinlik ve %69.45 mAP ortalama hassasiyet, araç tespiti için %83.4 mAP son kesinlik ve % 64.5 mAP ortalama hassasiyet elde edilmiştir.

Kaynakça

  • Mukhamediev, R. I., Symagulov, A., Kuchin, Y., Zaitseva, E., Bekbotayeva, A., Yakunin, K., ... & Tabynbaeva, L. (2021). Review of Some Applications of Unmanned Aerial Vehicles Technology in the Resource-Rich Country. Applied Sciences, 11(21), 10171.
  • Liu, Y., Zhang, F., Huang, P., & Zhang, X. (2021). Analysis, planning and control for cooperative transportation of tethered multi-rotor UAVs. Aerospace Science and Technology, 113, 106673.
  • Hassler, S. C., & Baysal-Gurel, F. (2019). Unmanned aircraft system (UAS) technology and applications in agriculture. Agronomy, 9(10), 618.
  • Ong, G. Y. G. (2008, May). Electronic vehicle identification in the intelligent city. In IET Road Transport Information and Control-RTIC 2008 and ITS United Kingdom Members' Conference (pp. 1-6). IET.
  • Volkova, L., Krisnawati, H., Adinugroho, W. C., Imanuddin, R., Qirom, M. A., Santosa, P. B., ... & Weston, C. J. (2021). Identifying and addressing knowledge gaps for improving greenhouse gas emissions estimates from tropical peat forest fires. Science of the Total Environment, 763, 142933.
  • Berrahal, S., Kim, J. H., Rekhis, S., Boudriga, N., Wilkins, D., & Acevedo, J. (2016). Border surveillance monitoring using quadcopter UAV-aided wireless sensor networks. Journal of Communications Software and Systems, 12(1), 67-82.
  • Tindall, J. A. (2006). Deconvolution of plant type (s) for homeland security enforcement using remote sensing on a uav collection platform. GEOLOGICAL SURVEY DENVER CO.
  • SALUR, M. U., AYDIN, İ., & Karaköse, M. (2019). gömülü derin öğrenme ile tehdit içeren nesnelerin gerçek zamanda tespiti. Dicle Üniversitesi Mühendislik Fakültesi Mühendislik Dergisi, 10(2), 497-509.
  • Rokhmana, C. A. (2015). The potential of UAV-based remote sensing for supporting precision agriculture in Indonesia. Procedia Environmental Sciences, 24, 245-253.
  • Demirhan, M., & Premachandra, C. (2020). Development of an automated camera-based drone landing system. IEEE Access, 8, 202111-202121.
  • Liu, M., Wang, X., Zhou, A., Fu, X., Ma, Y., & Piao, C. (2020). Uav-yolo: Small object detection on unmanned aerial vehicle perspective. Sensors, 20(8), 2238.
  • Yu, W., Yang, T., & Chen, C. (2021). Towards resolving the challenge of long-tail distribution in UAV images for object detection. In Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision (pp. 3258-3267).
  • Donahue, J., Jia, Y., Vinyals, O., Hoffman, J., Zhang, N., Tzeng, E., & Darrell, T. (2014, January). Decaf: A deep convolutional activation feature for generic visual recognition. In International conference on machine learning (pp. 647-655). PMLR.
  • Ngiam, J., Caine, B., Han, W., Yang, B., Chai, Y., Sun, P., ... & Vasudevan, V. (2019). Starnet: Targeted computation for object detection in point clouds. arXiv preprint arXiv:1908.11069.
  • Ouyang, W., Wang, X., Zeng, X., Qiu, S., Luo, P., Tian, Y., ... & Tang, X. (2015). Deepid-net: Deformable deep convolutional neural networks for object detection. In Proceedings of the IEEE conference on computer vision and pattern recognition (pp. 2403-2412).
  • Petrova, T., & Petrov, Z. (2020). Long Term Development Perspectives for UAV Potential. International E-Journal of Advances in Social Sciences, 6(16), 45-53.
  • https://developer.nvidia.com/embedded/jetson-nano-dl-inference-benchmarks, Jetson Nano: Deep Learning Inference. 03 Ocak 2022.
  • https://developer.nvidia.com/embedded/jetson-developer-kits, Jetson Nano. 03 Ocak 2022.
  • Basulto-Lantsova, A., Padilla-Medina, J. A., Perez-Pinal, F. J., & Barranco-Gutierrez, A. I. (2020, January). Performance comparative of OpenCV Template Matching method on Jetson TX2 and Jetson Nano developer kits. In 2020 10th Annual Computing and Communication Workshop and Conference (CCWC) (pp. 0812-0816). IEEE.
  • Wang, X., & Jia, K. (2020, July). Human fall detection algorithm based on YOLOv3. In 2020 IEEE 5th International Conference on Image, Vision and Computing (ICIVC) (pp. 50-54). IEEE.
  • LeCun, Y., Boser, B., Denker, J., Henderson, D., Howard, R., Hubbard, W., & Jackel, L. (1989). Handwritten digit recognition with a back-propagation network. Advances in neural information processing systems, 2.
  • Pastor Quiles, Y. (2018). Object detection and tracking using an UAV.
  • Singh, S., Suri, A., Singh, J. N., Singh, M., & Yadav, D. K. (2021). Object Identification and Tracking Using YOLO Model: A CNN-Based Approach. In Machine Learning and Information Processing (pp. 153-160). Springer, Singapore.
  • Moranduzzo, T. (2015). Detection and analysis methods for unmanned aerial vehicle images (Doctoral dissertation, University of Trento).
  • Sun, C., Zhan, W., She, J., & Zhang, Y. (2020). Object detection from the video taken by drone via convolutional neural networks. Mathematical Problems in Engineering, 2020.
  • Pandey, R., White, M., Pidlypenskyi, P., Wang, X., & Kaeser-Chen, C. (2017). Real-time egocentric gesture recognition on mobile head mounted displays. arXiv preprint arXiv:1712.04961.
  • Liu, W., Anguelov, D., Erhan, D., Szegedy, C., Reed, S., Fu, C. Y., & Berg, A. C. (2016, October). Ssd: Single shot multibox detector. In European conference on computer vision (pp. 21-37). Springer, Cham.
  • https://www.mathworks.com/discovery/convolutional-neural-network-matlab.html. 03 Ocak 2022.

Object Detection with Deep Learning using Embedded System in Unmanned Aerial Vehicles

Yıl 2022, Sayı: 34, 292 - 298, 31.03.2022
https://doi.org/10.31590/ejosat.1081713

Öz

Unmanned aerial vehicles (UAVs), called aerial robotics, have been frequently used recently to collect data and images in civilian and military areas such as disaster management, traffic density and border security. Due to the variability of the camera view angle, there are difficulties in making instant object detection on the images taken at high altitude with the UAV. This study aims to detect objects from a camera connected to the UAV with the CSI (Camera Serial Interface) module using images taken at different angles and conditions and a Convolutional Neural Network-based SSD MobileNet library. The images obtained with the camera via the UAV were processed and classified with the NVIDIA Jetson Nano embedded system computer. One of the problems encountered in real-time target detection is the classification of moving objects with low resolution due to variable weather conditions and lighting. A solution to this situation is sought by matching the close-featured data between the learned images contained in the deep learning library and the images from the camera. In this study, the images obtained by creating an algorithm in the embedded system were edited, and the outputs obtained after the feature comparison process with the deep learning library were evaluated using the object boundary and mAP (mean Average Precision) average precision percentage. As a result, 95.5% mAP final accuracy and 69.45% Map average accuracy for human detection, 83.4% mAP final accuracy and 64.5% mAP average accuracy for vehicle detection were obtained from real-time visual data received from the UAV.

Kaynakça

  • Mukhamediev, R. I., Symagulov, A., Kuchin, Y., Zaitseva, E., Bekbotayeva, A., Yakunin, K., ... & Tabynbaeva, L. (2021). Review of Some Applications of Unmanned Aerial Vehicles Technology in the Resource-Rich Country. Applied Sciences, 11(21), 10171.
  • Liu, Y., Zhang, F., Huang, P., & Zhang, X. (2021). Analysis, planning and control for cooperative transportation of tethered multi-rotor UAVs. Aerospace Science and Technology, 113, 106673.
  • Hassler, S. C., & Baysal-Gurel, F. (2019). Unmanned aircraft system (UAS) technology and applications in agriculture. Agronomy, 9(10), 618.
  • Ong, G. Y. G. (2008, May). Electronic vehicle identification in the intelligent city. In IET Road Transport Information and Control-RTIC 2008 and ITS United Kingdom Members' Conference (pp. 1-6). IET.
  • Volkova, L., Krisnawati, H., Adinugroho, W. C., Imanuddin, R., Qirom, M. A., Santosa, P. B., ... & Weston, C. J. (2021). Identifying and addressing knowledge gaps for improving greenhouse gas emissions estimates from tropical peat forest fires. Science of the Total Environment, 763, 142933.
  • Berrahal, S., Kim, J. H., Rekhis, S., Boudriga, N., Wilkins, D., & Acevedo, J. (2016). Border surveillance monitoring using quadcopter UAV-aided wireless sensor networks. Journal of Communications Software and Systems, 12(1), 67-82.
  • Tindall, J. A. (2006). Deconvolution of plant type (s) for homeland security enforcement using remote sensing on a uav collection platform. GEOLOGICAL SURVEY DENVER CO.
  • SALUR, M. U., AYDIN, İ., & Karaköse, M. (2019). gömülü derin öğrenme ile tehdit içeren nesnelerin gerçek zamanda tespiti. Dicle Üniversitesi Mühendislik Fakültesi Mühendislik Dergisi, 10(2), 497-509.
  • Rokhmana, C. A. (2015). The potential of UAV-based remote sensing for supporting precision agriculture in Indonesia. Procedia Environmental Sciences, 24, 245-253.
  • Demirhan, M., & Premachandra, C. (2020). Development of an automated camera-based drone landing system. IEEE Access, 8, 202111-202121.
  • Liu, M., Wang, X., Zhou, A., Fu, X., Ma, Y., & Piao, C. (2020). Uav-yolo: Small object detection on unmanned aerial vehicle perspective. Sensors, 20(8), 2238.
  • Yu, W., Yang, T., & Chen, C. (2021). Towards resolving the challenge of long-tail distribution in UAV images for object detection. In Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision (pp. 3258-3267).
  • Donahue, J., Jia, Y., Vinyals, O., Hoffman, J., Zhang, N., Tzeng, E., & Darrell, T. (2014, January). Decaf: A deep convolutional activation feature for generic visual recognition. In International conference on machine learning (pp. 647-655). PMLR.
  • Ngiam, J., Caine, B., Han, W., Yang, B., Chai, Y., Sun, P., ... & Vasudevan, V. (2019). Starnet: Targeted computation for object detection in point clouds. arXiv preprint arXiv:1908.11069.
  • Ouyang, W., Wang, X., Zeng, X., Qiu, S., Luo, P., Tian, Y., ... & Tang, X. (2015). Deepid-net: Deformable deep convolutional neural networks for object detection. In Proceedings of the IEEE conference on computer vision and pattern recognition (pp. 2403-2412).
  • Petrova, T., & Petrov, Z. (2020). Long Term Development Perspectives for UAV Potential. International E-Journal of Advances in Social Sciences, 6(16), 45-53.
  • https://developer.nvidia.com/embedded/jetson-nano-dl-inference-benchmarks, Jetson Nano: Deep Learning Inference. 03 Ocak 2022.
  • https://developer.nvidia.com/embedded/jetson-developer-kits, Jetson Nano. 03 Ocak 2022.
  • Basulto-Lantsova, A., Padilla-Medina, J. A., Perez-Pinal, F. J., & Barranco-Gutierrez, A. I. (2020, January). Performance comparative of OpenCV Template Matching method on Jetson TX2 and Jetson Nano developer kits. In 2020 10th Annual Computing and Communication Workshop and Conference (CCWC) (pp. 0812-0816). IEEE.
  • Wang, X., & Jia, K. (2020, July). Human fall detection algorithm based on YOLOv3. In 2020 IEEE 5th International Conference on Image, Vision and Computing (ICIVC) (pp. 50-54). IEEE.
  • LeCun, Y., Boser, B., Denker, J., Henderson, D., Howard, R., Hubbard, W., & Jackel, L. (1989). Handwritten digit recognition with a back-propagation network. Advances in neural information processing systems, 2.
  • Pastor Quiles, Y. (2018). Object detection and tracking using an UAV.
  • Singh, S., Suri, A., Singh, J. N., Singh, M., & Yadav, D. K. (2021). Object Identification and Tracking Using YOLO Model: A CNN-Based Approach. In Machine Learning and Information Processing (pp. 153-160). Springer, Singapore.
  • Moranduzzo, T. (2015). Detection and analysis methods for unmanned aerial vehicle images (Doctoral dissertation, University of Trento).
  • Sun, C., Zhan, W., She, J., & Zhang, Y. (2020). Object detection from the video taken by drone via convolutional neural networks. Mathematical Problems in Engineering, 2020.
  • Pandey, R., White, M., Pidlypenskyi, P., Wang, X., & Kaeser-Chen, C. (2017). Real-time egocentric gesture recognition on mobile head mounted displays. arXiv preprint arXiv:1712.04961.
  • Liu, W., Anguelov, D., Erhan, D., Szegedy, C., Reed, S., Fu, C. Y., & Berg, A. C. (2016, October). Ssd: Single shot multibox detector. In European conference on computer vision (pp. 21-37). Springer, Cham.
  • https://www.mathworks.com/discovery/convolutional-neural-network-matlab.html. 03 Ocak 2022.
Toplam 28 adet kaynakça vardır.

Ayrıntılar

Birincil Dil Türkçe
Konular Mühendislik
Bölüm Makaleler
Yazarlar

Ziya Saygılı 0000-0003-3282-0946

Güzin Özmen 0000-0003-3007-5807

Erken Görünüm Tarihi 30 Ocak 2022
Yayımlanma Tarihi 31 Mart 2022
Yayımlandığı Sayı Yıl 2022 Sayı: 34

Kaynak Göster

APA Saygılı, Z., & Özmen, G. (2022). İnsansız Hava Araçlarında Gömülü Sistem Üzerinden Derin Öğrenme ile Nesne Tespiti. Avrupa Bilim Ve Teknoloji Dergisi(34), 292-298. https://doi.org/10.31590/ejosat.1081713