Applying deep learning to automatically detect fly-tips in satellite imagery

Vadim Danelian, Andrei Kliuev

Perm National Research Polytechnic University, Perm, Russia

Cite: Danelian V., Kliuev A. Applying deep learning to automatically detect fly-tips in satellite imagery. JDS, 6(2), 26-34, (2024). https://doi.org/10.33847/2686-8296.6.2_3

Abstract.The research is dedicated to the development of neural networks for the detection of fly-tips on satellite images. The problem is relevant for Russia, where about 70 million tons of solid waste are generated annually, a significant part of which is dumped in fly-tips. Deep learning methods were used to solve two problems: binary classification of images for the presence of dumps and detection of their location. Unique datasets were collected to train the models, including more than 29,000 images for classification and 500 images for detection. The best models for classification were found to be VGG16 and VGG19 with an F1 measure of 0.91. The Faster R-CNN architecture was used for detection, achieving an accuracy of 89% on the AP metric. The results demonstrate the high effectiveness of deep learning in automating fly-tip monitoring, which helps to improve waste management control and environmental conditions in general.
Keywords: deep learning, fly-tip, satellite imagery, waste management, CNN. 

References

  1. D.N. Kobylkin, “Otkhody – v dokhody?” Argumenty i fakty (2019), available at: https://aif.ru/society/ecology/othody_v_dohody_glava_minprirody_o_tom_zachem_nuzhny_musornye_peremeny, last accessed 15 June 2019
  2. ”Chto delat’ s musorom v Rossii?”, Moskva: Greenpeace (2019), available at: https:// greenpeace.ru/wp-content/uploads/2019/10/report-RUSSIA-GARBAGE.pdf, last accessed 15 November 2019
  3. I. Egorov, Pogriazli: analitika, Rossiiskaia Gazeta (2018), available at: https://rg.ru/2018/06/28/chajka-nazval-regiony-s-nezakonnym-oborotom-othodov.html, last accessed 15 November 2019
  4. L. Mou, X.X. Zhu, IEEE Trans. Geosci. Remote Sens., 56.11, 6699–6711 (2018)
  5. Y. Feng, W. Diao, Y. Zhang, H. Li, Z. Chang, M. Yan, X. Sun, X. Gao, IEEE Int. Geosci. Remote Sens. Symp., 1025-1028 (2019)
  6. K. Bhosle, V. Musande, J. Indian Soc. of Remote Sens., 47.11, 1949-1958 (2019)
  7. S.T. Yekeen, A.L. Balogun, K.B.W. Yusof, ISPRS J. Photogramm. Remote Sens., 167, 190-200 (2020)
  8. V. Mnih, G.E. Hinton, Proc. 11th Eur. Conf. Comput. Vision, 210-223 (2010)
  9. S. Saito, T. Yamashita, Y. Aoki, Electronic Imaging, 2016.10, 1-9 (2016)
  10. E. Maggiori, Y. Tarabalka, G. Charpiat, P. Alliez, Proc. IEEE Int. Geosci. Remote Sens. Symp., 5071-5074 (2016)
  11. V. Mnih, Ph. D. thesis, University of Toronto (2013)
  12. Y. Tarabalka, J.A. Benediktsson, J. Chanussot, IEEE Trans. Geosci. Remote Sens., 47.8, 2973-2987 (2009)
  13. F. Hu, G.S. Xia, J. Hu, L. Zhang, Remote Sens., 7.11, 14680-14707 (2015)
  14. O.A.B. Penatti, K. Nogueira, J.A. Dos Santos, Proc. IEEE Int. Conf. Comput. Vis. Pattern Recognit. Workshops, 44-51 (2015)
  15. A.E. Lepsky, A.G. Bronevich, Taganrog, Taganrogskii tekhnologicheskii institut Iuzhnogo federal’nogo universiteta, 155 (2009)
  16. F. Chollet, Manning Publications (2018)
  17. F. Chollet, Proc. IEEE Conf. Comput. Vis. Pattern Recognit., 1251-1258 (2017)
  18. K. Simonyan, A. Zisserman, arXiv preprint arXiv:1409.1556 (2015)
  19. C. Szegedy, V. Vanhoucke, S. Ioffe, J. Shlens, Z. Wojna, Proc. IEEE Conf. Comput. Vis. Pattern Recognit., 2818-2826 (2016)
  20. A.G. Howard, M. Zhu, B. Chen, D. Kalenichenko, W. Wang, T. Weyand, M. Andreetto, H. Adam arXiv preprint arXiv:1704.04861 (2017)
  21. O. Russakovsky, J. Deng, H. Su, J. Krause, S. Satheesh, S. Ma, Z. Huang, A. Karpathy, A. Khosla, M. Bernstein, A.C. Berg, L. FeiFei, Int. J. Comput. Vis., 115.3, 211-252 (2015)
  22. S. Ren, K. He, R. Girshick, J. Sun, NIPS, 28, 91-99 (2015)

Published online 24.12.2024