DOAJ Open Access 2026

Neural network assistant for automated diagnostics and first aid support for eye and skin injuries in emergency situations

Gleb Yu. Shamsudinov Veniamin V. Morozov Georgiy S. Shirokov Vyacheslav Yu. Yarovoy Anna K. Mikhaylova

Abstrak

Purpose. Development and validation of a deep neural network for computer vision tasks, which, under conditions of limited time and resources, allows for the automatic classification of pathological conditions of the eyes and skin resulting from man-made and natural emergencies, as well as animal and insect bites, and the proposal of a first aid algorithm. Methods. Development of a deep neural network architecture, training the model on an expanded dataset of images of pathological conditions of the eyes and skin, validation of the results using standard computer vision metrics. Findings. A deep neural network has been developed that demonstrates high accuracy in classifying eye and skin pathologies, including animal and insect bites, on an expanded dataset. A new approach to automating first aid in emergency situations has been presented, which allows for reducing diagnostic time and increasing accuracy in various conditions, including the Ministry of Emergency Situations' activities to eliminate biological and social risks in disaster zones. Application field of research. The obtained results can be used to implement the model in first aid systems during emergencies, mobile applications and devices for rescuers, as well as to solve other problems in the field of life safety.

Penulis (5)

G

Gleb Yu. Shamsudinov

V

Veniamin V. Morozov

G

Georgiy S. Shirokov

V

Vyacheslav Yu. Yarovoy

A

Anna K. Mikhaylova

Format Sitasi

Shamsudinov, G.Y., Morozov, V.V., Shirokov, G.S., Yarovoy, V.Y., Mikhaylova, A.K. (2026). Neural network assistant for automated diagnostics and first aid support for eye and skin injuries in emergency situations. https://doi.org/10.33408/2519-237X.2026.10-1.107

Akses Cepat

Informasi Jurnal
Tahun Terbit
2026
Sumber Database
DOAJ
DOI
10.33408/2519-237X.2026.10-1.107
Akses
Open Access ✓