Exploiting Deep Visual Geometry Group Architecture for Fall Detection in the Elderly People
Abstract
Abstract Views: 0Over the last couple of decades, human fall detection has gained considerable popularity, especially for the elderly. Elderly people need more attention as compared to others in their homes, hospitals, and care centers. Various solutions have been proposed to deal with this problem, yet, many aspects of this problem are still unresolved. The current study proposed an approach for human fall detection based on the Visual Geometry Architecture of deep learning. The presented approach was weighed up with state-of-the-art approaches including ResNet-50 and even ResNet-101 by using MCF and URFD datasets, outperforming them with an accuracy of 98%. The proposed approach also outperformed these deep architectures in terms of performance efficiency.
Downloads
References
M. Z. Uddin, W. Khaksar, and J. Torresen, “Ambient sensors for elderly care and independent living: A survey,” Sensors, vol. 18, Art. no. 2027, June 2018, https://doi.org/10.3390/s1807 2027
O. Kharrat, E. Mersni, O. Guebsi, F. Z. Ben Salah, and C. Dziri,“Qualite de vie et personnes´ agˆ ees en tunisie,,” NPG Neurol-Psych- Geriat with, vol. 17, pp. 5–11,Feb.2017, https://doi.org/ 10.1016/j.npg.2016.12.001
C. Khraief, F. Benzarti, and H. Amiri, “Elderly fall detection based on multi-stream deep convolutional networks,” Multimed.ed. ToolsAppl. with, vol. 18pp.1–24,2020, https://doi. org/10.1007/s11042-020-08812-x
E. R. Burns, J. A. Stevens, and R. Lee, “The direct costs of fatal and non-fatal falls among older adults united states,” J. Saf., vol. 58, p. 99–103, Sep. 2016, https://doi.org/10.1016/j.jsr. 2016.05.001
Center for Disease control and Preverntion, “Cost of injury data. https://www.cdc.gov/injury/wisqars/cost/ (accessed August 22, 2020).
G. L. Santos, P. T. Endo, K. H. D. C. Monteiro, E. D. S. Rocha, I. Silva, and T. Lynn, Accelerometer-based human fall detection using convolutional neural networks,” Sensors, vol. 19, no. 1, Art. no. 1644, Apr. 2019, https://doi.org/10.3390/s19071644
S. K. Verma, J. L. Willetts, H. L. Corns, H. R. Marucci-Wellman, D. A. Lombardi, and T. K. Courtney, “Falls and fall-related injuries among community-dwelling adults in the united states,” PLoS One, vol. 11, no. 3, Art. no. 015093,Mar.2016, : https://doi.org/10.1371/journal.pone.0150939
L. R. Timsina, J. L. Willetts, M. J. Brennan, H. Marucci-Wellman, D. A. Lombardi, T. K. Courtney, and S. K. Verma, “Circumstances of fallrelated injuries by age and gender among community-dwelling adults in the united states,” PLoS one, vol. 12, no. 5, Art.no.01765612017, https://doi.org /10.1371/journal.pone.0176561
M. Rosanes, “Revealed – The most common types of insurance fraud.”https://www.insurancebusinessmag.com/us/news/breaking-news/revealed--the-most-common-types-of-insurance-fraud-399325.aspx (accessed Sep. 8, 2020).
K. Simonyan and A. Zisserman, “Very deep convolutional networks for large-scale image recognition,” arXiv,2015, https://doi.org/10.48550/ arXiv.1409.1556
K. Chaccour, R. Darazi, A. H. E. Hsani, and E. Andres, “From fall detection to fall prevention: A generic classification of fall-related systems,” IEEE Sens. J., vol. 17, no.3,pp.812–822,2017, https://doi.org/10.1109/ JSEN.2016.262809
A. Singh, S. U. Rehman, S. Yongchareon, and P. H. J. Chong, “Sensor technologies for fall detection systems: A review,” IEEE Sensors Journal, vol. 20, no. 13, pp. 6889–6919, Feb. 2020, https://doi.org /10.1109/JSEN.2020.2976554
M. F. R. Al-Okby and S. S. Al-Barrak, “New approach for fall detection system using embedded technology,” in 2020 IEEE 24th Int. Conf.Intell.Eng. Sys., 8–10 July 2020, pp. 209–214, https://doi.org/10.1109/ INES49302.2020.9147170
G. Miaou, S, H. Sung, P, and C.-Y. Huang, “A customized human fall detection system using omni-camera images and personal information,” in 1st Transdiscipl. Conf. Distrib. Diagnosis Home Heal. D2H2, 2–4 Apr. 2006, pp. 39–42,2006, https://doi. org/10.1109/DDHH.2006.1624792
C. N. Doukas and I. Maglogiannis, “Emergency fall incidents detection in assisted living environments utilizing motion, sound, and visual perceptual components,” IEEE Trans. Info. Techno Biomed., vol. 15, no. 2, pp. 277–289, Nov.2010, https://doi.org/ 10.1109/TITB.2010.2091140
S. F. Ali, M. Muaz, A. Fatima, F. Idrees, and N. Nazar, “Human fall detection,” in IEEE Int. Conf. Multi Top., 19–20 Dec. 2013, pp. 101–105, https://doi.org/10.1109/INMIC.2013.6731332
S. F. Ali, R. Khan, A. Mahmood, M. T. Hassan, and M. Jeon, “Using temporal covariance of motion and geometric features via boosting for human fall detection,” Universite de Montr´eal´, vol. 18,Art.no.1918June 2018, https://doi.org/10.3390/s180619 18
Wang, L. Chen, and J. Dong, “Human fall detection in surveillance video based on pcanet,” Multimed. Tools Appl., vol. 75, no. 19, pp. 11603–11613, June 2016, https://doi. org/10.1007/s11042-015-2698-y
W. Min, H. Cui, H. Rao, Z. Li, and L. Yao, “Detection of human falls on furniture using scene analysis based on deep learning and activity characteristics,” IEEE Access, vol. with2018,pp.324–9335, https://doi. org/10.1109/ACCESS.2018.2795239
A. Shojaei-Hashemi, P. Nasiopoulos, J. J. Little, and M. T. Pourazad, “Video-based human fall detection in smart homes using deep learning,” IEEE Int. Symp. Cir. Sys., 27–30 May2018,: http://doi.org/10. 1109/ISCAS.2018.8351648
R. Espinosa, H. Ponce, S. Gutierrez, L. Mart´ ´ınez-Villasenor, J. Brieva,˜ and E. Moya-Albor, “A vision-based approach for fall detection using multiple cameras and convolutional neural networks: A case study using the up-fall detection dataset,” Comput. Biol. Med., Art. no. 103520, Dec. 2019, doi: https://doi.org/10.1016/j. compbiomed.2019.10352
Q. Han, et al, “A two-stream approach to fall detection with mobilevgg,” IEEE Access, vol. 8, pp. 17556–17566, Jan. 2020, https://doi. org/10.1109/ACCESS.2019.2962778
S. A. Cameiro, G. P. da Silva, G. V. Leite, R. Moreno, S. J. F. Guimaraes,˜ and H. Pedrini, “Multi-stream deep convolutional network using highlevel features applied to fall detection in video sequences,” in Int. Conf. Sys. Sig. Image Proc., 5–7 June 2019, pp. 293–298, https://doi.org/10.1109/ IWSSIP.2019.8787213
LNguyen, D. Lin, Z. Lin, and J. Cao, “Deep cnns for microscopic image classification by exploiting transfer learning and feature concatenation.”IEEE. Symp. Cir. Sys., 27–30 May 2018, pp. 1–5. https://doi.org10.1109/ISCAS.2018.8351550
A. Sachan, “Detailed guide to understand and implement resnets,” CV-Tricks.com. https://cv-tricks.com /keras/understand-implement-resnets/ (accessed Feb. 27, 2018).
E. Auvinet, C. Rougier, J. Meunier, A. St-Arnaud, and J. Rousseau, “Multiple cameras fall dataset,technical report 1350,” Universite de´ Montreal, vol. 21, pp. 611–622, 20
C. Mei, Z. Liu, Y. Niu, X. Ji, W. Zhou, and D. Wang, “A 200mhz 202.4 gflops@ 10.8 w vgg16 accelerator in xilinx vx690t,” in IEEE Glob. Conf. Sig. Info. Proc., 14–16 Nov. 2017, pp. 784–788. https://doi.org/10.1109/Global SIP.2017.8309067
M. Z. Alom, et al., “The history began from alexnet: A comprehensive survey on deep learning approaches,”arXiv2018, https://doi. org/10.48550/arXiv.1803.01164
D. Theckedath and R. R. Sedamkar, “Detecting affect states using vgg16, resnet50 and se-resnet50 networks,”,,” ” SN Comput. Sci., vol. 1, pp. 1–7, 2020.fenix, “UR fall detetction dataset.” Fenix.com. http://fenix.univ. rzeszow.pl/mkepski/ds/uf.html (accessed Sep. 2, 2020).
Kepski and Kwolek, Fall detection on embedded platform using kinect and wireless accelerometer,” in Int. Conf. Comput. Handicap. Persons, 2012,pp.407–41. https://doi.org/10. 1007/978-3-642-31534-3_60
Copyright (c) 2023 Hina Bashir, Kanwal Majeed, Sumaira Zafar, Ghulam Zohra, Syed Farooq Ali, Aadil Zia Khan
This work is licensed under a Creative Commons Attribution 4.0 International License.
UMT-AIR follow an open-access publishing policy and full text of all published articles is available free, immediately upon publication of an issue. The journal’s contents are published and distributed under the terms of the Creative Commons Attribution 4.0 International (CC-BY 4.0) license. Thus, the work submitted to the journal implies that it is original, unpublished work of the authors (neither published previously nor accepted/under consideration for publication elsewhere). On acceptance of a manuscript for publication, a corresponding author on the behalf of all co-authors of the manuscript will sign and submit a completed the Copyright and Author Consent Form.