The Deep Learning based Smart Navigational Stick for Blind People

  • Muhammad Sulaman IT Administrator Balochistan Think Tank Network (BTTN) Quetta, Pakistan
  • Sibghat ullah Bazai Department of Computer Engineering BUITEMS Quetta, Pakistan
  • Muhammad AKram Department of Software Engineering BUITEMS Quetta, Pakistan
  • Muhammad Akram Khan Department of Computer Engineering BUITEMS Quetta, Pakistan
Keywords: artificial intelligence, deep learning, dlib library, face recognition, mobile-net ssd, open CV, object recognition, python

Abstract

Abstract Views: 195

Blind and visually impaired people find difficulty in detecting obstacles and recognizing people in their way, which makes it dangerous for them to walk, to work, or to go in a crowded area/place. They have to be cautious all the time to move, while avoiding any solid obstacles in their way. Typically, they use different aid devices to reach their destination or to accomplish their daily task. The normal stick is useless for blind and visually impaired people since it cannot detect barriers or people's faces. Visually impaired individuals are unable to distinguish between different types of objects in front of them. They are unable to gauge the size of an object or its distance from them. Several works have been done by public individuals and scientific investigators but their work is dearth in technological aspect. This technological aspect need to be addressed by adding artificial intelligence (AI). This prototype aims to help blind and visually impaired individuals in several aspects to simply obtain/perform everyday tasks and help these individuals to live with the same confidence as sighted people live.Therefore, this study inclined deep learning Mobile-Net Single Shot MultiBox detection (SSD) algorithm for object recognition and Dlib library for face recognition. Subsequently, the proposed solution is using an Open CV and Python. Additionally, Ultrasonic sensors are used for distance measurement, which can be a great help for visually impaired people. These components are grouped together to work effectively and efficiently for the development of visually impaired people. The recognition procedure was revealed through headphones, which notifies the visually impaired when face or any object get recognized. Inclusively, the innovative solution would be a great aid for the blind and visually impaired individuals. As a result, to test and validate the accuracy of the smart navigational stick, several experiments have been conducted on a range of objects and faces. Hence, this study’s modified navigational system was adequate and valid for visually impaired people.

Downloads

Download data is not yet available.

References

World Health Organization, “Blindness and vision impairment.” World Health Organization. https://www.who.int/news-room/fact-sheets/detail/blindness-and-visual-impairment (accessed Dec. 12, 2022). [2] B. Hassan, R. Ahmed, B. Li, A. Noor, and Z. ul Hassan, “A comprehensive study capturing vision loss burden in Pakistan (1990-2025): Findings from the Global Burden of Disease (GBD) 2017 study,” PLOS ONE, vol. 14, no. 5, pp. 1–19, May 2019, doi:

https://doi.org/10.1371/journal.pone.0216492

The World Bank, "Population, total – Pakistan Data", World Bank. Available: https://data.worldbank.org/indicator/SP.POP.TOTL?locations=PK&name-desc=true (accessed Dec. 12, 2022).

V. Kunta, C. Tuniki, and U. Sairam, "Multi-Functional blind stick for visually impaired people," in 5th Int. Conf. Commun. Electron. Syst., June 10–12, 2020, pp. 895–899, doi : https://doi.org/10.1109/ICCES48766.2020.9137870

A. Elsonbaty, “Smart blind stick design and implementation,” Int. J. Eng. Adv. Technol., vol. 10, pp. 17–20, June 2021, doi: https://doi.org/10.35940/ijeat.D2535.0610521

S. Grover, A. Hassan, K. Yashaswi, and N. Shinde, “Smart blind stick,” Int. J. Electro. Commun. Eng., vol. 7, pp. 19–23, May 2020, doi: httpa://doi.org/10.14445/23488549/IJECE-V7I5P104

H. Q. Nguyen, A. H. L. Duong, M. D. Vu, T. Q. Dinh, and H. T. Ngo, “Smart blind stick for visually impaired people,” in 8th Int. Conf. Develop. Biomed. Eng. Vietnam, Cham, 2022, pp. 145–165. doi: https://doi.org/10.1007/978-3-030-75506-5_12

M. P. Agrawal and A. R. Gupta, "Smart Stick for the blind and Visually Impaired People", in 2nd Int. Conf. Inven. Commun. Comput. Technol., pp. 542–545, 2018.

R. F. Olanrewaju, M. L. A. M. Radzi, and M. Rehab, "iWalk: Intelligent walking stick for visually impaired subjects," in IEEE 4th Int. Conf. Smart Instru. Measur. Appli., pp. 1–4, 2017.

K. Manikanta, T. Phani, and A. Pravin, "Implementation and design of smart blind stick for obstacle detection and navigation system", Int. J. Eng. Sci. Comput., vol. 8, no. 8, pp. 18785–18790, 2018.[11] S. Halim, F. Handafiah, R. Aprilliyani, and A. Udhiarto, “Electronic white cane with GPS radar-based concept as blind mobility enhancement without distance limitation,” AIP Conf. Proc., vol. 1933, pp. 040024-1–040024-7, Feb. 2018, doi: https://doi.org/10.1063/1.5023994

N. R. P. J. Johnson, "Smart walking stick for blind," Int. J. Eng. Sci. Inv. Res. Develop., vol. 3, no. 4, 2017.

M. Bansal, S. Malik, M. Kumar, and N. Meena, “Arduino based smart walking cane for visually impaired people,” in 4th Int. Conf. Inv. Syst. Cont., Jan. 2020, pp. 462–465. doi: https://doi.org/10.1109/ICISC47916.2020.9171209

A. S. Manaf, E. Joseph, S. P. M, and A. Ahmed, “Effective fast response smart stick for blind people,” Int. J. Eng. Res. Technol., vol. 7, no. 8, June 2019, doi: https://doi.org/10.17577/IJERTCONV7IS08057

I. A. Satam, M. N. Al-Hamadani, and A. H. Ahmed. (2019). Design and implement a smart blind stick. J. Adv. Res. Dynam Control Syst., vol. 11, no. 8, pp. 42–47.

L. Chen, J. Su, M. Chen, W. Chang, C. Yang, and C. Sie, "An implementation of an intelligent assistance system for visually impaired/blind people," in 2019 IEEE Int. Conf. Consum. Elec., Las Vegas, NV, USA, pp. 1–2, 2019.

T. Nadu, “Arduino based walking stick for visually impaired,” Int. Res. J. Eng. Technol., vol. 5 no.3, Mar. 2018

R. Dhanuja, F. Farhana, and G. Savitha, “Smart blind stick using Arduino,” vol. 5, no. 3, pp. 2553–2555, Mar. 2018.

N. Dey, A. Paul, P. Ghosh, C. Mukherjee, R. De, and S. Dey, ”Ultrasonic sensor based smart blind stick,” in Int. Conf. Curr. Trend. Converg. Technol., Coimbatore, 2018, pp. 1–4, https://doi.org/10.1109/ICCTCT.2018.8551067

M. Shanmugam, J. Victor, M. Gupta and K. Saravanakumar, ”Smart stick for blind people,” in National Conf. Emerg. Trends Info. Technol., Christ University, Bengaluru, Mar., 2017.

M. Singhalais, “Object detection using SSD mobilenetV2 using tensorflow API : Can detect any singleclass from coco datas,” Medium.

https://medium.com/@techmayank2000/object-detection-using-ssd-mobilenetv2-using-tensorflow-api-can-detect-any-single-class-from-31a31bbd0691(accessed Dec. 18, 2022). [22] A. Kumar, Z. J. Zhang, and H. Lyu, “Object detection in real time based on improved single shot multi-box detector algorithm,” J. Wireless Commun. Network., vol. 2020, no. 1, Art. no. 204, Oct. 2020, doi:

https://doi.org/10.1186/s13638-020-01826-x

A. Younis, L. Shixin, S. Jn, and Z. Hai, “Real-Time object detection using pre-trained deep learning models MobileNet-SSD. in Proc. 6th Int. Conf. Comput. Data Eng., vol. 978-1-4503-7673–0, pp. 44–48, Mar. 2020.

R. Kavitha, P. Subha, R. Srinivasan, and M. Kavitha, “Implementing opencv and dlib open-source library for detection of driver’s fatigue,” in Innov. Data Commun. Technol. Appl., Singapore, 2022, pp. 353–367. doi: https://doi.org/10.1007/978-981-16-7167-8_26.

A. Ponnusamy, “CNN based face detector from dlib,”

Medium. https://towardsdatascience.com/cnn-based-face-detector-from-dlib-c3696195e01c (accessed Dec. 19, 2022).

S. R. Boyapally, “Facial recognition and attendance system using dlib and face_recognition libraries,” https://papers.ssrn.com/abstract=3804334 (accessed Dec. 27, 2022).

T.-Y. Lin et al., “Microsoft COCO: Common objects in context,” in Computer Vision – ECCV 2014. Cham, 2014, pp. 740–755.

A. Piltch, “How to set up a raspberry pi for the first time,” Tom’s Hardware. https://www.tomshardware.com/how-to/set-up-raspberry-pi (accessed Feb. 27, 2023).

V. Gr, “How to use raspberry pi2 with a laptop display using VNC server,” Instructables. https://www.instructables.com/How-to-Use-Raspberry-Pi2-With-a-Laptop-Display-Usi/ (accessed Feb. 27, 2023).

Q-engineering, “Install tensorflow 2.1.0 on raspberry Pi 4 - Q-engineering,” Q-engineering. https://qengineering.eu/install-tensorflow-2.1.0-on-raspberry-pi-4.html (accessed Feb. 27, 2023). [31] S. Leet, “Jupyter notebook on raspberry Pi,” Instructables.

https://www.instructables.com/Jupyter-Notebook-on-Raspberry-Pi/ (accessed Feb. 27, 2023).

A. Singh and MACFOS, “Installing opencv using cmake in raspberry Pi,” Robu. https://robu.in/installing-opencv-using-cmake-in-raspberry-pi/ (accessed Feb. 27, 2023).

V. Phutak, R. Kamble, S. Gore, M. Alave, and R. R. Kulkarni, “Text to speech conversion using raspberry - PI,” vol. 4, no. 2, pp. 91–293, Feb. 2019.

Y. Li, H. Huang, Q. Xie, L. Yao, and Q. Chen, “Research on a surface defect detection algorithm based on MobileNet-SSD,” Appl. Sci., vol. 8, no. 9, Art. no. 9, Sep. 2018, doi: https://doi.org/10.3390/app8091678

H. Ali, M. Khursheed, S. K. Fatima, S. M. Shuja, and S. Noor, "Object recognition for dental instruments using SSD-MobileNet," in Int. Conf. Info. Sci. Commun. Technol., 2019, pp. 1–6, doi:

https://doi.org/10.1109/CISCT.2019.8777441

S. Hossain, “Smart blind stick using ultrasound distance measurement sensor system,” Bechelor thesis, Dep. Elec. Elec. Eng., Daffodil Int. Univ., Dahak, Bangladesh, 2020.

M. Aamir et al., “Spatiotemporal change of air-quality patterns in hubei province—a pre-to Post-Covid-19 analysis using path analysis and regression,” Atmosphere, vol. 12, no. 10, Art. no. 1338, 2021, doi : https://doi.org/10.3390/atmos12101338

U. A. Bhatti, Z. Zeeshan, M. M. Nizamani, S. Bazai, Z. Yu, and L. Yuan, “Assessing the change of ambient air quality patterns in Jiangsu Province of China pre-to post-COVID-19,” Chemosphere, vol. 288, Art. no. 132569, Feb. 2022, doi : https://doi.org/10.1016/j.chemosphere.2021.132569

R. U., Bazai, S. Ullah, U. A. Bhatti, S. A. A. Shah, and H. Ahmad, “Utilizing blockchain technology to enhance smart home secu rity and privacy,” in Int. Conf. Info. Technol. Appl., pp. 76–86, Singapore, 2022.

Published
2022-12-25
How to Cite
Sulaman, M., Sibghat ullah Bazai, Muhammad AKram, & Muhammad Akram Khan. (2022). The Deep Learning based Smart Navigational Stick for Blind People. UMT Artificial Intelligence Review, 2(2). https://doi.org/10.32350/umtair.22.05
Section
Articles