Teaching & Learning

Social Service-related Projects

The College is keen on fostering strategic partnerships with the social services sector to create a more cohesive and caring society. Over the years, our faculty members and students have developed research projects with NGOs in Hong Kong by applying available technology to enhance the delivery of social services. Below are some of the highlighted projects:

 

Project Title:
Wheelchair Battery Health Condition Monitoring Modules

Professor Henry Chung Shu Hung
Department of Electronic Engineering

Partnered NGO:
Direction Association for the Handicapped

Project Description:
This project aims to develop a diagnostic module for wheelchairs. The module can inform users the real-time state of charge and state of health of the batteries. Moreover, the module can tell users the distance that the batteries can support the wheelchair to travel.

 

Project Title:
Somewhere in Time Museum

Dr Charlie Xue Qiuli
Department of Architecture and Civil Engineering

Partnered NGO:
The Rotary Club of Hong Kong (Kowloon East)

Project Description:
The Rotary Club of Hong Kong (Kowloon East) is planning to build a small museum featuring collections of antiquity in Sai Kung. Led by Dr Charlie Xue, a team of architectural students from the City University of Hong Kong involved in the design of the museum. The team carried out preliminary site analysis and survey, and submitted a report to the Rotary Club in July 2018. All works were conducted on a voluntary basis in support of the community services by the Rotary Club. The project is on-going and it involves government departments such as Lands Department and Buildings Department. In this project, our students are able to apply their skills and knowledge outside of the classroom.

 

Project Title:
Developing an Aid to Help the Visually Impaired to Navigate at MTR Platform

Supervisor: Dr Kelvin Yuen Shiu Yin
Student: Ng Ho Ming

Department of Electronic Engineering

Partnered NGO:
Hong Kong Blind Union

Project Description:
MTR is one of the major means of transport for the visually impaired people (VIP) in Hong Kong. However, many VIP find that navigating in MTR stations is challenging due to the irregular and complicated structure of the stations, especially for those stations which they seldom visit or have never visited before. Such difficulties could lengthen the travelling time of the VIP and cause inconvenience to their journey. In hopes of helping the VIP to navigate on MTR platforms, an aid utilizing the iBeacon and iPhone is therefore developed to provide them with useful information such as the user’s current location, the location of escalators, the distance to escalators, the exits which the escalator leads to, etc. With the VoiceOver function of iOS switched on and by tapping different parts of the screen, the user will be able to hear the corresponding auditory response reading out the information. This proof-of-concept aid has been field-tested on an MTR platform and is found to work well.

 

Project Title:
Two Chess Games Developed for the Visually Impaired

Supervisor: Dr Kelvin Yuen Shiu Yin
Student: Lam Ming Leong

Department of Electronic Engineering

Partnered NGOs:
Hong Kong Blind Union
The Hong Kong Federation of the Blind

Project Description:
At present, the visually impaired people (VIP) play chess games using a special board and chess pieces. This project aimed to explore innovative ways to present the chess board positions so as to make the chess game experience even more enjoyable for the VIP. Two iOS chess games, “Tic-Tac-Toc” and “Dou Shou Qi”, are developed with accessibility. With the help of VoiceOver, VIP can easily know what is happening on the chess board and play with the user-friendly control.

 

Project Title:
Virtual Sports System for the Visually Impaired
"'

Dr Leanne Chan Lai Hang
Department of Electronic Engineering

Partnered NGO:
Hong Kong Blind Union

Project Description:
This project aimed to design a virtual sports system, providing a safe and effective exercise platform to the visually impaired, to increase their exercise compliance. The simulated tennis game was written in MATLAB, comprising of the Kinect system and four loudspeakers. Five game zones were designed for users to earn points. Users were asked to reach the corresponding zone after hearing a tennis serving sound generated by MATLAB via the loudspeakers. The Kinect will detect the hand movements of the users while users swing using a racket. Users could earn points by stepping into the correct zones generated by the speakers. Colour and depth sensors of Kinect were used to detect the shirt colour of the user. User’s location was analyzed by using the MATLAB Image Acquisition and Processing toolboxes. Sounds were generated in a randomized order to the loudspeakers and soundcards with their own speaker ID, and the Data Acquisition toolbox. Hand movement detection was achieved by the depth sensor to set the Body Posture Joint Indices of the right hand and made accessible as metadata on the data stream.

 

Project Title:
Scene Recognition Based Navigation System for the Blind

Dr Leanne Chan Lai Hang
Department of Electronic Engineering

Partnered NGO:
Hong Kong Blind Union

Project Description:
About 17,000 people in Hong Kong (2.4% of the total population) are visually impaired. Besides guide dogs and sticks, the indoor navigation relays on high-cost appendants, such as sensors, RFID and Wi-Fi. Convolutional Neural Network (CNN) has achieved impressive performance in object detection and scenes recognition. However, it is rarely applied in the field of indoor navigation for the visually impaired. This project intends to propose a CNN-based indoor scenes recognition system to improve mobility for the visually impaired.

6,100 images from Hong Kong Blind Union and online image dataset were collected to form the dataset. A CNN model based on AlexNet was optimized on this dataset and deployed in an android app. The resulted model achieves 85% accuracy in test dataset. Furthermore, the model’s behavior was studied through Class Activating Mapping technique, indicating how the model classifies the scenes based on discriminative image regions and spatial features. This study has proved that using CNN is a potential solution to recognize the indoor scenes, which could be applied to giving visually impaired people real-time navigation. Expansion of training dataset and combination with wearable devices could be done to further improve the performance.

 

Project Title:
CityU Apps Lab – Posture Check App

Supervisor: Dr Ray Cheung Chak Chung
Students: Kwok Tsz Wing & Debarun-Dhar

Department of Electronic Engineering

Partnered NGO:
The Chiropractic Doctors' Association of Hong Kong

Project Description:
This project serves to enhance the CityU Apps Lab's Posture Check mobile App by proposing a pipeline for the automatic detection of posture keypoints in the full-body front and side view images of human users. In the proposed method, the keypoint estimation is formulated as a regression problem which is solved using a deep learning approach.

The first half of this project is concerned with the development of a lightweight Convolutional Neural Network (CNN) architecture for human pose estimation on the FashionPose dataset. The model is shown to have comparable results with the current state-of-the-art. In the latter half, a new dataset of 900 images annotated with posture keypoints is used to adapt the pre-trained CNN to the new domain of posture keypoint estimation using transfer learning techniques. Two approaches to transfer learning are explored and evaluated. At the end the quantitative and qualitative results from the completed pipeline are presented. The final results of the pipeline demonstrate its high detection rate while maintaining a fast prediction speed and comparatively small memory footprint.

More about the project: Posture Check App

手機App助查脊骨