In the US, Alzheimer’s disease (AD) is the only disease in the top six for which the affected population is growing, and it is the single most expensive disease. One reason is that nursing home costs average over $6,750/month and often continue for 5-20 years. Thus, there is a significant need for technology to help patients live safely at home for extended periods. Current systems focus on keeping patients from harming themselves (by leaving the stove on, etc.), but do not consider how this data can be used to monitor disease progression. This information informs the physician’s final recommendation and may eventually allow for context-aware patient feedback (when disoriented, etc.). Collaborating with the Memory and Aging Center at UCSF, five MEng students developed a system that simultaneously keeps the patient safe and monitors the disease state by leveraging recent advances in wearable computing, Internet of Things and machine learning.
Based on this information, our team formulated a design criteria such that the device has to be aesthetically pleasing, quiet, and safe for avian life. Also, due to a city regulation, the device’s size was restricted to a 25-foot height limit.
Wearable Computing and Internet of Things
Wearable Computing and Internet of Things were identified as key components for this project since they allowed the team to track a patient’s activities over time. By placing inexpensive SensorTags around the house, sensor readings could be collected and sent to a smartwatch worn by the patient via Bluetooth. These sensor readings could then be interpreted as events taking place around the house, e.g. increased humidity in the bathroom indicates usage of the shower, and increased kitchen temperature points to usage of the stove. Team members, Junjie Ng, Chong Wee Tan and Yanrong Li designed and developed the software to ensure a robust connection between SensorTags and the smartwatch, and relaying of sensor data to the cloud service for further processing by our machine learning algorithms.
Team members, Ludovic Thea and Marie Douriez, developed and implemented the machine learning algorithms in the system. The reason why they chose to ﬁrst recognize events is to simplify the inputs for our machine-learning algorithms. By combining several events (e.g. stove used, fridge opened, cupboard opened) with the location of the patient (e.g. kitchen), their goal was to recognize activities (e.g. cooking). This was done by first providing the algorithm with a set of event data and the “ground truth” (e.g. cooking, sleeping, showering, etc) collected over time. After learning the relationship between events and activities, their algorithms would then be able to determine the activities taking place simply from event data collected.
The first prototype of the system was fielded in Ludovic’s house, which allowed us to develop and refine the initial set of machine learning algorithms. The system shall continue to be worked upon and fielded in additional houses to verify its suitability for deployment in the homes of actual Alzheimer’s Disease patients. This will again be a collaboration with UCSF’s Memory and Aging Center, under the Care Ecosystem initiative. The team also intends to submit a paper based on the knowledge gained during the project. It is our hope that the activity information contributed by our system will eventually reduce avoidable emergency room visits and hospitalizations, and delay entry into a nursing home.
← View all Capstone Projects