Researchers at Washington State University (WSU) have demonstrated a robot that they claim could help elderly people with dementia live independently in their own homes.
The so-called Robot Activity Support System (RAS) uses sensors embedded in the home to determine where its residents are, what they are doing and when they need assistance with daily activities.
The system navigates through rooms and around obstacles to find people, provides video instructions on how to do simple tasks and can even lead its owner to objects like their medication or a snack in the kitchen.
Currently, an estimated 50 per cent of adults over the age of 85 need assistance with every day activities such as preparing meals and taking medication. The annual cost for this assistance in the US alone is nearly $2 trillion.
With the number of adults over 85 expected to triple by 2050, the WSU team which is led by Professors Diane Cook and Maureen Schmitter-Edgecombe, hopes that technologies like this will alleviate some of the financial strain on the healthcare system by making it easier for older adults to live alone.
“Upwards of 90 per cent of older adults prefer to age in place as opposed to moving into a nursing home,” said Cook. “We want to make it so that instead of bringing in a caregiver or sending these people to a nursing home, we can use technology to help them live independently on their own.”
RAS is the first robot to be incorporated into WSU’s experimental smart home environment. They recently published a study in the journal Cognitive Systems Research that demonstrates how RAS could make life easier for older adults struggling to live independently.
During the study the group recruited 26 undergraduate and graduate students to complete three activities in a smart home with RAS as an assistant.
The activities were getting ready to walk the dog, taking medication with food and water, and watering household plants.
When the smart home sensors detected a human failed to initiate or was struggling with one of the tasks, RAS received a message to help.
The robot then used its mapping and navigation camera, sensors and software to find the person and offer assistance.
The person could then indicate through a tablet interface that they wanted to see a video of the next step in the activity they were performing, a video of the entire activity, or they could ask the robot to lead them to objects needed to complete the activity like the dog’s lead or a snack from the kitchen.
The next step in the research will be to test RAS’ performance with a group of older adults to get a better idea of what prompts, video reminders and other preferences they have regarding the robot.