Listen as Audio:
Last year, as the beginning of July drew closer, my wife and I focussed our energies on getting the house ready for our son, Josh, to return home after spending five months in rehabilitation at the Mary Free Bed. We switched bedrooms so that he and his little brother would be on the ground floor, and we would be on the top floor, made arrangements for the driveway to be paved, and changed around the bathroom so that Josh would be able to safely use the toilet and the shower.
Surgery in March to remove a large tumor from Josh’s brain had left him paralyzed on the right side, and unable to speak. During his rehabilitation, he was able to gain many of these abilities. When he checked out of Mary Free Bed in July, he walked off of the floor with the aid of a cane, and could speak sentences that were about five words long.
One of the challenges I knew we would face at home was being able to attend to quickly attend to Josh’s needs. We went home knowing that he would be starting six rounds of chemotherapy treatment right around August. This meant nausea. Even on an ordinary day, I knew there would be times when he would need our assistance, but if my wife or I were upstairs or on the other side of the house, it wouldn’t be as easy to hear him.
The other half of the challenge was that it was difficult for him to find the right words he wanted to use to express himself. This is fine during casual conversation, but in an emergency time is of the essence. For example, “bucket” became the word that he used when he felt like he needed to throw up. It was just something that started while we were in Mary Free Bed.
Hospital rooms are typically equipped with devices that allow patients to call nurses when they need something. Josh and I used his many times, although he wasn’t always comfortable trying to make the requests himself because of his aphasia. I knew that we needed a system which was similar to the nurse call intercom, but could be set up to help Jon communicate his needs.
Our Google homes seemed like a logical starting point. We had just gotten enough Google homes to position one in each room, and I had used Google’s app to create several automations. You could, for example, program the Google home to turn on the lights and play a specific podcast if you told it “I’m home.”
You could also pre-program an announcement which would go out on all of the Google speakers when triggered. And, since several of our rooms were equipped with Google homes, that announcement would be heard just about everywhere. So, we already had our own version of P.A. System.
I set up the triggers so that they would be unique to Josh’s medical needs, but rare enough so that we wouldn’t set off the trigger by accident. I borrowed the hospital’s color-coded system, i.e. “Code Red,” or “Code Blue.” Code Red, for example, triggered an announcement stating that Josh felt ill. I believe the exact phrase was, “I am about to get sick.”
I ended up with three or four automations, but I knew that it could be easy to forget all of their meanings. So, I set up an additional automation which would serve as a type of verbal menu. The trigger was “help,” and the response from the smart speaker included a list of the different options.
I had been binge watching “Star Trek: Voyager,” and was inspired by the show’s EMH to write the script for this automation as if Google was an Emergency Medical System. The word “help” would trigger a preliminary announcement to everybody in the house that the “system has been activated.” This was enough to cause alarm, even if you were sound asleep or involved in something else. Choosing the different codes would help to give context: I feel sick, I need to use the bathroom, etc.
I recall the system only being used once as intended, on the first night of Josh’s chemotherapy treatments. I tried to encourage the rest of my family to use it, but it didn’t experience a lot of traction. A year later, things are quieter, and Josh has an easier time communicating his needs, so we don’t rely on this so often. He has also been using text messages or just the intercom feature on the Google home in his room to get our attention.
I have been thinking, though, about all of the coronavirus patients who are quarantining themselves at home, and the people who are taking care of them. It seems like this kind of home emergency medical system, or HEMS, might be needed there as well. Especially if someone needs help, but isn’t able to quickly describe what it is that they need.
I also think about the way Josh was in the hospital, and his challenges with using the nurse intercom. How does one explain over an audio system what it is that they need, when they aren’t able to find the words? An automated system might have helped to bridge that gap.
I also purchased a pair of Alexa speakers a few weeks before Josh came home because I had seen the Alexa in action and was impressed with its intercom feature. You could basically connect an intercom call between one room and another, and then carry on a conversation. Google homes only allowed you to send short bursts of words to each other. My goal was to enhance our communication with the Alexa. We could also use the Alexa as a “baby monitor,” so that we were alerted if Josh got up or said that he needed something.
I think that this experimentation could be carried a lot farther with Alexa’s ability to set up “skills.” I can imagine setting up Alexa so that it asked the person intuitive questions, and then took actions based on the responses.
And, my Alexas are able to send my phone an alert if it “hears” the sound of broken glass. I wonder if it could be set up to react to other sounds, such as someone getting sick or key phrases.
Having tried both smart speaker systems, if I were to start over again I am sure that I would choose to invest only in the couple of Alexas. I think that their intercom capabilities are far beyond what the Google home can do. And, as I said, I think that Alexa has the potential to become a truly voice-activated medical assistive device.