Older Adults’ Perception of Smart Home Technologies

Percy Campbell et al.’s article, “User Perception of Smart Home Surveillance Among Adults Aged 50 Years and Older: Scoping Review,” is a really interesting bit of work into older adults/ perceptions of Smart Home Technologies (SMTs). The authors conducted a review of other studies on this topic to, ultimately, derive a series of aggregated insights that clarify the state of the literature and, also, make clear how policy makers could start to think about the issues older adults associate with SMTs.

Some key themes/issues that arose from the studies included:

  • Privacy: different SMTs were perceived differently. But key was that the privacy concerns were sometimes highly contextual based on region, with one possible effect being that it can be challenging to generalize from one study about specific privacy interests to a global population
  • Collection of Data — Why and How: People were generally unclear what was being collected or for what purpose. A lack of literacy may raise issues of ongoing meaningful consent of collection.
  • Benefits and Risks: Data breaches/hacks, malfunction, affordability, and user trust were all possible challenges/risks. However, participants in studies also generally found that there were considerable benefits with these technologies, and most significantly they perceived that their physical safety was enhanced.
  • Safety Perceptions: All types of SHT’s were seen as useful for safety purposes, especially in accident or emergency. Safety-enhancing features may be preferred in SHT’s for those 50+ years of age.

Given the privacy, safety, etc themes, and how regulatory systems are sometimes being outpaced by advances in technology, they authors propose a data justice framework to regulate or govern SHTs. This entails:

  • Visibility: there are benefits to being ‘seen’ by SHTs but, also, privacy needs to be applied so individuals can selectively remove themselves from being visible to commercial etc parties.
  • Digital engagement/ disengagement: individuals should be supported in making autonomous decisions about how engaged or in-control of systems they are. They should, also, be able to disengage, or only have certain SHTs used to monitor or affect them.
  • Right to challenge: individuals should be able to challenge decisions made about them by SHT. This is particularly important in the face of AI which may have ageist biases built into it.

While I still think that there is the ability of regulatory systems to be involved in this space — if only regulators are both appropriately resourced and empowered! — I take the broader points that regulatory approaches should, also, include ‘data justice’ components. At the same time, I think that most contemporary or recently updated Western privacy and human rights legislation includes these precepts and, also, that there is a real danger in asserting there is a need to build a new (more liberal/individualistic) approach to collective action problems that regulators, generally, are better equipped to address than are individuals.

Leave a comment

search previous next tag category expand menu location phone mail time cart zoom edit close