Foiling hackers in the smart home era


Monday, 15 April, 2024


Foiling hackers in the smart home era

Australian researchers have come up with an innovative camera design that obscures images beyond human recognition, thus preserving the privacy and security of those using smart home devices and Internet of Things technology.

Labour-saving gadgets like robotic vacuum cleaners, smart fridges and baby monitors are increasingly being embraced as part of everyday life. Known as sighted systems, these smart devices use vision to negotiate their surroundings, taking videos and images of our lives in the process.

Devices like autonomous vacuum cleaners form part of the Internet of Things — real-world objects that connect to the internet. While convenient, these smart objects can be at risk of being hacked by bad actors or lost through human error, with their images and videos then vulnerable to being stolen by third parties.

In a bid to restore privacy, researchers at the Australian Centre for Robotics at the University of Sydney and the QUT Centre for Robotics (QCR) at Queensland University of Technology worked on a new approach to designing cameras in which visual information is processed and scrambled before it is digitised so that it becomes obscured to the point of anonymity.

Acting as a ‘fingerprint’, the distorted images can still be used by robots to complete their tasks but do not provide a comprehensive visual representation that compromises privacy.

“Smart devices are changing the way we work and live our lives, but they shouldn’t compromise our privacy and become surveillance tools,” said Adam Taras, who completed the research as part of his Honours thesis.

“When we think of ‘vision’ we think of it like a photograph, whereas many of these devices don’t require the same type of visual access to a scene as humans do. They have a very narrow scope in terms of what they need to measure to complete a task, using other visual signals, such as colour and pattern recognition,” he said.

A key point of difference

In a crucial step, the researchers were able to confine the visual processing that normally happens inside the camera’s computer to within the optics and analog electronics of the camera, which exist beyond the reach of attackers.

“This is the key distinguishing point from prior work which obfuscated the images inside the camera’s computer — leaving the images open to attack,” said Dr Don Dansereau, Taras’s supervisor at the Australian Centre for Robotics and Digital Sciences Initiative. “We go one level beyond to the electronics themselves, enabling a greater level of protection.”

The researchers tried to hack their own approach but were unable to reconstruct the images in any recognisable format. They have opened up this task to the research community at large, challenging others to hack their method.

“If these images were to be accessed by a third party, they would not be able to make much of them, and privacy would be preserved,” Taras said.

Towards future security

Dansereau said privacy was increasingly becoming a concern as more devices today come with built-in cameras, and with the possible increase in new technologies in the near future like parcel drones, which travel into residential areas to make deliveries.

“You wouldn’t want images taken inside your home by your robot vacuum cleaner leaked on the dark web, nor would you want a delivery drone to map out your backyard. It is too risky to allow services linked to the web to capture and hold onto this information,” he said.

The team’s approach could also be used to make devices that work in places where privacy and security are a particular concern, such as warehouses, hospitals, factories, schools and airports.

As their next step, the researchers hope to build physical camera prototypes to demonstrate the approach in practice.

“Current robotic vision technology tends to ignore the legitimate privacy concerns of end users. This is a short-sighted strategy that slows down or even prevents the adoption of robotics in many applications of societal and economic importance. Our new sensor design takes privacy very seriously, and I hope to see it taken up by industry and used in many applications,” said Professor Niko Suenderhauf, Deputy Director of the QCR, who advised on the project.

Professor Peter Corke, Distinguished Professor Emeritus and Adjunct Professor at the QCR, who also advised on the project, said, “Cameras are the robot equivalent of a person’s eyes, invaluable for understanding the world, knowing what is what and where it is. What we don’t want is the pictures from those cameras to leave the robot’s body, to inadvertently reveal private or intimate details about people or things in the robot’s environment.”

The research, ‘Inherently privacy-preserving vision for trustworthy autonomous systems: Needs and solutions’, has been published in the Journal of Responsible Technology.

Top image credit: iStock.com/Andrey Zhuravlev

Related Articles

The significance of data management in mining

This article explores how advanced data management and electrical solutions can transform mining...

The role of digital twins in shaping sustainable grids

To meet the government's target of net-zero carbon emissions by 2050, the energy sector must...

The social value of smart homes

Home automation can greatly improve life for people with disabilities; it's also a smart...


  • All content Copyright © 2024 Westwick-Farrow Pty Ltd