Research suggests we’re going to see almost 700 million smartwatches and wearable units shipped to consumers over the next few years. Wearables represent an exciting new frontier for developers – and a potential new cyber security risk. Smartwatches record a surprisingly large amount of data, and that data often isn’t very secure.
What data do smartwatches collect?
Smartwatches are stuffed full of sensors, to monitor your body and the world around you. A typical smartwatch might include any of the following:
- Light detection
- Heart rate monitor
Through SDKs like Apple’s ResearchKit, or through firmware like on the FitBit apps can be created to allow a wearable to monitor and collect this very physical personal data. These data collection is benign and useful – but encompasses some very personal parts of an individuals life such as health, daily activities, and even sleeping patterns. So is it secure?
Where is the data stored and how can hackers access it?
Smart wearables almost always link up to another ‘host’ device and that device is almost always a mobile phone. Data from wearables is stored and analysed on that host device, and is in turn vulnerable to the myriad of attacks that can be undertaken against mobile devices. Potential attacks include:
Direct USB Connection: Physically linking your wearable with a USB port, either after theft or with a fake charging station. Think it’s unlikely? So called ‘juice jacking’ is more common than you might think.
WiFi, Bluetooth and Near Field Communication: Wearables are made possible by wireless networks, whether Bluetooth, WiFi, or NFC. This makes them especially vulnerable to the myriad of wireless attacks it is possible to execute – even something a simple as rooting a device over WiFi with SSH.
Malware and Web-based Attacks: Mobile devices remain highly vulnerable to attacks from malware and web-based attacks such as StageFright.
Why is this data a security risk?
You might be thinking “What do I care if some hacker knows how much I walk during the day?” But access to this data has some pretty scary implications. Our medical records are sealed tight for a reason – do you really want a hacker to be able to intuit the state of your health from your heart rate and exercise? What about if they then sell that data to your medical insurer?
Social engineering is one of the most used tools of anyone seeking access to a secure system or area. Knowing how a person slept, where they work out, when their heart rate has been elevated – even what sort of mood they might be in – all makes it that much easier for a hacker to manipulate human weakness.
Even if we’re not a potential gateway into a highly secured organization, this data can be hypothetically used by dodgy advertisers and products to target us when we’re at our most vulnerable. For example, ‘Freemium’ games often have highly sophisticated models for when to push their paid content and turn us into ‘whales’ who are always buying their product. Access to elements of our biometrics would only make this that much easier.
What does this mean?
As our lives integrate more and more with information technology, our data moves further and further outside of our own control. Wearables mean the recording of some of our most intimate details – and putting that data at risk in turn. Even when we work to keep it secure, it only takes one momentary lapse to put it at risk to anyone who’s ever been interested in seeing it. Information security is only going to get more vital to all of us.
This blog is based on the presentation delivered by Sam Phelps at the Security BSides London 2016.