Americas

  • United States

10 things you need to know about the security risks of wearables

How-To
Mar 29, 20179 mins
Consumer ElectronicsCybercrimeMobile Security

Fitness trackers may not present a huge security risk, but any connected device can be hacked. Here’s what you need to know to minimize those security and privacy threats.

wearables and security primary2
Credit: Thinkstock

The risks from corporate use of activity trackers and other wearables is low, some experts say — especially in comparison to all the other security and privacy risks CISOs, CIOs and IT folks must worry about.

That said, as with any connected device, there is risk potential. For example, recent research suggests that devices such as Fitbits can be hacked (when the hacker is within close proximity). By focusing on accelerometers and other motion sensors, researchers at the University of Michigan and the University of South Carolina found that it’s possible to, among other things, use sound waves at different frequencies to add thousands of steps to a Fitbit. (Scroll down to read Fitbit’s response to the research results.)

Here’s what you should know about the security and privacy risks of wearables, and the best practices for minimizing those risks.

1. Wearable security is a legitimate concern

With all the security concerns that enterprise IT already have on their mind, do they also need to worry about wearables?

Yes, says Jeff Pollard, a principal analyst focused on security and risk at Forrester Research. For example, some fitness trackers can provide geolocation data “minute by minute to the cloud,” sharing employee as well as company locations. At the same time, “enterprise employees and consumers are opting in to data aggregation and analytics at a daunting scale,” he explains.

“Though IoT devices and wearables don’t necessarily create new security vulnerabilities, they reintroduce a lot of old ones,” says Steve Manzuik, director of security research for Duo Security, a cloud-based trusted access provider. Such devices are “like the wild West of easy hacking targets that many experienced with mainstream computing back in the 90s,” he says.

As with typical consumer IoT devices, wearables “in most cases don’t ship with built-in security and so they’re vulnerable to being compromised,” says Vinay Anand, vice president of ClearPass Security at Aruba Networks, an enterprise wireless LAN provider.

“From an enterprise IT standpoint, this could be particularly worrisome because of the channel wearables maintain with smartphones that adversaries could exploit,” Anand says. “As the wearables are usually connected to a variety of cloud apps and, depending on an organization’s BYOD policy, the corporate network, this can be a launch point for an attack. This means that malware and other forms of attacks can use that path to compromise the phone and then other resources inside the network. The attacker would have access to legitimate enterprise credentials that would lead to loss of, or the ransom of, sensitive data.”

2. In the scheme of things, wearable security may not be a huge concern

To put things in perspective, the security and privacy risks associated with wearables is “quite low, but it escalates with the type of device,” says Chet Wisniewski, principal research scientist for security software developer Sophos.

“Pure biometric activity trackers like pedometers and heart rate monitors may leak information over Bluetooth but it’s reasonably difficult to capture, and it’s of little value to attackers,” Wisniewski says. “As you move up to things like smartwatches the risk increases, but mostly due to trust and theft, not so much interception. A found smartwatch within a few meters of the paired smartphone could be used to steal emails and contacts. This risk may increase with some of the newer smartwatches that have an LTE connection, as they can operate away from the paired device.”

In general, “personal connected devices that primarily operate via close-proximity protocols, like Bluetooth Low Energy, and piggy-back onto mobile devices, such as smart phones, are generally less directly accessible for abuse, vs. IoT devices that are actively connected to the internet via Ethernet or Wi-Fi,” adds Michael McNeil, global head of product device security for Philips Healthcare, which provides clinical healthcare systems and consulting.

Many non-activity tracking IoT devices run on commodity hardware with firmware that’s often not ‘purpose built’ and thus could expose extra services, such as SSH or Telnet remote administration or complex web application back ends, McNeil says. “Personal fitness devices are often very restrictive due to size and computing capabilities, with more specific engineering involved that provides less direct attack surface,” he says. “So, much of the risk is usually with the security of the services that store and transmit this personal data to-and-from the mobile application or other means of data transfer/functionality. Management of these risks should take into consideration these specific parameters of the IoT devices and their possible attack surfaces.”

In addition, Fitbit, the leading wearable maker for corporate wellness, has much to lose if it doesn’t take security seriously.

According to IDC, Fitbit is still the top maker of activity trackers, though its lost some market share. The company also has a corporate division, Group Health, which offers wellness programs to customers such as Adobe, McKesson and BP. And Fitbit CEO James Park has said recently that growing its Group Health business “is critical to the growth of the company.”

To help safeguard against hacks and to protect data, Fitbit devices receive firmware updates that address security (and functionality) as needed and include built-in encryption when syncing data to the cloud, says Marc Bown, Fitbit’s senior security engineer.

Other security steps Fitbit takes include the following:

  • Partnering with a customer’s IT and/or security team to “proactively address any questions or concerns” regarding the security of employee fitness and health data, says Amy McDonough, vice president and general manager of Fitbit Group Health.
  • Offering an invite-only, bug bounty program to augment the research and testing that Fitbit’s security response team conducts.
  • Posting explanations of tracker firmware updates. Since spring 2016, Fitbit has also labeled client software updates that contain security fixes with a “Critical/Important/Moderate/Low” rating to provide “guidance for interpreting those ratings similar to best practices from Google, Microsoft, and others,” according to a Fitbit blog post on security.
  • Developing best practices around the activity tracking data employers obtain from employees who participate in Fitbit wellness programs.

Fitbit says the recent hack conducted by researchers, manipulating its tracker accelerometers via sound waves, “is not a compromise of Fitbit user data and users should not be concerned that any data has been accessed or disclosed.” Fitbit, in an official statement, added that “we carefully design security measures for new products, continuously monitor for new threats, and rapidly respond to identified issues.”

Wearable security best practices

 

3. It’s important to anonymize data

 

Companies that collect but don’t carefully anonymize health-related data have effectively acquired what’s known aselectronic Protected Health Information (ePHI), “which puts you squarely in the HIPAA world,” warns Eric Hodge, director of consulting at CyberScout, a data risk management and identity protection firm. And then, you must “worry about complying with all kinds of HIPAA requirements just as a hospital would,” he says. Plus, you’re exposed to the same fines, which lately have been between $150,000 and $6 million, if you don’t comply with HIPAA requirements. As a precaution, be sure to dissociate information about health and fitness from the individual, he adds.

4. Segregate wearables on a different network

IT should treat wearables like any other computing device on their network, Manzuik says. “When possible, consider segregating IoT devices to their own network and don’t connect them directly to the internet.”

Because some IoT devices have “a history of poor security,” organizations should keep these devices on a dedicated network that doesn’t provide any access to internal resources, such as a guest Wi-Fi network, adds Matias Woloski, CTO and co-Founder of Auth0, a universal identity platform.

5. Do your due diligence

Is the IoT company HIPAA-compliant? Does it adhere to standards? How does it manage credentials and identity? Is there an easy revocation strategy in case a device is lost or stolen? These are a few questions CISOs should ask wearable/group health platform providers, says Woloski.

Corporate fitness and wellness programs are typically tied to third-party software platforms that request permission to access the data generated by trackers or other devices, Woloski adds. CISOs should look for wearable providers that expose their API using authorization protocols such as OAuth 2, so that users can stay in control and revoke access whenever they want, he says.

6. Educate users

It’s important to educate users about the type of data wearables collect, where it goes, and how it might be used, notes Pollard. “It might seem like the data I share with a (wearable) app stays on my smartphone or wearable. In reality, it goes to the cloud and might be shared with a number of third parties. Less sophisticated users may never know that happens, or that they could opt-out of it when or if given the choice.”

7. Limit access to employee fitness and wellness data

To run a successful wellness program or fitness challenge, an enterprise needs opt-in data from participating employees, such as how many steps they’ve taken. But you should restrict wellness program data access to those who need it to run the program, advises McDonough.

8. Get a clear picture of everything connecting to the enterprise network

“Understanding the full inventory of assets connecting to your enterprise network is critical,” says Anand. “You can’t protect what you don’t realize is on your network, so a process to profile and set policies for all devices that wearables would connect to on your network is an important first step.”

Wearables “should be treated as potential threats like any other computing device,” notes McNeil. “Keep an inventory of them, utilize mobile device management to understand which employees are using related mobile applications on their phone, and ensure that communications used by these devices and companion software are observed to leverage proper encryption over the network.”

9. Require multi-factor authentication

CISOs should require employees to use multi-factor authentication on their smartphones “as an added layer of protection,” Anand says. He adds to “use behavioral analytics to identify abnormal patterns of IT access and usage. At the first sign of suspicious behavior associated with a user’s smartphone that is a known participant of a wellness program, IT can act to mitigate any potential damage.”

10. Prepare for security and privacy risks, especially in the short-term

We’re “a long way” from “IoT anti-malware solutions,” notes Pollard. Wearables use a variety of third-party components, operating systems and software—there’s no standard dominant operating system, such as Microsoft Windows, to standardize or build upon, he explains.

So, the road could likely be rocky in the near-term. Long-term, the security situation will improve, Manzuik says. But it could take a few high-profile vulnerabilities or hacks to get us there.