How to Protect Wearable Devices Against Cyberattacks

IEEE cybersecurity experts share ways to avoid software design flaws

8 March 2016

Wearable fitness trackers flew off store shelves last year. Sales were up more than 170 percent over the previous year, with an excess of 78 million devices sold, according to the research firm IDC. As with other popular devices, hackers are finding ways to steal users’ information. According to BuzzFeed, in a string of incidents in December hackers used leaked e-mail addresses and passwords from third-party websites to log in to Fitbit accounts.

Some of the cybersecurity experts who are busy working to protect information the wearables are gathering participating in the IEEE Center for Secure Design, which focuses on identifying and preventing software design flaws. It’s part of the IEEE Cybersecurity Initiative.

In February the center released “WearFit: Security Design Analysis of a Wearable Fitness Tracker,” which describes how security flaws in a wearable device can be avoided. WearFit, a non-existent wearable designed for the project, was based on device architecture and components of existing systems, each of which presented potential hacking opportunities.

“For security professionals, we highlight the importance of building security in from the design of the software all the way through the development and testing, until it is eventually brought to market,” Jacob West said in a news release. West, a founding member of the center, cowrote the report. He is chief architect for security products at NetSuite, a company in San Mateo, Calif., that produces cloud business management software.

The analysis took the Center for Secure Design’s 10 most widely and frequently occurring software security design flaws and recommendations for preventing them and outlined how to apply each one to the wearable. Here are a few of the measures incorporated into WearFit.

ESTABLISHING TRUST

WearFit’s system was designed so that any mobile device running the app could relay data from the tracker to the server. Therefore, additional security constraints were built in to ensure that users can’t easily see or tamper with others’ activity data. Those measures include making the default data settings on the website private. There’s a simple dashboard for users to view the data they’ve shared and with whom. The sharing relationships can be revoked or modified.

Like real trackers, WearFit uses open-source and commercial libraries, frameworks, and other APIs, and approves only known components. The maker also runs security reviews and static analysis looking for vulnerabilities.

Because those constraints affect the security architecture, the device uses a pairing process to establish trust with the mobile device. That is done through a visual representation of the same token so the user can verify a match. Once the match is confirmed, each device stores the other’s identity.

AUTHENTICATION

WearFit’s secure design prevents an unknown entity from gaining access without first authenticating. That’s why users must register a new account through the website, or Facebook or Google. Lost passwords have to be reset; they can’t be retrieved in plain text. That minimizes the possibility of hackers gaining access to them. Five failed authentication attempts lock the user’s account, and the device requires a password reset to unlock it. For the password to be reset, the e-mail address a user provides must match the one in the account.

Even after users have been authenticated, they have to be authorized. When the device is first paired with a user’s account, the authorization process grants permission for the device to transfer fitness activity data to the mobile app and the website.

To modify or access data, the application verifies the user’s session is active and the person has permission to perform the given action. If a discrepancy is detected, the request is denied and the session terminates.

Strictly separating data and control instructions, and never processing control instructions from untrusted sources, is key to maintaining security, experts say. That’s why input from the user does not influence the program’s control settings. Authenticated users can perform actions from only a fixed set of allowed operations.

The WearFit system assumes that mobile devices, client applications, and websites might have been compromised or replaced by an imposter. Therefore, each of the wearable’s components runs a validation strategy that verifies assumptions about syntax, semantics, and expected values of all data before operating on them.

UP TO CODE

Cryptography plays many roles in the tracker’s system. It helps protect communications between the tracker, the mobile device, and the Web server from unauthorized reads and modifications. Because the system requires end-to-end data integrity and confidentiality from the wearable to the website, the tracker uses an immutable asymmetric private key (for which the Web server knows the public key).

In addition, the WearFit system is designed with algorithmic agility so that if cryptographic weaknesses in the algorithms are discovered, the system seamlessly transitions to new ones. The system’s services are cryptographically protected to lessen the potential that others will access or modify customer data. Firmware updates to the device are cryptographically signed and authenticated using a private key that only the tracker’s company knows.

“We hope this report reflects the Center for Secure Design’s mission, which is to shift the industry’s focus from exclusively finding and fixing bugs in software to a more balanced approach that also looks at design flaws,” West said. “We can avoid many bugs and vulnerabilities just by how carefully we build a system.

Each of the techniques is described in more detail in the free downloadable document.

Learn More