Uncharted waters for wearable technologies: data privacy, security and social acceptability
Featured story

The applications of digital technology to health and wellbeing, and to safety, have some great potential benefits, but are also taking us into some uncharted waters in terms of privacy, security and social acceptability. There are ways around these issues, which need to be underpinned by clarity and transparency, and consideration of the right incentives and frameworks.” – David Hardman, Senior Research Analyst

A Price Waterhouse Cooper survey of 2000 UK employees (2016) produced the following findings:

  • 65% of people think that technology has a real role to play in their health and wellbeing.
  • 61% of employees are keen for their employer to take an active role in their health and wellbeing.
  • 45% of people believe that their employer does play an active role.
  • 38% of people do not trust their employer to use the data they collect to benefit the employee.
  • 25% of the people who did not trust their employer would be willing to share their data if they were given an incentive, such as increased pay or flexible working hours.

Thus, whilst a considerable proportion of people have a positive view of technology relating to health and wellbeing (backed up by the strong sales of consumer wearables), there is also a substantial level of mistrust about how employers might use data.

It is possible that an employer in a safety-critical industry could have terms of service that require employees to use certain wearable devices. Where these confer clear benefits on employees, there may be minimal resistance to overcome. Who, these days, would object to maintenance workers wearing hard hats and high-vis jackets? But what about a situation in which an employer, trying to minimize the risks from worker fatigue, wanted employees to wear sleep-tracking devices outside working hours? Some people, at least, might consider this an unwarranted infringement on their personal life.

Police Bearing in mind such concerns, employers may need to be proactive in winning the support of their staff by openly discussing the health and safety benefits that these can bring. Additionally, trade union health and safety representatives may be able to help persuade their members of the benefits that wearables might bring. For example, body cameras worn by the British Transport Police can help to identify violent or abusive people and thus help to protect staff. They can also articulate workers' concerns to employers. For example, when an employer provides a device to employees with the assurance that they will not track their GPS location data, will this assurance still hold if there is a safety incident? This is the kind of issue that needs to be anticipated and incorporated into an agreed framework. 

Such frameworks may be most useful where safety is concerned. In the case of health and wellbeing it may be that voluntary arrangements are more suitable. For example, in sedentary occupations an employer might offer free wearables to staff, together with financial incentives to achieve certain physical activity goals. But even here it is possible to envisage tensions that might arise. How voluntary is voluntary? If a substantial majority of staff do sign up to such a scheme, then it is possible that – either explicitly or implicitly – the remaining minority might feel under pressure to do likewise. Perhaps they might worry about being viewed as a member of 'the awkward squad', with all that that might imply in terms of career progression.

Bluetoth low energy data transfer We also need to consider the potential for unauthorised access to data. Many wearable technologies work by sending information from the measurement device to another location. In the case of consumer wearables, data are typically sent from the device to the user's smartphone and to a cloud server. If used in an organisational setting, anonymised and aggregated data might also go to a company 'dashboard'. The mere fact that data flow from one location to another constitutes a potential privacy risk. For example, most fitness trackers communicate to a master device (e.g. smartphone) using Bluetooth Low Energy (BLE). Smartphones minimise their own energy consumption by frequently disconnecting from the fitness tracker. The tracker, in turn, frequently transmits advertisement packets to announce its presence to the master device. Third parties can "sniff" these packets and make accurate inferences from the data collected. Researchers have found that fitness trackers often do not randomise their BLE address, but even when they do it is possible for an eavesdropper to use the volume of the activity to determine the user's activity with a high degree of accuracy. It is thus possible for an employer to use sniffers to track and monitor the activities of employees within the workplace (e.g. walking or sitting).

A report by Symantec includes corporate misuse as one of the risks from wearable technologies, along with identity theft, profiling, location of user or stalking, and embarrassment and extortion. In addition, once a user's data are stored on a cloud server he or she should be able to expect that the host service is not using that data in inappropriate ways and that the data are secure. As an industry we must make sure that such risks are well-managed, either by adopting safe-by-design principles or by putting the necessary mitigations in place.

This investment in security is worth making because there are potentially huge advantages to be gained from wearables, as have been explored in some of the earlier articles in this series.   

The Economist recently noted that the application of machine learning techniques to medical "big data" accumulated from large numbers of wearables users could lead to advances in the diagnosis and treatment of many conditions. To this end, Apple is working to improve the measurement accuracy of its smartwatch and iPhone health applications, while other tech companies are setting up their own insurance schemes or creating the software to analyse health data (for example, Google has been working with the NHS). 

Wearables Employees are more likely to look favourably upon wearable technologies if the benefits of these are clearly apparent and if there is trust between the employer and employees. It might be that positive attitudes could be developed through the implementation of small-scale schemes before anything more ambitious is attempted. Perhaps trials with volunteers could be run, with the understanding that successful outcomes may lead to a wider roll-out.   

Consistent with the principles of GDPR, companies need to ensure that their storage and use of data is clearly explained, and staff should be consulted when developing these policies. GDPR-compliant policies and procedures will give people greater control over the way that their data is used but will not by themselves prevent breaches of privacy or illegal use of data. In addition, it is possible for data to become corrupted over time. Policies should include provisions for their own review and revision in the light of any potential problems arising.

In short, breaches of privacy and security are a fact of modern life; the point is to minimise the frequency of their occurrence and potential impact, while ensuring that employees can appreciate the benefits of the data they provide.   

The applications of digital technology to health and wellbeing, and to safety, have some great potential benefits, but are also taking us into some uncharted waters in terms of privacy, security and social acceptability. There are ways around these issues, which need to be underpinned by clarity and transparency, and consideration of the right incentives and frameworks.

Haven’t found what you’re looking for?
Get in touch with our expert for more information
author
Default
Tel: 123-4567
X
Cookies help us improve your website experience.
By using our website, you agree to our use of cookies.
Confirm