We have a problem. Well, more of a puzzle. Like much of Europe, the UK is gradually emerging from the lockdown of the last few months – this is great for business, collective sanity and our social lives. But opening up brings risks. If a second wave of COVID-19 is inevitable, and many scientists think it is, how should we avoid the mistakes of our first run?
Imposing another nationwide lockdown like the one this spring risks economic ruin for an already ailing UK economy. But with a vaccination a long way off, ‘keeping calm and carrying on’ would be even more disastrous.
One solution you’ve probably heard a lot about in the last few months is contact tracing. Or, more specifically, the new NHS COVID-19 app. Some have boldly declared the technology, coupled with testing, the answer to a return to normality. Meanwhile, others have raised serious cybersecurity and data privacy concerns.
So, how does contact tracing work? Are privacy activists and cybersecurity experts right to be worried about it? And, are your privacy and cybersecurity really in peril?
How does contact tracing work?
Although there are many different ways apps like this could work. For simplicity, let’s stick with how the NHS app works.
The app is incredibly simple. It uses Bluetooth to ‘ping’ any other phones (with the app downloaded) in your vicinity. The app then stores a record of anyone you’ve been in close contact with over a relevant time frame. For example, the 2-14 days symptoms typically take to appear in those who come into contact with the virus.
If anyone receives a COVID-19 diagnosis, the app notifies everyone recorded within the infection range. It then sends a message asking users to self-isolate.
What are the privacy concerns?
At this point, you may be wondering what the problem is. The app seems intuitive, it has the crucial benefit of simplicity, and it’s easy to scale (after all, 79% of us own a smartphone).
Most experts are broadly in agreement that the system is needed and a good idea. Where opinion differs is in the best way to design an app to accommodate it.
This argument centres around whether we should be building centralised or decentralised apps to tackle contact tracing. A centralised app means that in the event a user flags a positive test result, the data from their phone is sent to a centralised database run by a healthcare body or the government. This central database then unlocks the identities of the infected person and anyone they’ve been near.
In a decentralised model, this same process is repeated on the phone itself, meaning the government or healthcare body never receives any identifying information about app users. Instead, any data they collect is depersonalised, for example, the number of people infected and their geographic spread.
Privacy and security campaigners worry about the centralised model because it’s open to ‘scope creep’. Or, to put it another way, just because the technology is being used for benign purposes now, doesn’t mean it couldn’t be applied for mass surveillance in the future.
The UK had planned to use a centralised model. However, partly due to these concerns, and Apple and Google declaring they wouldn’t allow its use on their phones, it’s now switched to a decentralised model.
What about security?
The other big concern about any contact tracing app stems from whether its data is completely safe from cyber attacks. A recent report from two academics specialising in cybersecurity, reveals that contact tracing apps may have some unforeseen vulnerabilities.
We won’t delve too far into the technical reasons behind the findings. In essence, most of the models for apps we’ve seen from governments so far transmit encrypted and unencrypted data side-by-side. Security experts fear that this could mean would-be hackers have an ‘in’ to identify individual users and steal their data.
Are your cybersecurity and privacy really at risk?
We’ve outlined some of the security and privacy concerns about contact tracing apps, but how at risk is anyone who uses one?
Privacy – Had the UK government pushed ahead with its plan to use a centralised model, this would have been a very different article. However, the move to a decentralised approach has mitigated most privacy concerns.
A decentralised app won’t share any personal information about you. It won’t share your geographic location with any third party. And, from an inter-user standpoint, the design shouldn’t allow anyone to work out who in their recent contacts has become symptomatic.
Security – This issue is a little thornier. The questions raised by the report we mentioned earlier haven’t gone away, but at this stage, they remain theoretical problems rather than something users are reporting. What’s more, the GCHQ National Cyber Security Centre (NCSC) is aware of the findings of the report and is working towards fixing them.
Contact tracing apps aren’t perfect, but it’s a balancing act. As with any state-run technology, they face questions about privacy and security. On the other hand, the risks to privacy are small and security is only likely to improve as the technology does. More importantly, contact tracing has enormous potential to help us get back to something more like the pre-COVID world. So perhaps the real question is can we afford not to use it?
Looking to improve your cybersecurity but not sure where to begin? Start by getting certified in Cyber Essentials, the UK government scheme that covers all the fundamentals of cyber hygiene.