Apple and Google have attempted to explain how their contract-tracing technology will work, but experts are still questioning if it could be effective in reducing infections.
The technology will enable people to find out if they have been in close proximity with someone later diagnosed with COVID-19 based on Bluetooth signals between their smartphones.
Being able to warn people early about potential transmission of the coronavirus is crucial to tackling the pandemic because people are believed to be at their most contagious before symptoms develop.
But experts have raised concerns about the rare collaboration between companies whose mobile operating systems are run on 99% of the world’s smartphones.
The companies say their system will use Bluetooth to ping out signals to other smartphones which are near by, and the smartphones will create a log of all of the other devices they come near to.
This log will protect the identities of these smartphones by randomising the identifiers each of them uses and changing them every 15 minutes.
If users choose to share this information with public health authorities then they can be alerted if any of those other devices belonged to someone who was diagnosed with COVID-19.
Over the coming months the companies say the contact-tracing feature will be added to the underlying iOS and Android operating systems. Any positive matches would prompt users to download their relevant public health app.
Accessing this data, which will only happen if users choose to share it, will be limited to public health authorities as recognised by the national governments where the companies operate.
But there are significant issues with this model.
:: Listen to the Daily podcast on Apple Podcasts, Google Podcasts, Spotify, Spreaker
Security expert Professor Ross Anderson, of the University of Cambridge, warned: “If the app is voluntary, nobody has an incentive to use it, except tinkerers and people who religiously comply with whatever the government asks.”
And if the public health authority apps were compulsory, then “the incentive to cheat will be extreme” as human nature will drive people to adapt to the constraints imposed on them by the system, Prof Anderson added.
The privileges which any compulsory app could give people – for example, if potentially it offered immunity certification — might also introduce dangerous incentives, such as some people trying to contract the virus on purpose.
Another significant issue would be people falsely claiming to have contracted the virus in order to send false alerts to others.
Professor Anderson wrote: “Anyone who’s worked on abuse will instantly realise that a voluntary app operated by anonymous actors is wide open to trolling.
“The performance art people will tie a phone to a dog and let it run around the park; the Russians will use the app to run service-denial attacks and spread panic; and little Johnny will self-report symptoms to get the whole school sent home.”
Not to mention the “prank” aspect of being able to light up everyone you’ve been near’s devices with “you’ve been exposed to covid” (without them knowing you’re the culprit) at any time, without some kind of pretty heavy manual ID/result verification at the moment of reporting.
— Moxie Marlinspike (@moxie) April 10, 2020
Security expert Moxie Marlinspike, who created Signal Messenger, warned on Twitter that were numerous ways in which malicious actors could begin to troll the system and people whose devices were near their own.
To tackle this, the companies say they are working with public health authorities to provide the validation for any diagnoses.
But for many people getting a diagnosis simply is not possible.
Sky News has revealed the national coronavirus testing centre is only conducting 1,500 tests a day – significantly short of the 100,000 the government was aiming for.
With such a limited number of cases being flagged up by official testing system, Apple and Google say that public authorities may need to use another method to identify cases. It is not clear what that method would be.