COVID-19 contact tracing: Getting it done — and making it work
April 23, 2020 / Current Affairs, Open Letter / By OPTF
On April 16, I published an open letter detailing my thoughts on contact tracing apps in Australia. In it, I talk about how I think it’s a surprisingly well thought through plan, and in my opinion, sufficiently protects the privacy of the public.
It seems further explanation is required. Currently, the government is doing an atrocious job of convincing Australians to support what’s actually a very compelling product that could genuinely make a huge difference if enough people use it. I’m going to do my best to explain how the app works, why it’s actually relatively harmless, and lastly, my recommendations on what the government needs to do in order to make this a success.
Part 1: Is it okay?
There’s so much misleading information about COVIDSafe doing the rounds. It seems like people are imagining what it is based on supposition rather than actually looking at it — so let me explain what COVIDSafe does in simple terms.
How it works
COVIDSafe uses the BlueTrace protocol. This is great — it’s open-source and, in my opinion, pretty low risk when it comes to user privacy.
When you register with COVIDSafe, all you do is enter some basic information for the health system — your name, age range, postcode, and a number you can be contacted on. This is required so the health system can contact you and identify risk factors based on neighborhood and age associated disease. Nothing else. Then you can go about your life with the app running in the background.
COVIDSafe uses Bluetooth to communicate with other phones around you — not GPS or any other system. Using Bluetooth, your phone publishes an anonymous key to other devices in your proximity once you’ve been near them for long enough to be considered at risk of exposure (15 minutes, according to global health experts). When it does this, it’ll also collect the other person’s key. Identifying keys are rotated every 2 hours, according to the privacy policy, so you don’t have the same identifier linking back to your device for very long. The app does all this in the background, and keeps a record of all of the keys you’ve come across in the last 21 days. That’s really it. You have no way of knowing who these keys belong to. Nobody else knows your keys belong to you. You don’t automatically upload any keys you collect to the government. Location data isn’t collected or used. At all.
Let’s say you then develop flu-like symptoms — you get a test, and you have coronavirus. Then (with your express consent) the encrypted keys your phone has collected in the last 21 days are uploaded to the health system, who possesses the ability to decrypt these keys. Health services can then use the phone numbers provided to them upon registration to get in touch with these people and let them know they’ve potentially been exposed.
That’s how it works. Pretty clever. Pretty simple. Pretty not scary so far. Let’s dig a little deeper.
Is COVIDSafe a privacy risk?
In my opinion, some of the concerns with this approach aren’t worth all that much discussion. Yes, there is potential that the central servers could be hacked (but your phone number has probably been leaked in dozens of places already, and your health records are online now too, so not much additional risk). Yes, there is potential for ‘surveillance beacons’ to be set up by the health system so they can work out who stood next to a fixed position for more than 15 minutes — but this is ridiculous, if the governments want to know where you are, they can just ask your phone company. Realistically, you’re taking a risk every time you download an app. Just owning a phone is a privacy risk. In the age of Google Maps, Siri, Facebook Pixel, Fitbit, and metadata retention, I really don’t think COVIDSafe even deserves a mention in the 2020 Top 500 list of apps and services that seriously expose users to privacy risks. Most of us are running much bigger threats in the background 24/7 without even thinking about it.
iOS and Android’s built-in permissions systems should prevent the worst kinds of surveillance anyway. You can prevent COVIDSafe from accessing the phone’s built-in location functionality at the operating system level — this way you guarantee that it can’t access your location. Simple as that. At the moment, TraceTogether and COVIDSafe request location permissions because of a Google policy that I’ll explain shortly — but the app never uses location data.
But what about Google and Apple?
Google and Apple have been working on contact tracing too, but there’s a lot of confusion and misinformation floating around about what their solution is and whether it’s ‘better’ than COVIDSafe.
It’s not a matter of Google and Apple’s solution being better or worse at all — all apps have to interface with the phone’s operating system to work. If an app wants to use Bluetooth, for example, it has to send a request through the phone’s Bluetooth APIs (Application Programming Interface). These APIs are really useful, as they make it easy for app developers to leverage the hardware and functionality of the device without having to know anything about that specific device. So an app developer can use the same Bluetooth API whether you have the latest Samsung flagship phone, or a cheap and cheerful Chinese smartphone from 2017.
However, Google and Apple restrict what apps can do with these APIs, particularly Apple’s iOS. This is to prevent inexperienced, incompetent, or malicious app developers from draining the battery, using too much data in the background, and other undesirable activities.
However, in order to do this kind of Bluetooth contact tracing, the phone’s Bluetooth has to be always on in the background, actively communicating with all of the devices around it. Typically, iOS and Android don’t want apps doing that, so this would normally be restricted. So Google and Apple aren’t working on a contact tracing app per se — what they’re doing is creating new APIs that will allow contact tracing apps to use normally-inaccessible Bluetooth functionality. They’re also building a contact tracing handshake system into their operating systems, so governments building tracing apps don’t have to — the APIs will do all the hard work for them.
However, TraceTogether and BlueTrace jumped the gun — they’re trying to do contact tracing over Bluetooth before these new APIs exist. Turns out that even though iOS and Android make it pretty hard to build out apps with decentralised functionality, there are some ways around it. This is something we’ve had first-hand experience with when building Session, our decentralised secure messaging app, which you should definitely look at if you like privacy. One of the less desirable things TraceTogether and COVIDSafe do is request location permissions on Android. Google forces developers to do this because of a policy created to warn users about the fact that because Bluetooth broadcasts a device’s MAC address, Bluetooth access could potentially be used to reveal your location to hypothetical ‘beacons’ in the area. This has been proven to be a very low-risk threat, but Google still requires that the user is warned regardless.
However, some restrictions have proven impossible to overcome, particularly with iOS. Right now, TraceTogether/COVIDSafe only works when the app is running in the foreground, so people have to keep reopening the app periodically (to prevent iOS from stopping it automatically) if they want to participate in contact tracing. Needless to say, this is bad. That being said, it’s all we’ve got while we wait for these fancy APIs.
So when the government says ‘We don’t need Apple and Google’, they’re not completely wrong. COVIDSafe can function without using these fancy new APIs, and it’s a good thing it can, because Google and Apple won’t be releasing the APIs until mid-May. But BlueTrace’s reliability is a joke, especially on iOS, so I’d wager that once the shiny new APIs hit the streets, every contact tracing app (including COVIDSafe) will use them. But props to BlueTrace for forging ahead anyway. At least you tried.
And in spite of what Google and Apple are building, individual nations have to build and launch their own apps that interface with their own health system, servers, databases, and other infrastructure. It’ll be easier, but they’ll still have to commit the time and resources to doing it. They’ll just have some slick Silicon Valley APIs to help them out.
Does this mean Australia jumped the gun by choosing to get BlueTrace’s code to make its own version weeks ahead of the Google/Apple release? I don’t think so. It’s bold, but I’d do it too if it means Australians can get out of their homes a week or two earlier. It’ll take several weeks of strong messaging to get the app installed by a significant percentage of the population, by which time Google and Apple’s APIs might be released, so I think it makes sense to push it now even if it won’t be that reliable out of the gate.
Part 2: Making it a success
Alright, so that’s the background information on COVIDSafe — now you know what it’s all about. Maybe, like I was, you’re surprised that it actually isn’t too bad in terms of privacy. However, in order for it to be a success, lots of Australians have to download the app.
This means:
The government needs to properly communicate its plans and implementation to the public.
The government must stop violating our trust. Maybe then it would be easier.
The tech community put its weight behind this (good) use of technology in the public discourse — but only once we’re happy.
I’ll go through these points in greater detail. I don’t know if anyone from the government will ever read these recommendations, but hopefully some of you will be able to help in influencing the conversation.
We shouldn’t forget that the whole point of contact tracing is to slow the spread of a virus that has a mortality rate as high as 5% if it’s allowed to overwhelm the health system. If we do contact tracing right, it could literally save lives and help us return to normal life. That’s no exaggeration. Everyone wins. Yes, it may turn out that this app doesn’t end up working very well, but even if it doesn’t, at least we can say we tried.
Scepticism is healthy, but we need to be reasonable. When presented with a good solution that adequately addresses criticisms, it’s only rational to accept it — regardless of politics or ideology. People’s lives and livelihoods are on the line, and we can’t let our personal politics or cloud our judgement. Being outwardly dismissive of this plan because you think Stuart Robert or Scott Morrison are incompetent (which you’d have a lot of reason to believe) isn’t helping anyone.
So let’s look at the final hurdles to this being a success:
1. Trust
This whole thing is going to be a major uphill battle because the government has a shocking track record of violating our trust when it comes to digital rights, digital privacy, and really just anything involving technology. 2018’s TOLA (Assistance and Access Bill, 2015’s Telecommunications (Interception and Access) Amendment, Centrelink’s Online Compliance Interventions (robodebt) debacle — the list of tech shambles goes on and on. It’s incidents like these that lead us to believe that the government is not to be trusted with technology. Now the public hears the boy crying wolf once more. They’re having a hard time believing statements like ‘it’s not a surveillance app’, because we’ve been conditioned to expect something like it for years now.
As a result, the tech community has been very critical of the app. Not because it’s a bad idea at its core — but because we half expect the government to throw in a crucial detail at the last minute that’ll undermine what we’re advocating for.
I have taken a leap of faith by taking a pretty strong position on this, even though I’m still waiting on some details about, and don’t know what the final version will look like. This goes against my instincts but I think it’s important to move on this sooner rather than later so I can do my part in putting an end to this pandemic.
I do so because we have direct quotes like “[It] won’t tell us where, because that’s irrelevant, or what you’re doing.” and “We don’t care where you are or what you’re doing.” regarding this app. And we can hold them to that. Even if it’s closed source (I’ll get to that).
So if that really is the intention, then we need to make sure the government is doing everything they can to build the trust needed to convince the average joe to install the app on their phone.
First and foremost, they’ve got to make the damn thing open-source. It’s going to be a lot harder to get the people who know what they’re talking about — tech people — on board if they don’t. As I’ve already said, computers seem to catch on fire any time a government official walks into the room. Let us, the tech community, take a look at the source code. We can verify that it does what it says it does, and that there are no glaring issues in it that will lead to bad outcomes like hacks and whatnot. But more importantly, you need us on board. Why?
When this app launches, every grandparent will ring their grandkids to ask them what they think of the app. Everyone will text their ‘techy’ friend to ask for their 2 cents. And every news broadcast in the country will have a security expert talking about what they think, too. Their answers are crucial to the adoption of the app nationwide. The people who really care about technology in this country will make a judgement call in the coming days and weeks, and if they decide they do not like this app, it’ll go nowhere. Making it open-source is the best possible thing you can do to get the support of this nation’s highest-ranking nerds. Their opinions will influence the rest of the nation.
Personally, I have already received calls and texts asking me for my opinion on the app from a couple of not-as-nerdy friends. I expect a deluge to come in the following weeks.
We need to do everything we can to make it clear that open-source is the only option. It’s not that hard. This app needs to be open-source. With reproducible builds. Don’t let some jaded public servant tell you it’s too hard, or that some arbitrary policy prevents it. That’s just lazy. We’ve got to apply pressure.
2. Communication
Empty threats, paper thin details, and rubbish messaging have already damaged this project. Good old Scotty from Marketing has tripped at the first hurdle.
The government really needs to shore up its messaging on this app and explain how the app actually works. I explained it in 3 paragraphs. 273 words. That’s about 2-3 minutes of screen time. Sure, not everyone will follow it exactly, but the media will then have a crystal clear understanding of what it does. And so the narrative can then change from ‘Scotty wants to know who you’re hanging out with… spooky spooky’ to ‘how do we get Australians to use it.’ I’ve written most of this article based on information I’ve been able to piece together from occasional quotes and analysis by smart people on the internet.
Another huge blunder was to suggest that the government “wouldn’t rule out making the app mandatory.” Firstly, how the f**k can you possibly enforce that? Secondly, if you make it mandatory, you lose all credibility and trust. This app is really easy to sell to people — don’t suck the good out of it by telling people they won’t get a choice. They’ll feel violated. And rightly so. Thankfully there has been some pretty strong backpedaling on this, which I expect will continue.
There have also been other issues which instigated waves of panic due to poor communication, such as the decision to use AWS hosting for COVIDSafe. AWS provides cloud service infrastructure, which in this case means that the keys which you generate every 2 hours will be stored on Amazon-owned servers. Using AWS does introduce some hypothetical risks. The most serious concern is that Australian data could be obtained by US law enforcement even if the Amazon servers holding the data aren’t physically in the US (which they aren’t). But the data which will be stored on AWS isn’t that sensitive in the grand scheme of things — if it was, I would have much bigger problems with the app at a fundamental level.
I don’t think there is anything surprising here. If I had 2 weeks to ship out an app to service 10+ million Australians, I’d use AWS too. Unfortunately, the reality is that AWS is second to none when it comes to quick and secure cloud provisioning. It’s one less complexity to worry about when rapidly executing a nationwide technology rollout. This wasn’t communicated at all, and it makes selling the app a lot harder.
So, with all this in mind, how do you sell COVIDSafe? Real easy.
‘This app will let you know if you’ve been in close contact with an active case of COVID-19. It’ll keep you, your family, and your colleagues safer. It costs nothing and your privacy is protected. You don’t have to trust the government, because it’s backed by (insert respectable tech people here). Get the app now.’
3. Accountability
Obviously, the government needs to be held accountable by the Australian public if something goes wrong, if they fail to deliver on the finer details, or they don’t actually respect our privacy the way they say they will.
The get out of jail free card for them is to make it open-source. But let’s say they don’t do that. What can we do then to validate their claims?
We can use meta-analysis to test the following claims without having access to a single line of the source code:
The app doesn’t care where you are or what you’re doing
The app doesn’t upload tracing data automatically to central servers
By using some clever tools, software engineers and cybersecurity folks will be able to determine a lot of things about the app, including but not limited to:
If the app is using GPS or other location services
If the app is communicating over the internet or cellular network with anyone
Where those connections go to, how frequent they are, possibly what they contain
How the app uses Bluetooth and what it is saying over it
Through this, we’ll be able to test those claims — even if the app is closed-source. If we see that, immediately after a handshake has been conducted with another device, there’s a spurt of internet activity going to some unknown government server, we can quite rightly accuse the government of lying to us.
And also, if the app requests location access upon install, the government will have some very big questions to answer. I really hope they mitigate all of this embarrassment and stay true to their word. Or they could make their lives easier and make the thing open source. I honestly believe their intentions are good this time round. I just hope that whoever is working on this app knows what needs to be done.
4. Backing
Last but not least, this app needs backing. Backing from Australian technology experts, backing from nerds, backing from health officials, and backing from the digital rights community.
The wider community needs us to help the government get it right, and needs us to tell all of our family and friends to install this app and to tell everyone they know to do it as well. If it goes well, it might actually be useful for a range of other infectious diseases, which is a conversation I look forward to having if we don’t make a dog’s breakfast of it the first time round. It is in everyone’s interest to make this thing work, and get it right.
As I’ve said before, I’ll be opting in. I’ll be backing this. But there’s still room for them to f**k this up, and if they do, I’ll be watching. I hope you will be too.
UPDATE, 26th April: This article has been edited to reflect additional information that has come to light after the official launch of COVIDSafe.
PS. I’d love to contribute media conversations on this topic, and anything else relating to digital privacy! Reach out to me on twitter, @SimonAHarman
Latest blog posts
The OPTF and Session
The OPTF is transferring its responsibilities as steward of the Session project to the newly established Swiss foundation, the Session Technology Foundation.
READ MORE »
October 15, 2024
Cyber laws around the world: Privacy is not the policy
There is no doubt that the European Union’s GDPR has changed the cyber regulation landscape forever. As onlookers from non-EU countries urge their governments and regulators to adopt similar legislation, countries are rapidly adopting their
READ MORE »
December 04, 2022
The long and winding road : Striving for data protection in Indonesia
Juliana Harsianti is an independent researcher and journalist working at the intersection of digital technology and social impact. The long awaited Indonesian Personal Data Protection Bill was approved by the parliament on 20 September 2022.
READ MORE »
November 17, 2022