flypig.co.uk

List items

Items from the current list are shown below.

Sailfishos

All items from April 2020

11 Apr 2020 : Google/Apple's “privacy-safe contact tracing“, a summary #
As I discussed yesterday, Google and Apple recently announced a joint privacy-preserving contact tracing API aimed at helping people find out whether they'd been in contact with someone who subsequently tested positive for COVID-19.

We've already relinquished so many rights in the fight against COVID-19, it's important that privacy isn't another one, not least because the benefit of contact tracing increases with the number of people who use it, and if it violates privacy it'll rightly put people off.

So I'm generally positive about the specification. It seems to be a fair attempt to provide privacy and functionality. Not only that, it's providing a benchmark for privacy that it would be easy for governments to fall short of if the spec weren't already available. Essentially, any government who now provides less privacy than this, is either incompetent, or has alterior motives.

But what does the spec actually say? Apple and Google have provided a decent high-level summary in the form of a slide deck, from which the image below is taken. They've also published a (non-final) technical specification. However, for me the summary is too high-level (it explains what the system does, but not how it works) and the technical specs are too low-level (there's too much detail to get a quick understanding). So this is my attempt at a middle-ground.
 
A high-level overview of the approach

There are three parts to the system. There's the OS part, which is what the specification covers; there's an app provided by your regional health authority; and there's a server run by your regional health authority (or more likely, a company the health authority subcontracted to). They all act together to provide the contact tracing service.
 
  1. Each day the user's device generates a random secret $k$, which stays on the user's device for the time being.
  2. The device then broadcasts BLE beacons containing $h = H(k, c)$ where $H$ is a one-way hash function and $c$ is a counter. Since $k$ can't be derived from $h$, and since no pair of beacons $h_1, h_2$ can be associated with one another, the beacons can't in theory be used for tracking. This assumes that the BLE subsystem provides a level of tracking-protection, for example through MAC randomisation. Such protections don't always work, but at least in theory the contact-tracing feature doesn't make it any worse.
  3. The device also listens for any beacons sent out by other users and stores any it captures locally in a list $b_1, b_2, \ldots$.
  4. If a user tests positive for COVID-19 they are asked to notify the regional health authority through the app. This involves the app uploading their secret $k$ for the day to a central database run by the regional health authority (or their subcontractor). From what I can tell, neither Apple nor Google need to be involved in the running of this part of the system, or to have direct access to the database. Note that only $k$ is uploaded. Neither the individual beacons $h_1, h_2, \ldots$ sent, nor the beacons $b_1, b_2, \ldots$ received, need to be uploaded. This keeps data quantities down.
  5. Each day the user's phone also downloads a list $k_1, k_2, \ldots, k_m$ of secrets associated with people who tested positive. This is the list collated each day in the central database. These keys were randomly generated on the user's phone and so are pseudonymous.
  6. The user's phone then goes through the list and checks whether one of the $k_i$ is associated with someone they interacted with. It does this by re-calculating the beacons that were derived from this secret: $H(k_i, 1), H(k_i, 2), \ldots, H(k_i, m)$, and compares each against every beacon it collected the same day.
  7. If there's a match $H(k_i, j) = b_l$, then the user is alerted that they likely interacted with someone who has subsequently tested positive. Because the phone also now knows the counter $j$ used to generate the match, it can also provided a time for when the interaction occurred.

This is a significant simplification of the protocol, but hopefully gives an idea of how it works. This is also my interpretation based on reading the specs, so liable to error. By all means criticise my summary, but please don't use this summary to criticise the original specification. If you want to do that, you should read the full specs.

Because of the way the specification is split between the OS and the app, the BLE beacons can be transmitted and received without the user having to install any app. It's only when the user tests positive and wants to notify their regional health authority, or when a user wants to be notified that they may have interacted with someone who tested positive, that they need to install the app. This is a nice feature as it means there's still a benefit even if users don't immediately install the app.

One of the big areas for privacy concern will be the behaviour of the apps provided by the regional health authorities. These have the ability to undermine the anonymity of the system, for example by uploading personal details alongside $k$, or by tracking the IP addresses as the upload takes place. I think these are valid concerns, especially given that governments are notorious data-hoarders, and that the system itself is unlikely to be built or run by a health authority. It would be a tragic missed opportunity if apps do undermine the privacy of the system in this way, but unfortunately it may also be difficult to know unless the sourcecode of the apps themselves is made available.
 
Comment
10 Apr 2020 : Initial observations on the joint Google/Apple “privacy-safe contact tracing” specification #

Apple and Google today announced a joint protocol to support contact tracing using BLE. You can read their respective posts about it on the Apple Newsroom and Google blog.

The posts offer some context, but the real meat can be found in a series of specification documents. The specs provide enough information about how the system will work to allow a decent understanding, albeit with some caveats.

With so much potential for misuse, and given that mistrust could lead to some people choosing not to use the system, it's great that Google and Apple are apparently taking privacy and interoperability so seriously. But I'm a natural sceptic, so whenever a company claims to be taking privacy seriously, I like to apply a few tests.
 
  1. Are the specs and implementation details (ideally sourcecode) freely and openly available?
  2. Is interoperability with other software and devices supported.
  3. Based on the information available, is there a more privacy-preserving approach that the company could have gone with, but chose not to?
The answers to these appear to be "yes" (but not the sourcecode), "mostly" and "no". It's quite unusual, even for companies like Apple that make bold claims about privacy, to satisfy any one of these, let alone more than one, so this is genuinely very encouraging. Based on the specs released so-far, it seems that this has been a good-faith attempt to achieve both protection and privacy.

The catch is that the API defined by the specs provides only half of a full implementation. Apple and Google are providing an API for generating and capturing BLE beacons. They don't say what should happen to those beacons once they've been captured. Presumably this is because they expect this part of the system to be implemented by a third-party, most likely a regional public health authority (or, even more likely, a company that a health authority has subcontracted to).

Again, this makes sense, since different regions may want to implement their own client and server software to do this. In fact, by delegating this part of the system, Google and Apple strengthen their claim that they're acting in good faith. They're essentially encouraging public health authorities and their subcontractors to live up to the same privacy standards.

Apart from the privacy issues, my other main interest is in having the same system work on operating systems other than iOS and Android. My specific interest is for Sailfish OS, but there are other smartphone operating systems that people use, and locking users of alternative operating systems out of something like this would be a terrible result both for the operating system and for all users.

Delegation of the server and app portions to health authorities unfortunately makes it highly unlikely that alternative operating systems will be able to hook into the system. For this to happen, the health authority servers would also need to provide a public API. Google and Apple leave this part completely open, and the likelihood that health authorities will provide an API is unfortunately very slim.

I'd urge any organisation planning to develop the client software and servers for a fully working system to prove me wrong. Otherwise alternative operating system users like me could be left unable to access the benefits of the system. This reduces its utility for those users to nill, but it also reduces the effectiveness of the system for all users, independent of which operating system they use, because it increases the false negative rate.

There's one other aspect of the specification that intrigues me. In the overview slide deck it states that "Alice’s phone periodically downloads the broadcast beacon keys of everyone who has tested positive for COVID-19 in her region." (my emphasis). This implies some form of region-locking that's not covered by the spec. Presumably this is because the servers will be run by regional health authorities and so the user will install an app that applies to their particular region. There are many reasons why this is a good idea, not least because otherwise the amount of data a user would have to download to their device each day would be prohibitive. But there is a downside too. It essentially means that users travelling across regions won't be protected. If they interact with someone from a different region who tests positive, this interaction won't be flagged up by the system.

The spec is still very new and no doubt more details will emerge over the coming days and weeks. I'll be interested to see how it pans out, and also interested to see whether this can be implemented on devices like my Sailfish OS phone.
 
Reference to region-locking, taken from the overview slide deck
Comment