Spring Appeal: COVID-19 raises urgent ethical issues for each of us. Our experts shape real-time responses.

Transcript: Re-Opening the Nation: Privacy, Surveillance and Digital Tools for Contact Tracing

[00:00:09] Hello and welcome to Reopening the Nation Hastings Center, a conversation about the next steps forward in the coveted 19 pandemic. We’re so pleased to have with us today. Ryan, Kalo, Ed Felten and Mildred Solomon. We’re hoping for strong audience participation. So please do us questions by typing them into the Q&A box at the bottom of your screen. The discussants will be hopefully stopping once or twice during the session in order to address audience questions. If your question doesn’t get answered or if you’d like to continue the conversation, I’d encourage you to check out Twitter using the hashtag ethics for reopening. This webinar is being recorded and it will be available on the Hastings Center’s Web site later today. I encourage you to share it with family, friends and anyone else who you think may find it useful. Now I’d like to introduce Mildred Solomon, president of the Hastings Center, who will be leading today’s discussion.

[00:02:53] Hello, everybody. It’s my pleasure to welcome you to this hour webinar in the Hastings Conversation series Reopening the Nation. Our first webinar featured Zeke Emanuel from the University of Pennsylvania and Danielle Allen from the Safra Center at Harvard University. And it was on the values that should guide reopening. You can still view it by visiting our Web site or searching for it on YouTube. More than five hundred of you are registered for today’s event. You’re a mix of bioethicists and other scholars from different disciplines. Policy makers and the general public. And that’s precisely the mix that the Hastings Center has been aiming for. We want to create spaces where the public can hear and also interact with experts and where experts can be in dialog and responsive to members of the public. We’re planning a series on ethical issues raised by the pandemic. So please watch your Web site. Our Web site and your email. But we also have a non pandemic event occurring on Thursday, June 4th, where I’m going to have the pleasure of talking with Alondra Nelson. Who’s the president of the Social Science Research Council? And one of the most prominent sociologists in the country. She’s based at the Institute for Advanced Study in Princeton. Professor Nelson is a sociologist of science and has very thoughtful insights about the ethical issues that relate to 21st century genomics. If you want a break from pandemic ethics, please, Mark. June 4th on your calendars. Today, we’re focusing on contact tracing. It’s one of the most critical tools we have. If the nation is going to open open up safely, the idea behind contact tracing is to identify the people who’ve been in close proximity to an infected person so the exposed people can isolate, isolate long enough to ensure that they won’t be passing the virus on while they’re incubating it. Contact tracing is part of a three part package. You have to have diagnostic testing available to identify those who are who are positive. You have to have contact tracing in place to identify those who’ve been exposed to infected people. And you need those contacts to be willing to self isolate. If we had had testing and contact tracing at scale, we wouldn’t have needed so big and general a lockdown. Ethically speaking, you always want to use the least restrictive means to contain a pandemic without testing and contact tracing. We would have had we have had to use one of the most restrictive means. But if we have it in place, we could move to much more targeted isolation and reopen safely to bring contact tracing the scale we need to hire and train thousands of people who can play this role. And that’s a good thing for our economy as well as for our public health. With that, with so many people out of work, we could actually create a kind of jobs program for contact tracing. Massachusetts has realized this and they’re in the process of hiring over a thousand people whom they’re going to train to do contact tracing over the phone. Contact tracing should always be done human-to-human, but since we’re such a large country and we have so far to go in rebuilding our public health infrastructure, increasingly there are calls to use digital tools as a supplement to help to human contact tracing. There’s a lot of interest in these tools and they may bring some very important benefits, but there are also questions about whether they will work and also undoubtedly they’re going to raise a number of ethical issues. Depending on how those get resolved, these tools could have a very lasting impact on the nature of our society. So contact tracing shouldn’t be controversial. It’s not controversial. It’s been a tried and true method of restraining pandemics. But digital apps are. So to help us understand what’s at stake. We’ve invited two distinguished experts. Ed Felten is a professor of computer science and public affairs at Princeton University, where he founded the Center for Information Technology and Policy from 2011 to 2014. He was the chief technologist at the Federal Trade Commission, and in 2015, he joined the Obama administration as deputy U.S. chief technology officer. He’s he’s an expert on many things, including computer security and privacy. Ryan Calo is a professor at the University of Washington School of Law. He co-founded and co-directs the University of Washington Tech Policy Lab, which is an interdisciplinary research institute that spans the School of Law. The Information School and the School of Computer Science and Engineering. And he also recently co-founded the Center for an Informed Public, which is dedicated to abating purposeful misinformation in our country. He’s testified three times before the U.S. Senate on the use of technology to combat Koven, 19. So with that, I’m going to turn to our experts.

[00:08:15] And I want to start with you, Ryan, to help us understand what the expectations and the claims are that are being made about these digital tools. How might they how might they help us?

[00:08:30] Sure. So, first of all, thank you for convening us. This is a really critically important issue and a very important time and I’m glad to be a part of the conversation. So, I mean, I think you see a range of claims and I think that’s part of the issue. I think you see everything from a very humble claim that under specific circumstances that having these digital technologies, these apps will help manual contact tracing and make it more efficient and really be a great supplement to the efforts you’re describing. All right. All the way to a range of things, which is you can download an app and you can walk around the world and be safe. And because of that range of claims, I think it’s hard to know what it’s hard for the public to understand what to expect. And it’s hard for policymakers to incorporate these digital systems into their pandemic responses because they have this variety and there is this disconnect. So, again, a range of possibilities, everything from we’re just trying to help those important manual contact tracing efforts all the way to, you know, you’ll be safe if you just have if you just have this app on you.

[00:09:53] That’s a broad range. Ed, can you fill in a few examples on that continuum of the purposes to which these would be put?

[00:10:01] Sure. And as Ryan said, there are a lot of different claims that are made, but I think we can zero in on the claims that are that are more reasonable and more were what likely achievable. And here what we’re looking at basically is wait. Ease of on it is a capability that is part of a broad spectrum of approach. As we deal with this issue. Right. We’re like for a long time in a mode where we’re trying to manage the disease. We have a limit, eradicated it, nor are we just completely reopened.

[00:10:35] And during that period, the name of the game is making sure that the limited number of tests we have are used in the best possible way for those people who are at highest risk. And that decisions about when to self-quarantine, where to go, how to manage our contacts with people are made in a way that is as well informed and based on reasonable risk evaluation as possible. And so I think if you look at these apps as a way of provide more information to refine our notion of risk and to help make those decisions somewhat better, along with all the other sources of information such as manual contact, if if that is the view, I think that’s that’s something that the apps can reasonably hope to achieve if they’re if they’re done designed and used in a way that’s really thoughtful.

[00:11:26] So, Ed, could you walk us through how they would work? And I believe your. I think we have an illustration to that is about there is a diagram.

[00:11:36] Yes.

[00:11:37] If we could put that up. So. The great OK.

[00:11:44] And there are some. So this show is a kind of two by two grid. The what? What I consider the to make decisions that need to be made in figuring out how to make a contact tracing or exposure notification. And the two along that, I have the two different technical approaches. So people have an app on their smartphone and the app is supposed to give them provide information about who’s been near whom and who might have been exposed. And there are two technical approaches to figuring out who’s been near each other, direct proximity detection or location history matching. So direct proximity detection. The idea is that your phone would chirp out some kind of door signal. The technology, like Bluetooth, is a short range radio. It can operate within some small number of feet. And then other people’s phones would detect those and you would know you were near someone because you had heard the beacon that their phone had sent. That’s direct proximity detection. The other approach is location history matching. And what that means is everyone’s phones keeps records and various companies keep records of exactly where we been based on g._p._s and other location. Which means you could then take peoples. You could then take people’s location histories and then against each other. And if you see that two people have been in the same place at the same time, you know, there was a possible exposure. So those are the two technologies. And then on the other dimension, over on the what? Is that true goals that we might have for a system which tries to detect exposure events? The first the first way we could do it is to just try to notify the exposed individuals. So if I’ve been there, someone who was likely contagious and the system discovers that after the fact, it would notify just me and then I could take whatever action I deemed appropriate response to that. The other approach is to notify public officials so that this feed into a broader contact tracing infrastructure that public officials were doing. And along these dimensions, there are tradeoffs and different places in the world.

[00:14:02] There have been all for combinations of these have been tried, but the one that seems to be the leading candidate discussed the most is the one which is based on the Apple and Google technology. And that’s the one that’s an upper left box of notifying exposed individuals and doing the indirect proximity detection.

[00:14:24] Ryan, can can you tell us a little bit more about what Apple and Google are doing?

[00:14:31] For sure. And I think this slide is taken from Google’s own documentation of what it’s doing. So I think it’s important to note that Apple and Google are not recording to do contact tracing. As a matter of fact, they originally picked that term and backed away from it. And now so what they say that there are doing is they are supporting apps that do that, do exposure notification. And so they are very much in that box on the left, the set of boxes on the left, which is to say they’re trying to they’re trying to show that you’ve been near somebody else who has subsequently reported being positive for cocaine 19. But they’re not building an app. That is to say, they are building a framework. They’re building an API away for others to build apps on top of it. And what the capability that I’ve actually built is relatively limited. What it allows you to do is on either a Google system or an apple system, which is almost every system you can run a Bluetooth. Based proximity detection app that allows you to determine if you have come into contact of a certain kind for with with another with another individual who then later reports that they that they have Kobe 19. And so what’s really interesting, though, about the graph that I want and I want to walk you through is there’s a lot of assumptions embedded in this and it’s only an example and it’s not enforced by the system. So, for example, Alison, Bob, you can see in the top left hand corner and by the way, Ed, please just jump right in, if I may get this wrong. Right. Law professor here. Not computer scientists, but my understanding is as follows. If you look at the top left corner, you see that. Allison, Bob, meet each other for the first time and have a 10 minute conversation. A 10 minute conversation is a specific amount of time. What if it were a five minute conversation? What if it were a 20 minute conversation? Well, Apple and Google might say, hey, you set the threshold anywhere you want, right? We’re just telling you that, you know, that we have this capability. Then later on, Bob is positively diagnosed, recovered and enters a test results in an app from a public health authority.

[00:17:00] Is that exactly how it has to work? Well, that’s what they prefer. They prefer the idea is that you go to a health care provider, you get a positive diagnosis for Kobe 19.

[00:17:09] It’s verified by somebody who’s a health care professional. And only then does it wind up in the system. Does it absolutely have to happen that way? I don’t think so. I mean, they want it to. Most people want it to be great, but we’re not totally sure. In other words, another version of this could be that Bob is having a bad day and decides that he’s just going to tell everybody that he has coke and when in fact, he doesn’t. Then you have a system that that then notifies others without revealing your location, their location, their identity. It just tells you, hey, you’ve been in proximity with somebody who is now saying that they have at 19. And as Ed alluded to earlier, that can be really useful information if it’s driving, whether you want to get a test, they might be not so useful information.

[00:17:55] If you think that you’re safe or if you are trying to make a decision whether whether to quarantine. And so what the architecture supports is for you to build an app to notify people when they’ve been exposed for a particular purpose. But the Apple Google framework, I think it’s fair to say, kicks down the street for the app developer. A lot of some of the important questions that you’d have to answer to better architects, such a system. And so I’m going to pause there and make sure that I’m not screwing this up from a technical perspective.

[00:18:31] Do you think that I can’t take that job from a technical perspective that’s right on.

[00:18:37] I think one of the one of the important aspects of that is that Google and Apple have said that they will limit who can put apps into their app stores that use this functionality and exactly what those limits will be. I think our really important part of the architecture of the system and we can dig into that some more. That’s a story that there evolve over time. But I think one of the important factors is they say that they will only allow apps that are designated or approved by an appropriate government, public health authority, either at the national level or if a national government delegates that states and localities by the state or local health authority. So whatever it is that the app will do, if the companies all are through and what they’ve said about their approval process, it at least will be something that has been approved by an appropriate public health authority.

[00:19:37] Do you think that public health agencies have the ability to build these kind of apps?

[00:19:44] I think to build them probably not to contract for building them, they that’s certainly something they can do. One of the big open questions in this space in my mind is who will actually build the apps there? And it’s not that there is no one who’s trying to do that, if anything. There are a great many different groups and organizations and companies that are that that say they’re trying to build these apps. That the real question, though, is because public health authorities will be gatekeepers for these apps getting into use. The question is, how will that sort of three way interaction or four way between the app developer or the Public Health Authority, the Google and Apple? And then, of course, the end users, how will that dynamic go and what kind of governance or principles will be followed as that dynamic evolves? We don’t really know the answers to that at all at this point.

[00:20:43] Ryan, you mentioned that it was important to Apple or Google or both of them that this was not called contact tracing, but rather exposure notification. What was in their minds and making that distinction and how is it substantively how is it different than contact tracing? But I I I I don’t know.

[00:21:03] You know, I don’t know what what is in is in their minds, you know. I mean, I have to say that I wrote a piece for Brookings that people can can check out in which I and two coauthors, one of whom is a technologist. And the other is an epidemiologist. And we wrote some of our concerns about about these apps up for Brookings. And in that piece, I regret that we speculated a little bit about what was motivating Apple and Google. Here’s what we said. Right. And I should say, I have no independent knowledge. That’s anecdotal. The idea being that there a lot of pressure in society, including from the current White House for Apple and Google to do something as big, innovative, important companies who are who have access to all this information and attention. And and that Apple and Google created this architecture as a way to do something helpful, but not to be the actual provider of the of the ultimate application. And so what they did is they created this architecture that allows apparently a public health authority, approved apps, although it is an evolving situation. I’m not sure that was always a condition of it. Indeed, when it first rolled out, a lot of us looked at it and said, mm hmm. They seemed to be not even taking on the responsibility of vetting the apps that go into the system. They’re rather building infrastructure that enforces privacy and security in a particular way. But I’m glad to see the evolution.

[00:22:33] Now, why like so it’s not contact tracing. Contact tracing, as you noted, used to summarize it extremely well. Right. It’s when you have usually a human being goes out in the world, figures out who you’ve had contact with. This is exposure notification. You have been exposed to something. You’re now being notified of that exposure. And again, by design, it doesn’t put any health information or any location information into the hands of Apple or Google that they did not have before, nor does it really put that into the not not the basic API anyway, would not put that into the hands of the government either. Right. But what the problem with it, as I see it, is it doesn’t answer some pretty basic questions that we hope we sort of, you know, you fingers crossed emoji, hope that the policymakers and health authorities will do well. One of them, for example, is what’s the threshold for notification? Right. I mean, if you’re just passed by somebody jogging, you know, is that enough? And you know, Mick, maybe. But Mick, maybe not. Right. You’ve got to set the threshold for that. How far away do you have to be? And this is, incidentally, after you saw some of the deep technical challenges around making sure that that low energy Bluetooth, which in theory can range for quite a broad, can go very far. This is after after solving some technical questions so that you actually have good faith in the idea that you are only doing contact tracing for people that are physically close to you because you policed against that. Still, you have these threshold questions of how close and how long. And those questions are not technical questions or questions of epidemiology and virology. Right. That you hope are accurate. The other thing is, again, absent a mechanism, you do worry that that some apps will botch this and allow people to self-report, covered 19 status and thereby cause a lot of a lot of problems. Right. And so for me, it’s like I don’t know exactly what their motivation was.

[00:24:40] It feels to me I really empirical support for this, but it feels it feels to me and I’ve heard anecdotally that it had to do with being being seen to do something helpful, but without solving some of those hard questions about about about tradeoffs between privacy and health and also tradeoffs between setting thresholds and enforcing crying wolf scenarios. That’s my editorial. It sounds like the situation is evolving. I’m glad they went from contact tracing to exposure notification. I think that’s a big step. I mean, one of the big recommendations that my colleagues and I put forward is precisely that the limitations of these technologies should be very well understood by the public and the public health officials who are using them so that they don’t believe that it’s a panacea when it’s not. So they really understand what the limitations of technology is. And I don’t think anybody responsible disagrees with that. And you should have a good mental model of the technology that you’re using.

[00:25:43] Right. I made it clear that it’s dependent upon the individual to post their coded results before to for the system to know that that they are positive, which will make. Is that true? Is it between the.

[00:26:00] I mean, you can see from the the original grid that I described where there are some applications where the notice actually doesn’t even go to, it goes to the health authorities and they use it to follow up with with contact tracing.

[00:26:13] And that’s a different model, right? This is a model wherein somebody has to enter into the system that there’s been an exposure. And one idea would be that you yourself, the end user upload, that you have had an exposure. Another would be that you go to a health authority and who’s been designated. And that day and only that much like you’re only your doctor can write a prescription to your pharmacy, only that would be the ones who would be allowed the system. And that latter scenario actually addresses a deep concern of concerns that I have.

[00:26:46] Yeah, that that’s a very important difference. So Ed Ryan’s raised a lot of feasibility questions. Is it really going to work? And he’s touched on an ethics question about privacy. What? What are your thoughts about feasibility and the ethics questions here? Both.

[00:27:02] Well, I think there there are a bunch of open scientific questions about feasibility. Some of them have to do with questions like how accurately can you use these Bluetooth low energy signals to estimate how far apart people were and for how long? How far apart is the is the is the more difficult one, technically? People who know about this say it’s at least plausible that you could get information about how close people were to a degree of accuracy. That’s useful. And it doesn’t need to be perfect because manual contact tracing or other risk risk estimation measures are not perfect. Either you just want something that lets you make a better estimate of the likelihood that a person has been exposed to the disease or has or has or has the disease. So I think probably the evidence is the weight of evidence is on the side of it, probably being useful, being able to estimate a distance in a way that is good enough to be useful. I’m on the other questions. I think issues around privacy and and autonomy for end users are really critical. And this gets to what I think is most likely to the way the reason that the Apple and Google have ended up where they are on this. And this is not based on inside information, but I think it’s a reasonable. It’s my best read on what’s what has happened. Technically speaking, what the companies are doing is they are making an exception to the normal privacy rules on their system. So your phone has Bluetooth technology enabled on it. And the usual rule is that you can’t. Is that a Bluetooth devices beckoning to each other, your phone beckoning to a different device is something that an app can only make happen if you have that app actually open on your screen so that it’s currently displayed app and your phone is not locked. And that’s for your own privacy so that you can’t have an app on the phone which is constantly beckoning out information like your identity to everyone who’s nearby. Right. That’s the usual rule. And that rule would make this kind of proximity or exposure detection app not possible because you would have to keep your phone open to that happen locked all the time, which is not going to happen realistically. And so the companies have said we will make an exception and allow an app to use Bluetooth low energy deepening in that even when that app is not active, an unlocked phone provided that it fits within certain limits. So the companies have considered this idea of a sort of silent Bluetooth beacon to be a big privacy vulnerability, which is why they’ve so tightly controlled it. And now in light of the suggestions that opening up that technology and turning off that privacy control would be useful in this pandemic situation, they’ve said, OK, we’re going to loosen, but only under only under limited conditions. And that’s where you get to these conditions. And they’ve been building out those principles and restrictions over time. Right. So I think when Ryan and his colleagues wrote their article, the companies had said very little about what those restrictions would be. Since then, they’ve said they would require only apps that are approved by an appropriate public health agency and that the apps are only allowed to give certain information to the end user and only the end user and certain limitations on what the apps can do. Otherwise, they will be kicked out of the app store. So to my mind, this is the company is trying to figure out how to use this technology in a way that’s useful potentially in detecting covert exposure without opening the door to a broader surveillance infrastructure that would that would almost certainly follow if they just completely took the controls off its Bluetooth beacon technology. So that’s a long and complicated answer, but I think it put some perspective on what the companies are trying to do.

[00:31:12] That’s very helpful. We’re going to we’re going to come back and talk to try to articulate a cluster of ethics concerns here. But I want to open the floor to the folks who have been observing this conversation. Isabelle, you’ve been looking at the questions that are coming in. Can you summarize press or print or present a question or two?

[00:31:33] Yes, that there’s a lot of interest in these privacy and surveillance questions on Facebook. What we are advertising this event certainly got a lot of attention with people feeling like this was an invasion of privacy and would lead to horrible surveillance. I’m wondering if you can address those concerns and talk about what you have done so a little bit already, but also talk about how after the pandemic is over, we might kind of revert back to a less privacy and frenching structure.

[00:32:04] I know that Ryan’s study has studied what the public thinks about privacy, so maybe we give Ryan the first stab at this, but come back to you and read right after that.

[00:32:15] Yeah, I mean, so part of the part of the problem with, you know, doing research during during a pandemic, you know, is that science is not so comfortable in the fast lane. Right. And so what we what we did is we have been taking surveys, samples over a long period of time about people’s attitudes toward digital contact tracing, automated contact tracing apps, essentially. And the first we released the first tranche of those, which are from April first and third respectively, with an end of about about 200.

[00:32:51] So that is to say, you know, two hundred subjects. And what we found in terms of people’s privacy attitudes was already interesting, but pre-dated the Apple Google announcement. You know, so it’s complicated. Right. So one thing, for example, that we found was that there was a substantial group of people who wouldn’t want to download these things, even if they had perfect privacy. And that and that that drop number dropped precipitously when you started to introduce the possibility that there could be there could be privacy harms involved or security issues involved. It really, really had a deleterious effect on participation. The other one was that privacy people tended to trust things more that already that they already were using.

[00:33:41] And so, for example, you saw things like relative comfort with with you figure out your location based on triangulation because of your cell phone having to deliver. You know, you’re basically using cell site data to train your location. Not that that would be granular enough to do contact tracing, but that was something with which they express comfort. They expressed greater comfort with names they already knew, with apps they already knew, like Google Maps are doing this kind of they’re doing something, you know, about your location, history and so on. And then right when you started to get the government involved, a lot of participants got concerned. Right. So how does that map onto the current system? It maps on, in a sense, because a lot of people are going to be using this Apple Google infrastructure. Maybe people will have some comfort with that because they are aware of these of these folks and they know that they tracked location for other reasons. What they’re going to have to be working with government or they get kicked out of the store. So maybe that will make people concern. And I think a lot of I think whatever you think about. I mean, I ultimately come down just on the other side of the issue than an ad does that Ed said that he thinks on balance, acknowledging all the concerns that on balance it’s going to be useful. My own take is that on balance, it’s going to be counterproductive. You know what I mean? But I understand that that they’re reasonable minds differ. I will say that everybody who looks at the space as you do need to have a lot of participation for this to be even remotely useful for only a couple of people running an app. It just doesn’t do much at all. And the privacy concerns might be one of the hurdles.

[00:35:09] I will inject one last thing. This is not from our study, which was a study that we did with our colleagues in computer science. Lucy Semenko was the lead author. TARDIO She KOHNO Friends you Rosenau. But so we did that. We did a study. You can you can look it up. It’s pre prepublication. The other thing is, you know, I worry a lot about any number of technologies that will be used today to address the pandemic only for us to acclimate to them and then they will be used in the future to do other things. And so one version of this says apalling pandemic. You know, we address it. Everyone claps where we’re in great shape. And then all of a sudden, Apple and Google are sitting on this infrastructure that allows you to figure out whether you’ve been nearby somebody. And the temptation becomes, I think, very significant to try to commercialize that that platform. Right. And so behind the scenes I have been advocating and many people, many people including I imagine that I can’t imagine not I’ve been advocating behind the scenes. Let’s put some safeguards in place so that this these systems can not outlived their usefulness and that this data doesn’t get redeployed in ways that we call secondary use or mission creep. I think, you know, safeguards in place is critical both to the health of privacy and the nation, but also to giving people enough comfort that they’ll participate in these in these systems.

[00:36:42] So, Ed, you think that safeguards are possible? Could you give to go back to Isobel’s question? Really?

[00:36:47] It’s an extended collaboration of what could be done to make sure this was a short term use and not permanently a part of the landscape.

[00:36:57] So there are a bunch of things that can happen. One of the most important is to put an explicit sunset onto these mechanisms, onto these apps, so that on some pretty determined date they will shut off, won’t exist anymore. Will deactivate themselves unless there is an explicit action to sort of extended that need and then to get the companies and public health officials to commit to two, not to not game that system so explicit sunset. That’s our one thing that is important. Another thing that’s important is to get the people who are building these systems, authorizing them and so on. On the record, on the legally accountable record making prompt making the promises that they make. When companies make definite assertions to consumers and users about privacy, those are legally enforceable under the Federal Trade Commission Act and other other laws. And so having very explicit promises that are testable later and also provide really important and really important guidance. And then the final piece is, is around transparency and oversight, making sure that it’s clear what is happening and making sure that there are entities that are in a position to follow up and ask hard questions. I have an op ed with with a colleague, Adam Klein, from the Privacy and Civil Liberties Oversight Board that talks about how that ran in Politico, that talks about how we can apply the lessons of 9/11. 9/11 was similar in the aftermath of a major public sort of shock to the public. We had a lot of new and a lot of new capabilities put in place that that posed really serious risks with respect to privacy. And, you know, we have we have the track record we have in dealing with those risks. Some aspects, I think very successful, others not as successful. But learning and applying those lessons is really important. And the importance of sunsets in on these provisions is I think it is really important starting point that we point to there. And I’ll post up a pointer to that op ed on the chat.

[00:39:10] Great, thank you, Ryan.

[00:39:12] Sorry to interrupt, I didn’t mean to. So first of all, I think that Ed’s op ed in Politico is a must read. I mean, it just it’s really, really important. And it’s a starting place for this. And so and I think that if you if you have the capability to do it, well, then things like sunset and oversight, of which it is actually a part of the oversight of some of the powers that that are that resulted from 9/11. I do. I do think that you alluded to this. I think that, you know, we we don’t have the the best best track record in letting those kinds of powers lapse. You know, I mean, I think just recently we saw yet another renewal of a set of powers that and arguably expansion of a set of powers that make many Americans feel very uncomfortable. Right. I mean, there’s a there’s a great quote from which justice. Who is this? It’s a black man, Justice Blackman, which is in concurrence in Youngstown Steel, which says that the problem with emergency powers is that they tend to kindle emergencies. People like to have this ability. And so it’s sticky. It’s really, really sticky. So I absolutely agree that having very clear, explicit rules is critical. I agree 100 percent with that op ed. But I think in practice, gosh, sometimes we just don’t have the appetite, the political appetite to to really constrain ourselves once we have. Once we have adopted a new power.

[00:40:59] Is it. Thank you very much, Isabel. Another question from from the floor.

[00:41:04] Yeah. So there’s. Thank you for talking about kind of some of the regulatory structures that might be at play. There’s there’s a broader interest in what it looks like to create a regulatory framework for something that’s so time sensitive. And one of the main concerns that’s been brought up is, is potential access that law enforcement and immigration enforcement might have to these apps. It seems that some of our commenters are really worried that this might affect adoption and it might affect people’s attitudes, attitudes for these apps. In all fairness. So what framework can you imagine would be possible to be put in place or is that kind of something that is always going to be a vulnerability?

[00:41:46] I’ll go first this time. I mean, so. So I. There are already some some frameworks that are that are circulating. You know, I and a bunch of other people weighed in positively about an effort that was in the Senate and in the House. It’s called the Emergency Privacy Health Act or something, something I got from senator. I think Senator Warren was the was the spearheaded that led that effort. But it’s a bunch of different members and senators, and it does a lot of the things that that I’d mentioned it. It just it just is very clear that the data can only be used for the purposes which is gathered, that you have to have other kinds of important safeguards in place. You know, it’s fast it’s it’s it’s fast legislation, you know. But it’s been pretty heavily vetted.

[00:42:41] I don’t know what its political trajectory will be. But, you know, it’s it’s you know, there’s there’s bills out there and there’s happening also at the at the state level. And so the truth is, is that private security is relatively sophisticated at this point. We’ve had decades of dealing with this. And so we know we I mean, there’s no but I mean, there are there are folks with expertise who have a good sense of what mitigation strategies are for privacy. And they have been identified. And if there’s the political will, I’m I’m actually pretty confident that some of those issues will will be resolved. And also, I don’t see the Google Apple infrastructure, which is the most likely to catch on for all these reasons we’ve been talking about. I don’t see it as particularly privacy invasive myself. Right. I mean, it takes great, great it takes great pains to make sure that there isn’t a log somewhere, centralized log of your location and your health status. I think that’s also precisely why it will be a very limited utility. And so my my big concern is, is not that we don’t have mitigation strategies.

[00:43:45] Is that is that inevitably when you’re talking about addressing a public health emergency, using as sensitive information is where people are and your health status and their health history and their contacts and so on. There’s going to be private. Maybe not in theory, but in practice it’s going to be privacy tradeoffs. And so we need to do our best I can to mitigate it. But I’m confident that if we have the political will, we do have the regulatory toolkit.

[00:44:11] Isabelle, there’s another. Another question from the group, yeah.

 

[00:44:15] So the last main question that I think is coming up is this concern about exacerbating digital divides. So, you know, only so many Americans have smartphones that might be capable of using this technology, Internet connection or data plans that might be required. Can you talk about ways that that might mitigate the threats to exacerbating inequalities and how this might be a more equitable technology?

[00:44:42] Sure. If I could could address that. So about 80 percent of Americans have smartphones. And so for the 20 percent who do not have smartphones, this app would not be available to them. That’s a that’s just the fact. Now, in a tool, we do more to address the digital divide that will remain the case. So and that means you things right. It means, first of all, that we have less coverage in general from this technology, that a substantial number of substantial fraction of the potential Cauvin exposure events that may occur will be undetected and they’ll be disproportionately undetected in in disadvantaged communities. So. One of the things that I think we could clearly should be doing about that is to make sure that the that other resources that are designed to help protect people against coverage are deployed in a way that provides backstop protection for those people who don’t get the protection. Particular technologies that that particular technology. And so that means probably more attention from other sorts of resources need to go into disadvantaged communities, which is really already the case. This is hardly the only dimension in which some communities are better served by our public health infrastructure than others. But unfortunately it is another one and I’m smartphone until smartphone on display pages. That’s not going to change. One of the good news things about this is that it does not require or a fancy data plan if someone does have a smartphone. The amount of the amount and timing our data connectivity that it needs to do this functionality is relatively low. So it’s really the question of who has a smartphone and who doesn’t have a smartphone.

[00:46:39] I just wanted to interject that I think that that that’s a very, very rosy picture in a sense, because you have to imagine that not only 80 percent of people have smartphones. Right.

[00:46:51] And then that’s assuming that we all have we everybody decides to go with the Google Apple infrastructure, for example, or that a substantial number go with the same app that at least talks to one another even in theory. Right. And then you have to have adoption. And remember that of the people that we surveyed and these are all over the world and varied demographically. But the people we surveyed, you know, some like 20 some odd percent of them were just wanting to adopt this thing. No. No matter what. Right. So you got 80 percent of people.

[00:47:20] Then all of a sudden, 20 are just off the bat not doing it. And then subset of people. Well. And so it will. You know, I had said this. Well, the people who don’t have technology that skews underresourced, the people who don’t use technology that skews older and and and more vulnerable to this disease. Right. And so it’s it’s it is concerning to me that and indeed, we don’t have a lot of empirical evidence one way or the other. But there have been these sort of systems running in different places. And, you know, the places that they’ve been running, even when they’ve been heavily adopted, like in like an Iceland where there’s 40 percent adoption, they still seem pretty limited utility there. And other places they they’ve really not, you know, the people that are running it saying, yikes, this wasn’t what I thought. Right. So I think I think we at a minimum, as the evidence comes in. I’ve yet to see a very, very strong case that that lots and lots of people would adopt something and then it would be super useful, at least the exposure notifications in scenario area. Right. Versus aiding other kinds. I can see I’d want to get back in and use it so that that’s an area I’m worried about it up to.

[00:48:37] Yeah. Now, I agree that adoption is really important, but it’s worth noting that the Google, Google, Apple technology is not yet released and so there are no examples of apps that use it out there right now. It’s still a question mark. What those apps will be, which ones public officials will choose.

[00:48:54] All of the exposure notification or contact tracing type apps that are out there right now that people have experience with are using a different technology and subject to a different limitation. And I do agree that those are many of those have been problematic. There’s some of the early experiments, I think, either. Have not been successful and might have been done differently with the hindsight we have now.

[00:49:19] It’s even though it’s pretty early in the game as far as this goes, it seems to me that the things we’d want to build in to protect privacy in a way undermined some of the promise of effectiveness. So it’s so dependent upon somebody deciding to put in their own test result rather than having the medical records put in. The reality of what those test results were, for example, a feature that we might build in to protect privacy, but that might undermine effectiveness.

[00:49:49] Is that a reasonable inference or.

[00:49:53] Certainly there’s a tradeoff here, I think, between how privacy protected something is and people’s willingness to adopt it and B cooperate with it, both of which you need for it to be effective. There are safeguards you can do, for example, if you’re worried about how test results get reported. Some, at least some proposals say that people would voluntarily upload their own test results. But in order to for that test results to be recognized by the system, it would need to include some kind of code that came from a public health official. So, for example, if a person was in touch with a public public health contact tracing official, they might get a code or might be given some capability to voluntarily upload their app data. So there are things you can do to try to increase the accuracy of the results that are uploaded while at the same time making the uploading of a person’s test results. B I’d be entirely at their discretion. That’s that’s a world you can build.

[00:50:59] So we’ve talked a lot about privacy and both in terms of how do we design these apps to to maximize privacy.

[00:51:08] And also some of the regulatory or legislative things that we can put in place, like the like federal regulations against commercialization, for example, which I believe is a recommendation from the Saffer centers and others and maybe in this Emergency Privacy Health Act. I’m not sure, but I think that that would strike me as a key feature of such legislation. So we spent a lot of time on privacy. What other ethics questions do you think are raised or is? Is that the main one?

[00:51:40] I do, I’ll go first this time. So, I mean. So I’m going to I want to back out a little bit and say that one of the things that really concerns me is and this is a little you know, it’s it is truly backing out to to a to a. But one of the things that really concerns me is that if you think of where we are politically in United States at the moment, there is an ongoing conversation about whether or not notwithstanding a mortal risk to certain populations and indeed somewhat to everybody, that we ought to reopen anyway and that we ought to reopen anyway, because it’s awful, too, for us to all be away from each other.

[00:52:20] And believe me, it is like I’ve been total quarantine for months with my family. You know, I miss my friends, I miss my professional context and so on. And it’s devastating for the economy. That’s all that’s true. Right. I worry about the way in which technology plays into this picture. I worry that that technology is going to be a way that people can convince themselves that, in fact, it is safe and that the tradeoffs are not what we thought. Right. And if that and if these systems are so modest as to all they can do is, hey, if you get a ping, an exposure notification probably means that you it’s worth getting tested. Yeah. And I mean and let’s hope that there’s widespread and rapid testing, but it’s not a bad idea in that scenario. Sure. Right. But the idea that it’d be a panacea. I mean, if you look at the names of some of these apps that we have that have come out, they have names like Cauvin, Safe or Safe Path or and then they have descriptions.

[00:53:23] And I’ve looked and read through some of the descriptions of the actual apps that are available in the app stores. And they say, you know, your path to safety starts here and the like. Right. And so there is a very I’m very concerned about us misleading individuals that they could be safe because they download an app. And I’m very concerned about the way in which technology can play into a structural political conversation about whether it’s okay for us to to re-open America. Right. And so I just want to put that on the table. I think it’s it’s something to be concerned about ethically.

[00:54:01] Yeah, I’m also right might undermine trust in the public health infrastructure, because if people think that they can just deal with their app rather than with their county public health department that is trying to carry out human contact tracing, it could it could in some ways diminish our track further, diminish and erode trust in the government, public health services. Then we’re going to really need for real robust contact tracing or building a building on both of those comments.

[00:54:32] I think at the end of the day, this kind of app or approach that is aiming to give information to the individual. It’s fundamentally giving people advice, advice based partly on what the app has measured, but also advice based on advice that comes from ideally from the public health authorities that have authorized a particular a particular app. And what that advice is. Is it good advice? Is it pro-social advice? Is it well balanced? Is it effective? And all of that all of the questions you would ask about advice given to the public. All of that holds here. I think the fact that the apps well, through the Google Apple infrastructure will have to be approved by public health officials means that public health officials will be in a position to to control or at least very strongly influence what that advice is. And one would hope that in a lot of cases, the advice includes calling a public official and knocking oneself into the public health system. And that would, of course, be voluntary. But what the advice is, is one of the key questions in all of all of these concerns about why the advice might not be good. Might not be effective, might be politically motivated rather than then for the best interests of the individual and society. Those are all very real. But to me, that’s the big question. Are these things giving you good advice?

[00:56:01] I have a follow up question to that, but I just want to make a brief announcement, some people have been using the raise your hands function in the webinar. And if you have if you have a question, could you put it into the question and answer function instead? Raising your hands is not something we’re able to do, but questions and answers, if you put it, there is a bill, we’ll see it. And we have a few minutes left. And I will ask, is the bill to come back with a question or two? So I don’t want the people who are using the raising the hand function to be frustrated. You have a you in a function where you can put your question. So my my my follow up question from what Ed and Ryan just said, you’re right. You’re saying these apps are already being developed and they’re making some. Pretty unvetted anyway. Maybe extreme claims about about them, but we’re also hearing that Apple and Google are saying that these need to be at least authorized by public health agencies. Ice. What is standing between the development of these and their commercialization immediately jumping over the public health agency vetting? Why wouldn’t this just become commercialized like we’ve seen many other technological innovations that just go to market? If there’s a demand for them.

[00:57:19] I mean, look, you know, at the end of the day, there are enormous advantages to building app on top of Google and Apple that are that are probably going to mean that. Apps that are developed on those platforms are going to have an enormous, let’s say, market advantage, right. So it’s a sort of strange talk about markets and we’re talking about public health authorities, but they’re going to crowd everybody else out. But that said, absent these platforms, policing their app stores very, very closely. They are going to be in are many, many, many, many apps that address Koven, 19. You know, I mean, think about it for a moment. I mean, people are are literally selling Cheryl’s on television and you and me are on the Internet. You know what I mean? At a minimum. Right. So they say you’re going to times like these, unfortunately, bring out the best in people and they bring out the worst in people in Washington state. Many people have been hit with a fraud where a part of someone has claimed unemployment benefits on their behalf. And they’re doing that because there’s millions of unemployment benefits claims going in. And so they hope that they will get lost in the shuffle. I mean, people do bad things, right. And then people try to make money. Which isn’t which is, you know, a little bit different. So I think we will see many different kinds of things.

[00:58:44] The hope is that the hope is that there is a dominant. Public health app that runs on a system maybe even eventually interacts with others and not enough people download that and they ignore everything else, right.

[00:59:03] Has been vetted by public health authorities.

[00:59:06] But again, that’s a relatively recent development. Me and the other thing to remember from the researching and I’m not trying to draw too big a conclusion of this, but from the research, when the government gets involved, that actually makes a subset of people kind of move back a little bit. They say things like, OK, if it were my cell phone company because they know where I am, because they have to give me a call, they’re OK. If it were Apple or Google Maps, because, you know, I I use that already do where my location is, but I don’t want the government to do it. And then Google and Apple come in and say it’s got to be the government, you know, and then that interferes with with adoption, which which then reduces the utility of it.

[00:59:44] It wouldn’t be nice to think that this could be a moment where we could grow more trust in government and collective decision making rather than less. And.

[00:59:53] Local contact tracing is done by county and states where the where the opportunity for trust building is greater. So hopefully we don’t we don’t squander that opportunity. We’re almost done. We’re almost out. I promised one more question, Isabel. We have just two minutes left. So is there a quick question that’s been added?

[01:00:13] So final question maybe is to direct back to this question about public health agencies. So merely asked at the beginning, you know, what would it look like if our public health agencies to vet this? And this is a question that’s been asked a few more times. Where should public health agencies look for guidance about which apps are appropriate?

[01:00:37] Are we stumped? Well, yes.

[01:00:41] Well, no. I’m sure I’ll try this. This is one of the most important questions. Right. Public health and public health organizations are very skilled and trusted for certain purposes. But technology procurement is it is not their core competence. Number one. And number two, they are very busy right now. And so what we need to have and what we really don’t have right now is some kind of trusted authority or some sort of some sort of expert public consensus about what is good and what’s appropriate and how to proceed. I think public health officials are well-suited, are well positioned to make decisions about what to urge people to do when certain events happened. But the question of which apps are constructed in a way that is safe protects the public privacy, protective in the right way and so on. I don’t think they’re in a position to do that. And right now, I don’t see in a public entity that’s in a position to do that. This is you know, this is one of the roles that we would hope government would play, which, you know, our governments at this point seem to be having a hard time doing.

[01:01:54] Unfortunately, that is the note on which we have to end. I want to remind people or I don’t think I’ve told you that then that the next webinar is going to be a continuation of this, looking at digital tools for risk free SEC certification. Very different than contact tracing, including something that you may have read about called immunity passports or immunity certification. And we will be letting you know about the date for that, but it will be coming up fairly soon. I want to thank Prof’s, Kalo and Felton for a fantastic discussion for all your expertise. Thank you so much for sharing it with all of us. It’s really, really a pleasure. For the Hastings Center to be able to host this and to bring your knowledge and your insights to the audience. Thank you very much.

[01:02:42] Thank you. Hi, everybody.

[01:02:47] A recording of this will be available later today on the Hastings Center’s Web site. And if your question didn’t get answered or if you have more comments, please continue the conversation on Twitter using the hashtag ethics for reopening.

[01:02:59] Thank you.

 

The Hastings Center has never shied away from the toughest ethical challenges faced by society.

LET US SHARE OUR EXPERIENCES WITH YOU

Interests
The Hastings Center has never shied away from the toughest ethical challenges faced by society.

LET US SHARE OUR EXPERIENCES WITH YOU!

Interests