Seattle Considers Controversial Surveillance Technologies with Flawed Approval Process
Seattle's rushed approval process for intrusive surveillance technologies is discussed in a recent interview on the Hacks & Wonks podcast guest hosted by Dr. Shannon Cheng, with Amy Sundberg and BJ Last from Solidarity Budget
In a recent interview on the Hacks & Wonks podcast guest hosted by Dr. Shannon Cheng, Amy Sundberg and BJ Last from Solidarity Budget discussed Seattle's rushed approval process for intrusive surveillance technologies.
The City of Seattle is currently reviewing three surveillance technologies for potential implementation: ShotSpotter gunshot detection, CCTV cameras, and Real-Time Crime Center (RTCC) software. These technologies are being proposed as part of a $1.5 million pilot program to address gun crime. As BJ Last stated in the interview, "What it's supposedly for, it absolutely doesn’t work and does a whole host of harm in the meantime."
ShotSpotter consists of acoustic sensors placed around the city to detect gunfire and alert police to possible shootings. However, studies show ShotSpotter has extremely high false positive rates up to 90-97%, flooding 911 systems with false alerts. As Sundberg noted, this technology "vastly increases the number of police deployments in response to supposed gunfire - these false alerts."
There is an upcoming public hearing on February 27th at 6pm at the Bitter Lake Community Center to gather additional public input. As Sundberg argued, "I really, really encourage people to go, to give public comment, to support your community members who are in this fight with you."
Alternatives like community investments in youth jobs programs, gun violence interrupters, firearm lock boxes, and guaranteed basic income have proven to effectively reduce firearm deaths. These surveillance technologies encroach on civil liberties with no evidence they improve public safety. Concerned Seattle residents should contact their councilmembers, speak at upcoming public hearings, or submit comments against the flawed approval process and adoption of these technologies.
About the Guests
Amy Sundberg
Amy Sundberg is the publisher of Notes from the Emerald City, a weekly newsletter on Seattle politics and policy with a particular focus on public safety, police accountability, and the criminal legal system. She also writes about public safety for The Urbanist. She organizes with Seattle Solidarity Budget and People Power Washington. In addition, she writes science fiction and fantasy, with a new novel, TO TRAVEL THE STARS, a retelling of Pride and Prejudice set in space, available now. She is particularly fond of Seattle’s parks, where she can often be found walking her little dog.
BJ Last
BJ Last is a business analyst, and former small business owner, with two decades of budgeting experience across a wide range of industries. He organizes with the Solidarity Budget and Ballard Mutual Aid.
Resources
Public Comment Period Opening for the Technology Assisted Crime Prevention Pilot Technologies | City of Seattle Information Technology
STOP Surveillance City - Solidarity Budget Call to Action
Stop Surveillance City Sign-On Letter | Solidarity Budget
“Harrell Plans Hasty Rollout of Massive Surveillance Expansion” by Amy Sundberg from The Urbanist
“Seattle’s New Policing Panopticon” by Puget Sound Prisoner Support for Puget Sound Anarchists
The Surveillance Ordinance | City of Seattle
“Mayor Johnson to end ShotSpotter deal after summer, making good on key campaign promise” by Tom Schuba and Fran Spielman from The Chicago Sun-Times
Seattle Police Department 2023 Year-End Overview | Presentation to Seattle Public Safety Committee - February 13, 2024
Dangerous Surveillance #1 - Closed-Circuit Television Cameras (CCTV) | Solidarity Budget
Dangerous Surveillance #2 - Acoustic Gunshot Location System (AGLS), aka ShotSpotter | Solidarity Budget
Dangerous Surveillance #3 - Real-Time Crime Center (RTCC) | Solidarity Budget
“Cook County, Ill., officials say ICE using data brokers to purchase protected information” by Lindsay McKenzie from StateScoop
@DivestSPD on Twitter/X: SPD sociopath Micah Smith #7714 involuntarily committed people to score a date w/ an ambulance driver
“OPA Documents Show Current SPD Officer Misused Internal Police Data to Try to Get a Date, “Caused Anxiety and Concern”” by Carolyn Bick from South Seattle Emerald
Rainier Beach Action Coalition
King County Regional Office of Gun Violence Prevention
“Richmond is offering an important lesson on public safety at a critical time” by Justin Phillips from San Francisco Chronicle
“Want to reduce violence? Invest in place.” by Hanna Love from The Brookings Institution
Podcast Transcript
[00:00:00] Crystal Fincher: Welcome to Hacks & Wonks. I'm Crystal Fincher, and I'm a political consultant and your host. On this show, we talk with policy wonks and political hacks to gather insight into local politics and policy in Washington state through the lens of those doing the work with behind-the-scenes perspectives on what's happening, why it's happening, and what you can do about it. Be sure to subscribe to the podcast to get the full versions of our Friday week-in-review show and our Tuesday topical show delivered to your podcast feed. If you like us, the most helpful thing you can do is leave a review wherever you listen to Hacks & Wonks. Full transcripts and resources referenced in the show are always available at officialhacksandwonks.com and in our episode notes.
[00:00:52] Shannon Cheng: Hello, everybody. This is Shannon Cheng, producer of Hacks & Wonks. I am going to be your special guest host again today, and I'm super excited to be welcoming back to the show Amy Sundberg and BJ Last from Solidarity Budget. Some of you may recall that we did a show back in November about the Seattle City budget process. And we talked at that time about a proposed crime prevention pilot program that included technology such as ShotSpotter and CCTV. Well, today we're sort of doing this as an emergency show because we're trying to follow up on what's happening with the City's process in acquiring and implementing these technologies. So I just really wanted to have these experts back on to fill us in on what's going on and why it's important. So starting off, what is happening? What are these surveillance technologies that are being considered by the City?
[00:01:41] Amy Sundberg: Good to be back. We're happy to be here talking about this. Yeah, so there are three different technologies that are currently being discussed and reviewed. The first one is Acoustic Gunshot Location Systems, or AGLS - or colloquially known as ShotSpotter. So I would say as we continue to have this conversation, you should consider those phrases interchangeably. I might say AGLS, I might say ShotSpotter, but it's the same technology in either case. The second one is CCTV, and the third one is a Real-Time Crime Center software.
[00:02:13] Shannon Cheng: When we talked about budget back in November, I feel like there were only two at the time. And now we're talking about three - is that true?
[00:02:19] BJ Last: Yes, that has come in. They're claiming magically that it's all going to work under the same dollar amount. Back when we talked, it was just the AGLS, the Acoustic Gunshot Location Service, and the closed circuit television cameras, the CCTV. So now it's the Real-Time Crime Center, the RTCC, which is largely just a massive compiler of data that goes and pulls in tech from ShotSpotter, from AGLS microphones, from City-owned CCTV cameras, from privately-owned CCTV cameras, and a bunch of AI algorithms - a real quick overview of what that one is. But yeah, we're now up to three techs as a suite.
[00:02:57] Amy Sundberg: I should say, too, that the RTCC software also will integrate the license plate readers, which we just saw a massive expansion of at the end of last year.
[00:03:05] Shannon Cheng: Right. Just to remind everybody where we were at at the end of 2023 - during that budget process, funding for this surveillance technology was allocated, and I believe it was $1.8 million total. And of that, $1.5 million was supposed to be for a pilot project for this Acoustic Gunshot Locator System plus the CCTV - and there was no Real-Time Crime Center at the time. And then the other $300,000 was for this expansion of Automatic License Plate Readers that Amy just mentioned. So where are we now with these three surveillance technologies?
[00:03:46] Amy Sundberg: Well, we are in the middle of a convoluted process that BJ and I and others have been spending a lot of time trying to understand and to help other people understand. So it's called a Surveillance Impact Review, which all surveillance technologies that are going to be used in the City of Seattle now have to go through this review process because of an ordinance that was passed.
[00:04:09] BJ Last: And do you want to give a shout out to who was the primary sponsor of this ordinance? It is our current mayor, Bruce Harrell - just a fun one to know, given with how this process is unfolding.
[00:04:21] Amy Sundberg: I actually didn't know that, and that is kind of ironic - so thank you for sharing. So this process has to be done for any technology that is deemed to be surveillance technology, which all three of these technologies have been deemed. And it is a review process that has many steps. We have the draft reports available now, which I believe were filled out by SPD and maybe also the executive's office. And right now we're in the stage where we are able to give public comment. So there has to be at least one public hearing for this report - they are having two public hearings. One of them already happened, and the other one is upcoming on February 27th at 6 p.m. at Bitter Lake Community Center and online, of course.
[00:05:14] BJ Last: And I will say this process is being exceptionally, I'd say, rushed and short. So they started taking public comment on February 5th. They stop taking public comment on February 29th. So y'all can do the math - that's well less than 30 days that people actually get to go and provide feedback on this. And as Amy mentioned, there will be a grand total of two public hearings on this. So we're looking at literally less than a hearing per technology being done - three technologies, but only two total hearings. And as a comparison of how this works - Dayton, Ohio, an area I think a lot of people in Seattle would probably look down as like red state, flyover country - when they were looking at adopting just one of these technologies, they had 13 public hearings versus nominally progressive Seattle doing its grand total of two for three technologies.
[00:06:05] Shannon Cheng: Okay, so at the end of last year, the City allocated the money for these technologies. Now they're going through this process. As you said, it's this Surveillance Ordinance - so that took effect in November of 2018. It was designed to provide greater transparency when deciding whether the City was going to adopt any technology that is surveillance, as Amy said. And just to be clear, this is not just restricted to the Seattle Police Department wanting to implement surveillance technologies. When I was looking back at some of the past technologies that had to go through this process, SDOT had to do this for some cameras they had for traffic detection to help streets moving smoothly. So this is just - whenever we're implementing something that is going to be observing, it's so that the public and the city council can understand - what are the impacts and are there any concerns that we need to know about before we just roll all this stuff out onto our streets. So that's where we're at. And in the past, I noticed it took them maybe 6-7 months to go through this process. But as you're describing it, BJ, it sounds like it could be less than a month that they're trying to do everything right now.
[00:07:16] BJ Last: Correct. They're trying to limit all the public input to less than one month just to go push it through. You did a great job summarizing the Surveillance Ordinance, Shannon. It really was designed so the people of Seattle get to meaningfully - A) find out what surveillance they're potentially going to be impacted to, and B) get a chance to evaluate it so that we don't end up - Oh wow, there's this new surveillance because five people fell for a sales pitch. That people of the city actually got a chance to research the thing, find out what they were dealing with, and that's really hard to say that's happening when you're trying to do three different technologies in less than 30 days.
[00:07:50] Shannon Cheng: Yeah, that's a lot of information. I admit I've been having trouble wrapping my head around everything. So it sounds like we're already past the point of one of the public meetings having happened. We're recording this show Thursday, February 15th. The first one happened on Monday, February 12th. So what was that public meeting like? Did they provide in-depth information about the impacts that these technologies might have? And how did people react?
[00:08:17] Amy Sundberg: No, I wouldn't say that. About half of the meeting was a presentation about the technologies, but it was more about why they're going to adopt them - what they think will be helpful about the technologies. They didn't really go over any of the negative impacts that we are so worried about. And then there was a chance for public comment. I would say there was about 15 or 16 people who wanted to give comment at this first meeting, which - people didn't have a lot of advance notice. And like you said, it is three different technologies - some of which people are hearing about for the very first time - and they are technical. It does take some time to learn even what they are and how they work and why we should maybe be worried. So 15 or 16, given that, I feel like was higher than anticipated. And what I heard over and over again is people saying - This is too rushed. We need to slow down. We have concerns. We are against this surveillance technology. And also this is too fast, and this process is not serving the people of Seattle well. I would say there was maybe one comment that wasn't that. It was very uniform, in terms of people being very concerned about this. And it was at noon on a Monday, so people are taking their lunch break or time off in the middle of a workday - that's how worried they are, right? I am happy that the second public meeting is in the evening to give a different demographic of folks the chance to come out and give comment. But I still think two one-hour sessions is not sufficient.
I will also say that there are other worrisome things about this process. For example, there is a Surveillance Advisory Working Group. And how they plug into this process is once everything else is kind of done, they are supposed to review these reports. And then they complete a civil liberties and privacy assessment, which for a surveillance technology, you can see how crucial that would be. And right now, that group has one confirmed sitting member. So they can't meet quorum, right? And I know that there are some other folks that are lined up, but they do need to be confirmed in the committee first. And again, this is being very rushed - the mayor's office gets to appoint some and then the council gets to appoint some - the timing of it all makes me feel uncomfortable, to be frank. That this is going to be rushed right before these three technologies are going to be discussed - who is being chosen and why? I don't know the answers to that, but these are questions that we're going to have to ask as those appointees come on board. And then they're going to be brand new, and right away have to do this review. Again, a very rushed process.
And then perhaps my - all these things are very concerning, but one of my biggest concerns is the Racial Equity Toolkit component of this process. So all of these Surveillance Impact Reports have to have a Racial Equity Toolkit as part of the process. And it's been very unclear as to how - is the Racial Equity Toolkit a concurrent process? Is it a separate process? What is the timeline? What kind of outreach is going to happen? How are they reaching out to the impacted groups? Are they making sure to do so in a way that is best for those groups and to do it in a variety of different ways, et cetera, et cetera? There's a lot of open questions that I have not been able to get answers to thus far. I've been hearing that possibly these public hearings that we're having for the Surveillance Impact Report might be kind of rolled into the Racial Equity Toolkit, which seems inappropriate to me, frankly, for technologies that have such potential for grave misuse and negative impact. As well, we do not yet know exactly where this technology is going to be deployed. We've been told several locations - Aurora Avenue North, Belltown, and the Downtown commercial core - that's what we were told last year. Then a couple weeks ago, they added Chinatown International District - apparently at the last minute, and they don't know where. They've said that it's probably not going to be all of these places, but wherever they're going to deploy this technology, they need to do - in my opinion - a separate RET, Racial Equity Toolkit, because each neighborhood is going to have different dynamics, different demographics, different things going on, different groups that need to be consulted. And I haven't heard about any individual outreach. So it doesn't mean it hasn't happened, but I have been actively asking and I have not been able to find anything out that this is actually happening. As well, you're going to want to look at reports, studies for racial impacts, potentially. Again, I'm not seeing those being cited in the draft report. So it seems like a very slapdash, non-serious job that is being done. And it doesn't seem like the communications that have been sent out to the public don't seem to come from an administration that's serious about equity and social justice. And I'm very concerned, frankly, that I am even having trouble getting my questions answered.
[00:13:38] BJ Last: Also, that's a great point on the four different areas that are up for consideration, because there are four areas - again, two public comment periods. Last one that's open is up in Bitter Lake - that is not exactly close to Chinatown International District, that is not close to Belltown, that's not really close to Downtown core. So three of the four areas that could potentially get this will have never even had a public hearing in their area. Fortunately, people can join that remotely, but that's also not even an option for everyone. So they've said this might go out in one of four areas. They're not even trying to do outreach in each of those areas, which is - as Amy said, seems like a problem, and that's something they're not really taking seriously. Same with when they wrote up the Surveillance Impact Reports - there's a section of what studies have they looked at for each technology. And for two of the reports, those are entirely blank. And for one of them, for CCTV, they referenced one study that actually found this has no impact on violent crime. So this seems very slapdash, just trying to push it through, not trying to get the community involved.
[00:14:41] Amy Sundberg: We also really expected to see them talking to other cities. None of these technologies are particularly new. A lot of cities have used these technologies, have deployed them in various combinations. I will say also, it is not new to put all three of these technologies together in one place. Chicago, for example, does it - they've been doing it for a while now. And we're seeing a lot of cities backtracking - having had a contract for ShotSpotter or similar technology, and then discontinuing that contract. And just this week, we got the news that Chicago is going to be discontinuing their very large contract for ShotSpotter by fall at the latest. And it seems that it would make sense for a city who is considering deploying this technology to talk to other cities about the experience that they have had, especially if it seems like maybe they've had kind of a negative or mixed experience.
[00:15:37] Shannon Cheng: So what I'm hearing from the two of you is that we're on the brink of potentially acquiring or implementing these technologies, which we have some concerns about, that the product of this Surveillance Impact Report process is to provide the city council a holistic view of what these technologies are meant to do, whether they work, what kind of drawbacks they might have. And unfortunately, it sounds like the process that they're going through, there's just a lot of things wrong with it - the speed at which it's going, the incompleteness of their filling out the draft report, the not making sure that the last group who is going to review the report before it goes in its final form to city council even has people on it. It just makes you wonder - it's not like they didn't know this was coming. I remember when we spoke last November - BJ, you pointed out they had been trying to get ShotSpotter since the year before. They had an entire year. Now they've had two years to start planning, filling out this report, getting all these ducks in a row. And it just seems like we're now here at the last minute and there's some kind of false sense of urgency being put on the city council - who is also brand new to all of this as well - to just accept things that are going to have ramifications for everybody who lives, works, or plays in Seattle for many, many years to come. So I feel worried listening to the two of you talk.
So that's just the process. What about the technologies themselves? When we hear the word "surveillance," my concerns are my privacy rights - when I just go about my daily life, I don't necessarily want to feel like I am being monitored and all the details of that are being kept somewhere. When people feel like they are being surveilled, there can be a chilling effect on just how they behave - whether that's in public, or where they go, or who they associate with, or what they say. We're trying to live, theoretically, in a vibrant community with diversity in it. And I think that surveillance does have this effect that homogenizes - when people try to play to the camera and make sure that they're not going to get singled out for whatever that is being looked for. And then there's a lot of discrimination when it comes to surveillance - just the way that it's implemented - it's just got issues where the system's just never perfect at understanding what it's seeing. And so unfortunately, biases trickle through. So just generally, that's why surveillance is bad. And so that's why it's really important and why there's supposedly this process where before we undertake letting more of it into our lives, we want to understand what are the issues with it.
So here we are - we're in the City of Seattle, we're thinking about implementing these three technologies. Again, that would be the Acoustic Gunshot Location System, the CCTV cameras, and the Real-Time Crime Center. What problem does the City claim that we're trying to solve with these technologies? And does it seem like that they will?
[00:18:53] BJ Last: So the claim is that this is specifically for gun crime - which is always the claim that these technologies and a lot of other surveillance technologies use as an excuse - because that is a very real and very, very serious problem. And the thing is, they know it absolutely doesn't work - their technologies don't actually work to reduce that. And that's why you see what their pitch is keeps changing - from, Oh, this is going to prevent or reduce crime, to, Okay, this will help gather evidence for after crime has occurred, to, Maybe this will help the community know to improve the emotional health of kids, to, Maybe this will get people to medical treatment faster. It's just sort of as studies come out showing one doesn't work, they just keep moving the goalposts and moving the pitch. That's why even the technology suite keeps changing. From it's just, Oh you need CCTV - that's gonna solve it - make us a crime-free world, to, Oh, you need Acoustic Gunshot Location, AGLS. Oh no, you need the two of them combined. Oh no, you need the two of them plus RTCC, the Real-Time Crime Center, and all of its algorithms. It just keeps going because it absolutely does not work on this.
And this is actually even really reflected in how the City has kept trying to pitch these things. This right now is called the crime prevention pilot - emphasis on the word "prevention." So when they tried to get it back in the 2023 budget, an actual quote from Mayor Bruce Harrell - "Cities across the country have used this as an evidence gathering tool, not a violence prevention tool." So 2023, they're - Nope, no prevention. 2024 budget, they're back to calling it prevention. They're just constantly trying to change what it is. So nominally, it is for gun violence, but we've seen time and again that it does not work for that. Studies that you look at - like Chicago, they found that it's missed hundreds of gunshots in an actual year, while at the same time having an incredibly high false positive rate, with 9 out of 10 alerts being no evidence of any gun crime occurred. CCTV - again, the study that the city mentioned, found that it has absolutely no impact on violent crime rates or clearance. So what it's supposedly for, it absolutely doesn't work and does a whole host of harm in the meantime.
[00:21:02] Amy Sundberg: Another way that it's being pitched is to deal with SPD's unprecedented staffing shortages - that's a quote from the report. So conveniently this week, we just had the new numbers released for crime in Seattle in 2023. In terms of staffing for SPD - in 2023, they lost 36 more officers than they were able to hire in the year. So they're a net negative 36 - so it went down - they have less staffing now than they did before. And yet in 2023, they had a 9% reduction in overall crime and a 6% reduction in violent crime. Now, I don't want to be gaming these statistics - what is very serious is that there was a 23% increase in homicide. And obviously, we don't want to see that. But the question is, does staffing actually impact these numbers? Is that the thing that does it? And so in that case, does alleviating this staffing issue with these techs - is that going to have any impact on the numbers? And the studies, in general, say no - with CCTV, it would maybe have an impact on car theft or maybe some types of property crime. But property crime actually went down 10% in 2023 already. The numbers don't really line up either in terms of this unprecedented staffing and needing this technology. And at a certain point, I think you have to do a cost-benefit analysis of what do you expect to potentially gain from adopting a technology versus what are the harms that might happen. And so far, this conversation has been shifting the goalposts a lot on what we hope to gain and ignoring all of the potential and documented in other cities harm that could be caused. And I feel like that's a really unfortunate way for this conversation to be framed.
[00:22:53] BJ Last: And before getting into some of the harms, I want to - you mentioned, Amy, that they're using the - what they have been trying to claim since 2019 is a massive police staffing shortage. That is just a complete nonsensical argument for these. Acoustic gunshot Location Services - it's a false call generating machine. I mentioned Chicago found a 90% false positive rate. Atlanta found a 97% false positive rate. That's one of the reasons why both of those cities have stopped using Acoustic Gunshot Location Services. Other cities have as well, with police coming out and saying - This is a massive strain on our resources, because we're constantly getting these alerts that are coming through as, Oh, it's a shots fired incident. We're dispatching cops and they get there and they're like - there's absolutely nothing around. So the claim that this somehow would help for staffing levels is absolutely absurd, when again - AGLS just generates false positives, that's what it does.
[00:23:45] Amy Sundberg: Another thing that they're saying is that this would help get more justice for victims and victims' families of gun violence - and that also doesn't seem to be the case. There was a new review that just came out in the last couple of weeks by Cook County state attorney's office in Illinois that found that - they're using ShotSpotter. They found it has, "a minimal effect on prosecuting gun violence cases." And, "ShotSpotter is not making a significant impact on shooting incidents, with only 1% of shooting incidences ending in a ShotSpotter arrest." And then they also said - Also, it's really expensive. - so that's a thing, too. And then I spoke to an expert at the MacArthur Justice Center - attorney named Jonathan Manes - and he says that ShotSpotter doesn't make police more efficient or relieve staffing shortages. He says - Actually, it's the opposite. It vastly increases the number of police deployments in response to supposed gunfire - these false alerts that BJ was talking about - but with no corresponding increase in gun violence arrests or other interventions. And then he went on to tell me that it actually increases response times to 911 calls as a result of flooding the system.
[00:24:56] BJ Last: And it isn't just Acoustic Gunshot Location Systems that don't work on this. Again, with CCTV as well - there was a study from Dallas looking into this, and it found it didn't have any impact on clearance rates for violent crime. There was no benefit from actually going and putting out a bunch of CCTV cameras. And this actually corresponds with a lot of the studies done in London that have also shown the same thing - when they put cameras out through the city, they don't see that. The British Home Office looked into 14 different CCTV ones and found that they didn't reduce crime, make people feel any safer. So it's not just acoustic gunshot location, but even CCTV doesn't work, which I feel like for some people - it feels almost counterintuitive on that because we see so much crime dramas and all of - Oh, cameras solve everything - often with someone saying the word "enhance" multiple times and you get perfect evidence that never would have existed otherwise. And that's just not borne out by reality, they just do not do that.
[00:25:54] Amy Sundberg: I also just wanted to mention - this is called a pilot project, so it is not necessarily going to have a huge deployment right from the start. But the reason it's still really important to have this public conversation now, as opposed to later, is that this Surveillance Impact Review is happening now. This is our chance to discuss it. And once it passes this review, it won't go through another review if they decide they massively want to expand. So this opens the door to any future expansion that the City might decide that they want to do. And we've seen a recent example with the license plate readers, which did go through a surveillance review process in the past. They had it deployed on only a few SPD patrol vehicles, and now they're going to be on every single patrol vehicle that SPD owns. And that took very little effort. It received very little coverage in the media. So this is our one opportunity to most effectively push back against the broader use of these technologies, even though right now it's just being discussed as a pilot.
[00:26:59] Shannon Cheng: So during budget season, as we discussed before, they only talked about those first two - the Automatic Gunshot Locator System and the CCTV - but now they're adding on this Real-Time Crime Center. This is the one that I feel the least familiar with, but it also sounds potentially very insidious. And now they're trying to sell this as a package of these three together, claiming that - maybe these individually don't work that well on their own, but somehow magically, if we combine them together, it's going to completely be a Transformer robot or something and be able to save the world. So my understanding with this Real-Time Crime Center - and this ties into this expansion of Automatic License Plate Readers you were just talking about, Amy - is that it's just trying to basically aggregate a bunch of data from different sources that the police department has and then give this one view or something to some observer to call the shots about what's happening or what's not happening. What really worried me when I was reading about it is that it takes in these sources that maybe the City has deployed around, but it also offers this opportunity for private cameras to be incorporated. So people can opt-in to let their own - whether they have a Ring doorbell camera, that type of thing, or just a security camera at their business or their home - and they can allow, basically, law enforcement have access to that without their neighbors necessarily knowing or people coming into their store. And that doesn't go through a process on its own at all and wouldn't be subject to maybe public disclosure requests to know where the location of those cameras were or where they're being pointed. So what more can you tell me about RTCC? Because I just - I'm worried.
[00:28:56] Amy Sundberg: I think you should be worried. Yeah, it is worrisome. And the more I read about it, the more worried I become. You always hope in these situations that you start out being worried and then those worries are ameliorated through gaining more knowledge. But in this case, it is the opposite. I think the ability to plug in all these private cameras into the system is a big issue. The amount of data that is going to be collected - I don't think that can be understated - it's a massive amount of data because it's taking in all the data from all these other surveillance technologies, both the already existing ones like license plate readers and these potential new ones. And then all of these private cameras, which can keep expanding over time without oversight because they're privately owned cameras. So the public doesn't really get to weigh in on those private cameras. They can be pointed anywhere - you are correct. And the City has no control over where the private cameras are pointed. But that data still is then brought back to the software and collated and run through algorithms and available for people to have access to. So that is definitely worrisome.
[00:30:03] BJ Last: Yeah, the fact that the City doesn't control where the cameras go - since they now allow the private ones in there - is a huge thing. You may think - Hey, the City wouldn't point a camera at, oh, say, the parking lot of Planned Parenthood or a healthcare facility, because Seattle wants to be a sanctuary city for people seeking abortion healthcare or people seeking transgender healthcare. Hey, a private individual can. The Denny Blaine Beach - we just had that, where someone tried giving the city $550k to put in a playground there to effectively drive a queer beach - to disband it. Hey, they wouldn't have to give the City $550k, they could just point a camera there. So any place, if you were like - Oh, well, the City wouldn't do that because for whatever reason - they wouldn't target any groups. Guess what? Any private individual can go and point a camera wherever they want, and now that's getting fed in. And that is now data that does not need a warrant to be accessed. And so any potentially marginalized group anywhere that Seattle is trying to be a sanctuary city for is completely at-risk off of this. So just all of that is now in play as these private cameras roll out.
And beyond private cameras, RTCC, the Real-Time Crime Centers, they're also another Software As A Service, like the Acoustic Gunshot Location. And part of that is they openly brag about how they are constantly rolling out new algorithms as part of your subscription package - A) that really seems like that violates the Surveillance Ordinance because those aren't going up and getting public review as a part of that, so now that can't happen. And then what even are the ones that they're doing? So some of the ones that groups are trying to do is the theory of detecting whether or not someone has a gun on them by using cameras and looking at the way they walk, which unsurprisingly is incredibly inaccurate - as inaccurate as that actually even sounds, just from me trying to describe it. So you now have the potential of - that's now part of the RTCC. So SPD is now going to potentially roll up because - Hey, the camera algorithm thought you had a sort of funny walk, so guess what? The cops are now getting called as if you're someone carrying a gun on you. That is really - like that's so absurd, it doesn't sound like it should be accurate, but that is actually what this is.
[00:32:11] Amy Sundberg: I have a couple of other concerns as well - going back to the privately-owned cameras for just a moment. Because they're privately owned, what that means is it makes it more complicated and confusing in terms of restrictions that normally govern the police. So, for example, they wouldn't necessarily have to get a warrant for footage that they normally would be required to get a warrant for. And there's settings that the private users can do, but it's confusing. I don't think your layperson is necessarily going to know what they're opting into. I've spent the last two weeks immersing myself in information about this, and I still find aspects of it confusing. And your average person doesn't have two weeks to do that, you know? So it kind of disrupts the current checks and balances we have around surveillance and police power, which I find very concerning.
And then in terms of undermining Seattle's status as a sanctuary city, one of the things that is key to understand about this software is - the privacy of the data is not guaranteed. Once it's in that Real-Time Crime Center software, there's a lot of interagency exchange. So SPD might originally get the data. And then it could be exchanged with another law enforcement department somewhere else. And they could exchange it with another law enforcement department somewhere else. And then it could end up with ICE [Immigration and Customs Enforcement], as one example. I asked some experts - because we do have an ordinance here in Seattle that requires that when ICE makes a request, that it be referred to Mayor's Office Legal Counsel when they ask SPD for something. I was like - Well, would that help? But probably not, because of what I just stated - because it can pass from agency to agency to agency. So it's some fourth agency that's giving it to ICE - it's not SPD, so there's no chance to have that interruption there. As well, there are documented cases when a police officer will just give the data to ICE and they'll just - whatever policies are in place, they'll just kind of conveniently ignore that and hand over the data. So the idea really is that once this data is being collected and being collated, it is very difficult, if not impossible, to firewall it, protect it, make sure it stays in a limited space at all. And that has implications, as we've said, to undocumented people. It has implications for people who are seeking abortion - especially from other states where abortion is no longer legal. But we might eventually live in a world where abortion is no longer legal here in Washington state, and then it would apply to anybody seeking an abortion. It applies to all sorts of cases where privacy is really crucial, and not because anyone is committing gun violence - that's not why.
[00:35:08] Shannon Cheng: Yeah, that point about who does get access to the data that's collected. It's one thing to have all these things collecting the data, but if it isn't well-protected or there isn't a good system to limit or manage who has access to it, that's very concerning. And as you said, it impacts vulnerable communities first, but ultimately it impacts all of us. When marginalized communities feel like they're being targeted, they tend to go into the shadows and the margins - and that just is not good for anybody. Right after the Muslim Ban, we worked for - trying to make sure that local law enforcement wasn't cooperating with federal immigration enforcement. And one of the arguments was that if undocumented folks can't trust local law enforcement to not turn them in for deportation - if they're a witness to a crime or something like that - they're not going to want to engage and help the community solve these ills. They're just going to go into hiding. And that's just bad for all of us in general. So it's really worrying. And then also, in addition to these unknown other people who have access to the data, Seattle Police Department officers themselves, in theory, might have access to that data. And we have some documented cases, even recently, where they have abused their access to data. Is that correct?
[00:36:25] BJ Last: Yeah, yeah - absolutely. That is correct. We have had cases of SPD officers abusing access to data. One of the most famous ones was an officer effectively stalking a ambulance driver, an EMS person, and even having people involuntarily committed just to get to see that EMS person. By the way, they are still on the force. So, you know, in terms of how well our accountability system supposedly works.
[00:36:50] Shannon Cheng: Wait, what? Because they wanted a date with the EMS person or something?
[00:36:55] BJ Last: Because they wanted a date with them - that they were going and doing that.
[00:36:59] Shannon Cheng: Wow.
[00:37:01] Amy Sundberg: I would also just chime in and say we're talking about these really harmful impacts to our most vulnerable residents, our most marginalized residents. And I would say that is true across all three of these technologies, and it's documented. In terms of just ShotSpotter - increases pat downs, frisks, increases policing in the more marginalized communities, which tends to be where the microphone arrays are located in a city. And CCTV, it's been shown that people of color are more likely to be surveilled than other folks, so there is a disparate impact. So this is a throughpoint between all three of these technologies in terms of some of my gravest concerns - because again, these are not new technologies, so we've already seen how they've operated in the real world.
[00:37:52] BJ Last: Yeah, and just to go on that, a couple of real concrete examples on each of these technologies - of them causing massive amounts of harm and abuse. In Washington, D.C., there was a case of a very high-ranking police officer - believe he was a lieutenant offhand - blackmailing gay men using CCTV footage. UK, case of a CCTV operator - got fired because he kept pointing cameras into a lady's apartment - I'm sorry, a flat, because it was in the UK. Very real risks of harm. Acoustic Gunshot Location - we know Adam Toledo, a 13-year-old that was chased and shot while unarmed by Chicago Police Department because they were responding to a ShotSpotter alert. Just last month in January in Chicago - cops responding to what was listed as a ShotSpotter alert opened fire on an unarmed man that they saw because one of them heard a loud noise when he stepped out of a car. Also out of Chicago - we have seen police officers literally run over gunshot victims because they were responding to ShotSpotter alerts. These are all things of really real actual harm that these technologies have caused.
[00:38:57] Amy Sundberg: In addition, once we start talking about algorithms - which is what a lot of these technologies use - the algorithms tend to have racial bias baked into them because they're trained on datasets, and their datasets are informed by the racial bias that created them. So you end up in this loop where people are - Oh, well, the algorithms will solve racial bias. No, that is not true - because the data they're trained on has racial bias in it. So you see it instead perpetuated and potentially strengthened.
[00:39:27] Shannon Cheng: Yeah, garbage in, garbage out. In my past life, I hung out with a lot of people who were very technology-focused, and I can see this - Oh, we'll just add all these things together and it's going to work. The problem is that they're trying to model the real world based on these just very concrete assumptions about what cause and effect are, when we know the real world is actually very nuanced and requires a lot of context to interpret. And the problem is with these surveillance things is you're getting a very narrow view of different aspects of the world. So, for example, for the Acoustic Gunshot Locator, you're just getting random sounds. And then okay, maybe now you're trying to match it up with video feed to try to figure it out. And then now you're adding in this algorithm that's going to compile it all together. But the thing is, we're talking about real people's lives at stake - that they're basically experimenting on. This is a testbed for unproven models with real world consequences, and when we're talking about the actual people who live in our city, that if they make a mistake - somebody gets run over or somebody gets shot. Because we've seen that there's this worldview that law enforcement has where they see a lot of things as a threat or they just feel like there's a lot of danger out there when that may not be the case. There's a difference between being uncomfortable and unsafe. And I don't know that these surveillance technologies are really going to help with determining between being uncomfortable or unsafe. In some ways, surveillance technology is allowing them to abstract from the real situation - when you look at things through the camera, you're like, Oh, well, it's a fancy technological solution, so it's got to be right. But you can't just assume that what the camera sees is the truth.
[00:41:19] BJ Last: Yeah, and you talked about how these are unknown, haven't been studied - guess what? Stuff that actually reduces violent crime has been studied - this isn't something that we don't know - there are very real solutions on this, which is the much cooler thing. And I'm really happy that we're now transitioning into this, but most of them largely boil down to actually invest in community. Instead of giving the money to a tech company somewhere, invest in the actual communities themselves on that. There are some examples of that - the Rainier Beach Action Coalition - their program of youth violence interrupters, which are people in the community that are out there activating neighborhood street corners, they've been shown to reduce violence by 33%. In terms of that difference on actual invest in community on this - so for that $1.5 million, they could go and actually give 168 young people jobs for two years. So invest in community - it is proven, what Rainier Beach Action Coalition does. You can invest in community, give 168 people jobs, and you reduce violence. Or give the money to a couple big tech companies - that's just one of the things.
[00:42:20] Amy Sundberg: We also have this work done in King County through Public Health and the new Office of Gun Violence Prevention. And I sat in on their meeting, giving their briefing to the new council. And for example, they give out free firearm lock boxes. And basically it means that you have a safe place to store your gun - because a lot of times kids get the guns because they're just laying around in a closet or a drawer or whatever. But if you have them locked up, then the kid can't get to the gun and suddenly everybody's safer. So they hand out those for free, which is very effective. They also had a gun buyback that they hosted where people could go and they got gift cards. And apparently it was so well attended last year that they ran out of gift cards before the end of the event. So there is actually an appetite in this community for these sorts of programs. It's more a question, I think, of funding than anything else. Which instead - what we're going to throw $1.5 million away on this technology that we're pretty sure isn't going to work, when we have these things that community wants and that we know will help. And that office also coordinates with the Peacekeepers Collective and their gun violence prevention programs as well. So there is a lot of stuff happening on a local level.
And then as well, there's Guaranteed Basic Income, which I always have to give a shout-out to. But the reason I want to shout it out, and one of the reasons I'm so excited about it, is because it has been shown in studies to reduce firearm violence specifically. And also addresses inequality - and what we know, again, from other studies, is that inequality predicts homicide rates better than any other variable. So the more unequal your society is, or your city is, the more likely homicide rates are to go up. So if you address that and give people their basic needs - give them what they need - then that number tends to go back down. And maybe not the sexiest idea ever, but it works. And that's what's important. We've seen a violence interruption program in Richmond, California - which I love to pieces because it's been going on for a long time - it has hugely positive results for that community. And it actually combines the idea of a basic income with other services like mentoring for young people that live in Richmond, California. And like I said, they saw a huge reduction in violence. So you can get creative in terms of how you combine these different elements, but all of them have studies backing them up that show that they're effective in the real world.
[00:44:55] BJ Last: Yeah, and that's a phenomenal point, Amy - that it's not even community investments that are specifically linked to this, or specifically targeting - it's not just doing things like cure violence model or gun violence interruption things. Like you mentioned GBI, restoring vacant land - so pretty much making things into little parks, putting out grass and a few trees - that's shown to go and reduce violent crime, including gun crime. Upping the number of nonprofits in the community, mental health treatment facility options - even things like that that aren't specifically directed or don't in their name say, Hey, our mission statement is directly addressing this - these community investments, as Amy said, you reduce inequality, you reduce crime, because that is the biggest thing connecting them. So doing that - reducing inequality, invest in community will actually reduce crime and cut down on gun violence. Whereas giving money again to these three tech companies, that doesn't do that.
[00:45:48] Amy Sundberg: I also am really excited about the idea of creative placemaking, as a creative artist myself. That, again, has been shown to reduce gunshot violence - it's putting up art installations and cool, funky, creative plays and concerts. Basically, we have this opportunity to invest in making Seattle a more fun and vibrant and exciting place to be. And that will also reduce gun violence. It's one of these win-win, right? Same with some of these violence prevention programs - you're investing in community and you get the reduction in gun violence at the same time - it's another win-win. As opposed to the surveillance tech, which isn't going to be effective and it has all of these different harms, so it's kind of more of a lose-lose. And when you have win-wins and you get to pick between a win-win and a lose-lose, the fact that we're having this big debate and wanting to go with the lose-lose is a little bit baffling.
[00:46:49] Shannon Cheng: And the lose-lose is super expensive - we're talking about $1.5 million now. But my understanding is these companies - they're for-profit companies. So they obviously have business models which range from the subscription services, to just trying to expand their footprint of deployment, to selling their database that they're collecting all this information from us from to other parties who we may not have any control over. It boggles the mind.
[00:47:16] BJ Last: It is massively expensive. For just one of these technologies, Acoustic Gunshot Location, Chicago has spent over $50 million over six years. And again, that's just one of these technologies. Seattle wants three. And not to be - Oh, we should be penny pinching to try to reduce gun violence by going with investments like restoring vacant land, placemaking, cure violence models. We shouldn't be doing them because they're cheaper, but A) they work and you can do so much more as you go and invest in that. It goes a lot further, the number of investments you can make. And all of these investments are ones that actually do go and - yeah, make your city cool. Make it a better place, like Amy said, with the creative placemaking, they're restoring vacant land, they cut down on violence, and you can do a heck of a lot more of it than you can if you go for this surveillance tech.
[00:48:06] Amy Sundberg: While actually involving community - the people that live here - and giving them the resources and giving them more agency.
[00:48:13] Shannon Cheng: Yeah, wow. Well, here on Hacks & Wonks, we interviewed a lot of the City Council candidates - many who are seated now - and I remember hearing a lot from them about really needing to audit the budget and making sure that the money being spent is being used effectively. And so I hope they hear this - pick the win-wins, not the lose-loses.
So we're partway through this messy process, which seems like it's being rushed. For our listeners who have listened to this and they have concerns, what can they do about it?
[00:48:42] Amy Sundberg: They can do so much. Now is the time. There is a lot that can be done right now. And I really encourage people to get involved in whatever way feels best for them, because there are several options. I'd say the top option is to attend that second public hearing, which again is on Tuesday, February 27th at 6 p.m. - and it's both, there's a virtual option and it's at Bitter Lake Community Center. So I really, really encourage people to go, to give public comment, to support your community members who are in this fight with you. There also are forms online for each of the three technologies, which you can fill out - and you do have to fill it out three times, which I understand is not ideal, but I think, again, it is part of trying to make this process less accessible to community. So if you can stomach it, I say - let's show them that it's not working by filling out those forms. You can call and email your councilmembers because they're ultimately the ones that get the final say - they're going to have the final vote on whether or not these surveillance technologies are deployed. Start talking to them now - it's not too early, it is definitely not too early. Whatever you can do, if they're going to be talking in your community, if they're having a town hall - go talk to them there - the more, the better, frankly. You can write a letter to the editor at The Seattle Times. And again, those are shorter - those aren't op-eds - they're much shorter and easier to do. I encourage you to do that. And Solidarity Budget has put together a letter objecting both to the use of these technologies in our communities and also objecting to this rushed and sloppy process, which you can sign on to. We'll put a link in the show notes for that. You can sign on as a group or an organization, or you can also sign on as an individual. And I really encourage you to do that because it shows that we as a community are standing together.
[00:50:38] BJ Last: And follow Solidarity Budget - we will have more updates as this goes. If there are any more educational items that come up or additional ways to give input, we will definitely be sending that out through those channels. As Amy said, there's that hearing coming up on the 27th - you can do public comment. Or you can do comment forms online anytime until the 29th. And talk to your friends about this. This has not been something that has been widely covered - which, by the way, thank you so much, Shannon and Hacks & Wonks, for covering this, because it really hasn't gotten much coverage in local media that there are these three big surveillance techs coming. So there's a chance your friends, co-workers, whoever else you chat with doesn't even know about this. So let them know as well.
[00:51:21] Amy Sundberg: I really think that increasing surveillance to this level - this does represent a massive expansion of surveillance in Seattle, and I really don't want to understate that at all - it's a huge expansion. And I really think it's deserving of a really robust public conversation about what we want for our city and what direction we want our city to go into. And I don't want to get into national politics, but you have to think about the national political climate and the ramifications that are coming down the road, too. When you're thinking about increasing surveillance to this level - not only what is that going to enable us to do in June or July when it's first implemented, but what is it going to mean in the future? What is it going to mean next year and in future years, in terms of where your data is going to be, what the laws are going to be, et cetera, et cetera. This is something we should all be talking about, as far as I'm concerned - all the time - we should be talking about this.
[00:52:18] Shannon Cheng: Well, thank you so much. We will definitely include all the links to all the information and the resources in the show notes. This show will be airing on February 20th, so you have a week before that final public hearing on the 27th to get your comments in, to figure out how to attend, to tell all your friends to get out there. So thank you so much, Amy and BJ - it's been so great to have you back on again. Bye!
[00:52:43] Amy Sundberg: Thanks.
[00:52:44] BJ Last: Thank you.
[00:52:45] Crystal Fincher: Thank you for listening to Hacks & Wonks, which is produced by Shannon Cheng. You can follow Hacks & Wonks on Twitter @HacksWonks. You can catch Hacks & Wonks on every podcast service and app - just type "Hacks and Wonks" into the search bar. Be sure to subscribe to get the full versions of our Friday week-in-review shows and our Tuesday topical show delivered to your podcast feed. If you like us, leave a review wherever you listen. You can also get a full transcript of this episode and links to the resources referenced in the show at officialhacksandwonks.com and in the podcast episode notes.
Thanks for tuning in - talk to you next time.