Lawfare Daily: Chris Hoofnagle on the Theory, History, and Future of Cybersecurity (2024)

Chris Hoofnagle, Visiting Senior Research Fellow at King’s College and Professor of Law in Residence at the UC Berkeley School of Law, joins Kevin Frazier, Assistant Professor at St. Thomas University College of Law and a Tarbell Fellow at Lawfare, and Eugenia Lostri, Lawfare's Fellow in Technology Policy and Law, to discuss ALL things cybersecurity—its theory, history, and future. Much of their conversation turns on themes expressed in Hoofnagle’s textbook,“Cybersecurity in Context,” that he co-authored with Golden G. Richard III. The trio also explore related concepts such as the need for an interdisciplinary approach to teaching and studying cybersecurity.

Click the button below to view a transcript of this podcast. Please note that the transcript was auto-generated and may contain errors.

Transcript

[Introduction]

Chris Hoofnagle: Lawenforcement, the military, the intelligence community, private sector actorsall look at security differently. They have different incentives, differentgoals, different interpretations of what security is. And part of what's maderegulating security so difficult is that these goals are sometimes in deepconflict.

Kevin Frazier: It'sthe Lawfare Podcast. I'm Kevin Frazier, assistant professor at St.Thomas University College of Law and a Tarbell fellow at Lawfare, joinedby my colleague, Eugenia Lostri, Lawfare’s Fellow in Technology, Policy,and Law, and our guest, Chris Hoofnagle, visiting senior scholar fellow atKing's College and Professor of Law in Residence at the UC Berkeley School ofLaw.

Chris Hoofnagle: Thisis a rich field that involves so many different issues, from standards toforensics to international relations. So part of what we're doing is gettingthe student to slow down and to appreciate just how big the picture is.

Kevin Frazier: Todaywe're talking about “Cybersecurity in Context,” a new textbook coauthored byChris and Golden Richard. The book explores the theory, history, and future ofcybersecurity.

[Main Podcast]

Eugenia Lostri:Chris, I mean honestly, your textbook is pretty impressive. I think the firstthing I said to Kevin is that, oh my god, this is really a history ofeverything cyber, right? So that’s, that's really incredible. But I imaginethat a big challenge of writing a history of everything is ensuring that it remainsrelevant that you're not having to constantly update any time that there is abig change in technology, or a small change in technology, or in policy, since we'refinally seeing a lot of action on that front. So I was curious as I was readingit, how did you go about future proofing your textbook?

Chris Hoofnagle:That's absolutely right. That's a central concern in writing a textbook. And sowhat my coauthor, Golden Richard, and I did was take a framework approach. Thebook is filled with theory, with high level framing questions one can ask. Wetry not to answer questions, but rather to equip students with the rightquestions to ask, many of which are based around cost benefit analysis andreally thinking critically about what we're trying to do with security andwhether our interventions work, how they might fail, how technology mightchange. So you've identified a central concern for us, one that we've struggledthroughout the textbook to address.

Eugenia Lostri: Yeah,no, I can imagine. And I have to say the idea of the tradeoffs is so central,right? And it's not, sometimes not sufficiently discussed when we're talkingabout cybersecurity. We all want to accomplish 100 percent of what's in ourrespective silos. So having that as a theme I would imagine is going to be agreat contribution. Something that struck me, though, is that you chose to havethese images from the Iliad and the Odyssey, which is not necessarily what Ithink about when I'm thinking about technology. So, I'm curious if you can tellus a little bit more about why you chose that, what it represents, and how it'ssupposed to help students as they're going through the textbook.

Chris Hoofnagle:Golden and I are both classicists, and we think that there are lessons from theIliad and Odyssey that are relevant to today that we found that many studentsaren't familiar with. And the primary lesson is that you have to use your headto defeat your adversaries and not your brawn. You know, all of our popularmedia today presents heroes and soldiers as these kind of mega warriors ratherthan as people who use their head and use techniques that are as old ashistory. You can go back and read Caesar, Julius Caesar and his use of trickeryand disinformation to win battles. The Odyssey is filled with disinformationand clever tricks that allow societies to win without using violence. And Ithink this is a lesson we really need to understand that most people are going to,well the smart people in the world are going to try to engage in adversarialconduct using their head rather than their brawn.

Kevin Frazier: And whenyou mention use of a framework approach, using your brain, for example, to diveinto, let's say, a cost benefit analysis, one thing that came to mind for mewas that we're seeing some challenges to the use of CBAs, as the shorthand is,in other contexts. Where, for example, in an antitrust framework, we've beenseeing more and more people challenge this kind of over reliance, perhaps, onquantification of the good or bad of a certain policy.

Was there a sort of back and forth with you and Golden onwhether that was the best normative framework to use, or what was it about aCBA that you thought leaning into it in a cybersecurity context made the mostsense for students, given where we are, given other policy conversations goingon?

Chris Hoofnagle: I'vebeen a big critic of cost benefit analysis throughout my career. I'veidentified it as a kind of one-eyed analysis, but let me tell you, my sometimescolleague, Peter Schuck, convinced me that cost benefit analysis is the way tothink about regulation. And the way to think about cost benefit analysis is tobroaden one's lens to think about some other values.

So one of the questions we challenge students to think about iswhether a security trade off involves a privilege, a right, or an interest. Andif we are trading interests or trading privileges, that's a very differentissue than whether we're trading a right. Another question we ask in ourframework is whether the security measure makes opportunities for guile. Socan, if you implement this security measure, does it make it possible fordecision makers to take your money? Or to deny wrongdoing? So, security has tobe seen in this larger framework where assaults to our civil liberties are partof the costs considered and whether or not accountability is actually in thesystem.

And if there's not accountability, if there's no way for thedata subject, the individual, to hold a decisionmaker to account, that is acost that has to be in the framework.

Kevin Frazier: And Ithink this is particularly exciting because of my own experience being astudent of yours in an interdisciplinary cybersecurity course, having aframework that allows for the incorporation of diverse disciplines andencourages students to reach out to that public policy student or reach out to maybeeven that philosophy student and get a new understanding of what should beincluded in that CBA is really fascinating. So would you say that this textbookreally is a push, even a subtle push, or maybe an explicit push to make cybersecuritymore interdisciplinary?

Chris Hoofnagle:Absolutely. As Kevin, from being in my classes, my courses at University ofCalifornia are deeply multidisciplinary. It's one of the reasons why Golden andI wrote this textbook. We struggle to teach our respective students importantcyber literature from other disciplines. The classic example is trying to teachlaw students Thomas Rid's “Cyber War Will Not Take Place.” And that's anamazing article, and it has some subtleties and disciplinary assumptions thatare just outside law students’ toolboxes. Now you got it because of yourbackground, but most of--

Kevin Frazier: Whoknows if I really got it. You're being too kind. For all the listeners outthere, I was probably back there raising my hand all the time. What the heck isgoing on here? But I appreciate the kind words professor.

Chris Hoofnagle: Well,they're well deserved.

And so what we're trying to do, I think what the textbook does,is it synthesizes the economic literature, the psychological literature, thepolitical science IR security studies literature so that students who don'thave a background in these different disciplines can make sense. And a lot ofthat is about the disciplinary assumptions that are unstated in politicalscience and that actually conflict with lawyers’ ideas about what's just in theworld. And lawyers are very focused onindividual, individual rights. And so the disciplinary assumption that apolitical scientist or security studies scholars might have, it's just adifferent level. And it's just not obvious to people studying in the field.

Eugenia Lostri: Ijust, I want to jump in here as someone who comes from a background that hadnothing to do with cybersecurity or technology, here another lawyer with aninterest in international law. The way that I read this section of yourtextbook was, there's so many different ways in which you can contribute to thediscussion, even when you don't have this technical background, even though youshould probably learn a little bit, like your textbook is doing, just learninga little bit of everything to understand the context.

But I'm wondering, because you do work with students every day,you see this firsthand, and we know that there is a cyber workforce deficit. Weneed a push towards bringing more students interested in the field in any oneway, whether it's policy, whether it's law, whether it's economics, psychology,or the actual technical stuff. So, I'm wondering if in these cross disciplinaryclasses that you have, do you see what are the challenges or the hurdles in theway for these students? Or actually, we just need to wait a bit longer untilall of them graduate and then the cyber workforce problem will be solved.

Chris Hoofnagle:That's a great question. The problem that students have is that first step andbelieving in themselves and getting that first job. And what's so interestingabout this is that America and other nations need millions and millions ofpeople to work in cyber. And the upside for students is that even the entrylevel job, so information security analyst, is a $100,000 a year job. So, onein cybersecurity can do well. You can have a great middle-class job. The problemis getting that first job. And so what we do in the textbook is we integratelabs to increase students facility with computers. So, it's written so that evenstudents who have no programming experience can do the labs and learn moreabout their computers, but also to demystify things a bit.

What's going on in a lot of, let's say, SOCs is not terriblycomplicated. And a college student who has good critical thinking skills, goodcommunication skills, can absolutely do that work. But currently cybersecurityis cloaked in all this mystery and intrigue. So what we want to do is demystifyit and say, Hey, there's a place in cybersecurity for you. And it doesn'tmatter if you're, if you're a music major or some other major in, in socialsciences, what really matters is whether you can think and whether you cancommunicate.

Kevin Frazier:Speaking of that deficit of cybersecurity workers, you spend a lot of time inthe textbook pointing out that the military was the original cyber actor. And Ithink for a lot of those jobs you mentioned, the $100,000 job or some sort ofprivate sector job, I think there's a particular deficit in the public sector,finding folks who will be in those day to day roles, helping the governmentit*elf respond to these crises. Thinking about some of these broader effortswe're seeing now in the AI context, for example, the Department of HomelandSecurity created an AI core to try to bring more tech talent into thegovernment, would you call on DHS and similar entities to say, hey, we need asimilar Cyber Corps or can we expand efforts like that to say this is our sortof Peace Corps for cybersecurity or what can we do to get more public sectorexpertise in this front?

Chris Hoofnagle: Thepublic sector has a particular problem. They're training people from zero,taking them from zero to 60 and then they lose them. So the military and manyother agencies are training new people and those very people go out and getjobs at consulting firms where they sell their services back to the publicsector. So, it's great for those individuals, but it's very costly for thetaxpayer. So, we absolutely need ways to integrate new learners, so that that firstjob into the public sector, and no one wants to hear this, but we have to makepay higher. We have to pay more, and we have to make these jobs more flexible. There'sa lot of people who don't want, for whatever reason, they don't want to be inthe Washington D.C. area. They want to live elsewhere in the world. And forwhatever reason, they don't want to live under the burden of a securityclearance. And these are some of the factors that make it hard for the publicsector. So why should I live with the burden of a security clearance when Icould get a job at a Bay Area company that's not going to bother me aboutwhether I'm friends with people from certain nations or whether my weekendrecreational activities are. These are some of the challenges we need toovercome

Eugenia Lostri: Ssoyou started your question, Kevin, exactly the same way that I was going tostart my question, which is about the military as the original cyber actor. AndI was very curious if you could tell us a little bit more about how you see thefact that it started in the military, how it shaped the history of cybersecurityso far, the way that we understand it, and whether you're seeing a shift nowwhen we consider this more in the commercial space, the private space. Has itchanged things? Or are we definitely still burdened by the original sins of howeverything developed?

Chris Hoofnagle: Itmight be teaching at Berkeley that shades my lens on this, but my experience inacademia is that many in the professorate are dismissive of the military. Theydon't understand how complex it is. They don't understand how big and howawesome and how thoughtful people are in the military, and I really got fullyinto focus in this in studying Willis Ware's archive at the University ofMinnesota.

Willis Ware did all sorts of consulting and study for themilitary and some of his reports are now declassified. And what you learn fromthese reports is that in the late 1960s, the military had already figured out alot of things that we are still struggling to figure out in the corporatesector. Prime example is security by design, a report that Willis Ware wrote in1970, one of the first recommendations is that computer systems have to havesecurity as a design factor. That is security by design. In the 1970s, themilitary was doing red teaming. They called it tiger teaming. They also figuredout that multi-tenant environments are inherently insecure.

They figured out all these things that we seemed to forget andrelearn in the 2000s and 2010s. And so I think we should be humble, humbled bythis. And I also think that we ought to be looking at the military and theintelligence community more carefully for leadership in understanding securityproblems and then understanding what to do about them.

Kevin Frazier: And Iknow Eugenia is going to have a lot to say. I do want to make one plug justgenerally while we're talking about military opportunities as someone who wastwo weeks away from bootcamp joining the Air Force JAG, but then was medicallydisqualified. That's a whole other podcast, but for all those law studentslistening right now. Look at JAG programs. If you want to get involved on anyof these issues and get real meaningful experience right out of law school,call a recruiter. It'll be worth it, and you'll see some really interestingopportunities. And then send me an email, and we'll talk further. But withthat, Eugenia, please.

Eugenia Lostri:That's great. Thank you, Kevin. I just, there's so many threads in what youjust said that I want to pull. Okay, let's start with the first one. Youmentioned security by design and you may know, or listeners may know, that weat Lawfare have this ongoing project looking into security by design,the incentives and disincentives for it. You have an entire section addressingthis. And I do appreciate the plug that, yes, we've known it. What are some ofthe technical solutions to this problem? For a very long time we compiled this attemptedliterature review, looking at all the different ways in which different actorshave been thinking about security by design. And it just becomes verysurprising why these things are just not basic, why they're not in every singleproduct. And I do appreciate the Biden-Harris administration's push forsecurity by design. I think that's great.

But basically, the way that you talk about it, it's a verycommon way to talk about it, is about the incentive to be the first to marketand how that detracts from security. It creates, it makes sense, securitycreates friction, it means revisiting, means having to do things again, and ifyou're not the first to market, you're probably going to lose the race. So ifsecurity is not required, for minimum viability of the product, do you thinkthis is exclusively an economic problem, or these other categories that we'vebeen talking about, the technical side, the people side, the psychologicalside, do those influence it? Maybe they don't influence it just as much as theeconomics of the market. How do you see that now that you've done all this work,how do you see that affecting security by design?

Chris Hoofnagle: Letme just start with some humility. I'm a startup lawyer. I'm basically acorporate lawyer now. And my understanding of cybersecurity comes from my lens.And that lens is not universally true. It's just the lens I have and what Isee. Most of my work is in the startup space and the venture space. Smallcompanies tend not to have a chief security officer or security team. They tendto have someone who's very good at security, but not a comprehensive way oflooking at things. And they understand the game is about being acquired or goingpublic. And these economic factors are overshadowing the legal factors.

So speaking as a lawyer, I have to say they have to do all thethings and they have to comply with GDPR and so on. But speaking as an expertin the field, I know that the economics of becoming very affluent, of sellingone's company outweighs security. And that might be okay. One way to thinkabout it is, we're going to have this innovation, and oftentimes the acquiringentity cleans up the security problems. But it leaves a lot of users in alurch. And you can think about some of the social media players out there wherecompanies got big very quickly and then had breaches that were catastrophic fortheir users privacy. That's going to be the price we pay for that innovation.

Eugenia Lostri: Sinceyou brought up social media, let me tie that back to the military aspect ofthis. Personally, I've always been a little bit skeptical of pairingpsychological operations with cybersecurity. It just seems like they'resufficiently distinct and they have sufficiently different histories to beconsidered separately, but I know that's not the case for everyone. A lot ofdiscussion has bucketed psyops as part of a cybersecurity problem, which hasbeen super interesting to just read about.

But you discuss these psyops, and I'm just quoting, theprospects that cyberattacks might cause loss of crisis control is a powerfulpsychological factor for the military. So are you understanding psychologicaloperations as in you may be convinced that your cybersecurity defenses are notgood enough or is this just psychological operations by themselves are going tohave this demoralizing effect on the military?

Chris Hoofnagle: Sothe latter, and I think you're absolutely right. I think your framing isabsolutely right. I would draw a line. One is about electronic warfare, whichis separate, but then overlaps with cyber. Another is psychological operationsthat do not need to be in the internet at all. They could be entirely in personand so on, which again, overlaps with cyber.

So I think what's powerful about cyber operations, however, isthis psychological effect that an adversary’s systems might just not befunctioning, and it might be because of the incompetence of the adversary. Agreat example is Olympic Games Stuxnet with the centrifuges, that attack seemedto create a powerful effect on the adversary. And I think all adversaries topowerful nations have to think about whether their systems are malfunctioningbecause a powerful cyber actor like China or the United States is inside them.And so I think there's going to be this fundamental concern about control andsystems going forward.

Kevin Frazier: Sobringing up some adversaries now, I think that moves us nicely into a kind of realpolitikconversation. I think there are very few cybersecurity professionals andscholars who would say they are satisfied with cybersecurity regulation as itstands right now. We have more or less a 50-state framework for privacy and alot of cybersecurity measures. And yet you and your coauthor point out thatinternet policy disputes, to use the words of David Clark, are often betterdefined as tussles. And I love this phrasing of tussles. And you all say thatyou refer to them as tussles because they're perhaps harder fought than otherpolicy debates.

And so I wonder if you think that we need something else tohelp students not only take these frameworks, but then learn how to apply themin a kind of realpolitik setting. Because we've seen on the Hill, in thesestate legislative sessions, it's really difficult to pass meaningful cybersecurityprotections and new bills. So what's your response to how we take these greatideas and get them implemented in practice on the Hill or in Sacramento or pickyour capital?

Chris Hoofnagle: Ithink the most powerful argument we can possibly make is sourced in thesecurity as a science literature. If one can show that their approachesactually make people's lives better, systems more secure, more resilient, Ithink that's the way to go. And so I'll just throw a curve ball. I'm not sureat all that security breach notification helps anymore. And I think it'ssomething that we ought to rethink. I also think that lawyers have made givingnotification the terminal goal of security breach rather than security. Thewhole point of security breach notification is to create incentives forsecurity. So I think there ought to be a rethink around those measures.

The realpolitik issue, the way we address the realpolitik, isto talk about cybersecurity through different actors’ lenses and to explainthat law enforcement, the military, the intelligence community, private sectoractors all look at security differently. They have different incentives,different goals, different interpretations of what security is. And part ofwhat's made regulating security so difficult is that these goals are sometimesin deep conflict.

Kevin Frazier: And Ithink what's particularly important about the textbook as well is that you allpush students to think creatively about policy solutions. As you just pointedout, lawyers tend to be, sorry to the rest of the legal profession, fairly pathdependent, right? If we can squeeze something into an individual due processmindset, or say we can check a box and just persist with this outdatednotification law, even if we know no consumers respond to it. We're like, youknow what, that's all in a good day's work for a lawyer. Let's just keep onwith the status quo.

Eugenia Lostri: It'sgood to be self-aware.

Kevin Frazier: Yes. So,would you say, Chris, that inspiring that sort of policy creativity is one ofthe goals of the textbook?

Chris Hoofnagle:Golden Richard and I want to broaden what people consider to be cybersecurity.So many people out there think it's incident response. It's security breach.No, this is a rich field that involves so many different issues from standardsto forensics to international relations. So, part of what we're doing isgetting the student to slow down and to appreciate just how big the picture is.But then once one sees how big the picture is, you realize that cybersecuritymight become a form of universal regulation and kind of an excuse to regulateeverything under the sun. So, we have to both see the big picture, but also beable to decompose its elements into pieces that make sense and that areregulatable.

Kevin Frazier: And Ithink too, you all stress accountability, which is an underappreciated part ofany regulatory regime, which is, if bad actors can just continue to engage inbad behavior, or let's not even say bad actors, just accidental actors whoengage in a practice they didn't intend to without accountability, nothingreally changes. Would you identify that as one of the current biggest flawswith our approach to cybersecurity?

Chris Hoofnagle: So,accountability is basically not present in some areas of cybersecurity. And agreat example is cybercrime which is a area where people can make lots of moneyand more or less, they're unlikely to ever receive any type of investigation oraccountability. And then when we pan out a bit and say, okay, we can point ourfingers at cyber criminals all we want, I think we also need to think broadlyabout entities like critical infrastructure providers where the accountabilityrare really has to be around resilience. For instance, I don't think users are,as a user of my electrical utility, I don't really care about their security.What I care about is whether or not electricity is on. And so marketplaceincentives are really important and where you have monopoly and where you haveno choice, you might not have resilience and accountability. And so let me givea recent example.

Recent example is the car dealerships all across Americarecently suffered a major ransomware attack. I had both my cars serviced duringthat ransomware attack. Both of my car companies were operating, two differentcompanies, they were operating, they were able to take appointments, they wereable to work on my car. And I think that's because there's competition in theprivate sector. And it doesn't surprise me that we see total shutdowns whenthere is no competition, when let's say a pipeline receives a ransomwareattack. And the answer is to shut off service. So I see accountability in a lotof different ways. Everything from ensuring legal accountability through thecriminal law, but also through the market and how the market can shapeincentives for resilient operation.

Eugenia Lostri: So Ihave a question about accountability that is a little bit tangential to thequestion of cybersecurity. And actually, Kevin, this is also a question for youbecause it has to do with AI, which again we cannot go through a podcastwithout talking about. But when we think about the accountability of machinelearning, of artificial intelligence solutions that are being deployed and theproblem of this black box decision making, how are you thinking aboutaccountability in that context?

Where should that lay, should it be in the process? And maybethis is my lawyer brain thinking we should have a process in order to respondto this and make sure that there's accountability. Is it in the technical side?Is it on whoever is deploying it? Is it on the human in the loop? How should webe thinking about it? Because there's so many important decisions that affectmany individuals that are just being decided through who knows what process.

Kevin Frazier: I canjump in there a little bit and just say I think that this is one of the biggestissues with the use of AI, especially in a defensive context or an offensivecontext, whether it's a military actor or just a private actor, deploying thesesolutions where you're not quite sure who's responsible for what. And this wasjust amplified by the podcast I did with Ashley Deeks and Mark Klamberg a fewweeks back, where we're going to see AI systems interacting with one another.And when we get to that point, I don't know what you call black box squared.Maybe you just call it a black hole. I'm not sure. If I did just coin a newphrase, TM. But I think that's a really scary possibility that we have to reckonwith if we're going to ensure this notion of accountability, so my answer is a non-answer,which is to say I don't know how we hold folks accountable.

Eugenia Lostri: Sucha lawyer.

Kevin Frazier: Such alawyer. And I'm so skeptical, and I’d actually love your take on this Chrisgiven your attention to psychology as well. I'm very skeptical of human in theloop requirements satisfying a lot of these accountability concerns becausewe've seen things like automation bias, which is just, if you are engaging withan AI, it tells you to do X, what are you going to do? X. It tells you to do Y,what are you going to do? Y. It's not rocket science, it's just human nature,and that's fine, but I don't think we should ignore how our brains work, howour society works, how our organizations work, when we think about thoseaccountability decisions. So that's my one cent of Kevin wisdom.

Chris Hoofnagle: Ithink it makes sense to look at historical examples and the best historicexample is the Fair Credit Reporting Act, which absolutely regulates AI. Creditgrantors have been using machine learning in the form of multiple logisticregression for decades now. And your credit score is based on a somewhatsecret, it's a half secret, set of factors that are based on regressionanalysis. And the way we deal with that is we give individuals the ability tochallenge the ultimate decision. They're not allowed to look in the black box.We know what's in the black box. It's things like payment history and so on.

Now, automation bias is definitely present in credit grantingnow. If you go to your local phone store and try to buy a new iPhone and thecredit score comes back, it's not going to give, it's not even going to give anumber. It's going to give a thumbs up or a thumbs down. And that salespersonhas no choice. Yes, you get the phone or no, you don't. And that's it. And sothat's an example where we've totally locked into the automation of the system.We've totally taken away the authority of people to challenge it. Now thecustomer can go and say, I want to be reassessed or the information that mydecision was based upon was inaccurate and get it redone. But I think there'stremendous amount of academic study that can be done on credit reporting andthe effects of this automation and automated analysis.

I think it's also important, and what I see lacking in thefield is a paying attention to the compared to what. So you might not likecredit reporting, but let me tell you, there was something before creditreporting that was worse and it involved, sitting down in an office andconvincing a person that you could pay your bills without data. So as bad asthe machine learning approach is, it might be better than the alternative. Andthat kind of compared to what analysis is missing and a lot of the critique ofmachine learning out there.

Kevin Frazier: You'regoing to get me just talking on for way too long, but compared to what legalscholarship field is just too blank, especially with respect to issues likeAVs. I'm driving in Miami right now. I would much rather have a whole set ofAVs on the road than humans who are, let me tell you, just the mostunpredictable actors ever. So please, we need more compared to what analysisout there. Here's a call for papers for all those scholars out there.

Chris, I do want to also just dive into maybe other critiquesyou may have of the way cybersecurity is traditionally taught. I think thatwe've seen a huge spike in the number of schools who are offering cybersecurityprograms, which is great. It's on law course offerings across the board now. Butyou point out there's often issues with this monolith view, for example, tocybersecurity, we've talked pretty extensively about your emphasis on thinkingthrough all of the actors involved in cybersecurity. Are there any other commonissues you would just want to highlight or maybe best practices you reallythink other scholars should be paying more attention to?

Chris Hoofnagle: I dothink that cybersecurity has to be taught with a technical emphasis. And it'sdifficult to do this. This is one reason why Golden Richard and I wrote thistextbook. Golden is a computer scientist at Louisiana State University and haslong been affiliated with the NSA's Center for Academic Excellence program. Somuch of what we do is give a companion set of labs to the normal doctrinalinstruction to show students that computers are not mysterious objects, thatyou can become more sophisticated with them, you can learn what they're doing.And in fact, a lot of cybersecurity skills shrouded in mystery, once they areunshrouded, you realize that these are things I can do. And I don't need to bea computer scientist to do it. I just need to be someone who can make sense ofinformation. So having study and statistics having some basic programmingskills, are some of the areas that I think are important to teach. And so whatwe're trying to do is bridge the gap between the doctrine and these technicalskills. And it's rare to find people who have both.

Kevin Frazier: Well, Ithink we will go ahead and leave it there. Thank you again for joining us,Chris.

Chris Hoofnagle: It'smy pleasure. Thank you so much for having me.

Kevin Frazier: The LawfarePodcast is produced in cooperation with the Brookings Institution. You canget ad free versions of this and other Lawfare podcasts by becoming a Lawfarematerial supporter through our website, lawfaremedia.org/support. You'll alsoget access to special events and other content available only to oursupporters. Please rate and review us wherever you get your podcasts.

Look out for our other podcasts, including Rational Security,Chatter, Allies, and the Aftermath, our latest LawfarePresents podcast series on the government's response to January 6th. Checkout our written work at lawfaremedia.org. The podcast is edited by Jen Patjaand your audio engineer this episode was Noam Osband of Goat Rodeo. Our themesong is from Alibi Music. As always, thank you for listening.

Lawfare Daily: Chris Hoofnagle on the Theory, History, and Future of Cybersecurity (2024)
Top Articles
Federal Services - Conduent
Ascensionpress Com Login
Walgreens Boots Alliance, Inc. (WBA) Stock Price, News, Quote & History - Yahoo Finance
Skyward Houston County
Free Atm For Emerald Card Near Me
Air Canada bullish about its prospects as recovery gains steam
Practical Magic 123Movies
360 Training Alcohol Final Exam Answers
라이키 유출
The Pope's Exorcist Showtimes Near Cinemark Hollywood Movies 20
Horned Stone Skull Cozy Grove
Lost Pizza Nutrition
454 Cu In Liters
8 Ways to Make a Friend Feel Special on Valentine's Day
Cooking Fever Wiki
Who called you from 6466062860 (+16466062860) ?
VMware’s Partner Connect Program: an evolution of opportunities
Star Wars: Héros de la Galaxie - le guide des meilleurs personnages en 2024 - Le Blog Allo Paradise
Best Mechanics Near You - Brake Masters Auto Repair Shops
Noaa Duluth Mn
Melendez Imports Menu
Joan M. Wallace - Baker Swan Funeral Home
Ford F-350 Models Trim Levels and Packages
Low Tide In Twilight Ch 52
Naya Padkar Gujarati News Paper
Greensboro sit-in (1960) | History, Summary, Impact, & Facts
Paris Immobilier - craigslist
800-695-2780
Shoe Station Store Locator
Nurofen 400mg Tabletten (24 stuks) | De Online Drogist
Courtney Roberson Rob Dyrdek
Greater Orangeburg
Craigslist Cars And Trucks Mcallen
R3Vlimited Forum
Housing Assistance Rental Assistance Program RAP
Texters Wish You Were Here
Bimar Produkte Test & Vergleich 09/2024 » GUT bis SEHR GUT
Compare Plans and Pricing - MEGA
Daily Times-Advocate from Escondido, California
Stanley Steemer Johnson City Tn
Rs3 Bis Perks
Lake Kingdom Moon 31
Unveiling Gali_gool Leaks: Discoveries And Insights
30 Years Of Adonis Eng Sub
Cabarrus County School Calendar 2024
Nimbleaf Evolution
Maplestar Kemono
Lux Funeral New Braunfels
Cars & Trucks near Old Forge, PA - craigslist
Dmv Kiosk Bakersfield
99 Fishing Guide
Craigslist Monterrey Ca
Latest Posts
Article information

Author: Melvina Ondricka

Last Updated:

Views: 5985

Rating: 4.8 / 5 (68 voted)

Reviews: 91% of readers found this page helpful

Author information

Name: Melvina Ondricka

Birthday: 2000-12-23

Address: Suite 382 139 Shaniqua Locks, Paulaborough, UT 90498

Phone: +636383657021

Job: Dynamic Government Specialist

Hobby: Kite flying, Watching movies, Knitting, Model building, Reading, Wood carving, Paintball

Introduction: My name is Melvina Ondricka, I am a helpful, fancy, friendly, innocent, outstanding, courageous, thoughtful person who loves writing and wants to share my knowledge and understanding with you.