Skip to main content

tv   Key Capitol Hill Hearings  CSPAN  November 12, 2014 11:00pm-1:01am EST

11:00 pm
board members will have five minutes of questions. and then we will open it up to questions from the audience and as with the previous panels, when i start to ask questions, some of our staff members will stand up in the back and prim will stand up and hold up cards and you can get yourself a card. right down a question and then the staff will pass it up here. so we will just go down the row and start with professor kate. i'm not going go into length on the biographies because i think they are all available to you. professor kate is a professor at indiana school of law. he's been on a number of previous board and commissions on privacy and so professor kate, we will start with you. >> thank you very much. >> this is the time i think to say i'm color blind. so i will have no idea what cards are held up. perhaps you will wave them in a definitive way and i will pay attention. so let me just -- first of all, i'm sorry not to be here this morning but the last panel is
11:01 pm
absolutely superb and it is a privilege both to be here and i really want to applaud the board for taking up this, i think, really difficult but fundamental issue about what is privacy and how in practice might we go about protecting it within the private and public sectors. i want to really just offer observations as oppose to any specific if you will recommendations or conclusions. this touched some of the last panel. i think the fips are frankly not interest mendously useful. i'm not suggesting abandoning them which is a big change for me. ten years ago, a chapter called the death of fips but unfortunately i've gained a little bit of college here but i think we used them almost, like we can roll out these eight principles or depending on which list of fips you use or that will get us somewhere. and that far too frequently both in the private and in the public soerkt they really don't get us anywhere. what we end up is we end up just like talking about in the last
11:02 pm
panel looking for substitutes for the fips. we can't have consent, what could we have. rather than asking what is the purpose to be served in the first plis and maybe no longer rel haven't as tool to achieve that purpose, rather what are we trying to do here and really the question you've been asking all day, what are we trying to protect, what do we think protecting privacy really means. i say this by the way about the fips in part because i'm not sure that they have ever worked terribly well and certainly in the environment where they are largely noticed and i'm not sure that they work well in a world of massive data whether we call it big data or just high volume data. but the notion of a sort of fips like approach particularly with the focus on the individual when the broader issues are frankly societal. maybe the impact on the civil liberties. not of one person but everybody. i don't know that the fips help focus on the way and frankly the
11:03 pm
fips led to some silly results. i would just mention i've been surprised by example by the department of homeland security privacy impact assessment on border searches of electronic devices. which focus a lot on notice adds privacy protection. at the point that your device has been seized from you and its contents cop ooed, it is difficult to think that notice is meaningful protection. it may be necessary but whether it's protection or not, i think it's not. second point, one of the things we are seeing emerging in the debate in the private sector and we see this especially in europe and the context of discussing the general data protection regulation there is greater focus on risk management or risk assessment and risk management. i don't mean to use this because it is the jargon of the day but rather because risk management is an incredibly valuable tool that in the private sector we are far behind on. we have a clear idea what it means. part of the reason is we don't
11:04 pm
know what risk we're guarding against. we are very unclear what are the harms, what are the impacts, what are the negative effects we think we are balancing, if you will, of what are the positive outcomes of the use of data or what have you. one reason i think the risk management approach offers value in both the public and private sector is it makes us stop and say what is it we are trying to accomplish. what are the positive benefits and negative impacts not measured in terms of fips but measured in terms of actual impact on individual or on society or on the economy. as we think about it. when using risk management or if you hate risk management, in either case, third point, i think there's a lot of reason to focus more attention on use of data. and this has been a real weakness of the u.s. legal system. those of you who have suffered through law school know that fourth amendment has almost nothing to say about use of data whatsoever. you can have illegally seized data that the court acknowledges
11:05 pm
is illegally seized. there would be no disincentive for the collection only the collection of the fourth amendment in supreme court juris prude yens has been focused on. and for this reason i think we really would be better to think more about reasonable and effective limits on use. and i think that's what the public most commonly cares about. and one of the practical reasons is there is almost always a legitimate reason to collect the data. always some employment reason or security reason. there is some private sector reason. you know, verizon had a reason to collect the data. and then the question was who could access it and how could it be used. but our legal system is focused enormous attention on collection and once the data are in the government's store house then we feel that the data are more commonly out of control and i think that is a critical area to focus on as well.
11:06 pm
fourth, as i mentioned, i think the fourth amendment while a critical legal limit and i certainly incur -- that's yellow, right? for the rest of you, you will know, i just got a yellow card. i think the fourth amendment is critical legal limit and we must of course observe it. it is not a very useful guide for telling you what to do in the future. for a positive analysis of privacy issues. and i think we should again be careful about that. too often in our rhetoric we say, it's permitted under the fourth amendment, as if that tells us anything. other than it is not illegal under the fourth amendment but doesn't tell us anything under the ethics or desirability or what have you of doing it. and fifth, i would just say, it almost all of these areas, and i understand in national security this is particularly odd, i think redress is something we need to continue to focus on. we see many uses of data in the government setting and in private sector. which are done without regard to redress. with just sort of well, if it
11:07 pm
affects the person inaccurately every now and then, what does it really matter? we will deny boarding to people on airplanes or provide extra security for the wrong people. this is not an efficient use of government resources. and it is not a good way to think about privacy. and i think we should be very clear in those rare exceptions where we say, there might be no redress available here for the individual in which case we now have to provide it through other means inspector generals or the other ways of approaching it. but at all times we should think about redress, not just because of the rights of the individual but because of the interest in insuring that the system works as advertised and as it should. thank you very much. >> henry giegler focused on civil liberty, computer crime and cybersecurity. thank you for being here.
11:08 pm
>> members of the privacy and oversight board, thank you for inviting me to speak at your meeting today. thank you for your excellent work for ensuring protection for privacy, civil liberties and terrorism programs and congratulations on having one of the best acronyms in town. when it comes to evaluating privacy protection, the center for democracy and technology believes that fair information practice principles are a very important framework for both government and the private sector. now you can add other privacy frame works on top of that. we do not disagree with professor kate that societal impact is in use and protection focuses on the purpose of data collection are also useful but we view the fips as indispencible framework for evaluating privacy collection for data collection practices. the individual principles as you know are overlapping and mutually dependent on one another. it is a framework.
11:09 pm
an smorgasbord that you can choose and pick. and there is obviously some discussion in the private sector about doing away with data collection limitations or the data minimization principle of the fips seeing as how we are in an age of big data. but in the time you have given me, i want to address this head on in the context of government surveillance. first, cdt believes that there still should be collection limitations on private sector data collection. and that data minimization principle of the fips should apply to the private sector. second the government should not take its cues entirely from the private sector when it comes to noogs national surveillance. it is fundamentally different from national security surveillance and therefore even if the private sector were to collect data in some other man fer an alternate universe, then they should not follow suit.
11:10 pm
national security arms are not as transparent or responsive and are not likely to be. major companies in addition allow or are required to allow the collection of information about them. more and more services are differentiating themselves on the basis of strong privacy protection and of course individuals can choose ton participate in a commercial service as a means of limiting direct data collection about them. but data collection for national security purposes does not permit any meaningful choice. so this is not to law the private sector data collection practices because cdt does view them as generally inif you fishant protection of privacy.
11:11 pm
buzz of the differences i just broefly listed and other reasons, even if the private sector fails to robustly apply the fips, government amg encys should not follow suit. if anything, because of the differences government should strive for more strict and consistent application of the fips than that of private sector date why collection. so i have a small set of broad recommendationes it make. first, the government should place greater emphasis on implying the data of the fips. back in to minimization procedures alone are not sufficient. front end is also critical. trust is breached at the point of collection. once the government collects information nonstatutory internal restraints on access and use can fall away like sand castles on a beach. we saw this happen with the 702 loophole. so surveillance should be restricted at the front end by narrowly limiting to what is directly needed to accomplish a specific purpose. date why should then be retained only as long as necessary to
11:12 pm
fulfill that purpose. and the data should be destroyed unless a determination is made that the data are needed to accomplish the specific purpose. specified purpose of data collection itself should be subject to meaningful restriction. for example, limiting the scope of what is relevant under section 215 or definition of foreign intelligence and executive order 12333. so goal should be overall to move from mass data collection to targeted data collection of both u.s. and nonu.s. persons. a fair reading of the statute does not seem to grant them with
11:13 pm
this authority. so with order or when necessary summaries of opinions would substantially boost transparency. we should not be a nation of secret laws. third the government should have scope and request for data under national security authorities. the government should authorized the private secretarier to make similar reports. information is power and privacy is control of information. and entity possessing information about an individual has power over that individual. large scale government collection of information about individuals threatens the relationship between citizens and the state because it upsets the balance of power that supposedly exists in democratic society. therefore, cdt urges to recommend that the government recommit to robust application of fair information practice
11:14 pm
principles as well as other considerations regardless of what the private sector does. with much more targeted data collection and greater transparency. thank you. >> thank you. >> our next panelist is john grant. mr. grant is a civil liberties engineer and he previously served on the staff of the senate homeland security committee where among other things he oversaw the department of homeland security. thank you for being here. >> thank you for the invitation to speak today. as i never tire of telling people, i was a staffer on the greeting club. so i take a parent of the board and i'm sure it is every parent''s dream to one day testify in front of their children. i will spare everybody the commercial, just suffice it to say, building a data platform, that works with data, starting with the law enforcement intelligence space and extended to deployment around the world and in a variety of context and
11:15 pm
the financial sector and elsewhere. our technology isn't successful if in the course of achieving an organization we are not able to be deployed in a way that protects privacy. that is something that founders of the company instilled from day one and that is why my job exists a civil liberties engineer. one thng i learned, and this is different from the hill certainly, when you walk into a room and say to engineers, i'm worried about this thing you're building. it creates a privacy problem. the response is oh, okay, how do i fix it. which is not often what you get when you ray these things other places. so it is our job as an engineering team to come up with suggestions for how to fix it. i'm a lawyer. as you may have guessed. so i do not necessarily possess a lot of technical skill. so the main role for us is to translate between the lawyers and engineers. so what i want to focus on today
11:16 pm
a little bit is some of the technology at a high level and then i had actually suggestions for moving forward that i think are actually fairly low hanging fruit. so just briefly to provide context, as i said, data management and data analytics. we're not dealing with the collection of data. this gets more to professor kate's point about the use of data. and we have two sort of high level categories of technology that deal with manage willing or protecting privacy with the oou use of data. there is access control and oversight mechanisms. i want to start by pointing out and this is something to keep m mind just as technology expanded, power of surveillance and the amount of data collec d collected, it is also significantly expanding the leflt of privacy protection that isvilleable at the agencies. if you imagine 50 years ago if there was an fbi file, this is probably pieces of paper in a red well, sitting on a desk somewhere or maybe locked in a
11:17 pm
desk drawer. hopefully locked. or maybe in a dusty basement archive or something like that. and there is probably limited tracking of where the log book was. and anyone accessing the file can see whatever is in the red well. you can just rifle through it and you can see everything even if it isn't directly relevant to what you need. it o would be nonexistent's you couldn't see who added information to the file. who deleted informing from the file. and deletion is hopefully burn bag or shredder. probably just crumb pling it up in the trash. or a black magic marker redacting a few points of information. today we do a lot more management data and oversight. and management at a grander level. that's what axis control point. which you can now build axis controls to manage data very prenicely on data point by data point basis. can you do it in a more nuanced
11:18 pm
way. you don't have to choose between access or not access. can you make the access controls dynamic and so there is a lot of options and sort of the way the many options you have to configure the access controls give you a near infinite variety options in how to manage data. who can see the data and what they can do with the data. the other point is oversight pecknisms. and this is really you think a lot about audit logging and also using technological electronic work flows to control exactly how data flows around an organization and who can see data and what kind of analysis they can do with it. or hard wiring an approval chain for use of data and things like that. and these can be very detailed. so the -- or the hard wired approval process and things like that. that can be very complex and involve multiple actors. and then the auditing of how
11:19 pm
data used it self can be incredibly granular and incredibly detailed. and i want to get to other point. just these two capabilities are a significant improvement of what existed before and can get us a long way. and there are things that exist today. now i'm obligated to say that poll intier does this best but this is not exclusive to poll intier and they can be deployed and can be used in a lot of different context. so what is the problem today? why aren't these capabilities being used more? a couple things. one, issue and technical awareness. lawyers don't know technology and engineers don't know law. and you need people who know both of these things to be able to make the decisions as how to use these technologies. how to incorporate them into programs. lack of resources. you need people who can actually
11:20 pm
manage the data. you talked about this in earlier panel. alex joel has a very small staff. erica has a very small stafkcaa. they need resources and infrastructure do this. resource is hard. how do you use an audit log. how do you use it effectively. how do you access controls especially when you are dealing with massive amounts of data. the last one is death by anecdote. the debate, cost benefit analysis, tends to be the national security soerkt saying one time we caught this bad guy using this information and this community saying one time this unjust thing happened to a person because of this program. there needs to actually be a much more -- you can't just make this argument on anecdote. you have to look at date why and can find that more specifically how these programs are working, how effective they are.
11:21 pm
so solutions suggest some of the solutions in listing problems. education. i think and poll intier sponsors scholarships. to make sure lawyers can learn technology and engineers can learn law. it should be a requirement to have an ethics program. they will build technology that will hit the streets and is going to months or years before the law catches up. so shuouldn't engineers catch o what they are building is affecting privacy and they should think about these things. infrastructure. an important value for us as society then we need to invest in infrastructure to support it. cop photo goidance we actually need go beyond just systems should have use limitation. we need to tell people, how are you going to do that? i can dig into that more when people have questions. but really specific guidance rather than just the, you need
11:22 pm
to have notice and consent, you should think about use limitation and things like that. and last, everything in the world can be datafide these days. including how effective they are. becan do analysis and start analyzing data and figure out, is this effective, is this not effective. is this having negative effects. is this creating bias in the analysis. thanks very much. >> thank you. our next panelist is chris, venture partner at palton group. >> i spend most of my time teaching at the naval academy. i, like the other panelists, are grateful you established this venue for what i think is important dialogue. i would like to make four quick point, then get to question and answer. first and foremost, i absolutely agree with the premise which that the framers of the constitution did not intend for security and privacy to be in mort al combat and we try to
11:23 pm
figure out how do we achieve both. it may very well be we cannot trade one for the other. i think that's right. but we have to work harder to achieve both. i think technology and practice from the private sector can be helpful there. two, i agree that government is different. not simply in the powers, toops it might bring to bear on a citizenry or others. and therefore, therefore should be con trained. but the government alone has the requirement to essentially meet standard of the first, fourth and tenth amendments within the constitution and from my nsa experience, the most significant of those which essentially says, unless you have the dhothe authority to do something, you should not. the back door search is from 215 or nsa interpretation, both were specifically permitted underneath, under court approved procedures and specifically where interpretations of the law that went through three branches of government.
11:24 pm
i think that's right and proper. that doesn't necessarily justify them. it may be bad policy at the end of the day but rule of law has to pertain to how the government gets things done. point three, i would say that largely agree with what john had to say. i holy agree with what john had to say.wholy agree with what jod to say. that aspects and law are at odds with each other, because they are perceived as independent bias on any particular solution. i would add a third which is that what typically plays out in any one of these systems is that you are trying to effect technology, law and operational practice of those that make sense of the technology and surprising result is that because they do not change at the same rate, they essentially change at very different rates. keeping them reconciled or synchronized from moment to moment is really hard. therefore mechanisms or oermg things are not likely to satisfy the need what you need are threads or systemic solutions that you pull through and you
11:25 pm
take both art and science process to essentially try to figure out how to make some solution here. i will wholly agree with john that education is absolutely essential. at nsa when we found ourselves at compliance incidents, which no one intentionally made a mistake, we had to sit down and figure out, how do you find a horizontal joint between all who were trying to achieve something slightly different but ultimately invested in the same problem. last point i would mike is i do believe there is a role for big data. sometimes called mass collection. there is a role for big data. but the principles should be the same as surgical data. which is necessity and proportion ality. the government should be able to justify on what basis this is necessary. such that it could then argue not for on encroachment upon civil liberty or privacy but how
11:26 pm
do we work harder to achieve sustainment and it should only achieve that in proportion to that need. therefore, i think that all those comments aside, i would say that the private sector probably has a lot of experience in this regard that the government can take advantage of. my own sense is that government collects far less information than is perceived by the public and certainly far less information than the private sector does. i don't excuse the government for that. they should be held for account but they can bring technologies in that might well scale quite well for the government's purposes. because it would have to scale them down as oppose to scale them up. i'm open to any questions you might have. >> thank you. >> just a reminder to the audience, that there are stoppers in the back with cards and if you would like to direct a written question to the panelist, hold up your hand, find one of them, and write down your question. and for the benefit of the audience and the cameras, for
11:27 pm
the panelist when you're answering a question, if you wouldn't mind moving the mic back and forth. i'm sorry, we don't have as many mics as we probably should. i would like to start with asking about oversight. and i would like mr. grant to direct this question to you first. both in your oral statement and in the written statement that you submitted to us, you talked about a wide range of mechanisms. paper trails and electronic work flows and things like that. frankly, when i read the written statement it seems like an overwhelming array of different ways to engage an oversight. i think for a couple of reasons you need to choose your oversight mechanism. one is that the agency will have limited resources to dedicate and secondly as i mention a previous panel there may come a point where there are diminishing returns joefon oversight. you need to leave the agency to do their job and not have mechanisms all day long.
11:28 pm
so have you given some thought to what constitutes an effective oversight mechanism? how do you rank different mechanisms in terms of their effectiveness? >> yeah. so i think we should actually think about oversight as a big data problem. and apply the same thinking to it that we would try to analyze intelligence and try to analyze huge amounts of transactional data for marketing. it's a similar issue. have you a huge amount of data. there are massive amounts of audit logs for example in an organization like the nsa. and that's a lot of information. but you can use technology and analystic tools to mick sense of that information. and drive the insights that you're looking for. so at the part of the issue is, a, you need to do it. you need someone so we see this all the time and i know other
11:29 pm
organizations see this as well, which is everybody checks the box for audit logs. we've got audit logs and we will go through an enormous number of hoops to make sure it is logging exactly the information that it is supposed to. we get fewer requests to actually look at the audit logs once the auditing mechanisms are logged on. there aren't many laws i can tell that tell anyone they have to look at audit logs. it is the seinfeld joke about renting a car. everyone can take the reservation. but to hold the reservation, to use the information. so i think, to me, that's how make oversight more effective. you use these techniques. and that's another thing. oversight people and the information security people and things like that, they should be as good as your analysts and you need to have good people who are also doing analysis and connecting oversight. so to get to your last question, which is the most effective, i think it is using that auditing data. using the big data that you've
11:30 pm
got and having a team of people that can proactively comb through it. not only are you looking for people doing something wrong but you can also ask questions such as, is the data retention policy make sense? can you look at data and say, it turns out we is data set for five years. no one uses it older than three years so let's change the policy to change with the use of data. >> okay. i would especially like to get your thoughts from your time in government. what did you view as an effective oversight mechanism. >> first and for most, if there is an authority granted or burden that's imposed and they come hand in glove, that's not a one-time thing. and there cannot be a repurposing somewhere later or have gotten past that threshold. events might be collection, processing of data, analysis of data, dissemination of that data and burden imposed at every step
11:31 pm
according to the authorities that were granted for the acquiring of that data, acquisition of that data in the first place. what he with found to achieve that, data is is aggravated, synthesized, we take the iconic analytic effort, doesn't simply use data from one source, they use date wa from many sources. if there is different expectations to keep it straight in your head as to what you're going do about that. so the focus has to be, how do you find the attributes for particular data element at the moment that it comes into being. >> could you pull the mic a little closer? >> at the moment, you collect a piece of data, how do you bind attributes to tho that data, wht is the authority under which that data was collected, what are the burdens, constraints that come along with that. what are the prescriptions if any that come with that and that
11:32 pm
should be bound through that data through its life. throughity life of collection, process, analysis and dissemination. now at some point there is a second order use of that data where someone reads a broad swath of material synthesizes that in their head and constructs a document across an air gap. that gets hard. but at least in that primary use, if you have a systemic view from start to finish, you make the auditor's job or compliance oversight much, much easier. and you therefore in your system in your technology, essentially impose a constraint or check every time something exercises privilege again that data. whether it is at collection, analysis, processing or in dissemination, that makes the auditor's job much easier and frankly has a nice deterrant effect because they know at every moment they are held to account. but at my experience in government it is not so much the deterrent in government as the very rule ladened environment.
11:33 pm
typical counterterrorism analyst at nsa would often deal with hundreds of constraints on the data sets that are available to them. because various orders of the court, interpretations of the court sharing arrangements with various others nations would all come along with their independent assessments of how the data can or should be used. so bottom line is the technology can help us by essentially doing an atomic bind. meaning it issor beganic to the data itself of what is it prominence. that should never be lost through the history of that system. >> thank you. i would like to turn to fips and mr. goiger, i was happy that you recognized those and professor kate as well. so i would like to direct this question at first to the two of you. so mr. geiger, i notice that in the written statement that you sent us to you talked about the fips but you didn't really talk about the individual participation fip. and i guess when i talk about the fips, i'm referring it primarily to the dhs version. you said in your oral statement
11:34 pm
just now that the fips are not a smorgasbord, they are a framework. you can't just pick and choose between them. if you have to employ the fip, how can that work in a surveillance context in. >> that's the tough toastest >> that's the tough toaste to a this this context. one way to do it, which is not viable or good policy, is bring suit for violations of law. but my, i think more reasoned answer, is if the fip is lacking in the national security context, then the rest of the framework has to work overtime to compensate and that includes data minimization, which is why i emphasize data collection and transparency. as well as the rest of the framework. i absolutely recognize challenges in participation but this is one area again where government is different from private sector and i think that difference should express itself
11:35 pm
in particular in the data minimization principle. >> professor kate, do you have thoughts on that? i would ask also, there's a lot of -- a lot written and said in public recently about how perhaps the consent and individual notice fip really doesn't work well in the private sector because nobody really understands what they are consenting to. they have to consent to get service and it is a meaningless exercise. do you have thoughts on that and whether the individual can work in this process? >> thank you. i have thoughts. especially with one of the people who have written some of that. i think the challenge of the fip says that they often lead us in the wrong direction. and i think this is a real challenge. i'm not in any way trying to make it sound easier or make it sound like there is a simple answer here. but for example, if we think of
11:36 pm
fips and classic 1980 fips, we are talking about consent, use limitation, to the purpose specified and then we add things like data minimization and individual participation and frankly almost all of these seem claeled in a modern data environment. private sector or public sector. in other words, how does that really work? you know, there are 60 people in the room. they all have cell phones, recording devoices, video, audio. i don't have a statement from any of them. i don't know about my individual participation rights. i suspect they would look down on me wanting to interview each of them about it. the issue is an important one, which is how to protect privacy but shifting the burden to the individual which is what fips have the large effect of doing is a very difficult way to approach that. i think it is an important way to approach it in the public soerkt environment. but it also may lead to completely wrong results.
11:37 pm
in other words, one of the surprising things to me, and i can't believe i'm saying this in a place that's recorded, but that the -- about section 215 is that nsa collected all this data and did so little with it. it was astonishing. so you would like to say, when people talk about atomically binding limits and what you can do with the data with the data, something knew we might do with the data that might have a major effect on national security, we would have a process for some sort of risk analysis. what's the benefit. what's the risk p. what are the processes in place to protect it. now let's do that thing. and data has real value. it does in the national security environment and private environment. i think we need to think about approaches here that aren't boinding everyone to some mythical transaction that took place that which in the fips world we say the individual agreed to this even though i can't think of a case in which the individual actually agreed to it or it was meaningful
11:38 pm
consent. and in the national security world we just overlook thap. well we think it was important without again doing a clear and well-documented type of risk assessment. using clearly are a tech lated benefits and harms.te a tech la benefits and a tech lad benefits and harms.ce a tech lated benefits and harms.ue a t lated benefits and harms. a tec lated benefits and harms.a techd benefits and harms. tech lated benefits and lated benefits and harms.ech lated benefits and lated benefits and harms.h lated benefits and harms. lated benefits and harms.lated benefits and harms. >> so it does sometimes lead programs in the wrong direction. it is a useful framework for evaluating privacy protection but the application of the fips, what you are actually doing with the program, you may pass muster under your privacy impact assessment but the way the program is impacted on the grounds may not be privacy protected. so i don't think that fips are a silver bullet. but the principles themselves i think are very useful for the evaluation of the program. second, it's been a long standing controversy about notice and consent being inadequate. but that is why i said at the outset that the fips is a framework.
11:39 pm
each principle is dependent on the other. this came up clearly in the health context. people don't know what they are consenting to when they receive a notice from their doctor. they don't know what the privacy notice says or means or what hippa does. which is why there has to be a lot of additional privacy protections in place actually meaningfully protect that individual's privacy. then lastly, fips are not the only framework. i this i it is useful. indespencible framework. but there are other frame works that can be applied and should be applied to data collection at large. >> though this is the subject of the first pab el and not this panel, but i want to ask anyway. i apologize if i'm springing this on you. i want you to say what is privacy. you i assume you spent time
11:40 pm
thinking about how to protect privacy and civil liberties. what does that mean? what interest were you trying to protect? >> i would sigh i don't think that has changed over time. the fundamental question, always comes back to, two things. one, with respect to the perspective of the individual is there a reasonable expectation of privacy for fill in the blank what that information might be. that's the stuff of great legal debate but operators think about that as well. particularly operators inside the government because they are con train strained by the tenth amendment to think, what else is there do. but the second way to think about the issue of privacy, is then what might you learn if you take these discreet data sets and combine them in a way that might then give you some insight into things that were not self evident from any one of the discreet data sets.
11:41 pm
and you have to think about aggravation, synthesis, down stream. again you might have thresholds that you have to think your way through and you have to go beyond that particular point in time. i would tell that you at the national security agency, ethos is as important as compliance rules, fips mechanism and things of that sort. science will lead you astray. science alone cannot lep you. so essentially navigate the clael. the question of how do you achieve both security and privacy in a world where they are massively converged in a place called the internet. >> professor kate, do you have a thought on the nature of privacy. >> running out of time before you got to me on this. this is an area where i think public versus private sector is an important distinction. i think it has to be kept clearly in mind. in the private sector i think of privacy mainly in terms of if
11:42 pm
harms or impacts on individuals or groups of individuals. so whether that is it the way we think about it in the fair credit reporting act like higher price for creditor denying someone a benefit or whether it is some other way in which we think about an individual being manipulated or higher price. in the public soerkector i thin that is also true. i think there is something more in the public secretary popper which is privacy i think from the very beginning of the constitutional debate was seen as something about the balance of power between the individuals an their government. between the citizenry and the government. and that there is something quite strike aeng this i completely agree with harley about the more the government knows about individuals, the greater the risk that that information will be used in way that that alters that balance of power. that makes the government more
11:43 pm
powerful and makes the individual less powerful and you know, a widely served but ironic twist as we've got under the 21st isn'try we are less transparency to the citizen about the government and more transparency about the citizen to the government. and that is a clearer alteration in that relationship. that power relationship or that oversight relationship. and so in that sense, that's why again, focussing on collection or use, it may be a not so significant matter. but i at the end of the day, it is use that matters p. it is knowing how can the government use this information in a way that might effect me as opposed to o is the information out there, which seems to be always the answer is yes now. >> mr. grant? >> i don't have necessarily an answer. but i think i have sort after framework for thinking about tp prsitp prsiytp prstp prsittp prs prsitpthink about it in social .
11:44 pm
younger people are viewing privacy. if you ask, most engineers appear to be about 14. and if we had discussion internally and should we look at linkedin, face book, and look at it as part of the ways to defirdetect phishing and things of it as part of the ways to detect phishing and things of this. they vigorously -- and they say, you tweeted that. which means people are going read that. it is a tool for communication to the world. and they still felt yeah, it is publicly available. anybody can google it. but they still have an object to government collecting it or government reading it. or their employer reading it. things like that. i don't know what that means in terms of coming up with a final
11:45 pm
definition of privacy but suggests that people -- there is a different view of it. and that even public information, there is still privacy inherent in public information somehow. like i said, i think talking through sort of attitudes towards social media and understanding that could help us figure out what is the newer conception of privacy in this technological age. >> do you have something to say, mr. geiger? >> sure. i said most of it in my opening remark. i view it as an individual's ability to control information about herself, but then also the control that the entity holding information can exercise over individuals. and i think that it is very important not to just look at privacy harms or privacy interest o are the extent that privacy can control over an individual or their decisions in the context of today's technology.
11:46 pm
i think it is important to look out the next couple of decades and see what is coming down the pike and there are very pervasive, very privacy intrusive technology that we will see in hoirms or in ourselves in our lifetimes and certainly our children's life times. the laws haven't kept pace without a change in the law. again i reiterate that internal pro tetection on use and access while important be is not sufficient. because they can change. they have changed. when we talk about protecting privacy, i think we should look as i said just to what we are protecting several generations down the loiioinnoineoininn. . >> professor kate, they've been talking about that throughout. and for focussing on how the private sector might have solutions that the government might learn from, private companies are obviously doing something to control use of information they collect.
11:47 pm
they have to. they have a privacy policy that says what they will do with your information. they have to comply with it. there are organisms that they can use for forcing limitations that are effective that the government might leash from? mr. grant, do have you a view on that in. >> so we see this a lot. our customers don't hold data. and honestly, actually, they use the same basic pecknisms i described in my testimony and often the same basic weaknesses. and do they have the infrastructure to manage access control? a lot of them do not. and it costs money and takes time. are they conducting oversight of data? probably so more than some people, and again because of limited resources but they are probably still not doing it at the level that you would hope.
11:48 pm
and one thung i notice is that a lot of them, there is, even in europe, where you have more commercial privacy law and more commercial privacy compliance requirement, a lot of times it's best guess. for example, one we have been running into recently now, is looking into cybersecurity and information security data ex filtration risk in the private sector. and these giant companies are trying to deal with privacy laws that are all over the map. they are asking questions like if a german employee sends an e-mail to a u.s. employee, what privacy rules apply to the c conteco content of that e-mail. in germany, you have to tell people, i'm going to monitor your e-mail. in the united states, they can basically do what they want.
11:49 pm
there are terms in what the privacy is try doing but i think you are facing a lot of similar problems that related to scale, related to lack of understanding of what the rules should be as the government. >> so there is probably a lot of great technology out there that can be use b used but any techn can fuel into the wrong hands without the right process. the following process might be to consider, that first and foremost before you acquire any capability, within if the government, within the private sector, you think of the po portionality situation and is this necessary and have i done this only to the degree it is necessary. and what we are trying to achieve is not simply the balance of privacy but traps
11:50 pm
transparency and you don't often believe they achieve the balance of the first two. that derives the possibility in the government. the need to essentially acquire explicit that comes with constraint. constraints are bound to that and some measure of accountability for those constraints. and the process elements that then are essentially implemented to pull that off, i think should have the aspect of continuous compliant. not discreet compliance but continuous compliance. you think about it all the time. first, middle and last. a stretched analogy as port of the problem of cybersecurity. and you think of that as a bolt on some will such tomb we operate systems continuously with that foremost in miend as primary attribute that will break our heart. the next is an external component. internal component, you have to hold people accountable internally of system.
11:51 pm
you can wined up with mismatched expectations or the system might in fact go rogue. and then three, there has to be at various phase point required importing that ch is important because that is synthesis and retrospective that says how do wing we wing a reget aggregate our experience. do we need to invest time and energy in the process itself and absent that, you find that you're the frog in the beeker and it is just getting a degree hotter moment by moment all of a sudden you're the boiled frog. and you didn't realize you step back and take a hard look and you got off course a little bit. time to go. >> thank you. i think my time is up. we will go to mr. dempsey and go on down the row. >> thank you. thank you members of the panel for giving us your time today.
11:52 pm
>> in a way, building off of something that chris said or at left what i heard, you are saying that we need the technology controls. we need to build the technology in a way that implements the controls but at the same time you need the policies that surround it. you need the legal rules et cetera. i think john, my first question was to you, you talked a lot about the potential of the technology in terms of tagging information and audit controls and permission controls but just to state the obvious, that's an substitute for legal rules and policies. >> absolutely not. we try to say, even when we talk about privacy and capabilities, if you think you're buying a switch that you can flick that protects privacy, it is not going to happen. it not possible. you have to be able -- you have
11:53 pm
to respond dynamically to changing situations. you have to be able to make human driven nuance decisions. about data and how it is oeused used appropriately. that is just not something that machines can't do. and you can't find a terrorist button. you nodeeed a human at the top the analysis clanalysis chain. so don't worry about it, we've got privacy covered. so what the goal should be for technologist says what kinds of tools do policy makers need and the oversight boards and civil liberties protection officers, what do they need and what makes their job easier or possible especially when you are dealing with data at scale. and you know, so easy example is there's a lot of work, a lot of research going into improving
11:54 pm
access control inner face. when you're dealing withtera bites of information on the cybersecurity space, how can you create technological short cuts to allow a human to make the designificants about how to manage that data. and that is how you do it. you think about how do you support the policy. not how do you replace the policy. >> let me go to fred kate. fred, totally accepting your point about the limitations of the fips and totally accepting your point about the importance of focussing on risk and focussing on use, you're not saying, that collection is irrelevant, that obviously the fourth amendment is in some way a collection limitation, and that, you know, in the commercial context that company that had the flash light app that was at collecting data from the -- nobody even got to the
11:55 pm
harms analysis debt collection was inappropriate in and of itself. >> right. >> you are absolutely right. and i agree completely. in other words, i'm not suggesting collection's irrelevant. we make collection to the end of the story so once you cross the -- you know, like a spill way at a dam. once you're over the collection limit then anything else goes. >> the ironic thing is that at nsa, as chris english said, their view is they never thought of it that way, that they thought that you have your collection authorization which is critical, your retention, your use, your dissemination. your retention limit. that each one of those -- >> if i can just respond to that, i think there's something of a mismatch here and i'm not in any way doubting either of what nsa is doing or what chris is saying.
11:56 pm
but one of the astonishing things for example, when i read the section 215 report that came out from the nsa, civil liberties office, well written report, it was full of all of the limits on what they were doing and the incredible, what can only be described as bureaucracy and what struck the american people is how is the authorization obtained in the first place. we add law that said relevant to specific investigation. the 99 out of 100 people thought it might be focus owned specific individuals. apparently the 1 out of 100 that didn't was a fisa judge and had other members along with him, members of congress. so i think one of the critical issues when thinking about going forward, is this were private sector, there would have been
11:57 pm
immediate -- and you know that policy that says we will collect information for limited purposes, that means we will collect everything. then there is customer reaction p. what can we create that will mimic that in the clsfide environment. maybe that's the -- maybe that's you literally having outside of the agency but focused on privacy and civil liberties that says we understand the challenge but we think you have got the wrong end of the stick. but i think it has been overly focused on the fourth amendment that creates this problem. as you well know, the fis just dismissed it by saying third party doctrine, no problem at all, let's go ahead. someone should have said, wait, you are talking about collecting data on everybody. and that would have focused the discussion under way that all of the technological controls and all of the bureaucratic controls now well documented in the agency somehow never did.
11:58 pm
>> i don't want to -- that's very helpful. i don't want to further rehash 215 and history of 215. and anyhow, i have a red card. so i guess that's the end. thank you. >> so let me just follow up quickly on that point, maybe what we need do is supplement the fips with the omg standard. which is you know, private practice, i could have a client and i say, everything you propose to do is perfectly legal but are you nuts? how do we imbed that in stepping back and saying okay lawyers have technically signed off. everyone technically signed off. but this is a crazy thing to be doing. >> one positive step is adding someone like richards aep an office to support her within the agency. i think that's one way. so you have people not just thinking about the law but people who say understand legal clearance is taken care of but i still have the oh, my god response. are you allowed to refer to god at a hearing --
11:59 pm
>> prfree speech. can you say what you want. >> nervous about that. so the club, there are rules not necessarily identical but outside of the agency, there is where i would say, though this may reflect my naivete, we would have secret law. so if one thing is interpreted to mean the opposite that someone would feel the need to signal that as opposed to going out of their by and say no it doesn't mean what we think it means. it means only what you think it means so we would build in avenues for transparency about the law. so that at least we all knew what the rules were going into it. and i think that's a huge problem when the law itself is effectively classified because of the way this which the interpretive process works. >> i'm sure jorge posada, can
12:00 am
just jump under on that.hrge po just jump under on that.nrge po just jump under on posa just jump under on that.e posad just jump under on that.osada, jump under on that.sada, can ju jump under on that.ada, can jus jump under on that.da, can just jump under on that.a, can just p under on that., can just jump under on that. engineers and technologies thinks of things, as does it work or not work. not because they care about the civil liberties, they live in the world they create.
12:01 am
12:02 am
12:03 am
12:04 am
12:05 am
12:06 am
12:07 am
12:08 am
12:09 am
12:10 am
12:11 am
12:12 am
12:13 am
12:14 am
12:15 am
12:16 am
12:17 am
12:18 am
12:19 am
12:20 am
12:21 am
12:22 am
12:23 am
12:24 am
12:25 am
12:26 am
12:27 am
12:28 am
12:29 am
12:30 am
12:31 am
12:32 am
12:33 am
12:34 am
12:35 am
12:36 am
12:37 am
12:38 am
12:39 am
12:40 am
12:41 am
12:42 am
12:43 am
12:44 am
12:45 am
12:46 am
12:47 am
12:48 am
12:49 am
12:50 am
12:51 am
12:52 am
12:53 am
12:54 am
12:55 am
12:56 am
12:57 am
12:58 am
12:59 am
1:00 am


info Stream Only

Uploaded by TV Archive on