Skip to main content

tv   Politics Public Policy Today  CSPAN  November 21, 2014 3:00pm-5:01pm EST

3:00 pm
whole decades to see what is coming down the pike. there pervasive and privacy-intrusive technologies that i think we will see in our homes and maybe even ourselves. in our lifetimes and certainly our children's lifetimes. the internal protections while important are not sufficient. they can change. they have changed. they talk about protecting privacy. as i said, to what we are protecting several generations down the line. >> to the panel again, we have been talking about that throughout. focusing on how the private sector might have solution that is the government might learn from, the private companies are obviously doing something to control the information they
3:01 pm
collect. the privacy policy and what they will do with the information. are there mechanisms that the private sector used for enforcing the limitations that are particularly effective so the government might learn from? do you have a review on that? >> we see this a lot in terms of the data that our customers have, trying to help them implement. actually they use the same mechanisms and often they have the same weaknesses. a lot of them do not. it costs money and takes time. are they conducting the oversight of the data? probably more so than some people and possibly government because of limited resource.
3:02 pm
they are still not at the level we hope. one thing is that a lot of them, even in europe where we have more commercial privacy laws, more to find the requirements. a lot is the best guess. for example, one that we have been running into now is looking into cyber security and giant multinational cities and trying to deal with employee privacy laws. they are asking questions like a german employee sends an e-mail to a germany held what they are about to get. the u.s. they can clear whoever you want with a few exceptions. they don't know what the answer is. they make their best guess. i think there lessons in terms
3:03 pm
of what the privacy is trying to do, but they are facing a lot of similar problems that relate to scale and related to lack of understanding of what the rules should be with the government. >> anyone else have a thought on that question? >> there is probably a lot of technology out there that can be used. any technology can fall short of your expectations if you don't use it in the right process. therefore we ought to give it as much time and attention for which that technology might be used as the technology itself. in the following process, it might be useful to consider that first and foremost before you acquire any capability within the private sector, you think your way through the considerations. is this necessary? have i done this to the degree it is necessary? transparency is the third leg. absence transparency, you find yourself in a place where people
3:04 pm
don't believe they achieve the balance of the first two. that comes with constraints. some measure of accountability for the constraints and the elements that are essentially implemented to pull that off. it's not various phases. you think about it all the time. first, middle, and last. a stretched analogy is part of the problem with the cyber security in so many environments as you think about that. the second is external. they have to hold the people
3:05 pm
accountable and that's the mechanism that it may go rogue. that forces a retrospect they have shows how we aggravate what is working as it should. we need to invest time and energy in itself. absent that, you are just getting a category hotter moment by moment. you hadn't realize because you take a hard look at it. you got off course a little bit of time ago. >> we will start with mr. dempsey and go down the line. thank you for giving us your time today.
3:06 pm
it's something that chris english said or what i heard. you are saying that we need the technology controls. we need to build the technology in a way that implements the controls and at the same time we need the tell uses that voups. you need the rules and etc. my first question was to you. you talked a lot about the potential and the technology in terms of the permission controls and to state the obvious, that's not a substitute for the legal rules. >> absolutely not. >> we try to say when we talk about the privacy, if you think you are buying a switch that you protect privacy, it's not going to happen. it's not possible.
3:07 pm
you have to change the situations. you have to be able to make decisions about data and about how it's used and being used appropriately. you can't build a terrorist button. you need a human at the top of the chain. do it. the goal for -- i distrust any technology that said don't worry about it. we have privacy covered. what the gulf should be for technology is what tools do the tell us makers neat and what do they need and what makes their job easier or possible.
3:08 pm
when you are dealing with terabytes of information, how can you create shortcuts to make the decision to manage the data. how do you support and not how do you replace the policy. totally accepting your foint about focussing on risk and use, you are not saying that collection is irrelevant and the fourth amendment is a collection limitation. that company that has a flashlight is collecting data from -- nobody even got to the
3:09 pm
analysis and that was inappropriate in and of itself. >> you are right and i agree completely. i am not suggesting collection is irrelevant, but it's the end of the story so once you across a spill way in a dam, once you are over the limit, anything else goes. you have your collection authorization and your retention and use and dem nation and each one of those.
3:10 pm
i'm not doubting what chris is saying, but when i read the report that came out from the nsa office, a well-written report was full of all the limits on what they were doing. the incredible bureaucracy on that. it sort of ignored the factors that struck most to the american people. how is the authorization obtained in the first place. the law relevant to the investigation, 99 out of 100 people thought it meant it might be focused on specific individuals. apparently they had members of congress. if this was a private sector, we
3:11 pm
would have immediate feedback. then there would be customer reaction. what do we create that will mimic that in the classified intelligence environment. that is having the outside of the agency, but focused on privacy and civil liberties that understands the challenge, but we think you have the wrong end of the stick. it is overly focused on the fourth amendment. you are talking about collecting data on everybody and that would focus the discussion in a way that all of the controls and all of the wur kratic controls never
3:12 pm
do. >> that's helpful and i don't want to rush 215. i will follow quickly on that point. i will implement the omg standard. everything you propose to do is legal, but are you nuts? saying the lawyers are off, but this is a crazy thing to be doing. >> one positive step is adding someone like that. an office within that and we are saying understand they are taken care of.
3:13 pm
i feel very nervous about that. it's another way in which you will have some of those similar roles and outside of the agency. i would say although this may reflect this, i would like to think we need to have secret operations. we have secret law. if a law said they are interpreted to mean the opposite as opposed to going out of their way and it doesn't mean what they think it means. it means what only you think it means. we would build in avenues for transparency about the law. that's a huge problem where they have the way in which the interpretive process. >> we can jump in. how will you get that in the
3:14 pm
private sector? back to my point about education. and they just want to make things more efficient. it's not because they don't care. they end up living in a world they create. they don't realize this raises an issue. they prove education across the borders and throughout the nsa and the private sector. things like that. technology is a place where the lowly nrnls and powerful with the ceo. they said i'm not going to build this, that's it. there is an interesting power there.
3:15 pm
they were minimized and the information that was collected is being deleted. that's up front and how much information is coming in. if that's not a practical way. how do we do that better. >> information to the proposals, as a special advocate would help, fisa for a special od voicate.
3:16 pm
the determination had to be made of what was needed. i think it's done in agencies where they keep the information and make a determination that they don't need it. that's different than what causes information to language. that should be flipped. in terms of front end information, i think it can be feasible. it depends on the purpose. if the purpose is to collect everything, that sets off the standpoint. if the purpose is narrower, generally speaking it should be. you should have. there should be data collection limitations. depending on the actual means of the collections, sometimes it may be unavoidable that you collect more than you need. that you should be flushing the information thaw don't need.
3:17 pm
>> any reactions? >> yes. so on both parts, the question of 215, we don't want to rehash whether it's good or bad policy, but three branches participate in the sustainment over a year's time. more than three dozen judges. from an nsa perspective and charge to effect the will of government, i'm short amongst 350 million people which we do every two years. i don't know how you make a significant change in terms of how the government comes to conclusio conclusions. it's an extremely valuable addition. we will always find ourselves in a place where stakeholders stand in the shoes of what we serve. it's problematic on a couple of counts. first and foremost if you try to minimize it, you paradoxically begin to focus on things you
3:18 pm
shouldn't. strange truth in the world is that there two ends of every communication in the world and sometimes more. if you begin with that information, with the reasonable and probable cause, you begin to encroach upon their expectation of privacy and a reason to do so. they said do not focus on that. when do you encounter someone who deserves further protection, you must take it. built in are time limitations on how long you can hold that data. how much data for what purpose and reasons whether it contributes to the report that you keep it for longer.
3:19 pm
there time limitations and they are prescribed for those that grand the authority. >> following up on the phrases used a number of times, there is a reasonable expectation of privacy. to the extent that it evolves overtime chi think it does, how does ascertain the expectation of privacy. is it based on a "washington post" full that americans are uncomfortable about communications and surveillance and people still use their phones and engage in the world. if we were going to look to reasonable expectations of privacy as a touch stone, how should we ascertain what it is. we will start with you. you indicated that the nsa did
3:20 pm
look to the expectation of privacy to one of the guide lines. >> first and foremost with law. the technology changes overtime give us a free pass to say because the law allows us to use the old technology and the new technology which is more intrusive can continue unabated. there is that practice of law that makes their appeals and the condition for justice representation and the under 12-333. there is an expectation through the privacy. you can think about the expectations. there is some aspect of privacy.
3:21 pm
they to inform them of what the provisions are that you see the authorities for. interesting dialogue earlier about the 215 program and the bureaucracy. we thought that was the court proscribing use of that database for the application of it. it's not a province and weapons of mass destruction. that would have been encroachment of privacy that was not meritorious up front. that's the narrow focus alone and justified with the necessity. they had to avoid the creep beyond that because the expectation based upon the consumers looking back at us.
3:22 pm
>> you also used the phrase with thoughts on this. >> to my mind it gets back in to analyzing data. i think it's reasonable to expect the government won't look at data that is not useful. the information that is not proven effective. i'm fine with you looking at the data. tell me why you are looking at it. they are interested in protecting our security volunteers and we are interested in protecting our national
3:23 pm
security. they had interest beyond that. they have information to the private sector. everybody wants data. we get customers to say i want to understand twitter. half the time, that information if you want to understand do a lot of people like justin bieber, twitter is great. if you want to understand more complex nuance theory, maybe we should think about something else. the government should do the same. the data, that's where you expand that reasonable expectation of privacy. it's reasonable to expect data that is not useful.
3:24 pm
. >> they are wrestling with me all the time. it's a terrible thing. unreasonable searches and seizures. they allowed for unreasonable searches and seizures. it's a great example of that. the reasonable expectation of privacy and framework. i know that's a mysterious opinion. it seems to be moving and inching along in a direction where they are doubting for several decades. it would be a network of what do outside your house.
3:25 pm
there is a strong argument that that is okay. it is wrong to be viewing all of this stuff. it is one framework and you like the phipps are going to be a silver bullet. they won't provide you with the clearance. >> if i have time for an additional question, i am still seeing yellow. moving up the analysis of that and requiring agents or analysts who make an assessment of whether or not information is relevant or necessary to maintain rather than potentially letting that off. the implications is the approach
3:26 pm
that would require agents or analysts to put eyes on more communications that were otherwise reviewed. what is your answers to that shift? you could require them to look over every piece of data they collected and they have have not the data retention, but if the population is small, that's less of a problem. they are not connected to a crime or terrorism, that becomes more. merely looking at data they know is connected to other parts of their work. i don't think there is a hard and fast rule, it will depend on what they are looking for.
3:27 pm
on the back end, it's not the answer. it has to be part of the framework. there is a crucial part. you think it's important to get the collection, what do you think? wait a while and come after it and i'm interested in what you think the role of the courts are. on the systems of criminal justice, even on the fourth amendment, they have the final arkenal sis and even on the systems. they come up. the question is two part.
3:28 pm
what stage on the collection limitations do you think that the internal audits and techniques, i'm not enough that you need some kind of an outside look at it. as a former judge, i asked the question. that is you really think that the limited role that the court has been allowed to play in terms of they is createy and even with our recommendation of other people's suggestion about adding the judges in that court. not only would they come out in different ways and get disturbed, they were frustrated in terms of the technology.
3:29 pm
given all of the complexity and technology that we have talked about is the judges that will come in from the regular work for a week at a time. they will go back again. is that the best outside that we need surveillance and an independent look or is there some better way to get the notion of an independent and neutral ash tor. it's a big question. go at it starting with the professor and any way you want. >> thank you very much. >> i think it is essential. >> it neats to be an independent
3:30 pm
role. it involved the court and we want an independent neutral and detached court. even engineers have difficulty keeping up. i think we have seen ways of dealing with it. one is court appointed and another as we saw in the supreme kourt in the summer they explained the technology and relied on them. we shouldn't overlook that. courts have powers to explain the ekinologies in clear and understandable language and not accept their filings or not rule
3:31 pm
on the filings until they do. i could say more, but let me share the microphone. >> the courts play a crucial role in the oversight and i think the court is constrained by a lot of limitations. we could welcome the minimization procedures and on the ground controls on privacy. they could be an ally. technical experts, one of the
3:32 pm
problems we are seeing is there some forces in the court, perhaps formally of the court who would like to see restriction placed on those parties so that it is the court decides about their ability to help the court. >> don't you think that in some cases even the constitutional usefulness is depend ept on the technology of it? judge baits thought that way in one of the cases that was put out that way. you think the advocate will fill that role? >> i don't know enough to make a
3:33 pm
determination. it can be complicated and i would imagine most have that sort of training and i know they have powers to explain this as clearly as possible. you are right. technology has a direct bearing on the rights that are being manipulated. >> it's critical to have a translator role for someone to help try explain the technology. one of the challenges is having a debate and members are naturally going to be
3:34 pm
uncomfortable taking a stand when they are not sure what the considerations are. the critical question is that helps you understand things. it starts to have a real effect and it will take ten or 15 years for the court to settle. should there be ethical limitations and should the government figure out a way to talk through technologies and what's the framework for making that decision?
3:35 pm
it's not an understanding of the technology. the role of the adversary and technology expert has great merit that adds to the ability to understand the technology. we have to have expectations not because the government wouldn't want to perform, but it could be exhausted about technology at some moment in time in the presentation to the court and that could be where the technology is going go. nobody knows where it's going go and the use of certain technology even if the technology doesn't change, changes in and of itself. people make different use. forecasting that is i wouldn't say full, but it's hard. this said if you don't like the
3:36 pm
steps, what alternative do you suggest? >> so first of all, to be clear, i'm not saying that, but i suggest it it one of these we haven't talked about is the value of the use of data or the national security or whatever. one advantage is it focuses on both sides of that equation and drives towards specificity. that is one way to focus attention on it. as we identify those harmful negative impacts, whatever we want to call them, we look for tools to minimize those. it might be stolen and we talk
3:37 pm
about security. a great advantage is it makes clear. where should we focus our attention? the process within the agency. >> thanks to the speakers and all the panels as well as audience members. we had a good discussion and we heard from academics and advocates and industry. that's a lot. we covered a broad range of topics and collection and use and oversight all in one day. you have given us a lot to chew on with thou move forward to
3:38 pm
balance national security with privacy and civil liberties. any comments before it is complete? we encourage anyone with comments and the board members, the panelists or members of the audience or others to commit written comments. we are submitting comments to the end of the year and the transcript of this activity will be posted on the website. with that i move to adjourn the hearing. it's now 4 time 15. thank you very much.
3:39 pm
>> thank you, mr. chairman and good morning to members of the audience. good morning to the second panel. the title is privacy impacts and the impact of technology. i have no opening statement of my own so we can go straight to the opening statements by the witnesses. we will go down the row that happens to be alphabet cal order.
3:40 pm
keep opening remarks to seven minutes. in the front row, renee will be holding up a yellow card for the two-minute warning and red for time's up. there after around the questioning and the possibility of questions committed by members of the audience. the staff members and throughout the audience has index cards and so if during the course of the panel, your question occurs to you to raise your hand and some will bring you over a three by five. our first speaker or member is annie anton. she is professor in and chair of the school of interactive
3:41 pm
commuting. she has a ph.d. in computer science and one of the country's leading experts on issues at the interaction of technology and tell us. please. >> let's try that again. thank you for the opportunity to testify. we are in an ever changing world where terrorists and criminals are getting smarter and more coso 50icated. the techniques are surpassing our ability to protect our nation. providing strong protections for privacy for civil liberties is a counter terrorism weapon. today i focus primarily on three technology considerations. first, strong encryption is an essential technology for fighting terrorism.
3:42 pm
deidentification is a reasonable approach and improve threat mottling is critical for counter terrorism. they must be resilient to attacks for terrorists and criminals. we can cent security standards that are all processes that weaken the nation's cyber security posture and make it easier for them to infiltrate the systems for various purposes. the latest apple and google phones build in encryption by default. both are configuring such that they cannot decrypt the information for anyone, including law enforcement. these measures have been sharply criticized by the director of the fbi and attorney general. as a technology i can assert that applying security practices
3:43 pm
such as encryption by default will yield a system to better with stand service attacks as well as limit access to authorized users. for law enforcement and security and our nation's overall security. sophisticated terrorists and criminals will use products for less convenient alternatives. technology and policy scholars are actively debating the merits of deidentification techniques. this is critical because privacy rules only apply to identifiable data. technology scholars emphasize there is no way to mathematically prove a data set. it cannot be reidentified.
3:44 pm
in contrast, proolicy scholars provide protection to most of the people. they are pretty good. that's the idea. there some cases where it is critical to protect a person's identity. for victims of domestic abuse, we need to make sure their location is protect and can't be reidentified. in many settings if we apply effective, but not perfect deidentification procedures, overall protection may be increased and data may be more useful. in such cases, they should not be blaming of the good. they might consider that is
3:45 pm
critical to the counter terrorism. we must improve this. first, we must develop privacy oriented threats and most threats and techniques have been developed in security contests with little privacy information. the latter is crucial given the rise of big data analysts with the internet space. second is a nation that we do not want insiders leaking secrets to foreign journalists to become a common way for the decisions and debates. insiders with access to sensitive information must be either in direct providing useful information or incorrect cost with perspective of public
3:46 pm
erations and erosion of trust. a good threat model makes it reasonable for any organization. in closing, as a technology for a privacy scholar, we should encourage strong encryption and use practical technologies now rather than wait for theoretically perfect solutions and expand models with privacy and security as well. nation they mentioned the importance of reviewing and not have the technology that can appreciate all you are doing. having that is helpful, but you like to see it move forward with the technology involving the decision making. i would like to thank the civil liberties are privacy bort for the commitment to finding ways
3:47 pm
for the government to protect privacy and also for meeting the security. >> let me thank you for your testimony, but we have a technology in the second row. outside as well. the center on privacy, technology and the law at georgetown university law school and previously chief counsel to the senate judiciary subcommittee for privacy and the law. >> we have a problem with privacy. it's a problem for the
3:48 pm
government and industry. government and industry developed the data cools that let them analyze data sets that are too large or too messy. they let them process that data faster a& in short they create value that is driving both government and industry to create the two collect as much information as possible and retain it. it's ingrained with phipps. the and right now in industry, there is an effort to redefine
3:49 pm
privacy. it used to be about collecting only what you need. you collect as much as possible and protect privacy through the profit collect. i'm here to encourage you to resist this. collection still matters. that impacts the core right to privacy. second, this was discussed at the first, but there was a misconception that they were useful for privacy. i talk about the fact that they remain a benchmark which to measure the privacy impact of the policies. i will add given a previous discussion that literally since the inception in 1973, the committee that wrote the report dedicated a section and talking about how of course not all of them can apply, but clearly some of them must because the risk is
3:50 pm
too high. third, in my testimony to talk about that, we need remember the privacy is not about taking. it is about taking and not about sharing and fourth and finally, i think that americans do expect a degree of privacy in public. now, given my limited time here, i want to focus my oral testimony on just that first point, collection. i think it's the most important. after the snowden disclosures on the telephone records program last summer, the ic's first line of argument was that, you know, we may collect a lot of this information but we only look at a tiny part of it. the problem is that this is not how people think about privacy. if a police officer knocked on your door and said "hey, i want you to give me a last of every person you've spoke within in the last week." then said "don't worry, we're probably never going to look at this stuff." would that reassure you? i think that most people would say no and i think that this highlights the fact that the
3:51 pm
forcible collection of sensitive data in and of itself invades what this board has called "the core concept of information privacy." and that's "the ability of individuals to control information about themselves." it's not just a concept, as you know, it implicates first amendment and fourth amendment interests, elaborate that on my written testimony, but this my mind, the single-biggest reason to resist a privacy model that primarily relies on post-collection use restrictions is the disparate impact that might have on vulnerable communities. in a use restriction model you collect everything and protect privacy by banning harmful uses of data after it's been collected. the problem is that there's basically what i'll call a moral lag in the way we treat data. what i mean by that is that we as a society are often very slow to realize that a particular use of data is harmful, especially
3:52 pm
when it involves data of racial and ethnic minorities, lgbt and people who have lacked political power. in fact, the two most prominent examples with this moral lag involves the department of defense or formerly the department of war. during world war ii, japanese americans volunteered detailed information about themselves and their families in the census. they volunteered that information under a statutory promise from the federal government that that data would remain confidential. this was a use restriction. what happened? as you know, in 194 congress waived the confidentiality provisions and the department of war used detailed census data to monitor and relocate japanese americans to internment camps. after world war ii, a similar story unfolded for gay and lesbian service members. they were prohibited from serving openly so many turned to military chaplains, psychologists, physicians. yet routinely and even after "don't ask,don't tell" the military used that confidentially collected data to out and dishonorably discharge lgbt service members.
3:53 pm
now, today with the benefit of hindsight we recognize that these events are discrimination. but at the time, the picture was less clear for a lot of people and that took a long time to change. the census only acknowledged the work -- the full extent of wartime sharing of census data in 2007 and congress only repealed the ban on openly serving gay and lesbian service members in 2011. that was three years ago. so let me be clear, my point is not to cast aspersions on the department of defense. rather, my point is that all of us as a society are consistently slow to recognize what's a harmful use of data when it comes to vulnerable communities. it often takes us decades to figure that out. far too often today's discrimination was yet's national security measure. what this this means for our data and what this means for privacy is that we can not solely rely on use restrictions. what this means is that
3:54 pm
collection matters. and that the simplest and most powerful way to protect privacy is to limit data collection, particularly for the government. i urge you to continue to protect that core right. thank you. >> thank you very much. our next witness is mike hinsey, chief privacy council at microsoft where he's been for 16 and a half years really at the epicenter of the evolution of technology and privacy. mike? >> thank you for the opportunity to speak with you today in this important discussion. i come to this discussion from the perspective of advising on and managing privacy and related issues in the private sector. i've done that for nearly two decades, first as an associate here in a d.c. law firm and, as you mentioned, for last 16 plus years at microsoft. at microsoft, we approach the issue of privacy from a core belief that privacy is an essential value, both to us and to our customers. we have a strong commitment to privacy because we recognize the customer trust is critical to the adoption of online and cloud
3:55 pm
services. our customers -- from individual consumers to large enterprises -- will not use our products and services unless they trust them. unless they trust their private data will remain private. we seek to build that trust with our customers by adhering to a robust set of policies an standards. these policies and standards guide how we do business and design our products and services in a way that protects customer privacy. these are based on the fair information practices which we agree remain relevant today. including transparency about the data we collect and how we use it, minimization with regard to the data collected and how long it's retained, choice about collection and use of data, strong security to ensure that the data is protected, accountability to ensure that we are living up to our commitments. these standards are not just a rule book we created and hope that our employees follow. win instead, we built them into the processes we used to operate our business.
3:56 pm
for example, they're built into the tools that are used in our software development life cycle and there are checkpoints that prevent a product from service to shipping without a privacy signoff. in some, we've taken what's often referred to as a privacy by design approach to how we operate the company and how we develop and run our services. and this approach is supported by a mature privacy program that includes dedicated personnel with privacy expertise who sit in both centralized roles and are embedded throughout the business, the program includes incident management response and escalation processes. further we have developed and deployed comprehensive role-based training for engineers, sales and marketing personnel, as well as those in hr, customer service and other roles that touch and handle personal data. and our program includes
3:57 pm
executive-level accountability for privacy compliance. but that investment in privacy and the trust we've worked to build sundays mined if those customers believe that the government can freely access that data. concern about government access to data collected by the private sector can foster a lack of trust in most private sector services and when those concerns are focus tonight access to data by the u.s. government, that lack of trust becomes focussed on u.s. companies. that's why we've been vocal for the need for surveillance reform in the united states. there have been positive steps in this regard in the last year, but there's more that needs to be done. we've laid out several things the u.s. government should do to help restore the trust that's been damaged by last year's revelations. first, bulk data collection programs should end.
3:58 pm
we have been clear that we have not received any bulk orders -- any bulk orders for bulk data collection but we strongly feel that surveillance should be focused on specific targets rather than bulk collection of data related to ordinary people's activities and communications. the recommendations of this board in the section 215 program are encouraging, as are the comments of the president, and we urge the administration to end the existing program and we urge congress to enact prohibitions on any such orders in the future. second, we should do more to increase transparency. transparency is a key element to any program for protecting privacy. it facilitates accountability and enables public debate around policies and programs. here, too, we've seen positive developments, in particular, the government is declassified more information about its surveillance programs and the workings of the fisa court. additionally, we and other companies filed lawsuits last year against the u.s. government arguing that we have a legal and
3:59 pm
constitutional right to disclose more detailed information about the demands we've received under u.s. national security laws. earlier this year, we came to an agreement with the government enabling us to publish some aggravated data about the fisa and national security letters we've received. it was a good step that helped foster better understanding of the type and volume of such orders that service providers receive though we believe there can be and should be more detailed reporting permitted. third, we support reforms of how the fisa court operates. in order to foster a greater confidence and surveillance programs and government access to data, they're appropriately balanced against privacy and other individual rights. surveillance activities must be subject to judicial oversight. we need a continued increase in the transparency of the fisa court's proceedings and ruling but effective judicial review requires a true adversarial process where more than one side is heard. we urge congress to act on fisa reform. fourth, government should provide assurances that it will not attempt to hack into data centers and cables. in the year since the "washington post" reported on alleged hacking by the nsa of cables running between data
4:00 pm
centers of some of our competitors, there's not yet been any public commitment by the government that it will not seek to obtain data by hacking into internet companies. we believe the constitution requires that the government seek information from american companies within the rule of law and through authorized government access and we've taken steps to prevent such attempts by increasing and strengthening our use of encryption across our networks and services. nevertheless, we and others in the industry will press for clear government assurances. fifth, although recent government revelations have focused on the u.s. government and many of the subsequent debates have focused on the privacy rights of u.s. persons, we must recognize that this is a global issue. as we seek to sell our products and services to customers around the world, discussions that focus exclusively on the rights of u.s. persons are not enough. many people around the world do view privacy as a fundamental
4:01 pm
human right and they have a very real concern about whether and how governments can access that data. in that regard, we appreciate the steps that president obama announced in january which acknowledged the need to address protections by non-u.s. -- about non-u.s. citizens. along those lines in the law enforcement context, we've challenge add federal court warrant in the u.s. courts seeking customer e-mail for content that's held in our data center in ireland. further, we've called for governments to come together to create a new international legal framework that allows for new streamlined processes for cross border data access that can supplement existing rules. none of this should be taken to suggest that we don't value and appreciate the absolutely critical work that our law enforcement security agencies do everyday to keep us all safe. in fact, we work closely with the u.s. and other governments to help fight cyber crime and other threats. we want to ensure those agencies have the tools and information that they need to protect us from terrorism and other threats to our safety and security, but
4:02 pm
there needs to be a balance between safety and the personal freedoms that people around the world, especially law-abiding citizens and institutions enjoy. this balance is readily an easy one. as chief justice roberts recognized in the case of "riley v. california," privacy comes at a court. but the court's unanimous decision makes clear that privacy is an inherent value that must be protected. while there's not always a perfect analogy between protecting private any the private sector and nation hall security context, we deal with questions of striking the right balance between privacy and other needs and each of these contexts as technology evolves we need to continually re-evaluate that balance and many of the principles that have proved useful in striking and remaining that balance, the fair information principles, continue to be relevant today. >> mike, can you wrap up? >> i'm wrapping it up. sorry. >> super, thanks. >> we'll come back to some of those issues in the questions. our final member of this panel is the chief security architect at nvidia, the company that designs and builds high performance computer systems. he's a photographer and computer scientist. welcome and please proceed. >> thanks for the opportunity to testify today. i appreciate it. i'm here as a technologist, not as a lawyer and in silicon valley we say the "i'm not a
4:03 pm
lawyer" rule applies. our concern is about building systems that are buildable and creating rules that are enforceable, so i wish to provide some technology background to the panel and to the conversation and the -- from our perspective, security is to system to what harmony is to music. providing security as a foundation of establishing rules of privacy the our model. we build systems that are enabled and are able to enforce rules and that is the context of security as we see it. security is one of the intersections between technology and civil liberty and we deal with issues such as trust and active adversary in a system. this is how we built and design our systems. our world used to be simpler and sometimes i provide samples of that simple world. you all remember this as a mobile phone. this is from the time that the phones were actually doing just that, they were phone. and some of these devices were statements of class. you remember this, right? this was a phone. this was a mobile phone.
4:04 pm
i worked in this company. one of my favorites in the collection, text -- this is. this used to send and receive text messages. some of these -- oh, yeah. plie, this was your personal digital assistant. i have some other -- oh, yeah. [ laughter ] palm. there used to be the company that existed, this was one of the darlings of the valley. oh, yeah, of course this was also a very important device that everyone carried. s from the time that the world was very simple and we built systems that did very basic things. and it was per thomas freedman when i sat down to -- and i
4:05 pm
quote here "to write the world is flat, facebook didn't exist, twitter was still a sound, the cloud was still in the sky, 4g was a parking place, linked in was a prison, applications were what you sent to college and skype was a typo." so june 29, 2007, iphone was introduced. the world changed. the world for us, technologists changed, probably for everybody else in the room non-technologists and technologists alike also changed and we are dealing with devices
4:06 pm
that are not as simple as what we used to carry. so that's part of the problem that from my perspective i'm interested and ramifications of the changes in this technology as the subject as we are talking about. it's only seven and a half years. it's only seven and a half years ago. so i don't believe there is any other event in the history that in this short amount of time has ravaged and gone through everything and tried to change everything such as our foundation of our society. in the old and pre-2007 world we said things like "you cannot
4:07 pm
enumerate all the attacks." cryptography is a known statement and meaning you cannot define a secure state of aa system. it was difficult back then during these devices. it has just become worse. the guarantees that -- we don't know anything about our future but a couple of things i could guarantee, a couple of things i could guarantee right here is that things would only get faster. we're going to build things that are faster. they're going to become smaller, a lot smaller, they're going to become cheaper and these devices are going to become a lot more abundant. some of them we no longer care about, building devices that are usable for a long period of time, it's a lot more economic to build these devices that are basically throw away devices.
4:08 pm
that's a concept that we are following. and they're becoming more connected. everything is becoming more connected. you have heard of things such as iot, internet of things or, as i called it, thingsternet. everything is becoming very talkative. everything is -- all of these devices are very chatty. they talk a lot. so you guys all have phones, smart phones in your pockets. from the time that i started, which is about five minutes right now until now, each one of those devices without you even touching them has transmitted, sent, and received about half a meg data without you even touching them. this abundance of information
4:09 pm
that has happened, that is without you interacting is having a lot of ramification on what we are doing -- we heard a lot of things about day is only accumulating, it's not going away. we are generating more data than we can manage or fathom. a hundred hours of video. a hundred hours of video is uploaded on youtube. and youtube is not the only recipient of this service. other companies also have these services. hundred hours of videos are uploaded to youtube every single minute. every single minute. so we are building systems to manage and compartmentmentalize and define and create and work with these data and this data as we have heard in the two panels are not going away. they are not disappearing. and in the new world, maintaining security is even harder. so as a citizen very carefully
4:10 pm
following what is happening by this esteemed board as to what is the ramification of the decisions that we are making and whether that's enforceable, we can we can build systems that are enforcing these rules because right now being a security professional and creating a doable, enforceable security is as unpopular as being an atheist in jerusalem. no one likes you. so i'm hoping that we can come up with a system that is also buildable. lastly, i close my remarks and i'm looking forward to the questions. one more thing that i can guarantee is the attacks are going to increase. and they're going to become simpler and easier to mount and by one measure the number of attacks in 2013 were three trillion, only affecting private information. on average 27.3 -- $27.3 per
4:11 pm
attack. about $100 billion, the cost of these attacks. this data is 2013. none of the target home depot, linked inn, none of that information, none of those attacks are included here so with that i close my remarks and look forward to answering questions. thank you. >> thank you. we'll now go through a round of questioning and board members as well will be subject to the time limits here. i think ef-20 minutes and then each board member will have five minutes and then still the possibility of questions from members of the audience. i wanted to build my first question off of the point that i think hadi was making at the end there which is that there seems to be this inexorable trend
4:12 pm
towards more sophisticated devices, collecting, generating, sharing, emitting autonomously, automatically disclosing more and more information and i think i'll go to professor anton first and maybe come back to hadi with this. but looking that the phenomenon and the seaming inexrabbit of it, the seaming inevitability of it, first on the technology design side, on the technology design side what do you see as any potential at all for limiting that growth, controlling the flow of that information you talked for some extent about the possibility of technology protecting privacy. how does that square with this tremendous on going growth of information. >> thank you, so as was
4:13 pm
mentioned in the earlier panel, systems are getting more and more complex which makes compliance more and more difficult as well. i really theme we don't limit growth and limit the ingenuity of new technologies that might have great applications in the future and solve wonderful, really important problems. by the same token, there's work that's been done, especially with work being done at georgia tech on how do we design the internet of things or internet of devices such that we are taking privacy and security into consideration given all of the outputs, all of the possible inputs and engineers just simply need better tools and heuristics for how to do that. privacy by design is thinking about these early on and not thinking about it after the fact.
4:14 pm
in terms of controlling information i think what we want is to secure the flow of information but not limit the flow of information. these are all things researchers are working on in universities and research labs and industry as well. >> i've written myself about the potential for privacy enhancing technology and the value of privacy by design but at the same time i just don't see it happening.
4:15 pm
or -- let me put it this way. while i see it happening and i take mike's point that microsoft has incorporated privacy by design as a corporate concept but there are these other hugely dominant trends that almost seem to be overwhelming. >> so within the context of counterterrorism i think there's a lot of policies and laws that are in place. when i mentioned earlier i'd like to see more technologists in the room it's not just to study it after the fact but actually to be involved in forming the policy because a lot
4:16 pm
of times the policy and the law are written in such a way that we can avoid it. and so what i'd like that see is more technologists involved in the discussion up front really informing the decisions about laws that are going to be passed, about policies we're going to adopt. because we could write them in a way that makes it easier to comply with the law. when -- >> do you have an exam until mind? >> excuse he? >> do you have an example in mind? >> so i work a lot in hipaa, for instance. we have the new change that w
4:17 pm
meaningful use i had one ph.d. student trying to predict how the change is going to be because when they finally make that decision we'll have very little time to implement that change in systems to make sure we're compliance. and had we had more technologists involved in that process, we'd be able to more quickly adapt our systems and we've have a better community of practice, if you will, about how to salvage those laws and have systems to make sure that only the right people are having access to the right information at the right time. and in compliance with the law.
4:18 pm
>> and certainly you would agree that we need both better, clearer laws as well as more mindful technology? >> absolutely. >> not that one or the other will solve this problem? >> absolutely. we need both. >> i want to go to alvaro but there was one point in your written testimony which you didn't mention and i want you to talk about it now because i think it's very, very important. a lot of our constitutional law of privacy is based upon the concept of reasonable expectation to privacy. and there's a lot of worry and legitimate concern that with the -- these changes in technology that our expectations of privacy diminish. you talked about the fact that in fact with changes in technology our expectations of privacy may actually be growing. >> the point here is that the katz test cuts both ways. usually when the court talks about katz in society they say everyone's becoming inured to this idea, they're surrendering to the ubiquitous collection of their data but i think it's helping people learn what privacy is and the best example of this is location technology and facial recognition technology. previously, people food occasion to develop an opinion on whether or not they expected the sum total of their movements to be
4:19 pm
developed -- to be compiled in a profile. suddenly, it's becoming radically cheaper to conduct that surveillance so i think that in the same ways that you only realize what you had when you start losing it, for the first time a reasonable expectation of privacy is crystalizing privacy in people's minds so i would say you know what? maybe when i go to the grocery store or i drive down the street gohr to work i expect my colleagues at work to see me, the people i know at the store to see me, my neighbors to see me. but i don't expect anyone to know i'm at all those places at all times no matter where i go so i think the technology can expand our expectation of privacy. >> and mike, over the past 15 or 16 years that you've been at microsoft, do you think it's fair to say that your customers have become less interested and less concerns about privacy or expect more of microsoft and other companies when it comes to privacy? >> i think they expect more.
4:20 pm
i think i agree that expectations of privacy in some ways have increased. they've certainly changed as technology evolves. people learn about it, they adapt. there's certainly data sharing going on people would haven't contemplated or accepted a number of years ago but that doesn't mean people don't care about privacy anymore. it's very clear to us that our customers care about privacy now more than ever. and you see that in the amount of resources and attention and focus that we've put on privacy. it really is one of the top legal issues we're dealing with. it's one of the top customer issues we're dealing with. we hear everyday from customers who have questions about hour their data is being treated, how it's being protected, how it's used. people's expectations of
4:21 pm
privatery are not fading away. >> just to put a nail in the coffin here, i think the government argues and there's obviously security precedent to support it that a person surrenders his privacy rights when he disclosed information to a third party such as microsoft in the course of using microsoft products or services. but it seems to me from what you're saying that microsoft does not believe that its customers have surrendered their privacy rights when they've used a microsoft product or service or thereby microsoft has acquired information, microsoft does not believe that that information has zero privacy interest, does it? >> absolutely not. on the contrary. to the extent the third party doctrine ever made any sense, it doesn't make sense today. people increase. >> i are putting all of the information they used to keep on their homes in file cabinets on
4:22 pm
line and in cloud services and as recent court decisions have recognized, particularly in "riley," it's even more there's more data created, filing the most private, intimate details of people's lives that's in cloud services in the hands of third parties more so than was ever in people's homes and the expectation around privacy, around that data are quite profound. >> and that's true in your view both of content so to speak and noncontent or metadata or transactional data. there's sensitivity there in both categories. >> absolutely. i don't like the term metadata because it encompasses too much. we should talk about what we're talking about and there is a broad range of data that's collected or created or inferred through the use of online services and some of it's fairly benign. we call things, the -- put the metadata label on things like
4:23 pm
the amount of storage you're using in the online storage thing or the average file size and -- but even that has privacy implications and we embrace the ideas of transparency and consent and all of the phipps around that kind of data, too. but as you go up the scale with maybe content being the end as the most private, the most -- the stuff that people have the highest expectation of privacy, but other things you're about who you're communicating with are right up there. right up against content in terms of what that can reveal about people's relationships associations, thoughts believes, et cetera. and there's very important privacy implications about that as well. >> you mentioned the transborder issues and the fact that people around the world recognize privacy as an interest and in many cases as a human right. are you -- where do we stand a what are you aware of?
4:24 pm
what do you know about -- is there any progress being made multilaterally or bilaterally or in terms of developing standards for transborder surveillance and transborder government access? anything in the works there that we should be aware of? >> not that i'm aware of specifically. you know, there's certainly more discussions happening in recent years than there has been in the past around a number of constituents and interested parties on privacy around the globe. the chairman and i were recently at an international data protection conference where these issues were loudly and vigorously discussed and debated and so that dialogue is happening but in terms of actual progress towards making headway in terms of developing an international framework for this stuff, there's certainly a lot
4:25 pm
more work to be done. >> i would just ask you and i would ask others as well as members of the audience but additional panel uss if and when you do become aware of things that are making progress, please let us know. obviously we're remaining interested in that, in the transborder question. we've talked about privacy by design. in your experience, do technologists give adequate consideration to privacy as they design products and what more could be done to encourage or promote privacy by design? >> and technology we built things that are reasonably well defined so i recognize in the previous panel there was a discussion that you don't necessarily need to define privacy to be able to enforce it.
4:26 pm
on the technology side, if we are able to build a model that represents a need, then we are very good at building it. i think part of the reason that a map hang, a very human very societal concept such as privacy into the devices that we build, the services that we build and we use sometimes it's simpler, sometimes it's not. to answer your question, i see a great deal of attention, a great deal of interest in the notion of privacy, privacy by design, secure by design, trustworthy by
4:27 pm
design and especially in the field that we are dealing with, our model and security of the device when we release it and it goes to the field is in a mutually distrusting system. so you don't really know. it's one thing -- let me take a step back. it's one thing to build a server that resides in someone's data center where you have full control over the actual device and you have to control the flow of information, the software that is there and how it's used. it's another thing to build a device and leave in the the hands of the users and guessing
4:28 pm
what they want to do. and then it's one thing to have a notion of privacy as we do and build a system based on that. it's another thing when you take a look at this -- should i call it a generation gap as to -- there's this company called snap chat and they had promised that whatever picture you take it will disappear. anyone who works in technology knows that things like this are not possible. you could take a picture of that
4:29 pm
device. we call it job security. then when they realize that this is not really possible, they announced it and they're under the oversight of the government for about 20, i think, years to make sure that they do things right and they're paying attention -- i know they're paying a lot of attention to make sure they get things right. but then you look at the users. i think the status -- stat was released last week or the week before that they asked college students 50% -- more than 50% of college students said, yeah, we still will use snap chat. they're aware. they understand. i don't know how to reconcile that. there's a generation -- new generation that has -- i don't know whether it's more or less but certainly a different expectation and definition of privacy. and there is a vagueness of what does that mean in terms of a system that could be built? once those are, you know, in a reasonable state, we are really good at building systems that satisfy those rules, hence my opening remarks as to our model in the industry and technology, we understand the rules. they're very good at, you know, creating those rules, building devices and services that enforce those rules but it has to be buildable and it has to be enforceable. the attention is certainly there. >> the first premise is the rules have to be clear and if they're not clear then you don't know what to build to? >> semi-clear will do. we used to live in a world before 2007 everything had to be really, really well defined. it no longer exists. we have a new generation of hackers that do not abide by the rules. therefore we have to create systems that are almost right.
4:30 pm
we are seeing it in the program languages, design of the system, seeing it in self-correcting systems. sometimes, somehow, somewhat accurate will do. >> andy, did you want to respond to that in the minute remaining? >> sure. this reminds me a little bit about what i was talking about practical encryption and anonimization. and so, i think there are times and certain applications where that kind of risk is fine and there are other instances where it's not fine. and then that's where guidance from p clot can be helpful in terms of trying to figure out when is it that we can have pretty good rules and when do we have to have very tight, accurate 100% certainty kind of rules. >> okay. thank you. >> at this point, other members of the board will pose questions
4:31 pm
under the five-minute rule and we'll go in sort of reverse order down the line here starting with rachael grand. >> thank you, jim. and thanks to all of you for being here. that's really a good segue. the first question i was planning on asking, i was interested in what you were seeing about the domestic violence context you want it to be perfect, other context, good enough will do. can you explain what you mean by that? what's an example of d identification method that might be good enough but perhaps not perfect? i'm not a technologist, as you know. if you could help me out, that would be great. >> all right. so, there are certain cases of studies that have been done, for instance, when the netflix put out their data online and then
4:32 pm
researchers went and looked at the internet movie data base to see if they could reidentify people. they had resources. it was readily available information and this context i don't think anyone was personally hurt by it, but there might be cases where that kind of identification could be extremely damaging. so the more we talked earlier about aggregation of data bases and how the ability to link different kinds of information across different kind of data bases could actually be detrimental. it can also help us find the bad guy, though. that's the tension, right? so when is it okay and when is it not okay? and are there instances -- for instance, for netflix or something that's available online that's just not, you know, where you went to school or something that's not very important, it may not be really necessary to worry about where you had dinner, for instance. but in the context of a group that is actively trying to mount
4:33 pm
a terrorist attack, that's really important. >> so i guess that makes sense in terms of when it's important around when it is not important. how do you do it? for example, how do you do the perfect domestic violence context? >> i think it's very difficult. i think we have technology that's pretty good but not perfect. and so the idea is do you keep the data unincrypted and then easily accessible? or because it's not very important. or do you actually encrypt it and then use reasonable, practical anonmization on top of that. and so it just depends. and i think this is one of those cases where technologists would welcome guidance when -- in helping us figure out when are the risk profiles.
4:34 pm
technologists don't have access to sometimes what the risks are. >> for mr. bedoya. you said something along the lines of national security context some of the phipps must apply, even when they all can't. >> sure. the first is a historical point, when the hue report was issued -- i was just reading its pages 74-75, the committee said, okay, we set out these standards. clearly all of them can't apply to all intelligence records, but some of them must apply because the risk is too high if we don't have some protections. so put that more concretely, obviously the difficult ones are individual participation in transparency. i think there are ways to address these -- at least on an aggregate level that would be really powerful. so, you know, i think in the 702 context the board has -- and to take a step back.
4:35 pm
i think it is shocking that one half years after the disclosures the american public doesn't have even a rough sense of how many of them have had their data collected. people think it's everyone but then you have news reports saying 30% of calls are actually recorded. so in the 702 context the board recommended various measures to identify the scope. in all my time in the senate, i never saw anything that would lead me to believe that it would actually be impossible for the nsa to produce an estimate based on statistical sampling of a number of u.s. persons collected in 702 data. there's a number of things you could do to quantify scope, one of them could be releasing the number of queries done on 12333 level. i think there are ways to address this. >> anybody else have a thought? >> i do in terms of transparency. this is another way in which, for instance, fisk technologists could be helpful. when you have -- if heidi
4:36 pm
success and whispers in mike's ear, i spoke with jim about the panel. by the time it got to jim it will be i spoke with him about wearing flannel. when you get lawyers from the nsa and fisk about technology and you don't have a technologist there to ask questions like -- make suggestions about, well, we could actually -- have you thought about including this kind of metric or instrumenting this software in certain ways, we could actually improve the ability to have more transparency and more oversight in technology with those discussions bringing everyone in the room. >> thank you. >> i'm going to try to get a question in for each panelist. i would appreciate brief responses. annie, you said that encryption is good for counterterrorism. i would like to understand more. i understand mandating a back door weakens protections, but it seems that terrorists can now hide their communications which
4:37 pm
seems to be detrimental to counterterrorism. >> it's a better world when everyone can hide their information. and if -- so there was a case in greece where there was a phone and someone was able to actually start -- because of the back door they were able to actually listen to the conversations because through a wiretap on the prime minister thatch that's what happens when you don't have encryption and security by default. to think that the terrorists aren't going to do the same thing, i think is naive. >> alford, you talked about the expectation of privacy and if i heard you correctly, tell me if i'm wrong, is that you're suggesting that we talk about not what people expect their privacy to be because i can put up a sign saying i'm conducting video surveillance and destroy that.
4:38 pm
but their expectation of what privacy should be? >> i'm not saying that. that's a separate wonderful, powerful argument. what i'm saying is that technology is making us realize that we do expect privacy in scenarios that didn't exist 10 or 15 years ago. so i think technology can expand your notion of privacy, but i also think the fourth amendment doesn't protect me and you, it protects us as a society and sets a base for relationship between a government and its citizens that also needs to be protected. >> fourth amendment, mike, you talked about the balance between government requests and your customer's privacy, do you think the government should have a warrant every time it accesses your customer's records? particularly if they're american customers. >> yeah. certainly in the law enforcement context we've advocated for a reform of that that would in effect require a warrant for access to any content regardless of the age to precise location information, other sensitive data. you know, i'm not sure we could
4:39 pm
so far to say that a warrant is required for every single case for every single data type. we certainly need to update the rules so that there is appropriate judicial review of surveillance programs and specific requests that we get for data. >> in terms of third-party doctrine, would you then essentially not have it be an absolute exception to the fourth amendment but essentially where would grow with it to provide some protection but not necessarily a full warrant protection? >> yeah. the laws that we deal with in the law enforcement context provide a sliding scale in effect. provide some reasonable oversight in protection, something below warrant and probable cause. and we've taken the position that's appropriate for some types of subscriber data, et cetera. >> thanks. heidi, you talked about -- i want to put this in the context
4:40 pm
of how much information should be collected, you talked about enforceable rules for collection, but you also said that collection is going to be faster, cheaper, we're going to be all more connected and that tax will increase and that even compliance with rules may be more difficult, professor fellton talked about potential abuse of information and increase possibilities of breech. how would you strike the balance between collection rules and essentially user rules? >> that's a very good question, very difficult one. i don't know on the technology side of the house, i don't know if we really know where the balance is. we take a look at the tax, we look at the system, we look at the capabilities, we look at the mere fact that all of these attacks, exploits are becoming so advanced that i used to give you one concrete example. i used to need to be physically around your things that you touched to be able to lift your fingerprint then have access to your phone and then use that fingerprint to mount an attack.
4:41 pm
with the resolution of the cameras that we have these days, sometimes -- very high resolution camera, i just need to have your picture that was taken somewhere in china to be able to zoom in, zoom in, zoom in and lift your fingerprint and mount an attack. how do you reflect things like this should we build systems that when ever there's a fingerprint that smudges it and we don't expose it -- there are things like that encompass all those cases that should be buildable what i'm trying to come across are coming up with the rules that define those capabilities or things that should or shouldn't be done is a very complex problem. >> thank you. >> so, thank you, guys, for
4:42 pm
another excellent panel. my first question and this goes back to what i had said on a previous panel which is i view our job to be translating these ideas, these concepts, these concerns into practical recommendations. so, starting with you, what have you found effective as a privacy officer to ensure your very large work force, your complicated work force dealing with emerging issues takes privacy seriously, your rules are enforced, and that from beginning to end, privacy is a part of your culture? this is free advice to the new privacy officer over at nsa. >> well, becky, as i eluded to in my opening remarks, you know, one, there's no silver bullet. you need to take a number of approaches. and we've taken a number of approaches to drive awareness
4:43 pm
and sensitivity around privacy throughout our work force, through a number of steps on mandatory training that's required for all employees that cover a range of ethical and compliance issues. deeper role-based training that's specific to software engineers, that's specific to sales and marketing people, that's specific to different roles that people play in a company that impact customer privacy. we have -- as i mentioned, not just sort of told people what the rules are and then crossed our fingers and hope they abide by them. we have put in check points in the way that we have developed our internal systems, the way you've developed a software and get it out the door that has to go through certain check points and reviews to ensure that privacy issues aren't missed or overlooked. there's a number of things that we've done along those lines to make sure that people are aware and have the tools available to them to do privacy right. but then there's also different
4:44 pm
checks along the way to ensure that mistakes don't get made. nothing is perfect, of course, but we try to do a multi-facetted approach or multi-layered approach to make sure we catch those things. >> let me follow up on this and it's a somewhat specific example but hypothetical. have you found training to be more effective or effective enough in the absence of pairing with mechanisms and processes. that was a horrible question. so i'm just going to start over again and say, 702 that program has certain legal requirements. in the private sector, would you train to those legal requirements or would you also have, for example, when an
4:45 pm
analyst is sitting there attempting to target or select or whatever they're going to do, also have at each stage of the screen or the process or however they're doing it rules reflected in the computer system that they're attempting to use? >> we do both. to the extent that you can use technology to enforce policy, that's always super effective because you get past -- or you reduce the potential for human error. but that's not always possible. you can't completely prevent mistakes, oversight, or intentional bad acts. so you need to do more than that. you have to have -- you have to build the awareness so that the inadvertent stuff is reduced. you have to build in the technology tools to prevent that
4:46 pm
from happening. then you need some level of checks to make sure that everything went right. if you have somebody trying to circumvent a policy for whatever reason, that there's some way to catch that before it creates a negative impact. >> and so i think i have time for one other quick question, in the section 215 program, one of the features was, in fact, not all of the call detail records went to the government. in fact, names are not provided originally to the government, subscriber information, simply numbers to numbers, would that be an example of d identification and anonmization. that was my only question. >> i have a couple of very sort of brief questions, which i think you can answer very quickly. that way i'll get them all in. okay. i'll begin with -- you talked about how it would be good for us and we already do have technologists on board -- based upon your knowledge here, does the government have
4:47 pm
technologists who worry at all about privacy? i know they have technologists, obviously, but is this as a result of your observations and studying the field something that they consult with the technologists about, hey, we need this kind of information for national security, but we would like to get it or as much as we can? what's the balance off? does any of that kind of thing go on inside the government with technologists? >> right. so having worked a lot with the government, i know that they consult technologists greatly with security, with privacy, with compliance issues and how do we engineer software that takes all of that into consideration. i think if we look at the past five years or so or six years or so that you'll see that the nsa
4:48 pm
was really, really focussed on compliance. i think the results of the reports and the oversight have shown that they've done a really good job with that. when there's been an issue, they've dealt with it. i think someone mentioned the new cpo at nsa. what we'll see different now is not only are we complying with law going to be something that's factored into all of the software that's developed and all of the tools and techniques and procedures but also now, well, just because it complies with law, should we really be doing it? and what's the extra step we're going to take to really consider privacy at the on set? >> so you sound reasonably satisfied with the fact that they're taking it seriously and doing the best they can? >> i absolutely do. i wish -- i actually feel very
4:49 pm
comforted by the fact that the government has a ton of oversight and a ton of laws to comply with. and i personally am much more worried about the large collection and amount of collection that's taking place in industry that people don't really understand. >> all right. so i can get on to my next -- mr. bedoya, you talked about how important it was to limit collection to what was necessary or purposeful, et cetera. but in light of so many of the experts on both panels talked about almost like an inevitable momentum of collection, collection, collection, where would you look -- what part of the government or where would you look for the mechanism to try and limit the collection or get that kind of impediment or balance done? >> certainly. so i think folks have been saying that it's inevitable that industry is going to collect all this data. i don't think folks have been saying it's inevitable that government will collect it. taking that as a given, i think the question is about reconstructing the fire wall
4:50 pm
between government and industry with respect to data collection. and so i would be surprised if anyone on the panel thinks or previous panels think that it's inevitable the government will collect all this data. one other quick point on your previous question, i believe previous question, i believe that the congressional committees that conduct oversight on fisa and on foreign intelligence certainly senate judiciary committee lacks technologists. >> we talked a little bit about that in our first report on fisa reform there. okay. mr. hinz, you talked earlier, you said one of your principles was there shouldn't be any bulk data collections. now, terminology is varied all over the place, it would help me if i knew what you meant by bulk collection. at a gathering i was at, they talked about the great importance of public health data, especially for when epidemics come along or that
4:51 pm
sort of stuff, so wouldn't some of that come under your ban against all bulk data collection? >> i was talking specifically about government surveillance programs. >> okay. i just wanted to clarify that. and what do you mean by -- give us an example of what you call bulk data. this has been a debate whether this program or that program falls under bulk data. >> certainly. i had in mind the 215 program in particular where government goes -- >> it's not targeted. >> yes, it's not targeted. correct. >> okay. i think that's all i have right now. >> we may be able to go back to board members for additional questions. i would like to continue with this panel up until the top of the hour.
4:52 pm
we have one question from the audience which i will read and we welcome others if others want to post questions, in 2005, the national academy of sciences studied whether patterned-base data mining can anticipate who is likely to be a future terrorist. it concluded that this wasn't feasible. the question is pattern-based data mining in the terrorism context, is it feasible today and will it be feasible ten years from now? would anybody like to address that? heidi? >> i don't know specifically about terrorism, mindful of what ed mentioned as that we have limited data. but there is a program that has been running in las vegas in lapd, we may not necessarily still be able to identify
4:53 pm
specific criminals, but our predictive modelling systems have been at work. they're able to make a reasonably good prediction about where the criminal activities are more likely. it is not precisely the question that you're asking, but i can assure that it is just becoming better. i can assure that any service provider that has the amount of data that we are generating and it's becoming more and more and more generated is just a honing and fine tuning and polishing their models. whether it's going to be applicable to anti-terrorism methods, i don't know. i think all of these models are heavily data driven. so one would need a lot of data. but to the point that these models, these predictive modelling are able to predict things that may relate indirectly to terrorism or criminal activities, the systems are suggesting that we are going that way.
4:54 pm
>> other thoughts on that question? there's a system in chicago that the chicago police department has deployed which both has been touted and criticized, but it does somewhat the neighborhood or block level predictive predictions as to criminal activity as well as i understand individual level identifying people who may be either victims of crimes or perpetrators of crimes. again, both touted and highly criticized. any thoughts or comments? >> one quick one creating a feedback loop. you see every crime that occurs on corner acts and draw an overdrawn example that you thought it was real dangerous. that's the main one from my
4:55 pm
perspective. >> so this is certainly not necessarily my area of expertise, however, predictive is different from being able to reconstruct after the fact. and so can we use these things to then -- when something has happened go back and find whether we missed certain people that are still involved? yes, i do believe that's the case. in terms of predictive, i think we have a ways to go. by the same token, i get a crime report from all the crime in my area, i can predict where there's going to be on a weekly basis crime in my neighborhood. so, we're getting there. >> but i mean, at some level that's just comstat all over again. the systems that have been available to police for decades. >> sure. >> one question and i'll go down
4:56 pm
the row again and i'll pose a question and i think we can just go down the row with additional board members if they have additional questions. i had said in talking to each of the panelists i didn't want this to be a panel about going dark and the implications of encryption, but several of you have eluded to encryption and its significance here. and i would ask any of you who would to comment on the following, which is there is a growing trend towards more and more devices, cheaper and cheaper wearables and the internet of things. more and more data collection occurring. there's also, it seems, a trend towards more encryption by default, whether it's at the device level or as mike was
4:57 pm
referring to in terms of the encryption of data flowing between data centers. so it seems to me like we have two things going on at once, which is not unusual. somebody referred to the modern era, the era of the internet of things and big data and ubiquitous data flows as the golden age of surveillance. it seems to me that both trends will always be there. more and more information available both to the private sector and possibly to the government and increasing pervasiveness or at least increasing die fusion, if not diffusion of encryption.
4:58 pm
comments on that as a premise first of all, the premise of my question, am i right? and secondly, where does that lead the government and would you agree with my assumption that there will still be huge amounts of information available both to the private sector for its purposes as well as to the government? i guess let's go right down the row. professor? >> so, i believe that there will still be a lot of data that's available to the government. when i say that i really support encryption by default, i also really think that our country really -- we were the code hackers. and it was really critical in world war ii. and i think that instead of just kind of taking the lazy approach and saying, oh, leave us a back
4:59 pm
door that we should just get better at cracking the code because they're getting smarter and we need to get smarter, too. so i leave it to the lawyers the legality of when you can actually apply that or break into a system is, but being satisfied with just having a back door means that we're not advancing our state of the craft and your trade craft here in this country and we'll be left behind as a result. >> i'll actually pass. >> my thoughts on this two-trend seem to be occurring simultaneously. >> yeah. we're certainly seeing an expanded use of encryption. encryption between customers and the service provider, encryption between data centers, encryption on devices, et cetera, that's being driven by customer demand. customers are concerned about the security of their data and they're not just concerned about
5:00 pm
the security of their data vis-a-vis hackers and bad guys, they're increasingly concerned about their data the government. that's driving customer demand for these security features and companies will continue to invest in that. does that mean that there will be no data available? i don't think so. the nature of many cloud services requires service provider access to it. you can't run an effective e-mail system without being able to filter the content for spam and malware. and so there will be a point in the communication chain where data is available. if it's available to a service provider, it's available to a government through lawful demands. >> yeah. any thoughts on this and then i'll yield? >> first off, i want to agree with dr. anton's points, we


info Stream Only

Uploaded by TV Archive on