tv Today in Washington CSPAN April 8, 2011 6:00am-9:00am EDT
>> now a house science subcommittee hearing in evaluating tsa's passenger screening behavior detection program. under the program known as spot security officers are trained to use facial expressions and body language to identify security threats.
the program was implemented in 2003 and cost more than $200 billion to fund last year.
a 2010 report by the government accountability office found problems with the programs effectiveness. republican paul broun of georgia chairs the two-hour hearing. >> evaluating tsa's s.p.o.t. program. you'll find in for a few packets witnesses, witness panels written testimony, biographies, and truth and testimony disclosures. before we get started this thing the first meaning of the oversight subcommittee for the 112th congress, i'd like to ask the subcommittee's indulgence to introduce myself. it's an honor and a pleasure for me to chair the subcommittee for this congress and his position that i do not take lightly.
i want all members of the subcommittee to know that my door is always open. that i will endeavor to serve all members fairly and impartially. that i worked as are the best interest of congress and all-america to ensure that the
agencies and programs under our jurisdiction are worthy of the public support. i recognize myself for five minutes for an opening statement. today subcommittee meets to violate tsa's s.p.o.t. program, developed in the wake of september 11, 2001, and was deployed on a limited basis in a select number of reports in 2003. 2007 tsa created new behavioral detection officer positions whose goal was to use behavioral indicators to identify persons who may pose a potential security risk to aviation. this aviation. this goal expanded in recent years to include the identification of any criminal activity. tsa currently employs about 3000 in about 160 airports at the cost of over $200 billion a year. the president's fiscal year 12
budget request asked for an increase of 9.5%, an additional 175 edl. over the next five years the s.p.o.t. program will cost roughly $1.2 billion. outside of a few brief exchanges, congress is not if i with this program. that isn't to say congress wasn't paying attention. it culminated a report on the s.p.o.t. program last may. in that report gao identified several problems with the program. most notably, that was deployed without being scientifically validated. this is a common theme that this committee is increasingly forced to deal with. expensive programs are rolled out without conducting necessary analysis. this has become a trend throughout the federal government. but particularly at the department of homeland security.
this committee has a long history with the development acquisition of advanced spectroscopic dash as a southerner it's hard to stay, portable program, explosive, trace detection portal machines, and cargo advanced automated radiography system, all ran into problems because they were rolled out before they were ready. dhs either fails to proper test and the violate the technology, does not conduct a proper risk analysis, or neglects to conduct a cost-benefit analysis. a crucial aspect that is oftentimes taken for granted by dhs is the nexus between those developing the technology and those who are actually using it. in the case of s.p.o.t., it seems as though the operators got out ahead of the developers. typically what we see is the
opposite, the scientists and engineers developing capabilities that do not a properly fit into an operational environment. unfortunately this is an issue that the committee is unable to address today because of tsa's refusal to attend to the goal of this hearing is to shed light on the processes by which dhs created the s.p.o.t. program to better understand the state of the science that forms the foundation of the program, examine the methodologies by which dhs, is evaluating the program and identifying any opportunities to improve how behavioral sciences are utilized in the security context. the goal is not to throw out the proverbial baby with the bathwater. but rather to ensure that the sites they use is not oversold or undersold. s.p.o.t. is the first behavioral science pro-gram to stick its neck out for evaluation. this review is an opportunity to look at how behavioral sciences
can be used appropriately across the security enterprise and to understand its limitation and strengths. to its credit dhs, s&p's is conducting a violation of the program of tsa. this report was due earlier this year in february. and at the end of march, now suspected shortly an open we will get that shortly. while this is a good first step i am eager to hear how independent is a violation is that i look forward to understanding the reuse methodology, its assumptions and what level of input and access dhs, s. and t. had in its design. formulation and findings. as gao stated in its recent duplication report, quote, dhs is response to geos report did not describe how the review currently planned to assign to determine whether the study's methodology is sufficiently comprehensive to validate the s.p.o.t. program, unquote. you all understand that i hope
you do. it's bureaucrat ease. but anyway, the use of behavioral sciences in the security setting is not just another layer two security. there's clear opportunity costs that have to be paid. every video, there's one screen who is not looking at an x-ray baggage that one intelligence analyst don employed, one air marshal not in the sky. i realize this isn't a one for one substitute, but clearly there are trade-offs that have to be made in a very difficult fiscal environment. also, i would be remiss if i did not address the clear privacy issues that this technology and other dhs technologies present. privacy along with the serious constitutional questions i have only compounds the complexity of the issue. while the focus of the hearing today is the science behind the
program, i don't want these other important issues to be forgotten. now, the chair recognizes ms. edwards for an opening statement. ms. edwards. >> thank you, mr. chairman, and congratulations to you as you convened the first of what i hope are many oversight hearings to make sure that we are paying attention to the kind of oversight we need to engage in on the science and technology committee on behalf of the taxpayers. i'd like to say that i too am disappointed that tsa is not here today, wasn't able to provide a would've. i think you lost an important opportunity to inform the congress and the public why they believe the s.p.o.t. program is worthy of our support. and i hope they will cooperate with this committee and the congress in the future. and i hope it's not terribly distracting as we get to the witnesses. i don't want anyone of them to be identified as tsa. i know it's a confusing for me a beer. let me just say in the opening i
think each one of us has had an experience that instinctively sensing that something about a situation of a person is wrong or is worrying. police officer, immigration officers, transportation security officers have those instinctive feelings all the time. however, it's an open question whether distinctive reactions are reliable as warnings of malintentioned we also do not know whether a person can be trained to actually sort through their instinctive reactions choosing to intervene when faced with potential threat and to resist reactions based on racial profiling. what the transportation security administration has tried to do is develop behavioral training for officers so that they can quickly and accurately assess and screen passengers. can hunches be harnessed and identifying potential threats to air safety? that's the key question that underlies today's hearing and i hope we'll be able to take deeply into those questions. after richard reeves failed shoe
bombing, some in the aviation security committee concluded we were spending too much time and money on trying to stop the bomb and not enough to stop the bomber. screening by s.p.o.t. were issued by tsa as a way to get some officers eyes off the scanning screen and onto the passengers. those credited with helping to develop the s.p.o.t. program, some who are testifying before us today, and to the to train the hate filled protection officers on demeanor. an ongoing concern however with the bdl and with law enforcement as well is that they not engage in racial profiling. if bdl is focused on the passengers ethnic, religious or racial equality, thereby letting the law. terrorists have come in all colors, shapes and sizes. and as security personnel were fixated on profiling, approach to find the next mohammed either been a witness the next john
walker lindh, timothy mcveigh, lord richard reid. the s.p.o.t. program try to identify specific behaviors that will naturally emerge due to elevated levels of anxiety or stress. the hypothesis is that terrorists would display those cues when attempting to in a secure facility such as an airport. but behavioral scientists do not agree on these nonverbal cues and they don't agree on whether terrorists will exhibit them. because it's impossible to get a group of terrorists to participate in a double-blind experiment. it's hard to validate the theory. dhs points to the program success in identifying people who violated the law and are caught, but no one can be certain criminals and terrorists behave in a similar fashion. tsa relies on nonverbal cues to help sort through the more than 1 million passengers to fly into the u.s. each day. nonverbal cues provide a filter method to allow officers to determine who they should engage and discussion looking for
verbal signs and deception. there is more agreement among social scientists than verbal interactions with individuals can actually help in detecting deception. we had hoped that dhs funded and validation report on the s.p.o.t. program would be available for this hearing today. that report shows that s.p.o.t. trained behavior detection officers are much more likely to identify what tsa deems as, quote, high-risk passengers as against a purely random sample of passengers. we look forward to the reports completion and its findings, but without we're missing an important initial assessment of the program's performance. over the past 10 years since the 9/11 terrorist attack, congress has allocated billions of dollars in the department of homeland security for the development of tools and technologies to keep our air travel secure. too often that investment has been wasted. and too often we have relied on technology that is not adequately tested before it is deployed. it's not based on adequate
scientific evidence of effectiveness, and almost an athlete the technology is proven costly to acquire deployed in service. so i look forward to today's hearing and asking the questions about the more than $200 million a year that we are spending to make sure that we carefully evaluate spots operational merits. and with that ideal. >> thank you ms. edwards because our members who which -- who wish to smit additional opening statements, though statements will be added to the record at this point. at this time i like to introduce our panel of witnesses. esther stephen lord, responsible for directing geos numerous engagements on aviation and service transportation issues. before his appointment to the senior executive service in 2007, mr. lord of the geos work on a number of key international security finance and trade issues. mr. lord has received numerous
gao award for meritorious service, outstanding achievement in timor. congratulations. mr. larry willis is program director for suspicious behavior detection within the human factors division of the homeland security advanced research projects agency's, science and technology director at the department of homeland security. your business card must be a big one with all of that. detective lieutenant peter didomenica, minus pronounce to brown. my family can't pronounce so i'm very cognizant of people's pronunciation. detective lieutenant peter didomenica is employed by the boston university police where he commands the police detective division. prior to this he served as a massachusetts state police officer as well as the director of security policy at boston logan international airport
where he developed innovative antiterrorism programs. dr. paul ekman is professor emeritus of psychology at ucsf and his curly president of the paul eichmann grew. is authored or edited 15 books. you have been busy. and has consulted with federal and local law enforcement and national security organizations. american psychological association identified doctor ekman as one of the most of the 100 most influential psychologists of the 20th century. quite an honor. "time" magazine selected him as one of the 100 most influence of people of 2009. is also the scientific adviser to dramatic television series on fox tv, like to me, which was inspired by his research. hope you're getting rich with all that. love the market system. it's a great.
dr. maria hartwig is an associate professor in the department of psychology at john jay college of criminal justice. she has published research on deception and a number of scientific journals on the editorial board of human behavior. 2008 dr. hartwig received an early career award by the european association of psychology and law for her contributions to psychological research. congratulations. doctor philip rubin is the chief executive officer andy senior scientist at haskins laboratories, a private nonprofit research institute affiliate with yale university and university of connecticut. and 2010 dr. rubin received a ba is meritorious research service accommodation. dr. rubin is the chair of the national academies board on behavioral cognitive and sensory scientist and was briefly the chair of national research committee, field evaluation of behavioral and cognitive
sciences base methods and tools for intelligence and counterintelligence. and a member of the nrc committee on developing metrics for the department of homeland security science and technology research. noticeably absent from the witness table is the transportation security administration. tsa was invited to the initial hearing on march 13. it was postponed. they were invited to this hearing several weeks ago. and respond to these invitations, dhs has refused to send a tsa representative. on another committee hearing just yesterday, the department of homeland security refused to have a witness sit on a panel with other witnesses. dhs has staked out a claim that i think is intolerable.
it is unconscionable that tsa will not send a representative here today to this important hearing on this program, that is slated to send $1.2 billion of the taxpayers money, to talk to us about it, and i find that totally reprehensible. in a letter to this committee, dhs sought to detail the subcommittee's interest presumably quoting from rule 10 of the house of representatives that delineates jurisdiction. in this letter they state, quote, given the subcommittee's interest in scientific research development demonstration and projects, larry willis, uncle, labels project manager hostile, will represent dhs at the aforementioned hearing, unquote. finally, it is highly presumptuous that dhs thinks it
knows our jurisdiction better than we do. it shows their arrogance. and i find it appalling. considering this committee was formed in 1958 and played an active role integrating the department of homeland security, while dhs surprisingly fights our black letter jurisdiction under rule 10 correctly, they must have stopped reading there. in rule 11, committee on science-based technology is tasked with responsibly to, quote, review and study on a continual basis loss, programs and government activities relating to nonmilitary research and development, end quote. ths -- and less tsa and dhs argue science and research play no role in the department of s.p.o.t. program, i see a compelling reason for their attendance here today.
the nexus between science and operations is vitally important to understanding how programs were developed, why there are problems -- while there are problems and how they can improve. if tsa and dhs are, in fact, making a claim that science and research play no role in the formation of the program whatsoever, then this program should be shut down immediately for lacking any scientific basis. and a little more than snake oil. dhs does not value this committee's role in overseeing the agency come and if tsa does not value as anti-scientific advice, there are a number of legislative options that this committee could employ to change that impression. i'll also note that dhs is has sent officials before from customs and border protection and the coast guard.
i find it odd that in this instance, tsa would not want to talk about this program. it makes me wonder what they are trying to hide. when dhs is asking for 9.5% increase in the fy 2011 budget request for s.p.o.t., you'd think they could justify that increase. to us here in congress. let me be clear, the administration does not tell congress how to run its hearings. we will likely return to this issue once again after the validation report is delivered. at that point, we may seek tsa's input once again. if that is decided, this committee may seek more aggressive measures to compel tsa's attendance, including the issuance of a subpoena.
this committee has not needed to issue a subpoena in almost two decades because it's been successful in reaching accommodations with republicans and democrat administrations. i'm hopeful that tsa will determine that they have a viable contribution to make to this topic in the future so that we do not find it necessary to go down that road. now, after witnesses said no, testimony is limited to five minutes each. if you would all try to please hold it to the five minutes. if you go over a few seconds, that would be okay, but if you just go on and on then i may have to tap the gavel so, you know, please wrap up your quickly. your written testimony would be included in the record of the hearing. it is the practice of the subcommittee on investigations and oversight to receive testimony under oath. do any of you have any objections to taking an oath?
any of you? okay, let the record reflect all witnesses are willing to take an oath. they all show that by nodding their head from side to side indicating no. you also may be represented by counsel. to any of you have council here with you today? no. okay. let the record reflect that none of the witnesses have council. now, if you'd please stand and raise your right hand. [witnesses were sworn in] >> let the record reflect that all witnesses participating have taken an oath. thank you. y'all may sit down. i now recognize our first witness, mr. stephen lord, director of homeland security justice issues, government account of the office.
mr. lord. >> chairman broun, ranking member edwards and other members of the committee, thank you for inviting me here today to discuss tsa's behavior detection program, also known as s.p.o.t. today i would like to discuss two issues. first, dhs's ongoing efforts to validate the program, and second, tsa's efforts to make better use of the information collected through this program. this is an important issue as the departments currently seeking $254 million in fiscal year 2012 funds, including 350 additional behavioral officer positions. and as we reported in may 2010, tsa deployed s.p.o.t. to 161 airports across the nation before completing ongoing validation efforts. that's, it is still unclear whether behavior and appearance indicators can be, can be used to reliably identify individuals who may pose a threat to the
u.s. aviation system. according to tsa, the program was deployed before these efforts were completed to help address potential security threats. to help ensure the program is based on sound science, our report recommended that tsa and dhs convene an independent panel of experts to review the methodology and results of the ongoing validation effort mentioned in your opening comment. the good news is dhs agreed with this recommendation. however, as other panel members will note in their statements today, a scientific consensus does not yet exist on whether behavior detection principles can be reliably used for counterterrorism purposes in an airport environ. it's also important to note that the current dhs validation effort will not answer several important questions. for example, how long can behavior detection officers observed passengers without
becoming fatigued? what is the optimal number of officers needed to ensure adequate coverage? to what extent are they behavior and appearance indicators the right mix of indicated? sure the list of indicators be larger or should the list be smaller? also, while mr. willis will report that s.p.o.t. is nine times more effective than random screening in identifying so-called high-risk individuals, the results of this analysis has yet to be shared with gao or independently reviewed. our report also highlighted some difficulties that tsa faced in capturing and analyzing the rich information that was collecting at airports. we recommended that tsa better collect and analyze s.p.o.t. information to help connect the dots on passengers who may pose a threat to the u.s. aviation system. for example, we recommended that tsa clarify its guidance to bdos for inputting information into the database used to track
suspicious activities. we also recommended that the expand access to this database across all s.p.o.t. airports. the good news is tsa agreed with our recommendations and has revised its procedures accordingly. tsa also expanded access to this database to all s.p.o.t. airports as of march of this year. our 2010 report also recommended that tsa make better use of information collected through airport video systems. we noted that 16 individuals who were later charged with or pleaded guilty to terrorism related offenses transited through eighth s.p.o.t. airports on 23 different occasions. that's we recommended that tsa examine the feasibility of using airport be assistance to refine the current number of behaviors currently assessed, and also to use this information to help refine the program going forward. we believe such recordings could
help identify behaviors that may be common among terrorists, or to demonstrate that terrorists do not generally displayed any identifying behaviors. and again, tsa agreed with our recommendation and is now exploring ways to better use these video recordings. in closing, behavior and appearance is monitoring might be able to claim a useful role in efforts. no, however it still an open question whether these techniques can be successfully applied on a large-scale in airport environ. and while i am encouraged that tsa -- i'm a dhs has taken steps to validate the program, i'm still surprised that the devout is seeking an additional funding for its program before the issue is fully address. hopefully today's hearing will help clarify tranninety future plans for the program. chairman broun, ranking member edwards, and other members of the committee, this concludes my statement that i look forward to your questions.
>> thank you, mr. lord. now recognize our next witness, doctor paul acton, professor emeritus. skipped over and i apologize. i now recognize mr. willis, our next witness. history larry willis, program manager homeland security advanced research projects agency, science and technology director at the department of homeland security. mr. willis, you have five minutes. thank you, sir. >> good afternoon, chairman broun, ranking member edwards, distant wish there is of the subcommittee. i'm honored to appear before you today on behalf of the department of homeland security science and technology director to discuss our evaluation of the transportation security administration's screening passenger by observation technique or s.p.o.t. referral report which is a checklist of predefined behavior, and indicators used by tsa could invite potential high-risk
travelers but for the purpose of snt studies, high-risk travelers are defined as those pastors in possession of a series prohibited or illegal items, or individuals engage in conduct leading to arrest. for background purposes, the s.p.o.t. validation effort began in 2007 as a result of the component lead snt people screaming process that identified and prioritized capability gaps and dhs operational customers. as an active participant in this process, tsa identified the s.p.o.t. referral report and its associated indicators as a candidate for validation study. the s.p.o.t. referral report contains a discrete list of indicators which have been designated by tsa as sensitive, secure information or ssi. tsa's behavior detection officers or bdos are trying to identify these indicators and use them to make screening decisions such as referral for additional screening at the tsa
checkpoint. it is important to note that the behavior screening isn't limited aviation security and is conducted formally or informally by dhs agencies, the department of defense, intel community and law enforcement worldwide. s.p.o.t. validation research is a rigorous evaluation to tsa's s.p.o.t. referral report that supports our better understanding of the threat, the screening accuracy of the existing indicators, and advances of science and behavioral screening. ..
and included 71,589 referrals for 43 airports. to make direct comparisons between the base rate database and operational referral a second data set was created for the 23,264 operational spot referrals collected during the same time and at the same locations of the base rate steady. together these two datasets allowed a i are to assess the extent to which spot referral report was of their role indicators leading to direct screening decisions. the number of finding the word from the analysis of the report included the following that i would like to share with you. one, operational s.p.o.t. identifies high risk travelers at a higher rate than random screening. the study indicates the high risk traveler is nine times more likely to be identified using operational s.p.o.t. versus
random screening. moreover to achieve this outcome, bdos were able to get 50,000 fewer travelers using operational s.p.o.t. than with random selection eleventh. the second results is the population base rate for s.p.o.t. indicators is low. among those selected for random screening in the base rate study the most frequently observed indicator was displayed in only 2.8% of randomly selected travelers. all of these indicators were observed, all other indicators were observed in 2% of travelers selected during the base rate study. in conclusion these results indicate the s.p.o.t. program is more accurate than random screening in identifying high risk travelers using the metric that we employ. our validation process which included an independent comprehensive review of the s.p.o.t. referral report is a key example of how s&p works to enhance the effectiveness of
operational activities. chairman broun, thank you for this opportunity to discuss research to validate the screening of passengers by observation technique referral report and i am happy to answer the questions you may have. >> thank you for keeping your remarks under five minutes. sometimes that is not done in in fact most times. next witness is peter didomenica of boston university police. you have five minutes. >> good morning, chairman broun and ranking members of the committee. thank you for the suburbs and 80 to address your regarding the future of the gsa s.p.o.t. program. originally developed, they trained 3,000 police intelligence and security officials in 100 federal's, state and local agencies and also i have been a lecturer for
the assessment of the fbi secret service and u.s. army night-vision labs, defense department's criminal investigation task force and national science foundation. i represent only myself, none of the organizations i have been employed by. on december 22nd, 2001, while assigned to logan international airport i was part of a large team of public safety officials who responded to the airfield to the american airlines flight 63 diverted to boston from a flight from paris to miami. on board was a passenger named richard reid who attempted to detonate an improvised explosive device that if successful would have killed all 197 passengers and crew members aboard. i still only a few feet away from him who was securely in custody it hit me then this man was the real thing. the threat of another terrorist attack by al qaeda was nonstop and we need to do much more to properly screen passengers than merely focusing on weapons
detection. this began development of the behavior assessment screening system. i began to explore the scientific literature in an effort to quantify the human capacity to detect dangerous people. my research included many disciplines including physiology, psychology, neuroscience and specific research into suicide bombers and specific behaviors were selected that reported in scientific literature and consistent with law enforcement experience. the program went on to be delivered to numerous agencies including the washington d.c. metro transit police, amtrak police and atlanta police officers assigned to the world's busiest airports. in 2006 two trainers and i spend two weeks in london where we set up a british version of the bass program for the british transport police as in response to the july 7th, 2005, terrorist attack on the london underground. during the course of training police officers around the
nation, instructors discovered terrorist ties. in 2004 when conducting training with new jersey transit police i observed three males exhibiting suspicious behavior using vast techniques. the subject was in the united states on a religious visa being escorted to an amtrak train for a week-long trip with no luggage. it was later confirmed the subject was listed on the terror watch list. i intercepted an inspector on a covert test at logan airport in late 2003 with concealed weapons. chandra levy s.p.o.t. program is identifying high-risk passengers its effectiveness is limited because proper resolution of highly suspicious people discovered by the gsa recovers a law enforcement response by police officers trained in the same behavior detection and interview skills. are designed programs of the most dangerous people are removed from the critical infrastructure or arrested by bass trained police officers. i do not believe they its
support the training program is enough. the airport police in my opinion need to be trained in the same techniques and skills sets which would engender confidence in the program and their ability to detect terrorist behavior and prevent devastating attacks. another issue with the s.p.o.t. program is the key as a creative too high expectations for what it is able to achieve. the original s.p.o.t. program was not primarily for the apprehension of suspects. with the means to deny access to critical infrastructure of high-risk persons involved in terrorism or other dangerous activity. it was the last and best chance to prevent a tragedy when other methods like intelligence and traditional physical stress screening failed. catching a terrorist in a public place without prior intelligence is extremely difficult. by way of example if we use the known number of terror suspects who board domestic commercial flights and the approximately four billion passenger at commercial airports from 2004 to 2009 the base rate of terrorist
passengers is one in 1 seventy-three million. the expectation that the s.p.o.t. program will result in the rest of all terrorists attempting to board a domestic flight in the united states is unrealistic and threatens continued support. if it is seen as paul of a multilayered approach with the primary goal of preventing terrorist access to critical infrastructure in conjunction with properly trained law enforcement programs sets reasonable and attainable goals and should have the support of this congress. thank you for this opportunity to address the program and i'm prepared to answer any questions you may have. >> you did not exceed your 5 minutes either. congratulations. thank you for being here. our next witness, prof. of emeritus from san francisco, you have five minutes for your
testimony. >> thank you. i really appreciate this opportunity to testify on this very important issue. i have been working with psa and s.p.o.t. for eight years based on 40 years of research on how to demeanor facial expression, gesture, gays and posture, can help in identifying lies and harmful intel. my research examined four different kinds of laws. lies to conceal a strong emotion felt at that moment. was claiming to hold a social political opinion the exact opposite of your truly strongly held opinions. buys the 9 that you have taken money that isn't yours and lies in which members of extremist political groups attempt to block an opposing political
group from receiving money. our research focuses on real world lies that matter to society in which each person besides for him or herself whether to lie or tell the truth. just as we do in the real world. no scientist comes out of the closet and told us you are supposed to lot bigger and you're supposed to tell the truth except in experiments published in journals. the person who tells the truth knows that if he or she is mistakenly judged to be lying they will receive the same punishment of the flyers who is caught. this makes the truthful person apprehensive and harder to distinguish from the liar just as it is in the real world. and the punishment threat is as severe and highly credible to those who participate in the
research as we could make it passed by the university. i should mention i worked in a medical school. i would never get past at berkeley but in the medical school what i'd do is considered trivial. unlike any other research team we have performed the most precise comprehensive measurements of face, gesture, voice, speech and gays and those measurements have yielded between 80% to 90% identification of who is lying and who's telling the truth. the clues we have found are not specific to what the law is about. as long as the stakes are very high especially with the threat of punishment behavioral line will be the same. it is this finding that suggests there will be no clues specific to the terrorists high-paying harmful intent than the money smuggler or the drug smuggler or
the one to fall. in my written testimony i raised 3 key questions. what is the basis for the checklist? i explain why our findings on different kinds of laws provide a solid basis for reviewing what was on a spot check list. what is the evidence of the effectiveness of s.p.o.t.? dr. willis has covered that i and i won't repeat it. i am eager to see the report that you are eager to see. question of 3, can s.p.o.t. be improved? and dangerous question to ask a scientist. we always think more research is necessary. but is it a wise investment compared to other things that the government can invest in regarding airport security? that is your decision, not mine. in my testimony i have outlined a couple types of research that could be useful if you decide you want to do more research but
we do not need to do more research now to feel confident in this layer of security provided to the american people. in my written testimony i attempted to answer questions that have been raised by critics of s.p.o.t.. would have not been able to use it on how terrorists actually behave, why it wasn't s.p.o.t. based on -- why isn't s.p.o.t. catching smugglers and not as terrorists? are people with middle eastern names or middle eastern appearance more likely to be identified by s.p.o.t.? i would be glad to respond to questions to provide brief answers to each of these that are in my written testimony. my thanks to the committee and staff of the committee for the opportunity to talk to you and to the men and women in the essay the make flying safer than
it would be without their dedicated efforts. >> appreciate your testimony. now recognize their next witness, dr. hartwig at john jay college and criminal justice. your testimony for five minutes. >> an honor to be here. thank you for allowing me the opportunity. the s.p.o.t. program is based on the idea that credibility can be made on the basis of observing facial cues and nonverbal cues that indicate stress, fear or deception and i have been asked to address scientific support for this. first of all there are more than their years of research showing that people are quite poor at detecting deception on the basis of observing behavior. in a recent analysis, statistical overview of the research, people obtain a hat
rate of 54% and you should keep in mind that 50% in is by chance alone. why are people supporting the section on the basis of observation? one answer is for there are very few nonverbal demeanor based cues to deception and these tend to be week. so simply put there may not be much to observe and contrary to what laypeople and presumed law experts such as law enforcement believe flyers don't display more signs of stress, fear and arousal. critics of this research very often say that these findings are due to the nature of laboratory experiments that most research relies on. the claim is that when the stakes are sufficiently high these cues to the section will
appear. research has addressed this concern by studying high-stakes laws like those told by people suspected of serious crimes like murder, rape, and these studies don't show any evidence that stress and anxiety appears, has stakes increase. let me turn to the issue of detecting deception from facial cues like e motion. this is based on the idea that wires experience the emotion of fear when observing facial cues can help you detect lies. i don't have time to go into details about this with that -- in brief it invites false alarms. it may miss travelers with high style in tensions who don't experience these emotions or who successfully conceal them. it may generate false alarms for
travelers who don't have hostile intentions but experience these feelings for other reasons. most people are quite surprised to hear that there's very little evidence on the issue of these microexpressions. brief displays of an underlying emotion that are revealed automatically. i am aware of only one study published in peer review literature conducted by steve porter in the journal psychological science. the -- falsified and genuine displays of emotion. they found no complete microexpression in 697 facial expression that allies. they found 14 impartial in the lower or upper half but these microexpressions occurred with similar frequency in true and falsified expressions.
so this study shows microexpressions are curve very rarely and to the extent they do occur they occur in genuine displays as well. this would include the occurrence of microexpression and true expressions usefulness in airline settings are questionable. they also state current training relies on the identification may be misleading. i would like to address a point of view expressed by dr. asherman on the s.p.o.t. program. he said in a letter publishedes all the details of his work in peer review literature because those papers are closely followed by scientists in countries such as syria, iran and china. i object to deliberate strategy not to publish research for three reasons. first, the enemy whoever they
are, potential criminals will be aware of results of research applies to all research so if we took this are made it -- argument seriously we should publish any lie detection research because it may ultimately help the enemy. second it is my understanding of the theory of microexpression that these are involuntary displays and if that is the case i fail to see how knowledge about these behaviors or research on these behaviors could help a person. most importantly these claims, microexpressions, 2 q deception or a fa program, questions that should be addressed with data to scientific peer review and given the amount of resources that have already been spent on this program i think such validation is necessary.
in summary my view is the s.p.o.t. program is out of step with scientific research. it relies on an outdated view of deception and there is very little support in the peer review literature. if i had more time i would say a few words about what may be a more productive approach to credibility but i believe i am out of time. >> thank you, dr. hartwig. if you want to add some suggestions we will enter those in to the record and entertain suggestions you may have and hopefully get those from you. i would like to recognize our panel witness, dr. philip rubin, executive officer of the haskins laboratories. you have five minutes. >> chairman broun, ranking member edwards and subcommittee, thank you for the opportunity to speak to you today. i am philip rubin.
i currently serve in a number of roles inside and outside government that might be relevant to today's hearing. in addition to activities previously mentioned by chairman broun i am technical advisor formed to provide critical input related to analysis and methodology used in the s.p.o.t. program. i was invited to the research and science and behavior cognizance related to laboratory studies in evaluation of various tools, techniques and technology used in detection of the section. my written testimony provides historical background on selected activity in the behavioral sciences related to security and mentioned a variety of documents and reports some of which i have here including many produced by the national academy of natural research council like consensus reports and other documents and written testimony focuses on two involved in the workshop, field evaluation and
intelligence and counterintelligence on threatening communication and behavior. i am not able to describe these in detail and refer you to my written testimony. regarding the field evaluation, in various obstacles of field evaluation. may be overcome if field evaluations of techniques derived from behavioral sciences to become more common and accepted. the most basic obstacles for lack of a fishy asian among many for the value of objective feel evaluations with inaccurate in formal lessons learned approaches can be to field evaluation. a number of people throughout the process of developing this summary spoke about vote pressure to use a new devices and techniques once they become available because lives are at
stake. the sense of urgency can lead to pressure to use available tools before they are evaluated and can even lead to ignoring the results of evaluation if they disagree with the user's conviction that the tools are useful. as indicated earlier i am a member of the technical advisory committee for s.p.o.t.. the committees technical advisor committee's role is extremely limited focusing on determining whether or not the research programs successfully accomplish the goal of evaluating whether s.p.o.t. and can identify high risk travelers. attempting to defeat the process. the problem-the committee cannot evaluate the s.p.o.t. program or indicators in the program. and evaluate consistency across measurement with training issue,
scientific foundations, and evaluate the program, would be needed. to summarize my written testimony, let me mention a few points as highlights. these are recommendations how to move forward. create a reliable research base examining many issues related to security and detection of the section. peer review where and when possible, shining a light, on methodology, as open as possible should be necessary if these technologies are performing in the known and reliable manner. incorporate knowledge on complexity and regularities and idiosyncrasies of human behavior. understand the interplay and difference between emotion, stress and other factors. make sure we are not distracted
or misled. pay serious attention to the ethical issues and issues related to human subject research. and relevant emerging areas with privacy concerns and ethical implications of the deployment of agents and devices. conflicts of interest when possible including financial interest and development and understanding how urgency organizational structure and institutional barriers can shake program development and assessment and support the importance of the need for independent evaluation of new and controversial project and issues with appropriate scientific, technical, statistical expertise. thank you. >> i want to express my appreciation for your being here.
you had some recent challenges and i appreciate you being here. i want to thank all the panel for your testimony. committee rules limit questions to five minutes. the chair will open the round of questions and share recognizes himself for five minutes. when can we expect s.p.o.t. violation report? >> the report was delivered to me last night and is being submitted through the review and release distribution process. i am not sure what that time is and what is disseminated concerning -- i will get that information. >> what additional steps have to be taken? >> i don't know what the process
entails. and following my participation here -- >> any problems with preliminary results? >> i don't know what the policy is on that but i am happy to provide whatever is consistent on relief. >> i understand the results are still preliminary. there appears to be discrepancy in s.p.o.t.'s success rate. you state, quote, the study indicates a high risk traveler is nine times more likely to be identified using operational s.p.o.t. vs. random screening. i understand when you met with staff from the subcommittee on march 3rd you said the s.p.o.t. program was 50 times more effective than random screening. one of our other witnesses makes
a similar claim in his testimony saying, quote, malfeasance -- etc. -- identified 50 times as often as those identified by s.p.o.t.. please explain the discrepancy. >> there shouldn't be a discrepancy. we use four metrics to evaluate s.p.o.t.. the first was the possession of illegal or prohibited items. the second was possession of fraudulent documents. the third was law-enforcement arrests and the fourth was combination thereof. the arrest has the higher number you referred to in your question. the position of prohibited items and fraudulent documents is approximately 4-1/2 times and if one combines all of them it is
nine times. >> those that were identified the bristol how many of those were convicted? >> i have no idea. our efforts stops at a decision recorded as being arrested or not and that is the information that is available through the s.p.o.t. database. it doesn't go beyond that. >> do you have any data about false positives? on the people identified at 50 times or nine times? >> associated with arrests? >> with arrests, yes. the with arrests and with prosecution and the ultimate prosecution etc.. >> we have information available
on that. for example if one looks at the false positive index which is for every person you correctly classified as a high risk traveler, what is -- what is the number of travelers you miss classify. we have that information or any of the four metrics we discussed. for example combined outcome for every person you correctly identify using operational s.p.o.t.. 86 cowher misidentified.
>> that's the reason tsa should be here and the reason they missed the effort and the extreme of the point they're not here. >> i could talk, sir, about why we use metrics that deal more with criminal that terrorism if that would be helpful. >> you've got a few seconds, go ahead. >> my time's out. >> the reason we use those metrics we just listed, sir, were because they were available to us in the data in sufficient numbers to analyze even though they themselves are extremely rare, and data directly dealing with terrorism is unavailable and, thus, can't be used as a metric. >> okay. my time's up. ms. edwards. >> thank you, mr. chairman. and as i mentioned earlier, i am disappointed that tsa isn't here because i think there are a number of questions that
actually go to things like training protocols and other aspects of the spot program that they would have, you know, really useful information to share, and so i look forward to working with the chairman and the committee. this committee about who needs to appear or not is not a decision really for the administration, congress determines under its constitutional authority who appears before the committees and what the jurisdiction is. so i do share that concern. i want to go to this question, though, of profiling -- >> would the gentlelady yield? >> yes. >> appreciate your comment. you took up about almost a minute with that, and i'd like to give you an extra minute on top of that. [laughter] i don't want to charge you. >> i appreciate that, mr. chair. >> i'll give you an extra minute. if y'all would start the clock again, please. >> thank you. thank you again, mr. chairman.
i have a question, really, that goes to this issue of profiling. i mean, as an african-american woman who sometimes because i have short hair and i get cold, i ware a scarf -- wear a scarf on my head, i've had the experience of being pulled over and questioned, and it hasn't happened once or twice, it's happened multiple times. and, you know, i don't want to make any speculation about that, but it does raise the question of who's identifying me and how and what i'm sending off. i'm also reminded in dr. hartwig's testimony that, you know, i remember when i broke a lamp, and i tried to glue it together, and my mother walked in and she said, what did i yo do? and -- what did you do? i suspect part of the reason -- and then i proceeded to tell her a lie. but i suspect part of the reason that she knew i was lying is because she knew me and because she'd had experience with me, and she'd read my both verbal and nonverbal cues many times over which gave her a better
indication of when i was doing truth telling and when i wasn't. we don't have that experience in our, in our operates, and so -- airports, and so i have a question for lieutenant didomenica, and that is whether it's possible to train officers of all kinds not to engage in profiling. and i've done police training, law enforcement training as well, and i think it's tough to train out culture. culture in the sense of a police culture and a law enforcement culture where you have to train against type when it comes to these issues. and so i'm curious, lieutenant didomenica, if you have some, if you can share with us whether it's possible to train officers not to engage in profiling. >> i believe it is so, and i've been training in bias policing and racial profiling for over a decade now. principally with the state police. i design statewide programs for the massachusetts police community on racial profiling and bias policing, and it is
possible to make people aware of their own unconscious bias and tendency to want to make snap decisions about people based on very superficial things. we all have this hardware, it's a survival instinct, and we look at somebody, we're automatically making an opinion about them. a lot of it has to do with our background and cultural influents, and a lot of those are negative. this part of your brain is about survival, and it wants to understand what's going on very quickly, and it actually gets a jump on your conscious awareness. right away we made a decision about each other before we were even consciously aware of who we were and what we are, and that's going on all the time. and this is the source of bias. now, knowing that i can't stop my feelings about someone paced on how they -- based on how they look, that initial survival reaction about whether the person might be dangerous or not, but i can take a few seconds, maybe minutes, to think about, you know, what's going on, what do i know objectively and maybe even do some race
transposition. if this person was another race, how would i feel about the decision? it takes self-awareness, it takes training, it takes the ability to being willing to change and monitor yourself, but it can be done. one of the foundations of the behavior assessment training i've done and what i initially gave the tsa is you have to address bias and racial profiling. in fact, i call it -- to me, it was an antidote to racial profiling. >> i'd love to hear it, but i just have a minute and a half left, and i wanted to get to -- i appreciate your answer. i wanted to get to dr. ekman because i have to tell you, you've been unnerving the entire time i've been in here, and i wonder if you have something to share with us on this issue of whether you can train against those kind of what could be negative instincts in one context, but train them to be positive factors in the recognizing behavior? >> yes. and thanks for the opportunity to respond to that. i wanted to quickly put in that
we did research years ago that showed that the better you knew someone, the worse you were in identified when they lied to you. because you're biased. if they're your friend, your spouse, etc., you don't want to discover that. strangers do better than close people. but the issue is monitoring, building in to the spot program some monitoring to discover the actual incidents of racial profiling. and my bet is that some people show a lot more of it than others. not everybody can learn everything. not everybody can unlearn everything. what we want as bdos are the people who have the flexibility of mind to benefit from that training and not doing it, how can we find out? it's not rocket science. it's having unannounced observers checking on who is it
they pay attention to. and finding out whether there are some people who are repeatedly showing racial profiling. and you either reeducate, or you reassign them to a different job. >> thank you, dr. ekman, and thanks for your indulgence, mr. chairman. >> you and i will always be friends, and i will always give you some variances on the time, so i'm not going to be worried about that at all. dr. benishek, you're up next for your questions. go ahead, sir. >> [inaudible] >> thank you, mr. chairman. thanks to the panel as well for being here. it's our job here to try to spend the money of the taxpayer in the most efficacious way, and
listening to the testimony here, it's really difficult for me to determine whether this spot process is accurate or not, but i'd like to address mr. didomenica about the process a little bit more. from your comments today, it seems as if there's some doubt. i mean, even after the media sees some kind of behavior, what is the process after that? if there's someone there, it sounds as if you have some doubt as to the next screening step. are those people not trained in the same thing? i hate to see somebody getting missed. so i'd like to know about the exact process from the moment that the person gets taken out of the queue. is that effective? are we doing any good? are we missing people? i mean, this is the kind of thing that i think you brought up in your testimony. >> i think it's effective, and i
also think we're missing people, but i think that could be improved. the process, actually, starts with an observation that may indicate a person that's high risk that maybe should not get on that airplane or get onto that train or into that government building, whatever the critical infrastructure is. and based on the evaluation this spot scoring which i really can't go into because that's sensitive information, but there are two levels, and one is more screening, and one is a law enforcement response. so for the people deemed to be the most high risk, the protocol is to invite or call a law enforcement officer to do a follow-up interview. now, the follow-up interview is the opportunity to address the false positives because a lot of people that exhibit the behaviors that may indicate possible terrorist or criminal intent are just people who are upset or distracted or late for work or going to a funeral. whatever it is, there may be a lot of people that just get on the radar, and this interview --
which really only takes a couple of minutes to do -- is an opportunity to resolve that. it's also an opportunity to determine if you got the real thing, that this person is high risk. so that's another skill. that's the interview skill which is another part of this process. and so -- >> are those people skilled enough, in your opinion? >> we save those people -- >> the people at the secondary person. are there enough of those people? is. >> i think the responsibility ultimately falls on police officers when there's a high-risk person. i think they're capable. every day they're making decisions around this country whether to arrest somebody, not to arrest somebody, deny people their freedoms. so i don't think it's too much to ask them to make a decision, is this person a high-risk person, and do we need to slow down the process to figure out what's going on? is i think they're capable of doing it. we're doing it whether this program gets funded or not o. cops are making these decisions every day, but i would like to see them get more training and
support, and this program has that potential. >> all right, thank you. i don't know where we are at the time, but i'll yield back the remainder of my time, if any. >> thank you, doctor. i just want to say your questioning just shows further why tsa should be here so that we could answer those questions, because if they were, then you could direct it to the tsa individuals, and it'd be very instructive to the whole committee, democrats and republicans alike, and help us to go toward. to go forward. the next person on the agenda is my friend, mr. mcnerney. >> thank you -- >> yiedz for five minutes. >> thank you. i appreciate you calling this hearing. it's interesting. i have watched "lie to me" on occasion, and i find it, it's compelling but not too scientific in my been. but it's -- in many my opinion. it's good to examine this issue and see how much utility there can be from it and how much money should be expended to
define that utility. dr. hartwig, i think i heard you say, and you can correct me if i'm wrong, that you failed to see how knowledge of the indicators could be useful. >> well, i think that is, again, an empirical question. there isn't enough research on, well, there's a lot of research on demeanor cues, but as far as i know there's no study that tests whether knowledge about, for example, microexpressions helped people not display them. but that would be a second step. it would be a good first step to establish that these expressions are occur reliably. so countermeasures come second. >> okay, thank you, dr. hartwig. i was going to follow up with you, dr. ekman, to basically say would you agree that knowledge
of those indicators would also be useful to potential wrongdoers? >> we don't know. i mean, you're basically asking the question in polygraph terms is could you develop countermeasures? >> right, right. >> a proposal i put in to the government to find out, i mean, i have reason to believe that the chinese know the answer because they were sending me questions that you would want to prepare on if you were going to do a training study to see whether you could inhibit people from showing not just microexpressions, but there are dozens of items on that checklet's. checklist. our government has not decided that it's worth finding out whether you can beat the system. other governments are finding out and may be selecting people who can and training them so they can. we just don't know. we know about the polygraph, we know countermeasures are quite successful. we know about some verbal means. we know they're quite
successful. if i could have a moment more, sir, you've heard some complete contradictions between dr. hartwig and myself. i think if you look carefully at the literature, you would find that it comes out supporting me, but how can you know? and i think you need to do when you get a disagreement among scientists is you need to establish a advisory panel of experts who have no vested interest and no connections to hear from the people who disdegree and look at the literature and resolve it because you're really being given in this testimony advice that's 190 degrees opposite in testimonies of is there -- in terms of is there a scientific basis for what's being done. but you could argue, and i don't know whether mr. willis, dr. willis would that if this
validity study holds up to scientific scrutiny, to everyone who's looked at it in this committee, if it's as successful as the report is, you've got to be doing something right to get that kind of success. so maybe it's just of scientific interest to find out. >> thank you, dr. ekman. mr. lord is chomping at the bit here. >> i'd like to respond to dr. ekman's point. in fact, that was the key recommendation of our may 2010 report was to have an independent panel review the results of this current air validation effort. we think it's very important for a panel to be established that has no ties to the current program, that's not an advocate of the current program to help weigh in on this very issue. i think it's very interesting that the panel shows a lack of consensus which was the basic point i made in my earlier statement. there's to scientific -- >> well, a subject like this
you'd expect to be a broad range of disagreements. has the panel, like what you're recommending, been suggested in one of the budgets or lined out somewhere, or is something -- >> yeah. dhs agreed to establish an independent panel to review the methodology as well as to review the final results, but as mr. willis indicated, the the final results of this latest validation effort have only recently been submitted, i believe he said as of last night. >> i think i've run out of time, so i'm going to yield back. >> mr. hultgren, five minutes. >> thank you. i thank you all for being here. i share the frustration with some of the others that tsa is not here today. i'm a new member here at congress along with quite a few others and so have been traveling much more in the last three months than i've ever traveled in my life. just on monday i had my first experience of the full treatment by tsa out of o'hare, and it was
interesting. i didn't realize it involved turning your head and coughing, but i now realize that's what it is. [laughter] it's important to have these discussions to protect our liberty and freedom while at the same time making sure we have security. so i do thank you for your role. i think what i'm learning is we've got a lot more work to do and a lot more discussion that needs to take place. i just have a couple questions. dr. rubin, if i could address my questions to you, if that'd be all right. much has been made about the science and research behind an individual, in this case bdo, to detect a motion, to seek an intent of another individual based on a combination of verbal and nonverbal and microfacial expressions. i wondered speaking broadly and keeping it as simple as you can for us laymen, could you just tell us the state of the science as it relates to the detection of emotion, deceit and intent by behavioral cues? >> yes.
in general, i guess i would agree with dr. ekman in the sense that we are at the point where there's two things going on. most of the studies, if you look at something like voice stress analysis and look at the -- there's a metaanalysis done by susan brandon coming out of the defense department. what you basically show in most of these studies, it's no different than chance. and agreeing with both dr. hartwig and ekman, there's a lot of controversy here, and there's very little real science and validation. and it's not just about the field evaluation when you can't do it. again, there has been a committee established on the spot regarding the report. i'm on that committee. and we have not been asked to do any scientific validation overall of the program, just to look at one particular thing; are the results different than chance? so i'm agreeing here that's what
really -- that what's really needed on these issues before we continue to invest more money is to really establish without, you know, putting any information at risk and stuff like that, establish a baseline about what's double, what's -- doable, what's known and what's not. this is the classic issue of do you test first and then field it? or field it and test? in this particular instance considering the investment and the intrusion on people's privacy, i think it's absolutely time to be testing, validating and scientifically exploring these things now before we continue to do significant investment. i'm not saying we shouldn't continue the program, i think it's got its importance, but right now we need to establish on some of the known kind of things that we're doing without giving anything away is there good science behind it? otherwise we're throwing money down the drain. >> i think kind of following up on that one of the concerns that operators have is that behavioral science is not dismissed because there are
issues dealing with the validation of specific cues. can you speak for a moment on the importance of behavioral science in counterterrorism context? and then what its limitations are, what its strengths are as far as our work for counterterrorism? >> okay. we're changing the topic a little bit because we're moving to counterterrorism. i think that the behavioral work is broad in counterterrorism. i think it's extremely important. again, when we get to counterterrorism, we're broadening the argument out because you get to analysts. there's been an excellent report on a committee chaired by baruch fishoff. we touched on some of this, and a number of the panelists, you're starting to get involved in behavioral issues of attitude, of biases, you know, stemming from the original intelligence work of richard hoyer, there's a lot that we know. the issue becomes structural and
organizational. given what we -- two things: what do we know and what don't we know? with the stuff that we do know, how do we make sure it's being most effectively used by the intelligence community and by whoever else needs to use it? on those issues where we're not entirely clear, where things are uncertain or controversy, how can we move ahead? and then there's emerging technologies we're going to start to be seeing used, we see some of them in terms like x-ray, things like neuroimaging, remote imaging. that's where i was speaking of the seduction of technology. i support that stuff greatly, but we need to make sure on stuff that's new and emerging that we also get a handle on it. so i think the behavioral stuff is growing rapidly, extremely important, but i think that we're not doing a comprehensive approach to, essentially, deploying it in the field before
it's being appropriately evaluated. >> i see my time is up. i do want to thank you all for being here. i do think like this is a start of a cushion we need to -- discussion we need to continue. i also would ask for any advice of any microfacial expressions i might have so i don't have to go through that examination again. that would be helpful. thank you. [laughter] >> thank you, mr. hultgren. i ask unanimous consent that the gentleman from florida, mr. mica, be allowed to sit on the dais with the committee and be in the hearing. hearing none, so ordered. you're recognized for five minutes. >> well, thank you. first of all, thank you, mr. chairman and ranking member edwards and other members of the panel. i have a great interest in the subject that you have before you, as you may know. i was involved in the creation of tsa when i chaired the
aviation subcommittee in 2001 for some six years after that. and watched its evolution. first, i might say that i'm, i'm absolutely distraught that your subcommittee would be denied by tsa the opportunity for them to be here and possibly learn something, um, or participate. i don't want you to feel like they're just ignoring you. they've ignored our committee and others, so they have a history of this, and i will work with you and others. in fact, i think we need to convene a panel of chairs of various committees and somehow rein this agency in. it has an important mission.
i'm just stunned, again, that they would not have someone at least to hear from the excellent panel of witnesses you've had here today, particularly when they come and ask for more money. let me just tell you my involvement with the spot program. again, as chair of the committee that created it, i followed tsa in its successes and failures, and we've deployed a lot of expensive technology out there and, unfortunately, the technology does not do a very good job, and the personnel failure/performance rate is just off the charts. if you haven't had a classified briefing on the latest technology about to back scatter and the millimeter wave, i urge you to do that. gao in december of last year, and then the patdown which was
sort of their back-up, new procedure which they put in place at the end of last year, and then i had that reviewed by gao in january. but that failure rate is totally unacceptable. the way we got started on spot is i found the technology lacking and reports of performance both by screeners and the equipment they used as leaving us vulnerable particularly after the czechan bombers. they didn't work, and they promised me they would work, but they didn't work. so we need something in place looking at the israeli model, and you can't really adopt the israeli model because they have a much smaller amount of traffic. we have three, two-thirds to
three-quarters of all the passenger traffic in the world, and that's part of america, you know? you get on a plane, you go where you want. people just have a imagine you can carpet -- have a magic carpet through aviation in the country. so that's how we started this. i've observed their operations, and i can't, i can't evaluate 'em. we had gao evaluate them, and you have some representatives here to tell you that the failure rate is unacceptable. it's almost a total failure. if it wasn't money and personnel maybe wouldn't matter, but they've got 3,300 spot officers, i believe, in the program, and they've got a quarter of a billion dollars in expenditures and asking for more. what i heard today is, again, it
doesn't work. i didn't actually get to hear the suggestions, and i would look for -- i had to leave before i heard all of it, some of the suggestions on the amount of time to do a verbal interview would improve it. but maybe finding some way to get us to a number that we could have some exchange. ms. edwards made an excellent, some excellent points in her comments, opening comments, too, that we've got to have some way to improve this and that unless there is some verbal exchange, i think that we're with a standoff observation, we're wasting time, money and resources. so i don't have a specific recommendation for the replacement. i do know what's in place does not work, but i can't tell you how much i appreciate your
subcommittee taking time to review this matter and try to seek a better approach, a better science and better application of something that's so important. because we are at risk. these people are determined to take us out. just came from another meeting, folks that drop with developed -- folks that developed both a back scatter and millimeter wave which is two technologies we're using. the scary thing there is we had, we had witnesses in one of the other hearings that said that both of those technologies will not, will not be able to detect either body cavity or surgical implants. and we already see that these guys are going, they're always going one step ahead of whatever we put in place. so we've got a failed system, we're spending a lot of money on it. it's supposed to provide us with a backup, the information we
have and the review of the performance shows that it's not doing that, and it needs to be replaced or dramatically revised if it's going to be effective in keeping us from this next set of, of threats. so those are my comments. i would ask that if you have suggestions, we do have an faa bill which we can include some positive suggestions. we couldn't do that on the house side because of jurisdiction, but we can do it in conference, and the door's already been opened by the senate. and i would love to hear recommendations from you and from those who participated today how we can do it better. so thank you for allowing me to participate. >> well, thank you, chairman mica. i appreciate your being here and appreciate your comments. i can speak for ms. edwards, we both are very concerned about national security.
we both are concerned about civil liberty t. we're both concerned that we make sure that the flying public are safe and appreciate her, her input. and i hope that you'll find some way that maybe we'll have those subjects that we can -- [laughter] put in a study so that maybe some kind of behavioral, behavioral science can be developed to try to identify these folks. we'll go to our next round of questioning. so i'll recognize myself for five minutes for questioning. even if spot is more than nine times more effective than random, we still are talking about very low base rates. lieutenant didomenica states in his testimony that the base rate for terrorism is .000000, i think one more 06. hope i didn't get too many zeros and did not leave out one.
can any panelist help put that into perspective? anybody? mr. lord? >> sure. i -- as that statistic implies, acts of terrorism are very rare events. obviously, that makes it very difficult to test the efficacy of the program and develop, as we recommended in our report, performance metrics to allow you to better judge whether the program works as designed. but we don't think that should deter you from trying to craft what we would call proxy measures, other measures that help you get at this at least indirectly, and we made that very important recommendation, and tsa and dhs agreed to try to get, you know, develop these indicators. one step we think they could take that would make this exercise a lot more useful is that they use a very long list of behaviors in their list, the
exact number and characteristics are considered sensitive security information, but we posed the question: how do you know this is the right number? and they also assign point scores to each of these behaviors. again, the details are sensitive security information, but that would be one way, we think, that could make the program more useful in identifying potential acts of terrorism. validate the point system, scrub the list of behaviors, cull the list and try to come up with something that's more related to an eventual arrest or a hostile act. and there's ways to do that statistically. >> thank you, mr. lord. anybody else? oh, mr. willis, yes. >> thank you, mr. chairman. so, um, first off, proxy measures are a standard part of research especially in the area of terrorism because, again, there are no direct measures in
sufficient quantities, typically, to use for terrorism. criminal activity is often used as a approximate key measure. it's -- as a proxy measure, it's an accepted practice. when one is looking for active terrorism in a lot of transit areas, your looking at somebody who's coming in to try to use some false identification, or you're looking for somebody who's smuggling. and pote of these things -- both of these things are represented in higher numbers even though they're still low base rate numbers in criminal activity. so that's why that's typically used by other organizations as proxy measures. so i wanted to make sure that we were comfortable that we had given forthought to that and used what is a best practice for proxy measures, sir. >> doctor? >> there are a number of organizations. i work with airport security in
england. i've seen the videos of the bombers' group before they bombed. i worked in israel where they do a lot of, of course, security. but even within our own government the different parts of dod that deal with counterterrorism and the attempts to identify terrorists in field military situations, there's no sharing of information. there's a lot of information out there that hasn't been brought together. it's sensitive, but it needs to be brought together, and then with that database take a look at what's on the spot list. i haven't seen what's on the spot list for four years. so i don't know how it's changed, and i don't know how it's been informed by research findings from our group and other groups and from observations by special forces,
by our counterintelligence, by nypd counterintelligence. there's a lot of information in this country in separate little pockets that haven't been brought together. >> thank you. my time's expired. for my question. recognize now ms. ed yards, ranking member, for five minutes. >> thank you, mr. chairman. i want to go to a question that was raised by mr. mica's comments when he was here, and i just wanted to be clear that from the perspective of xao -- gao and the report and analysis that you've done, mr. lord, we don't yet know if the spot program is, quote-unquote, a fiasco. isn't that correct? >> that's absolutely correct. those were his words. that's not in our vocabulary. thank you. >> and, and just to be clear again, what metrics, again, would you use to determine the success or failure as an
operational program? >> well, one metric besides scrubbing the current list of behavioral indicators they have, they're honing in on as well as the associated point scores, we recommended since there are -- we've identified several instances of terrorists transiting through the u.s. system, studied the videotapes of their movement. are they, in fact, exhibiting signs of stress? are they, as some literature suggests, they don't typically emote much because they believe they're going on to a more blissful state. so it's unclear to us at this juncture whether there would be discernible signs of stress or fear. but there's videotape evidence that would allow you to get at that, and we think that would be invaluable as fine tuning the program. >> yeah. i think i highlighted that in your testimony because there are a number of examples that we have, and i wonder, mr. willis, has dhs made an attempt to pull together not just video evidence here in the united states, but
with our international partners to do some kind of an assessment stacked up against the screening techniques that have been identified to see whether we're on target? it's an awful lot of money to spend without, you know, putting it up against realtime data. >> thank you. again, i represent dhs science and technology, not the operational community. from a -- >> this is a science question. >> yes. from a science and technology perspective, we are attempting to locate video of terrorist threats in other countries as well as within the u.s. and it is very difficult to try to get access to that information. or to successfully get access to that video. and so if -- >> well, part of the reason that we pulled dhs together is because it was, you know, because it's, you know, a collection of all of our, you
know, sort of security, um, and investigative interests under one house to work with our international partners. so it's a hitting staggering to me to know that you've not had the capacity in, now, a decade to look at video for -- and use it to make an analysis about whether the techniques that you seem to be employing are, would be successful. i mean, that seems to me kind of a basic scientific question that dhs should be in a position with our partners internationally and here in the united states to get that video and, you know, conduct some real scientific analysis of that. so i would urge dhs to consider that. i want to go to dr. hartwig for a minute because in your testimony you indicated that there are some other recommendations that you might make, and i wonder if you could
just describe very briefly those to us because i don't think you had an opportunity here in your testimony. >> right. i, um, i think it's roughly captured by what mr. mica said before he left. that it's important to engage a person in conversation to elicit cues to deception. overall, the research shows that statements carry some cues to deception, and also there's an emerging wave of new research that focus on how to create cues to deception, how to elicit cues to deception because there's such an abundance of research showing that people don't just automatically leak. of -- so my basic answer is that some form of questioning protocol, some kind of brief interview protocol that's based on the scientific research on how to elicit cues to deception, how to ask questions so that the
liars and truth tellers respond differently, i think that would be a worthwhile enterprise. >> so you're not really saying -- and this is a yes or no -- saying scrap the program, but you are saying that there are areas where we need to significantly improve the techniques that we're using to take us down a track of being able to identify potential terrorists. >> yes. i think if efforts would be spent on the questioning part of the program, that would put it much more in line with the scientific research. >> thank you. thank you, mr. chairman. >> thank you, ms. edwards. we've been joined by congressman from florida, ms. adams. you're recognized for five minutes. >> thank you, mr. chair. mr. willis, earlier you said that there had been 71,000 referrals, and you made a distinction that the behavior leading to arrest. how many of those were arrested? >> of the 71,000?
>> yes. >> um, that's the random selection method. >> correct. >> of that 71,000 referred in the random selection, nine arrests were made. >> nine? >> yes. >> and then the other method? >> using spot? 23,000 and a little bit were referred and 151 were arrested. >> and the types of arrests? >> um, i don't have the nature of the arrests in the data that we looked at. >> so it could have been belligerence or anything of that matter. >> some of them were for prohibited items that were on them at the time, others could have been through outstanding warrants or something of that nature, ma'am. >> do you think that i have an appearance or would i be a target for spot? i mean, every time i go through the airport, i get pulled aside and searched. and the reason i ask that is because, you know, being a past law enforcement officer and
trained, i have some concerns about the way you identify and pull people aside. the, dr. hartwig, you said you wanted, you thought the program would work if more tool were available -- more tools were available. would it be better to use a validated system as opposed to one that's untested and unvalidated? >> well, first of all, i didn't say that the program would work. i was talking about where i think more emphasis should be spent or put. >> so even with the more emphasis, do you believe that it would work? >> >> um, i don't know. i think we would need a properly-conducted study to find that out. and i think it would be important to go beyond examining the arrest rates and to look at what are the actual behaviors that are displayed by these people who are arrested and to compare those behaviors with those that are in this list of cues, i don't know what those
cues are because it's not available. and to look at are the spot criteria actual indicators? so i think that, it's definitely we need to know whether it works or not. >> mr. didomenica, you're a law enforcement officer, i'm a past law enforcement officer. do you believe that the tsa employees have enough training and the skill sets based on the training they're receiving to, you know, to provide this type of screening? at this level? >> i think with a proper follow-up by trained law enforcement that they do. but if we don't have the proper follow-up by the police officers to figure out what's going on because this is just like an alarm. it's like going through the mag no tommer the and it beeps.
someone comes over and pats you down. the cops are like the patdowns, all right, why did this beep? if you have that level of follow-up by trained law enforcement, i'm comfortable with the training they receive. but without that level of follow-up, i'm not comfortable. >> so would it be your opinion that there needs to be more training? >> yes. >> i yield back. >> thank you, ms. adams. mr. willis, i've got another question for you. does tsa plan to use r&d to improve the spot program, or does it believe the program cannot be improved upon? >> we have -- [laughter] we do have some ongoing research with them, and if i may say, this is one of the beginning research elements that we have with tsa, sir. and, in fact, the started in
2007 prior to gao's interests. its focus is specific not to evaluate absolutely everything going on with spot. that's a huge tasking of which we are not tasked or resourced to do. this is looking at the indicators, the czechlist itself -- checklist itself, the existing checklist. the first question that needs to be asked from a scientific perspective is does the checklist as it's currently put together and as it's currently deployed english it mission? -- accomplish its mission? you would like to be able to compare that against random and something else that's out there that's shown to be valid. the fact is there isn't another behavioral-based screening out there employed by any other group we're aware of either in the u.s. or abroad that have been statistically validated. so we've not been able to great that. so we compare -- not become able to address this. so we compared this against
scientific basis. >> so tsa is doing research? >> we are doing research that supports, tsa. >> ms. edwards, do you have another question? >> i do, thank you, mr. chairman. i just want to follow up with you, mr. willis, because i'm confused. my understanding is that you shared with our staff that there is a pool of video available of suicide bombers and the like that could be used to study, and, i mean, i would expect that if, if tsa were operating in the right kind of way, that would also be used for training. and so i'm a little confused by your answer, and i just want to be clear. do we have video both from ourselves and, perhaps, from our international partners that we could use to assess the techniques that have been
developed and the questions, the assessment questions that have been developed so that we can make sure we have a program that is working as effectively as we know it can work? >> we don't presently have a sufficient number of videos to conduct scientific analysis on. we are attempting to work with our partners in the u.s. internationally, but being a research organization, we do not have the ability to compel national organizations to provide us with that video. what we are doing is attempting to continue to collect that at the best we can as well as to conduct other kinds of supporting things such as interviews of direct eyewitnesses to suicide bombings, international subject matter experts in the area to go beyond what the current validation study was which is of
the existing indicators to try to help establish from a scientific perspective what is being used operationally abroad and, in fact, what is being witnessed by, again, eyewitnesses and subject matter experts so that we may be able to then bring that information back and test it. >> is snt doing that or tsa? >> that's in snt research. >> okay. so i guess for our doctors, hartwig and ekman, it would be useful, wouldn't it, to have a pool, a real data pool to be able to assess that and develop a research property kohl that en-- protocol that enabled us to stack our assessment tools against that? my question, though, for mr. willis is whether or not what agency do you think would be the responsible one to get this pool together? is it, is it dhs? is it tsa?
mr. lord? >> i don't know what the right organization for that -- >> in our report we made 11 recommendations. one of the recommendations was to use and study available video recording the help refine the spot program in their formal agency comments. the department indicated they agreed, and they were taking steps to do that. so i think the department's already on record for saying they agreed, it's a good idea, we're going to do it. so, i mean, they're -- they bought into this idea. to the extent they've actually implemented it, we'll have to follow up and see the extent they've addressed it. but just so to clarify, dhs has bought into this idea. they've already agreed to do it. >> thank you. and then finally, mr. lord, since you already have the microphone,dhs hasn't done a cost benefit analysis on the program or risk assessment, and it's my understanding that they don't do great job, actually, and i apologize for the critique of either conducting cost
benefit analyses of risk assessments for many of the programs. how do we know if we even need the program? >> well, typically as part of our analysis we would look at the cost benefit analysis or the risk assessment to study, number one, how they decided, for example, you need a risk assessment, we would assume, to show where you needed to deploy the program. it's at 161 airports, so our question was, how did you establish this number? did you have a risk assessment? and the answer was, no. they're in the process of ramping up the program now. every year, you know, funding has increased. we assume that would be justified by a cost benefit analysis. they don't have one yet, although to their credit they've agreed to complete both risk assessment and a cost benefit analysis. but traditionally we would expect to find that early at program inception, not four or five years after you deployed a program. >> with well, thank you -- well,
thank you all for your testimony and, mr. chairman, i would just say for the record it'd be good to get a cost benefit analysis and risk assessment before we spend another, you know, $20 million, $2 on the program. thank you very much. >> i agree with ms. edwards. ms. adams, you're recognized. >> thank you, mr. chair. the program, mr. willis, has been ongoing since 2007s that what i heard? >> the validation research study has been ongoing since 2007. >> the validation research study since 2007, and i heard you say there was no system out there that you could use that was validated or available, is that correct? >> we're unaware of any behavioral-based screening program that's used that has been rigorously validated, yes. >> what about israel's program? >> um, we have not located any
study that rigorously tests that. >> did they study it? >> we're not provided any information -- >> did you ask? >> yes. >> and they have said they would not provide it? >> um, we've not been -- they didn't say they wouldn't provide it. >> okay. so it's maybe the way you asked for it? maybe? um, i'm trying to determine since '07 you've been doing a study, we don't have anything validated, you can't give us cost benefit analysis. we're four years out, and when you say there is no other programs out there, there are some out there, i believe. mr. didomenica, is there programs out there? >> there are similar programs -- oh, excuse me. there are similar programs for behavior assessment principally for law enforcement. i've been teaching bass, there's a dhs program called -- it's
approved by dhs called patriot. i have another training course called hide, but these programs are given maybe a few days of training, and then people go off and do their thing. there's no follow up on how successful is it. i think people are getting good ideas and techniques, butst the knot -- but it's not done in a way that can be measured and followed up on, and i think that needs to be done. >> and these programs are all from dhs also? >> there's one that's approved. in order, it's approved for funding. and, but they're not dhs programs. >> okay. so they're funded, but they're trained and then they're kind of sent out, and there's no true follow up, is that what you're saying? is. >> yeah. there's no collection of data about success or failures or effectiveness. it's, like a lot of law enforcement training, you know, you're probably aware of this that you go for a class, you sit there for a week, you get a certificate, you walk out the
door, and that's the end of it. unfortunately, that just falls in line with a lot of the training that's done. i think for this program it's, you know, for what's at stake we need to be better at how we follow up on this. >> i know in many my certificate, we had to go back for training every so often, or else we lost our certificate, so i can relate to having to keep your training and your skills honed. i appreciate that. no more questions, from -- mr. chair. >> thank you, ms. adams. i want to thank the witnesses for being here today. i appreciate y'all's testimony, and i appreciate the members, all the questions that we've had. this is a very interesting topic. i'm, again, very disappointed that tsa has refused to come because there are a lot of questions that i know ms. edwards and i both would have liked to have asked tsa if they had graced us with their presence, and hopefully, we don't have to go down the road of requiring them to be here in
the future. but we will rook into that -- look into that, and they will be here at some point, i hope, voluntarily. i hope you'll pass that along to the folks that are in the position to make that decision. members of the subcommittee may have additional questions for the witnesses, and we ask that y'all will respond to those in writing. the record will remain open for two weeks for additional comment by members. the witnesses are excused, and the hearing is now adjourned. [inaudible conversations]
present their criticisms of health care legislation. on "after words" with jeff greenfield presents three alternate histories, robert kennedy's presidency, the jfk administration that never was, and the re-election of gerald ford and subsequent defeat of ronald reagan. he's interviewed by ted koppel. also this weekend, live coverage from the indianapolis -- annapolis book festival. look for the complete schedule at book the.org -- booktv.org and sign up for our o booktv alert. >> as a host and, i think, as a trader you're not necessarily a republican or a democrat, you're simply looking at the impact of what government is doing on the financial markets whether it be the oil markets or trading or wall street firms. >> sunday, cnbc's "fast money" anchor melissa lee on her career and influences and what she believes is her role in reporting business and financial
news. watch the rest of the interview sunday night at 8 on c-span's "q&a." >> house republican budget proposals include restrictions on funding for groups that provide abortion services. yesterday one group that would lose federal funding under the plan, planned parent hood, along with pro-choice america, led a rally to protest these proposed restrictions. several members of congress spoke at the event at the capitol. this is an hour, ten minutes. [cheers and applause] >> you're a beautiful sight! y [cheers and applause] are you ready to stand up fore o women's health? r [cheers and applause] well, you've come from far and wide, and we've got plenty to stand up for today. .. today standing up for any woman