Skip to main content

Full text of "DTIC ADA1050566: The Source Selection Decision Process in Aeronautical Systems Division."

See other formats


A0-A105 056 
UNCLASSIFIED 



AIR FORCE INST OF TECH MX6HT-PATTERS0N AFB OH SCHOOL—ETC F/6 5/10 
THE SOURCE SELECTION DECISION PROCESS IN AERONAUTICAL SYSTEMS D—CTC(U) 
JUN 51 C H BARCLAYt J E NIDO i*—nviui 

AFXT-LSSR 12-Bl ML 













































Th« cantanfts of th* docusMnt are technically accurate, and 
no sensitive items, detrimental ideas, or deleterious 
information are contained therein. Furthermore, the views 
expressed in the document are those of the author(s) axul do 
not necessarily reflect the views of the School of Systems 
and Logistics, the Air University, the Air Training Command, 
the United States Air Force, or the Department of Defense. 




AFIT Control Number LSSR 12-81 


AFIT RESEARCH ASSESSMENT 

The purpose of this questionnaire Is to determine the potential for current 
and future applications of AFIT thesis research. Please return completed 
questionnaires to: AFIT/LSH, Hright-Patterson AFB, Ohio 45433. 

1. Did this research contribute to a current Air Force project? 

a. Yes b. No 

2. Do you believe this research topic is significant enough that it would 
have been researched (or contracted) by your organization or another agency 
if AFIT had not researched it? 

a. Yes b. No 

3. The benefits of AFIT research can often be expressed by the equivalent 
value that your agency received by virtue of AFIT performing the research. 
Can you estimate what this research would have cost if it had been 
accomplished under contract or if it had been done in-house in terms of 
oumpower and/or dollars? 

a. Man-years _ $ (Contract). 

b. Man-years _ $ ______ (In-house). 

4. Often it is not possible to attach equivalent dollar values to research, 
although the results of the research may, in fact, be important. Whether 

or not you were able to establish an equivalent value for this research 
(3 above), what is your estimate of its significance? 

a. Highly b. Significant c. Slightly d. Of No 

Significant Significant Significance 

5. Comments: 


Name and Grade 


Position 


Organization 


Location 








FOLD DONN OH OOTSZOB - SEAL WZTB TAPE 



AFIT/DAA 

WrigM-^rttonaa AFB OH 45433 



































SCCuniTV CLASSIFICATION OF THIS FAQC (TNlM Oata EnlmH) 


REPORT DOCUMENTATION PAGE 


READ INSTRUCTIONS 
BEFORE COMPLETING FORM 


S. NCCIPieNT'S CATALOG NUMBCN 


4. title rand Sitbilllt) 

THE SOURCE SELECTION DECISION PROCESS 
IN AERONAUTICAL SYSTEMS DIVISION 


7. AUTHOAfaJ 

Colin V. Barclay, Australian DOD 
Jose E. Nido, Captain USAF 


S. TYPE OF REPONT • PERIOD COVERED 

Master's Thesis 


«. PERFORMING O^G. REPORT NUMBER 


t. CONTRACT OR grant NUMBERfaJ 


«■ performing organization name and address 

School of Systems and Logistics 

Air Force Institute of Technology, VPAFB 01 


tl. CONTROLLING OFFICE NAME AND ADDRESS 12. REPORT DATE 

Department of Communication and Humanities I June 198 I 

afit/lsh, wpafb oh U 5 U 33 


. monitoring agency name a AOORESSri/ irom Controtiing OIIIcm) It. SECURITY CLASS. tfilg report) 

UNCLASSIFIED 


ts«. declassification/downgraoing 
schcoule 


IS. OISTRIEUTION STATEMENT (oi lM§ Rgpoft; 


Approved for public release; distribution unlimited 


t7. DISTRIBUTION statement (at Ilia abaltael antand In Black 20, II dlllatani ham Raparc; 


FTx ' • 

d;- 

Air Force In 


IS. supplementary notes '^gtit-Patterson AF 3 , OH 45433 

vrKovzo Fo;.'^ ^ i ‘ T I 

-J- ^ <.w..LAiL AFR lOQ-l/ 


•S. KEY WORDS fCofilln«i« on fororoo old* fl nocoooorr IdonNfp bp block numbo^lj.. ^ 


Source Selection 
Proposal Evaluation 
Procurement 


Multi-Attribute Decision Medclng 
Systems Acquisition 


20. ABSTRACT fConifnwo on rovoroo oldo If nocoooofY and idantifr block ntmbor) 

Thesis Chairman! Jack L. McChesney, Lieutenant Colonel, USAF 


DD I j an*7S 1473 edition of I NOV EE It OBSOLETE 


security CLASSIFICATION OF THIS PAGE (Phan Data Enrararfi 





















'unA'aiT.i! ’.ui-LW-jiK-iJ-inj.i-i li, ri-."ni 


This rsssarch la eoncsrnsd with IdsntlTyixi^ a nodal of the sourca 
salactlon proeass as uaad In Aarenautleal Systaais Division^ Air 
Forea Syatana Cosmand (ASD) and avaluatln^ tha stran^tha and 
waaknaasas of tha proeasa in ralatlon to statad Dapartmant of 
Dafanaa and Air Forca objactlvas. Information was gatharad fx*om 
a revlaw of past aourea salactlon casas and a sarias of intarrlawa 
with ASD aourea salactlon parsonnal. A cooiputar modal was 
conatructad to slamlata tha affaeta of tha daolslon forming 
taehnlquas obaarvad on tha posslbla outeomas of aourea salaotlons. 
A rasultln^ daacrlptlva modal provldas a basis for battar 
tindar a tending of tha q\iality of daelslon infomatlon prowldad 
by tha procaas and forms a framawork for Improving tha sourca 
salactlon prooaas. 


^ 73 H 


tccumry cLAMiriCATtow ▼m** PAOC<iiVMi« om 












THE SOURCE SELECTION DECISION PROCESS 


IN AERONAUTICAL SYSTEMS DIVISION 


A Thesis 

Presented to the Faculty of the School of Systems and Logistics 
of the Air Force Institute of Technology 
Air University 

In Partial Fulfillment of the Requirements for the 
Degree of Master of Science in Logistics Management 


By 

Colin V. Barclay, BTech, GradDlpAdmln Jose E. Nldo, BS 

Australian DOD Captain, USAF 


June 1981 


Approved for public release; 
distribution unlimited 






This thesis, written by- 

Mr. Colin V. Barclay 
emd 

Captain Jose E. Nldo 

has been accepted by the undersigned on behalf of the 
Faculty of the School of Systems and Logistics in partial 
fulfillment of the requirements for the degree of 

MASTER OF SCIiaJCE IN LOGISTICS MANAGEMENT 

DATE: 17 June I98I 



SAIRMAN J ) 


COMMITTEE CHAIRMAN 


ii 


ACKNOWLEDGMENTS 


We wish to thank Mr. James Schaeffer of ASD/PM 
for the benefit of his knowledge of source selection, 
and for his assistance In obtaining access to the source 
selection personnel and records necessary to be able to 
conduct this research. Thanks also are due to Mr. James 
Helmlg and the staff of the ASD Source Selection Center 
for their cooperation and patience In making their 
records available to us. We are also deeply grateful 
to the many busy people In ASD source selection activ¬ 
ities who gave generously of their time and accumulated 
knowledge. 

Finally, our thesis advisor, Lt Col Jack McChesney, 
was an Inspiration through his Insight of the problems 
we faced, his helpful criticisms, and his encouragement. 
For this we thank him. 


Ill 



TABLE OF CONTENTS 


Page 

ACKNOWLEDGMENTS . iii 

LIST OF TABLES . ..vii 

LIST OF FIGURES.viii 

CHAPTER 

I. SOURCE SELECTION . ... . . . 1 


The Sovtrce Selection Decision Problem ... 1 

Source Selection Decision-Making .... 1 


Decision-Making Problems ........ 3 

The Research Need.. 4 

Literatvire Review.. 4 


Policy and Procedural Background .... 4 

Purpose of Source Selection Procedures . 10 


Theoretical Background . . 11 

Some Recent Propositions . 16 

Practical Considerations . 17 

II. RESEARCH APPROACH.20 

e 

Research Objectives . 20 

Scope of Research. 20 

Reslftirch Question ............ 21 

^ Research Methodology . 21 

Discussion .. 21 

Research Hypothesis . . 26 

iv 



















CHAPTER 

III. 


IV. 


Page 


ANALYSIS AND MODELING . 29 

Review and Analysis of Cases.29 

Results of Regression Analysis ..... 30 

Modeling the Value-Building Process .... 32 

Computer Model . 33 

Assumptions of the Model ........ 3^ 

Parameters in the Model.33 

Goodness Level. 33 

Weighting Coefficients (B^) ....... 37 

Introduction of B,, to the Model.38 

The Computer Program . 39 

Analysis of Output of Computer Model ... 4l 

Sample Size ............... 43 

ANOVA Test Procedure ..44 

ONEWAY ANOVA Results . 45 

Difference Between Color-Scored 

and Nvimerically-Scored Results (t-Test) . 49 

Multiple ANOVA (MANOVA).50 

Summary of Results of Analyses 

of Model Output ............ 52 

INTERVIEWS WITH SOURCE SELECTION 

PRACTITIONERS . 55 

Effectiveness and Efficiency of the 

Process .......... . 5^ 

The Scoring Process.60 

Contractor Inquiries and 

Deficiency Reports . ..... 63 


V 

























Improvements Suggested by 

Interviewees . 65 

Summairy 69 

V. CONCLUSIONS AND REX30MMENDATI0NS. 71 

Source Selection Methodology . 71 

Maturity of Concept .. 75 

Weights of Attributes . 76 

Source Selection Resources . . 77 

Management Style of the SSA. 77 

Choice of Scoring Method . 79 

Procedural Aspects of Source Selection . . 81 

Effectiveness and Efficiency . 82 

Use of Scoring Techniques ....... 83 

Contractor Inquiries .. 84 

Problems of Soiirce Selection. 85 

Recommendations ...... . 86 

APPENDICES . 89 

A. LISTING OF COMPUTE® PROGRAM. 90 

B. SUMMARY TABLES OF COMPUTER OUTPUT. 94 

C. OUTPUTS OF MANOVA TESTS.105 

D. GUIDE TO INTERVIEWS WITH SOURCE 

SELECTION PRACTITIONERS . 114 

SELECTED BIBLIOGRAPHY . 117 

A. REFERENCES CITED. 118 

B. RELATED SOURCES.121 



























LIST OP TABLES 


TABLE Page 

I, GOODNESS LEVELS OF SIMULATED PROPOSALS .... 36 

II. POSSIBLE SETS OF VALUES OF WEIGHTING 

COEFFICIENTS . 38 

III. RESULTS OF ONEWAY ANOVA TESTS. 4? 

rv. TABLE SHOWING HOMOGiafEOUS SUBSETS 

OF TREATMENTS ...... . 48 

V. T-TEST OF RESULTS USING NUMBER AND 

COLOR SCORE METHODS. 51 










LIST OF FIGURES 


Figure Page 

1 . Sotirce Selection Value Hierarchy ..23 


viil 






CHAPTER I 


SOURCE SELECTION 

The Source Selection Decision Problem 

Air Force system acquisition projects involve 
contracts at stages throughout the project life for the 
procurement of seirvices eind equipment toward fulfilling the 
system acquisition objectives. Typically, each procure¬ 
ment involves the solicitation of offers followed by an 
evaluation process which produces information from which a 
choice is made to determine the contract award. 

Source Selection Decision-Making 

The evaluation leading to the award decision 
(source selection) is the composite product of the results 
of independent expert assessments of a variety of aspects 
of the offers under consideration. While these aspects 
may cover a wide range of criteria, they can be conveniently 
grouped into five major areas: (l) technical, (2) opera¬ 
tional, ( 3 ) logistics, (4) msuiagement, amd ( 5 ) cost. Each 
area may in turn be broken out into more specific items 
which themselves may shred out into more discrete segments 
called "factors" (17:pp.3-2,3”3)• The regulatory documents 
which are reviewed in this study give broad guidance on 











the use of subjective and objective methods of Integrating 
expert assessments of the separate aspects of offers Into 
consolidated evaluation reports to provide a basis for the 
soiirce selection decision. The evaluation should provide 
a balanced appraisal of all significant factors with a 
high level of quality and consistency to facilitate an 
objective, Impartial, equitable, and economic comparative 
analysis of competing offers. 

Determining an Integrated evaluation of eui offer 
from factor assessments Involves two processes: (l) scor¬ 
ing and ( 2 ) weighting. 

( 1 ) Scoring Is the allocation of a comparative 
"value" to each factor being assessed. The 
value may be expressed by allocating a 
numerical score, by color coding, by reuiklng, 
by narrative, or by a combination of these 
methods. 

( 2 ) Weighting Is the process of giving to the 
score value of each factor a coefficient 
which reflects the relative Importance of 
the factor In the final evaluation. In 
practice, weighting may be done by object¬ 
ively allocating nianerlcal coefficients or by 
subjective comparisons. 


2 





As the foregoing suggests, a number of empirical 


models have been developed as a guide In making evaluations 
for source selection decIslon-maklng. Because of the nec¬ 
essarily large nuunber of participants in factor assessment 
and the variety of Integrating techniques available, It Is 
difficult to justify that current procedxires provide the 
required objectives of quality euid consistency in Air Force 
source selections. 

Decision-Making Problems 

A recent study (10:49) of source selection proc¬ 
edures in Aeronautical Systems Division (ASD) showed that 
different groups of source selection evaluators rated the 
same proposals differently. An examination (8:119) of 
numerical scoring and weighting schemes concluded that 
small relative changes in item weights auid item scores 
can overturn the order of evaluations when differences bet¬ 
ween scores are small fractions of the scores. Awareness 
of shortcomings in numerical rating schemes has led to a 
’’strong trend toward rating proposal elements using a 
combination narrative and color coding system ^T:pp.9,10_7.’’ 
While this trend avoids the specific criticisms of numer¬ 
ical rating schemes, there is no hard evidence to show 
that more subjective methods than numerical weighting come 
closer to obtaining consistent and accurate source selec¬ 
tion decisions. 


3 






The Research Need 


There remains a need for a source selection evalua 
tion procedure which is capable of giving demonstrably 
consistent results with different assessment groups. 

Source selection from among complex competing offers is a 
multi-attribute decision-making situation. 

Some recent academic discussions of possible 
applications of multi-attribute decision theory to mlllt- 
ai*y logistics problems are discussed in the literature 
review. It is believed that theory developments in this 
field offer a base from which to examine an actual source 
selection process to determine a source selection decision 
maker's visibility of the assessment factors in relation 
to the source selection criteria. A sufficiently rigor¬ 
ous examination may provide new Insights into source 
selection that will enable the development of more effect¬ 
ive meuiagement of the process. 


Literattire Review 


Policy and Procedural Background 

Department of Defense Directive (DODD) 4105.62 
(20:2) provides source selection policy and procedures for 
the acquisition of major defense systems, and states three 
primary objectives to be met as a result of the source 
selection process. 


4 







The prime objectives of" the process are to 
(a) select the source whose proposal has the highest 
degree of realism and credibility and whose perform¬ 
ance is expected to best meet Government objectives 
at an affordable cost; (b) assure impartial, equitable, 
and comprehensive evaluation of competitors' proposals 
and related capabilities; and (c) maximize efficiency 
euid minimize complexity of solicitation, evaluation 
emd the selection decision. 

The major systems acquisition process is a complex 
one. It consists of a sequence of specified phases of 
program activity and decision events directed to the 
achievement of program objectives in the acquisition of 
defense systems. Each major weapon system acquisition 
program has its vinique features, and therefore, no two 
programs are identical. If one were to compare various 
programs, a number of differences would immediately surface 
to include differences in time, cost, technology, manage¬ 
ment, and contracting approach. Despite the differences, 
however, the basic acquisition process is common to all 
programs. As such, all programs are driven through the 
process toward a common goal of obtaining for the Govern¬ 
ment "the most advantageous contract--prices, quality and 
other factors considered ^T8:p.1-302.2_7. " 

DODD 4105.62 (20;2) provides guidance to achieve 
this objective; 

Each DOD Component shall develop, and consistently 
apply, procedures which create the environment for an 
impartial, balanced and realistic appraisal of all 
proposals submitted. 


5 






Air Force Reflation (AFR) 70-15» Source Selection 
Policy euid Procedures . (l7sp.l-l) establishes policy, 
assigns authority and responsibilities, and prescribes 
implementing procedures for source selection. It also 
states the main objective of* the source selection process: 

The prime objective of* proposal evaluation and 
sovirce selection is to assure impartial, equitable, 
and comprehensive evaluation of* competitive proposals 
and to assure selection of* that source whose proposal, 
as submitted, offers optimum satisfaction of the 
Government's objectives including cost, schedule, and 
performanc e. 

A typical source selection process is composed of 
a stimctured organization which consists of a Source 
Selection Authority (SSA), Source Selection Advisoiry 
Council (SSAC), and a Source Selection Bhraluation Board 
(SSEB) (l7ip«1“^)» The sotirce selection process itself is 
initiated as a result of the submission and approval of a 
Source Selection Plan. This plem, the key planning docu¬ 
ment for the conduct of the source selection process, is 
normally prepared by the project officer charged with 
effecting the procurement of the system (l7!p-2-l). The 
Source Selection Plan usually includes, eunong other things, 
"basic evaluation criteria to provide a basis for the more 
detailed shredout by the SSAC and SSEB for use in the 
solicitation," a description of the SSEB evaluation and 
rating methodology and the SSAC analysis technique, and a 
schedule of events, identifying and listing the source 


6 




selection activities within a time framework (l7:pp>2-1, 

2 - 2 ). 

The Contract Definltizatlon Group (CDG) Is a part 
of the SSEB org 2 uiizatlon. Its role Is to negotiate def¬ 
initive contracts with all offerors determined to be in 
the competitive r 2 Lnge. The CDG manages all communications 
with the offerors and Is advised by a Cost Panel. The 
primary purpose of the Cost Panel is to provide an evalua¬ 
tion of the most probable cost to the Government of each 
offeror's proposal (l:l6). While technical and cost 
evaluations by different evaluators are held simultaneously, 
they are kept apart to prevent the technical evaluation 
from being biased by cost considerations. 

After proposals are received, the evaluation period 
commences with the SSEB examining euid conducting: 

. . . £ui In-depth review of the relative merits 
of each proposal against the requirements in the 
solicitation document and the evaluation criteria 
established by the SSAC. The evaluation ftmction 
must be thoroughly conducted, objective, fair, euid 
economical ^Tysp* -6_7. 

A summary report of findings by the SSEB is then 
prepared and submitted to the SSAC. This SSEB Evaluation 
Report is basically a svumnary of the results obtained 
after evaluating each proposal against the standard crit¬ 
eria set forth by the SSAC {l7!p.1-6). 






AFR 70-15 (17spp.1“5*1- 6 ) establishes the SSAC's 


duties and responsibilities. These Include, among others: 

( 1 ) Establish the evaluation criteria, using the 
general guidance set forth In the approved Source 
Selection Plan. 

( 2 ) Establish the relative Importance of the 
evaluation criteria In a form for use In the solicita¬ 
tion document. 

( 3 ) Establish the evaluation criteria weights 
for SSAC use when numerical scoring techniques are 
employed. 

(4) Review the findings of the SSEB and, when 
niunerlcal scoring has been used, apply the established 
weights to the evaluation results. 

( 5 ) Prepare the SSAC Analysis Report (comparative 
analysis) based on the SSEB Evaluation Report. 

Basically, this part of the process consists of a 
review of the SSEB Evaluation Report by the SSAC, after 
which an evaluation of proposals is again conducted against 
the SSAC criteria. A Source Selection Advisory Council 
Analysis Report is then submitted to the SSA. This com¬ 
parative analysis report consists of a "proposal versus 
proposal" evaluation that should help the SSA make em 
objective selection decision ( 17 !pp» 1 - 1 » 1 - 2 » 1 - 5 » 1 - 6 ). 

The SSA is ultimately responsible for the proper 
conduct of the proposal evaluation and source selection 
process. Therefore, he should strive for a source selec¬ 
tion process that will provide him with the information 
necessary to make the most objective selection decision 

8 







possible 


The SSA must be presented sufficient Indepth 
Information on each of the competing offerors and 
their proposals to make an objective selection 
decision. The SSAC Analysis Report euid oral briefing 
should be presented to the SSA in a manner which 
accomplishes this objectxve. The SSAC presents 
findings euid analyses but does not make recommenda¬ 
tions to the SSA imless specifically requested 
Z"l7:p.1-2^. 

In the final analysis, the degree of success that 
the SSA will attain in making an objective decision will 
depend on the extent to which a logical, consistent, and 
systematic approach is established. AFR 70-15 (l7sp*1-3) 
provides guidance for the establishment of evaluation 
criteria and rating systems to be used in evaluating 
offerors' proposals: 

The specific evaluation criteria must be included 
in the solicitation docvunent emd enumerated in terms 
of relative order of importemce of those signific^ult 
factors which will form the general basis for proposal 
evaluation £ind selection/contract award ... The 
rating system shall be structured to enable the SSA 
to identify the slgnificemt differences, strengths, 
weaknesses, and risks associated with each proposal 
and subsequent definitized contract . . . The rating 
system may be entirely narrative, or may employ 
ntunerlcal scoring and weights or a descriptive color 
code in conjunction with narrative assessments. The 
Important task in either rating system is the integ¬ 
rated assessment of all aspects of the evaluation, 
emalysls, euid negotiation process. 

AFR 70-15 (l7sp.3-^) relies on the evaluator's own 
judgment while performing an evaluation: 


9 





How an evaluator approaches the task of' evaluation 
is up to his own Judgment based on his experience. The 
method by which it is accomplished is dependent on what 
he feels best suits the particular circumstances , . . 
It is, however, important that all evaluators be con¬ 
sistent in their approach to evaluation. Failure to 
do so will result in distortion of the true value of 
the proposals. 


Purpose of Source Selection Procedures 

The Logistics Management Institute briefed the 
Defense Blue Ribbon Panel on the subject of defense proc¬ 
urement policy and weapon systems acquisition in August 
1969 s 

Formal procedures were established for selecting 
contractors for major development or large production 
efforts. These procedures required evaluation of 
proposals according to pre-established, point grading 
criteria and a review of the documented results of 
the grading system. The objective was to reduce the 
influence of subjective Judgments in the selection of 
contractors and to encourage objective evaluation of 
all proposals by responsible offerors /9s21_^, 

The essential decision-making process in source 
selection involves weighing euid Judging complex issues 
arising from the assessment of the many factors which make 
up competing offers. The Issues are evaluated by separate 
expert groups with different perceptions of the ultimate 
acquisition. Weighting of issues is subject to the biases 
of the welghters. Overall policies may be overwhelmed by 
the goals of the organizational subsystems Involved in the 
process. The source selection decision-maker requires 
information which: 


10 





(1 ) relates to the acquisition policy eind 


objectives 

( 2 ) is free from bias 

( 3 ) is equitably weighted 

(M can withstand scrutiny and be repeatable 
with different assessors. 

Finally, the decision-maker requires the informa¬ 
tion in a form which is digestible and \»hich will assist 
him to exercise Judgment in the fullest possible knowledge 
of the choices available. 

Theoretical Background 

A preliminary survey of general literature in 
decision-making suggests that there are research findings 
which might be applied to the experience of existing 
empirical source selection models to develop an improved 
understanding of the source selection process. 

Simon (l6;272) proposed the concept of bounded 
rationality as a feature of management decision-making. 

He reasoned that decision-makers in complex situations 
"satisficed" the choices available to them. They used only 
that part of the available information which they perceived 
to enable them to make a satisfactory rather than an op¬ 
timal decision. The cautions and reservations expressed 
in current Air Force and DOD source selection regulations 

11 







confirm an awareness of the difficulties of source selec¬ 


tion decision-making. Because of this inherent complexity, 
"satisficing” continues to play a significant part in 
source selection decision-making as a practical necessity. 

These views have support in a recent research which 
utilized multiple linear regression techniques to examine 
source selection in an Air Force procurement division. 
Milligan (lOjvi) attempted to determine whether or not the 
evaluation criteria contributed significantly to the rating 
a proposal received and how program managers and super¬ 
visors make source selection decisions. He found that 
people do not always use all the information available to 
them in making source selection decisions: 

Thesis results suggest that source selection 
decisions are not similar across organizations within 
the AF division. Furthermore, subjects did not 
utilize all the Information available to them in making 
decisions. People often chose to utilize only a part 
of the available information in arriving at a decision. 

Dawes (3s180-188) demonstrated the "bootstrap" 
effect of making policy a conscious element of the decision 
model. When policy was defined or "captured" in the model, 
decisions beceime more aligned to policy. A similar effect 
in Air Force source selection decision-making was suggested 
by Milligan when he showed that source selections were more 
consistent among experienced source selection staff when 
given a policy in the decision task statement than when 


12 



1 


they were given the task with no formal policy. 

While trying to improve the proposal evaluation 
phase of the source selection process, Dycus (5! 256 ) 
conducted an Evaluator Preference Survey in which he 
attempted to measure the attitudes and evaluative Judgment 
of a quasi-sample of 33 experienced DOD technical proposal 
evaluators. Although he found that the evaluators had a 
favourable attitude toward proposal evaluations, survey 
responses indicated a need for improvement of the evaluative 
procedures: 


. . . survey data indicated considerable room 
for the government to improve proposal evaluation 
mechanics. Most evaluators indicated they reinter¬ 
preted scored evaluation criteria. There was only 
moderate Judgment that scores evaluation criteria 
and rating scales were "good" emd "fair". 

Dycus recommended that experimental research be 
conducted in order to improve the proposal evaluation 
aspect of source selection. He further suggested that such 
research would improve evaluation rating scales, evaluation 
criteria for scoring, determine preferred evaluation mech¬ 
anics, and improve scoring discrimination: 

End product of proposed experimental research 
would be a proposal evaluation guide that defines a 
preferred rating scale, and directs the evaluators 
in how to make their evaluation scorings. Such a 
guide would Improve the quality and discrimination of 
proposal evaluation scores, and attest to the prac- 
tical value of applied procurement research 256_/. 


13 






The primary goal of source selection is to arrive 


at an objective selection decision. However, several 
problems exist which limit the ability to accomplish this 
goal. The work of Beard (2!iv) in his study, "The Applica¬ 
tion of Multi-Attribute Utility Measurement (MAUM) to the 
Weapon Systems Source Selection Process", identifies five 
problem areas that presently limit the ability to fully 
accomplish an objective evaluation: 

These problems are: current weapon systems 
development is multidimensional and does not allow 
for evaluation on a single dimension - em array of 
attributes must be evaluated; performance evaluation 
is in many cases a subjective attribute and judgment 
can be influenced by biased viewpoints; the current 
color coded evaluation procedure provides results 
that can be washed out and are arrived at wholistlc; 
the current numerical evaluation procedure provides 
results that can be veiry close, may tend to level 
out results or obscure the more important issues; 
and costs. 

MAUM is a ten-step procedural approach developed 
from multi-attribute utility theory by Dr. Ward Edwards 
to objectively addx'ess important decisions when selecting 
among various alternatives having multiple attributes 
(technical, logistic, and operations evaluation factors) 
(2:8). It provides a freunework for scoring euid weighting 
attributes in such a way as to ensure significant discrim¬ 
ination between the scores allocated to substantially 
different proposals. Using this approach. Beard (2:43) 
concluded that the objectivity required in source selec¬ 
tion decisions C 2 ui be attained: 


14 







MAUM's procedurallzed methodolo^ greatly reduces 
the influence individual bias can have in evalxiatlon 
results. The use of value curves and the philosophy 
of "operationally defining" evaluation factors will 
result in much more objective evaluations. MAUM's 
procedures preclude inconsistent application of 
evaluation standards over time. 

Basically, Beard argued that since the present 
source selection evaluation process considers various 
proposals having multiple attributes (evaluation items and 
factors), MAUM's ability to evaluate decisions having more 
than one attribute, aspect, criterion, or dimension helps 
eliminate the problems presently encountered in the proc¬ 
ess (2:42). 

There has been concern regarding the numerical 
scoring and weighting system used to evaluate offerors' 
proposals, specifically, the sensitivity of total scores 
to small variations in the choices of item weights and in 
item scores. In a paper presented to the Sixth Annual 
Procurement Research Symposium, Lee was concerned with the 
possibility of these variations causing "offeror A to have 
a greater total score than offeror B in one case, while 
making B's score exceed A's in another (8:123)." Lee 
concluded that: 

The order of numerical scores of proposals can 
be overturned by small relative changes in item 
weights and item scores whenever differences between 
scores are small fractions of the scores, even when 
item weights meet all the requirements of APR 70-15 
and AFLC Supplement 1 to that regulation. 


15 




Some Recent Propositions 

Vald (21:12) has described DOD material acquisi¬ 
tion decision-making as a value-building process and 
proposed the use of a theory of analytic hierarchies to 
clarify the multi-criteria choice situation involved. A 
key element of the theory is the use of two-dimensional 
comparison matrices to refine expert estimates of value 
scores in terms of the value structure of the organization. 
A claimed advantage of the scheme was that it is relatively 
simple to construct and administer in a complex organiza¬ 
tion. He concluded that the process drives toward 
consensus and provides a truly wholistic approach to 
decisions. 

A proposed application of multi-criteria decision 
theory to a specific Air Force acquisition plauining 
declslon-meiklng scenario by DeWispelare, Sage, and White 
(4:p.1-15) identified two major theoretical application 
techniques: Multiple Objective Optimization Techniques 

(moot) and Multiple Attribute Utility Theory (MAUT). It 
was suggested that both MOOT and MAUT are mental constructs 
to approaching multiple criteria decision situations and 
that there are practically no fundamental differences 
between their analytical structures. However, while MOOT 
may more quickly identify non-dominant solution sets, 
decision-maker preference (weighting) emerges more 


16 





efficiently through MAUT. Dewispelare, Sage, and White 
developed a methodology of combining the organizationally 
desirable features of MOOT and MAUT. The methodology has 
been tested in the Air Force and has been demonstrated to 
be an acceptable and desirable approach to improving the 
efficiency of decision-making. The research offers en¬ 
couragement of the practicality of developing the applica¬ 
tion of multi-criteria decision theory to source selection 
scoring and weighting. 

Practical Considerations 

The literature review to this stage suggests that 
the major problems in achieving effective source selection 
evaluations Include consistency, equitable weighting of 
factors, and policy visibility. Recent research has 
focussed on methodology for improving the quality of 
estimates of value score and attribute weighting. 

In general, source selection practitioners are 
averse to the use of mathematical models of subjective 
judgment which use numerical scoring (for example, ( 6 : 89 )), 
There is a feeling that numerical scoring methods inhibit 
the freedom of the decision-maker to m 2 dce, and justify, 
subjective decisions. During recent years, there has been 
a strong trend toward using a combination narrative and 
color-coding system in rating proposal elements. When 


17 





used, the follovlng Is an ex^uDple of how the color-coding 
system may be applied (1:10): 

Blue - Exceeds specified performance or capab¬ 
ility and excess is useful, high 
probability of success, no significeuit 
weaknesses. 

Green - Average, meets most objectives, good 
probability of success, deficiencies 
can be corrected. 

Yellow - Weak, low probability of success, signi¬ 
ficant deficiencies, but correctable. 

Red - Key element fails to meet intent of 
Request for Proposal (RFP). 

The source selection evaluation process provides 
for a tendency to "wash” the evaluation of proposals toward 
an acceptable standard. This effect appears to be due in 
part to the conservativeness of evaluators at the lower 
levels. These evaluators appear to be reluctant to rate 
a proposal as \macceptable and so eliminate it from the 
competition. Evaluators at these levels seem to avoid this 
kind of decision, deferring it to the higher level to make 
such a determination. The cvunulative effect reduces the 
visibility at the higher levels of the process of the 
overall worth of the different proposals when compared 
agains t s tandards. 

It is believed that detailed study of the value 
building processes in actual source selections is necessary 
before significant conclusions can be made about the 


18 




practical application of multi-attribute decision-making 
theory to Improving source selection. 


19 




CHAPTER II 


RESEARCH APPROACH 

Research Ob.iectives 

The objective of this research was to examine the 
value building processes in am actual source selection case 
and to establish its correspondence with the theoretical 
constructs of multi-attribute decision theoiry* 

Scone of Research 

Formal source selection evaluation procedures are 
mandatory only for new development progreuns requiring $100 
million or more RDT&E fvinds or projected to require more 
than $500 million production funds, or other progrsuns 
specifically designated (l9:2). However, the objectives 
of source selection remain the same in all procurements 
regardless of dollar value, and responsible officers are 
required to demonstrate a systematic and consistent approach 
to the source selection decision. The problems of offer 
evaluation and decision-making are similar for all pro¬ 
curements auid vary only in size and scope. 

This research was directed to a detailed study 
of a source selection process undertaken within the 
standardized formal procedural frEunework used within the 


20 





1 


Aeronautical Systems Division (ASD), However, it is 
considered that it will provide a basis for study of more 
general cases of government source selection. 

The research Included a series of Interviews with 
source selection practitioners and administrators to 
identify perceptions of the strengths and weaknesses of 
the empirical source selection models in current use. 

Research Question 

The question addressed in the study introduced by 
this paper may be sxammarized in the following way: 

Can a detailed study of am actual source 
selection process establish a relationship 
with theoretical multi-attribute decision- 
making models which will provide means for 
improving the management of source selection? 

Research Methodology 


Discussion 

The process of source selection may be pictured 
conveniently as a value hierarchy in the manner described 
by Wald (21:l4). In source selection, the levels of 
hierarchy are typically: 


source 

area 

item 

factor 

sub-factor 


21 



The values of each component which contribute to 
the decision rise progressively through the hierarchy, 
successively being refined, until they reach the source 
decision level. At each level, a series of weighted 
combinations of individual component scores tedces place 
to arrive at a new set of scores to enter the weighting 
process at the next higher level (Figure 1). 

Researchers have modeled this kind of combina¬ 
tional process in many applications as a linear multi- 
attribute utility model (22:122-124). The model is 
expressed in the form: 

Y = B,x. + B„x_ + B_x„ + , . . + B X 
11 22 33 nn 

where Y is the value (score) outcome of the 
process, and x^ is the value of the nth 
component, euid B^ is the weighting coeffic¬ 
ient of the nth value. 

The linear model is consistent with the procedures outlined 
in AFR 70-15 (l7sp«2-6) and the source selection policy- 
capturing research of Milligan (l0:12). 

The underlying assvunptlons of the linear model are 
that the components are (22:123): 

( 1 ) Independent - to avoid double counting, and 

( 2 ) unidimensional - the scores should be realist¬ 
ically seen as adding to the decision dimen¬ 
sion, and 


22 



Value Level 



FIGURE 1 - Source Selection Value Hierarchy 









( 3 ) compensating - high scores on some components 
will compensate tor low scores on others, and 

(4) relevant - the components should be relevant 
over all contexts, and 

( 3 ) exhaustive - all appropriate components should 
be Included, but 

(6) determinant - the components should be Impor- 
tant to the selection. 

When comparing a niimber of competing multi-attribute 
options (proposals) as In source selection, meaningful 
comparisons are made when the coefficients B remain con¬ 
stant for each calculation of Y. 

This kind of model Implies a straightforward way 
of combining objective component scores through a value 
hierarchy to evaluate competing proposals. The combined 
scores become absolute comparative values at the source 
level which should provide a clear basis for the source 
selection decision. 

However, nvimerlcal scoring systems have been almost 
universally rejected In ASD as a suitable means of source 
selection for anything but the simplest procurements. 

The major objections to numerical scoring arising 
ou.t of practical experience are: 


24 



(1) variations between offers are "averaged out" 

I 

so that major deviations from stamdard 
become obscured in the final score. 

( 2 ) The allocation of objective scores narrows 
the options of the decision-maker constrain¬ 
ing him to the highest numerical score. 

These objections are supported in a stated reluct¬ 
ance of evaluation personnel to use the full range of 
nxamerical scoring scales and the findings by Lee (8:123) 
that when total scores are close, small variations in 
component scores can overturn the result. The latter 
gives concern that the former can lead to a scoring result 
that is not optimal. 

ASD prefers the use of more subjective color coding 
scoring systems which are believed to give the combiner at 
each level in the value hierarchy greater flexibility in 
the subconscious weights that he applies to subordinate 
component color scores when allocating his own color score 
to the whole of the group of components within his res¬ 
ponsibility. 

In most important acquisitions, the potential 
contractors are experienced and competent in government 
contracting. They have developed an intimate knowledge of 
the government's requirements during prior negotiations 
and understand the sourc'^ selection process. Their 






proposals are therefore constructed to closely conform 
to the expected criteria. The outcome is that all propo¬ 
sals tend to meet the main acquisition requirements and 
that differences between them are small. Differences 
between proposals tend not to be evident in the compara¬ 
tive color score allocated at higher levels in the value 
hierarchy. The Source Selection Advisory Council is often 
obliged to look below the highest hierarchical levels to 
detect differences which may become justifiable bases on 
which to make accept/reject decisions. It appears in 
practice that at each level of combination of scores there 
is an effect which ”washes-out" the visibility of signifi- 
ceint factors which may be a basis for acceptance or 
rejection when considered with the whole. 

Research Hypothesis 

Even with the use by ASD of color-coded scores in 
the value hierarchy, the concept of a linear combinatorial 
model remains valid if meaningful numerical equivalent 
scores may be given to the color code. However, because 
the color-coding technique of scoring does not lend itself 
to a priori allocation of objective attribute weights 
(other than a simple ranking of order of importance), it 
is possible to bias the model during application. The bias 
effect may be represented simply by extending the model by 


26 







a constant term such that: 

Y = B- + B,x. + B-X- + . . . + B X 
° 1 1 2 2 n n 

AFR 70-15 suggests a suitable scale of numbers in the 

range 0-10 to equate to color scorings. This scale was 

adopted as a basis to score color-code ratings used in 

source selection: 

blue - 10 
green - 6 

yellow - 2.5 

red - 0 

The value-building processes in actual source 
selection cases were examined to see if a fit could be 
established between the actual processes and the extended 
model. A suirvey of recent ASD source selections showed 
that suitable cases could be identified with sufficient 
proposals emd components to be able to conduct a multiple 
regression analysis of the allocated cell value to the 
component values for each cell of the decision hierarchy. 

Analysis was conducted by the multiple regression 
procedures in the Statistical Package for the Social 
Sciences (SPSS), (12:328), available on the Cyber CDC 
6600 Computer. The basic test hypothesis was: 

Ho : = 0 

^ 0 

Having established the possible nature of the value- 
building equation when color-scoring was used as the 


27 





discriminant, a computer model was constructed to simulate 
a series of value-building situations to examine the rela¬ 
tive performance of color-scoring and numerical scoring 
methods as value discriminants. 

Inteirviews were held with experienced sotirce 
selection practitioners in ASD to gain a fuller apprecia¬ 
tion of the empirical source selection process and to 
validate the model assumptions. 

The outcome enabled some conclusions to be made 
about the way the source selection process functions and 
its relationship to the DOD objectives. 


28 


CHAPTER III 


ANALYSIS AND MODELING 

Review and Analysis of Cases 

A preliminary survey of source selection cases 
completed in ASD over the period 1975 to 1980 was made to 
identify cases with svifficient historical data of the 
value building process to facilitate detailed analysis 
of value building hierarchies. 

Three cases were identified as being suitable for 
analysis, and permission was granted by the Commander, ASD 
to examine the records in detail. The selected cases 
involved teams of 4l, 43 and 70 evaluators and 4, 5 and 9 
proposals respectively. All cases used combinations of 
color 8uid narrative scoring techniques. 

Within each case, it was sought to isolate indiv¬ 
idual value building cells which met criteria for multiple 
regression analysis. 

The criteria sought for analysis were; 

(l) The higher-level composite "value" given to a 
value building cell was expressed in compara¬ 
tive terms to the expression of values of 
the component parts or attributes (i.e.. 


29 




colors, or by generic descriptive groupings 
such as "Exceeds Standards", "Meets St 2 uidard", 
"Fails to Meet Standard", "Unacceptable”). 

( 2 ) There were a sufficient number of proposals 
in relation to the number of independent 
components so that the multiple regression 
synthesis of relationship was valid, l.e., 
the number of attributes (variables) was less 
than the number of proposals (sample size) 

( 12 : 329 ). 

Twelve value building cells were identified which met the 
criteria. 


Results of Regression Analysis 


The twelve value cells examined had from three to 

thirteen components. However, where all proposals scored 

the same for a component, that component was eliminated 

from inclusion in the equation as being discriminating. 

The SPSS multiple regression technique was then applied to 

evaluate the relationship of implicit value: 

Y * Bo + B,x< + B„x„ + . . . + B X 
” 7 12 2 n n 

Further components were eliminated in the regression 

analysis because of multi-collinearity or because they 

were below a 0.01 inclusion level. As a result, all value 

cells reduced to five or less signiflcemt components in 


30 








the analysis. was found to be equal to zero with 93^ 

confidence in only two of the twelv^ relationships so 
derived. Values of Identified with 95^ confidence were 
two in the reuige 0 to 7*7 and eight in the range -10.3 to 0. 

The B^ values reflect the evaluator’s perception 
of the relative Importance of the components of the 
decision cell. The stznicture of the source selection 
process requires that weightings be determined in advance 
of and separately from the component evaluations (l7!p.3-7). 
It is difficult, if not impossible, for the evaluator to 
express absolute weights in numerical terms when using 
color or narrative scoring. The observed practice is only 
to rank components in order of relative importance to each 
other at the outset and it is the judgment of the evaluator 
which determines the implicit relative weight actually 
accorded to each component at the time of final determina¬ 
tion of the composite score. 

The value of B^ may be perceived as a measure of 
the cell evaluator's adjustment of the weighted composite 
score against a subjective benchmark. It reflects a sub¬ 
jective readjustment of the value of the competing proposals 
in an attempt to portray a relationship between them euid 
the perceived standard. 

The composite scorer for the cell, therefore, 
vindergoes a complex process of mental weighting and re- 


31 







f 


evaluation of the component score data In arriving at a 
value. Vfhen using the color code/narrative approach of 
ASD, he Is constrained to express the Judgmental outcome 
by one of four discrete "values" (red, yellow, green or 
blue), shaded as necessary by narrative support. 

Modeling the Value-Bulldlng Process 

Dr. Lee has shown (8:119) that when numerical 
scoring schemes are used, the order of numerical scores 
of the whole are sensitive to small relative changes In 
component weights and scores whenever differences In scores 
are small. 

This part of the research was concerned with how 
the sensitivity of the model was affected when a four- 
increment scale of scoring was used instead of a relatively 
continuous numerical scoring scale; and to see what effect 
the introduction of the value adjustment had on the 
discriminating power of the model. 

Xn considering the discrimination between different 
proposals when color-scoring is used, it was evident that 
a difference becomes significant when component scores are 
near the "border-line" of an Incremental range on the 
scoring scale. Because of the discontinuous natxire of 
the discriminating effect of score differences, it was 
decided that the problem could be most conveniently 


32 








1 


examined by means of a computer simulation. A computer 
model was therefore constructed to find out the effective¬ 
ness of the scoring system used by ASD as a means of 
discriminating between offers of various differences, and 
to compare the performance of color scoring with numerical 
scoring. 

Computer Model 

The model was constzmcted to simulate a value 
building cell of five components, the whole value of which 
is represented by Y where: 

Y = Bg + B^x^ + ^2*2 * ^3*3 ^ * ^5*5 

The values, x^, of the components of the cell were 
randomly generated for five value cells, representative of 
the situation of evaluating five competing proposals. The 
data were generated to represent five sets of proposals of 
differing degrees of "goodness” so that the discriminating 
properties of the model might be observed. For each set 
of five lots of simulated data, the five item values (y) 
were calculated and the highest scored proposal was det¬ 
ermined. Multiple sets of data were tested over a range 
of values of Bg and B^ and goodness levels to determine 
the frequency of selection of the "best" proposal for 
each set of independent variables. The model also rep¬ 
licated the process using the raw numerical scores of x^ 


33 






instead of the four-increment color scoring scale 


Assumptions of the Model 

Examination and analysis of the case histories 
suggested that the following were reasonable asstimptions 
on which to base a model synthesis: 

(1) Each component item of the value cell is 
independent, i.e., no multi-collinearity 
exists. 

( 2 ) The value attributed to each proposal in 
the whole may be conceptualized on a scale 
of 1 to 10 and that the limit of perception 
of objective difference of values of pro¬ 
posals so conceptualized is 2 per cent. 

If the objective difference is less than 2 
per cent, then the "best" bid will be 
selected on subjective factors. 

( 3 ) Goodness has consistency. A proposal for 
which the evaluation is "good" in an item 
may be expected to perform at a "good" level 
on the average across all factors that make 
up the components of the item evaluation. 

(4) Evaluators tend to judge components against 
the standard on a continuum before allocating 
discrete color or descriptive scores. 


34 




Parameters in the Model 

Goodness Level 

In order to be able to examine the discrimination 
of the model, it was necessary to simulate data represent¬ 
ing the evaluated component scores of proposals of differ¬ 
ing quality or "goodness". 

Asstimption 3 states that the values attributed to 
the components of a "good" proposal in a value cell will 
cluster about a value higher them the value about which 
bids of lesser goodness will cluster. To simulate this 
concept, "goodness" levels were modeled on a scale of 1 
to 10. The designated "goodness" level was set as the 
mode of a continuous triangular frequency distribution. 

A computer-generated, uniformly distributed pseudo-random 
number was then put against the cximulative distribution 
curve of the triangular distribution to derive a "goodness” 
nvunber. The nvimbers (AX) so derived were then reduced to 
a scale of color-equivalent Incremental values (x) as 
listed in AFR 70-15 (l7:p.3-6) as follows: 

If (AX.LT.1.25) then X=0 
If (AX.GE.1.25.and.AX.LT.4.25) then X=2.5 
If (AX.GE.4.25.and.AX.LT.8) then X=6 
If (AX.GE.8) then XslO 


35 





Sets of five proposals were simulated with each 


proposal in the set being of a designated "goodness” level. 
Each proposal consisted of a value cell with five compon¬ 
ents . 

In each set of five proposals the "best" proposal 
was put at a goodness level of 10 and goodness levels of 
the remaining proposals put at 10 per cent decrements. In 
each successive goodness set, the Inteirval between the 
"best" and the "second best" proposal was increased by 10 
per cent. 

The resulting goodness levels of the proposals in 
the sets are shown in Table I. 

TABLE I 

GOODNESS LEVELS OF SIMULATED PROPOSALS 

Proposal Number 
1 2 3 5 

10 10 9 8 7 

10 9 8 7 6 

10 8 7 6 5 

10 7 6 5 4 

10 6 5 4 3 


Proposal 

Set 

1 

2 

3 

k 


5 




Proposal sets were not extended beyond set niunber 
5 because: 

(1) it was Judged that a 10:6 quality ratio was 
representative of the largest gap between 
proposals which would merit formal source 
selection procedures, and 

( 2 ) the difference between the "best” and "second 
best" proposal could no longer be regarded 

as "small". 

Weighting Coefficients (B^) 

When total value is determined by the expression 

Y = B^x^ + BgXg + B^x^ + . . . + B^Xj^ . . . (1) 

and Y and x^ are both scored on the seune value scale, 
then: 

?! B = 1.(2) 

1=1 ^ 

Typically (l3:33)» weighting coefficients when 
used in source selection are put at values which are 
multiples of 0.1. Within these guidelines, there are 
seven possible sets of values for B^ when five terms are 
Included in the total value expression (Table II). 


37 








TABLE II 


POSSIBLE SETS OF VALUES OF WEIGHTING COEFFICIENTS 

Set Number Value of B^ 

1 0 . 6 , 0 . 1 , 0 . 1 , 0 . 1 , 0.1 

2 0 . 5 , 0 . 2 , 0 . 1 , 0 . 1 , 0.1 

3 0.4, 0.3, 0.1, 0.1, 0.1 

4 0.4, 0.2, 0.2, 0.1, 0.1 

5 0.3, 0.3, 0.2, 0.1, 0.1 

6 0.3, 0.2, 0.2, 0.2, 0.1 

7 0.2, 0.2, 0.2, 0.2, 0.2 

All seven possible sets of B^ values were included in the 
computer model. 

Introduction of B^ to the Model 

The analysis of twelve value-building cells from 
three source selection cases revealed B^ values ranging 
from -10,3 to +7.7« The number of cells examined is a 
small sample compared with many source selection cases. 
Given the small sample size, it was not possible to make 
significant conclusions about the real limits of range 
and frequency of occurrence of B^ values when color¬ 
scoring or narrative-scoring systems of value expression 
were used. However, it was sufficient for this research 
to observe the possibility of occurrence of significant 


38 









Bp values. When a real value of Bp was introduced into the 
value equation and all B^ values sum to 1, as expressed in 
equation ( 2 ), it was necessary to modify equation ( 1 ) to 
retain the same scoring scales for Y and x^, so that: 

Y = Bp + (l-Bp/s)(B^x^+B2X2+B^x^ + . . . +BnX„) ... (3) 

where S is the scoring scale for Y and x^. 

Since the concept of the model was that the values 
of the components (x^) were additive toward the value of 
the whole, and as Bp approached S the value of Y approached 
Bp, the maximvim practical limit of Bp was S. 

For the purposes of the model, three values were 
chosen for the adjustment parameter: 

Bp = 1-7 

Bp = 0 

Bo = -7 

as a basis to observe the effects of inclusion of Bp in 
the value building equation. 

The Computer Program 

The Computer program to simulate the operation of 
the model is listed at Appendices AI-A 3 . 

The program was arranged to give 80 simulations of 
data for each goodness set, providing 2000 simulated data 
points. The data were processed to find the value or 
score for each bld/goodness set combination and to select 


39 





the highest scoring bid for each simulation. The proportion 
of each bid selected over the 80 simulations was calculated. 
The outputs which the program provided were: 

(1) A frequency table for each goodness set of 
per cent each proposal selected In the first 
run of simulations for the five goodness sets 
and three values of and 7 sets of B. 
coefficients. 

( 2 ) A histogram for each frequency table. 

( 3 ) A summary table of the frequency of selection 
of the "best" proposal (bid number 1) for 
each run of simulations against goodness set, 

B^ value and B^ coefficient set. 

The program was arranged to run the 80 simulations 
five times, each time from a new random number base. The 
five runs were repeated using absolute numerical scores 
for discrimination instead of incremental color scoring to 
provide a basis for comparison between the two scoring 
methods. 

The ten summary tables are presented at Appendices 
B1-B10 which show, for the same proposal data: 

( 1 ) Frequency of selection of proposal niunber 1 

against goodness and Bg and B^ sets for color 
scoring for five runs. 


ho 







(2) Frequency of selection of proposal number 1 
against goodness and and sets for 
numerical scoring. 

Analysis of Output of Computer Model 

The computer model experiment was Intended to study 
the effects on the niimbers of times the "best" proposal 
was selected by successively varying the four factors: 
weighting coefficients (B^), goodness (LG), adjustment 
parameter (B^), and scoring method. 

The four factors or treatments were varied in the 
model over different levels as listed: 

coefficients - seven levels 

goodness - five levels 

adjustment - three levels 

scoring method - two levels 

The concern was to determine if any of the treat¬ 
ments significantly changed the mean frequency of selection 
of the "best" proposal. 

A suitable statistical technique for determining 
the slgnlfIceuice of any observed change over a number of 
observations of different levels of treatments is Analysis 
of Variance (ANOVA), (11:526). 


4l 






The assumptions ot ANOVA are that: 

(1) The probability distributions of the depend¬ 
ent variables are normal. 

( 2 ) The variance of the dependent variable is 
constsuit. 

( 3 ) The samples are independent. 

Regarding the assumption of normality, the variable 
of Interest in the experiment was the number of times the 
"best” proposal was selected, l.e., the result of n 
Bernoulli trials of which the outcome was either "selected" 
or "not selected". The distribution of such a series of 
events has the binomial probability: 

tM = (S) P" (i-P)-" 

where n = sample size 

X = the number of events of interest in n 

and p = probability of occurrence of an event 
of interest 

(11:137) 

However, when the sample size n is reasonably 
large (n>30), the binomial probability distribution ceui 
be approximated by a normal probability distribution 
(i 1 : 216 ), euid it has been shown (l4:6l} that a moderate 
departure from normality has little effect on the test of 
significance of ANOVA. 


42 





A preliminax*y scanning of the computer model 
output suggested that constancy of variance was a reason¬ 
able assumption. It was decided to proceed to ANOVA on 
that basis and use the Cochran’s "C" procedure provided 
with the SPSS program to test the assumption after the 
event (12:430). 

The sample data of the computer model was stat¬ 
istically independent to the extent of the independence 
of the pseudo-random number generator. The condition of 
independence was regarded to be satisfied for the purposes 
of ANOVA for all treatments except for the treatment 
"method", (color or numerical scoring). For simulation 
of "method", the treatments were successively applied to 
the same sets of basic data. ANOVA was, therefore, chosen 
as the means by which to examine the treatment effects 
of "weighting coefficients", "goodness", euid "adjustment 
parameter". 

As the treatment "method" involved only two levels 
of treatment (color or numbers), it was appropriate to 
apply the t-test for population mean differences between 
matched samples to study the effect of "method" (l1:320). 

Sample Size 

It was desired to have a ^^unple size so that the 
ANOVA would provide information about the discrimination 


43 






of the computer model with 10 per cent confidence level 
with 10 per cent accuracy of estimation of the frequency 
of selection of the "beat" proposal. 

For a binomial probability distribution, the 
sample size can be estimated by: 

n = 

(15:191) 

where Z(0(/2) is the two-tailed normal statistic 
for the desired confidence level, and 

d is the difference between the true 
probability of selection and the estimate. 

For the experimental requirements, n was calculated 
to be 68{ 5 computer runs of 80 simulations were selected 
to provide an adequate data base for evaluation of results. 

ANOVA Test Procedure 

The computer model outputs were first tested to 
determine the significance of the different treatments 
when applied separately. SPSS procedure ONEWAY was 
employed. There are two steps involved in using this 
technique: 

( 1 ) Test the hypotheses 

: There is no difference in the mean 

proportion of proposals number 1 
selected between different levels of 
the treatment being studied. 


(«/?,) 

4d^ 


44 







There is a difference in the mean prop¬ 
ortion of proposals number 1 selected 
between different levels of the treat¬ 




H 


1 


ment being studied. 

The decision rule for the test is: 

if F*< F(0.9;r-1»u^-r), conclude other 


wise conclude (I1s535)» 

treatment mean souare 

VQiere F* = — i . — . . . 't - 

error meem square 

and r = the number of treatment levels 

n. = total nvunber of observations. 

X 

The value of F* is provided as part of the 
SPSS output. 

( 2 ) If the test shows a difference between means, 
analyze the ranges within which the differ¬ 
ences lie. Duncan’s multiple reuige test 
provided in the SPSS package (12:427) is 
suitable for this purpose. 

A multiple ANOVA analysis was then conducted of the 
significant treatments to examine Interaction effects over 
the range of treatments when color scoring was used in the 
model. 


ONEWAY ANOVA Results 

Eight data sets were selected for ONEWAY analysis 
to obtain a feel for the separate treatment effects on mean 

45 







proportion of proposals nvunber one selected. The results 
are presented in Table IJX. 

The results in Table III show that, for the para¬ 
meter sets tested, the treatment "weighting coefficients" 
was not significant at the 0.1 level in determining the 
frequency with which the "best" proposal was selected. 

Both "goodness set" and "adjustment parameter" (B^) were 
significant treatments which affected the outcome of the 
selection. 

The results of the Duncan's multiple range tests 
are shown in Table XV. 

The results show that when the nvunerical scoring 
process was applied, the mean frequency of selection of 
the "best" proposal was significantly different for each 
goodness set of proposals. The frequency of selection of 
the "best" proposal increased as the quality difference 
between the proposals Increased. 

A similar result was shown when color scoring was 
used, except that the frequency of selection of the "best" 
proposal in each goodness set was consistently less than 
when number scoring was used and that the frequency of 
selection of the "best" proposal was not significantly 
different when the difference between the "best" and "next 
best" proposals was large. 


46 









TABLE IV 

TABLE SHOVING HOMOGENEOUS SUBSETS OF TREATMENTS 

Test 
No . 

1 T'ment Level 
Mean 
Subset 

2 T'ment Level 
Mean 
Subsets 


3 T'ment Level 123^5 
Mean 35.0 43.8 53.8 64.8 69.8 

Subsets _ 


1 

2 

3 

4 

5 

6 

7 

52.4 

53.6 

52.2 

53.8 

54.4 

54.6 

55.0 

1 

2 

3 

4 

5 

6 

7 

45.0 

47.6 

47.4 

49.6 

49.8 

52.8 

54.2 


4 T'ment Level 12345 
Mean 31.6 42.8 49.6 64,0 66,2 

Subsets _ 


5 T'ment Level 123 
Mean 52.0 42.8 4l.0 

Subsets _ 


6 T'ment Level 123 
Mean 58,6 49.6 48.0 

Subsets _ 


7 T'ment Level 123 
Meeui 69.4 64.0 62.0 

Subsets _ 


8 T'ment Level 123 
Mean 71.4 66.2 65 .0 

Subsets _ 



with regard to the ONEWAY analysis of the treatment 
"adjustment parameter" (B^), included when color scoring 
was used, treatment level 1 (B^ s + 7 ) was found to cause 
a significantly different result in the selection of the 
"best" proposal at all goodness levels. There was no 
significant difference between the effects of treatment 
levels 2 zuid 3 (B© = ^ - “7 respectively). 

The values obtained for P in the Cochran's C test 
show that in all cases the assumption of homogeneity of 
variances was met at the 0.1 level, justifying the valid¬ 
ity of the ANOVA approach. 

Difference Between Color-Scored and Numericallv-Scored 

Results (t-Test) 

The concept of value building by using numerical 
component scores euid weights does not include the adjust¬ 
ment factor, Bg. It was therefore appropriate for the 
purpose of this test to compare numerical scores with color 
scores only at the B^ = 0 level. 

The purpose of the t-test was to determine if the 
frequency of selection of the "best" proposal was slgnif- 
Icauitly greater at the 0.1 level when number scores were 
used than when color scores were used. 

If the mean score by niimbers is M euid the mean 

n 

score by colors is M , then the test hypothesis is: 

c 


49 






H. : M > M 
Inc 

and the decision procedure using SPSS output (12:271) Is: 

If the one-tailed probability Is larger th 2 m Ol 
do not reject . 

The t-test was conducted over the range of goodness 
sets 2 to 3 and at level 4 and ot = .1. The results 
are presented at Table V In which Is concluded for 
goodness sets 2 and 4 and for goodness sets 3 eoid 3« 

Multiple ANOVA (MANOVA) 

The ONEWAY ANOVA test results showed the effects of 
treatments "goodness" euid "adjustment parameter" (B^) to 
be significant at the 0,1 level for the fixed parameter 
values tested. Treatment "weighting coefficients" (b^ 
set) was found to be Ineffective at 0.1 level of confidence 
for nvimerlcal scoring and Bg = 0 and goodness set nvunber 3. 
However, when the parameter "colors" was Included In the 
test for significance of treatment "weighting coefficients", 
the value of F* (1.845) was close to the value of F(2.00), 

It was considered advisable to Include "weighting co¬ 
efficients" as a treatment In the MANOVA In case It became 
significant at the extremes of range of treatments or when 
applied In conjxmctlon with other treatments. 


50 



TABLE V 



cn a X X X 

X 



o 



m 

o 



o 

>A 

00 

J- 

Pl- 



00 


m 



Oi 


a\ 


n 

ON 

ON 

On 


m 

UlIH 

• 

• 

• 

• 

• 

• 

• 

• 

P 

lO 

VO 



N 

N 

>fN 

j- 








The SPSS ANOVA sub-program is designed to handle 


MANOVA for factorial experimental designs. 

Since the ONEWAY test for treatment "goodness set" 
yielded very large values of F*, it was likely that the 
effect of varying "goodness set" would overwhelm the 
effects of treatments "adjustment parameter" and "coeffic¬ 
ient set" for goodness sets 2 through 4. A symmetrical 
factorial design was chosen with three levels each of: 
"adjustment parameter"; = +7»0»-7 
and "coefficient set"; set No. 1, set No. 4 and 
set No. 7. 

The multiple classification einalysis (MCA) option 
of the SPSS ANOVA program was used to provide an indica¬ 
tion of the magnitude of the effect of each treatment. The 
outputs are presented at Appendices C1-C8. 

Summary of Results of Analyses of Model Output 

The ONEWAY test results show that the treatments 
"goodness set" and "adjustment parauneter" are significant 
at the 0.1 level in determining the probability of selec¬ 
tion of proposal number 1. "Coefficient set" was not a 
significant treatment for either number or color scoring 
at goodness set number 3 and adjustment parameter 2 
(Bo = 0). 

Further analysis of the treatments "coefficient 
set" and "adjustment parameter" tadcen conjointly in a 


52 





two-way MANOVA for goodness sets 2 through 4 when niamerlcal 
scoring was used, show different joint effects of treat¬ 
ments "coefficient set"and "adjustment factor" as the 
level of goodness set is Increased, i.e., as the difference 
in quality between the "best" proposal and the "second 
best" proposal Increases. 

When the difference between proposals is small, 
"adjustment factor" (B^) is the significant external 
treatment. Large positive values of Increase the fre¬ 
quency of selection of the "best" proposal. At goodness 
set 2 (10^ difference between proposals), B^ explained 
0.25 of the selection preference, whereas the coefficient 
set explained only .04 of the selection preference. 

However, both treatments accounted for a relatively small 
part of the selection and a large variance of outcomes 
was predicted. 

At goodness set 3 (205^ difference between propos¬ 
als), coefficient set and adjustment pareuneter each 
explained about .16 of the selection preference with 
still a relatively large variance of outcomes. 

At goodness set 4 (30^ difference between propos¬ 
als), coefficient set became the dominant reason for 
selection preference, explaining 0,50 of the outcome 
while adjustment parameter explained 0.12 of the outcome. 

At goodness set 5 (^0% difference between propos¬ 
als), coefficient set was even more dominant, explaining 


53 






0.36 while adjustment factor still explained 0.12 of 
the outcome. The variance of selection due to unex¬ 
plained factors of the model was reduced as the gap 
between "beat" and "next beat" proposal Increased. 

The direction of effects of the treatments was 
also worthy of note. Large positive values of adjustment 
factor (Bq) Increased the probability of selection of 
the "best" proposal. Negative values of reduced 
the probability of selection. Coefficient sets with 
small differences In component weights forced selection 
toward the "beat" proposal while larger differences In 
component weights resulted In greater variance of 
selection of proposals. 

The results of t-tests for the effect of treat¬ 
ment "method" (color or ntimerlcal scoring) were less 
conclusive. At goodness sets 3 and 5» the discriminat¬ 
ing power of nvimerlcal scoring, as modeled, was sig¬ 
nificantly greater than the power of color scoring at 
the 0.1 level. There was no significant difference 
between the two scoring methods at goodness sets 2 and 

4. 


54 




CHAPTER IV 


INTERVIEWS WITH SOURCE SELECTION PRACTITIONERS 

To further examine the underlying nature of the 
source selection decision-making process, structured 
interviews were conducted with source selection practi¬ 
tioners and administrators in an attempt to identify 
their perceptions of the models in the field. The 
Aeronautical Systems Division's Directorate of Contract¬ 
ing and Manufacturing, originally established as point 
of contact in this research effort, provided a listing of 
selected ASD personnel who were at the time, or had 
recently been, engaged in different aspects of source 
selection. Thirty-one personnel were inteirviewed. All 
had been involved in at least one of the many different 
functions of source selection, including acquisition 
policy and procedure management, SSA, SSA advisor, SSAC 
member, SSEB Chairman, item captain. Acquisition Logistic 
Division (ALD) representative, program manager, principal 
contracting officer (PCO), and general contracting, 
pricing, buydLng auid manufacturing participants. 

An interview guide was prepared to ensure consist 
ency of approach in the research. A copy of the guide is 
attached at Appendix D. The interviews were designed to 


55 




try to obtain an overall view of the source selection 
decision-making process from a participsint * s viewpoint 
and to assist in identifying the reality of the process 
as it is applied in practice against the theoretical 
models of multi-attribute decision-making. Discussions 
were centered around the following areas: effectiveness 
8uid efficiency of existing sovirce selection decision 
procedures, relative merits of numerical and color-coding 
schemes of scoring the results of evaluations, influence 
of Contractor Inquiries (CIs) suid Deficiency Reports (DRs) 
on the decision process, and ways of improving the source 
selection decision process. 

Effectiveness and Efficiency of the Process 

There was little agreement on what the ultimate 
objective of source selection should be, although the 
majority of the personnel interviewed agreed that existing 
source selection decision procedures assured the effective¬ 
ness of the process in attaining its perceived objective. 
Responses included the following: 

"a mechanism to appear as objective as possible 
in selecting a source while protecting against protests 
and complaints" 

"to get best contractor at best price" 

"to de-select other offers to be able to withstand 
protests" 


56 









"to get technically best contractor" 

"to select best sotirce for the government, all 
factors considered" 

"to give you a good insight before you commit 
yourself" 

"to select best supplier at best price, if price 
is one consideration" 

"to be fair in selecting a source able to perform" 

"to get the best capability in meeting the needs 
of the Air Force in accordance with the requirements 
of the solicitation". 

Statements such as these clearly show a lack of 
agreement and possibly misunderstanding among personnel 
interviewed regarding the purpose/objactive of the source 
selection process. A clear understanding of the ultimate 
objective of the process by its participants is essential 
to ensxire effective evaluation of proposals and results 
which meet the ultimate objective of the source selection 
process. 

Far greater agreement was found among those inter¬ 
viewed when asked about the effectiveness of the process in 
achieving the perceived overall objective, A large majority 
agreed that the process is usually effective in meeting 
its stated objectives, and that the right contractor is 
selected in almost all cases. Some concern was expressed 
though, regarding normative political override. Source 
selection decisions are sometimes made on political con- 


57 







slderations without adequately quantifying the risk of 
program failure. 

A number of factors seems to hamper the effective¬ 
ness of the existing source selection process. Among 
factors cited was the effect that funding constraints 
have on the source selection decision. During the last 
decade, budgeting has been a major external Influence on 
the process, creating "a temptation to make the low offer 
appear to meet the requirements" through extensive use 
of Cls and DRs. 

The massive amoiuit of data with which evaluators 
are confronted when evaluating proposals was seen to be a 
major factor in preventing a truly effective process. It 
was said that evaluators usually find it difficult to 
filter out the data in order to identify and be able to 
assess the key issues. It appears that source selection 
evaluations are being made with an excessive amovmt of 
data--far more than that which is needed--obscuring the 
important issues and preventing decision-makers from 
effectively evaluating them. 

Most of the personnel interviewed expressed concern 
about the inefficiency of the source selection process. 

Meuiy said that they considered the process to be grossly 
inefficient, due mainly to the large number of people 
involved in the evaluation stages, the excessive eunoimt of 







time taken up by evaluations, and the large amounts of 
data encountered In proposals. The large EUid detailed 
RFPs sent out to industry seem partly to be the cause of 
much of this inefficiency. The RFPs force offerors to 
generate large amounts of data in support of their pro¬ 
posals and make evaluation a time-consuming, extremely 
complex process which requires mEuiy evaluators in order 
to sort out the data. 

More than half of the respondents said that the 
source selection process involves far too meuiy people. 
Lack of expertise and evaluating experience was cited by 
some as contributing to the inefficiency of the process. 
It also appears that the government spends a dispropor¬ 
tionately large amoxmt of resources in obtaining a small 
system in relation to that which it spends in acquiring 
a major system. The need to streamline the process was 
emphasized. Some suggested that a small group of 10 to 
15 qualified evaluators could reach a decision as accept¬ 
able as that made by a large number of evaluators. 

Some concern was expressed regarding the amount 
of resotirces spent in areas which did not influence the 
final decision. Much emphasis is placed on certain areas 
of proposals, e.g., management. The effort evaluators 
put into these areas seems unwarranted when the output of 
such evaluations falls to have an Impact on the decision 


59 





process. It was observed that there is a trend toward 
Increasing the number of management evaluation items. 

Vfhile some of the perceived inefficiency attrib¬ 
uted to the process may be caused by the need to dociiment 
evex*ything in order to have a sound defense against 
potential protests, such a fear of protests appears to 
be vinfovinded. Less than 4 per cent of contracts awarded by 
ASD result in protests, with the majority Of the protests 
being shown to be without foundation. 

In summary, it appears that a number of factors 
cause many people to be Involved in source selection. 
However, the process seems to have worked effectively, 
and the desired results have been achieved as well in 
those cases where strong management has insisted on a 
reduced ntimber of evaluators. 

The Scoring Process 

AFR 70-15 provides broad guidance on source sel¬ 
ection decision procedures. It discusses the use of both 
numerical euid color-coding schemes of scoring the results 
of evaluations, supported by narrative statements. ASD 
regulations encourage the use of color-coded and narrative 
assessments, and numerical scoring has not been formally 
used in ASD since Jxine 1972. In an attempt to Identify 
the strengths and weaknesses of both the numerical and 


60 






color-coding techniques, personnel Interviewed were asked 
to comment on the relative merits of each approach. 

While about one-half of those Interviewed expressed 
their preference for the use of colors, one-third indicated 
that both methods were equally effective in assessing 
proposals, with a few personnel showing a preference for 
the numerical scoring technique. The preference for the 
color-coddLng approach seemed to be based on the concept 
of providing an integrated assessment which would highlight 
the strengths, weaknesses, and risks of each proposal and 
allow the SSA greater latitude to exercise judgment. 

Under the numerical scoring system, the SSA felt 
constrained to accept the ntunerical results, and a deci¬ 
sion to select a soiorce other than the one with the 
highest scoring proposal was difficult to justify. Com¬ 
ments were also expressed that source selection is partly 
a qualitative judgment process which is sometimes hard to 
quantify and creates difficulty in arriving at an agreed 
number, whereas agreement is much more easily reached 
using color scores. Areas such as past performeuice and 
management are sometimes difficult to weigh and score with 
numbers giving an unwarranted degree of precision, while 
color-coding provides a clearer overall picture to the 
decision-maker. 


6l 






Individuals who expressed the view that both 
approaches were equally effective and would serve to 
accomplish essentially the same purpose, indicated that 
the Important thing is to conduct a balanced evaluation 
which emsures key areas are identified appropriately and 
evaluated properly. 

It was frequently stated that in the more object¬ 
ive areas, e.g., technical, evaluators made initial scores 
on a numerical scale. They then converted these, using 
cut-off value% to color scores to fit in with the source 
selection plan. 

Those who preferred the numerical approach said 
that numbers provided a quicker reaction to, euid identif¬ 
ication of, slight differences between similar proposals. 
The numerical scoring technique appears to yield a more 
discrete and finer identification of differences at the 
attribute level; something color-coding falls to do. Xt 
forces the attribute evaluator to commit himself to a 
firm decision. In addition, numerical scoring allows the 
weighting of Issues to be precisely identified in advance 
of scoring according to their relative importance as 
established in the source selection pleui. Conversely, 
they said that color-coding introduces a degree of un¬ 
certainty and encourages political maneuvering. 


62 




In discussing numerical scoring, a variety of 
perceptions of the "cut-off" level of discrimination of 
numerical scores, one to the other, and when compared to 
a standard, was discovered. Some respondents said that 
an absolute difference between scores was a sufficient 
basis on which to make a decision. Most who gave an 
opinion said that a difference of 1 to 2 per cent between 
scores was significant. Less than that, other (subjective) 
considerations would come into the decision. About half 
the respondents felt they could not give an opinion, and 
one experienced officer said that if numerical scores 
were used in systems sovirce selections, he would not 
consider score differences of less than 10 per cent to be 
significant. Respondents frequently said that in many 
cases ASD was concexned with buying concepts which did not 
lend themselves to highly objective scoring. 

Vhen evaluating proposals at the 5SEB level, 
proposals should be compared against steuidards established 
in the solicitation document. A tendency to compare 
proposals with each other at this level, rather than 
against standards, as required by regulations (l7sp>3'*^) 
was expressed by some of those interviewed. 

Contractor Inquiries euid Deficiency Reports 

A considerable amount of effort is spent by source 


63 





selection personnel in the preparation of CIs and DRs as 
the mews of communicating with offerors, to provide for 
clarification of certain aspects of proposals, and to 
Identify specific parts of proposals which fall to meet 
the government's minimum requirements. This procedure 
allows the offerors to correct deficiencies found by 
evaluators. Almost every one of the personnel Interviewed 
agreed that although the CX/DR process Is time-consuming 
and usually prolongs the evaluations. It Is essential to 
obtaining a satisfactory contractual arrangement and Is 
significant In Influencing the decision process. 

Responses Indicated a frequent excess of CXs. 

This was partly due to the failure of RFPs to be definitive 
In some areas. The excess was also attributed to the 
reluctance of evaluators to make a subjective judgment, 
and attempts to obtain a defensible, documented position. 

Xt was suggested by some respondents that more direct 
talks with offerors would help to reduce the number of CXs 
originated and eliminate much of the paperwork created 
during the process. Xt was observed that, in those cases 
where ASD had used the fotir-step solicitation process 
(20:4) there was a large reduction in the use of CXs. 

DRs were considered to be far more critical In Influencing 
the decision process, since these documents allow evalua¬ 
tors to determine how well final offers meet the govem- 


64 










ment's requirements. The most importeint ones are usually 
highlighted luider "strengths and weaknesses” In evaluation 
reports to the higher levels ot declslon-maklng. 

Although AFR 70-15 requires that proposals only 
be scored as originally submitted to encourage the best 
Initial proposals, It was Found that, In practice, prop¬ 
osals were oFten rescored. A review oF three source 
selection case histories In ASD, together with responses 
obtained during the Interviews, Indicated that proposals 
are Frequently rescored aFter the CI/DR process Is com¬ 
pleted. It appears that Further clarlFlcatlon and 
guidance regarding rescorlng oF proposals may be required 
to ensure that a Fair and consistent approach Is used. 

Improvements Suggested bv Interviewees 

It was agreed that the existing source selection 
process Is usually eFFectlve In selecting the proposal(s) 
which best meets the government’s cost, schedule and per- 
Formance requirements, considering that what Is being 
evaluated usually Is an oFFeror's Futvire perFormeuice oF 
something which Is essentially Innovative. However, a 
great majority oF the personnel Interviewed saw much 
room For Improvement oF the process. The discussion that 
Follows concentrates on those areas suggested to have the 
greatest potential For Improvement oF the overall process. 


65 







A need to Integrate the source selection activity 
with that of preparing RFPs was expressed. A great part 
of the source selection plan and process is determined by 
the way the RFP was written. It was said that closer 
coordination between source selection personnel and those 
responsible for the preparation of RFPs would help ensure 
that more definitive auid concise requirements go out to 
industry. This contact would result in more compact and 
precise proposals which would serve to reduce the tre¬ 
mendous amotints of data with which evaluators are presently 
being confronted, would allow the significant aspects and 
key Issues to surface sooner, and would provide for a 
more effective and efficient evaluation. Some respondents 
suggested that the size of proposals should be controlled 
by defining in the RFP the number of pages of submission 
allowed. 

Further streamlining of the process was suggested 
to help make it more efficient. It was suggested that a 
group of well-qualified and experienced personnel with 
broad knowledge, complemented with competent technical 
advisors, would result in a reduced number of evaluators 
and a shorter time required to assess proposals. It was 
said that the source selection experience of evaluators 
must be improved and more specific guidance provided for 
first-time evaluators. The lack of a viable training 


66 







program in source selection procedures for evaluators 
with no previous experience in source selection, makes 
it a difficult task for those personnel who have to learn 
the procedures while on the job. This shortcoming results 
in much unproductive time and decreased efficiency. 

An awareness that many of the problems which 
surface during the performance of a contract are related 
to the contractor's data and cost tracking systems has 
directed an increased emphasis on the management area of 
proposals during evaluations. Respondents indicated that 
although a considerable amoiint of effort is spent in this 
area, it seldom influences the decision process. An 
improved approach for assessing the management area of 
proposals in a more realistic way was felt to be necessary, 
with increased emphasis being placed on a prospective 
contractor's past performance. 

It was felt that there was a need to develop a 
better way of linking the cost and technical evaluations 
together in order to obtain a realistic cost-benefit 
analysis. Other suggestions in this area dealt with the 
need to bring together the assessments of the Cost Panel 
and the Contract Definitization Group at some point during 
the process to provide a better overall picture when 
considering tradeoffs between cost and technical require¬ 
ments . 


67 





Increased nse of the abbreviated procedures for 
source selection was advocated. In the abbreviated pro¬ 


cedure a Source Selection Evaluation Committee (SSEC) 
assumes the responsibilities of the SSAC and SSEB. This 
resulted in a more efficient process. It seemed evident 
to some interviewees that frequently the SSAC failed to 
apply the Judgment required in a comparative Euialysis of 
proposals, euid merely seized as a means of filtering the 
SSEB evaluation results to the SSA. It was also felt 
that some of the more formal requirements for source 
selections on lower-dollar acquisitions could be eliminated, 
improving the efficiency of the process without impacting 
on its effectiveness. 

A need to rescore proposals after the DR process 
is completed was thought by many to be an essential pro¬ 
cedure to ensure eui optimum decision. Scoring proposals 
as originally submitted and as corrected seemed to be the 
only way to conduct a realistic appraisal. 

During the course of the research, it became evid¬ 
ent that some soxirce selections departed significantly 
from the guidance provided in regulations; a fact which 
was felt by some people interviewed to cause some of the 
inefficiency attributed to the process. This was thought 
to be partly due to the lack of recent and current guide- 
ance in the field. AFR 70-15, the primary document for 


68 







establishing policy and procedures for the conduct of 
source selections in the Air Force, is now five years 
old, outdated and has been under revision for over a year. 

It was hoped that when the new issue of AFR 70-15 is 
published, it will provide more specific guidance for 
the conduct of source selections. 

Some concern was expressed that major contractors 
have developed an ability to submit high scoring bids which 
makes it difficult to assess proposals which, on paper, 
appear to be fairly similar. Evaluation then becomes a 
task of determining whether the offeror is able to do what 
he says he can do, rather than making an objective technic 
cal decision. As this seems to be the case during many 
formal source selections, it becomes critical to provide 
the SSA with objective information on which to m8dce a 
rational decision which will reduce the risk of cost 
overruns and program slippages. 

Summary 

The interviews provided a good insight of the source 
selection process as it is presently applied, and identified 
a number of difficulties perceived by source selection 
participants. 

Even though respondents agreed that the process 
was effective in achieving its perceived objective, there 

69 








was little agreement as to what that objective should be. 
ConcexTi was expressed regarding the inefficiency of the 
process. This inefficiency was attributed to the large 
ntimber of people involved in source selections, the 
excessive amount of time teiken up by evaluations, and 
the massive eunount of data with which evaluators are 
confronted. 

Views regarding the techniques used for scoring 
proposals provided a wide range of opinions of the 
relative merits of each approach. Preference for nvimer- 
ical or color scoring methods was divided. 

Although it was evident that the Cl and DR 
processes are time-consuming, they were considered to 
be essential and very significant in influencing the 
decision process and in making a satisfactory contract. 

Inteirviewees agreed that there was room for 
improvement of the process. Their responses suggest some 
approaches for accomplishing that objective. 

Source selections in ASD cover a wide range of 
acquisitions of varying degrees of complexity and maturity 
of concept. However, there is some evidence that the 
process is not always applied with sufficient Judgment 
and that departures from policy and procedures occur. 






CHAPTER V 


CONCLUSIONS AND RECOMMENDATIONS 

This study was directed toward Identifying the 
process of source selection as practised In ASD. The 
methodology of source selection was simulated through a 
computer model. A perspective of the process was devel¬ 
oped through a review of the procedural guidance and a 
series of Interviews with ASD sovirce selection personnel. 
This chapter summarizes the findings of the study and 
compares them with some theoretical concepts to develop 
a descriptive evaluation of the ASD source selection 
process. Finally, recommendations are made which may 
contribute to the improvement of the management of source 
selection. 


Source Selection Methodology 

The suialysis of sovirce selection cases in which 
color or narrative scoring methods were used demonstrated 
the possibility of evaluators incorporating an adjustment 
parameter (B,) Into the value building process when 
aggregating a group of lower-level attribute scores. 

The effect of introducing negative values of 
Into the simulation model was to reduce the discrimination 


71 






I 


of the process in selecting the "best" proposal in terms of 
the evaluation criteria. Positive values of biased the 
scores in favor of the "best" proposal. The effect was 
greatest when the difference between proposals was small. 
More cases of negative values of B^ were observed than 
positive values suggesting a "wash-out" of the component 
evaluations in those cases. 

As might be expected from the work of Dr. Lee, the 
model confirmed that when the difference between the mode 
goodness or quality of the components of the "best" propo¬ 
sal aind the "second best" proposal was large, the most 
significant internal parameter which affected the selec¬ 
tion was the weight applied to each component (coefficient 
set). When the weighting difference was large, the propo¬ 
sal with the "best" modal quality was less likely to be 
selected than when weighting differences were small. 

The relative effectiveness of nvunerical scoring 
and color scoring as discriminators was substantially 
dependent on the nature of the relative difference in the 
quality of proposals being compared. For some differences 
in quality of proposals, color-scoring provided signifi¬ 
cantly less preference for the "best" proposal them 
nvunerical scoring provided. The inconsistency of dis¬ 
crimination provided by color scoring is explained by the 
"broad bemding" of the four-increment color score scale. 


72 







Evaluations of two proposals which fall on different 
sides of the botindary between two color bands will be 
discriminated by color scoring. However, if the evalua¬ 
tions of two proposals (which, theoretically may differ 
by as much as fall within the same color band, the 

color scoring system will not differentiate between them. 

The positive featvires of numerical scoring when 
compared with color-scoring are: 

(1) Absolute weights may be allocated to 
attributes before evaluation and scoring. 

( 2 ) The inclusion of adjustment parameters (B^) 
which can wash out or bias final scores 

is precluded. 

( 3 ) Small differences in evaluations are 
recognized in the scores allocated to 
attributes and are discriminators of the 
outcome. 

The disadvantages of numerical scoring are: 

( 1 ) A degree of precision of evaluation is 
implied which is not always realistic, 
particularly when dealing with the concept¬ 
ual attributes of proposals. 

( 2 ) Nximerlcal scores imply a sense of absolute¬ 
ness which inhibits the exercise of 


qualitative judgment by the SSA 






(3) Evaluators tend to be reluctant to use the 
full range of scores, clustering results into 
a narrow band, so reducing the discriminating 
power of the process. 

(4) It is sometimes difficult to obtain agreement 
on relative weights. 

( 3 ) Extreme responses are not highlighted (e.g., 
non-conformance). 

In comparison, color scoring offers the following 
advantage s: 

( 1 ) A convenient and powerful means by which a 
comparative overview of the quality of com¬ 
peting proposals may be visualized is provided. 

( 2 ) Subjective values of attributes may be scored 
with high levels of agreement, 

( 3 ) Extreme responses are highlighted. 

(4) The SSA is provided with considerable scope 
for qualitative judgment. 

The disadvantages of color-scoring are: 

( 1 ) Significant differences in objective evalua¬ 
tions of attributes may not be recognized in 
the scoring process as discriminating factors. 

( 2 ) Attributes ceumot be objectively weighted to 
highlight comparative importance. 


74 





( 3 ) The process permits the washing out or 

biasing of evaluation results by the intro¬ 
duction of an adjustment parauneter (B^). 

Both methods of scoring have unique advantages 
euid disadvantages. Whether one method or the other is 
appropriate is dependent on the nature and structure of 
the particular source selection involved. It is concluded 
from this study that the choice of the appropriate method 
of scoring is influenced by: 

(1) The matxirity of the concept being considered. 

( 2 ) The relative importance (weights) of the 
key attributes of the decision. 

( 3 ) The resources available to the source 
selection activity. 

(4) The management style of the SSA. 

Mattiritv of Concept 

The maturity of the concept being considered 
strongly controls the level at which a proposal may be 
evaluated. When the concept is novel and the proposal is, 
in effect, a projection of what might be done based on 
broad assvunptlons, then the evaluation can only be realist¬ 
ically scored at a qualitative level. Evaluating human 
skills such as expectations of management or Innovative 
capabilities is also highly conceptual and only able to 


75 




be satisfactorily expressed In qualitative terms. Con¬ 
versely, when standard and predictable techniques and 
practices of mature concepts are being evaluated, 
quantitative scoring of evaluations can be done with 
confidence and precision. A single source selection may 
Involve a mix of novel and mature concepts. For example, 
a technical area may encompass a variety of well-developed 
concepts, whereas the corresponding logistics area may be 
one In which the Implications of the systemic application 
of the technology Is entirely novel. 

Weights of Attributes 

In some source selections, the weight of the 
decision may rest heavily on a particular attribute. In 
others, weights of attributes may be about equal. Even 
at lower levels of evaluation such as the factor level, 

It may be necessaiy to weight the sub-factors to prevent 
the lmport 2 uit attributes from being swamped by the many 
trivial attributes. Nvunerlcal scoring methods allow the 
use of definitive weights when needed. Color scoring Is 
weak In Its ability to reflect weightings but has the 
power to highlight component deficiencies when weighting 
Is not lmport£uit. 





Source Selection Resotirces 

The major resources available to a source selection 
activity are personnel, time and money. Personnel may be 
limited in numbers or specific skills. Time available 
may limit the depth of evaluation. Money resources may 
determine the extent of Investigation of proposed solutions 
or restrict the amount of outside assistance that can be 
brought to bear on the source selection. All of these 
resource constraints may reduce both the effort that can 
be put into evaluating the attributes of each proposal, 
and the precision with which the attributes may be scored. 

As the potential for precision of evaluation is reduced, 

color scoring becomes a more suitable technique than , 

) 

J 

numerical scoring. 1 

t 

Mainagement Style of the SSA 

Simon has written that management and decision¬ 
making may be viewed as synonymous (1 6 :1). The management 
style of the SSA is an Important consideration in selecting 
the soiirce selection structure. The structure should 
provide the SSA with the kind of Information he needs to 
be able to make an effective decision within his own frame 
of reference. Keen and Morton (7:62) classify decision 
style into five main groups: 


i 








. rational - based on analytical definition of all 
the variables to obtain the best decision. 

. satisficing - based on effective use of available 
information to obtain an acceptable decision. 
. procedural - based on following through standard 
organizational procediares toward a single 
outcome. 

. political - based on the use of personalized 

bargaining between organizational units to 
seek an acceptable decision. 

. individual - based on the decision-maker's own 
individual assessment of the Information 
available to him. 

This grouping of decision-making styles suggests 
that different decision-makers will seek different kinds of 
information on which to act. Rational and satisficing 
decision-makers are likely to feel more comfortable with 
numerically-scored information, whenever it may be practic¬ 
ally applied. The procedural decision-maker is unlikely to 
strongly favor either numerical or color scoring, so long 
as he is satisfied that a correct procedure has been 
followed. Political and individual decision-makers are 
more likely to be attracted to color or narrative scoring 
techniques as being compatible with their own styles of 
management. 


78 






Choice of Scoring Method 

The wide range of factors bearing on the effective¬ 
ness of a particular soiirce selection process suggests 
that there is no one best technique for scoring proposals. 
The requirement that "a qualitative rating scale will be 
used in lieu of weighted scoring (l:9)" unnecessarily 
inhibits ASD source selection personnel from exercising 
the flexibility to choose the process best suited to each 
source selection situation. 

Within a sotirce selection case, different areas 
may merit different scoring processes according to the 
criteria discussed above. The color scoring system does 
not offer sufficient range to be able to satisfactorily 
show Important differences in many areas of technical 
evaluation. In other areas, such as management, color 
scoring may be an appropriate tool when needed to indicate 
the outcome of largely subjective judgments. There is no 
overriding reason why nvunerical and color scoring should 
not be separately used in different parts of the same 
source selection. If done, it would present the SSA with 
an overview of both the objective and subjective aspects 
of the total evaluation. Alternatively, if it is the 
preference of the SSA, scores could be feasibly converted 
to an "all color" or "all number" presentation at the 


area level 









Vflien values can be determined with higher precision 
them afforded by a four-increment color range, numerical 
scoring offers greater power of discrimination of the 
merits of proposals than that offered by color scoring. 

This advantage should not be foregone in those groups of 
attributes for which numerical scoring is appropriate. 

Other perceived problems of numerical scoring (clustering, 
agreement on weights, and extremes not highlighted) may 
be overcome by appropriate techniques. The techniques 
described by Beard (MAUM) and Wald (comparison matrices) 
provide practical and convenient ways of making objective 
amd effective weight and score allocations with reliability 
and repeatability. Extreme attribute scores (mainly non- 
conformamce) may be simply highlighted in the evaluation 
presentation or treated with an exclusion rule which 
eliminates the proposal from further amalytical considera¬ 
tion. 

Color scoring is a suitable technique for representing 
evaluations of subjective and highly conceptual attributes. 

It recognizes the imprecision inherent in such areas, yet 
presents a good overall comparative picture of proposals. 
Extremes are highlighted. Where precision of evaluation 
is possible, color scoring tends to wash-out significant 
differences. It presents problems in allocating relative 
weightings when weights are significant to the decision. 


80 






and allows a wide variety of outcomes because of the scope 
for Implicit adjustment factors in the process. The dis¬ 
advantages of color scoring can be minimized by management 
vigilance euid skill in application, together with careful 
consideration of the suitability of areas to the applica¬ 
tion of the technique. 

Procedural Aspects of Source Selection 

The prime objective of the source selection process 
is to obtain an impartial, equitable, and comprehensive 
evaluation of competitive proposals which will result in 
the selection of a source which will offer optimum satis¬ 
faction of the government's requirements, to include cost, 
schedule, and performance The wide range of 

conflicting responses obtained from interviewees regarding 
the ultimate objective of the process tends to indicate 
that personnel involved in source selection fail to 
approach the process with a common objective. This lack 
of agreement impacts on the quality of the final decision 
and reduces the overall effectiveness of the process. The 
need to understand and work toward a common objective in 
sovirce selection cannot be overemphasized. It is essential 
in order to make a selection based on that objective. 




AO-AIOS 056 AXR FORCE INST Of TECH M16HT-PATTERS0N AFB OH SCHOOL—ETC F/6 5/10 

THE SOURCE SELECTION DECISION PROCESS IN AERONAUTICAL STSTENS D—CTC(U) 
JUN 51 C If BARCLAY* J E NlOO w— 

UNCLASSIFIED AFIT-LSSR 12*51 ^ 




























f II I . .1. J 


Effectiveness and Efficiency 

During the evaluation of proposals, source sel¬ 
ection personnel are confronted with vast Eunovmts of 
data, a large part of which Is not needed to make an 
effective decision In an efficient manner. This excess 
detracts from the decision-maker's main tasks. Provid¬ 
ing source selection personnel with excessive amounts 
of data Inhibits them from being able to effectively 
Identify and assess the small amount of really Important 
Information needed to reach a decision which will result 
In satisfaction of the government's objectives. 

The study of source selection cases during this 
research foimd exaunples In which areas, Items or factors 
were broken down Into mauiy attributes for evaluation. 

Often there was high multl-colllnearlty between some 
attributes. Indicating that they did not contribute to the 
decision. Some respondents to Intezvlews expressed concern 
at the proliferation of sub-dlvlslon of evaluation. 
Evaluation of management was cited as a particular area 
of proliferation. There appears to be a tendency. If an 
area Is recognized as critical to the decision, to expand 
the sub-headings vinder which It Is evaluated. There Is 
the dauiger In this approach that proliferation of parts 
merely leads to an averaging of scores and obscures what 



82 









T 


is important. The discrimination of the process is 
improved by keeping the number of attributes small and by 
applying differential weights to them according to their 
Importance. Helman and Taylor (6:90) suggest that only 
three items (planning, organizing, and controlling) should 
be evaluated in the management area and that each item 
be broken into no more them four factors. 

A need to develop a better way to consider cost 
2 uid performance tradeoffs is suggested from interview 
responses. The present philosophy of source selection is 
to associate a cost with a technical proposal, identify 
acceptable proposals within a competitive cost range, 
and then obtain best and final offers (BAFOs), Although 
in theory the budget should not be a constraint, and the 
contract should be awarded to the offeror who best satis¬ 
fies the government's requirements, in practice, budget 
restrictions sometimes prevent the selection of the best 
offeror. A tendency to emphasize cost limitation at the 
expense of technical feasibility may not be the best 
decision. 

Use of Scoring Techniques 

Personnel Involved in source selection held a wide 
range of opinions on the effectiveness and appropriateness 
of methods of scoring proposals: numbers, colors, or 


83 







narrative. There was little objective understanding of 
the implications of choosing one method over another, and 
choice was largely a matter of subjective preference 
based on experience. The model experimentation conducted 
during this research suggests that there are circumstances 
in which the method of scoring proposals significantly 
affects the outcome. 

Contractor Inquiries 

There was a broad focus on the concentration of 
use of CXs to cl 2 urify and justify the work of evaluators 
beyond what was required for contract definitization. 

Much of the use of CIs was felt by many interviewees to 
be a device to protect the organization from future pro¬ 
tests by unsuccessful offerors. Proliferation of CIs 
tends to extend evaluation time and can lead to technical 
leveling and raising low cost proposals to more favorable 
evaluation levels. Cases were seen where initially poorly- 
rated proposals were rescored to acceptable levels as the 
result of Cl actions. It was observed that the introduc¬ 
tion of the fotir-step solicitation process (20: l) 
contributed to a large reduction in the use of CIs. 

Clearly, a balance is required between sufficient 
use of CIs to provide adequate contract definitization euid 
an inefficient excess of CIs. Many personnel felt that a 


84 







proper balance was not bein^ achieved 


Problems of Source Selection 

The major problems confronting the conduct of 
source selection lie in meeting effectively and efficiently 
the objectives of Impartiality, equltability and compre¬ 
hensiveness. The descriptive model of soiurce selection 
developed in this research shows it to be a highly complex 
process. The goals of the process are not always clearly 
perceived. The effect which the techniques used have on 
the Interaction within the process are not widely under¬ 
stood by personnel involved in the activity. Procedural 
guidance tends to be fragmented and is not clear on the 
suitability and applicability of the techniques available 
to evaluators and decision authorities. The transitory 
natiire of source selection teams precludes the development 
of depth of experience in many personnel key to the process. 

A logical, consistent, and disciplined approach, 
tailored to requirements, is necessary to provide a com¬ 
plete and objective analysis with minimum resoxirces. The 
process should efficiently communicate to the SSA a clear, 
complete, relevant, and objective analysis which will 
provide a reliable basis for the source selection decision. 


85 







Recongendatigns 

This study points to some possible ways In which 
source selection may be made more effective and efficient. 
Many of the problems experienced In making efficient and 
effective source selections lie In the limited experience 
and understanding of the process by many of the personnel 
Involved. Working-level expertise In source selection Is 
limited because of the relatively short time memy partici¬ 
pants spend in the process. However, their actions impact 
on decisions involving very large expenditures of money and 
long-term operational commitments by the Air Force. 

A significant contribution to improvement of the 
operation of the process would be to introduce short train¬ 
ing programs for personnel entering source selections. The 
training should be directed toward developing: 

( 1 ) A common \mderstanding of the Air Force 
objectives of sovirce selection. 

(2) A knowledge of the procedural framework 
of source selection. 

(3) An appreciation of the scoring emd weighting 
techniques available, their relative ad¬ 
vantages and disadvantages, ^uld their scope 
of application. 

The training program should emphasize the principle 
of essentiality in source selection. A source selection 


86 







should be concerned with what is essential to the decision 


It should focus on collecting and evaluating essential 
data. Efficient source selection plans should restrict 
the use of evaluation items and limit the factors to a few 
significant headings which will facilitate meaningful 
discrimination between proposals. The literature review, 
findings, and discussions in this study provide some 
insights of the source selection process which provide a 
basis upon which a suitable training program could be 
built. 

In parallel with training development, a review of 
source selection procedures is advisable. The following 
changes are recommended: 

(1) Avoid directed use of specific scoring or 
weighting techniques. 

(2) Encourage source selection planning tailored 
to the specific characteristics of the 
acquisition. 

(3) Facilitate integration of RFP development 
with source selection plaxming. 

( 4 ) Provide guidance on relating cost evaluation 
to technical evaluation. 

The better \uiderstending of procedures Euid policy, 
and the objectivity that can flow from such measures should 
result in smaller, more ptirposeful source selection teams. 


87 






eind more powerful decision support mechanisms for the SSA. 

The model developed here Is an attempt to describe 
the complexities of the source selection decision process 
In ASD. It Is not complete, and does not purport to be 
so. However, It Is hoped that It will provide a useful 
basis from which to Improve understeuidlng of the process. 
There Is scope for much further work, some of which Is 
suggested In the preceding pages. 







1 


i 


I 


appendix a 

LISTING OF COMPUTER PROGRAM 


90 


SOUilM 

"iTJOTVTrjN '5F T'SflOfiCr 5;L;:5*I6h i?L. Si* iU5u4Y/nIc3- 

I*!’";:* <¥(113)1-;, j'.,13.:s.*',sfciCtJ.K,i..c 

P'i'. ^3 <113,Hi) isd 17,113),®0( ill) ,Y <11 3),iX, r-.rX, <( 115, 115,11; 
r lUO) , <,P£RCI 115, II 7,11 J> ,XA,<o 

;-i4Pi;:£-i'-»-K-so,cai:r 7' ' - - .. 

ji-i -«»•:/• .. .....t, 

, . ! < ^-r - H ' ! : - —- 1 

iooaxiis 'i4’'pix ;o<L«i,jrf' i '.ii»&3ccn£ss SET, jvi):3 rs set 

jT n ii-i,: . 

■.:<_i;,i)=ia._ _ _ _ _ _ 

SC< 1, 2) >10. 

-^srrrrrrrr. - 

'i3<i,’)»». 

■';D<-,J)«r. 

S3<i, 2) «6. 

■■^'3(l,3)ai. 

'.■>(,, 1 )=;. 

llTl j 3<?, 3) ,1, ')< 1, k) ,GOr2,4) ,GO(3,^),ij3( ,,^) ,Sj<3,^) ,G3<1,5) , GO <2 
I 5) / 

15 . T. 7 r;/ -- 

3T(3 .5)» 5. _ _ 

G3”( *., ? I • -*. 

53(3,3I«3. 

■3l*3t<ia' TCr'AKG'CGIFFICITsrs 'ei:3»30‘Srr'.tS*3 5 ST,m»J COeFFlCIENT 

3( 1,11 a,; 

03 i.3'*‘N<-2r? .. . 

_ 3(l.N)a.l _ 

; 3 i< ll'J Jt 

aiTi 5(2,i),S<2,2)/.S».27 _ 

■ oTSei'Ni'ir?" ■■■ ' ■■ ' ~ .. ■ ■' ■■ 

2 (2,.<)a .1 

»iN'je - ... - .. - 

34‘'i 3<3,1),3(3,2)/...,.37 

-■T1~3Tf~Na-y7;- 

2 (3,l))c .1 

OTITIvuE. .. ■■■ -.. ■ ... 

jl'^ M4,1) ,.’(.., ^,6 «-,i) ,i(4,4|^e(J.,5), 2(5, t>,S<5,?l ,5(5,3) ,3(5, 

«i.jt;;?)/ .. 

I .,,.2. .2, .1, .1,. 3,. 3,. 2,. 1, .1/ 

iT 3,1) a. t 

30 7)2 N*2,4 

3-(6,N)i72 ■ 

';o*irrv'j£ 

l ( i, 5 ) a, I 

)0 2 * ’ N- I,: 

-rrrr.o'i'n!- 

OO'CI'lUt 

.11 T1 2 0(1) ,; C( 2) , = 0( 3)/7., 1..-7/ 

30 53 rsi,2 

C££3a912S7 ‘ 

IF tC.EO.l) ■HiH 

TTJrsnc ITTF T»- 


91 















03 4# L*ti5 

S^;4!)df‘-54ScO V4.JiS“0F' < FOCiiCH iOSCSiSS .;7;l 

03 013 Lu»1.3 _ _ _ 

00 iOd'i j«sl,5 .. .* 

03 lid3 r4>l«S 

i3 1240 liTTP 

■^aOCC.I, jV) 

3ii:«'<00( (2-2qd»i££0*qq91),f3'}01/l ~ 

- xaoiEj/neijir. 

■<i«33sr(FX to*'Ml ■■“ . ~ ■ “■ 

<0a< 2*3 0-T( Ifl'SX-^-cx*^) »)/. 2 


-■( 2-1 UR I ! — 10-- X-“-s^X + R» I ) 

I "(XD.OT.M.INJ. XO..c.n. »Th£N 

' " ■■■ ' ' ■■■ ■ -- - 

iMOIr 

r^OF.iT'.'SiTiNO.'XF.rj.ltJ'.TTl-iN .. 

( aa<P 




irdiTj 

I-(iX.LT.1.35)TJ*e^ 

X (L *i J/iflfI) afl# 

•.3.-::f <Ax.Lr.<*.2S)r-ts'i 
<(Lo,jV7N,i)i2;« " ■ ■ ■ 

i.SilF (Ay.;.x.S.)T-ti>4 
X(l'« jJ tn,i)*o; 

-.Si 

X( l. 3 ,J 7 , n 7 Dal J.. 

iNOIF 

- .. . 

X (Lu, J 7 ,'l« I)atX 








I 


r-jNrtNJc 
:: nai jv«i,5 

(I < Jtf 1. £3.T£1 Them . _ _ 

<r(jv»»<Y(j/) *1 

£mo:f _ __ 

:3:r:NiJE 

wTM Tt'i UE______ __ 

( l 5 ffsTrcj»xy ('ll ‘ iiT./ i<f( i» *<r (2) ♦<< H) ♦<< ( *) *<y (5)1 

i- £3. t.Mf.C.L. EQ.! )THi 4 __ _ 

aa; 'If * . - . - 

oa.r'lT «(/Tin ,A,T,T,e-.11'^'■SULf OF 80 SliJLJriJMS 3' 3C*',5C(:;) 

' (Tin, J,Tua,P5.2,r.5.^T.2,r50,F5.>."55,-5.?,'’il,‘f£.21 *, 

I 'AMT :3£-F ICIEST SET J<:5. t) ,fi(rs.2> tt t I3t T) ,£(I3,*i ,3(1$, 5) 

-^-3i-4r"‘ rroTV^ts i •, *iMa’^'aucMesT s', 

• (/TS, A,T3a, A) *, HC; 't£"T_VALJ£S'M3. 3- 033 ICES ' 

33 238= 'j\/«l,5 ■ . ‘ . ■ ... ■ 

J = I iT * <*12,A,T 13,12, TTe.ISI *,‘Y • , J V,<Y ( Jrf I 

■■33.M'T>.ue — ..- . ...... 

-’i'lr • *, 'H XST3;»A-1 0^ FStQUEN3Y 3- 5EL£3rt3 1 CF 2IC3' 

~ iT 21 -TP 

33 :io.' JV*1,5 

3S.I.-.T • (TIO.A ) ' ,“*X'. ... 

3 = I.-.T • r5,l,T6,:2,'’t3,3,TU,Al*,'Y ',JT.'I*,.'1*2<«l«<YtJV)) 

I'O.M'ISUE ■ .. 

aF,:-ir ' (Tio,A)', *1 • 

—i a fSTf* -—————————— 

S’i'ir • CIO, A,T-7,F 7.21 s *_®i’_CE;C JF_nME 313 M3.1 > E1.ECT^3= • .pes 
l-C-I,13,-01 

=o.:'ir '<*15, A,*3c,-5.2)', *1.1*1 HIGHEST T WAS'.TE 

- . . . ._ ..... . 

3 3TD 15 58 


COSTINJ; 

;3NTif.')E.. ■ .. ' ■" ■■ ■ ■ 

CO'lYt'lJE 

33t,jr *(/////T22,l) ','RE3UL* '0® jo SIHJLATIONS O^'srs 3AT4' " 
F’.lsr ' </T 17 ,A1','PE ?Ci'4r JF TXh- *10 MUMPER I M»3 HtSH VALJE' 

■■ 3 =’rjr-rrr* 2.* ;i-, r 38 ,: 2, Tvr; i r* ,-»nj 3 . TiuMFE r ' ,u';: C 3 i- 

-’’.I'l'’ '<///*5,-,T23,A,TI.,»l '.'CSEFFICIEMT', '33', ';333'IE3S SET' 
■'’IMT 3,3'l'','‘T£r' 

■’•'I'T ' (• 17, A,T25,A,T31, 1, .*i*,A,r53,4,T5 4,A, T75,ll ', 

I'II', ' 2 ', ' 3 'E'■“ * ■ .. 

‘'•’I'l: '<717,1,735,41 •,»:»,'I' 

"TI'ITOr '■(TT.'.'-’AI', I- 

33 T08' rS*l,7 

33 31ji'I0*l,3 ■■ ■ 

**I IT » (T 17,4 ,T 25,11 ','I','I ' 

’ = : iT ' <7ta,:2,Ti7,A,7f»,Ffc.r7T25V4,T?8;7‘£.t, “ 

I T79,F* fl,*33,F6.l,r3l,F*.l,?72,F6.ll ', 

1-Tr,'T'-, ’a-fToj-,-*IinTis,icTTFT7-cT2-,-rTTi:),- 

' *£?.'< J, IE,: 31 ,FESC1 ., IS, 10), P£FC<5, IS.IOI 

lOrCIKilE ' - “ ~ ■ 

* = I iT ' <717,1,7 25,11 'I ' 

ail-JT '<-3,'7A)', 1“ -- 

_7 3MTJ'IJ£ ___ 

- jsTi ■'•'Jc 
CJ <7 I.MJE 








APPENDIX B 

SUMMARY TABLES OF COMPUTER OUTPUT 


9h 








95 





























_ »;.SU.T y jO JNS CF i'.Z T T6 

=i.-CiriJ OP M'J-bE*. 1 hA3 ttJt '/ILUi 

•iJ'l Njxri-. "! CCcJ^E 


c.oEFF::rt 1 

• tr 


lOJflZSj i£T 

.. 




: 

2 

1 2 

3 

¥ 

5 


99 ^ 9 ¥ ¥¥^^^¥9 99¥¥9¥ 9 999 99 99¥9 ¥^99^**9 9 ¥ ¥ 99 9*99 9¥ 

¥999 9 



* 

I 





1 

• 7. a 

I 

3i.fl 52.5 

5 ? 

55.9 

69.1 


♦ « • • 

I 





_ 1_ 

I _II. 3 

1 

I 

31.3 fJ.5 


s 1. 1 

51.7 

1 

: -7.3 

t 

33. £ >.3.0 

93.1. 

31.1 

61.3 


I 

I 





♦ ♦ 

»♦999 ¥999999999999 999999999^99999999 9 9 9 99 99 9999 999999999999999999 


¥ 

1 





9 

: 7.3 

I 

33.3 5>..9 

r > • 3 

el.» 

72.* 


: 1.1 

* 

»1.2 -a.t 

93.1 

'jbm 5 

66.7 

9 

: -T*! 

• 

33.2 -a.l 

9o.O 

£9. 1 

66.7 



Jk 






9 999999 9999*^99999^99 9**999999 9*99 9999999 9*9999999999 ¥ 

9 

: 7.3 

i 

33.S 59.0 

J ..3 

S*. 9 

72.3 

9 

r;.3 

* 

• 

,31.5 _5l«2.. 

92.3 

92.9 

99.1 

T 

: -7.3 

9 

29.S 51.2 

93." 

«»9* W 

59.1 



": 





f9 99¥9999¥99 9*9^^ 9 9 99 99 * ¥ 9 9 ¥ 99 9 9^ 94>99 99 9 9 99 99 9 9 9 9 9 99 99 9 9 9999 9 ¥ 999 9 9 9 9 9 99 9 ¥ 


4 

¥ 





• 

: 7.3 

9 

35.2 59.0 

5 ^ * 

ed.7 

72.3 


z 

¥ 





i. 

i 0.3 

9 

33.7 <.9.2 

’**¥ ¥ 

bZ« 9 

6).l 


T 

1 





_ - 

: -7. ] 

z 

32.3 -7.0 


61.3 

63.1 




= : 7.1 

I 


_59.2 

9 7 • « 

70.7 

75.3 

" ; a. 0 


29.3 

50.0 

97.1 

b 0 • 3 

95.1 

3 ; -7.J 

T'" 

I 

29.3 

50.0 

^ 

55.3 

69.1 

¥ 

« 







99¥¥9999¥9999^9**9999999999¥99¥99 



i : 7.3 

¥ 

-2. 7 

=0.c 

5 *5 • 1 

• s 

31.7 

■ ; 0.3 

1 

T 

33.7 


51.« 

62. 7 

*2.3 

- : -7.0 

* 

32.3 

-1.- 

51 • n 

62.7 

71.9 














r 

: 7.1] 

I 

I 

-1.0 

73* 9 

5i.'« 

73.1. 

73.u 

e 

n 

H 

||B 

-i.3 

51.' 

69 • ^ 

SI.:. 

r 

^^9 

H 


*•'»# 7 

51.* 

o*»* ? 

3?.2 



n 

HHHHBilili 



































£5U;.r 

O' *3 

riiauTioNS CF 

512 2*-T4 




: 

i* C£ '4 

T 0" 

44 

• < 

• 

J MUMCEr 1 4AO 

5755 ''»LU; 






^■J'l sjMS 

It CULOaS 













i-OiFPt: 

:r.r 

uL 


GOOCsiSS irT 




--' 

: 


I 

1 

> 

% 

k* 

5 


4444444444444444444444444 44 44 4^ 44 44 44 44444*4444444444444444 


t 

r 

7 .1 

1 

35. J 

*7.7 

5"* 

2 

51 • • 

i 

w 

c. 3 

I 

I 

?7.1 

42.0 

51.5 

-e. * 

57. * 

• 

I 

I 

-7.0 

I 

?S. 7 

-t.E 

51.3 

43.5 

55.1 

4^ ^^444 4 4 44 ♦ 4 444 


? 

4 

7.0 

I 

X 

35. 7 

**0*3 


£1. 0 

65 • ? 

•* 

t 

0,0 

■"1 

1 

1 

40.2 

55 • 

50.1 

56*5 

9 

4 

-7.3 

—J-, 

r 

24. t 

39.0 

52.’ 

^9* 

59«d 

♦ 4444 4 44 4 

4 

444444 

T 

44 4444444444444444444 4^444444 44444 44 444444 44 44 44 4444444 4 44 444 4 

t 

• 

* 


T 

J5.1 

••5 • 3 

S'.5 

£7.5 

73.2 

T 

* 

* 

0.0 

i 

?S. 2 

J4.5 

5 7.4 

53. 3 

53.3 


■ j 

f 

- 7,1 

I 

I 

25.2 

3i,.6 

52.4 

52.5 

59.3 

44444444* 

4*4*4444444 4444444444 4 44 444 4444 44444444444444444444444444 44444444444 


f 

7.0 

T 

40. ? 

50.0 

£1.5 

£7.9 

71.1 

C 

4 

• 

0.0 

i 

I 

25.3 

*•3 • 2 

54. ’ 

eO« 0 

62.2 


* 

i 

-7. 3 

X 

X 

25.5 

35.3 

52.4 

5d • 5 

61.3 


44444 4 44 4444444 444444 44444444444444 444444 4 44 4444 44 44 44 4 44444^^44 44 4 4 

z 

• 

7.3 

I 

35.5 

-3.0 

55 

69*5 

72.3 

z 

I 

3. ] 

I 

I 

2 <j.e 

35.3 

51.' 

£2.2 

64»5 

i 

I 

-7.3 

I 

“ f “■ 

25. ? 

35.3 

51.5 

61.0 

04* 6 




1 : 7, J 

X 

_’3.l _ 

*•9 9*. 

62.' 

71.6 

7-.7 

‘ : a. 1 


31.» 

37.3 

B..(> 

00 . 3 

70.- 

^ : -^3 

4 

31. 9 

3-. 6 

54,3 

63.7 

_ 69.1 

44444444»***4444* «♦ 44 444 44 4*44 444 444 44444444 4444 44 444 44 44 444444 4 4 



4 






7 r r. 1 

X 

33.0 

30. £ 

53.'. 

69. « 

73.3 


I 






: 3.3 

; 

36. f 

_^.l 

61. 

£7.1 

75.6 

• ‘ -■’.3 

r 

35. 3 

37.1 

63.' 

t3, k 

44 4444 4 4 

72.3 

444444 444444 

^8 .. ■ 


/ 











•J»T4 

























CF 3:; T*T« 

= OF ~:»1i ;I j MUjec® 1 HAD jin 'ALU- 

‘UMbiSr. 1 HO>'S£R» 




.«*«»»** ^* ********** ******** ******************************** 

r.J : -T.J_ iT.3 _ fo.t _bt.^ ^3.3 

0.3 r ZT.s bo.o *r*~ st.^ SJ.i 


3 


? : C.3 I ^r.j _52.5__!*•* __ 

» ; •<j0 * _«._q _ ^.e __s®**. . 

m « 


t 

X 

7.3 

I 

1.7.5 

bO. 3 

&d.’ 

70.a 

75.3 

3 

I 

A 

0.3 

T 

Z 

32.5 

-3.0 

3s.’ 

65.3 

31.1 


r 

-7.1 

~~f 

I 

33.1 

*3.0 

53.1 

62.5 

33.0 


1 z 


•4 

r 

7.3 

I 

51.3 

35.0 

" 

'* '• 


"r ■ 

— *“ ■* •■“••- 


» 

z 

0.1 

X 

.33.1 

‘•6.6 


I 

-7.3 

4 

4 

35.0 

•*3# 8 




e 

i 

7.0 

I 

51. I 

f$. 5 

70.1 

81.3 

12.5 

5 

i 

0.0 

I 

_33.3_ 

*7.5 • 

57.5 

36.3 

72.5 

s 


-7.0 

z 

_35. q . 

_-3.3 

- J3.’ 

62.5 

31. J 






: 

I 


.. ... 

. - - 

1 

( ': 7. 1 

I 

T 

52.5 

6 J. i 

6 3 • * 

83.1 51.3 1 

1 - : 0.3 

f 

* 

36. ' 

63.r 

5J.* 

66.1 7’..? 1 

i - - 

I 

33.5 

‘0.0 

5 

c6.1 72.3 j 


•ioo 



















CO£rFT::i I' 


^isu'.r or '3 ns cf_j:; j)iT4 

I. tiJ.rH^ 1 H4C Hill “/iLU 


_ J’.'JN_SU''6£_R 2 _NOK £S.5 

j socO'i£:^s scT 


II 2 ^ 




■.. : "■ ■ i"‘ ■ ■■ ' 

1 : 7.0 : ^0.0 uo.o 57 .* 67.* 7 ?.* 


3 _ 5 

d 3 




z 

? : G,a : _I**® 

’ ; -:.j : 53.0 fc2.5 


i 

■^_ : :_;'3^_fc3.e_n.j_!.®!.2._ 


r 

I 

0.1 

* 

21. S 

,7.t 

■♦ 0.3 

62.3 

7j.3 


k 

•r.j 

I 

23.3 

-.9.0 

•.9.0 

60.9 

76.3 




f 

7 . j 

X 

I 

-1.3 

■j2.E 

SI.’' 

7b. 1 

31.1 


* 

• 

3.0 

* 

X 

10.0 

<-3.e 

1.9.^ 

67.5 

77.5 

. _ -7- . 


-7. 1 

* 

i 

27.r 

^6« 3 

49.1 

62.3 

73.3 

1 ‘ 


































11T4 


_ j.iiui.* ''' »a ^itt»LA~i3n5 CF 

■I.JN NJ«9-». ? t«0*«ei?5 




S00;m£SS S£T 


1 

:sf 






I 

1 2 X 

k 

5 


t ■ 





♦♦♦♦♦*#»* 

z 

A 




1 : i,^ 

r 

’S.» 57.f f1.' 

(:5.0 

br. 5 

« 





! I C. 1 

A 

^3.5 ST.-* 50. -» 

S3.5 

53.5 

I 

I 




1 : -7.0 

I 

J5.0 51.J 53.' 

50. 0 

50.3 

I 

T~ 






4 4 ♦♦♦ ♦♦♦4 ♦♦♦♦♦♦♦♦4 44N4 44 ♦♦*♦♦♦ 44 444 44 * 4 44 44 44 4 444 ♦♦♦♦♦44 4*4 444 4 

^ • 




^99 


65. ! 

^ • 

HI 

B 


K9SHM 

^91 

IBDH 

so.o 

> • 

.'••A 

4 

_ 3^0 _ 

•*T ^ 5 


_ _6 0. 9 

59.5 

*«*»*>♦♦o** **>« 

4 4 ♦444 444 ♦♦ 4 44 4 4 44 4444444*4444 44 44 44 4♦ 444 44 444444 4 4444 4 44 444444 

3 : 

7.0 

4 

<‘7.5 

SI .3 

55.0 

76.3 

72.5 

^ : 

_. “•■ I 

4 

“?“■ 

_15_ 

<.7.5 

I®*’-.. 

cT. 5 

b7 • 5 

3 - 

- 7.0 

X 

3?.? 

»7.5 

55.*7 

63.3 

>3.3 




4 






^ f 

;.3 

r 

Vf.O 

52.5 


77.5 

72.5 

* 


"t 

.. 





« 

0.0 

4 

33.5 

*•0 4 3 

55.3 

fc£. 0 

S3 • 0 

• 









-7.0 

X 

'0.0 

3*0 • 3 

55.0 

82.5 

63.3 

* 


i 






♦ ♦♦♦♦♦■•**44 ♦^♦♦♦♦♦* ♦♦♦♦♦ 44 ♦♦♦♦ ♦♦♦♦♦♦♦♦♦♦ ♦♦♦♦ 4* ♦♦*♦♦♦♦♦♦ *4 4444444 44 4444444 ♦♦ 44 

J 1 

7.0 

I 

*■5. ! 

62.5 

55.0 

77 • 5 

77.5 



4 






“ I 

0.3 

I 

32.5 

.3.9 

57.5 

ee. 3 

SS • T 

: : 

-7.0 

1 

1?.5 

_1.7.5_ 

3»’ • ® 

69.3 

&3 aO 


4 4 444 4444 

44*44 4 4 44 444♦♦♦4 4444 444444 444 4444 444444 4 444 4 444444444 

■ 

- . _ 

"T 

-- —- — 


• ■“ ■ 



r : 

7.3 

T 

52.5 

65. 1 

71. ^ 

e2.5 

!5.7 



• 






► : 

• 

3. 3 

I 

n. 3 

50. 0 

57.® 

70.7 

77.0 

r * 

-^0 

4 

31.3 

-7.5 

56* 

£ 8 . • 

o5 . J 

I 


X 





J 


♦ ♦♦♦♦♦♦♦44444444*44 44♦♦♦♦44♦* 4444 44 4 44 44 44 


• * 

*.3 

4 

4 

* 

“I. 3 

H 69 Z 

75.3 

53.; 

93.3 

f 


I 

.... 





“ 

C. 7 

I 

3r. 

V9*0 

51.3 

73.3 

57.5 

• 

-•*.3 

4 

33.* 

•.■ 2 * 3 

• • 

73.3 

S’ • 5 


.. 




102 




























1 


_<r-S_aLr Cf »0 Cf JIC 11*4 

_ -iMN N^J?«6cr » NU« 'irS 


_-j_G^t^O^tSS -SET _ _ _ 

: : 1 2 j 4 5 

-:- 1 - 



... ^. 


■ I” 


■■faHillUIIIII 

..... 



1 

? 

7.a 

I 



60.1 

o3# 4 

66.1 

1 

i 

C. 1 

■ I 

I 


kHBifl 

57.1 

35 • t 

61.3 


I 


4 






1 

r - 

7.5 

I 

7i.3 

3a.£ 

35.1 

66.3 

35.3 


I 


r 







4 4^>^^ 4 4^4>»4 4^4^44 » 


4 


4 






2 

4 

7.1 

r 

42.5 

-.a.e 

61.5 

66. 3 

65.5 


I 


r 

4 







4 

0.3 

1 

’1.S 

33.6 

56.3 

5b# ' 

63.5 


4 


t 







» ^ 

7.3 

T 

4 

27.S 

u 

l>» 

59 . 1 

6o.3 

S3.5 


i 


I 







♦ ^4 44^4 


r 

m 

1 

I 

41.2 

«o • 3 

52.? 

72.5 

75.1 


* 


'T 







z 

0.1 

X 

30.0 

37.5 

55.1 

55.5 

65.3 

7 

I 

» _ 
« • 

7.0 

i 

’S.» 

35.3 

33 . ? 

66. 3 

65.1 


z 


I 






♦♦♦♦♦♦«♦♦** 




4 


X 






u 

X 

7.3 

I 

.3.3 

53.6 

56.3 

71. 3 

73.5 


4 


'I 






u 

4 

0.1 

r 

35.0 

35.« 

55.1 

60.1 

09# 1 


"!' 

7.1 

I 

X 

13.3 

S3.e 

53.5 

60.0 

9? • 3 


*• 

4 


" X' 


. 

* 



r 


7.3 

• 

.3.3 

-J.5 

57 . • 

72.5 

75.3 

C 


3.3 

X 

I 

■ 4 ■ • ■ 

13.0 

_ 37.5_ 

3 ^ “ 

cfe • 3 

72.5 

e 

: - 

7.3 


31.3 

36.3 

39# ’ 

65. 3 

71.3 


4 4 

*4444 4^44^44^444 4 4^44444444 



7.3 

T 

47.= 

53.1 

6 5.1 

73.5 

73.5 




X 






& 


3.) 

I 

■ r 

13.3 

36. c 

5 9#' 

70. 1 

71.5 

«. 

: - 

7.1 

Z 

15. 0 

H.- 

35.' 

e7. 5 

65.5 


i 

. : . . ..... 

r 

; 

1 

4 

4 

?l. 3 

53* ? 

71.1 

60.1 

31.3 

- 

z 

C. 1 

9 

1T.» 

36.7 

? r • * 

71. 3 

77.5 


X 

i • 

7. 1 

4 

11. ■■ 

35.1 


7 1.3 

77.5 



• •♦ ' 



103 


44 444 4 4 4444 

. 



J 



















1. 

E 



■i-SULT "3- 3" 

C: Of T-i ; 

•»J*I NJHfctF 5 


cotF^:::! r 


aocc'-ta 




: : t_2 j_ 1. ? 

I I 7.1 I -a.e 52.5 52.' ii.l 75.2 

“ J 

_ t - ?V'^ *_??„•?_2-*=._. »»•'’ »’•’ 

* X 

1 I -7.3 : 23.! _ “1*1 62.3 52.1 

■■: I ■ " ■■ ■■ ■ - - 


_3 2.J_ _??*® 6 

23. 3 32.e 

.3 I 22.5 3o.3 5 


• « 

2 : 7.3 I J-l.!_jp3.i_ 

2 : n.a I 22.5 -. 3.9 52 .* 62.5 n.i 


. ’ . -i -7.3 I 21. 2 _1T‘.-.._. _. -’O... _60.0 

'- : 7.0 : ?5.1 * 1.3 6 5. 1 73.! SI.2 


_ 35. 3 ._ ... 

35.0 66.2 






E. 5. ! 

o?» i 


71.3 

75.3 

22.5 


52.* 

62.5 

71.1 

21. 2 

37.5 

53.2 

60.0 

9>.3 





75.2 

SI.2 


66.2 

73.3 


66.2 

6 S.S 


1 

* 4 

; ‘.o I 

22.5 

fo* 3 

62.* 

77.5 

12.5 

Z I 

: 3.1 : 

-0.3 

50.3 

55. •> 

73.3 

83.1 


-’.3 : 27.3 <.2.: 32." 71.3 7S.- 

■ ■ r " ■ ■ ■ - .- 

r 

_7 .3 I 57.5 53.6 5 j»*2 _ e0• 3 32.5 

3.3 : 27.3 -5.2 i1.* 7 0. 3 72.5 

I - - 

*7.3 I 25,3 -3.5 I.?.' o7.5 72.5 

» ■ ■ ~ 

* 




















■i 

i 


1 

1 


APPENDIX C 

OUTPUTS OF MANOVA TESTS 







i 



106 















APPENDIX C2 




‘JNISSHt :i.lH (iGJ I ) SISjj 
J - bOdd JC'rt 







• 

* 

u. u 

^ ^ 

o i:^ 




^ 

O' o» 

♦ 

* 

^ u. 

^ o o 

fO 



o o 

• • • 

• • 

* 

* 

M 





i/) 



• 

* 


«n 

«4 ^ 

* 

if 


eo ^ ^ 

lO «r» 



U. 

<0 40 0* 



♦ 


• • • 

• • 




|A lA lA 


• 

* 


«<l CM 


* 

* 

3? UJ 

«0 A (VI 

03 «A 


* 


«A tf% (\J 

lA lA 

u* 


UJ 

«C aT C» 

K> r»> 


♦ 

r 3 

• • • 

• • 

o 


o 

^ ^ ‘r 

l^ lA 



(A 

Aj N- cn 


2 



(\j ro 



CO 



b. ro 

CO 


» 

O U4 

a: 

CO 

u. 


x; < 

A3 



s z> 


o 


Vi o 

5 


* 

CO 

o 

o 

o 

cn 

» 



o 

I o 
o cn cr 

• 

> 

* at 


*M « ^ 

AJ Al 

J- 

o 


N* ^ 

(VI CVI 




J- !•» CJ 


CD 

£ 


• • • 

• • 

• 

• 

• 

O' l». s\J 


(ibT 

w 


JC ►. 

uJ 00 

flO 

Ai 

) 

c»» Vi 


(A 


i£\ 

Ti 


o 


cn 





z 



H- 


o 





M 



»-< 


f- O 



O' 


o a 



« 

(/> 

< 



> 

►- 

a 





cu 



U. 

Ul 


c 


o 

u. 


JJ 



u. 


z 

< 

III 

Ul 


M 


o 

•-I o 

V M 

< 

c> 

iX 

z e CD 

< O. 


M 

3 

M 

z 

0. 


O 


1 

X 

Ul 

(O 

z 

M 

ly 

QC 


CASES ( I P:r) \1ZRz missing 





APPENDIX Ck 



109 


5 CASES WfS»E PROCESSTI# 

0 CASES ( i F3T) HISSING 





APPENDIX C5 





* 

* 

CO CO «X 

O t- Ul ►- 





m 

• 

U. Z 1- Ul 







ul < lU 






* 

n at-t 





iA 

» 

Ul z oc 
^ Ui « 






* 

to Cl. » z 

3 UJ O t 





(A 


“» o o > 
a z Ul 





► 


« M ♦ o 





•J 









o' CO « 






* 

r> i-' t_ 


M 



Z 


U. Z Ul 


• 





Ul If- 





< 

* 

C Q 

Ul z 

1- Ul 

CO o. z 

o o o 


<e K c" 


z 


3 Ul : 

4- CM ^ 


a ccj ^■ 




3 Q > 

• • • 


• • • 


o 


O Z UJ 



J- '*4 CM 




< M a 

1 


• • 


1-4 







►- 

* 

O <t 


C» 


N 

< 


Ul H' 


CM 


o 

♦ 

A 

J^ UJ 

CO 

3 


• 


r . 

S 

w 

CO 

M 


”> z 

9 «S> O 


.J K CO 


A 

o : 

^ CM iD 


O CM K 

(0 

CO 

H 

§ 

b. 


< > 

• • • 


• • • 


A 

Z UJ 



v4 CM 

M 


3 O 

1 


1 1 

CO 

A 

A 





o 

A 

tn 


z 

IT. IA 


lO lo lO 

o 

o 


A 

A 


^ w4 ^ 


^ w4 ^ 










« 






Ul 

» 

CO 





UJ u 


Tt 





M 

A 

• >- 





^ o 


.3- Of 





z 

1-4 0* 

^ o 





a c> 

cr s 

c* 






* 

Ul 





M 


►- 






A 

«r 






>• 

II u 






CD » 







» 

z ♦ 





3 

A 

UJ UJ 

X J 





z 


cr 






A 

a < 

Z M 

CM 


T4 CM fO 



A 

<x ct 

tn < 

M 


o 



A 

o > 

CD 


OD 


110 


MULTIPLE R SQUARED •ZJJ 

MULTIPLE R *526 






APPENDIX C6 



* 

* a' <a to <i 

C I- *- 


* 

fr (t. z >- to 

uj «(. a 
« o o •-« 


(/J 

lb z 

* Ui «r 



I/I a ^ z 

» => UJ o : 


C/i 

•» O w > 

» O Z iiJ 



l-l ♦ o 

A 




,V I/I 

A o ►- *- 


•X 

a z •>> 

« ti' cr 



n C3 
» ut z 

H- tu 

10 w s 


* caa z 


z ui: 

N. r* lO 


5 -no 

• • • 

o 

O Z Ui 

M CM 


% <c »-t o 

1 



•» O »T 



UJ 

* k- UJ 


» o 

to 


ca 

» 3 

10 w ■» 

M 

*0 z 

«} 

• OS 

/>. rt ui 

U) U. 

■«* > 

• • • 

4. Z UJ 

ro CM 

§ 

3 3 

t 

Q « 

O t/' 

o » 

O c/i 

« 

\r V 


«c 


€T« U\ CM 

Ki CT lA 

• m ICt 

• • 


O f«. lo 
« CM tn 

• • • 

ro tC «\j 

I I 


•o 


iS r* ^'> 
CO CM b'< 

• • • 

lO CM 
I I 


If- ii ir 




UI 


JO 


UJ 0 


O’ 


>' lu 

» 

• 

>- 

-1 0 



z 

z 

*-4 0 * 

lA 

0 

a 0 

a* r 


CD 




UI 

M 





• 


«r 

»- 

>• 

II 

u 


CD • 



-1 


z 

u 


» 

<t 


3 


UI 

UJ 



z 


X 



0 

* 

a 

< 

1 


z 

M 

• 

* 

< 

z 



or 

< 

» 

* 

0 

> 


C 

Ui 

or 


3 

r» 

(/) 


▼C CM 


O 

CD 


ly ar 

111 Ui 

^ -i 
a a 


3 3 


1 1 1 


i 



li 




appendix C7 




* o; (/) c/> < 

C, »*l K 

♦ 

♦ Ik X »- u 

Ik < (D 

» o o *-• 

(A 

UJ X 

• k* ul «■ 


(A <1. ^ 7 

* 3 ui o : 

(/» 

^ o o > 

• O X UJ 


« i-« ^ a 






OC (A « 






O k- »» 

IW 

1^ 


X 


tk X UJ 

• 

• 



* 

tU CD 




<x 


r» o 





» 

* 

Ik X 
k- Ui 

CD a. X 

O'.» 

N CNJ 


X 


3 til s 

^ o lO 

tO \D r> 



• 

-D O > 

• • • 

• • • 


o 


O X Ui 

!i\ cv ro 

(VJ 1 CM 



♦ 

«c l-t o 

1 

1 

-d- 

M 





s 

h- 

♦ 

a <r 


.J* 

(0 

< 


Ik t- 

Nk 

K> 


♦ 

1- lU 

• 

• 

0] 

a> 

U 

M 

* 

(A 

3 

3 X 

O' ^ ^ 

«r CM CM 



* 

o : 

md 


g 

U. 


<s > 

• • • 

• 

• 

• 


• 

X UJ 

i.-\ fsi ro 

CM 1 CM 

o 

M 


3 O 

1 

1 

o 

trt 









»r fr IT 

»r. IT 



* 


^ w4 w4 

^ ^ 





MULTIPLE R SOUAREO ,f>l? 

MULTIPLE R .f92 












APPENDIX D 

GUIDE TO INTERVIEWS WITH SOURCE SELECTION PRACTITIONERS 


114 





GUIDE TO INTERVIEWS WITH SOURCE SELECTION PRACTITIONERS 


Introduction 

Outline the topic and scope of the study to the 
lntei*vlewee. 

Obtain details of the Interviewee's background eind 
experience In source selection. 

Specific Points of the Discussion 

The purpose of this section of the Interview Is to 
obtain the lntex*vlewee' s perceptions of: 

(1) The ultimate objective of the source selection 
process and the effectiveness etnd efficiency 
of the process toward achieving the objective, 

( 2 ) The comparison between numerical and color 
scoring methods and comments on their relative 
merits. The significant level of discrimination 
In nvimerlcal scoring systems. 

( 3 ) The significance of Contractor Inquiries (CIs) 
and Deficiency Reports (DRs) to the decision 
process. Is the effort commensurate with the 
usefulness of CIs and DRs? 

(4) Changes that could be made to Improve the 
soxirce selection process. 


115 








BB 


Closing Discussion 

Invite additional comment on the source selection 
process which might aid the research. 


116 









A. REFERENCES CITED 


1. Aeronautical Systems Division, Air Force Systems 

Command. Source Selection Guide . ASDP 800-7. 
Wright-Patterson AFB OH, 1978* 

2. Beard, Major Robert J., USAF. "The Application of 

Multi-Attribute Utility Measurement (MAUM) to the 
Weapon Systems Source Selection Process." Un¬ 
published research report No. 0l40-80, Air Command 
and Staff College, Max-well AFB AL, I 98 O. 

3 . Dawes, R.M. "A Case Study of Graduate Admissions: 

Application of Three Principles of Human Decision 
Making," American Psychologist . 1971» PP. 1 80-88. 

4. DeVlspelare, Aaron, A.P. Sage, 2 uid C.C. White, III. 

"A Multicriterion Planning Aid for Defense Systems 
Acquisition with Application to Electronic Warfare 
Retrofit," Proceedings of the Ninth Annual DOD/FAI 
Acquisition Research Symposium . United States 
Naval Academy, Annapolis MD. June 198 O. 

5 . Dycus, Bob. "Improving the Source Selection Process 

by Measuring the Human Response of Proposal Evalua¬ 
tors ." Proceedings of the Sixth Annual Department 
of Defense Procxurement Research Symposium . Array 
Procurement Research Office, U.S. Army Logistics 
Mauiagement Center, Fort Lee VA. June 1977* 

6. Helman, Theodore, LtCol, USAF and Robert L. Taylor, Maj, 

USAF. "A Conceptual Model for Evaluating Contractor 
Management During Source Selection." National Contract 
Meinagement Journal .Vol 10, Number 2, Winter 1976-77» 

pp.88-108. 

7 . Keen, Peter G.W. and Micheal S. Scott Morton. 

Decision Support Systems - An Organizational Per¬ 
spective . Reading: Addison-Wesley, 1978. 

8. Lee, David A. "Sensitivity of Offerors' Scores to 

Variations in Item Weights and Item Scores." 

Proceedings of the Seventh Annual Acquisition 
Research Symposium . Hershey PA, June 1978. 


118 








9. Logistics Management Institute. Briefings on Defense 
Procxirement Policy and Weapon Systems Acquisition . 
Washington, 1969 . 

10. Milligaui, Captain John N., USAF. "A Critical Appraisal 

of Source Selection Procedures." Unpublished 
master's thesis. AFIT/GSM/SM/79S-10, AFIT/EN, 
Wright-Patterson AFB OH, September 1979» 

AD AO 76158 . 

11. Neter, John, William Wasserman, 5uid G.A. Whitmore. 

Applied Statistics . Boston: Allyn and Bacon, 

1978. 

12, Nie, Norman H,, and others. Statistical Package for 

the Social Sciences. 2d ed. New York; McGraw- 
Hill, 1975 . 

13 , Peterson, Steven W. "Numerical Methods for the 

Evaluation of Potential Research and Development 
Contractors." Unpublished research report, 
USAMC-ITC-02-08-75-214, USAMC Internal Training 
Center, Red River Anay Depot, Texarkauia TX, 

April 1975 . 

14. Scheffe, Henry. The Analysis of Variance . New York: 

Wiley, 1959 . 

15 , Shannon, Robert E, Systems Simulation - The Art etnd 

Science . Englewood Cliffs; Prentice-Hall, 1975* 

16 , Simon, Herbert A. Adminjstrative Behaviour . New 

York; Macmillan, 1957» 

17 . U,S. Department of the Air Force. Source Selection 

Policy and Procedures . AFR 70-15• Washington; 
Government Printing Office, 16 April 1976. 

18 . U.S. Department of Defense, Defense Acquisition 

Regulations . Washington; Government Printing 
Office. 

19 , U.S. Department of Defense. Ma.ior System Acquisition 

Procedures . DOD Directive 3000.2. Washington: 
Government Printing Office, March, 198 O. 


119 






/ 

/ 

20, U.S. Department of Defense. Selection of Contractual 

Sources for Ma.ior Defense Systems . DODD 4105.62. 
Washington: Government Printing Office, J 2 m.uary, 

1976. 

21, Wald, Charles C. "Determining Value: A Process 

for Quantifying Subjective Beliefs." Proceedings 
of the Seventh Annual Acquisition Research Sym¬ 
posium . Hershey PA, June 1978. 

22, Williams, Robert F. "Problems In Numerical Input 

for the Source Selection Decision." Defense 
Systems Management Review . Vol 3» Number 3, 

Stunmer 1980, pp. 122-128. 


120 






B. RELATED SOURCES 


Aeronautical Systems Division, Air Force System^ Command. 
The Source Selection Process . Wright-Patterson AFB 
OH, 15 Jan 1978. 

Proceedings of the Eighth Annual DOD/FAI Acouisition 

Research Symposium . Naval War College, Newport RI, 
May 1979 . 

Shaw, Major Graham, USAF. "Source Selection Process 

Handbook for the Air Force Space Division." Unpub¬ 
lished research report No. 2185-80, Air Command and 
Staff College, Maxwell AFB AL, 1980. 

U.S. Department of Defense. Major System Acquisitions . 
DOD Directive 5 OOO.I. Washington: Government 
Printing Office, March 198 O. 


121