0% found this document useful (0 votes)
92 views112 pages

An Analysis of Navy Recruiting Goal Allocation Models

Uploaded by

Vincent Koh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
92 views112 pages

An Analysis of Navy Recruiting Goal Allocation Models

Uploaded by

Vincent Koh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 112

An Analysis of Navy Recruiting

Goal Allocation Models

Yevgeniya K. Pinelis • Edward J. Schmitz


with
Zachary T. Miller • Erin M. Rebhan

CRM D0026005.A2/Final
December 2011
Approved for distribution: December 2011

David Rodney, Director


Fleet and Operational Manpower Team
Resource Analysis Division

This document represents the best opinion of CNA at the time of issue.
It does not necessarily represent the opinion of the Department of the Navy.

Approved for Public Release; Distribution Unlimited. Specific authority: N00014-11-D-0323 .


Copies of this document can be obtained through the Defense Technical Information Center at www.dtic.mil
or contact CNA Document Control and Distribution Section at 703-824-2123.

Copyright  2011 CNA


This work was created in the performance of Federal Government Contract Number N00014-11-D-0323. Any copyright in
this work is subject to the Government's Unlimited Rights license as defined in DFARS 252.227-7013 and/or DFARS
252.227-7014. The reproduction of this work for commercial purposes is strictly prohibited. Nongovernmental users may
copy and distribute this document in any medium, either commercially or noncommercially, provided that this copyright
notice is reproduced in all copies. Nongovernmental users may not use technical measures to obstruct or control the read-
ing or further copying of the copies they make or distribute. Nongovernmental users may not accept compensation of any
manner in exchange for copies. All other rights reserved.
Contents
Executive summary ............................................................................ 1
Background and tasking .............................................................. 1
Approach ...................................................................................... 2
Findings and recommendations .................................................. 2
Enlisted active.......................................................................... 2
Enlisted reserve ....................................................................... 2
Officer active ........................................................................... 3
Officer reserve......................................................................... 3
Medical .................................................................................... 3

Introduction ....................................................................................... 5
Background and tasking .............................................................. 5
Issues ............................................................................................. 6
Approach ...................................................................................... 7
Organization of this report .......................................................... 8

Enlisted active component (AC)......................................................... 11


Introduction ................................................................................. 11
Market definition .......................................................................... 12
Geography .............................................................................. 12
Eligibility ................................................................................. 12
Size .......................................................................................... 13
U.S. Air Force ......................................................................... 16
Enlisted AC goaling—the enlisted goaling model ..................... 13
Description ............................................................................ 13
Performance ........................................................................... 14
New goaling issues ........................................................................ 15
Unit of analysis ........................................................................ 15
Unit of analysis ........................................................................ 15
Lessons from other Services ........................................................ 17
Addressing the issues: A zip-code level goaling model with
diversity data .............................................................................. 19
Estimation technique—the Zero-inflated Poisson Model .... 19
Model inputs—the any-contracts model ...................................... 20
Model results ........................................................................... 23
Model diagnostics ................................................................... 27
Recommendations ........................................................................ 28

Enlisted reserve component (AC)....................................................... 31


Introduction .................................................................................. 31
Enlisted RC goaling—the current models.................................... 31
Lessons from other Services ......................................................... 32
U.S. Army ................................................................................. 32
U.S. Marine Corps ................................................................... 34
Recommendations ........................................................................ 36

Officer active component (AC)........................................................... 39


Introduction .................................................................................. 39
Current officer AC goaling method ............................................. 40
Description .............................................................................. 40
Performance ............................................................................ 41
New goaling issues ........................................................................ 43
Demographic diversity ............................................................. 43
Lessons from other Services ......................................................... 43
U.S. Army ................................................................................. 44
U.S. Air Force .......................................................................... 44
U.S. Marine Corps ................................................................... 45
Addressing goaling issues in the officer AC ................................ 47
Demographic diversity ............................................................. 43
Suggestions for a way forward ...................................................... 52
Ideas and suggestions ............................................................. 52
Diversity ................................................................................... 53

Officer reserve component (RC)........................................................ 55


Introduction .................................................................................. 55
Officer reserve PS .......................................................................... 55
Current goaling model ........................................................... 55
Addressing officer reserve PS goaling issues ......................... 55
Lessons from other Services ......................................................... 56
Ideas and suggestions ............................................................. 57
Officer reserve NPS ....................................................................... 58
Current goaling model ........................................................... 58
Lessons from other Services ......................................................... 58

ii
Ideas and suggestions ............................................................. 59

Medical officer recruiting .................................................................. 61


A descri;tion of goaling issues in medical officer recruiting ..... 62
U.S. Navy ................................................................................. 62
Lessons from other Services ........................................................ 64
Potential ways forward ................................................................. 66

Future work ........................................................................................ 69

Appendix A: Recent goal allocation concerns and ROY


winner interviews ............................................................................ 71
Recruiters’ geographical range .................................................... 73
Technology ................................................................................... 73
Past production ............................................................................ 74
Medical officer recruiting ............................................................ 75

Appendix B: Enlisted active component (AC) zip-code-level


model results ................................................................................... 77
A-cell recruits ................................................................................ 77
Black recruits ................................................................................ 73
Black A-cell recruits ...................................................................... 80
Hispanic recruits .......................................................................... 81
Hispanic A-cell recruits ................................................................ 82
Female recruits ............................................................................. 83
Female A-cell recruits ................................................................... 84

Appendix C: Examples of data sources for medical officer


goaling .............................................................................................. 87

Appendix D: Review of incentives literature .................................... 93


Background .................................................................................. 93
Recruiter incentives ..................................................................... 94

References .......................................................................................... 97

Bibliography ....................................................................................... 101

List of tables ....................................................................................... 103

iii
This page intentionally left blank.

iv
Executive summary
Background and tasking
The Navy continually aims to have the right combination of per-
sonnel to meet its dynamic needs. Achieving the desired overall
force composition requires accessing the right mix of recruits—
enlisted and officer, active and reserve. To do this, the Navy must
have details about the available recruitable population, including
where specific types of people are located.

The Navy desires to improve the way it allocates recruiting goals.


The Navy Recruiting Command (NRC) uses econometric models to
guide how it geographically allocates goals for recruiting enlisted
personnel, and employs a less rigorous method for officers. These
models consider a variety of factors, but they have limitations. There
has not been a review of the enlisted model since the late 1990s,
and the current model does not consider all the components of the
recruiting market that the Navy may want to examine. The Navy’s
method for allocating recruiting goals for officers has been devel-
oped in even less detail, and has not been evaluated recently. Also,
the perception among recruiters is that there might be too much
emphasis on past production when setting goals, especially in offi-
cer recruiting models. A broader issue is that recruiting goals are
tied to Navy Recruiting Districts (NRDs), which are quite large, thus
preventing the precise allocation of manpower and advertising
funds for recruiting purposes because such allocation was not an
original goal model objective. In the future, more precise goaling
models can facilitate NRC’s ability to restructure recruiting.

In light of these issues, the Commander of NRC (CNRC) asked


CNA to suggest how to improve the existing models and methods.
To address this tasking, we offer suggestions for advancing the
methodology underlying their current ways of setting goals.

1
Approach
We began by talking to the winners of the Recruiter of the Year
(ROY) awards to understand what they think is lacking in the cur-
rent goaling process. Next, we reviewed the literature on recruiting
practices and modeling methods. Then, after reviewing how the
other services allocate their recruiting missions, we examined exist-
ing databases to see what data are available as model inputs. Finally,
we created a model for active enlisted personnel and identified what
data are necessary to create a similar model for the reserve enlisted
and active and reserve officers. Using information we collected
about officer data, we developed recommendations on how to im-
prove the current approach to officer recruit goaling.

Findings and recommendations


Our study findings, results, and recommendations relate to the five
personnel types under review: active enlisted, reserve enlisted, active
officer, reserve officer, and officers in the medical field.

Enlisted active

The current NRC model for goaling enlisted recruits for the active
component (AC) was rigorously developed. It uses much of the
publicly available data, but these data lack detail and specificity, par-
ticularly with regard to location. The goals are distributed to the
NRDs, which are fairly large. To provide NRC with the capability to
account for location in goaling, we develop a model for forecasting
the number of recruits of different types from each zip code. We
verify our model’s predictive capability and recommend adopting
the zip-code level model to make better use of available market
data. This more highly detailed goaling method will be helpful as
the NRC considers reorganizing and redistributing its goals both
geographically and demographically.

Enlisted reserve

As for reserve enlisted, NRC is using available data, but the data
have some of the same issues. For non-prior-service personnel, we

2
recommend using our active enlisted model with some minor
changes to data inputs.

Officer active

We learned of significant data problems on the officer side. Current


accession data do not have enough detail about where potential Of-
ficer Candidate School attendees live and from which college or
university the officer was recruited. We recommend that NRC con-
tinue its effort to fill its data gaps, as well as support efforts to create
an all-service officer database. An easy modification to the current
method would be to use more publicly available market data, such
as propensity-to-enlist and medical data from the Centers for Dis-
ease Control and Prevention. Once the relevant data are in place,
the existing officer goaling method can be evaluated quantitatively
for its performance before resources are spent on improving it.

Officer Reserve

Data about the prior-service and non-prior-service populations re-


veal similar issues to those of the active officer population. On the
non-prior-service side, we recommend that NRC add the market
considerations we recommended for the AC officer goaling, and
continue its efforts on accurately recording home-of-record and col-
lege data. We also recommend that NRC focus on collecting and
updating accurate, timely, and specific data about officers who leave
the active duty force, so they may be contacted later for the reserves.

Medical

The small number of officers in the medical field makes it difficult


to apply statistical techniques to the goaling process. The current
process is based, in large part, on past production, which can dis-
courage recruiters from producing as many contracts as they can. If
NRC is to keep the current medical goaling model, we propose im-
provements through incentives and additional market information
sources, such as data on medical students. However, we also rec-
ommend that NRC consider alternative ways of doing recruiter goal-
ing, including using incentives and competition to motivate

3
recruiters. One of our suggestions is to consider the “fantasy draft”
model for goaling medical recruiters, as is done in the Air Force.

4
Introduction
Background and tasking
Navy Recruiting Command (NRC) is responsible for recruiting four
main types of personnel, each of which constitutes a separate re-
cruiting market: officers and enlisted for the Active Component
(AC) and officers and enlisted for the Reserve Component (RC). In
addition, NRC has responsibility for recruiting all the Navy’s medi-
cal officers, an important submarket of AC and RC officers. An ef-
fective goaling process is a key factor in ensuring that all the Navy’s
accession goals are met and that recruiting resources are used effi-
ciently. In particular, the geographic allocation of the Navy’s re-
cruiting goals has a major impact on resource productivity, as well as
on the quantity, quality, and demographic mix of recruits. The goal-
ing processes for the five types of personnel listed above have not,
however, been updated or critically examined for some time. This is
significant because, over the last decade, several important changes
have occurred, including the following:

 NRC has increased its emphasis on meeting detailed goals,


especially for demographic diversity.

 Data about key market characteristics are, increasingly, pub-


licly available via the internet.

 Budget constraints have tightened, highlighting the need for


effective use of recruiting resources.

In light of these changes, the Commander, Navy Recruiting Com-


mand (CNRC) asked CNA to examine NRC’s goaling processes with
an eye toward making maximum use of available market informa-
tion and efficient use of recruiting resources.

5
Issues
In general, the purpose of a goaling process is to distribute a total
service accession goal to recruiters in the field in a way that maxi-
mizes the probability that the aggregate mission is met. Depending
on its design, a goaling model may also be used to signal changes in
the recruiting environment that call for changes in the overall level
of recruiting resources or changes in their distribution across the
country.

An important consideration in the allocation process is equity: all re-


cruiters should have the same opportunity to succeed. To equitably
distribute recruiting goals across geographic markets, a goaling
model must control for the underlying productivity of a particular
area. For example, a full model might take into account the size of
the target population; economic and labor market variables that cap-
ture alternative employment opportunities; indicators of the relevant
population’s propensity to join the military; and measures of Navy
and other military recruiting resources applied to the area. In prac-
tice, some of these variables are related to each other in the way that
they affect an area’s productivity, so data may show that only a subset
of these variables is required for an adequate goaling model. Equity
concerns are not entirely about fairness; they are also related to pro-
ductivity. Goals that are perceived to be too difficult for the market
can harm recruiter morale and, therefore, productivity.

Navy Recruiting Command (NRC) uses econometric models to


geographically allocate recruiting goals for AC and RC enlisted per-
sonnel, and employs less formal methods to allocate goals for AC,
RC, and medical officers. All the models consider some important
factors, but they have limitations. Although the enlisted goaling
model is statistically rigorous, it may not include all the components
of the recruiting market that the Navy may want to examine. The
methods for allocating recruiting goals for officers are not statisti-
cally rigorous, largely due to quality and quantity of available data,
and may place too much emphasis on past production, while not
taking advantage of available market data. An issue that affects all
five goal allocation models is that they are defined at the National
Recruiting District (NRD) level, which prevents the use of poten-

6
tially valuable market information and the precise geographic allo-
cation of resources, including recruiters, advertising, and stations.

To inform improvements to NRC’s current goaling methods, this


study addresses the following questions for each of the five person-
nel types and their associated markets:

 What is the most effective goaling level or unit of analysis?

 What market, demographic, and resource factors should be


included in each model?

 What method should be used to allocate recruiting goals?

Approach
This study was done in two phases. We began with exploratory ac-
tivities, then, based on results from these activities, we moved to
model assessment and development.

The exploratory phase of the study focused on understanding the


goaling methods currently used for each market by the Navy, as well
as the other services, and on identifying market-specific issues asso-
ciated with each approach. Information on the Navy came from
relevant literature as well as from discussions with CNRC personnel
1
at various levels; selected 2010 Recruiters of the Year (ROY); and
participants in the 2011 Officer Goaling Conference. Information
on other services was provided to us by the staff of the U.S. Army
Recruiting Command (USAREC), Marine Corps Recruiting Com-
mand (MCRC), and the Air Force Recruiting Service (AFRS). We
also examined existing Navy and market data to explore the market-
specific potential for statistical modeling and to inform model in-
puts and structure. We reviewed the following data sources:

 Navy data

— Enlisted Master File (EMF)

— Officer Master File (OMF)

1
We summarize what we learned from ROY winners in appendix A.

7
— Personalized Recruiting for Immediate and Delayed
Enlistment (PRIDE)

— Reserve Component Common Personnel Data System


(RCCPDS)

 Market data

— The U.S. Census

– General population statistics

– Woods & Poole and Qualified Military Available (QMA)


subsets

— Public Use Microdata Sample (PUMS), which is part of


the American Community Survey (ACS)

— Integrated Postsecondary Educational Data System


(IPEDS)

— National Center for Education Statistics (NCES)

— National Center for Veterans Analysis and Statistics


(NCVAS)

— Center for Disease Control (CDC)

— Association of American Medical Colleges (AAMC)

— American Association of Colleges of Osteopathic Medicine


(AACOM)

Based on the results of these exploratory activities, for the enlisted


AC market, we developed a zip-code level goaling model that takes
into account key market factors, including information related to
gender and race/ethnicity. We then evaluated the model against
real data from outside the estimating sample to assess its predictive
accuracy. For the other four markets, data issues and resource con-
straints dictated that we take a more qualitative approach. For these
markets, we identified appropriate modeling techniques and the
data required to support such models. We also considered non-
modeling approaches to improve the goaling process.

8
Organization of this report
The paper is divided into five main sections, one for each market
segment (enlisted AC, enlisted RC, officer AC, and officer RC). The
final section looks at recruiting medical professionals. Within each
section, we describe the Navy’s current goaling method and identify
the issues we believe need to be addressed. This set up is followed by
a brief summary of the other Services’ goaling methods and lessons
learned from them. With all the necessary information in hand, we
then analyze the goaling issues and make recommendations.

9
This page intentionally left blank.

10
Enlisted active component (AC)
Introduction
The enlisted AC mission is by far the Navy’s largest recruiting mis-
sion and the goaling model for enlisted AC personnel is the Navy’s
most sophisticated. It uses a combination of statistical methods to
forecast high-quality male contract production. The model includes
historical production, recruiting resources, and economic and
population factors, as well as seasonality, pay, and other variables.
The Enlisted Goaling Model (EGM) is used to provide goals for
each of the two Regions (East and West). It also is also configured
to provide estimates of the recruiting potential for each of the 26
NRDs. The Regions may use the goaling model recommendations
for their NRDs, but they are free to modify these allocations. Each
NRD further redistributes its goals down to recruiting stations and
individual recruiters, using whatever approach they deem most rele-
vant.

The EGM has remained largely unchanged for at least two decades,
over which time the recruiting environment has changed substan-
tially. Two changes have particular relevance for enlisted AC goal-
ing. First, an increased emphasis on demographic diversity suggests
a need to include new factors in the model. Second, tighter budget
constraints, and the concomitant need to use the goaling model to
support efficient resource allocation, suggest a need to model em-
ploying a smaller unit of analysis.

Therefore, in this section, we develop a new enlisted goaling model


that uses the zip code as its level of analysis and that allows for more
refined modeling of contracts for the demographic sub-groups of
interest. We begin by defining the enlisted AC market to provide
context for the modeling discussion and to elucidate how the
goaling and modeling concerns for enlisted AC personnel differ

11
from the goaling and modeling concerns of the other four types of
personnel addressed in this report.

Market definition
Geography
For the Navy, enlisted AC recruiting markets are defined
geographically by recruiting stations and the areas surrounding
them: together, the recruiters assigned to each station are
responsible for covering a surrounding geographic territory that is
roughly defined by zip codes [1]. The stations are nested within the
26 NRDs, and NRDs aggregate to the two Regions.

Eligibility
Within geographic areas, eligibility requirements for enlisted
personnel further define the enlisted AC market in terms of
education level and age. The primary target population for enlisted
AC recruiting is high school students and high schools graduates
ages 17 to 22. This is the primary market because members of this
age group are both “at the stage of life that career decisions are
natural” and at the “optimum training age” [1]. A secondary target
market consists of men in the 22- to 29-year-old age group, with or
without a high school diploma [1]. Although the secondary market
officially includes non-high-school-degree graduates (NHSDGs), the
Navy only enlists a limited number of these each year because this
group has been shown to have high first-term attrition relative to
2
high-school-degree graduates (HSDGs)[2].

The enlisted AC market is further defined in terms of recruit quality


based on scores on the Armed Forces Qualifying Test (AFQT).
Recruits who attain a percentile of 50 or above and have a high
school diploma are called A-cell recruits, those with the same AFQT
scores but without a high school diploma are B-cells, and those who
score below 50 but have a high school diploma are considered Cu-
cells [3]. These cells are shown in figure 1. In general, recruiters

2
The DOD restricts NHSDGs to 10 percent of total accessions, and the
Navy currently places a 5 percent cap on NHSDG accessions [30].

12
seek to enlist A- and Cu-cell recruits; the Navy does not enlist
anyone with an AFQT score below 35 and does not enlist NHSDGs
with AFQT scores below 50 [1].

Figure 1. Recruit Quality Cell Matrix


AFQT Percentile HSDG NHSDG
50 - 99 A B
31 - 50 Cu D

Finally, additional eligibility requirements related to health status,


citizenship, and criminal behavior further narrow the market.

Size
Between FY2007 and FY2011, the average size of the enlisted AC
3
recruiting mission was just under 36,000 accessions. With a mission
this large, it is feasible to use a statistical goaling model to allocate
the recruiting goals at a low level of geographic detail.

Enlisted AC goaling—the enlisted goaling model


Description
The model used for enlisted AC goaling is known as the enlisted
4
goaling model (EGM). It is a sophisticated econometric model that
is designed to determine the supply of eligible recruits and to
allocate the recruiting mission to each NRD. The dependent
variable in the EGM is the number of net new contracts from each
NRD in a given quarter. The explanatory variables are:

 A series of variables to capture the sizes of the male A- and


Cu-cell populations in each NRD

3
This data came from the Facts and Statistics tab on the CNRC website:
http://www.cnrc.navy.mil/PAO/facts_stats.htm .
4
To account for the possibility of a strong relationship between recruiting
in a particular quarter and recruiting in past quarters, the model is esti-
mated as an autoregressive form.

13
5
 The number of production recruiters in each NRD

 The seasonally unadjusted national employment rate

 The ratio of military pay to civilian youth earnings

 The Youth Attitude Tracking Study (YATS) propensity to


enlist

 The combined amount of money spent by the Army and


Navy on advertising

 The number of A-school seats available

 The total veteran population in each NRD and

 Controls for season, government shutdown in 1995, and


individual NRD effects

However, because the model was developed in the late 1980s and
early 1990s, the supply of “eligible recruits” refers to the supply of
male recruits only. Also, the model includes no information about
race and ethnicity.

Performance
The EGM was last evaluated in the late 1990s by CNA, which
reported its results in An Econometric Analysis of the Enlisted Goaling
Model by Goldhaber [4]. In general, Goldhaber found that the
model was reasonably accurate at predicting the number of A-cell
male enlistments on the NRD level, but that different models

5
The number of recruiters is determined using a different model. Each
year, planning staff at NRC determine the desired number of recruiters us-
ing a constrained optimization model that calculates the cost-minimizing
number of recruiters for a given beginning-of- year contract objective or
the maximum number of contracts for a given number of recruiters. The
model includes as parameters the programmed levels of other recruiting
resources (e.g., advertising and enlistment incentives), forecasts of the na-
tional unemployment rate and of military pay relative to civilian pay, as
well as the supply response to increases in the number of recruiters, which
comes from the EGM. [29]

14
should be used for subgroups of the population, such as workforce
recruits vs. high school seniors.

Goldhaber also explored adding new variables to the model, such as


average tuition in four-year public and private colleges and
universities (by state), average wages of those with at least an
associate degree and those with at least a bachelor’s degree, and
average unemployment rates for those with some college but less
than a bachelor’s, and those with at least a bachelor’s degree.
However, Goldhaber concluded that the modeling complications
introduced by these additional variables were too large compared
with the increase in model precision. As a result, with one the
exception identified below, he recommended keeping the existing
mix of variables for A-cell recruits. [4]

Goldhaber recommended taking the advertising variables out of the


model because of their endogeneity to the outcome: the Navy
spends more money on advertising where and when it is less
successful in its recruiting efforts, so in the model there was a
negative correlation between advertising and recruit production,
which is counterintuitive. Because assessing the effect of advertising
is important, he recommended estimating it in a separate study and
connecting it to changes in propensity to enlist rather than to the
number of contracts. [4]

New goaling issues


Although the EGM has performed well over the years, CNRC has
raised two concerns regarding its overall design.

Unit of analysis
Based on our initial tasking and on additional conversations with
NRC staff, CNRC’s primary concern regarding the EGM is the unit
of analysis. Because NRDs are large—spanning multiple labor
markets and even multiple states—modeling at the NRD level does
not allow the Navy to precisely allocate the recruiting goal based on
market-specific conditions and needs. While recruiting goals are
eventually distributed down to the station level, this is done without
the help of a model or rigorous methodology. A goaling model with

15
more geographical detail could make both goal and resource
allocation more structured and potentially more efficient.

Demographic diversity
The Navy has become increasingly focused on demographic
diversity, with the goal of growing a force that is representative of
the nation in terms of race, ethnicity, and gender. And, compared
to the other Services, the Navy has been relatively successful at
recruiting racial/ethnic minorities and women into its enlisted
ranks. For example, the Military Leadership Diversity Commission
(MLDC) reported that, for 2007 and 2008, the Navy was the only
Service whose enlisted AC accessions were not disproportionately
white relative to the eligible recruiting pool. And, only the Air Force
had a higher share of women among its accessions for the same
years: 23 percent for Air Force accessions compared to just below 19
percent for Navy accessions. The female accession shares for the
Army and the Marine Corps were 16 and 7 percent, respectively. [5]

To ensure that its accessions continue to be representative, the Navy


sets diversity targets for each NRD. Since the EGM cannot be used
to generate these targets, they are created using a combination of
past production and the demographic mix of the NRD population.
And, like the overall goal, diversity targets are distributed to the
station level without the use of a formal model. Having more
geographically precise information on which geographic markets
have larger shares of women and minorities who are in the right age
group and meet the Navy’s eligibility requirements would help
achieve the representation goal.

In addition, from talking with the recruiters of the year, we learned


that they rarely go out to specifically recruit minority candidates.
The recruiters’ trust in their NRC regional assignments emphasizes
the need to learn in detail about the location of minority recruits
and the need to continue placing recruiters in areas with racially
and ethnically diverse populations.

16
Lessons from other Services6
Before addressing these issues, we looked to the other Services for
ideas. Our review of the other Services’ goaling methods for the
enlisted AC market showed that none has a model that is as detailed or
statistically rigorous as the EGM. Instead, all three other Services
allocate their national goals to lower geographic levels using measures
of market size or past production, or combinations of the two.

The Army goals at the station level (there are 1,400 Army recruiting
stations across the country) use a weighted average of just two factors.
7
The first is a measure of historical past production that captures the
overall Department of Defense (DOD) production of high-quality con-
tracts. The second is a projection of the qualified military available
8
(QMA) population between 17 and 29 years of age. The weights the
Army assigns to these two factors can change over time. In the past, the
Army has assigned a 90 percent weight to past production and a 10
percent weight to the QMA population. Recently, however, the weights
changed to 60 percent and 40 percent, respectively. The Air Force al-
9
locates national recruiting goals at the group level based solely on a 5-
year average of past production, with the most recent three years
10
weighted more than the other two.

6
We are grateful to Mr. Mike Nelson at USAREC, Col T.J. Kenney at AFRS,
and Captain Joseph Wydeven at MCRC for providing CNA with the follow-
ing information on goaling in their respective services.
7
The past production measure is a weighted average of the last four years
of production, with the weights declining from 40 percent for the most re-
cent year to 30, 20, and 10 percent for the next three years.
8
QMA is the total 17- to 24-year-old population, excluding institutional-
ized and those in military service, unauthorized immigrants, and non-
HSDG not enrolled in high school or an equivalency program.
9
The Air Force Recruiting Service (AFRS) is organized into 3 groups, 24
squadrons, and 1,215 recruiting offices.
10
In the past, the model included manning and population data in addi-
tion to past production. In recent years, however, evaluations of the model
have shown the contribution of manning and population to be small in
comparison with past production; as a result, the Air Force has been using
the simplified model.

17
Finally, the Marine Corps goals at the zip-code level use two popu-
lation measures and a past production measure. It begins with es-
timates of the 17- to 24-year-old civilian non-institutionalized
population (CNIP) provided by an outside agency called Woods
and Poole (W&P). Then, it determines what percentage of enlist-
ees should score 50 or above on the AFQT using five years’ worth
of results from ASVAB (Armed Services Vocational Aptitude Bat-
tery) test takers (regardless of whether the test taker subsequently
enlisted). The population estimates are then combined with five
years’ worth of zip-code level DOD production data. Based on this
information, the Marine Corps determines what each station’s mis-
sion share should be. Furthermore, this population distribution is
used to determine the allocation of recruiting resources. If there is
a shift in population, recruiters are moved between the regions,
but the total remains the same.

We drew two conclusions from our review of the other Services’


methods. Our first conclusion was to minimize using a measure of
past production in our model. This conclusion was based on evi-
dence that methods that rely too much on past production can
provide the wrong incentive for recruiters who may perceive that
producing a lot a contracts this year, will only make their jobs that
much harder next year. In their 2006 evaluation of the Army’s
goaling method, Dertouzos and Garber emphasize the importance
of latest past production numbers on setting fair goals and caution
against their potential negative effect on recruiter motivation. [6]
In our conversations with 2010 Recruiters of the Year, we heard
the same message with regard to the Navy’s goaling method for of-
ficers; as such, this issue will be addressed again in the AC and RC
officer sections. Our second conclusion was to model at the zip
code level. The zip-code level model used by the Marine Corps ap-
pears to provide a good level of detail to use for resource
allocation.

18
Addressing the issues: A zip-code level goaling model with
diversity data
To address the issues raised by CNRC, we created a zip-code-level
model. Zip-code data are the most granular and detailed information
available about potential recruits. Including such data is beneficial
for several reasons. First, tracking specific population information by
zip code will allow the Navy to be more responsive to changing
demographic needs and to target specific subpopulations as neces-
sary. Like the Marine Corps, the Navy could begin allocating re-
sources and recruiters to those areas that are most likely to produce
particular types of contracts. Second, understanding which areas are
likely to be most and least productive could lead to the closing or
consolidation of recruiting stations. Similarly, the Navy could place
recruiting stations strategically to minimize the distance between
them and promising populations.

Research suggests that this final point on distance and travel time
between recruiting stations and potential recruits merits additional
attention. In 1992, Bohn and Schmitz looked at the effect that dis-
tance has on enlistment rates. They hypothesized that an increase in
distance would have a negative effect on the rate of recruitment and
found this to be true. [7] A later study by the same authors used a
different modeling technique but reached a similar conclusion: The
greater the distance from the recruiting station to a recruit’s loca-
tion, the more production rates fell. [8] Evaluating production on a
zip-code level can inform the NRC about where most of the recrui-
table population resides. This, in turn, can inform station place-
ment in an effort to decrease the distance and travel time between
the station and as many potential recruits as possible.

Estimation technique—the Zero-inflated Poisson Model


In this subsection, we describe our model for recruiting on the zip-
code level. Because each zip code produces a positive whole num-
ber of contracts, an ordinary least squares model would not be ap-
propriate because it places no restrictions on the sign of the
outcome and is primarily designed for continuous outcomes.
Rather, a count model using the Poisson distribution is appropriate.
In practice, however, many zip codes are expected to produce no

19
contracts at all, so we use a zero-inflated Poisson (ZIP) model. The
11
ZIP model provides a way of modeling the excess zeros in addition
to the counts of recruits we expect from each zip code.

To accommodate the extra zeros, the modeling process is divided


into two stages. In the first, we estimate whether the zip code is ex-
pected to produce any contracts at all. In the second, we estimate
how many contracts the zip code will produce conditional on this
number being greater than zero [9].

Recent studies have shown the variables that go into our model as
predictors to be important for enlistments (see [10]). These vari-
ables are primarily socioeconomic in nature (e.g., population, edu-
cation, and crime data). Therefore, multicollinearity is a potential
12
modeling concern. Although this may not affect the model’s pre-
dictive ability, the estimated coefficients in the model can become
unstable and sensitive to model specification. The coefficients can
take signs that are counterintuitive, but, because the purpose of this
model is prediction rather than estimate coefficients, we consider
multicollinearity a secondary concern and caution against interpret-
ing the coefficients in isolation.

Our model is different from EGM in three ways: the estimation


technique, the unit of analysis, and the input independent variables.
Because we are using a zip-code level zero-inflated Poisson model,
we refer, for its components, to literature which has considered
similar modeling techniques (see [10] in addition to EGM).

Model inputs—the any-contracts model


The first step in the modeling process is estimating the likelihood
that a zip code will yield any contracts at all. The inputs to this
model are listed below. Notably missing from this list are economic
inputs such as the unemployment rate. During our model selection
11
We consider these zeros “excess” because they would not be expected
under an ordinary Poisson model. A formal statistical procedure called the
Vuong test will be used to verify that the zero-inflated model is appropriate
for the data.
12
This phenomenon occurs when multiple predictors in the model are re-
lated to each other and therefore explain the outcome in related ways.

20
process over several years of data, the unemployment rate was not
shown to have a significant relationship with total enlistments, so we
did not include it in the model presented below, but it does come
up as significant in other models included in appendix B. Addition-
ally, we use market awareness measures as a substitute for propensity
to enlist and advertising.

Distance to nearest college or university

We included data from IPEDS (Integrated Postsecondary Educa-


tional Data System) on the distance “as the crow flies” from the zip-
code centroid to the nearest centroid of a zip code that contains a
degree-granting college or university. We also included the square
of this distance and a dummy variable for situations in which there
is a college or university in the zip code under consideration. We
hypothesize that the nearby presence of a college or university
might affect a youth’s future goals, steering him or her away from
the military and toward higher education.

Size of the nearest college or university

IPEDS also provides the size of the nearest college or university. The
variable is divided into ordinal groups of under 1,000, 1,000–4,999,
5,000–9,999, 10,000–19,999, and 20,000 and over. We recoded these
as one though five, respectively, for our analysis. We hypothesize
that there might be a relationship between the size of the nearest
college or university and the academic atmosphere around the zip
code, and thus, the zip code’s likelihood of producing enlistees.

Interaction of size and distance

We also include the interaction of size and distance of the nearest


college or university in our model. This will help determine whether
the effect of distance to nearest college on recruit production varies
with the size of that college.

Multiple schools flag

We flag zip codes that have multiple colleges or universities in them.

21
Historically Black College or University (HBCU)

We flag zip codes that contain an HBCU. We hypothesize that this


may be particularly important for our minority models, but we left it
in all models to check for effects.

Model inputs—the count model

In the second stage “count model,” we model the predicted number


of enlisted AC contracts for each fiscal year. Below, we list the ex-
planatory variables.

Distance to the responsible Navy Recruiting Station

Each zip code is assigned as an area of responsibility to a recruiting


station. We got the list of these assignments from CNRC and com-
puted the distance “as the crow flies” between the two zip code cen-
troids: the zip code in question and the station’s zip code. We also
added the square of this variable to the model because it has been
shown to be significant [10].

Demographic data

We included population data in our model. We alternated between


several specifications, using the QMA dataset and the W&P data, but
ultimately, the detail provided by W&P provided a more accurate
model for prediction of total inventory, so we present our findings
based on that dataset. We included population counts, by zip code,
of 17- to 19-year-olds and 20- to 21-year-olds, by race and education
group. We included the racial/ethnic categories of black, Hispanic,
and white, as well as the following education categories: currently in
college, currently in years 1 through 3 of high school, currently a
high school senior, and high school graduate. For some subpopula-
tions of the total inventory, QMA data were better predictors, and in
some cases we had to use a combination of the QMA and W&P data.
These data came from the Recruit Market Information System
(RMIS), which was provided to us by the staff at Joint Advertising
Market Research & Studies (JAMRS).

22
Navy Awareness Index

CNRC provided us with the Navy Awareness Index (AI). This variable
measures each location’s awareness of the Navy by summarizing in-
formation on Navy leads. More specifically, AI estimates the likeli-
hood that consumers recognize the Navy’s “product” (i.e.,
employment/career opportunities available through naval service).
These data were measured by the Designated Market Area (DMA),
and the mapping between DMAs and zip codes was provided by
CNRC.

Recruiters

We include the number of recruiters from each service that are re-
cruiting in each zip code. Because a recruiter is often responsible
for more than one zip code, this number is sometimes a decimal.
These data are from RMIS.

Crime data

Because crime data has been shown to be an important variable for


predicting enlistments [10], we include data on property and vio-
lent crime, which we obtained from the Census Bureau. The data
are on the state level, so all zip codes in the state get the same num-
ber for these variables. They measure the number of property and
violent crimes per 100,000 people in the state.

Veteran population

We obtained veteran population statistics from the National Center


for Veterans Analysis and Statistics (NCVAS). The data are by
county and age group (17–44, 45–64, 65–84, and 85 and older).

Model results
We ran our model for 2006 to 2010, with each model’s inputs pre-
dicting the following year’s recruiting results. Previous models have
been run using within-year data [10], but because our interest is

23
specifically in forecasting enlistments, we created a model that mir-
13
rors NRC’s work in this way.

The resulting model coefficients differed slightly every year, but we


present only the most recent results in this paper, i.e., results using
2009 data to predict 2010 recruits by zip code. Table 1 shows the re-
sults for the any-contracts model. As hypothesized, the greater the
distance to the nearest college, the higher the probability of a zip
code producing recruits. The squared distance was also significant,
but the coefficient looks practically unimportant due to its size. The
largest negative effect on the probability of a productive zip code
was the existence of multiple colleges or universities in the zip code.
We interpret this as follows: the presence of multiple schools likely
takes students directly away from military services, and it might also
indirectly create an academic atmosphere encouraging college,
rather than the military, after high school. We also estimate a nega-
tive relationship between the size of the school and the probability
of a productive zip code.

Table 1. Results from the Any Contracts Model: Coefficients for


modeling productive zip codes
Variable Coefficient Standard error p-value
Distance to closest college/ 0.1638 0.0117 0.0000
university
Distance squared -0.0021 0.0002 0.0000
College or university in zip -0.3313 0.1877 0.0770
Size of the college / university -0.1455 0.0415 0.0000
Multiple school flag -1.8673 0.9131 0.0410
Constant -1.8077 0.1811 0.0000

Table 2 presents the results from the counts model. Of particular in-
terest are the variables with p-value less than 0.05, because those vari-
ables were deemed statistically significant, meaning that they correlate
well with our response variables, so for ease of reading, we have re-
moved those variables which were highly insignificant. Similar models

13
The Navy’s WEBSTEAM database, which is used by recruiters to identify
and analyze markets, could easily incorporate predictions from our model.

24
for subpopulations of the total inventory by race/ethnicity,
A-cell, and gender are presented in tables 7 through 13 in appendix B.

Table 2. Results from the Counts Model: Coefficients for modeling inventory on a zip-code
level
Variable Coefficient Standard Error p-value
Distance to responsible Navy Recruiting Station -0.0141 0.0005 0.0000
Distance to NRS squared 0.0000 0.0000 0.0000
W & P 17-19 currently in college Hispanic 0.0103 0.0057 0.0680
W & P 17-19 currently in college white -0.0152 0.0020 0.0000
W & P 17-19 currently in HS, year 1-3 black 0.0030 0.0012 0.0120
W & P 17-19 currently in HS, year 1-3 Hispanic 0.0141 0.0025 0.0000
W & P 17-19 currently in HS, year 1-3 white 0.0025 0.0013 0.0650
W & P 17-19 HSDG black 0.0011 0.0003 0.0000
W & P 17-19 HSDG Hispanic 0.0025 0.0004 0.0000
W & P 17-19 HSDG white 0.0003 0.0001 0.0040
W & P 17-19 senior in HS Hispanic -0.0113 0.0014 0.0000
W & P 17-19 senior in HS white 0.0017 0.0006 0.0040
W & P 20-21 college grad White 0.0075 0.0015 0.0000
W & P 20-21 currently in HS, year 1-3 black -0.0203 0.0060 0.0010
W & P 20-21 currently in HS, year 1-3 white -0.0292 0.0060 0.0000
W & P 20-21 HSDG black -0.0012 0.0003 0.0000
W & P 20-21 HSDG Hispanic -0.0015 0.0004 0.0000
W & P 20-21 HSDG white 0.0006 0.0001 0.0000
W & P 20-21 senior in HS black 0.0088 0.0023 0.0000
W & P 20-21 senior in HS Hispanic 0.0193 0.0029 0.0000
W & P 20-21 senior in HS white 0.0080 0.0016 0.0000
Navy Awareness Index in 2009 0.5996 0.0512 0.0000
USAF recruiters in 2009 0.0855 0.0182 0.0000
USMC recruiters 0.0325 0.0095 0.0010
USN recruiters 0.0677 0.0040 0.0000
Violent crime 0.0002 0.0001 0.0050
Property crime 0.0003 0.0000 0.0000
Veteran population 17-44 0.3459 0.0274 0.0000
Veteran population 65-84 -0.2061 0.0321 0.0000
Veteran population 85 and up 0.5167 0.0973 0.0000
Constant -1.6927 0.0637 0.0000

25
Distance to the nearest recruiting station, as well as its square, were
statistically significant in predicting total recruits in a zip code. As
expected, the distance to the nearest recruiting station has a nega-
tive relationship with enlistments. The farther the station is from a
given zip code, the less Navy presence is in that zip code, and the
more effort is required for the recruiter and the potential recruit to
make the contact. The square of the distance is also statistically sig-
nificant, but the coefficient is quite small, nearly zero, so it does not
carry practical importance because the coefficient has nearly no
impact on the predicted count of recruits.

Demographics were also significantly correlated with enlistments.


Of the many W&P variables included in the model, the largest posi-
tive coefficient was obtained for the number of 17- to 19-year-old
Hispanics currently in the first three years of high school. Some co-
efficients for the population data are actually negative. This could
be because the relationship with enlistments is actually negative
(with college degrees, for example), or it could be the result of the
inevitable multicollinearity of all the population variables. The coef-
ficients seem to suggest that it is easier to recruit in areas with many
Hispanic high school students. Spending time interpreting the coef-
ficients in this case, however, is not efficient because it is likely that
collinearity has made the estimates unstable. The purpose of this
model is to predict the final number of enlistees expected from
each zip code; the model coefficients can be used to do so despite
the caution with respect to their interpretation. Model results
meant to identify high-producing diversity markets are presented in
appendix B.

The largest positive coefficient in the count portion of the model is


for the Navy AI. This means that the AI, which measures the ratio of
the number of responses from all advertising sources (leads) in a
DMA to the average number of leads across DMAs is very strongly
correlated with enlistments. This speaks to the effectiveness of Navy
advertising.

The coefficients on the numbers of recruiters from other services


are all positive. This is an indication that military presence in gen-
eral contributes to the recruiting environment and overshadows
whatever competition there is between services for recruits.

26
The data on veterans is also significant. We note that the presence
of very young and very old veterans, who likely participated in the
Iraq War and World War II, respectively, has a positive relationship
with enlistments. The presence of those in the age group between
65 and 84 is negatively related, likely because of the controversial
war in Vietnam. The group of 45- to 64-year-olds is not significant,
possibly because both younger and older veterans are in that group.
On average, then, there is no effect. Unfortunately, these are the
only breakdowns provided by the dataset so we cannot investigate
this further.

Model diagnostics

After performing a test to confirm that the zero-inflated model was


14
appropriate, we evaluated our model based on its predictive abili-
ties. First, we computed the predicted number of enlistments for
every zip code in our analysis. Then, we instituted the following
checks. We computed the average difference between actual 2010
contracts from the zip codes and the predictions the model made.
We expect the average difference between actual and predicted val-
ues to be close to zero. Anything other than zero would indicate
that our model is biased. In this model, the average difference be-
tween actual and predicted numbers of enlistments was 0.076. This
means that on average, our model is accurate.

Our next step was to compute the mean absolute deviation (MAD).
In the analysis described above, when we averaged the differences
between predicted and actual enlistments, negative and positive dif-
ferences canceled each other out, giving an optimistic estimate of
how accurate our model can be. We now know that it’s right on av-
erage, but we want to know by how many people we are wrong on
average. To do this, we first rounded the model predictions to the

14
Specifically, we used the Vuong test to check whether a regular Poisson
model could have been used rather than the zero-inflated one. The test gave
a p-value of 0, confirming that the zero-inflated model was appropriate.

27
15
nearest integer. We did so because this is how the NRC would use
the model: NRC would round the predictions to see how many
whole people it expects from each zip code. Then, we computed the
difference between actual enlistments and the predictions, but we
took the absolute value of the difference before averaging so that all
differences contribute to our MAD. We estimate the MAD to be
about 0.943. That is to say, our model is off by just less than one
person, on average, per zip code.

Finally, we checked whether we were correctly predicting nonpro-


ductive zip codes. Of all the zip codes that really produced zero re-
cruits, our model correctly identified just over 55 percent, which is
not great performance—only marginally better than guessing. This
is a part of the model that could be improved in future research.
Because this clearly contributed to our overall error rate, we recom-
puted our MAD for only those zip codes that produced recruits. In
that case, our model was only off by about 0.533, as opposed to
0.943. So it is fair to say that the performance of this model would
greatly improve (the prediction errors would be cut in half) if we
could correctly identify the nonproductive zip codes.

Recommendations
We developed this zip-code level model to help NRC distribute goal
in more geographical detail. After verifying the model’s predictive
capability in the previous section, we recommend adopting the zip-
code level model to make better use of available market data. The
improvement in the detail of this goaling method should prove
helpful as the NRC considers reorganizing, consolidating or closing
stations, and redistributing its goals both geographically and de-
mographically. Although the coefficients of the models should not
be interpreted in isolation, the models are valid for predictions, and
their forecasts can be used to determine the proportions and feasi-
bility of recruiting goals at each station.

15
In the zero-inflated Poisson model, the observed value is typically a
count, but a predicted value is a conditional mean (the average number of
events given the predictors) so it need not be an integer.

28
Specifically, we recommend the use of this detailed model in tan-
dem with the EGM, i.e. to use the EGM to come up with a total
NRD goal, and then use the zip-code model to allocate the EGM
goal to stations proportionately to the model predictions. On one
hand, this approach allows NRC to maintain their EGM which in-
cludes variables they deem important for enlistment, and which was
built on their substantial subject matter expertise. On the other
hand, the combination of the two models will allow a more detailed
adjustment for the recruiting markets and thus may improve the ef-
ficiency of resource use and equity among recruiter assignments.

Furthermore, we suggest an evaluation of travel time in addition to


distance to the responsible Navy Recruiting Station as a variable
which might have an effect on contract production in a zip code. It
is possible that travel time will be a more effective predictor be-
cause distance does not take into account whether a given area is
rural or urban and therefore harder to travel to or likely to have
traffic congestion.

29
This page intentionally left blank.

30
Enlisted reserve component (RC)
Introduction
There are two elements to RC recruiting: prior-service (PS) and
non-prior-service (NPS) recruiting. Recruiting for enlisted NPS re-
serves is similar to the AC because the target populations are the
same. In fact, recruiters can now recruit for both components, al-
though there are some major differences between the two. For in-
stance, because a much smaller number of people enlist in the
reserves, modeling and geographical allocation of resources are
more complex from a statistical perspective. In addition, because
reservists usually live at home and train on base at regular intervals,
recruiting for the RC depends on the location of drilling units, as
well as vacancies and authorizations— three restrictions that do not
come into play in modeling the AC. Because of the similarities in
target population for NPS recruits and AC recruits, the goaling
model is currently similar between the two, as are the concerns,
identified in the previous section.

For PS recruiting, because enlistees come in with specific skill sets,


vacancies and authorizations at a specific drilling unit are particu-
larly important. An ongoing issue with PS recruiting is the ability to
locate and contact sailors after they leave active duty.

Enlisted RC goaling – the current models


In the enlisted RC, as in the AC, each service uses different methods
to define recruiting goals and to allocate resources. After describing
the current Navy models, we discuss related issues in the Army and
the Marine Corps (similar information for the Air Force was not
available to us at this time). In each case, we look for practices that
may help the Navy refine its current RC goaling methodology.

31
Non-prior Service (NPS)

For NPS recruiting, NCR uses the New Accession Training goal
model, which is based on the same methodology as the goaling
model in the enlisted AC. (Please see the previous section for de-
tails.) The only difference is that populations that go into the reserve
model are constrained to residing within 50 miles of Navy Opera-
tional Support Centers (NOSCs), where the recruits undergo train-
ing. This is a logical restriction, given that recruits have to travel
regularly to train. The model predicts enlistments of A-cell men on
the NRD level and assumes that AC production recruiters recruit NPS
enlistees. The model has not been evaluated against production.

Prior Service (PS)

RC prior-service goals are constructed using a weighted combina-


tion of one-third reserve recruiters, one-third historical reserve PS
past production, and one-third USN NAVET (U.S. Navy Veteran)
losses for the past five fiscal years. Although the weights are cur-
rently the same for each of the three components, they have moved
around in the past. It makes sense that QMA is not part of this cal-
culation, since general population distribution has little to do with
where Navy veterans usually end up. One issue we identified in the
PS goaling method is that it does not explicitly take into account au-
thorizations and vacancies at the units for which it goals. Another is-
sue is that, like the NPS model, its ability to accurately predict
production has not been evaluated.

Lessons from other Services16

U.S. Army

Non-prior Service (NPS)

The first step in the Army goaling process is to allocate all recruiters
geographically, regardless of their component (i.e., active, Army
16
We are grateful to Mr. Mike Nelson at USAREC, Colonel T.J. Kenney at
AFRS, and Captain Joseph Wydeven at MCRC for providing CNA with the
following information on goaling in their respective services.

32
National Guard, or Army reserve). This distribution is based on an
equal weighting of QMA and DOD past production. Subsequently,
recruiters are allocated by component. Active Guard Reserve (AGR)
recruiters are allocated separately from the rest of the recruiting
force, based on the following four weighted factors: 30 percent past
production, 20 percent vacancies, 40 percent TPU (troop program
or drilling unit) authorizations, and 10 percent QMA. The Army,
however, is currently considering a model that relies completely on
distributed authorizations and TPU structure as a fixed basis for lo-
cating AGR recruiters.

The second step in the goaling process is distributing the mission,


which is also done with a model. The Army Reserve (AR) model for
distributing the enlistment contract mission is also based on four
factors: past DOD-wide production, QMA, vacancies, and authoriza-
tions for the TPUs in the geographical area. As for all services, va-
cancies are of particular concern in Army reserve recruiting:
recruiting is dependent on specific vacancies in local Army reserve
TPUs. These vacancies have to be available, be within 50 miles of
the applicant’s residence, and have open training seats.

More precisely, the goal allocation model for recruiting NPS enlist-
ees is structured as follows:

 10 percent AR Past Production (PP): a measure of market po-


tential, based on a weighted average of the last four years,
17
with more recent years weighed more heavily.

 20 percent QMA: projected 17- to 29-year-old population

 45 percent vacancies (limited by training seats)

 25 percent authorizations

This year, current vacancies and authorizations are modified by G-2


(Deputy Chief of Staff) to reflect the changes due to ARFORGEN
(Army Force Generation) and mobilization sourcing that will occur
on October 1st. The AR weights vacancies heavily because they are

17
For current year yt, the weights are as follows: yt-1 = 40 percent yt-2 = 30
percent; yt-3 = 30 percent, and y1-4 = 10 percent.

33
currently over endstrength and require a precision mission.
Authorizations are weighted much lower than vacancies because at-
trition is currently very low. The weight given to past production is
low because past vacancy availability is likely to be different from fu-
ture vacancy availability. And, finally, because QMA is considered to
be a good measure of the NPS market, it remains in the model.

Prior Service (PS)

The PS enlisted model is built in the following way (the same units
are filtered out in the PS model as in the NPS model):

 35 percent AR PS past production (PP): a measure of market


potential based on the last four years, weighted 40 percent, 30
percent, 20 percent, and 10 percent

 5 percent QMA: projected 17- to 29-year-old population

 40 percent vacancies (limited by training seats): where the ac-


cessions must be made

 20 percent authorizations: measure of vacancy growth due to


attrition

Past production is weighted more heavily in the PS model because,


historically, PS vacancies have always been open to accessions re-
gardless of the unit ARFORGEN cycle (i.e., regardless of current bil-
let availability and the unit’s capacity to train). QMA has little
weight because it does not measure the PS market accurately
enough. Vacancies do not need to be limited by training seats and
are weighted heavily because the AR is currently over endstrength.

One of the struggles with PS recruiting is accurately recording and


maintaining the records of veterans’ addresses. The DMDC provides
a list of individuals leaving the service to USAREC, and the Army
has contractors contact credit rating agencies to update that address
information.

U.S. Marine Corps

The Marine Corps recruits enlisted reservists based on their prox-


imity to 1 of about 500 SMCR units, detachments, and individual

34
mobilization augmentee (IMA) billet locations in the continental
United States, Hawaii, and Puerto Rico. A 2001 CNA study by
Dolfini-Reed looks at how recruitable populations support the cur-
rent Selected Marine Corps Reserve (SMCR) force laydown [11].
The motivation for this work is that certain geographical areas have
difficulty supporting the manning requirements of some reserve
units. PS IMA billets have paygrade and skill requirements, particu-
larly because there is no training funding for IMA positions. This
makes recruiting for these billets more challenging than on the ac-
tive duty side, since location and skill matches limit who can be-
come a reservist. Furthermore, the travel policy states that a reservist
must live within 100 miles of the unit, or within three hours’ travel
time, whichever is less. In this sense, the USMC’s ability to man a re-
serve unit depends on its geographic location

Dolfini-Reed presents three main problems that follow from the


current process of recruiting Marine reservists. First, if a unit is in
either a remote or a highly populated area, it could face significant
issues with recruiting. Rural locations have fewer people from which
to draw, and urban areas may have more people with security clear-
ance and driver’s license restrictions. In addition, urban areas often
do not have the training space for some military occupations. Sec-
ond, high population areas may have more diverse and skilled peo-
ple, but those same people may want to have a military occupation
that is substantially different from their civilian job (i.e., a civilian
computer programmer wanting to do artillery). Finally, it is chal-
lenging to move the location of reserve units because Congress of-
ten selects their locations for political rather than demographic or
military reasons.

The study makes two primary recommendations. First, the Marine


Corps should consider moving some reserve units from the north-
east and north central regions of the United States to areas in the
west and south where there are larger markets. Second, the Marine
Corps should think about consolidating some units that are consis-
tently undermanned or short on certain skill sets. One example is
the intelligence units in Washington, D.C. These units are typically
undermanned even though most intelligence professionals live in
that area. It could be useful to combine these units.

35
Next, we consider the allocation of recruiters rather than reserve
units. CNA researchers Malone and Hattiangadi consider different
ways that the Marine Corps could allocate PS recruiters across dis-
tricts [12]. In the same document, CNA also offers two other meth-
ods for deciding where to place recruiters based on additional
variables. One of these variables is the “ease” of actually filling bil-
lets, which is defined as the number of leads divided by the number
of billets. A second variable that could be used in conjunction with
the first is the vacancy rate, which is defined as the number of leads
divided by the fill rate.

By accounting for both variables, CNA’s approach identifies districts


that could need more PS recruiters if either leads or vacancies in-
creased in a particular area. More recruiters would be useful if leads
grow, because the added recruiters could help secure the extra
placements. Malone and Hattiangadi acknowledge that the current
method is acceptable, but also suggest that the service consider us-
ing CNA’s expanded approach.

Recommendations
In considering the goaling methods laid out in this section, as well
as the enlisted model described in the previous section of this pa-
per, we provide some general recommendations for how the Navy
might improve its RC goaling process.

Because NPS recruiting is similar to AC enlisted recruiting in terms


of the target population, and because AC recruiters currently per-
form this function in the Navy, it makes sense to use the zip-code-
level model we suggest for the AC as a starting point for NPS re-
cruiting. However, the population data that go into the model
should be narrowed to within 50 miles of the NOSCs, as is done in
the current Navy model. Following the Army model, we recommend
that vacancy and authorization data be added.

Although we do not evaluate such a model in this paper, we rec-


ommend that an evaluation similar to the one we used for the AC
model be performed on this detailed NPS recruiting model. Then,
provided that the model performs well, or based on an improved

36
model, CNRC could address the issues discussed in the Marine
Corps section, such as merging and moving reserve units.

For PS recruiting, because the available population is defined quite


differently, we agree with NRC’s decision to leave QMA out of its
model and to incorporate market information through NAVET
losses. It seems sensible, however, to evaluate adding vacancies and
authorizations to the model, as they are in the Army’s model. We also
recommend that the model’s predictive capabilities be evaluated in
the manner we discussed in the section on the enlisted AC. The
weights of different model components can be adjusted accordingly.

37
This page intentionally left blank.

38
Officer active component (AC)
Introduction
Compared with enlisted goaling, creating an officer goaling model
presents several additional challenges. First, the number of officer
accessions is smaller than the number of enlisted accessions, so sta-
tistical modeling is less reliable. Second, the geographic units of
analysis are more difficult to characterize. Typically, the services
consider colleges and universities their units of analysis rather than
zip codes or states. However, the Census and similar data sources,
which are readily available in fine geographic detail, may not apply,
since potential recruits may attend college far from home. When of-
ficers are recruited, their home of record is recorded inconsistently:
sometimes the services record the permanent residence and some-
times the college address. In addition, all-service officer data are not
as readily available as their enlisted counterpart. Instead, each ser-
vice has to use its own past production data.

In addition, diversity is a different issue for officer recruiting than


for enlisted. First, the target population of college graduates pre-
sents a different diversity picture from high school graduates, with
fewer eligible candidates, so it is harder to recruit a racially and
ethnically diverse force. MLDC reports that each service had several
problems with representation of race/ethnicity groups in various of-
ficer commissioning sources. The report noted the underrepresen-
tation of (a) Hispanics and non-Hispanic Asians for Navy officer
accession; (b) Hispanics, non-Hispanic Asians, and non-Hispanic
others in Army officer accessions; and (c) Hispanics, non-Hispanic
blacks, and non-Hispanic Asians in Air Force accessions. Similarly,
women were underrepresented despite making up more than 50
percent of the recruiting pool [13].

As a result of data challenges which we detail below, building a sta-


tistical model is, at present, not plausible. Therefore, in this section,

39
we examine current methodology for officer recruiting, considering
the issues described above and discussing ways to improve goaling
methods. Given the noted data constraints, our analysis in this sec-
tion is necessarily qualitative—we rely on information gathered dur-
ing the Officer Goaling Conference, on discussions with recruiters
and recruiting leadership from the Navy and other services, and on
relevant literature. We also report on how the issue of officer diver-
sity in recruiting is approached in each service, and we make sug-
gestions for the Navy.

Current officer AC goaling method


The primary market for active duty officers is the college market,
which is made up of people who are currently enrolled in four-year
colleges and universities. If prospective applicants are not in school,
they are part of the work force market, which consists of employed
and unemployed non-affiliated civilians and Navy veterans. The
18
NAVET market is a primary market for reserve officer recruiting.

Description

The current Navy AC officer goaling method aggregates school-level


data to the NRD level, placing equal weight on measures of market
(college and workforce), on manning, and on historical production;
each is weighted at one-third.

The market measures consist of college degree data from the Na-
tional Center for Education Statistics (NCES). From this dataset, the
Navy focuses solely on Science, Technology, Engineering, or
Mathematics (STEM) degrees. The data are analyzed by the number
of degrees by degree type, race/ethnicity of the degree recipient,
gender of the degree recipient, and school quality. NRC excludes
nonresident aliens from these data and uses this as a proxy for citi-
zenship since no better citizenship data are available. CNRC maps
college majors to specific Navy officer designators, and goals can be

18
See:
http://www.cnrc.navy.mil/publications/Directives/1131%202E_CHA
TER%206_CH1.pdf

40
computed at this very detailed level. Overall, the Navy recruits into
67 broad officer categories, with a total of over 260 detailed goals.

Performance
Although NRC has not formally quantitatively evaluated the officer
goaling method, they are looking to improve their current model.
We first discuss the performance of the goaling method in general.
Then, we describe the problems associated with the Navy data. The
lack of data defines the limitations on our analysis.

Methodological issues

As noted earlier, NRC’s current goal allocation methodology relies


on market information, the number of recruiters, and past produc-
tion. We address each of these components below.

Market. Currently, NRC focuses on STEM degree graduates in their


market analysis. Their method is quite detailed and includes a
mapping from college degrees to officer designators. Educational
data are used to estimate the numbers of college graduates with
relevant degrees, and thus provide a proxy for officer goals. Al-
though very carefully thought out, this method does not take into
account propensity to enlist by geographic area.

Recruiters. As referenced above in the Enlisted AC section, the


number of recruiters has a clear impact on how many contracts one
might expect to obtain from an area. In the Enlisted AC model, the
marginal effect of a recruiter, or the number of contracts that each
recruiter is expected to produce, is estimated using a model. Here,
however, that number is implied by the weight placed on the num-
ber of recruiters, which is not necessarily estimated correctly, and
has not been empirically verified.

Past production. Currently, one-third of the weight of the goal is at-


tributed to past production. Five years’ worth of past production is
included in the methodology because it is often seen as a reliable
measure of the market and propensity to enlist. However, there are
three main problems with measuring the market in this way. First,
past production provides a backward incentive system for recruiters

41
[6]. ROY awardees told us that, if a recruiter does an exceptional
job one year and recruits above and beyond her goal, she is tasked
with a higher number the following year. If she underrecruits, her
job next year will be easier. Past production is perceived as punish-
ing those who work harder and rewarding those who work less
19
hard . Second, in the case of officer recruiting, when the numbers
are small (e.g., one recruiter might recruit three or four people for
a given community), there are large chance variations in recruiting
outcomes. It is possible for someone to recruit four people for a
community one year, and nine people the next, without large
changes in the surrounding environment. The part of past produc-
tion that is due to chance alone can then affect future goals and set
up an expectation of eight people a year, which may not be a rea-
sonable expectation. Changes based on past production can be very
irregular and unsystematic. Third, relying on past production
means that, to some extent, the goaling expectations are always lag-
ging behind the changing recruiting environment.

Data issues

Deficiencies in past data collection, and current struggles with pro-


ducing a reliable IT system that facilitates data integrity, make the
task of officer goaling especially difficult. Navy officer goaling suf-
fers from small-sample problems, making statistical modeling less
reliable. In addition, the Navy accession data that we obtained from
the Officer Master Files were missing nearly one-third of college and
university data for Officer Candidate School accessions over the last
six years. Of the remaining schools, many university names were
misspelled, so we could not trust the accuracy of the existing counts
of recruits from these schools. In addition, campus information was
missing for many universities that belong to larger university sys-
tems, such as the University of California, so we were unable to tell

19
The caveat to this assessment is that, because the model smoothes over
the last five years of past production, the most recent year should not have
a strong effect on the change in goal. However, the model predicts on the
NRD level, and a district CO can use the most recent past production
numbers to alter a particular recruiter’s goal. Thus, although this large ef-
fect of recent production is not explicitly part of the goaling model, it is
part of the process, and is perceived negatively by recruiters.

42
which city produced the officer candidate. Finally, DOD-wide officer
accession data are not currently available. Obtaining DOD-wide
records will greatly increase the sample sizes for subsequent analysis,
making it more robust.

New goaling issues


Demographic diversity

Diversity presents different issues for officer recruiting than for


enlisted. Because of the differences in the demographics in target
population, it is harder to recruit a racially and ethnically diverse
force. There are many benchmarks to choose from when it comes to
racial and ethnic diversity. The Army, for example, uses the general
population composition, others have used the composition of col-
lege graduates, or the composition of employees in management
positions. Ultimately, the Navy picked a benchmark that is based on
the predictions of the 2037 population composition. During the Of-
ficer Goaling Conference, recruiters expressed concern about the
fact that diversity is goaled differently from overall goals. There was
a popular proposition that would move the process from goaling
applications to goaling a combination of applications and selections
within legal constraints, or at least incentivizing selections while
goaling applications only. Another suggestion, and the one that is
likely to be pursued by the Navy in the near future, is goaling all of-
ficers based on applications.

Lessons from other Services20


In this section, we look to the other services to determine how they
goal for officers and for ways to address the issues described above.

20
We are grateful to Mr. Mike Nelson at USAREC, Col T.J. Kenney at
AFRS, and Captain Joseph Wydeven at MCRC for providing CNA with the
following information on goaling in their respective services.

43
U.S. Army

In the Army, enlisted recruiters recruit for Officer Candidate


School (OCS), West Point, and Reserve Officers' Training Corps
(ROTC). Only medical officers and chaplains are recruited by offi-
cers. Recruiting for special programs in the Army officer corps is
usually handled by enlisted recruiters in grades E6 and above.

The USAREC portion of the officer mission has decreased from


1,000 in FY 2011 to 600 in FY 2012. When this already small mission
is allocated to 250 recruiting companies, the apportioned recruiting
goals get very small, making a rigorous statistical model not feasible.
Instead, this distribution is based on population statistics, such as
the number of people attending college and similar metrics. This
goal then gets passed from the recruiting company to individual re-
cruiting stations and recruiters. This distribution, which uses col-
leges and universities as its analysis units, is based on past
production as the main contributor. Recently, the Army has started
focusing on recruiting from STEM degrees, making their task more
complex.

In the Army, diversity is set as a target on a station level, but there


are neither rewards for overexecuting the diversity goal nor penal-
ties for not meeting the target.

U.S. Air Force

The Air Force Recruiting Service (AFRS) recruits general duty offi-
cers who do not come from ROTC or the Air Force Academy. These
usually represent between 10 and 20 percent of the general duty of-
ficers who go to Officer Training School (OTS). Senior enlisted
personnel with past recruiting experience recruit all officers in the
Air Force, with the exception of chaplains and lawyers, who are re-
cruited by volunteer recruiters in their ratings.

The Air Force officer recruiting goals are assigned based on pro-
pensity to enlist (as measured by past production) and manning.
This suggests that market factors are taken into account, albeit indi-
rectly. The Air Force places importance on recruit quality: all can-
didates are required to take the Air Force Officer Qualifying Test

44
(AFOQT). This test, like the Graduate Record Examination (GRE),
covers academics, analytical thinking, and mathematics. It contains
five composites: pilot, navigator, academic, verbal, and quantitative.
The test measures aptitudes, and scores are later used to select can-
didates for commissioning programs, such as OTS or Air Force Re-
serve Officer Training Corps (AFROTC). It is also used for selection
into specific training programs, such as pilot and navigator training.
In addition to the AFOQT, an officer applicant needs at least a 3.0
grade-point average (GPA) to be competitive.

In the Air Force, officer diversity is handled in a similar fashion to


enlisted diversity. Recruiters earn points for recruiting women, His-
panics, and African-American applicants. There are additional
points for applicants who apply for flying jobs, or to become electri-
cal engineers, computer engineers, and meteorologists. Competi-
tions based on these points within and between squadrons
incentivize recruiters to earn these points. A top squadron can’t win
overall unless its diversity target has been met. Squadrons are en-
couraged to have internal reward systems that motivate recruiters
and keep them oriented toward general Air Force needs.

U.S. Marine Corps

Each year, CNA produces the Qualified Candidate Population


(QCP) report for the Marine Corps [14]. This report pulls data on
schools from the Integrated Postsecondary Education Data System
(IPEDS), which has enrollment and graduation rate information, as
well as SAT scores. CNA merges this dataset with Barron’s data on
school quality. The Barron’s score is based on the percentage of ap-
plicants admitted, as well as on the average incoming SAT scores
and high school GPA among those admitted. Top-tier colleges take
students who ranked in the top 20 percent of their high school
classes, earned GPAs between A and B+, and received SAT scores in
reading and math of 655 to 800. These schools admit fewer than a
third of applicants. After the schools are ranked on a combination
of quantity and competitiveness using Barron’s scores, the total mis-
sion is allocated to recruiting districts in proportion to the numbers
associated with these schools.

45
The Marine Corps uses the QCP report to determine where Officer
Selection Officers (OSOs) should focus their attention for recruit-
ing on college campuses [15]. Given that there are only about 70
OSOs at 60 or so Officer Selection Stations (OSSs), the Marine
Corps must identify where the QCP is concentrated, and assign
OSOs accordingly, in order to optimize the use of their recruiting
resources. The market of potential Marine Corps officer candidates
includes people who are test-score qualified and either have a
bachelor’s degree or are in the process of earning one. This makes
schools a good area to focus on when deciding where to assign
recruiters.

A recent study by CNA analyzes the composition of potential Marine


Corps officer candidates, with a focus on the size, location, and ra-
cial/ethnic attributes of the population. The study also provides a
list of the schools with the highest concentration of QCP, as well as
maps showing the distribution of certain kinds of college graduates
by county. The paper concludes that the QCP tends to be in large,
competitive, mostly public educational institutions [14]. Qualified
black and Hispanic men are highly concentrated in large, produc-
tive schools—a fact that should inform diversity-based recruiting ef-
forts. These results align with an earlier CNA study that examined
whether the QCP can predict which schools are most likely to pro-
duce many officer accessions [15]. While that study did not focus on
diversity within the QCP, it claims that the QCP estimates are
“strong, positive predictors of which schools ‘produce’ large num-
ber of Marine Corps officers.”

Finally, the QCP paper [14] also presents an analysis that augments
the traditional QCP data with propensity to enlist and medical data
collected by the Centers for Disease Control and Prevention (CDC).
This analysis results in more precise estimates of the QCP popula-
tion, and generates a refined list of the schools with the most officer
candidates. This kind of additional market information could help
the Navy better evaluate its officer recruiting market.

Although officer racial and ethnic diversity is a concern for the Ma-
rine Corps, recruiting at Historically Black Colleges and Universities
(HBCUs) hasn’t been very successful. In addition, these colleges do
not usually show up high on the ranked QCP list. Rather, there is a

46
separate list of QCP schools for minority officers [14]. In the Ma-
rine Corps, each OSO must submit a certain number of diversity
applications based on feasibility, as computed by QCP estimates.
Anecdotally, however, because only applications are currently
goaled, recruiters tend to submit candidates that they know will not
get in simply to accomplish that goal.

The Navy appears to have the most detailed approach to recruiting


officers for active duty. The fact that the Navy recruits into over 260
officer categories makes it difficult to appropriate methods from
other services, who have less detailed goals, and thus, to whom re-
cruits appear more uniform. Additionally, the Navy, more so than
other services, focuses on recruiting STEM graduates, which de-
creases the number of potential candidates and eliminates certain
schools from consideration altogether.

Additionally, we did not find that other services had a significantly


different approach to recruiting a demographically diverse officer
force.

Addressing goaling issues in the officer AC


We found the Navy’s method for recruiting AC officers to be more
detailed than its equivalents in the other services. Because the Navy,
more so than the other services, focuses on recruiting STEM gradu-
ates, CNRC maintains and regularly updates a dataset with numbers
of technical college degrees from most schools, and a mapping
from those degrees to officer designators. In this section, we at-
tempt to create a list of QCP schools the way the Marine Corps ana-
lyzes them to check whether this methodology would be helpful to
the Navy.

In table 3, we present a rough estimate of what the QCP schools


21
would look like for the Navy before the adjustments for propensity

21
The Navy’s minimum SAT score requirement is 1050, whereas it is 1000
for the USMC, so the list is slightly different for the Navy. In addition, we
did not use Barron’s data to identify how competitive the schools are; we
used the default measurement provided by IPEDS.

47
and medical data. To construct this table, we used data from easily
accessible online sources. We used the IPEDS dataset to obtain the
list of schools, their competitive attributes, and enrollment data.

Table 3. Estimated list of QCP Schools


% % undergraduates Full-time un-
admit- His-
Institution name dergraduate Black
ted Black Hispanic panic
enrollment
2009
Arizona State University 90 5 16 45,597 2,280 7,296
Ohio State University–Main 65 7 3 37,864 2,650 1,136
Campus
Pennsylvania State University– 51 4 4 37,485 1,499 1,499
Main Campus
Texas A&M University 67 3 14 35,400 1,062 4,956
The University of Texas at Austin 45 5 18 35,364 1,768 6,366
University of Central Florida 47 9 15 34,197 3,078 5,130
Michigan State University 72 8 3 33,429 2,674 1,003
University of Florida 43 10 15 31,316 3,132 4,697
Indiana University–Bloomington 73 5 3 31,061 1,553 932
University of Illinois at Urbana– 65 7 7 30,639 2,145 2,145
Champaign
Purdue University–Main Campus 73 3 3 30,334 910 910
University of Minnesota–Twin 50 5 2 28,539 1,427 571
Cities
University of Washington–Seattle 61 3 6 28,094 843 1,686
Campus
Brigham Young University 69 0 4 28,048 0 1,122
Florida State University 61 10 13 27,705 2,771 3,602
Rutgers University–New 61 8 10 27,588 2,207 2,759
Brunswick
University of Wisconsin–Madison 59 3 4 27,386 822 1,095
University of Arizona 78 3 18 27,103 813 4,879
University of California–Los 23 4 15 25,772 1,031 3,866
Angeles
University of Michigan–Ann 50 6 4 25,342 1,521 1,014
Arbor
University of Colorado at 84 2 6 24,916 498 1,495
Boulder

In the first column of the table, we give the percentage of applicants


admitted. The next two columns indicate the proportions of minor-

48
ity enrollment. The fourth column is the total number of full-time
undergraduate students, and the two columns that follow are the
numbers of black and Hispanic students. These numbers are ob-
tained by multiplying the respective proportions in columns 2 and 3
by the total enrollment number. Table 4 is the QCP table created
for the Marine Corps, with adjustments made for medical data and
propensity to enlist. There are many similarities between the quick
data pull and the original QCP school list, indicating that this is an
inexpensive way to get at roughly correct data. Note that this list
does not focus on any specific kind of degree, but rather on the to-
tal number of students. As we will see later, this method is less useful
when only STEM degrees are of interest.

Table 4. QCP for the Marine Corps

When we looked at the officer data for the Navy and compared the
non-missing school entries with top QCP schools that we identified
earlier, there was little correlation between what schools came out
as top QCP schools and where most Navy officers came from. This is
likely because the QCP analysis that CNA performs for the Marine

49
Corps does not make adjustments for STEM degrees. We present
our comparison in table 5. The third column indicates rank based
on total QCP population. The rightmost column indicates where
the QCP schools rank when sorted by Navy inventory.

Table 5. IPEDS and Navy schools

Full-time
undergraduate IPEDS Navy
Institution name enrollment QCP rank rank
Arizona State University 45,597 1 27
Ohio State University 37,864 2 27
Pennsylvania State University 37,485 3 15
Texas A&M University 35,400 4 6
The University of Texas at Austin 35,364 5 3
University of Central Florida 34,197 6 541
Michigan State University 33,429 7 32
University of Florida 31,316 8 4
Indiana University–Bloomington 31,061 9 36
University of Illinois at 30,639 10 19
Urbana–Champaign
Purdue University–Main Campus 30,334 11 60
University of Minnesota 28,539 12 27
University of Washington–Seattle 28,094 13 16
Brigham Young University 28,048 14 24
Florida State University 27,705 15 11
Rutgers University–New Brunswick 27,588 16 114
University of Wisconsin–Madison 27,386 17 12
University of Arizona 27,103 18 67
University of California, Los Angeles 25,772 19 102
University of Michigan–Ann Arbor 25,342 20 13
University of Colorado at Boulder 24,916 21 24
University of California–Berkeley 24,797 22 2
University of Georgia 24,669 23 38
University of Maryland 24,617 24 5
Temple University 24,114 25 36

As a contrast, in table 6, we provide the schools that most Navy offi-


cers came from and their IPEDS ranks based on total QCP (some
ranks are tied). Several schools are missing an IPEDS rank because
they are not part of the IPEDS system. Again, it is easy to see that
there isn’t a strong correlation between high-potential QCP schools

50
and those colleges and universities where the Navy has successfully
recruited in the past. Thus, the Marine Corps approach is not likely
to work for Navy goaling.

Table 6. All inventory based on Navy rank: Navy most productive


schools (2005–2009)
Total recruited Navy IPEDS
Institution name inventory rank rank
Embry-Riddle 283 1 313
University of California 170 2 2
University of Texas 163 3 5
University of Florida 179 4 8
University of Maryland 127 5 24
Texas A&M 178 6 4
University of North Carolina 139 7 65
Southern Illinois University 101 8 146
State University of New York 184 9 42
University of Phoenix 94 10
Florida State University 58 11 15
University of Wisconsin 105 12 17
University of Michigan 127 13 20
Thomas Edison College 69 13
Pennsylvania State University 216 15 3
University of Washington 145 16 13
St. Leo University 77 16 174
University of South Florida 78 18 29
University of Illinois 117 19 10
University of Pittsburgh 41 19 71
Old Dominion University 90 21
Louisiana State University 36 21 36
University of Tennessee 34 21 45
University of Colorado 174 24 21
Brigham Young University 32 24 14

Provided that the Navy plans to maintain its focus on recruiting


STEM graduates, we conclude that general data pulls used in the
Marine Corps do not provide enough detail on college graduates to
be useful. That said, Malone et al. recommends two extra data
sources to incorporate into the Marine Corps’ QCP computations:
estimates of propensity to enlist provided by JAMRS, and health-
related data from the Center for Disease Control (CDC) [14].

51
These additional market characteristics could help isolate the most
likely candidates for the Navy from within college populations as
well.

Suggestions for a way forward


Ideas and suggestions

In considering the issues the Navy faces, in combination with our


review of relevant literature and of the goaling methodology used in
the other services, we suggest the following ways forward in evaluat-
ing and improving the Navy AC officer mission allocation process:

 We suggest that NRC continue its efforts to collect better data


on its officer recruits, particularly as it pertains to their home
of record and college/university. Current data are often erro-
neous or missing, making rigorous analysis impossible. We
also suggest that the NRC continue its efforts to have DOD
develop an all-service officer recruiting database. Obtaining
DOD records will greatly increase the sample sizes for subse-
quent analysis, making it more robust. Once the reliable data
are in place, NRC will have enough information to perform a
thorough market analysis and use these market data to move
away from using past production in its goaling methodology.

 Although the system currently used by the NRC is fairly com-


plete, a potential flaw is that it places an incorrect weight on
the number of recruiters in the model. Once the data are in
place, we could study how the market responds to variations
in recruiter numbers or how the number of recruiters should
be changed to obtain the allocated goal.

 As was done in the recent CNA study for the Marine Corps
[14], we suggest that Joint Advertising Marketing Research &
Studies (JAMRS) estimates of youth propensity to enlist be in-
cluded in the NRC model. As we showed in table 4, in the case
of USMC, including this information changed the school order
on the ranked list. Including propensity may also substitute for
the current model’s reliance on past production.

52
 Also based on the Malone et al. study, we suggest that NRC es-
timates be adjusted for data from the CDC [14]. The data
available online include diabetes, obesity, and activity infor-
mation on a county level. Including this information will have
an effect on the order of the ranked schools.

 Once the data issues mentioned above are resolved, rigorous


analysis of the model will be possible. We think that evaluat-
ing the current model and its shortcomings is an important
step in the process of improving it. We recommend using
measures similar to what we used to evaluate our zip-code-
level model in the earlier section, such as looking at the mean
absolute deviation of the model predictions and making sure
the model is not biased. To perform such an evaluation, how-
ever, it is important to collect quality data.

Diversity

In this final subsection, we provide suggestions based on the diver-


sity issues we identified with NRC staff and recruiters at the Officer
Goaling Conference, and within the literature (primarily the MLDC
report [13]), that could be helpful in thinking through improve-
ments in diversity goaling. In-depth analysis of these issues was out-
side the scope of this project, so we are unable to make
recommendations, but it is important to pursue these issues in fu-
ture work.

 A useful idea discussed at the Officer Goaling Conference (men-


tioned in the Navy section on diversity) was that for minority ap-
plicants, NRDs could be ranked based on selection rates. This
way, rankings take selections and attainments into consideration;
applications are goaled, but quality is incentivized.

 Another idea we garnered from the Officer Goaling Confer-


ence was that diversity recruiting performance evaluations
could be based on whether soft application targets are met
and on whether selection percentages matched application
percentages by NRD. For example, if in a particular NRD, the
nonminority selection rate was the same as the selection rate
for a particular minority category, then it follows that the
NRD’s minority applicants were as competitive as their other

53
applicants. The recruiting region could be recognized for
achieving that balance. If the minority selection rate is higher
than that for nonminorities, the NRD could be recognized for
achieving outstanding quality in minority applications. Both
this suggestion and the previous one would fare well with the
Navy’s recent proposal to goal applications for all officers.

 Per the recommendation in the MLDC report, the Navy could


evaluate the effectiveness of current spending on minority
marketing and recruiting initiatives to check whether those
resources are optimally allocated.

 The Navy could examine untapped recruiting markets. This


includes exploring recruiting at two-year colleges and strate-
gically locating ROTC host units. The MLDC report cites Na-
tional Center for Education Statistics (NCES) 2008 data and
posits that close to 50 percent of all students in college at-
tended two-year colleges, and this percentage is slightly
higher for blacks and Hispanics. In addition, a report by
Kraus from 2004, states that, in the enlisted Navy, those with
two-year college degrees have higher test scores and higher
continuation and reenlistment rates than those with only high
school degrees [16]. The report also says that 17 percent of all
students attending a two-year college transfer to a four-year
college. The same is true for 19 percent of Hispanics and 8
percent of blacks. MLDC suggests that these students could be
targeted for ROTC.

 Furthermore, MLDC recommends an evaluation of ROTC lo-


cations relative to diverse qualified population locations.

54
Officer reserve component (RC)
Introduction
As on the enlisted side, there are two types of recruits who enter the
officer reserves: those with prior service (PS) and those without
(NPS). Those with prior service enter the reserves with a certain
skill set and training and are usually looking for a particular job.
They are usually recruited within a short amount of time after com-
pleting an active duty obligation. In contrast, NPS officers who
don’t come in through a Navy program such as ROTC, enter the re-
serves in a similar way as the active duty in the sense that they are
recruited out of the general college-educated population and with-
out prior Navy training. The recruiting process is quite different for
NPS and PS officers, so we address the two groups separately in this
section.

Officer reserve PS
Current goaling model

In the Navy, the RC officer goal model is a weighted combination of


the market (20 percent), recruiters (40 percent), and historical
production over the last five years (selects, enlistments, and acces-
sions—40 percent). The IRR market is identified from the eligible
Navy officer losses provided by the Chief of Naval Personnel (CNP).
The market is composed of officers who have completed MSO, offi-
cers who have not yet completed MSO, and officers serving under
approved Ready Reserve Agreements.

Addressing Officer Reserve PS goaling issues

One issue we identified with the RC officer goal model is that its
heavy reliance on past production makes recruiters distrust it. How-
ever, one minor benefit of using the last five years of production is

55
that one year of increased production has a relatively small impact
on the predicted mission. Hence, this system should not be as de-
motivating as one with a heavier reliance on most recent numbers.
Nevertheless, the five-year window for evaluating production could
lead to a risk that CNRC can lag behind market changes by several
years.

The IRR list, which is given to recruiters, has records for those who
left the military in the last three years. It is not clear, however, if this
is the optimal time span. Although we could not locate existing
studies on how long someone should stay on that list, at the Officer
Goaling Conference it was discussed that three years is too long and
that, after a year or two, there is little chance of someone from the
IRR list joining the reserves.

In addition, anecdotally, and based on our conversations with NRC


staff, it’s unclear what home of record (HOR) means in the IRR.
Our understanding is that it is frequently unreliable because a lot of
people put down Florida as HOR for tax reasons. Sometimes, offi-
cers give addresses they intend to move to but never actually end up
moving there. Many give their current home base address even
though they plan to move at the end of their obligation.

Lessons from other Services22


The services rely on officers separating from the AC for the majority
of the reserve officers that they recruit. These officers have both the
training and experience to fill reserve jobs.

In the Air Force, roughly 90 percent of RC officers come in with


prior service. In the upcoming fiscal year, the Air Force Reserves
plan to recruit a little over 300 PS officers, excluding health profes-
sionals. RC officer recruiting is very much tied to the 45 wings used
for drilling. In fact, the vacancies and needs of the training units are
what dictate the goal distribution, and recruiters are sent to those

22
We are grateful to Mr. Mike Nelson at USAREC, Col T.J. Kenney at
AFRS, and Captain Joseph Wydeven at MCRC for providing CNA with the
following information on goaling in their respective services.

56
areas in accordance with the allotted goal. Like the Navy, the Air
Force keeps the Individual Ready Reserve (IRR) list. The IRR list
gets populated during the exit interview that each active duty officer
must have with a recruiter, and the Air Force sends out quarterly
mailings to that address. The Air Force experiences problems with
the addresses on the IRR list; they are frequently inaccurate because
officers leaving the Air Force either don’t go to the location they
planned or they move.

To help mitigate similar issues, the Army has a special division re-
sponsible for management of the IRR list, under the Chief of Army
Reserves.

The Marine Corps Recruiting Command has a system in which


enlisted and officers can recruit PS officers. Those assigned to this
duty reach out to officers within two years of the end of their mili-
tary service obligation (MSO) and update their records for MCRC.
This way, they have more accurate data on the officers than when
they originally filled out their paperwork.

Ideas and suggestions


We have identified some issues that should be addressed to improve
RC goaling for officers with prior service. Our priorities for potential
improvements follow:

 Increase emphasis on data recording so that the Career Transi-


tion Office (CTO) and IRR lists have reliable HOR informa-
tion. Location is especially critical when recruiting people into
the reserves. This may involve following up with officers who
are coming up on the end of their minimum service require-
ment (MSR) and updating records. Some of the weight that
the current method places on the IRR could be relocated to
the CTO.

 Reconsider the number of years used in past production. This


recommendation is based on our participation in the officer
goaling conference, where we heard that recruiters thought go-
ing back five years was not valuable. A first step to do this would
be to see how the model predictions change using various time
period lengths and evaluate (using methods similar to those we

57
presented in the section on enlisted recruiting) which model
predicts the outcome the best. There will be inevitable en-
dogeneity in this type of a test since, intuitively, the recruiting
goal can affect the recruiter’s behavior. However, recruiters
have told us that, while they take the goal into account, it is not
the main driver of their work, so there might be some validity
to a test like this.

Identify the new maximum amount of time someone stays on the


IRR list. This number can be studied. A first step would be to look at
people who get recruited into the IRR and see what the average
time is between when they left active duty and affiliated with a
SELRES unit. During the Officer Goaling Conference, we heard
that people drop off after about two years.

Officer reserve NPS


Current goaling model

In the Navy, RC non-Medical officer Direct Commission Officer


(DCO) goals are weighted combinations of recruiters (50 percent,
same recruiters used for RC NAVET Model) and historical produc-
tion (50 percent), which is a 5-year production average. Aside from
placing a large weight on historical production and having the
number of recruiters in the computation, which we discussed in the
previous section, what stands out about this method is that there is
no current market information incorporated in the calculation.

Lessons from other Services


Military recruiting does not exert much effort on recruiting NPS of-
ficers into the reserves. The services focus on recruiting NPS into
the active forces, and PS officers into the reserves.

The Marine Corps has, however, recruited NPS officers,. Because


NPS RC officers are similar to AC officers, the Marine Corps has a
similar model in place for reserve NPS officer recruiting.

58
The Army and the Air Force recruit very few NPS officers into the
RC so their recruiting and goaling methods are not systematic. In
fact, the Air Force does not goal PS and NPS separately because it
has a strong preference for PS officers. Those interested in becom-
ing NPS officers in the Air Force have to interview for the position,
and typically there are more leads than billets, so the Air Force does
not aggressively recruit NPS officers.

Ideas and Suggestions

To improve the current RC NPS officer goaling model, we have


three suggestions. The most important actions should be decreasing
reliance on past production and putting increased weight on re-
cruiting market information. This can be done in two ways. First,
the NRC already has a fairly detailed model for college degree re-
cipients. This information could serve as a starting point for this
model, and it would not come at a large cost because the Navy al-
ready uses this model.

Second, we suggest incorporating information on the workforce


that is publicly available in the Census. Specifically, occupation data
are available by state in the Public Use Microdata Sample (PUMS),
which is part of the American Community Survey (ACS). We suggest
that the NRC create a mapping between Navy officer designators
and Census occupation definitions, and use a combination of those
selected with a measure of propensity to join (this could be based
on past production) to help geographically allocate the recruiting
goal. Similar to the enlisted case, it will make sense to limit this
population information to a radius of 50 to 100 miles within a train-
ing unit.

Finally, since the reserves are geographically tied to training units so


that the recruits can drill regularly, we advise that training unit re-
quirements, vacancies, and authorizations be incorporated in this
calculation.

59
This page intentionally left blank.

60
Medical officer recruiting
Background
Across services, medical recruiting is the most difficult type of re-
cruiting. Doctors frequently have to give up a convenient schedule,
a large salary, and sometimes a private practice to join the military.
They also spend more time in school in pursuit of their degrees
than most other recruits.

Christensen et al. note that recruiting practices in the civilian sector


are similar to those used by the services; for example, often these
potential employers offer scholarships and loan repayment pro-
grams. The study also found that some important factors in the civil-
ian sector were salary, training, and job satisfaction [17]. For
recruiting and marketing to medical professionals, this means inclu-
sion of pay, affiliation with academic medical centers, and training
opportunities to offset some of the opportunity costs we described
above.

The study’s authors posit that increasing accessions can only hap-
pen as the result of services providing additional resources to en-
hance and expand current programs, as well as to research the
practices of other services [17].

Christensen et al. also discuss a concern regarding the negative ef-


fect that the changing gender mix in medical schools can have on
physician recruiting. As the proportion of women in medical
schools increases, it has a negative effect on medical recruiting be-
cause women historically have had a much lower propensity to join
the military than men. The authors state that the female proportion
of new medical students grew from 7 percent in 1965 to almost 50
percent in 2005. In addition, Riche and Kraus point out that the
foreign-trained and non-citizen share of the medical community is

61
growing, which may also have a negative effect on recruiting medi-
cal personnel into the military [18].

A description of goaling issues in medical officer recruiting


The Navy, Army, and Air Force all encounter the foregoing difficul-
ties in recruiting and goaling doctors and health professionals. The
Marine Corps does not recruit doctors. We summarize the various
services’ methodologies below.

U.S. Navy

The Navy recruits the following categories of medical personnel:

 Medical Corps (MC) direct appointments (DA)

 MC students

 Dental Corps (DC) direct appointments

 DC students

 Medical Service Corps (MSC) direct appointments

 MSC students

 Nurse Corps (NC) direct appointments

 NC students

The majority of medical personnel are accessed via various scholar-


ship programs, such as the Health Professions Scholarship Program
(HPSP). For example, in FY 2011 the Navy was goaled with 16 phy-
sicians from the workforce and 289 to be acquired through the
HPSP and similar programs.

The components of the current medical goaling model are recruit-


ers, historical production, and a market factor. In the AC model,
each factor receives a weight of one-third. For MC, DC, MSC, NC,
and DA, the NRC uses a combination of professional college degree
market and workforce data. In the RC Direct Commission Officer
model, market (based only on professional workforce data) gets 20
percent of the weight, designated reserve recruiters get 40 percent,

62
and historical production over the last five years is weighted at 40
percent.

The workforce data are from PUMS, specifically the U.S. Census
ACS, which gives the geographic locations, by state, for the medical
workforce age 18 to 40. The main potential issue with these data is
that they are organized on a state level, which is fairly large and
does not always correspond with the NRD assignments. Still, this is
the only breakdown available in the ACS. The other two compo-
nents of the model are recruiter and historical five-year production,
each weighted at one-third.

We describe a few data sources below that could provide ready-to-


use data on matriculation and graduation of medical students by
medical school. Medical students constitute a large part of the re-
cruiting market, so incorporating this information should help in-
form goal distribution.

The Association of American Medical Colleges provides several use-


ful datasets on its website. These give enrollment and graduation
numbers over time for different medical schools. We present one of
those datasets in table 14 in appendix C. The association’s website
(www.aamc.com) presents various breakdowns of these data, includ-
ing race and ethnicity, Medical College Admission Test scores, GPA,
and others. This can be a first step in identifying the market for
medical recruiting.

In addition, the American Association of Colleges of Osteopathic


Medicine provides tables of their enrollees and graduates, with sev-
eral breakdowns, and would make a useful addition to the medical
recruiting model. Such tables as table 15 in appendix C can be ob-
tained on its website (www.aacom.com).

When recruiting DCOs for the RC, the Navy usually looks for
graduate students, as well as individuals in residency programs, Cer-
tified Registered Nurse Anesthetist (CRNA) programs, and trade
organizations. The goal is 20 percent market (as measured by the
medical workforce), 40 percent designated reserve recruiters, and

63
40 percent historical production. NAVETs are usually recruited di-
rectly from active duty or from the IRR lists.

A complicating issue in medical recruiting is that the goals are rela-


tively small. In some medical subspecialties, the number of doctors
required is much smaller than the number of NRDs. When these
goals are assigned to regions, the assignment is frequently perceived
as unfair; it is difficult to recruit doctors. Past production also ac-
counts for a sizable proportion of the goal computation. Because
goal allocation is based partially on past production, and small
numbers can often depend on chance rather than on the market,
recruiters do not trust this model. In fact, the model’s reliance on
past production was one of the main complaints from the recruiters
we interviewed at the ROY awards.

Perhaps the key issue with respect to setting goals is how goals affect
the motivation and productivity of recruiters. We heard from several
recruiters and from leadership that competition, incentives, and re-
wards might be the best way to allocate Navy medical recruiting
goals. They believe that the interest and motivation generated by a
competitive environment will overshadow the unfairness that comes
from assigning goals in small numbers.

Lessons from other Services23


The Army is organized with medical recruiting prominently placed
in USAREC. Medical recruiting is organized with five special medi-
cal recruiting battalions under the command of the Medical Re-
cruiting Brigade. The Army relies on senior enlisted recruiters to
perform virtually all of the medical recruiting.

For health practitioners, the Army mostly focuses on the relevant


school population. The main component of the mission allocation
model for medical recruiting is past production and graduation
numbers from various medical schools. Collecting information on

23
We are grateful to Mr. Mike Nelson at USAREC, Colonel T.J. Kenney at
AFRS, and Captain Joseph Wydeven at MCRC for providing CNA with the
following information on goaling in their respective services.

64
graduating classes from medical school would be a useful addition
to the way the Navy currently goals medical officers.

The Air Force, like the Army, relies on senior enlisted recruiters to
recruit medical personnel. For medical goals, the Air Force uses
both applications and board success. However, they monitor board
results to make sure quality of applicants is competitive. The AFRS is
the major recruiter for medical professions. It recruits 90 percent of
all health care professionals (medical doctors, nurses, Medical Ser-
vice Corps, etc.). One of the factors considered in medical goals is
feasibility of recruiting numbers given recruiting resources. Goals
come from officer endstrength projections of requirements, which
are determined independently from recruiting resources. Hence,
goals may not be feasible. For example, two years ago the AFRS was
tasked with recruiting a large number of fully qualified MDs, but
had resources allowing for only a few dozen medical recruiters. This
year they had a mission of 25 fully qualified MDs, which was achiev-
able. Next year it will be 55, which is also reasonable. The Air
Force’s workforce medical doctor goals are by specialty, as are the
Navy’s.

Until recently, in the Air Force, the allocation of medical goal was
done based on past production, without much emphasis on particu-
lar medical subspecialties. For example, it recruited surgeons, in
general, as opposed to surgeons with particular specializations.
Starting in FY 2011, however, the requirements became more de-
tailed. To distribute these newly specific goals, for FY 2012, the allo-
cation of many of the more challenging medical goals (fully
qualified MDs) will happen using a “fantasy draft” model. Each
group gets to select one of the goals until all the goals have been al-
located. The group gets the entire nation to recruit against these
goals. For example, if a recruiter is looking for an oral surgeon, he
or she can go to conventions and visit places across the country to
find the appropriate specialty. If a different group finds an oral sur-
geon, it has to refer that person to the group tasked with recruiting
that specialty.

65
Potential ways forward
In reviewing the Navy process and considering the other services, we
identified two problem areas in the current goaling process: one is
market related, and the other is process related. Thus, we present
two sets of considerations for a way forward. The first assumes that
the current goaling process will continue to be used, but aims to
improve it by adding additional market data. The second explores
the idea of completely restructuring the medical goaling process by
introducing competition and incentives as tools to motivate recruit-
ers. In appendix D, we present an overview of existing literature on
recruiter incentives, which can apply to both sets of considerations.

Making medical officer recruiting more market driven. We identi-


fied some data sources that will help better define the market or the
potential recruitable population. NRC could include these data
sources in their goaling calculations. NRC should continue to use
the medical workforce data from PUMS, but we believe that the ad-
ditional sources will improve the precision of the calculations.

Restructuring the medical officer goaling process. As we mentioned


above, one consequence of goal distribution is its effect on recruiter
motivation. As we heard from recruiters, and also found in literature,
competition is one of the most effective recruiter motivators. With
that in mind, it makes sense to think about an entirely different mis-
sion allocation system, one based on self-selection and ambition.

As discussed earlier, the Air Force has already implemented such a


system. We think that the Air Force approach of assigning many of
the smaller goals, such as workforce physicians, through an auction
system has merit. These goals are literally too small to be allocated
through any sort of statistical goaling model. By letting the areas
and/or districts self-select, CNRC will be eliminating the problem of
placing too much emphasis on past production as well. Also, by al-
lowing areas to choose their specific goals, this approach promotes
direct responsibility and autonomy for the recruiters.

A medical “fantasy draft” goaling approach could be considered and


evaluated as an alternative by CNRC. Under this model, each region
or NRD would be assigned a random number. Then, according to

66
that order, each participant would choose a medical job category to
recruit and would be solely responsible for recruiting into that cate-
gory regardless of the region of the country. A competition could be
introduced based on completing the recruiting task. Alternatively,
the medical professions could be ranked based on difficulty to re-
cruit, and an auction draft could be implemented in which the par-
ticipants could trade their medical recruiting missions. NRC could
wait to see whether this model works successfully for the Air Force
or, alternately, conduct a study to assess this approach’s potential
for success.

67
This page intentionally left blank.

68
Future work
In this section, we briefly describe three potential extensions of the
analysis developed in this study.

1. In an immediate follow-on of our work, we could adapt the


modeling approach used for the enlisted AC population to
the enlisted reserve population; however, we would have to
adjust for the geographic dependency on training units.
(The available population would have to be restricted to
within a 50-mile radius of a training unit.)

2. Likewise, CNA could apply this modeling method to the


officer populations (both active and reserve), potentially
generating estimates at the university level. (In the officer
case, units of analysis are primarily universities, not zip
codes.) The success of this effort, however, will depend on
data availability. We summarize the data issues in the
following bullet points:

3. The active officer (NPS) population needs more accurate


home-of-record (HOR) and university data. Currently, the
university field in the Officer Master File (OMF) is missing
for almost a third of OCS recruits. For many others, the
university name is misspelled or campus information is miss-
ing. This makes it difficult to validate the existing officer
goaling method and to test new ideas. Given that current
NRC goaling methods take university degrees into account,
it is important to keep accurate records on universities for
assessment of existing calculations and for future forecast-
ing capabilities.

4. There is also uncertainty associated with HOR data in the


OMF. For some officers, the data reflect their college ad-
dress, and for others, their permanent residence. In addi-
tion, it is not recorded whether the officer was recruited
while at college or at his or her permanent address. This

69
uncertainty prevents improvements to the current goaling
model because it limits our knowledge of where the officers
are actually being recruited.

5. There is a clear need for a DOD-wide officer accession da-


tabase. Such a database would provide the necessary addi-
tional information that would allow for more robust
modeling of the officer corps.

6. The reserve officer (PS) population needs more data on the


location of officers who leave active duty. We heard from
NRC and recruiters that data on PS officers are frequently
unreliable. Although all of the services experience this issue
to some extent, some services have concrete efforts in place
to update the databases used for PS recruiting, and it seems
that a special effort is required from NRC to collect and
maintain accurate and up-to-date data on officers who leave
active duty. Currently, lack of such data is the main obstacle
to PS recruiting.

7. For officers with medical skills, CNA could help restructure


the current goaling process by reviewing different ways to
use incentives and competition to motivate recruiters.
Medical officers are the hardest to recruit, and the process
does not lend itself to statistical modeling. It is worth
considering an involved qualitative analysis of potential
medical goaling alternatives.

70
Appendix A: Recent goal allocation concerns
and ROY winner interviews
Every year, the Navy presents 13 to 15 top enlisted and officer re-
cruiters with the Recruiter of the Year, or ROY, award. At the be-
ginning of the project, we had an opportunity to talk with some of
last year’s winners about the current recruiting environment. Those
discussions helped inform this project. Although it wasn’t feasible to
study all topics identified during those discussions, we describe the
most salient here for reference.

Diversity recruiting
Recruiters who won awards for their diversity numbers mentioned
that they are more successful in inner-city areas, and that it helps if
the recruiter is of the same gender and racial/ethnic background as
the potential recruits. They mentioned having to travel quite a bit to
achieve their numbers, and they talked about the use of social me-
dia tools to attract recruits. They also said they don’t specifically re-
cruit to the recommended diversity targets: they do their job every
day and diversity takes care of itself. This emphasizes their reliance
on CNRC’s ability to geographically allocate diversity goals. Recruit-
ers mentioned a recent emphasis on quality rather than quantity
and a push for women, Hispanic, and African American A-cell
recruits.

Merging missions
With more extensive use of the reserves in wars in Iraq and Afghani-
stan, recruiting into the reserve component (RC) has become more
critical. Also, in a cost-constrained environment, the Navy is merg-
ing missions in its four major markets (i.e., enlisted, officer, AC, and
RC), enabling reassignment of recruiters across missions, and, in

71
some instances, a recruiter to recruit for all four. This could intro-
duce cost savings by closing some of the recruiting stations.

In addition, the Navy has slowly been merging enlisted and officer
missions. In our interviews, recruiters did not express concerns
about merging active and reserve missions; in fact, they explained
that this is already being done to some degree. However, merging
enlisted and officer missions seemed to generate more concern. On
the positive side, the recruiters thought that merging these missions
would enable officer candidates to come into any office and apply
without having to go to officer recruiting. Because parts of the re-
cruitable population are similar for enlisted and officer missions,
recruiters would also be able to cross-reference high schools and
colleges. In fact, the recruiters told us that many candidates are not
sure which of the two careers they prefer when they first consider
joining the Navy.

Enlisted recruiters we interviewed thought that a qualified recruiter


can recruit enlisted and officer candidates, AC, and RC, provided
sufficient training and information. However, officer recruiters had
more concerns. Officer recruiting is often viewed as more challeng-
ing by the recruiting community, and there is a larger startup cost
of getting to know the recruiting and processing system, as well as
the available market. In addition, they felt that many officer candi-
dates have a strong preference for talking with recruiters who have
officer experience; these candidates require a special mentality and
approach. For example, officer candidates are likely to be less avail-
able to stop by a recruiting station. Instead, a recruiter often needs
to travel to them. Some recruiters worried that, given time con-
straints on recruiter training, it was not possible to inform a re-
cruiter sufficiently on both enlisted and officer processes. One
recruiter suggested that an enlisted recruiter who works to recruit
officers could do most of the work, but an officer could come in to
give a presentation on a regular basis.

Finally, recruiters brought up an additional concern: the time re-


quired for processing. Officer recruiters are given individual quar-
terly goals, and enlisted recruiters are given station monthly goals.
They also noted that it can take anywhere from 3 to 6 months for
the officer application to go through the NRC system, whereas an

72
enlisted application takes only a few days. This will make the proc-
esses difficult to merge.

Recruiters’ geographical range


One of the main objectives of CNRC mission allocation is to assign
goals equitably so that all recruiters get the same opportunity to
succeed in their jobs. Goals are allocated geographically based on
models in the database called Standardized Territorial Evaluation
and Analysis for Management (STEAM), which has station-level data
and an all-service accession dataset for enlisted recruiters.

Recruiters currently use STEAM to learn about their market and de-
termine how to allocate their time. For additional information, they
reach out to recruiters who have worked in the same area, and they
investigate their assigned neighborhoods for socioeconomic factors,
such as income and political affiliation.

When we asked recruiters about their geographical range, we got a


variety of answers. Several recruiters mentioned that they usually
travel within 20 miles of their recruiting station and sometimes up
to 50 miles. Many said that they would drive 1 to 1.5 hours to meet
with potential recruits, and a few said they would be willing to drive
to another state for a meeting with a promising and qualified can-
didate. Officer recruiters mentioned that they find it particularly
challenging to establish their market. Because quality of officer ap-
plicants affects their probability of selection into the Navy, they
placed great emphasis on the equitable assignment of recruiters to
areas close to colleges and universities of different quality.

Technology
The technology of recruiting has changed markedly over the last 20
years, with new marketing technologies, social networking, and
greater recruiter mobility. Of the many recent technological ad-
vances, those making it easier to keep in touch have had the most
impact on Navy recruiting.

73
Recruiters discussed several tools that have enhanced their produc-
tivity, as well as technologies they see as necessary for improving
their workflow. Several recruiters mentioned that they use Facebook
to communicate with potential recruits and those in the Delayed
Entry Program (DEP). Many find it easier to contact recruits via
Facebook than by phone, and they use the “Friends of Friends” fea-
ture in Facebook to find additional candidates and get references.
This technique allows them to indentify candidates who are mem-
bers of swim or other athletic teams. Some also use Facebook to
check that potential recruits adhere to the Navy’s height and weight
standards, and they use this information to determine qualified
candidates or to provide mentoring. Recruiters are also finding that
texting rather than calling young recruits results in more frequent
interaction.

Because social networking sites and text messaging have become the
leading ways that young people communicate, inability to text or go
on Facebook hurts recruiters. Regular contact with those in the De-
layed Entry Program and mentees is important, and both social
networking and texting help recruiters stay in touch. Most recruiters
expressed dissatisfaction with the amount of paperwork and with
the number of computer- and phone-related issues they encounter.
Because of frequent travel, they mentioned the need for portable
computers and printers.

Past production
Several recruiters told us that they rely on CNRC to give them
achievable goals, and they direct their time toward recruiting in
general rather than toward the specific number provided by
24
CNRC. Currently, past production (i.e., the number of recruits
brought on in a district in the previous several years) makes up a
portion of the model that forecasts enlistments for each area. This is
not surprising because past production is a good measure of the re-
cruiters’ market, and recruiters agree that without past production
informing some of the goaling model, the missions would become
stagnant.

24
CNRC actually only goals Regions. NRD COs then goal recruiters.

74
The problem with this measure, however, is that it does not neces-
sarily incentivize recruiters in the right way. One’s hard work one
year may result in a harder task the following year, whereas poor
production could be “rewarded” with a smaller, more accomplish-
able goal. One recruiter told us that he was goaled one year with
four officers in a particular community, but delivered nine, so the
following year, he was goaled with nine, which was quite hard to ac-
complish. Recruiting models use more than one year of past pro-
duction (typically three to five), so this sharp increase in required
numbers is likely a result of the goaling process (leadership over-
sight) rather than of the goaling model alone. Nevertheless, re-
cruiters suggested that goals should be raised in a marginal manner
and based on the nation’s need, rather than on past production.

Medical officer recruiting


Our conversations with recruiters and leadership identified medical
officer recruiting as the most difficult of the recruiting tasks. Geo-
graphically allocating recruiting goals is particularly complex for the
medical field because every year the number of recruits needed for
medical programs is small (sometimes smaller than the number of
NRDs), and there is an inherent unfairness when these numbers get
distributed among NRDs and then further assigned to particular re-
cruiting stations. When these small numbers get further subdivided
by diversity requirements, the perceived unfairness is even greater.

Recruiters were unhappy with the way past production affected


these requirements because they felt that, with numbers this small,
past production was highly affected by chance alone and not by the
size of the recruitable population. We heard from several recruiters
that, if some form of competition were incorporated in medical of-
ficer goaling, the process would be more motivational and per-
ceived as engaging and challenging rather than unfair.

75
This page intentionally left blank.

76
Appendix B: Enlisted active component (AC)
zip-code-level model results
This appendix presents resulting model estimates for modeling re-
cruit production on a zip-code level for various subpopulations of
the total inventory. Several models were unstable with respect to
model specifications, and some were impossible to estimate. We
present, for each subpopulation, the best estimable model.

A-cell recruits
Table 7 presents our model results for predicting A-cell recruits by
zip code.

Table 7. Model results for predicting A-cell recruits by zip code


Variable Coefficient Standard Error p-value
Distance to NRS -0.0151 0.0006 0.0000
Distance to NRS squared 0.0000 0.0000 0.0000
W & P 17-19 currently in college black 0.0100 0.0065 0.1240
W & P 17-19 currently in college Hispanic 0.0081 0.0065 0.2120
W & P 17-19 currently in college white -0.0151 0.0022 0.0000
W & P 17-19 currently in HS, year 1-3 black 0.0055 0.0014 0.0000
W & P 17-19 currently in HS, year 1-3 Hispanic 0.0141 0.0028 0.0000
W & P 17-19 currently in HS, year 1-3 white 0.0016 0.0015 0.2660
W & P 17-19 HSDG black 0.0008 0.0003 0.0110
W & P 17-19 HSDG Hispanic 0.0025 0.0004 0.0000
W & P 17-19 HSDG white 0.0003 0.0001 0.0210
W & P 17-19 senior in HS, black -0.0019 0.0009 0.0280
W & P 17-19 senior in HS, Hispanic -0.0121 0.0016 0.0000
W & P 17-19 senior in HS, white 0.0024 0.0007 0.0000
W & P 20-21 college grad black 0.0143 0.0068 0.0360
W & P 20-21 college grad Hispanic 0.0029 0.0042 0.4950
W & P 20-21 college grad White 0.0084 0.0016 0.0000
W & P 20-21 currently in HS, year 1-3 black -0.0074 0.0067 0.2710
W & P 20-21 currently in HS, year 1-3 Hispanic -0.0113 0.0059 0.0550

77
W & P 20-21 currently in HS, year 1-3 white -0.0298 0.0067 0.0000
W & P 20-21 HSDG black -0.0010 0.0003 0.0030
W & P 20-21 HSDG Hispanic -0.0014 0.0004 0.0010
W & P 20-21 HSDG white 0.0005 0.0001 0.0000
W & P 20-21 senior in HS, black 0.0041 0.0026 0.1120
W & P 20-21 senior in HS, Hispanic 0.0243 0.0033 0.0000
W & P 20-21 senior in HS, white 0.0073 0.0018 0.0000
Navy Awareness Index in 2009 0.6900 0.0571 0.0000
USAF recruiters in 2009 0.0743 0.0204 0.0000
USA recruiters 0.0071 0.0033 0.0350
USMC recruiters 0.0311 0.0105 0.0030
USN recruiters 0.0699 0.0044 0.0000
Violent crime 0.0002 0.0001 0.0090
Property crime 0.0002 0.0000 0.0000
Veteran population 17-44 0.3288 0.0306 0.0000
Veteran population 45-64 0.0091 0.0314 0.7720
Veteran population 65-84 -0.1408 0.0336 0.0000
Veteran population 85 and up 0.4308 0.1027 0.0000
Constant -1.9588 0.0706 0.0000
------------- ----------- ----------- --------
Non-zero model
Distance to closest college / university 0.1724 0.0135 0.0000
Distance squared -0.0023 0.0002 0.0000
College or university in zip -0.2252 0.2142 0.2930
Size of the college / university -0.1578 0.0478 0.0010
Size by distance interaction -0.0008 0.0025 0.7390
Multiple school flag -2.7610 2.7492 0.3150
Historically black college or university in zip -0.0158 0.1943 0.9350
Constant -1.9235 0.2084 0.0000

Black recruits
This model was unstable and not robust to model specifications. In
many specifications, our software was unable to estimate the model.
We present one of the successful specifications in table 8. For brev-
ity, we omit the statistically insignificant results.

78
Table 8. Model results for predicting the number of black recruits by zip code
Variable Coefficient Standard Error p-value
Distance to NRS -0.0118 0.0018 0.0000
State unemployment rate 0.0296 0.0117 0.0120
QMA black 17-19 0.0078 0.0021 0.0000
QMA white 17-19 -0.0079 0.0012 0.0000
W & P 20 no high school black -0.0039 0.0014 0.0060
W & P 20 GED black 0.0671 0.0238 0.0050
W & P 20 GED white 0.0319 0.0107 0.0030
W & P 20 in HS, years 1-3 black 0.0869 0.0313 0.0060
W & P 20 in HS, years 1-3 white 0.0786 0.0401 0.0500
W & P 20 HSDG -0.0102 0.0024 0.0000
W & P 20 Native American no HS 0.0041 0.0014 0.0030
W & P 20 HS senior black -0.0263 0.0134 0.0490
W & P 20 HS senior white 0.0645 0.0145 0.0000
W & P 21 college grad black -0.0929 0.0310 0.0030
W & P 21 no HS black 0.0048 0.0013 0.0000
W & P 21 GED white -0.0326 0.0126 0.0100
W & P 21 in HS years 1-3 black -0.0822 0.0297 0.0060
W & P 21 HSDG Hispanic -0.0100 0.0043 0.0200
W & P 21 no HS black -0.0056 0.0026 0.0340
W & P 21 AA degree white 0.0194 0.0096 0.0430
W & P 21 senior in HS black 0.0385 0.0138 0.0050
W & P 21 senior in HS Hispanic 0.0696 0.0230 0.0030
W & P 21 senior in HS white -0.0466 0.0133 0.0000
W & P 22 college grad black -0.0066 0.0026 0.0120
W & P 22 HSDG black 0.0056 0.0023 0.0140
W & P 22 HSDG Hispanic -0.0096 0.0052 0.0630
W & P 22 HSDG white 0.0052 0.0021 0.0150
W & P 17-19 GED black -0.0232 0.0107 0.0300
W & P 17-19 GED Hispanic -0.0458 0.0261 0.0800
W & P 17-19 HS years 1-3 black 0.0139 0.0084 0.0980
W & P 17-19 years 1-3 Hispanic 0.0028 0.0010 0.0050
W & P 17-19 years 1-3 white 0.0077 0.0019 0.0000
W & P 17-19 in college white 0.0018 0.0005 0.0010
W & P 17-19 senior in HS Hispanic -0.0103 0.0047 0.0300
W & P 23-24 college grad black 0.0062 0.0012 0.0000
W & P 23-24 AA degree black -0.0056 0.0030 0.0620
W & P 23-24 AA degree Hispanic 0.0094 0.0055 0.0900
W & P 23-24 senior in HS black 0.0541 0.0194 0.0050
W & P 23-24 senior in HS Hispanic -0.0999 0.0279 0.0000

79
Navy Awareness index 0.8029 0.1613 0.0000
USAF recruiters 0.1965 0.0472 0.0000
USA recruiters -0.0140 0.0084 0.0950
USMC recruiters 0.0872 0.0293 0.0030
USN recruiters 0.0135 0.0112 0.2300
Violent crime -0.0005 0.0002 0.0070
Property crime 0.0004 0.0000 0.0000
Veteran population 45-64 0.5852 0.0857 0.0000
Veteran population 65-84 -1.0567 0.1151 0.0000
Veteran population 85 and up 1.1917 0.3386 0.0000
Constant -3.6337 0.2289 0.0000
------------- ----------- ----------- --------
Non-zero model
College or university in zip -3.2560 0.4215 0.0000
Constant 1.5061 0.5197 0.0040

Black A-cell recruits

Table 9 presents our model results for predicting the number of


black A-cell recruits by zip code.

Table 9. Model results for predicting the number of black A-cell recruits by zip code
Variable Coefficient Standard Error p-value
Distance to NRS -0.0183 0.0023 0.0000
Distance to NRS squared 0.0000 0.0000 0.0000
W & P 17-19 currently in college black 0.0300 0.0142 0.0350
W & P 17-19 currently in college Hispanic -0.0556 0.0213 0.0090
W & P 17-19 currently in college white -0.0285 0.0089 0.0010
W & P 17-19 currently in HS, year 1-3 black -0.0049 0.0026 0.0570
W & P 17-19 currently in HS, year 1-3 Hispanic 0.0152 0.0096 0.1130
W & P 17-19 currently in HS, year 1-3 white 0.0114 0.0050 0.0220
W & P 17-19 HSDG black 0.0006 0.0006 0.2830
W & P 17-19 HSDG Hispanic 0.0027 0.0015 0.0670
W & P 17-19 HSDG white -0.0013 0.0004 0.0040
W & P 17-19 senior in HS, black 0.0072 0.0016 0.0000
W & P 17-19 senior in HS, Hispanic -0.0161 0.0052 0.0020
W & P 17-19 senior in HS, white -0.0035 0.0023 0.1250
W & P 20-21 college grad black 0.0056 0.0140 0.6880
W & P 20-21 college grad Hispanic 0.0284 0.0089 0.0010

80
W & P 20-21 college grad white 0.0031 0.0060 0.5990
W & P 20-21 currently in HS, year 1-3 black -0.0596 0.0187 0.0010
W & P 20-21 currently in HS, year 1-3 Hispanic -0.0076 0.0179 0.6710
W & P 20-21 currently in HS, year 1-3 white 0.0081 0.0218 0.7090
W & P 20-21 HSDG black -0.0008 0.0007 0.2030
W & P 20-21 HSDG Hispanic -0.0017 0.0015 0.2500
W & P 20-21 HSDG white 0.0020 0.0004 0.0000
W & P 20-21 senior in HS, black 0.0231 0.0068 0.0010
W & P 20-21 senior in HS, Hispanic 0.0318 0.0100 0.0010
W & P 20-21 senior in HS, white 0.0052 0.0060 0.3840
Navy Awareness Index in 2009 1.4917 0.1763 0.0000
USAF recruiters in 2009 0.2188 0.0540 0.0000
USA recruiters 0.0080 0.0091 0.3830
USMC recruiters 0.0217 0.0311 0.4860
USN recruiters 0.0268 0.0124 0.0300
Violent crime 0.0002 0.0002 0.2020
Property crime 0.0004 0.0000 0.0000
Veteran population 17-44 0.3167 0.0866 0.0000
Veteran population 45-64 0.4756 0.0967 0.0000
Veteran population 65-84 -1.0550 0.1266 0.0000
Veteran population 85 and up 1.0796 0.3744 0.0040
Constant -4.9429 0.2238 0.0000
------------- ----------- ----------- --------
Non-zero model
Distance to closest college / university 0.1712 0.0309 0.0000
Distance squared -0.0016 0.0005 0.0010
College or university in zip -0.1029 0.2970 0.7290
Size of the college / university -0.0248 0.0786 0.7520
Size by distance interaction -0.0038 0.0058 0.5120
Multiple school flag -1.9223 1.9512 0.3250
Historically black college or university in zip -2.2667 0.7687 0.0030
Constant -1.4263 0.3712 0.0000

Hispanic recruits
Table 10 presents our model results for predicting the number of
Hispanic recruits by zip code.

81
Table 10. Model results for predicting the number of Hispanic recruits by zip code
Variable Coefficient Standard Error p-value
Distance to NRS -0.0313 0.0015 0.0000
Distance to NRS squared 0.0000 0.0000 0.0000
QMA 17-19 I-IIIA black 0.0022 0.0003 0.0000
QMA 17-19 I-IIIA Hispanic 0.0056 0.0002 0.0000
QMA 17-19 I-IIIA white 0.0006 0.0000 0.0000
Navy Awareness Index in 2009 1.7594 0.1071 0.0000
USAF recruiters in 2009 0.1194 0.0362 0.0010
USA recruiters 0.0317 0.0052 0.0000
USMC recruiters 0.1885 0.0162 0.0000
USN recruiters 0.0764 0.0072 0.0000
Violent crime 0.0010 0.0001 0.0000
Property crime 0.0003 0.0000 0.0000
Veteran population 17-44 0.4855 0.0674 0.0000
Veteran population 45-64 -0.3404 0.0712 0.0000
Veteran population 65-84 -0.1335 0.0770 0.0830
Veteran population 85 and up 0.9779 0.2252 0.0000
Constant -4.0439 0.1407 0.0000
------------- ----------- ----------- --------
Non-zero model
Distance to closest college / university 0.2194 0.0189 0.0000
Distance squared -0.0036 0.0004 0.0000
College or university in zip 0.4802 0.1201 0.0000
Size of the college / university -0.0059 0.0353 0.8670
Size by distance interaction -0.0106 0.0031 0.0010
Multiple school flag -0.1532 0.1733 0.3770
Historically black college or university in zip 0.7875 0.1904 0.0000
Constant -1.1273 0.1785 0.0000

Hispanic A-cell recruits

Table 11 presents our model results for predicting the number of


Hispanic A-cell recruits by zip code.

82
Table 11. Model results for predicting the number of Hispanic A-cell recruits by zip code
Variable Coefficient Standard Error p-value
Distance to NRS -0.0338 0.0017 0.0000
Distance to NRS squared 0.0000 0.0000 0.0000
QMA 17-19 I-IIIA black 0.0021 0.0003 0.0000
QMA 17-19 I-IIIA Hispanic 0.0054 0.0002 0.0000
QMA 17-19 I-IIIA white 0.0007 0.0000 0.0000
Navy Awareness Index in 2009 1.8643 0.1187 0.0000
USAF recruiters in 2009 0.1421 0.0402 0.0000
USA recruiters 0.0366 0.0059 0.0000
USMC recruiters 0.1959 0.0180 0.0000
USN recruiters 0.0747 0.0079 0.0000
Violent crime 0.0009 0.0001 0.0000
Property crime 0.0003 0.0000 0.0000
Veteran population 17-44 0.4212 0.0738 0.0000
Veteran population 45-64 -0.2957 0.0771 0.0000
Veteran population 65-84 -0.1115 0.0809 0.1680
Veteran population 85 and up 0.9464 0.2329 0.0000
Constant -4.3508 0.1543 0.0000
------------- ----------- ----------- --------
Non-zero model
Distance to closest college / university 0.2254 0.0213 0.0000
Distance squared -0.0036 0.0004 0.0000
College or university in zip 0.4886 0.1349 0.0000
Size of the college / university 0.0324 0.0394 0.4100
Size by distance interaction -0.0123 0.0036 0.0010
Multiple school flag -0.0299 0.1898 0.8750
Historically black college or university in zip 0.7357 0.2103 0.0000
Constant -1.3642 0.2047 0.0000

Female recruits

Table 12 presents our model results for predicting the number of


female recruits by zip code.

83
Table 12. Model results for predicting the number of female recruits by zip code
Variable Coefficient Standard Error p-value
Distance to NRS -0.0234 0.0012 0.0000
Distance to NRS squared 0.0000 0.0000 0.0000
QMA 17-19 I-IIIA black 0.0062 0.0003 0.0000
QMA 17-19 I-IIIA Hispanic 0.0043 0.0003 0.0000
QMA 17-19 I-IIIA white 0.0006 0.0000 0.0000
Navy Awareness Index in 2009 0.3339 0.1113 0.0030
USAF recruiters in 2009 0.1806 0.0390 0.0000
USA recruiters 0.0139 0.0063 0.0270
USMC recruiters 0.1435 0.0199 0.0000
USN recruiters 0.0545 0.0082 0.0000
Violent crime 0.0003 0.0001 0.0340
Property crime 0.0003 0.0000 0.0000
Veteran population 17-44 0.2730 0.0547 0.0000
Veteran population 45-64 0.2248 0.0584 0.0000
Veteran population 65-84 -0.2158 0.0617 0.0000
Veteran population 85 and up 0.5422 0.1876 0.0040
Constant -2.5881 0.1348 0.0000
------------- ----------- ----------- --------
Non-zero model
Distance to closest college / university 0.1672 0.0183 0.0000
Distance squared -0.0023 0.0003 0.0000
College or university in zip -0.0297 0.1694 0.8610
Size of the college / university 0.0338 0.0458 0.4610
Size by distance interaction -0.0053 0.0032 0.0980
Multiple school flag -0.1497 0.2894 0.6050
Historically black college or university in zip -0.0822 0.2257 0.7160
Constant -1.6590 0.2308 0.0000

Female A-cell recruits


Table 13 presents our model results for predicting the number of
female A-cell recruits by zip code.

84
Table 13. Model results for predicting the number of female A-cell recruits by zip code
Variable Coefficient Standard Error p-value
Distance to NRS -0.0262 0.0014 0.0000
Distance to NRS squared 0.0000 0.0000 0.0000
QMA 17-19 I-IIIA black 0.0062 0.0003 0.0000
QMA 17-19 I-IIIA Hispanic 0.0042 0.0003 0.0000
QMA 17-19 I-IIIA white 0.0007 0.0000 0.0000
Navy Awareness Index in 2009 0.5327 0.1245 0.0000
USAF recruiters in 2009 0.1747 0.0441 0.0000
USA recruiters 0.0156 0.0071 0.0290
USMC recruiters 0.1556 0.0222 0.0000
USN recruiters 0.0570 0.0091 0.0000
Violent crime 0.0004 0.0001 0.0060
Property crime 0.0002 0.0000 0.0000
Veteran population 17-44 0.2659 0.0604 0.0000
Veteran population 45-64 0.2334 0.0645 0.0000
Veteran population 65-84 -0.1608 0.0675 0.0170
Veteran population 85 and up 0.5576 0.2024 0.0060
Constant -3.0119 0.1519 0.0000
------------- ----------- ----------- --------
Non-zero model
Distance to closest college / university 0.1819 0.0229 0.0000
Distance squared -0.0025 0.0004 0.0000
College or university in zip 0.0856 0.1901 0.6520
Size of the college / university 0.0656 0.0527 0.2130
Size by distance interaction -0.0079 0.0038 0.0400
Multiple school flag 0.1247 0.2783 0.6540
Historically black college or university in zip 0.0312 0.2533 0.9020
Constant -1.8709 0.2796 0.0000

85
This page intentionally left blank.

86
Appendix C: Examples of data sources for
medical officer goaling
In table 14, we present the total number of graduates by medical
school and gender for 2008 through 2010. Table 15 displays the
number of graduates in osteopathic medicine.

Table 14. Total graduates by U.S. medical school and gender


2008 2009 2010
State Medical School Women Men All Women Men All Women Men All
AL Alabama 61 99 160 71 90 161 59 105 164
South Alabama 29 36 65 34 29 63 33 34 67
AR Arkansas 61 79 140 62 78 140 62 86 148
AZ Arizona 52 52 104 66 57 123 56 52 108
CA Loma Linda 80 86 166 76 84 160 62 96 158
Southern Cal-Tech 88 80 168 85 89 174 81 79 160
Stanford 52 41 93 39 39 78 39 60 99
UC Davis 51 42 93 43 42 85 53 36 89
UC Irvine 47 44 91 34 43 77 49 55 104
UC San Diego 54 72 126 52 58 110 54 65 119
UC San Francisco 76 73 149 82 63 145 81 76 157
UCLA Drew 13 11 24 20 9 29 12 5 17
UCLA-Geffen 74 75 149 77 47 124 72 75 147
CO Colorado 57 75 132 70 63 133 73 73 146
CT Connecticut 45 35 80 52 25 77 47 28 75
Yale 59 36 95 52 45 97 54 56 110
DC George Washington 92 63 155 87 77 164 102 76 178
Georgetown 98 82 180 87 102 189 98 95 193
Howard 52 47 99 58 50 108 48 49 97
FL Florida 58 56 114 70 54 124 67 63 130
Florida State 28 29 57 47 27 74 65 29 94
Miami-Miller 73 77 150 86 85 171 68 87 155
South Florida 66 49 115 60 53 113 65 55 120
GA Emory 56 57 113 51 58 109 65 61 126
MC Georgia 65 104 169 82 98 180 89 90 179

87
Mercer 32 21 53 27 36 63 27 33 60
Morehouse 31 20 51 34 22 56 28 16 44
HI Hawaii-Burns 37 21 58 35 25 60 28 30 58
IA Iowa-Carver 58 78 136 66 78 144 69 62 131
IL Chicago Med-Franklin 85 98 183 89 93 182 78 108 186
Chicago-Pritzker 59 39 98 51 62 113 55 60 115
Illinois 153 155 308 143 157 300 160 170 330
Loyola-Stritch 74 65 139 64 68 132 69 63 132
Northwestern-Feinberg 75 91 166 75 92 167 77 78 155
Rush 77 48 125 60 62 122 73 64 137
Southern Illinois 37 36 73 35 30 65 36 33 69
IN Indiana 118 142 260 120 147 267 116 148 264
KS Kansas 69 94 163 79 76 155 79 89 168
KY Kentucky 36 59 95 46 49 95 41 55 96
Louisville 57 89 146 52 84 136 54 84 138
LA LSU New Orleans 76 79 155 72 98 170 64 101 165
LSU Shreveport 44 53 97 49 61 110 51 61 112
Tulane 65 90 155 60 72 132 66 98 164
MA Boston 81 71 152 90 64 154 86 67 153
Harvard 88 86 174 85 90 175 78 75 153
Massachusetts 53 49 102 50 49 99 61 40 101
Tufts 78 96 174 78 91 169 70 99 169
MD Johns Hopkins 57 44 101 52 73 125 58 60 118
Maryland 85 61 146 76 64 140 98 62 160
Uniformed Services-Hebert 49 108 157 45 113 158 40 123 163
MI Michigan 74 95 169 73 88 161 85 80 165
Michigan State 51 47 98 52 38 90 65 52 117
Wayne State 115 131 246 117 135 252 144 129 273
MN Mayo 17 19 36 20 19 39 22 10 32
Minnesota 110 98 208 94 110 204 103 103 206
MO Missouri Columbia 44 45 89 46 39 85 46 54 100
Missouri Kansas City 58 27 85 51 38 89 50 40 90
St Louis 71 79 150 61 90 151 65 98 163
Washington Univ. St Louis 58 61 119 57 57 114 52 63 115
MS Mississippi 48 50 98 29 65 94 55 60 115
NC Duke 37 53 90 53 54 107 49 56 105
East Carolina-Brody 36 36 72 32 35 67 33 34 67
North Carolina 78 87 165 78 79 157 65 75 140
Wake Forest 49 55 104 49 57 106 50 65 115
ND North Dakota 29 31 60 29 30 59 28 28 56
NE Creighton 55 66 121 62 67 129 59 58 117
Nebraska 52 63 115 50 65 115 51 69 120

88
NH Dartmouth 33 29 62 35 28 63 40 46 86
NJ UMDNJ New Jersey 70 76 146 84 78 162 78 90 168
UMDNJ-RW Johnson 85 69 154 80 70 150 84 67 151
NM New Mexico 43 27 70 34 35 69 42 35 77
NV Nevada 27 22 49 25 26 51 27 28 55
NY Albany 77 56 133 87 51 138 67 60 127
Buffalo 66 66 132 88 61 149 58 71 129
Columbia 59 75 134 75 83 158 77 91 168
Cornell-Weill 45 51 96 52 40 92 45 53 98
Einstein 88 86 174 97 82 179 98 80 178
Mount Sinai 63 55 118 63 58 121 68 48 116
New York Medical 96 96 192 101 82 183 104 88 192
New York University 77 77 154 81 83 164 85 90 175
Rochester 57 34 91 47 42 89 44 50 94
SUNY Downstate 107 94 201 93 106 199 80 93 173
SUNY Upstate 75 68 143 70 84 154 80 70 150
Stony Brook 62 42 104 57 51 108 49 64 113
OH Case Western 65 70 135 75 93 168 71 89 160
Cincinnati 67 89 156 65 78 143 66 96 162
Northeastern Ohio 58 43 101 56 64 120 58 52 110
Ohio State 73 129 202 66 130 196 86 115 201
Toledo 68 72 140 57 78 135 56 85 141
Wright State-Boonshoft 52 37 89 56 42 98 56 34 90
OK Oklahoma 60 79 139 52 96 148 60 89 149
OR Oregon 64 39 103 59 62 121 66 61 127
PA Drexel 127 117 244 126 107 233 131 129 260
Jefferson 111 104 215 128 129 257 129 115 244
Penn State 61 66 127 67 66 133 71 77 148
Pennsylvania 70 71 141 72 79 151 84 72 156
Pittsburgh 67 71 138 73 74 147 56 75 131
Temple 85 96 181 80 83 163 74 87 161
PR Caribe 25 25 50 31 27 58 33 28 61
Ponce 38 35 73 32 34 66 29 29 58
Puerto Rico 53 47 100 42 50 92 47 47 94
San Juan Bautista 41 11 52 47 29 76 33 26 59
RI Brown-Alpert 42 28 70 54 36 90 45 52 97
SC MU South Carolina 64 72 136 60 70 130 66 72 138
South Carolina 38 38 76 35 33 68 30 48 78
SD South Dakota-Sanford 22 31 53 22 25 47 21 25 46
TN East Tennessee-Quillen 26 30 56 35 27 62 28 34 62
Meharry 51 44 95 35 34 69 59 26 85
Tennessee 56 96 152 49 89 138 55 90 145

89
Vanderbilt 36 58 94 46 57 103 51 67 118
TX Baylor 81 98 179 74 83 157 71 81 152
Texas A&M 35 41 76 39 38 77 41 34 75
Texas Tech 57 79 136 61 66 127 64 79 143
UT Galveston 103 94 197 103 95 198 108 113 221
UT HSC San Antonio 128 74 202 108 91 199 122 89 211
UT Houston 86 102 188 84 105 189 94 128 222
UT Southwestern 94 127 221 113 120 233 87 117 204
UT Utah 38 60 98 40 59 99 35 64 99
VA Eastern Virginia 55 54 109 57 49 106 43 56 99
Virginia 57 73 130 64 77 141 69 72 141
Virginia Commonwealth 86 95 181 82 92 174 95 92 187
VT Vermont 57 24 81 57 48 105 61 47 108
WA U Washington 95 74 169 89 91 180 89 80 169
WI MC Wisconsin 87 99 186 89 112 201 99 95 194
Wisconsin 64 71 135 99 75 174 70 71 141
WV Marshall-Edwards 15 27 42 21 31 52 23 38 61
West Virginia 38 50 88 38 60 98 37 66 103

90
Table 15. Graduates in osteopathic medicine
Academic
Year College Total Graduates Male Female % Female
2009-10 ATSU-KCOM 165 101 64 39%
2009-10 AZCOM/MWU 137 99 38 28%
2009-10 CCOM/MWU 171 85 86 50%
2009-10 DMU-COM 207 125 82 40%
2009-10 GA-PCOM 66 29 37 56%
2009-10 KCUMB-COM 239 119 120 50%
2009-10 LECOM 215 120 95 44%
2009-10 LECOM Bradenton 142 69 73 51%
2009-10 MSUCOM 187 89 98 52%
2009-10 NSU-COM 215 107 108 50%
2009-10 NYCOM/NYIT 269 118 151 56%
2009-10 OSU-COM 82 43 39 48%
2009-10 OU-COM 110 55 55 50%
2009-10 PCOM 235 103 132 56%
2009-10 PCSOM 66 33 33 50%
2009-10 TUCOM-CA 129 68 61 47%
2009-10 TUNCOM 120 75 45 38%
2009-10 UMDNJ-SOM 100 46 54 54%
2009-10 UNE-COM 112 56 56 50%
2009-10 UNTHSC/TCOM 151 83 68 45%
2009-10 VCOM 149 63 86 58%
2009-10 Western U/COMP 203 104 99 49%
2009-10 WVSOM 161 72 89 55%
2008-09 ATSU-KCOM 166 104 62 37%
2008-09 AZCOM/MWU 149 83 66 44%
2008-09 CCOM/MWU 163 68 95 58%
2008-09 DMU-COM 197 95 102 52%
2008-09 GA-PCOM 73 35 38 52%
2008-09 KCUMB-COM 234 114 120 51%
2008-09 LECOM 217 119 98 45%
2008-09 LECOM Bradenton 159 82 77 48%
2008-09 MSUCOM 198 87 111 56%
2008-09 NSU-COM 218 105 113 52%
2008-09 NYCOM/NYIT 290 138 152 52%
2008-09 OSU-COM 82 41 41 50%
2008-09 OU-COM 103 44 59 57%
2008-09 PCOM 269 126 143 53%
2008-09 PCSOM 74 42 32 43%

91
2008-09 TUCOM-CA 133 51 82 62%
2008-09 TUNCOM 91 47 44 48%
2008-09 UMDNJ-SOM 92 35 57 62%
2008-09 UNE-COM 116 54 62 53%
2008-09 UNTHSC/TCOM 128 62 66 52%
2008-09 VCOM 139 73 66 47%
2008-09 Western U/COMP 196 98 98 50%
2008-09 WVSOM 101 60 41 41%

92
Appendix D: Review of incentives literature
Our sponsor asked us to briefly review literature on incentives; we
include our review in this appendix. For additional reading, we also
suggest the following references: Cooke [19], Cooke [20],
Samuelson et al. [21], and Jehn and Shughart [22].

Background
After the draft ended in 1973, each of the services created incentive
programs to motivate recruiters. While there are differences among
these programs, there are also many commonalities. For example,
all the services have Recruiter of the Year, Quarter, and Month
awards, as well as awards at different levels of command [23].

In addition, the services often use other kinds of incentives, such as


plaques or promotions. These are usually in place for a set period of
time and are designed to focus on service-specific needs. The needs
themselves change and can include requirements for enlistees to
have certain racial/ethnic or professional attributes. For the most
part, these incentives are similar across the services, mainly as a re-
sult of DOD regulations regarding recruiter awards [23].

Various studies have looked at the effectiveness of certain incentives


and at other factors that contribute to recruiter productivity. Com-
mon themes in the literature are that recruiting is a high-stress job
and that fear of failure is a strong motivator for recruiters. This is
particularly relevant for the most difficult types of recruiting, such
as medical recruiting.

Recruiter incentives
In a 2001 study, Emerson analyzed factors that influenced Navy re-
cruiter motivations to meet recruiting objectives. Data for the study
were collected through an online survey given to the enlisted re-

93
cruiting force. The survey was meant to identify recruiters’ attitudes
about certain incentives. Sabbaticals and financial awards ranked
highest on the list of extrinsic tangible incentives, but data analysis
indicated that recruiters rank intangible incentives even higher.
The top two intangible incentives were: (1) wanting to avoid letting
down their station and (2) feeling good for meeting the mission
[23].

The number one motivator of this incentive type is understandable.


According to Loving, failure can lead to poor performance evalua-
tions, which can make a recruiter less competitive for promotion
[24]. It can also mean extra training and supervision during an al-
ready challenging assignment. Removing a recruiter from his or her
position for low productivity, which may seem like a sensible course
of action, can actually reflect poorly on the commanding officer of
the recruiting station. Doing this is an acknowledgment that the
commanding officer was unable to train and lead a subordinate,
and it places strain on the remaining recruiters because the station
must still meet quotas, but with fewer personnel until a replacement
arrives.

Loving also discovered that a good command climate has the largest
impact on recruiter motivation [24]. Two other studies reveal that
how leaders choose to motivate, set goals for, and discipline recruit-
ers can affect recruiter performance (see [23 and 25]). Incentive
preferences differed by paygrade, volunteer status, and membership
in the Career Recruiting Force.

In 1993, Barfield examined recruiter productivity by:

 Geographic location

 Racial and ethnic background of the recruiter and recruit

 Incentive program

The study found that recruiters are more productive when recruit-
ing people who are like themselves. For example, women were bet-
ter at recruiting women [26]. This finding has potential policy
implications, particularly when services must target populations with
specific attributes. It may be beneficial to increase the number of

94
recruiters with a certain profile (e.g., male Hispanics) if, and when,
a service aims to recruit more male Hispanics. Similarly, one could
infer that some involvement of a surgeon might be helpful in re-
cruiting another surgeon. Clearly, this is difficult, in practice, when
it comes to recruiting for critical jobs in the medical field. One solu-
tion might be to have nonmedical recruiters put forth most of the
effort needed to recruit for critical medical jobs, but have some rep-
resentatives of those fields come in and give seminars on their jobs
with some regularity.

This analysis also concluded that two initiatives—the Recruiter Ad-


vancement Through Excellence (RATE) program and the Recruiter
Meritorious Advancement Program (RMAP)—succeeded in empha-
sizing recruit quality instead of simply quantity. Both programs seem
to be better than the Freeman Plan, which the Navy implemented in
1979 to award recruiters for enlisting quality people rather than fo-
cusing on quantity. Despite the perceived success of RATE and
RMAP, some in the Navy felt that there was room for improvement
regarding recruiter incentives [26]. This would indicate that a similar
program that focuses on how to recruit medical professionals could
address some of the difficulties in medical recruiting.

Garcia and Sharma judged the performance of the Navy Recruiter


Incentive Program of October 1992 to April 1993 for Expansion of
Female Representation among recruits [27]. The quality incentive
system gave recruiters two more points for every female contract.
Recruiters earning the most points received consideration for med-
als, certificates, and advancement. A policy comparison was made
using recruiting data collected from the Navy Recruiting Districts
for 1992 and 1993. Although this study looked only at female re-
cruitment efforts, in the past, the Navy has offered similar point
awards for recruiting other target populations, such as those quali-
fied to work in the nuclear field. The study concluded that giving
recruiters extra points for enlisting members of a target group is a
good strategy.

Similar to the target-specific incentives discussed in Garcia and


Sharma, the Marine Corps sometimes has seasonal recruiting drives
with rewards for enlisting certain types of people (see [24 and 27]).
The winter months can be a very difficult time for recruiters, so one

95
program incentivizes them to sign up high school graduates who
can attend recruit training within 30 to 60 days. This kind of cam-
paign is usually highly publicized, with recruiter performance dis-
seminated daily, to spur competition among Marines. One year
there was a baseball theme to the program, and awards included
baseballs and bats. More substantial rewards, such as meritorious
promotions, went to the most productive recruiters. This is further
evidence that spurring competition and using awards could make a
difference in medical recruiting as well.

 Recent research has questioned the efficacy of traditional


measures of performance, which can act as incentives for re-
cruiters. Dertouzos and Garber argue that performance
measurements should reward recruiter skill and effort since
recruiters who have more skill secure more contracts for a
given level of effort when market quality is constant [28]. The
issue is that most metrics do not allow the separation of skill
and effort, so it is difficult to infer how much production oc-
curs as a result of either factor. Better performance measures
would isolate skill and effort and account for the following:
 Adjust for such things as the quality of local markets based on
demographics or economic conditions.
 Account for differences in enlistment rates over time within
local areas or regions.
 Assess differences in the difficulty of recruiting certain popu-
lations, such as high school seniors with high test scores.
These studies have identified that recruiter incentives can affect re-
cruiter performance, but they have produced few results that can be
used to guide a service in how to construct a set of incentives. There
have not been any controlled experiments on recruiter incentives, al-
though large-scale experiments have been used in the past to test
educational benefits, bonuses, and advertising. If the Navy chooses to
experiment with alternative ways of setting recruiting targets, as in
the case, for example, of medical recruits, it would be an opportune
time to set up an experiment that could produce some insights into
how such an approach affects recruiter performance.

96
References
[1] Nonresident Training Course. Navy Counselor 1 & C (Recruiter).
NAVEDTRA 14172. Jan. 1995.

[2] Lauren Malone, Neil Carey, Yevgeniya Pinelis, David Gregory.


Waivered recruits: An evaluation of their performance and at-
trition risk. CNA Research Memorandum D0023955.A2/Final.
Jan. 2011.

[3] Christopher D. Bownds. “Updating the Navy’s Recruit Quality


Matrix: An Analysis of Educational Credentials and the Success
of First-Term Sailors”. Master’s thesis, Naval Postgraduate
School, Mar. 2004.

[4] Daniel Goldhaber. An Econometric Analysis of the Enlisted Goaling


Model. CNA Research Memorandum 2799002000. Jun. 1999.

[5] MLDC, Decision Paper #1: Outreach and Recruiting, February


2011.

[6] James N. Dertouzos and Steven Garber. Human Resource Man-


agement and Army Recruiting Analyses of Policy Options.
RAND Corporation Report MG-433-A. 2006.

[7] Don Bohn and Edward Schmitz. A Zip Code Based Production
Function. United States Navy Recruiting Command. Dec. 1992.

[8] Don Bohn and Edward Schmitz. Estimating Enlistment at the


Local Level. United States Navy Recruiting Command. Apr.
1995.

[9] J. S. Long. Regression Models for Categorical and Limited De-


pendent Variables. Thousand Oaks, CA: Sage Publications,
1997.

97
10] Jennifer Gibson et al. Zip Code Valuation Study Technical
Report: Predicting Navy Accessions. Defense Human Re-
source Activity, JAMRS Report No. 2009-16. Dec. 2009.

[11] Michelle Dolfini-Reed et al. Demographic Dynamics of the


Reserve Force Laydown. CNA Research Memorandum
D0025181.A2/
Final. Jul. 2001.

[12] Lauren Malone and Anita Hattiangadi. Review of Methods


for the District Allocation of Prior Service Recruiters. CNA
Memorandum D0019870.A1/Final. Feb. 9, 2009.

[13] From Representation to Inclusion: Diversity Leadership for


the 21st Century Military. Military Leadership Diversity
Commission (MLDC). 2011.

[14] Lauren Malone, Laura Kelley, and Adam Clemens. The


Qualified Candidate Population: Obtaining Highly Quali-
fied, Diverse Accessions for the USMC Officer Corps. CNA
Research Memorandum D0025113.A2/Final. Aug. 2011.

[15] Jennie Wenger and Laura Kelley. Marine Corps Officer Re-
cruiting: Which Schools Did Officers Attend? CNA Research
Memorandum D0014599.A2/Final. Sep. 2006.

[16] Amanda Kraus et al. College Recruits in the Enlisted Navy:


Navy Outcomes and Civilian Opportunities. CNA Research
Memorandum D0010405.A2 Final. Dec. 2004.

[17] Eric W. Christensen et al. Recruiting and Manning Issues for


Health Care Professionals. CNA Research Memorandum
D0016562.A2 Final. Sep. 2007.

[18] Martha Farnsworth Riche and Amanda Kraus. Demographic


Representation in the Bureau of Medicine and Surgery: Civilian
Trends and Comparisons. CNA Research Memorandum
D0017355.A2/Final. Apr. 2008.

98
[19] Timothy W. Cooke. Individual Incentives in Navy Recruiting.
CNA Research Memorandum 86-289. Dec. 1986.

[20] Timothy W. Cooke. Navy Recruiting Initiatives: FY 1989.


CNA Information Memorandum 103. Nov. 1990.

[21] Dana L. Samuelson et al. Productivity Effects of Changes in


the Size of the Enlisted Recruiter Force. CNA Research
Memorandum D0013975.A2. May 2001.

[22] Christopher Jehn and William F. Shughart. Recruiters, Quo-


tas, and the Number of Enlistments. Center for Naval
Analyses CNS 1073. Dec. 1976.

[23] Ellen H. Emerson. “Navy Recruiter Incentives and Motiva-


tion: A Survey of Enlisted Recruiters.” Master’s thesis, Naval
Postgraduate School, Mar. 2001.

[24] Major J. B. Loving. “Monetary Incentives for Marine Recruit-


ers.” Master’s thesis, United States Marine Corps Command
and Staff College, 2001.

[25] Anne E. Aunins et al. Navy Recruiter Survey: Content Analy-


sis of Free Response Data. Navy Personnel Research and De-
velopment Center. Mar. 1990.

[26] Lisa C. Barfield. “An Analysis of Enlisted Navy Recruiter Pro-


ductivity.” Master’s thesis, Naval Postgraduate School, Sep.
1993.

[27] Federico Garcia and Ravi Sharma. Recruit Incentives, Re-


cruiter Incentives, and Navy Female Enlistment. CNA Re-
search Memorandum 95-35. May 1995.

[28] James N. Dertouzos and Steven Garber. Performance Evalua-


tion and Army Recruiting. RAND Corporation Report MG-
562-A. 2008.

99
[29] Samuelson et al, Productivity Effects of Changes in the Size
of the Enlisted Recruiter Force. CNA Research Memoran-
dum D0013975.A2/Final, May 2006

[30] Dana L. Brookshire, Anita U. Hattiangadi, Catherine M.


Hiatt. Emerging Issues in USMC Recruiting: Assessing the
Success of Cat. IV Recruits in the Marine Corps. CNA Anno-
tated Brief D0014741.A1/Final, Aug. 2006.

100
Bibliography

Asch, Beth. Navy Recruiter Productivity and the Freeman Plan. RAND
Corporation Report R-3713-FMP. 1990.

Government Accountability Office. Military Recruiting: DOD Could


Improve Its Recruiter Selection and Incentive Systems.
GAO/NSIAD-98-58. Jan 1998.

Mero, Neal, James R. Vanscotter, and Rebecca M. Guidice. The In-


fluence of Incentives and Monitoring on the Task and Contex-
tual Performance of Navy Recruiters. Office of Naval Research.
Jan. 2002.

Oken, Carole, and Beth Asch. Encouraging Recruiter Achievement:


A Recent History of Military Recruiter Achievement Programs.
RAND Corporation. 1997.

United States Air Force Diversity Strategic Roadmap: A Journey to


Excellence. Air Force Diversity Operations (AFDO), AF/A1DV.
2010.

James E. Parco and David A. Levy, eds. Attitudes Aren’t Free: Thinking
Deeply about Diversity in the US Armed Forces. Maxwell Air Force
Base, AL: Air University Press, 2010.

101
This page intentionally left blank.w

102
List of tables
Table 1. Results from the Any Contracts Model: Coefficients for
modeling productive zip codes ............................................. 24

Table 2. Results from the Counts Model: Coefficients for modeling


inventory on a zip-code level ................................................. 25

Table 3. Estimated list of QCP Schools................................................ 48

Table 4. QCP for the Marine Corps .................................................... 49

Table 5. IPEDS and Navy schools ........................................................ 50

Table 6. All inventory based on Navy rank: Navy most productive


schools (2005–2009) .............................................................. 51

Table 7. Model results for predicting A-cell recruits by zip code ....... 77

Table 8. Model results for predicting the number of black recruits


by zip code ............................................................................. 79

Table 9. Model results for predicting the number of black A-cell


recruits by zip code................................................................. 80

Table 10. Model results for predicting the number of Hispanic


recruits by zip code ................................................................ 82

Table 11. Model results for predicting the number of Hispanic


A-cell recruits by zip code ..................................................... 83

Table 12. Model results for predicting the number of female


recruits by zip code................................................................. 84

Table 13. Model results for predicting the number of female A-cell
recruits by zip code ................................................................ 85

Table 14. Total graduates by U.S. medical school and gender ............ 87

Table 15. Graduates in osteopathic medicine ...................................... 91

103
This page intentionally ;left blank.

104
CRM D0026005.A2/Final

4825 Mark Center Drive, Alexandria, VA 22311-1850 703-824-2000 www.cna.org

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy