Publication | Closed Access
Choosing Among Alternative Nonexperimental Methods for Estimating the Impact of Social Programs: The Case of Manpower Training
565
Citations
8
References
1989
Year
Labor Market ParticipationEducationProgram ImpactHuman Resource ManagementPolicy AnalysisSocial WorkProgram EvaluationNonexperimental EstimatorsSocial Policy ResearchManpower TrainingStatisticsAlternative Nonexperimental MethodsAlternative Nonexperimental EstimatorsPublic PolicyEconomicsEmploymentSelection BiasSocial ProgramsSocial ImpactLabor Market OutcomeSociologyBusinessQuantitative Social Science ResearchLabor Market ImpactSocial PolicyUnemployment
Recent studies of manpower training programs show that different nonexperimental estimators yield widely varying impact estimates, prompting calls for experimental evaluation and highlighting a lack of systematic guidance for selecting among estimators. This paper investigates whether simple specification tests can guide the choice of a suitable nonexperimental estimator for manpower training programs. Reanalysis of the National Supported Work data demonstrates that a simple testing procedure narrows the set of nonexperimental estimators to those consistent with experimental estimates of program impact.
Abstract The recent literature on evaluating manpower training programs demonstrates that alternative nonexperimental estimators of the same program produce an array of estimates of program impact. These findings have led to the call for experiments to be used to perform credible program evaluations. Missing in all of the recent pessimistic analyses of nonexperimental methods is any systematic discussion of how to choose among competing estimators. This article explores the value of simple specification tests in selecting an appropriate nonexperimental estimator. A reanalysis of the National Supported Work Demonstration data previously analyzed by proponents of social experiments reveals that a simple testing procedure eliminates the range of nonexperimental estimators at variance with the experimental estimates of program impact.
| Year | Citations | |
|---|---|---|
Page 1
Page 1