Publication | Closed Access
Development of Automated Scoring Algorithms for Complex Performance Assessments: A Comparison of Two Approaches
72
Citations
17
References
1997
Year
EngineeringGeneralizability TheoryDiagnosisEducationSoftware EngineeringPerformance MeasurementPerformance Measurement SystemsProgram EvaluationData ScienceMedical Expert SystemPerformance AssessmentSystems EngineeringApplied MeasurementBiostatisticsAutomated AssessmentPerformance MetricReliabilityExpert SystemsExpert RatersComputer Simulation TestEducational TestingDecision Support SystemsComputer ScienceClinical Decision SupportAutomated Scoring AlgorithmsEducational MeasurementComplex Performance AssessmentsEvaluation MeasureSoftware TestingPerformance MeasurePerformance AssessmentsEducational Assessment
Performance assessments are typically scored by having experts rate individual performances. The cost associated with using expert raters may represent a serious limitation in many large‐scale testing programs. The use of raters may also introduce an additional source of error into the assessment. These limitations have motivated development of automated scoring systems for performance assessments. Preliminary research has shown these systems to have application across a variety of tasks ranging from simple mathematics to architectural problem solving. This study extends research on automated scoring by comparing alternative automated systems for scoring a computer simulation test of physicians'patient management skills; one system uses regression‐derived weights for components of the performance, the other uses complex rules to map performances into score levels. The procedures are evaluated by comparing the resulting scores to expert ratings of the same performances.
| Year | Citations | |
|---|---|---|
Page 1
Page 1