Publication | Open Access
Comparison of PBL assessment rubrics
10
Citations
8
References
2009
Year
EngineeringProject ManagementEducationStatisticsReliabilityTeam-based Project WorkTest DevelopmentDesignLearning AnalyticsPbl Assessment RubricsProblem-based LearningStudent AssessmentTeam SubmissionsComprehensive Marking RubricEvaluation MeasureProject-based LearningHigher Education AssessmentEducational AssessmentSurvey Methodology
This paper investigates the development and use of a comprehensive marking rubric for assessing team-based project work in a core first year problem-based learning (PBL) course. Students work in teams of up to eight to solve open-ended engineering problems and submit their solutions in a project report. The marking rubric is designed to assess all required aspects of the submissions including technical components and reflective aspects where teams reflect on their progress and problems to date and plan for future improvement. Team submissions may be assessed by any one of eleven different markers. Analysis of marking from earlier course offerings shows that this assessment was not consistent between markers, nor did it give constructive feedback to the students. As a consequence the marking rubric was redesigned and was evaluated against earlier marking schemes for consistency between markers and repeatability. Results indicate that new rubric is consistent and repeatable across markers with the exception of one criterion which needs further development.
| Year | Citations | |
|---|---|---|
Page 1
Page 1