Midlands State University Library
Image from Google Jackets

Comparison of automated scoring methods for a computerized performance assessment of clinical judgment created by Polina Harik, Peter Baldwin, Brian Clauser

By: Contributor(s): Material type: TextTextSeries: ; Volume , number ,Philadelphia : Sage; 2013Content type:
  • text
Media type:
  • unmediated
Carrier type:
  • volume
Subject(s): Online resources: Summary: Growing reliance on complex constructed response items has generated considerable interest in automated scoring solutions. Many of these solutions are described in the literature; however, relatively few studies have been published that compare automated scoring strategies. Here, comparisons are made among five strategies for machine-scoring examinee performances of computer-based case simulations, a complex item format used to assess physicians’ patient-management skills as part of the Step 3 United States Medical Licensing Examination. These strategies utilize expert judgments to obtain various (a) case-specific or (b) generic scoring algorithms. The various compromises between efficiency, validity, and reliability that characterize each scoring approach are described and compared.
Reviews from LibraryThing.com:
Tags from this library: No tags from this library for this title. Log in to add tags.
Star ratings
    Average rating: 0.0 (0 votes)
Holdings
Item type Current library Call number Vol info Copy number Status Notes Date due Barcode
Journal Article Journal Article Main Library - Special Collections BF39 APP (Browse shelf(Opens below)) Vol. 37, No. 8 pages 587-597 SP17345 Not for loan For in-house use only

Growing reliance on complex constructed response items has generated considerable interest in automated scoring solutions. Many of these solutions are described in the literature; however, relatively few studies have been published that compare automated scoring strategies. Here, comparisons are made among five strategies for machine-scoring examinee performances of computer-based case simulations, a complex item format used to assess physicians’ patient-management skills as part of the Step 3 United States Medical Licensing Examination. These strategies utilize expert judgments to obtain various (a) case-specific or (b) generic scoring algorithms. The various compromises between efficiency, validity, and reliability that characterize each scoring approach are described and compared.

There are no comments on this title.

to post a comment.