Abstract: Assessment of educational outcomes through purchased tests is commonplace in the evaluation of individual student ability and the evaluation of educational programs. Focusing on the assessment of writing performance in a longitudinal study of first-time, full-time students (n = 598), this research describes the design, use, and assessment of an open-source alternative to purchased tests. Augmenting usability testing, the research design relies on a framework of inter-reader agreement, inter-reader reliability, and coefficients of determination. The open-source, web-based portfolio assessment system yielded rates of agreement, reliability, and determination superior to the traditional paper-based portfolio assessment method. In addition, the system appears to be ideally suited to assess EPortfolios created to showcase student ability in digital environments: agreement range = 82% to 98%; reliability range = ? = .472 (? < .01) to ? = .827 (? < .01); coefficient of determination = R2 = .78, F(4, 74) = 65.75 (? < .01). This novel and innovative application of an open source platform for outcomes assessment yields the foundation for a sound validity argument, the eradication of human error, and complete system transparency and flexibility. Future research directions point to the need for the design and assessment of an open-source system designed to capture the socio-cognitive environment in which student learning occurs.
2240118 2RG8G772,2RG8G772 items 1 apa default asc