Evaluation of best practices in the design of online evidence-based practice instructional modules. Academic Article uri icon

abstract

  • OBJECTIVES: The research determined to what extent best practices are being followed by freely available online modules aimed at teaching critical thinking and evidence-based practices (EBPs) in health sciences fields. METHODS: In phase I, an evaluation rubric was created after reviewing the literature. Individual rubric questions were assigned point values and grouped into sections, and the sections weighted. Phase II involved searching Internet platforms to locate online EBP modules, which were screened to determine if they met predetermined criteria for inclusion. Phase III comprised a first evaluation, in which two authors assessed each module, followed by a second evaluation of the top-scoring modules by five representatives from different health sciences units. RESULTS: The rubric's 28 questions were categorized into 4 sections: content, design, interactivity, and usability. After retrieving 170 online modules and closely screening 91, 42 were in the first evaluation and 8 modules were in the second evaluation. Modules in the first evaluation earned, on average, 59% of available points; modules in the second earned an average of 68%. Both evaluations had a moderate level of inter-rater reliability. CONCLUSIONS: The rubric was effective and reliable in evaluating the modules. Most modules followed best practices for content and usability but not for design and interactivity. IMPLICATIONS: By systematically collecting and evaluating instructional modules, the authors found many potentially useful elements for module creation. Also, by reviewing the limitations of the evaluated modules, the authors were able to anticipate and plan ways to overcome potential issues in module design.

altmetric score

  • 4.45

author list (cited authors)

  • Foster, M. J., Shurtz, S., & Pepper, C.

citation count

  • 15
  • 16

publication date

  • January 2014