NSTA RSS Feeds 

HOW ARE WEB PAGES EVALUATED?

Here you can review the process that SciGuide Developers use to evaluation Internet URL resources. The process of evaluating Internet resources required rigorous training to ensure that all SciGuide Developers had a common understanding of the meaning of the evaluation rubrics and exhibited reliability in the rubrics' application. There are eight rubrics in all, each created to evaluate a specific educational component with respect to the web content. Those rubrics are:

  • Authority: Information presented is reliable, valid, and authoritative
  • Design (overall): In addition to being educational, the materials are visually appealing and/or entertaining.
  • Interactivity: Interactive features, such as dynamic feedback based on user manipulation, is provided and add value to the site. Content provides the manipulation of data sets and simulations
  • Communication/Collaboration: Content allows communication/data exchange between content providers and students, as well as between distributed students via the Internet.
  • Scientific Inquiry: To what extent does the content, presentation method, and learner activity facilitate inquiry-based learning -- supported with real world examples
  • How Scientist Work/Nature of Scientific Inquiry: How students learn about what scientists do in the process of inquiry
  • Quality of Writing: Instructional and explanatory text is well written.
  • Resource Integration: The Web page is easily implemented, adds value to pre-existing resources, and is articulated to the Standards.

After an initial evaluation by a SciGuide Developer each web resource in a SciGuide is reviewed by a second and in most cases a third review by a SciGuide expert. These multiple reviews not only provide rigor for the evaluation process, but, through comments provided by each individual reviewer visible, provide the end user with the rational behind each individual's review.

In order to come to a common understanding of the evaluation rubrics, SciGuide Developers were first asked to score a sample web page. This web page evaluation process initiates several lines of discussion regarding the application of the rubrics and the need for further reflection and evaluation of the process. Through repeated exposure, definition, discussion, and practice the SciGuide Developers achieved the high level of reliability required of the project and their personal high standards.

During the early NSF funded portion of the project, Webwatchers SciGuide Developers were led through a series of online survey tools and the application of the rubrics was monitored by Horizon Research. The application of the rubrics was calibrated by having participants apply the rubrics individually to a single pre-evaluated web page. After all parties had evaluated a precalibrated web page, the frequency distribution of the ratings was reviewed as well as the comments of the reviewers. The frequency distribution was then presented to the large group and a discussion ensued that ensured a "tighter" application of the rubric in question. At this early stage of the process, minor editing of the rubric also occurred to help clarify areas of concern.

As this initial rubric application training came to a close it was realized that while attempting to evaluate the on-line materials, we could never appreciate the myriad of ways in which a page could be augmented to support science instruction. We could only evaluate a URL as it explicitly addressed an area included in the evaluation rubrics, as delineated by the creators or designers of that particular page. For example, if the creators of a URL resource discussed ways in which the page could be used as a component of a larger inquiry and linked to it, then the front page itself would not be listed but the page that provided the activity would be evaluated.

Icon Filters

Icon Filters allow teachers to selectively view Webwatcher resources sorted by specific types of information such as hands-on investigation, lesson ideas, misconceptions, history of science, etc. In order for this to be a useful feature for teachers, all SciGuide Developers are trained to be consistent in the selection of icons.

In addition, when evaluating an Internet resource for a specific icon (inquiry for example) the question to answer became, should the URL resource be evaluated as potentially supporting a facet of inquiry if augmented in some way by the classroom teacher? Or does the activity support a complete inquiry or investigation without augmentation by the end-user. As with the entire rubric alignment process, it was decided that the information needed to be explicitly provided on the web page for it to be labeled or identified with an icon.