Evaluation

The Engineering Pathway team intends to develop a thriving community of educators and learners actively engaged in the full lifecycle of engineering education resources, from creation to curation, dissemination and use. Evaluating the effectiveness of Engineering Pathway is very important to its users and sponsors.

An expert in digital libraries and their evaluation - Prof. Nancy Van House at the University of California, Berkeley - continues to serve as our development and implementation consultant. We are constantly looking for ways to improve and make changes daily but we have also conducted, and continue to conduct, a number of evaluation activities to systematically assess the performance of the library in a number of key areas.

In addition to using previous studies on the effectiveness of the NEEDS digital library to inform the design of Engineering Pathway's digital library, we have a number of ongoing activities to provide us with greater insight into the user experience on Engineering Pathway.

Teaching and Learning

The Engineering Pathway digital library is designed to support improved teaching and learning by promoting its use within a larger educational context that focuses on learning goals and outcomes. The Engineering Pathway has three levels of review (1) library review, (2) comments/expert reviews, and (3) Premier Award for Excellence in Engineering Education Courseware. The library review ensures that each catalog record has relevant information about the resources for users to quickly decide whether the resource might be relevant to their use. The catalog record includes learning resource type, audience, and ABET student and learning outcome attributes. Many of the K-12 records include information on relevancy to State standards. The comments and review provide suggestions for use or applications at other universities. Where available, links are provided on relevant scholarly publications that have evaluated the resource in educational settings or to award recognition. As of March 31, 2009 over 60% of our records have at least one comment or review and over 3,500 resources link to each other (e.g., a courseware module may link to a scholarly publication reporting the educational assessment of that courseware.)

The Premier Award is the top level of evaluation and its critiera are divided into three categories: instructional design, software design, and engineering content. Each category is described by a set of components and sub-components. The entire learning experience of using the software, as well as the materials in the submission packet, should demonstrate that the submission meets (and hopefully exceeds) the criteria by addressing each component and sub-component. For more information, please see the Premier Award Criteria page.

The Engineering Pathway offers workshops on use of digital libraries for improved teaching and learning. For example, please review slides 4-8 on selecting learning materials and designing learning activities from the workshop "How to Improve Teaching and Learning: Selecting, Implementing and Evaluating Digital Resources in the Engineering Pathway" (Powerpoint).

example: slide 4 from "How to Improve Teaching and Learning: Selecting, Implementing and Evaluating Digital Resources in the Engineering Pathway".

Performance and User Metrics

To perform user evaluations, we are drawing upon prior studies, and methods and instruments employed by NEEDS, providing a useful benchmark against which we can measure progress towards current objectives. To better understand user needs, specific quantitative goals will be developed through discussions with partner communities. We are expanding performance monitoring applications to automatically report access and performance measures of the system's infrastructure.

Our current infrastructure is instrumented to enable our collections, in accordance with our privacy policies, to track usage of a number of key metrics (downloads, queries, federated search queries, 'successful' recommendations, etc.). These usage metrics enable us to monitor the variations in performance and activity on Engineering Pathway in response to, for example, changing monthly themes (For example, view our live summary of monthly usage statistics from NEEDS/SMETE/EP).

Quantitative and Qualitative Evaluation

We are using qualitative evaluation techniques to understand the quality of the users' experience and how resources are used. We currently mine the server log data to provide a better understanding of how much time users spend with resources and how these resources are accessed. In addition to these in-depth metrics there are a number of ongoing activities that enable cross-referencing of these results.

Survey

Engineering Pathway conducted a survey of the site's visitors in late 2006, focusing on demographics, usability, and the usefulness of the digital library's features. The results of this study helped update the development team's understanding of current user needs, and focus efforts on features indicated to be important to Engineering Pathway's visitors.

Heuristic Evaluation

We are also performing heuristic evaluations based on general web design heuristics and heuristics more specific to the design of digital repositories. Usability heuristics are general principles for user interface design that help to improve the user-centered approach and experience of a site's visitors. For an introduction to and some of the most widely used heuristics for web site evaluation see Nielsen's 10 usability heuristics and Bruce Tognazzini's first principles of interaction design.

Usability Studies

We are performing a number of task-based usability studies with select users from Engineering Pathway's target user groups. The usability studies are focused by specific tasks to perform such as searching, registering, cataloging and commenting on resources. We have also performed a number of lead-user interviews to identify emerging directions and needs for the Engineering Pathway site. Under the direction of Professor Nancy Van House(Information School, University of California at Berkeley) we have conducted two user needs and usability studies of the Engineering Pathway site (Higher Education and K-12) and have greatly benefitted from the feedback.

For an explanation of each usability assessment method we recommend the usability methods table at usabilitynet.org.

Advisory Committee

John Prados, Co-Chair, served as president of ABET (1991-92), senior education associate in the NSF Engineering Directorate (1994-97) and was founding editor-in-chief of the Journal of Engineering Education. In addition, he served as vice president of the University of Tennessee, where he received a Ph.D. in chemical engineering.

Lee Dirks, Co-Chair, is a senior manager at Microsoft Research, where he is responsible for directing the company's efforts in Scholarly Communications. Lee manages research programs related to open access to research data, interoperability of archives and repositories, and the preservation of digital information.

Jack Lohmann, Vice Provost at Georgia Tech

Norm Fortenberry, Director, Center for the Advancement of Scholarship Engineering Education (CASEE)

Lillian Wu, Program Executive, IBM University Relations and Innovation

Tony Hey, Technical Computing Initiative, Microsoft

Read some of our past publications on evaluation
Reitsma, Rene F., Marshall, Byron, and Zarske, Malinda S. (2009) "Aspects of 'Relevance' in the Alignment of Curriculum with Educational Standards," Information Processing and Management (IPM). (submitted)
Marshall, Byron, Reitsma, Rene F., and Zarske, Malinda S. (2009) "Differentiating Search and Evaluation Behavior in K-12 Digital Libraries," Joint Conference on Digital Libraries (JCDL), Austin, TX, June 2009. (submitted)
Joe Tront and Alice M. Agogino, "How to Improve Teaching and Learning: Selecting, Implementing and Evaluating Digital Resources in the Engineering Pathway," ASEE Workshop, ASEE Annual Meeting, Austin, Texas, June 15, 2009. Presentation (Powerpoint)
Alice Agogino, "Introduction to Evaluating, Selecting and Using Digital Learning Resources", Workshop for the Indo-US Collaboration for Engineering Education, Mysore, India, July 18, 2008.
Teng, X., B. Muramatsu, J.W. Zhang, J.G. Tront, F. McMartin and A. M. Agogino, "Implementation of Quality Evaluation for Web-based Courses and Digital Learning Resources," Proceedings of the 3rd International Conference on Web-based Learning (Aug. 8-11, 2004, Tsinghua University, Beijing, China). http://www.icwl2004.org/. Archival version published in: Lecture Notes in Computer Science, Eds., Wenyin Liu, Yuanchun Shi, Qing Li, Springer-Verlag GmbH, ISBN: 3-540-22542-0, vol. 3143, p. 379, 2004.
Teng X., J.G. Tront, B. Muramatsu, A. Agogino, Best Practices in the Design, Development and Use of Courseware in Engineering Education. Proceedings of the 2005 Frontiers in Education Conference, October 19-22, Indianapolis, Indiana
Shuang, S., A. Dong and A.M. Agogino. Modeling Engineering Information Needs. Journal of Computing and Information Science in Engineering, 2, No. 3, Sept. 2002, pp. 199-207.
Dong, A. and Agogino, A. M. Design principles for the information architecture of a SMET education digital library. Proceedings of the 2001 ACM/IEEE Joint Conference on Digital Libraries: June 2001, Roanoke, VA.
Puzniak, J., & McMartin, F. Building a digital learning community for faculty on the Internet. Proceedings of the 2000 American Society for Engineering Education Annual Conference: June 18-21, 2000, St. Louis, MO.
McMartin F., NEEDS User Study, Observations at the ASEE, June 2000
McMartin, F. Preliminary findings from "Science, Mathematics, Engineering, and Technology Education User Study" focus groups.