2013 – 2014


Evaluation-Specific Methodology

Date: April 29th, 2014
Presenter(s): Dr. Michael Scriven - Distinguished Professor, Claremont Graduate University and Co-Director, Claremont Evaluation Center
Abstract: Professional, scientifically acceptable evaluation requires a wide range of competencies from the social science methodology toolkit, but that’s not enough. Unlike the old-time social scientists, you’ve got to get to an evaluative conclusion—that’s what it means to say you’re doing evaluation—and to do that you apparently have to have some evaluative premises. Where do those come from, and how can you validate them against attack by people who don’t like your conclusion? The way most evaluators—and the texts—do this is by relying on common agreement or intuitions about what value claims are correct, but of course that won’t work when there’s deep disagreement e.g., about abortion, suicide hot lines, creationism in the science curriculum, ‘natural’ medicine, healthcare for the poor, torture, spying and war. Evaluation-specific methodology covers the selection and verification/refutation of all value claims we encounter in professional evaluation; and how to integrate them with data claims (and data syntheses) by inferences from and to them; and how to represent the integrated result in an evaluation report. Important sub-topics include: rubrics, needs assessment, the measurement of values, crowd-sourced evaluation, and the special case of ethical value claims. (Not to mention a completely new philosophy of science.) LOCATION: 4410 Ellsworth Hall TIME: NOON- 1:00 P.M.

Racial Equity Lens for Culturally Sensitive Evaluation Practices

Date: April 16th, 2014
Presenter(s): Mr. Willard Walker-Senior Policy Consultant, Public Policy Associates; Dr. Paul Elam, Project Manager, Public Policy Associates; and Dr. Christopher Dunbar, Professor, MSU
Abstract: We believe that this work will play a critical role in efforts to identify particular aspects of diversity, inclusion, and equity, and their relevance to an evaluation process. We have intentionally embedded these concepts to emphasize the importance of racial and cultural proficiency throughout analysis and evaluation processes. In this way, a researcher is moved away from simply assessing outcomes to recognizing the historical events that play a part in maintaining the adverse conditions that exist in underserved communities. We do our work through a lens that acknowledges white privilege and structural racism as fundamental forces in understanding how conditions came to be as they are today. LOCATION: 4410 Ellsworth Hall TIME: NOON- 1:00 P.M.

CANCELED: The Detroit Sexual Assault Kit (SAK) Action Research Project: Developmental Evaluation in Practice

Date: April 9th, 2014
Presenter(s): Dr. Rebecca Campbell—Professor of Ecological-Community Psychology, MSU
Abstract: THIS EVENT WILL BE RESCHEDULED FOR FALL 2014 In 2009, 11,000+ sexual assault kits (SAKs) were discovered in a Detroit Police Department property storage facility, most of which had never been forensically tested and/or investigated by the police. In 2011, a multi-stakeholder group convened to develop long-term response strategies, including protocols for notifying victims whose kits had been part of the backlog. In this presentation, I will describe the process by which we used developmental evaluation theory to create a mixed-methods evaluation of this initiative. This presentation will summarize the numerous challenges (psychological, ethical, and legal) we have faced attempting to locate survivors so many years later to evaluate the efficacy of the protocols developed by the multidisciplinary team. LOCATION: 4410 Ellsworth Hall TIME: NOON- 1:00 P.M.

Assessing Cultural Competence in Evaluation

Date: April 2nd, 2014
Presenter(s): Dr. Jody Brylinsky—Professor of Health, Physical Education, and Recreation, WMU
Abstract: Cultural Competence is the ability to function effectively in diverse cultures. This presentation will attempt to create an awareness of how cultural competence impacts evaluation in very implicit ways. Participants will be asked to reflect on how their particular culture may be biasing evaluation practices. An understanding of a personal level of cultural bias is important as it has impact on: (1) Academic and interpersonal skills, (2) Understanding and appreciation of cultural differences & similarities, (3) Willingness to draw on community-based values, traditions, customs, and (4) Valuing diversity both between and within groups. LOCATION: 4410 Ellsworth Hall TIME: NOON- 1:00 P.M.

Guidelines for Developing Evaluation Recommendations

Date: March 26th, 2014
Presenter(s): Dr. Lori Wingate—Assistant Director, The Evaluation Center, WMU
Abstract: Michael Scriven has said “an evaluation without recommendations is like a fish without a bicycle.” Nonetheless, most evaluation clients expect an evaluator to produce recommendations for program improvement. In this session, I will present guidelines for evaluation recommendations, covering their development, delivery, and follow up. Most of the session will be discussion—participants are invited to share their trials, tribulations, successes, and lessons learned with regard to providing recommendations based on evaluation findings. LOCATION: 4410 Ellsworth Hall TIME: NOON- 1:00 P.M.

ASSESS: Web-based Assessment Instrument Selection for Engineering Education

Date: March 19th, 2014
Presenter(s): Dr. Denny Davis—Emeritus Professor, Washington State University
Abstract: Engineering educators and scholars of student learning are challenged to improve and document student achievements for program accreditation, grading, and institutional accountability. Difficulty finding appropriate assessment instruments is exacerbated by engineering educators untrained in educational assessment and evaluation professionals unfamiliar with specialized knowledge and professional skills in engineering. The Appraisal System for Superior Engineering Education Evaluation-instrument Sharing and Scholarship (ASSESS) addresses such challenges by providing educators and evaluation professionals a web-based catalog of information about engineering education assessment instruments. ASSESS enables searches and comparisons to facilitate selection of instruments based on constructs assessed, ABET (formerly Accreditation Board for Engineering and Technology) criteria, and technical and administrative characteristics. LOCATION: 4410 Ellsworth Hall TIME: NOON- 1:00 P.M.

Improving the Design of Cluster Randomized Trials

Date: March 12th, 2014
Presenter(s): Carl Westine—IDPE Student, WMU
Abstract: Evaluators and researchers rely on estimates of parameter values including effect sizes and variances (unconditional or conditional) to appropriately power cluster randomized trial designs. Individual disciplines, including education, increasingly test interventions using more complex hierarchical linear model structures. In order to improve the design of these studies, researchers have emphasized the development of precise parameter value estimates through meta-analyses and empirical research. In this presentation, I summarize recent research on empirically estimating design parameters with an emphasis on intraclass correlations and the percent of variance explained by covariates, R2. I then demonstrate how these parameter values are becoming increasingly accessible through software. Using Optimal Design Plus, I show how evaluators and researchers can utilize these parameter value estimates to improve the design of cluster randomized trials. LOCATION: 4410 Ellsworth Hall TIME: NOON- 1:00 P.M.

“Expectations to Change” (E2C): A Participatory Method for Facilitating Stakeholder Engagement with Evaluation Findings

Date: February 26th, 2014
Presenter(s): Dr. Adrienne Adams—Assistant Professor of Ecological-Community Psychology, Michigan State University
Abstract: Program evaluators strive to conduct evaluations that are useful to stakeholders. To achieve this goal, it is critical to engage stakeholders in meaningful ways throughout the evaluation. The “Expectations to Change” (E2C) process is an interactive, workshop-based method for engaging stakeholders with their evaluation findings as a means of promoting evaluation use and building evaluation capacity. In the E2C process, stakeholders are guided through establishing standards, comparing the actual results to those standards to identify areas for improvement, and then generating recommendation and concrete action steps to implement desired changes. In this presentation, I will describe the process, share findings from an evaluation of its effectiveness, and discuss its general utility in evaluation practice. LOCATION: 4410 Ellsworth Hall TIME: NOON- 1:00 P.M.

The Process of Evaluating Grant Applications at the National Institutes of Health (NIH)

Date: February 19th, 2014
Presenter(s): Dr. Stephen Magura—Director of The Evaluation Center, WMU
Abstract: The National Institutes of Health (NIH) grant application evaluation process is considered a model for the field. The presentation will describe this process and discuss recent changes that were made in the rating scheme in response to perceived shortcomings. The effect of these changes on the evaluation of grants will be discussed and critiqued. (The presenter is a permanent member of an NIH grant review committee and has been reviewing NIH grants for 25 years.) LOCATION: 4410 Ellsworth Hall TIME: NOON- 1:00 P.M.

International Large-Scale Assessments: TIMSS and PIRLS – A Guide to Understanding Learning Outcomes Locally and Globally

Date: February 13th, 2014
Presenter(s): Dr. Hans Wagemaker—Executive Director of the International Association for the Evaluation of Educational Achievement (IEA)
Abstract: The last three decades have witnessed a considerable growth in, and development of international large-scale assessments. Despite the wealth of data that is collected through the Progress in International Reading Literacy Study (PIRLS) and Trends in International Mathematics and Science Study (TIMSS) research programs, much of the media attention continues to be focused on international rankings. This presentation will provide an overview of the International Association for the Evaluation of Educational Achievement’s (IEA) TIMSS and the PIRLS with a view to providing participants with an understanding of the purpose, practices, and challenges associated with international large-scale assessments. Using some examples from the US and other countries, it will address some of the common concerns expressed about the validity of these assessments and their use in understanding learning outcomes locally and globally. LOCATION: Bernhard Center 204 TIME: NOON- 1:30 P.M.

Completing the Evaluation Café Remodel: Format, Guidelines, and Feedback

Date: February 5th, 2014
Presenter(s): Kelly Robertson—Evaluation Café Coordinator and Senior Research Associate, The Evaluation Center, WMU
Abstract: The Evaluation Café has a new logo, food, and day of the week; and now different format options and guidelines. Please join us to help complete the Eval Café remodel, by learning about and providing your feedback on the new format options, guidelines, and information submission process.LOCATION: 4410 Ellsworth Hall TIME: NOON- 1:00 P.M.

Integrating Traditional Evaluation with Agent-Based Simulation of Complex Behavior

Date: January 29th, 2014
Presenter(s): Dr. Jonathan Morell—Director of Evaluation, Fulcrum Corporation Dr. Van Parunak—Senior Scientist, SoarTech
Abstract: We will make the case for continually iterating between traditional evaluation methods and agent based simulation over the course of an evaluation’s life cycle. First we will discuss the value of any kind of simulation as an evaluation tool. We will then discuss the nature of complex systems and show why agent-based simulation provides information that would not otherwise be available. We will conclude with a demonstration of a scenario in which traditional evaluation methods and an executable agent based model inform each other in an evaluation of best practice adoption. The result of using both approaches will be enhanced ability to anticipate unexpected results of program behavior, and deeper understanding of the implicit assumptions that guide program development and program execution. LOCATION: 4410 Ellsworth Hall TIME: NOON- 1:00 P.M.

Three-Paper Dissertation: Guidelines and Panel Discussion

Date: January 22nd, 2014
Presenter(s): Dr. Chris Coryn - Director of the IDPE, WMU; Dr. Kurt Wilson - Former IDPE Student, WMU; and Carl Westine - Current IDPE Student, WMU
Abstract: Guidelines for the new IDPE three-paper dissertation format and strategies on how to choose the right dissertation format for you will be discussed. Additionally, the first two IDPE students to use the three-paper dissertation format will talk about their experiences and lessons learned. LOCATION: 4410 Ellsworth Hall TIME: NOON - 1 P.M.

Evaluation Cafe and Workshop with Dr. Beverly Parsons

Date: December 5th, 2013
Presenter(s): Dr. Beverly Parsons - Executive Director InSites and President-Elect of AEA
Abstract: In the Evaluation Café, Dr. Beverly Parsons provides a framework for developing a systems-oriented evaluation. She provides a handout showing a step-by-step process for positioning a systems-oriented evaluation to fit a particular context. She illustrates how to zoom in and out between a large-scale evaluation framework and a specific evaluation using a systems orientation. She has used this framework in the education, social services, and health fields, and is currently working on its connection to environmental sustainability. LOCATION: 4410 Ellsworth Hall TIME: NOON - 1:30 P.M. During the workshop, Dr. Beverly Parsons provides practical systems-oriented evaluation tools and processes organized around four phases of evaluation—evaluation design; data collection; making meaning from data; and shaping practice. The focus within the design phase is on the connection of a specific evaluation to a larger systems change or sustainability purpose. (This is a transition from the topic of the Evaluation Café.) The data collection phase discussion focuses on systems-oriented questions. The questions draw on multiple systems theories. The conversation about making meaning from data brings in visualization of data interpretation and consideration of system dynamics. The shaping practice discussion focuses on using evaluation results and processes within communities of practice that are designed to address systems change and sustained movement in a desired direction. LOCATION: 4410 Ellsworth Hall TIME: 1: 45 P.M. - 3:30 P.M.

Walking the Line: Reflections on Evaluation Practice Across Contexts

Date: November 26th, 2013
Presenter(s): Dr. Juna Snow - Senior Evaluation Specalist, The Research Group, University of California Berkeley
Abstract: Dr. Snow will share professional experiences since completing her doctoral degree in 2005 from the University of Illinois at Urbana-Champaign. "Walking the line" refers to balancing the worlds of academia and consulting with inherent tensions and ill-aligned purposes. Dr. Snow's academic specialty is educational evaluation, with content areas of science and technology in education, while her evaluation practice has proven to be transdisciplinary, taking her across contexts of railroad switching operations, food and wine hospitality, and sleep health hygiene. Dr. Snow intends this to be a dialogic presentation, welcoming questions from attendees specifically about her professional experiences and lessons learned since becoming an evaluation practitioner who has worked within the academic, governmental, and independent-consulting sectors. LOCATION: 4410 Ellsworth Hall TIME: NOON- 1:00 P.M. FOOD: Cottage Inn

Winning Evaluation Grants and Contracts: Tips and Tricks for New Investigators

Date: November 6th, 2013
Presenter(s): Dr. Daniela Schroeter
Abstract: This informal presentation will provide guidance on how to pursue grants and contracts in evaluation. Topic areas covered will include identifying opportunities, writing and re-writing different aspects of proposals, and developing relationships and networks. Participation from experienced principal investigators is encouraged to stimulate discussion and share experiences with different funding agencies and from a range of perspectives. Students, junior staff, and other new investigators are invited to submit questions in advance to the café or to raise questions during the presentation. LOCATION: 4410 Ellsworth Hall TIME: NOON - 1:00 P.M. FOOD: Campus Beet

Pizza Evaluation

Date: October 30th, 2013
Presenter(s): Alex Manga - IDPE student
Abstract: This Evaluation Café will focus on criteria used in commercial applications to evaluate pizza quality and product positioning. A brief history of pizza origins will be discussed; as well as an in-depth look at today's industry standards for pizza quality across multiple dimensions. A few of the questions that will be addressed in this Evaluation Café include: What is good pizza sauce? What is the difference between franchise and local pizza sauces? How important is cheeses? How do pizza connoisseurs evaluate pizza? What standards do they use? How can I be a pizza connoisseur? LOCATION: To be determined TIME: NOON - 1:00 P.M. FOOD: Pizza from various restaurants including DaVinci's in Strugis, MI (https://www.facebook.com/pages/Davincis-Sturgis-Michigan/188678151144333)

CANCELED AEA Evaluation 2013 Practice Session

Date: October 9th, 2013
Presenter(s): Evaluation Center staff and/or IDPE students
Abstract: In anticipation of the upcoming American Evaluation Association annual conference, we will feature four preregistered presenters who will screen their conference sessions for our Evaluation Café audience. Each presenter is limited to 15 minutes, which includes content and question-and-answer time. This session is perfect for attendees who cannot attend the annual conference or who wish to free up their conference schedules by catching WMU-related sessions ahead of time. Attendees will be asked to complete brief feedback forms for each presenter. LOCATION: 4450 Sangren Hall TIME: NOON - 1:00 P.M. FOOD: Cottage Inn

Meta-Analysis as a Method of Multi-Site Evaluation: An Example from International Development

Date: September 25th, 2013
Presenter(s): Dr. Chris Coryn—Director of IDPE, WMU Kristin A. Hobson—IDPE Student, WMU Robert McCowen—IDPE Student, WMU
Abstract: Although interest in multi-site evaluation seemingly has grown, with a nascent body of literature, most of the literature on multi-site evaluation has come from writings about place, group, and cluster randomized controlled trials. However, guidance on how to execute uncontrolled multi-site evaluations is scarce and usually treats sites as homogeneous. In this Evaluation Café, the presenters will discuss and demonstrate how meta-analysis techniques were used to evaluate the effects of Heifer International projects in Albania, Nepal, and Uganda, with an emphasis on nutritional outcomes for project participants. LOCATION: 4450 Sangren Hall TIME: NOON - 1PM FOOD: Cottage Inn

Current Eval Cafe Events | 2012-2013 | 2011-2012 | 2010-2011 | 2009-2010 | 2008-2009 | 2007-2008 | 2006-2007 | 2005-2006

No Comments

No comments yet.

RSS feed for comments on this post.

Sorry, the comment form is closed at this time.