2012 – 2013

Conversation with Craig and Karen

Date: July 17th, 2013
Presenter(s): Dr. Craig Russon - Senior Evaluation Officer, International Labour Organization, Geneva, Switzerland and Karen Russon- President, Evaluation Capacity Development Group (ECDG), Ferney-Voltaire, France
Abstract: ****Please note change of location!**** LOCATION: 4405 Ellsworth Hall TIME: 11AM - NOON

The Key to Dissertating: Defining Your Contribution

Date: June 12th, 2013
Presenter(s): Dr. Amy Gullickson—Senior Lecturer, Centre for Program Evaluation, Melbourne Graduate School of Education
Abstract: Dissertations are based around contributions. However, this term is often not very clearly defined or explained, and you end up relying solely on your committee members to tell you whether or not your proposed topic or finished work is "a contribution." No matter where you are in the process of your PhD, understanding what a contribution is (and how your dissertation will make one) can help you communicate effectively with others about your work AND keep you on track and focused. LOCATION: 4410 ELLSWORTH HALL TIME: NOON-1PM

Navigating Foundation Evaluation–An Insider’s View

Date: April 9th, 2013
Presenter(s): Dr. Kimberly James—Evaluation Officer, W.K. Kellogg Foundation
Abstract: The purpose, function and structure of evaluation within the larger, private U.S. foundations have undergone significant change over the past few years. This session highlights these changes and current “pressure points” several larger foundations face. What role can evaluators play? Moreover, what skills are really useful for those interested in navigating the evaluation terrain of larger, private foundations? Hear frank advice, "hard lessons learned" and questions answered about evaluation from one insider’s evaluation journey. LOCATION: 4410 ELLSWORTH HALL TIME: NOON-1PM

Setting Criteria and Standards for Ball Throwing Athletes

Date: April 4th, 2013
Presenter(s): Alex Manga—IDPE Student, WMU
Abstract: NOTE CHANGE OF DATE! This presentation will identify and discuss in detail the evaluation methods, criteria, and standards used to evaluate prospective professional baseball players, particularly pitchers and catchers as established and continued evolution of its evaluation processes. 2) Discuss the changing evolving criteria and standards used for systematic evaluation of pitchers and catchers and the multiple levels of standards used to determine eligibility for prospects acquiring contracts and how summative evaluations are used by the MLB to determine offer levels. 3) Discuss the methods and measurements used to determine if prospects (evaluands) will have what it takes to be a professional baseball player and how these measurements can also translate into financial magnitude being offered.- LOCATION: 4410 ELLSWORTH HALL TIME: NOON-1PM

Health Care Use and Outcomes Among Ex-Offenders Living with HIV: Evaluating Michigan’s Community Re-entry Program for Prisoners Living with HIV

Date: March 28th, 2013
Presenter(s): Dr. Robin Miller—Professor Ecological-Community Psychology, Michigan State University
Abstract: Leaving prison often results in loss of care and poor health among ex-offenders living with HIV. To improve linkage to care, many states have discharge planning programs, few of which have been evaluated. Using a mixed-method approach, we examined linkage to and maintenance in care 3-5 years after release among 190 ex-offenders who had used Michigan’s statewide community re-entry program. A majority of ex-offenders had followed through on their discharge plans; however, long-term maintenance in care was low and poor health common 36 months post-release. Mortality was 17%. In addition to presenting findings, challenges conducting the evaluation will be highlighted. LOCATION: 4410 ELLSWORTH HALL TIME: NOON-1PM

Putting Evaluation to Use: A Workshop-based Method for Engaging Stakeholder in the Process of Interpreting and Using Evaluation Data

Date: March 13th, 2013
Presenter(s): Dr. Adrienne E. Adams—Assistant Professor, Ecological Community Psychology, Michigan State University
Abstract: EVENT CANCELED! Many evaluators strive to design and conduct evaluations in ways that promote the use of evaluation findings. To achieve this goal, it is critical to engage key stakeholders in meaningful ways throughout the evaluation process. “Putting Evaluation to Use” is a workshop-based method for engaging stakeholders in the process of interpreting and using evaluation data. This method provides intended users with an opportunity to reflect on their work, celebrate successes, identify areas for improvement, and plan for change. In this presentation, I will describe the method, provide examples of materials, and demonstrate key activities. I will also share some of the benefits and challenges of using this method based on my experience with non-profit human service organizations. LOCATION: 4410 ELLSWORTH HALL TIME: NOON-1PM

Evaluation Large-Scale Research Funding Policies and Procedures: Lessons from an Evaluation of the Swiss National Science Foundation

Date: February 27th, 2013
Presenter(s): Dr. Chris Coryn—Director of IDPE, WMU Dr. Brooks Applegate—Professor, Evaluation, Measurement and Research, WMU Dr. Daniela Schroeter—Director of Research, The Evaluation Center, WMU Krystin Martens—IDPE Student, WMU Robert McCowen—IDPE Student, WMU
Abstract: Internationally, a wide variety of policies and procedures have been used for funding research by national grant-making foundations and similar organizations. Simultaneously, demands for improved grant-making and accountability have increased. These demands, driven by a multitude of factors (e.g., increasingly scarce resources, increased competition, pressures to improve performance), have placed a great burden on grant-making foundations not only to continuously improve their overall effectiveness, but also to account for their activities and expenditures. In this Evaluation Café, the presenters will discuss some of the challenges encountered, and solutions to those challenges, in evaluating the research funding policies and procedures of the Swiss National Science Foundation. LOCATION: 4410 ELLSWORTH HALL TIME: NOON-1PM

Institutional Analysis of Evaluation Policies in Quebec, Canada

Date: February 20th, 2013
Presenter(s): Dr. Ghislain Arbour—The Evaluation Center Visiting Scholar from the National School of Public Administration, Quebec, Canada
Abstract: In 2000, the legislative assembly of the Province of Quebec, in Canada, adopted a results-based management system through its Public Administration Act (L.R.Q., 2000, c.8). Since then, the evaluation function has been the object of several administrative developments in order to make it a useful information input in the newly organized government management cycle. Among other initiatives, many public bodies adopted internal policies in order to specify the priorities, the meaning, the scope and the responsibilities associated to the evaluation function. This presentation is about an analysis of the evaluation policies adopted by several departments, through the lens of an institutional theoretical framework inspired by the work of Elinor Ostrom. LOCATION: 4410 ELLSWORTH HALL TIME: NOON-1PM

Evaluation Questions: The Foundation for Meaningful and Useful Program Evaluation

Date: February 13th, 2013
Presenter(s): Dr. Lori Wingate—Assistant Director, The Evaluation Center, WMU Dr. Daniela Schroeter—Director of Research, The Evaluation Center, WMU
Abstract: In this Evaluation Cafe, we will discuss the importance of evaluation questions and their influence on evaluation design, stakeholder engagement, and use of results. Lori Wingate and Daniela Schroeter will share highlights from their day-long whttp://www.wmich.edu/evalctr/wp-admin/post-new.phporkshop on this topic, addressing issues such as how to: align evaluation questions with program purpose and information needs; articulate questions that are evaluative, unambiguous, important, and answerable; use logic logic models to ensure alignment between program design, evaluation questions, and indicators; assess and improve preexisting evaluation questions. LOCATION: 4410 ELLSWORTH HALL TIME: NOON-1PM

Social Return on Investment: Metrics, Methods, and Market Solutions

Date: February 7th, 2013
Presenter(s): Dr. John Gargani—President, Gargani + Company
Abstract: I provide an overview of social return on investment (SROI), acting as both champion and critic. SROI may be unfamiliar to many evaluators. It is one of several metrics—borrowed from business, economics, and finance—that quantifies the monetary value of program impacts. Investors are using it to develop new “impact investment” strategies. Policymakers are using it to experiment with public-private funding mechanisms, such as social impact bonds. Should evaluators be using it to demonstrate program effectiveness? That’s the question we will explore together. LOCATION: 4410 ELLSWORTH HALL TIME: NOON-1PM

Project Management for Evaluators

Date: January 30th, 2013
Presenter(s): Steven Dibble—Associate Director—P3MO, Stryker, and IDPE Student, WMU
Abstract: Project Management is the discipline of organizing work to improve a team’s efficiency and effectiveness. To realize these benefits, good project managers have mastered a set of technical, interpersonal, and contextual skills that are applicable to most industries including Evaluation. This Evaluation Café presentation will define the project management discipline, showcase key concepts that will positively impact your evaluation practice, and recommend resources to develop your skills further. LOCATION: 4410 ELLSWORTH HALL TIME: NOON-1PM

Student Assessment in 21st Century Education

Date: January 23rd, 2013
Presenter(s): Dr. Arlen Gullickson—Professor Emeritus, The Evaluation Center, WMU
Abstract: This presentation looks at the two primary types of student assessments in use today, summative and formative assessments. It contrasts the two types of use in terms of their characteristics and applications, and discusses issues surrounding their use. The argument is made that summative evaluation is given much greater attention and financial resources are squandered on summative evaluation practices. A case for greater attention given to formative assessment is presented. Suggestions are made for when and how formative assessment can effectively be used in classrooms along with key attributes that must be greatly improved if formative assessment is to reach its potential for improving student learning. LOCATION: 4410 ELLSWORTH HALL TIME: NOON-1PM

Evaluation the Jedi Way: Using Star Wars for Evaluation Training

Date: January 15th, 2013
Presenter(s): Dr. Lori Wingate-- Assistant Director, The Evaluation Center, WMU
Abstract: Learn evaluation, you will.When an evaluator provides evaluation training to an audience that works outside his or her realm of expertise, it can be a challenge to develop a good case for applying the workshop's concepts or tools. A good case is brief so it can be read and understood in a short period of time, yet complex enough to require critical thinking. It should be relevant to participants' background knowledge and work contexts, but not so much so they they get caught up in the details or realism of the situation. A good case for evaluation training demonstrates how evaluation principles can be applied across disciplines and content areas. In this session, we'll look at an evaluation case based on Star Wars (Episode I) developed for a workshop on data interpretation for public health evaluators. What are its strengths and weaknesses as a vehicle for learning about evaluation in general and data interpretation specifically? For what other evaluation topics could it be used? What other popular culture sources could be mined for raw materials for evaluation cases. LOCATION: 4410 ELLSWORTH HALL TIME: NOON-1PM

Universal Design for Evaluation

Date: January 8th, 2013
Presenter(s): Dr. Jennifer Sullivan Sulewski, Ph.D. — Institute for Community Inclusion, University of Massachusetts Boston & Dr. June Gothberg, Ph.D. — Senior Research Associate, Office of the Vice President of Research, WMU
Abstract: Vulnerable populations including persons with disabilities, the homeless, chronically ill, veterans, economically disadvantaged, low-literate, elderly, and culturally different are frequently involved in or affected by evaluations, regardless of the specific topic under investigation. Designing evaluation tools to include persons with disabilities and other vulnerable populations is critical to ensure these populations are fairly represented and included in the evaluation process. Universally designed evaluation plans, materials, and data collection tools provide for more inclusive, sound and valid evaluation practice. This presentation will describe the concept of universal design for evaluation (UDE) and discuss a UDE checklist developed by the presenters. LOCATION: 4410 ELLSWORTH HALL TIME: NOON-1PM

Meta-evaluation as Capacity Building for Evaluation Stakeholders

Date: December 5th, 2012
Presenter(s): Dr. Robin Lin Miller – Professor Ecological-Community Psychology, MSU
Abstract: The U.S. President’s Emergency Plan for AIDS Relief (PEPFAR) Caribbean Regional Program is a technical assistance program aimed at developing a sustained response to HIV in the Caribbean region. PEPFAR Caribbean represents a partnership effort of twelve Caribbean governments, two regional partners, and five U. S. government agencies (USAID, HRSA, CDC, Defense, Peace Corps). The aim of the partnership is to reduce HIV/AIDS incidence and prevalence in the Caribbean region, build the capacity of national governments to develop and maintain sustainable, comprehensive and effective national HIV/AIDS programs and strengthen the effectiveness of regional coordinating agencies and non-governmental organizations to provide quality, cost-effective goods and services to bolster national HIV/AIDS programs. A key goal area for the region is improving capacity to collect and use strategic information. I will describe how the PEPFAR Caribbean Regional Program improved the strategic information capacity of its U. S. government PEPFAR program managers through a meta-evaluation process that incorporated developmental elements. Specifically, I will present on how the U. S. government PEPFAR management team used a meta-evaluator to increase its capacity to commission and use evaluation and insure the quality of its mid-term evaluation. Specific steps in our meta-evaluation process will be described, including how the meta-evaluator used review of prior evaluations and current activities to help the team to assess readiness to commission a mid-term evaluation and prepare to select and work with a mid-term evaluation team. LOCATION: ELLSWORTH 4410 TIME:NOON-1PM

The ABCs of Goal-Free Evaluation

Date: November 27th, 2012
Presenter(s): Dr. Brandon W. Youker—Assistant Professor, School of Social Work, GVSU & Lyza Ingraham—Graduate Assistant, School of Social Work, GVSU
Abstract: Goal-free evaluation (GFE) is an evaluation model where the evaluator is deliberately kept from the stated goals and objectives of the program; this is accomplished by appointing a screener to keep goal-related information from the goal-free evaluator. Although GFE has been around for more than half a century, goal-based evaluation (GBE) continues to dominate evaluation practice and the literature on GFE remains sparse and highly theoretical. This presentation will introduce social service workers to GFE, provide a brief history of the model, discuss some of the theoretical arguments for and against it, and articulate actual principles and operations for conducting a GFE. LOCATION: 4410 ELLSWORTH HALL TIME: NOON-1PM

National Secondary Transition Technical Assistance Center’s Model for Building Capacity for Evaluation

Date: November 6th, 2012
Presenter(s): Jennifer Coyle-Research Associate, Office of the Vice President of Research, WMU & Dr. June Gothberg-Senior Research Associate, Office of the Vice President of Research, WMU
Abstract: The National Secondary Transition Technical Assistance Center (NSTTAC) is a federally funded technical assistance and dissemination center. One of our charges is to build capacity of state and local departments of education to improve transition education and services. An ongoing process of strategic planning, implementation, and evaluation is the basis for our NSTTAC continuous improvement model. In this session we will provide an overview of our model and participants will leave with a copy of the NSTTAC Evaluation Toolkit, 2nd edition. LOCATION: 4410 ELLSWORTH HALL TIME: NOON-1PM

Failure of Intervention or Failure of Evaluation: A Meta-evaluation of the National Youth Anti-Drug Media Campaign Evaluation

Date: October 30th, 2012
Presenter(s): Dr. Stephen Magura-Director, The Evaluation Center, WMU
Abstract: The national Youth Anti-Drug Media Campaign was conducted during 1998-2004 and evaluated through a national, four-wave panel study of adolescents. The evaluation’s results were unexpected and controversial, finding both no effects overall and a possibly harmful effect, namely inducing initiation of marijuana use. A meta-evaluation by the U.S. General Accounting Office (GAO) supported the original evaluation’s major conclusions, but the Campaign’s sponsor, the Office of National Drug Control Policy (ONDCP), contested both the original evaluation’s findings and the GAO’s assessment of them. This study presents an alternative meta-evaluation of the original evaluation, concluding that the Campaign probably was ineffective, but without sufficient evidence of harmful effects. LOCATION: 4410 ELLSWORTH HALL TIME: NOON-1PM

Why is it so? An Analysis of Evaluators–Stakeholders Communication Based on Communication Theories

Date: October 9th, 2012
Presenter(s): Maran Subramain-Coordinator for International Student Activities, WMU
Abstract: Evaluators and their clients often have different backgrounds, tasks and training. At the same time, they interact as equal partners to jointly identify areas for problem solving, evaluation, and research. To date, evaluator–stakeholder communication is an understudied area in the field of evaluation, with very few studies examining evaluation practice with a communication lens. This presentation explores the nature of evaluator-stakeholder communication through selected interpersonal communication theories. In particular, the presenter will inform the practice of evaluation in terms of how evaluators interact and communicate with their stakeholders and set the stage for future research involving evaluators-stakeholders communication. LOCATION: 4410 ELLSWORTH HALL TIME: NOON-1PM

The Kirkpatrick Model for Evaluation: An Overview and Discussion

Date: October 2nd, 2012
Presenter(s): Dr. Lori Wingate-Assistant Director of The Evaluation Center, WMU
Abstract: The Kirkpatrick Model for training evaluation focuses on assessing a program’s quality and effectiveness in terms of participants’ satisfaction (Level 1), learning (Level 2), application of new skills or content (Level 3), and the resulting impacts (Level 4). Although developed specifically for evaluating corporate training programs, the model is adaptable to other settings and types of programs. We’ll discuss the pros and cons of the approach and its relevant to other contexts, such as K-16 education, health, and human services. LOCATION: 4410 ELLSWORTH HALL TIME: NOON-1PM

Welcoming Michigan: A Statewide Grassroots Immigrant Integration Program

Date: September 25th, 2012
Presenter(s): Susan Reed, Supervising Attorney, Michigan Immigrant Rights Center and Program Supervisor, Welcoming Michigan and Lillie Wolff, West Michigan Communities Coordinator, Welcoming Michigan
Abstract: Welcoming Michigan is a statewide grassroots immigrant integration initiative based at the Michigan Immigrant Rights Center. Welcoming Michigan is an affiliate of Welcoming America. Our current geographic reach includes the Southeast Michigan communities of Sterling Heights, Hamtramck, and SW Detroit's Chadsey Condon neighborhood; and Van Buren County in Southwest Michigan. The program's model is a three-pronged approach to immigrant integration: strategic communications, leadership development, and public engagement. These efforts aim to spread positive messages and images of Michigan's immigrants and engage local residents in identifying opportunities to bring U.S.-born and foreign-born community members together to build trust and mutual respect. Welcoming Michigan seeks guidance from the WMU Evaluation Center regarding the program's development of evaluation methods to measure the efficacy of the Welcoming Michigan model in changing people's hearts and minds about Michigan's immigrants. LOCATION: ELLSWORTH 4410 TIME:NOON-1PM

Some Elements of Context and Their Influence on Evaluation Practice

Date: September 20th, 2012
Presenter(s): Jody Fitzpatrick—Associate Professor, University of Colorado Denver and President-Elect of the American Evaluation Association
Abstract: Many elements of context influence evaluation but some primary ones include the setting of the evaluand (education, social welfare, mental health, environment) and the discipline or training of the evaluator and the key stakeholders. Both of these contextual issues influence the culture and values concerning evaluation. I (Fitzpatrick) will talk about the diversity in evaluation in the United States and other countries today and how these elements, setting and discipline, influence approaches and practice in evaluation. LOCATION: PRESIDENT'S DINNING ROOM, BERNHARD CENTER (MAIN FLOOR) NOON-1PM.

Conceptual Revolutions About Evaluation; Past and Future

Date: September 11th, 2012
Presenter(s): Dr. Michael Scriven–Professor of Psychology, Claremont Graduate School
Abstract: The usual trajectory of conceptual revolutions in science is that each replaces a prior view (the previous paradigm) with a radically different and more accurate one based on breakthrough research. The situation in evaluation was different: the first great paradigm shift—the doctrine of value-free science—involved rejecting all evaluative claims from the domain of scientific knowledge. The second shift returned them to respectability and provided the license for the current massive developments in ten sub-fields of professional evaluation e.g., program, product, and personnel evaluation. But the most exciting changes are still ahead, including upgrades to the status of (i) alpha discipline, (ii) applied science paradigm, and (iii) the big kahuna.

Evaluative Considerations Regarding Quebec’s Freedom of Information Act (FOIA)

Date: September 4th, 2012
Presenter(s): Ghislain Arbour—Evaluation Center Visiting Scholar from the National School of Public Administration, Quebec, Canada
Abstract: This presentation examines Quebec’s Freedom of Information Act (FOIA) from the perspective of an evaluator. This research originated from a dissertation in public administration within which the following items have been studied: 1) the political needs for an FOIA, 2) the characteristics of Quebec’s FOIA and 3) the implementation of the law. The research is now considered in terms of the different challenges an evaluation of the law may encounter throughout the evaluation process.

Current Eval Cafe Events | 2012-2013 | 2011-2012 | 2010-2011 | 2009-2010 | 2008-2009 | 2007-2008 | 2006-2007 | 2005-2006

No Comments

No comments yet.

RSS feed for comments on this post.

Sorry, the comment form is closed at this time.