Spring 2015 and Fall 2014 Evaluation Café Events


*Please join us for our Spring 2015 Evaluation Café Speaker Series (Schedule)

Cafes are typically held at The Evaluation Center, located in 4410 Ellsworth Hall, from noon-1pm on Wednesdays.

Formidable Challenges and Promising Opportunities for the Transdiscipline of Evaluation with Dr. Steward Donaldson – President of AEA, Director of the Claremont Evaluation Center, and Dean at Claremont Graduate University

Tuesday, January 13, 2015
1pm-3pm
4410 Ellsworth Hall

The demand for rigorous and influential evaluations is now pervasive worldwide.  Evaluators are being commissioned to address a wide range of educational, health, community, organizational, public policy, and international development challenges.   A global grassroots movement to strengthen evaluation capacity worldwide, EvalPartners, has identified approximately 170 National Voluntary Organizations of Professional Evaluators (VOPEs) consisting of more than 43,000 members.  Recent research also shows that Universities are providing more evaluation degrees, certificates, courses, and professional development opportunities than ever before.  As we move into 2015—The International Year of Evaluation—the transdiscipline of evaluation is faced with a wide range of new challenges and opportunities.  A subset of these challenges and opportunities will be explored in some depth with an eye toward how to continually improve evaluation theory and practice in the years ahead.

Slides

Bike Evaluation with Tim KroneOwner, Pedal Bicycles

Wednesday, February 4, 2015
Noon-1pm
4410 Ellsworth Hall

Conducting an On-site Observation: Tip, tricks, and best practices for observers with Dr. Kristin Everett—Senior Research Associate, SAMPI, Western Michigan University

Wednesday, February 11, 2015
Noon-1pm
4410 Ellsworth Hall

Observational research, usually in the form of a site visit, is a component of many project evaluation plans. Collecting high quality, relevant data in dynamic situation – while at the same time remaining objective, alert and in the background – can be a challenging endeavor. This presentation will discuss ways to help evaluators prepare for and conduct an observation. The presenter will share insights and best practices for planning and conducting observational research.

The Detroit Sexual Assault Kit (SAK) Action Research Project: Developmental Evaluation in Practice with Dr. Rebecca Campbell—Professor of Ecological-Community Psychology, Michigan State University and Recipient of AEA’s 2013 Outstanding Evaluation Award

Wednesday, February 18, 2015
Noon-1pm
4410 Ellsworth Hall

In 2009, 11,000+ sexual assault kits (SAKs) were discovered in a Detroit Police Department property storage facility, most of which had never been forensically tested and/or investigated by the police.  In 2011, a multi-stakeholder group convened to develop long-term response strategies, including protocols for notifying victims whose kits had been part of the backlog.  In this presentation, I will describe the process by which we used developmental evaluation theory to create a mixed-methods evaluation of this initiative.  This presentation will summarize the numerous challenges (psychological, ethical, and legal) we have faced attempting to locate survivors so many years later to evaluate the efficacy of the protocols developed by the multidisciplinary team.

Achieving Excellence in Engineering Through Testing and Evaluation with Dr. Joe Stufflebeam—Engineering Manager at TRAX International; LLC, White Sands Missile Range, NM

Monday, February 23, 2015
Noon-1pm
4410 Ellsworth Hall

Engineering is the process of utilizing math and science when conceptualizing, / designing, building, and using structures, machines, systems and processes. / To increase the efficiency and effectiveness of engineering efforts, the process / of evaluation can be integrated to provide necessary feedback to appropriate / stakeholders. A case study using the Space Shuttle and the challenges of long / range imaging is presented to show the utility of timely, integrated assessment / and feedback. Finally, a look into the world of Department of Defense Testing and / Evaluation will be presented to show a rich diversity of challenging engineering / opportunities that are available to prospective engineers.

Making Evaluation Useful with Dr. Wendy L. Tackett—President, iEval and Adjunct Professor, Evaluation, Measurement, & Research. WMU Senior Evaluation Specialist, The Research Group, Lawrence Hall of Science, University of California, Berkeley

Wednesday, March 4, 2015
Noon-1pm
4410 Ellsworth Hall

Have you shared your evaluation process with clients only to have their eyes glaze over? Have you provided lengthy literature reviews to provide credibility to your evaluation approach only to have the client never read them? Have you made data-supported recommendations to clients only to have them ignored? Have you shared an evaluation report with a client only to have it filed away on a shelf or in a computer folder? If any of these situations sounds familiar to you, come ready to learn some ideas for what you can do to improve your evaluation’s use!

Project Management: Keeping Yourself and Your Project Organized with Emma Perk—Project Manager Evaluation Center WMU

Wednesday, March 18, 2015
Noon-1pm
4410 Ellsworth Hall

Keeping yourself and your project organized is a key component to keep your project on schedule and productive. While it sounds simple and easy to do, it’s something we all wish we did better. In this Evaluation Café, I will present practical ideas and techniques to help keep you and your project on task. Topics will include team management and communication, electronic organization, time management, and management tools and software.

Research on Evaluation at WMU – An Overview of Current Investigations and Their Relevance to Evaluation Theory, Methods, and Practices with Dr. Chris L. S. Coryn— Associate Professor and Director of the Interdisciplinary Ph.D. in Evaluation (IDPE), WMU /  / Lyssa N. Wilson / IDPE Student, WMU /  / Gregory D. Greenman II / IDPE Student, WMU /  / Satoshi Ozeki / IDPE Student, WMU

Wednesday, March 25, 2015
Noon-1pm
4410 Ellsworth Hall

In this Evaluation Cafe the presenters will discuss several ongoing investigations into evaluation theories, methods, and practices taking place at WMU (many of which involve collaborations with Claremont Graduate University [CGU] and the University of California at Los Angeles [UCLA]) and the importance of conducting research on evaluation as a means for moving the field forward. Attendees will be encouraged to provide feedback and suggestions for improving the investigations.

Visualizing Data with Text Analysis Tools with Kate Langan—Assistant Professor, Humanities Librarian, WMU

THURSDAY, April 2, 2015
Noon-1pm
4410 Ellsworth Hall

As Humanities Librarian at WMU, Ms. Langan engages in digital humanities research and training, the use of technology in literary analysis. Text analysis tools are widely available and are used to deepen one’s understanding with text-based information. These tools are also appropriate for research in the social sciences. Mining text-based data and thoughtfully visualizing those results can enliven one’s research, whether a presentation or publication. This session will review text analysis tools and different techniques for visual representation of text-based data.

Group Facilitation: Tools, Skills, and Techniques with Dr. Claire Bode—Public Policy Education Specialist, MSU Extension

Wednesday, April 8, 2015
Noon-1pm
4410 Ellsworth Hall

Focus groups, interviews, and community meetings are useful processes when conducting a needs assessment or evaluating the impact of a community program. This session will highlight some tried-and-true facilitation tools and techniques for engaging (non-academic) groups that can balance participation, allow for divergent thinking, and arrive at goals. Online tools, use of flip charts and sticky notes, and verbal techniques will be reviewed.

Brainstorming Session: Funding Opportunities for the Joint Committee on Standards for Educational Evaluation with Dr. Daniela Schroeter Director of Research, The Evaluation Center, WMU

Wednesday, April 15, 2015
Noon-1pm
4410 Ellsworth Hall

Evaluation of the Federal Railroad Administration’s Confidential Close Call Reporting System (C3RS) with Dr. Jonathan A. Morell—Director of Evaluation, Fulcrum Corporation, Melinda Davey, Senior Analyst, Organizational Solutions Group, Jacobs Advanced Consulting Services. Dr. Joyce Ranney Volpe Center, Railroad Safety Culture, Senior Program Evaluator and Senior Program Manager

 Wednesday, April 22, 2015
Noon-1pm
4410 Ellsworth Hall

The Federal Railroad Administration has been pilot testing C3RS, an innovative program in which close calls in railroad operations can be reported and addressed. C3RS has been running since 2007 and comprises four major railroads: the Union Pacific and the Canadian Pacific (freight); and Amtrak and New Jersey Transit (passenger). C3RS has been subject to thorough evaluation because it is a high risk, high reward program that could have a major impact on safety in the domestic railroad industry. Large amounts of qualitative and quantitative data are being collected from each participating railroad, thus allowing rich comparisons across railroads, within railroads, and over time. The purpose of the evaluation is to provide formative and summative data, as well as data on factors that explain sustainability over time. This presentation will explain C3RS and the evaluation design in more detail, and then provide a small set of the more interesting findings.

Tools and Resources for Using Geographic Information Systems (GIS) in Evaluation with Corey Smith—Doctoral Associate, WMU

Wednesday, December 3
Noon-1pm
4410 Ellsworth Hall

Demonstration—Geographic Information Systems (GIS) allow users to represent data geographically. In evaluation, we often have data which has some connection or association with a geographic place. GIS provides a tool to explore whether or not these connections are meaningful, or reveal trends or patterns that we may otherwise miss. I my Evaluation Café I will walk through a number of GIS related resources, demonstrate a few of them, while also commenting on the utility and application of GIS in evaluation.

Slides

Using Multi-Attribute Utility Theory in Evaluation: Challenges and Opportunities with Dr. Daniela Schroeter—Director of Research, The Evaluation Center, WMU

Wednesday, November 12
Noon-1pm
4410 Ellsworth Hall

This presentation will illustrate the use of multi-attribute utility theory as part of a multiple, mixed methods evaluation. Methodological considerations and stakeholder engagement will be at the center of the discussion. Evaluation Café participants will be introduced to tools used as part of the evaluation and are encouraged to critically reflect on the opportunities and limitations of using multi-attribute theory as a means to operationalize the logic of evaluation.

Slides

Project Management – Tools and Techniques with Dr. Daniel Gaymer – Faculty Specialist, Educational Leadership, Research, and Technology, WMU

Wednesday, November 5
Noon-1pm
4410 Ellsworth Hall

This workshop covers the fundamentals of project management.  Participants will learn the basic concepts of project management and be introduced to project management tools with an emphasis on the role of project leaders with respect to personal and professional effectiveness, task management, communication strategies, and project team leadership.  Learners will leave with tools, techniques, and documents that will prepare them to manage projects effectively.

Handout

Evaluation and Social Justice with Dr. Jerry Johnson – Associate Professor, School of Social Work and Research Director, Johnson Center Community Research Institute, Grand Valley State University

Wednesday, October 29
Noon-1pm
4410 Ellsworth Hall

Doing Evaluation by Combining “Traditional” Knowledge with “Complex” Understanding of how the World Works with Dr. Jonathan (Jonny) A. Morell—Director of Evaluation, Fulcrum Corporation and Editor, Evaluation and Program Planning

Wednesday, October 22
Noon-1pm
4410 Ellsworth Hall

Paper—Research on “complexity” reveals many notions of how the world works that are at odds with “traditional” social science knowledge and evaluation program theories. “Strange attractor” is a particularly attention grabbing example, but many other examples are out there. It is easy to abandon “traditional” theories and just explain programs in terms of complexity. It’s also easy to ignore complex behavior. Neither approach makes sense. When we design evaluations we need to pay attention to “complex”, and “traditional” knowledge, and draw from each as wisdom and expertise dictate. This presentation will provide examples of what such combinations would look like.

Slides

Video

NSF Investment in Advancing Evaluation Theory, Methods, and Practice: Outcomes, Trends, and Implications with Dr. Lori Wingate, Corey Smith, Jason Burkhardt, and Emma Perk – The Evaluation Center, WMU

Wednesday, October 8
Noon-1pm
4410 Ellsworth Hall

Poster -In this session, we’ll presents the results of a systematic review of all grants made by the National Science Foundation’s Directorate for Education and Human Resources that focused on advancing STEM education evaluation theory, practice, and methods in the decade spanning 2004 through 2013. We reviewed all abstracts in the NSF awards database containing the word “evaluation” for this time frame. Of these 2,990 awards, about 90 focused specifically on advancing STEM education evaluation by developing evaluation instruments, providing evaluation training and/or resource materials, developing evaluation frameworks, and/or holding conferences focused on evaluation. Based on our review of the publications, websites, project outcome reports, and other documentation from these grants, we developed an inventory of the grant outcomes. In addition to provide a high-level view of the results of NSF’s investment in advancing evaluation, our analysis reveals trends and gaps in what NSF has funded and the results of this funding.

Digital Qualitative: Leveraging Technology for Deeper Insight with Dr. Robert W. Kahle—Kahle Research Solutions Inc. 

Wednesday, September 24
Noon-1pm
4410 Ellsworth Hall

Demonstration—Technological advancements provide a platform for a wide variety of new qualitative research techniques. Sometimes called Qualitative 2.0, New Qualitative or Digital Qualitative, these new techniques are powerful when appropriately applied and carefully implemented. This informal presentation will provide an inventory of qualitative research techniques, from the traditional focus group and depth interview to the newest technology-enabled mobile methods. Presentation learning objectives: 1. Understand the range of new qualitative techniques currently available; 2. Identify when and why to use specific qualitative techniques; 3. Understand the advantages and disadvantages of these new qualitative methods; and 4. Learn about hybrid designs.

Handout

Slides

Harnessing Complexity: Analysis Methodology and Ethical Frame for Using Video Data in Evaluations with Dr. Kurt Wilson—President, Effect X 

Wednesday, September 17
Noon-1pm
4410 Ellsworth Hall

Paper—Many evaluations are conducted in contexts of complexity; the specific intervention being evaluated is but one of many interrelated factors influencing the desired outcome. Video data, especially when directly generated by program participants, can provide both exceptionally rich qualitative data as well as contextually-relevant feedback within complex systems. Despite these unique strengths and opportunities, video data is underutilized in the field of evaluation. This presentation is based on a three paper dissertation and addresses two barriers to the use of video data: high time and expense of analysis, and ethical concerns.

Slides

Useful and Free Evaluation Websites with Dr. Wendy L. Tackett—President, iEval 

Wednesday, September 10
Noon-1pm
4410 Ellsworth Hall

Demonstration—We will explore over 40 web sites that provide free resources for you to use in your evaluation work, including learning about evaluation, tools and instruments, data visualization, data sources, blogs, and some fun surprises. While some of the websites will be shared just for reference, we’ll do some hands-on work in several of them to demonstrate the ease of use and the potential usefulness.

Slides

No Comments

No comments yet.

RSS feed for comments on this post.

Sorry, the comment form is closed at this time.