Our Vision


  1. Providing evaluation, research, and capacity-building services to a broad array of organizations
  2. Conducting research on evaluation
  3. Engaging in academic leadership
  4. Administering the Interdisciplinary Ph.D. in Evaluation

Daniela C. Schroeter, Ph.D.


Director of Research
The Evaluation Center
Western Michigan University
Email: daniela.schroeter@wmich.edu
Phone: (269) 387-5895
Vita (PDF)

My evaluation and research interests are interdisciplinary at heart and center on evaluation theory, methodology, practice, and capacity building. Primary interests in grant and contracts work relate to program evaluations in science, technology, and engineering sectors, adult learning via innovative learning environments, organizational theory, and sustainability evaluation. I have a special interest in teaching evaluation theory, methodology, and practice.

Current Projects

National Science Foundation (Award Number: DRL-1228809)

  • PEEPs for PD: Identification of Project Effectiveness Principles for Professional Development in Elementary Science Teaching (2012-2014)
  • Co-PIs: Chris Coryn, Lori Wingate, and Juna Snow

Centers for Disease Control and Prevention University

  • Workshops in Data Interpretation, Evaluation Questions, and Translating Findings (2012-2013)
  • Co-Project Director with Dr. Lori Wingate
  • Slides:

AXIOM Sales Force Development

AXIOM Sales Force Development

  • Training Impact Evaluations of Axiom Sales Force Development (2012-2014)
International Labour Organisation

  • Evaluation Appraisals for ILO Technical Cooperation Projects 2009-2011 (Spring 2013)
  • Co-PI: Kelly Robertson
Augustana College (NSF DBI Subaward: 0954829)

  • External Evaluation of the Augustana College NSF RCN-UBE: Microbial Genome Annotation Network Project (2011-2014)
Kalamazoo College (DOE FIPSE Subaward: P116B100047)

  • External Evaluation of the Kalamazoo College “Improving Student Retention and Connection through Innovative Advising Practices” grant (2011-2014)
  • Co-PI: Kelly Robertson
  NIH NIDA (Award Number: 1R21DA032151-01)

  • Systematic Review of Evidence-Based Program Repositories (2011-2013)
  • PI: Stephen Magura

Selected Publications

Coryn, C. L. S., Schröter, D. C., Cullen, A., Semen, L., & McLaughlin, J. (2012). Assessing implementation fidelity of a national nutrition education program: A case study of Share Our Strength’s Operation Frontline. Journal of MultiDisciplinary Evaluation, 8(19), 15-25.Schröter, D. C., Coryn, C. L. S., Cullen, A., Robertson, K. N., & Alyami, M. (2011). Using concept mapping for planning and evaluation of a statewide energy efficiency initiative. Energy Efficiency, 5(3), 365-375.Coryn, C. L. S., Noakes, L. A., Westine, C. D., & Schröter, D. C., (2011). A systematic review of theory-driven evaluation practice from 1990 to 2009. American Journal of Evaluation, 32(3).Schröter, D. C., & Alyami, M. (2011). Evaluation of Distance Education and E-Learning. [Review of Evaluation of Distance Education and E-Learning]. American Journal of Evaluation.Schröter, D. C. (2010). Outcome-based program development and evaluation. [Review of Outcome-based program development and evaluation]. Canadian Journal of Program Evaluation, 23(2), 262-264.

Coryn, C. L. S., Schröter, D. C., & Hanssen, C. E. (2009). A modification of the Success Case Method for strengthening causal inferences from single‐group studies. American Journal of Evaluation, 30(1), 80-92.

Dewey, J. D., Montrosse, B. E., Schröter, D. C., Sullins, C. D., & Mattox, J. R. (2008). Evaluator competencies: What’s taught versus what’s sought. American Journal of Evaluation, 29(3), 268-287.

Schröter, D. C. (2008). The logic and methodology of sustainability evaluation: A checklist approach. (pp. 217-236). In M. Ant, A. Hammer, & O. Löwenbein (Eds.), Nachhaltiger Mehrwert von Evaluation [Sustained value-added via evaluation]. Bielefeld, Germany: Bertelsmann.

Schröter, D. C., Coryn, C. L. S., & Montrosse, B. (2008). Peer review of submissions to the annual American Evaluation Association conference by the Graduate Student & New Evaluators Topical Interest Group. Journal of MultiDisciplinary Evaluation, 5(9), 25-40.

Recent Presentations

The Complexity of Training and Development Evaluation. [Schroeter, D.C., Martens, K.S. Robertson, K.N.]. Paper presented at the annual meeting of the American Evaluation Association, Minneapolis, MN, October 27, 2012.Identifying Project Evaluation Effectiveness Principles in Professional Development Evaluation [Discussants: Snow, J., Hartry, A., Schroeter, D.C.] Think Tank facilitated at the annual meeting of the American Evaluation Association, Minneapolis, MN, October 26, 2012.Critical Review of Evidence-Based Program Repositories for Behavioral Health Treatment [Robertson, K.N., Schroeter, D.C., Burkhardt, J., Magura, S., Coryn, C.] Paper presented at the annual meeting of the American Evaluation Association, Minneapolis, MN, October 25, 2012.

Courses

Cost Analysis in Evaluation. Course of the IDPE (Spring 2010, 2009)Ongoing supervision of

  • Evaluation Field Experiences
  • Independent Research
  • Dissertation Readings

Service

 

Membership

 

Reviewer

 

Associate Editor

 

Advisor on Doctoral Program and Dissertation Committees

  • Select doctoral students in the IDPE (2009–)

 

(grant reviewer, 2010)

 

Daniela C. Schroeter, Ph.D.


Daniela C. Schroeter, Ph.D.
Director of Research

daniela.schroeter@wmich.edu
Phone: 269-387-5916

View website

Michigan Saves: Evaluability Assessment and Process Evaluation


Sponsor: Public Sector Consultants, Inc.

 


Project staff

Daniela Schroeter, Principal Investigator
Anne Cullen, Co-Principal Investigator
Chris Coryn, Methodologist
Kelly Robertson, Project Manager
Gil Peach, Consultant

Description

Michigan Saves is a recently funded innovative statewide energy efficiency and distributed renewable financing program, implemented by Public Sector Consultants Inc. in partnership with Delta Institute. The Evaluation Center serves as the programs external evaluator tasked with conducting an evaluability assessment and process evaluation.

 

Upcoming

After completing the process evaluation of the Cherryland pilot project, we are now planning for the statewide process evaluation of residential programming.

 

 

Manuscripts in Preparations

Schroeter, D. C., Coryn, C. L. S., Cullen, A., Robertson, K. N., & Alyami, M. Using concept mapping for planning and evaluation of a state-wide energy efficiency initiative. [Target journal: Energy Efficiency].

 

Presentations

Michigan Saves: Evaluability Assessment & Process Evaluation of an Innovative Energy Program. Presentation given at Western Michigan University’s Evaluation Center Evaluation Café Series. [with K. N. Robertson, A. Cullen, & M. Alyami]. Kalamazoo, MI. March 29, 2010.

 

Reports and Briefing Papers

Michigan Saves! Cherryland Pilot Project Evaluation Report (October 2010).

Evaluability Assessment of Michigan Saves! (May, 2010).

Michigan Saves Cherryland Evaluation Plan (March, 2010).

Michigan Saves Management Plan: Status of March, 2010.

 

 

NRCCTE’s Effectiveness and Performance on GPRA Measures


Evaluation Contract

March 2009 – February 2013

Sponsor

The National Research Center for Career and Technical Education, University of Louisville

 

Project staff

Daniela Schroeter, Principal Investigator
Richard Zinser, Co-Principal Investigator
Chris Coryn, Methodologist
Kelly Robertson, Project Manager

 

Description

This external evaluation is intended to assess the National Research Center for Career and Technical Education’s (NRCCTE) effectiveness and performance on the Government Performance and Results Act (GPRA) measures. Panel studies are used to NRCCTE research relevance and product quality, and survey studies are used to NRCCTE’s usefulness of technical assistance and quality of professional development. The findings are reported to the Department of Education’s Office of Vocational and Adult Education. In addition to the GPRA assessment, The Evaluation Center has been asked to conduct an outcome evaluation of the Technical Assistance Academy, one of the major NRCCTE projects.

 

Activities 2010-2011

 

Manuscripts in Preparations

Coryn, C. L. S., Schroeter, D. C., Zinser, R. W., & Robertson, K. N. Evaluating for accountability against Government Performance and Results Act criteria of research relevance and product quality: A case example from a national research center. [Target journal: Research Evaluation]

 

Reports and Briefing Papers

Management plan for assessing NRCCTE’s performance against GPRA metrics in 2010-2011 (May, 2010)

Synthesis of the Evaluation Methodology and Results of NRCCTE’s Effectiveness and Performance on GPRA Measures, Years 1 & 2 (April, 2010)

Evaluation of NRCCTE’s Effectiveness and Performance on GPRA Measures: Final Year 2 (2008-2009) Report (February, 2010)

Evaluation of NRCCTE’s Effectiveness and Performance on GPRA Measures: Final Year 1 (2007-2008) Report (July, 2009)

Findings from the questionnaire: Briefing Paper 3 (July, 2009)

Feedback for revising the panel studies: Briefing Paper 2 (July, 2009)

Findings from the panel study: Briefing Paper 1 (July, 2009)

Management plan for assessing NRCCTE’s performance against GPRA metrics (May, 2009)

 

International Labour Organization


This project revised ILO’s evaluation appraisal tool, specifically reflecting the ILO declaration on social justice for a fair globalization; appraised ILO’s 2008 evaluation reports (roughly 40 reports); surveyed evaluators’ regarding compliance with evaluation processes and results; and reported finding back to ILO.  The final report included both findings from each evaluation and synthesized findings across the evaluations.

 

Center for Student Opportunity


Purpose for the Center for Student Opportunity project is to develop a data collection plan and sampling framework; develop data collection instruments; implement a needs assessment; and report in aggregate.

 

Fulcrum


This project provided evaluation services for the Federal Railroad Administration’s International Conference on Fatigue Management and support to the evaluation track of the conference.

 

Operation Frontline


Operation Frontline, a nutrition education program that connects families with food by teaching them how to prepare healthy, tasty meals on a limited budget, was implemented at multiple sites across large geographic areas over a period of 15 years.

The Evaluation Center was in the beginning stages of conducting a process evaluation to measure the level of program implementation and developed a monitoring plan for Share Our Strength, the project funder.  The evaluation team selected for evaluating this project combined utilization-focused and consumer-oriented evaluation models and approaches.  Data was gathered through interviews, observations, and surveys.

 

German Logic Model


Logic Model Theory and Practice–A review of manuals, handouts, other written materials, and computer programs