2004 – 2005


October 5, 2004
Title: The Rhetoric and Reality of Evaluation in Portugal

Presenter: Dr. Irene Figueiredo – Instituto Politécnico do Porto, PortugalAbstract: In Portugal, as in most countries, evaluating the effectiveness of organizations, programs, projects and staff has become a central issue on the political agenda. A significant body of legislation has been proposed concerning public administration and the assessment of civil servants (including the educational sector, which is 90 percent state owned). However, implementation of such policies is facing difficulties. The underdeveloped use of evaluation in Portugal is due to particular cultural/contextual and technical reasons. This session will briefly present some of those reasons and propose improvements to the situation. To succeed, Portugal must identify its strengths and develop and demonstrate excellence. Currently, OECD indicators show the country has a long way to go. Training institutions, particularly at the higher education level, are facing a strong demand for information and training in evaluation theory and practice. To better cope with this demand, Portugal needs to establish sound research and development programs based on national and international cooperation with recognized institutions. The session will present the state of evaluation theory and practice in Portugal and discuss approaches needed to foster a sustainable course of action.

December 14, 2004

View Claudius Ceccon’s political cartoons
Title: Centre for the Creation of People’s Image: An Idea in Progress

Presenter: Claudius Ceccon – CEO, Centre for the Creation of People’s Image (CECIP), Rio de Janiero, BrazilAbstract: This presentation will deal with the cultural, social, historical and political background that led to the creation of CECIP, a Brazilian NGO that uses communication and education to foster human development. Mr. Ceccon will briefly discuss the concept of educommunication and the process of “conscientization,” or awareness raising, a neologism invented by Brazilian educator Paulo Freire. He will present CECIP’s Lines of Action and give some concrete examples of current projects and the methodology that supports the organization’s way of doing. He will discuss the use of humour as a key pedagogical element. Mr. Ceccon will describe a current case, the “Guaranteeing the Future Project” in Angola and explain why CECIP was invited to work in that country and how CECIP sees this partnership. Some considerations on the present situation and the perspectives for the near future will be discussed.

January 6, 2005
Title: Curriculum Evaluation: Criteria Development for the Evaluation of the Greek Curriculum in Primary Education

Presenter: Vangelis Krikas – Doctoral Candidate in Curriculum Evaluation, University of Crete, GreeceAbstract: Although educational research has grown over the past decade, curriculum evaluation studies in Greece have received relatively little attention. The main aim of my doctoral research is to develop general and specific criteria for evaluating the intended/written curriculum for Greek primary education. These criteria will be based on: (a) critical examination of the New Cross-Thematic Curriculum from the Greek Ministry of Education (2003); (b) contemporary and future nature of school curriculum; (c) existing evaluation models and criteria/standards; and (d) views of curriculum and evaluation experts, educators, students, and parents.

January 12, 2005 Title: Evaluating Research: A Structured Set of Questions That Address Validity and Credibility of ResearchPresenter: Dr. Gary Miron – Chief of Staff, The Evaluation Center

Abstract: The presentation will review a set of questions that can be asked when evaluating the validity and credibility of reported research. These questions address study design, scope, and completeness of the study. Additional questions address what should be found in the technical report, as well as the source and sponsor of the research. The Hechniger Institute on Education and the Media at Columbia University asked Dr. Miron to prepare a lecture for newspaper editors and supervisors regarding how they judge and cover educational research dealing with student achievement. This session will highlight some of the issues addressed in his presentation for the Hechninger Institute. A short amount of time at the beginning of the event will be set aside to demonstrate a searchable literature database developed by Center staff and used by Dr. Miron and others working with him.


January 18, 2005

View the slides
Title: Developing Guidelines for Diversity EducationPresenter: Dr. Carolyn Sullins – Senior Research Associate, The Evaluation Center

Abstract: This session is for anyone who is invested in the future of diversity education, whether as developers, facilitators, participants, or parents of participants. Such guidelines are important because if developed and implemented well, diversity education programs can break down longstanding barriers and promote respect and appreciation in a variety of educational, corporate, and community settings. On the other hand, past experience shows that poorly designed or executed diversity programs can instead fuel resentment or conflict. These issues are especially salient in today’s polarized socio-political climate. The session’s goal is for participants to review Dr. Sullins’ drafted guidelines and present their own ideas for improving them. The ultimate goal is to create a set of broadly accepted guidelines for developing and implementing constructive diversity education programs.This event was part of WMU’s MLK Day Celebration.


Feburary 4, 2005 Title: HSIRB Guidelines for Research and Evaluation

Presenter: Ms. Victoria Janson – Research Compliance Coordinator, Office of the Vice President for ResearchAbstract: Ms. Janson will discuss the University’s Human Subjects Institutional Review Board policies for research and evaluation involving human subjects.

February 23, 2005

View the slides
Title: The Tale of a Mentored Internship Program in Evaluation

Presenters: Dr. Arlen Gullickson, Director – The Evaluation Center & Ms. Joan Farland, Project Specialist, The Evaluation CenterAbstract: The Evaluation Center has offered an evaluation internship opportunity through the National Science Foundation-supported MTS (Materials development, Training, and Support services) grant. Many lessons have been learned from this innovative program that was in effect from 1997 to 2004. The MTS internship will be described and implications for its continuation and for The Evaluation Center will be explored and discussed.

March 11, 2005
Guests: Dawn Wood and Gina Thomas – Co-directors, Girls in the Wild program at the Adventure Center at Pretty Lake

Abstract: Ms. Wood and Ms. Thomas are seeking input on how to evaluate the Girls in the Wild program. Girls in the Wild is a wilderness-based rite-of-passage program that seeks to educate adolescent girls about reproductive health, self-defense, body image, and healthy peer relationships. At this time they do not have funding for evaluation, but may acquire it. In the meantime they would like guidance on how to set up an evaluation of the program.

March 18, 2005

View the slides
Title: Evaluation of a Formal Language Test for School-Age ChildrenPresenters: Dr. Nickola Wolf Nelson – Professor, Director of WMU Interdisciplinary Health Studies Ph.D.; Barbara Johnson and Michele Anderson, Doctoral Associates, WMU Interdisciplinary Health Studies Ph.D

Abstract: The Test of Integrated Curriculum-Related Language Skills (TICLS) is a new test that has been through several rounds of field testing in preparation for final standardization. We are currently engaged in another set of pilot studies on some of the subtests and parent, teacher, and student versions of a “Classroom Questionnaire.” We would welcome input as we analyze these data and prepare to resubmit a revised Small Business Technology Transfer grant proposal to the NIH for funding of the final standardization activities.


March 29, 2005

View the slides
Title: Hosted Survey™: An Introduction

Presenter: Daniela C. Schroeter – Evaluation Ph.D. StudentAbstract: Web-based surveys have become a leading tool to develop, design, and implement surveys within the realm of evaluation. Likewise, a wide and fast growing range of such tools are readily available on the Internet. Hosted Survey™ is one of these tools and was chosen to be utilized for The Evaluation Center’s evaluation of the sustainability of the National Science Foundation’s Advanced Technological Education (ATE) program as it includes design options not commonly available in other widely used packages such as Zoomerang™ or Survey Monkey™. Within this Evaluation Café, attendees will be introduced to the primary features of this tool and how it will be utilized to collect data within the ATE program evaluation. A draft survey will be presented as an example. Finally, attendees of the Evaluation Café will be asked to provide feedback to the survey.

April 8, 2005

View the slides
Title: Putting Evaluation to Work for FoundationPresenter: Dr. Jana Kay Slater – Senior Research Scientist, Public Health Institute, Oakland, CA

Abstract: What are the major challenges to a productive partnership between evaluation and philanthropy? What solutions to these challenges are suggested by leaders in these fields?The David and Lucile Packard Foundation recently funded the Evaluation and Philanthropy Project, which aimed to explore these and other questions. Nationally prominent leaders from the philanthropic and evaluation communities worked together over two years to identify issues and propose potential solutions for the unique problems of using evaluation in foundations. Among the themes that emerged from this intellectual groundwork, two stood out clearly. First, evaluation can provide a wide range of benefits to foundations, although the unique organizational characteristics of foundations must be considered in order for evaluation to be of optimal use. Second, evaluators face special challenges in accommodating the ways that foundations are structured, and must be informed about and sensitive to those challenges.Jana Kay Slater is an experienced evaluator and social science researcher. She was the principal investigator of the Evaluation and Philanthropy Project and is the co-editor of Evaluation and Foundations: Contexts and Practices for Effective Philanthropy (Jossey-Bass, 2004). Drawing on the experiences, debates and recommendations that grew out of the evaluation project, she will present information related to the above themes, and will address the issue of how to make foundation evaluations successful and useful.

Past Events | Upcoming Events

No Comments

No comments yet.

RSS feed for comments on this post.

Sorry, the comment form is closed at this time.