Advertisement
examples of impact evaluation questions: Impact Evaluation in Practice, Second Edition Paul J. Gertler, Sebastian Martinez, Patrick Premand, Laura B. Rawlings, Christel M. J. Vermeersch, 2016-09-12 The second edition of the Impact Evaluation in Practice handbook is a comprehensive and accessible introduction to impact evaluation for policy makers and development practitioners. First published in 2011, it has been used widely across the development and academic communities. The book incorporates real-world examples to present practical guidelines for designing and implementing impact evaluations. Readers will gain an understanding of impact evaluations and the best ways to use them to design evidence-based policies and programs. The updated version covers the newest techniques for evaluating programs and includes state-of-the-art implementation advice, as well as an expanded set of examples and case studies that draw on recent development challenges. It also includes new material on research ethics and partnerships to conduct impact evaluation. The handbook is divided into four sections: Part One discusses what to evaluate and why; Part Two presents the main impact evaluation methods; Part Three addresses how to manage impact evaluations; Part Four reviews impact evaluation sampling and data collection. Case studies illustrate different applications of impact evaluations. The book links to complementary instructional material available online, including an applied case as well as questions and answers. The updated second edition will be a valuable resource for the international development community, universities, and policy makers looking to build better evidence around what works in development. |
examples of impact evaluation questions: Impact Evaluation of Development Interventions Howard White, David A. Raitzer, 2017-12-01 Impact evaluation is an empirical approach to estimating the causal effects of interventions, in terms of both magnitude and statistical significance. Expanded use of impact evaluation techniques is critical to rigorously derive knowledge from development operations and for development investments and policies to become more evidence-based and effective. To help backstop more use of impact evaluation approaches, this book introduces core concepts, methods, and considerations for planning, designing, managing, and implementing impact evaluation, supplemented by examples. The topics covered range from impact evaluation purposes to basic principles, specific methodologies, and guidance on field implementation. It has materials for a range of audiences, from those who are interested in understanding evidence on what works in development, to those who will contribute to expanding the evidence base as applied researchers. |
examples of impact evaluation questions: The Goldilocks Challenge Mary Kay Gugerty, Dean Karlan, 2018-04-02 The social sector provides services to a wide range of people throughout the world with the aim of creating social value. While doing good is great, doing it well is even better. These organizations, whether nonprofit, for-profit, or public, increasingly need to demonstrate that their efforts are making a positive impact on the world, especially as competition for funding and other scarce resources increases. This heightened focus on impact is positive: learning whether we are making a difference enhances our ability to address pressing social problems effectively and is critical to wise stewardship of resources. Yet demonstrating efficacy remains a big hurdle for most organizations. The Goldilocks Challenge provides a parsimonious framework for measuring the strategies and impact of social sector organizations. A good data strategy starts first with a sound theory of change that helps organizations decide what elements they should monitor and measure. With a theory of change providing solid underpinning, the Goldilocks framework then puts forward four key principles, the CART principles: Credible data that are high quality and analyzed appropriately, Actionable data will actually influence future decisions; Responsible data create more benefits than costs; and Transportable data build knowledge that can be used in the future and by others. Mary Kay Gugerty and Dean Karlan combine their extensive experience working with nonprofits, for-profits and government with their understanding of measuring effectiveness in this insightful guide to thinking about and implementing evidence-based change. This book is an invaluable asset for nonprofit, social enterprise and government leaders, managers, and funders-including anyone considering making a charitable contribution to a nonprofit-to ensure that these organizations get it just right by knowing what data to collect, how to collect it, how it can be analyzed, and drawing implications from the analysis. Everyone who wants to make positive change should focus on the top priority: using data to learn, innovate, and improve program implementation over time. Gugerty and Karlan show how. |
examples of impact evaluation questions: Program Evaluation John M Owen, Patricia J. Rogers, 1999-04-05 Using an original framework, this practical introduction to evaluation shows how to identify appropriate forms and approaches, involve stakeholders in the planning process and disseminate the evaluation findings. |
examples of impact evaluation questions: The Road to Results Linda G. Morra-Imas, Linda G. Morra, Ray C. Rist, 2009 'The Road to Results: Designing and Conducting Effective Development Evaluations' presents concepts and procedures for evaluation in a development context. It provides procedures and examples on how to set up a monitoring and evaluation system, how to conduct participatory evaluations and do social mapping, and how to construct a rigorous quasi-experimental design to answer an impact question. The text begins with the context of development evaluation and how it arrived where it is today. It then discusses current issues driving development evaluation, such as the Millennium Development Goals and the move from simple project evaluations to the broader understandings of complex evaluations. The topics of implementing 'Results-based Measurement and Evaluation' and constructing a 'Theory of Change' are emphasized throughout the text. Next, the authors take the reader down 'the road to results, ' presenting procedures for evaluating projects, programs, and policies by using a 'Design Matrix' to help map the process. This road includes: determining the overall approach, formulating questions, selecting designs, developing data collection instruments, choosing a sampling strategy, and planning data analysis for qualitative, quantitative, and mixed method evaluations. The book also includes discussions on conducting complex evaluations, how to manage evaluations, how to present results, and ethical behavior--including principles, standards, and guidelines. The final chapter discusses the future of development evaluation. This comprehensive text is an essential tool for those involved in development evaluation. |
examples of impact evaluation questions: A Review of Recent Developments in Impact Evaluation Asian Development Bank, 2011-02-01 Impact evaluation aims to answer whether and to what extent a development intervention has delivered its intended effects, thus enabling evidence-based policy making. The desire for more hard evidence of the effectiveness of development interventions has fueled a growing interest in rigorous impact evaluation in the international development community. This report discusses the fundamental challenge of impact evaluation, which is to credibly attribute the impact, if any, to the intervention concerned. It then discusses the merits and limitations of various impact evaluation methods. It also presents a survey of recent applications of impact evaluation, focusing on the typical evaluation problems looked at, methods used, and key findings. The report includes six case studies and outlines practical steps in implementing an impact evaluation. |
examples of impact evaluation questions: Principles-Focused Evaluation Michael Quinn Patton, 2017-09-28 How can programs and organizations ensure they are adhering to core principles--and assess whether doing so is yielding desired results? From evaluation pioneer Michael Quinn Patton, this book introduces the principles-focused evaluation (P-FE) approach and demonstrates its relevance and application in a range of settings. Patton explains why principles matter for program development and evaluation and how they can serve as a rudder to navigate the uncertainties, turbulence, and emergent challenges of complex dynamic environments. In-depth exemplars illustrate how the unique GUIDE framework is used to determine whether principles provide meaningful guidance (G) and are useful (U), inspiring (I), developmentally adaptable (D), and evaluable (E). User-friendly features include rubrics, a P-FE checklist, firsthand reflections and examples from experienced P-FE practitioners, sidebars and summary tables, and end-of-chapter application exercises. ÿ |
examples of impact evaluation questions: Effective Chemistry Communication in Informal Environments National Academies of Sciences, Engineering, and Medicine, Division of Behavioral and Social Sciences and Education, Board on Science Education, Division on Earth and Life Studies, Board on Chemical Sciences and Technology, Committee on Communicating Chemistry in Informal Settings, 2016-09-19 Chemistry plays a critical role in daily life, impacting areas such as medicine and health, consumer products, energy production, the ecosystem, and many other areas. Communicating about chemistry in informal environments has the potential to raise public interest and understanding of chemistry around the world. However, the chemistry community lacks a cohesive, evidence-based guide for designing effective communication activities. This report is organized into two sections. Part A: The Evidence Base for Enhanced Communication summarizes evidence from communications, informal learning, and chemistry education on effective practices to communicate with and engage publics outside of the classroom; presents a framework for the design of chemistry communication activities; and identifies key areas for future research. Part B: Communicating Chemistry: A Framework for Sharing Science is a practical guide intended for any chemists to use in the design, implementation, and evaluation of their public communication efforts. |
examples of impact evaluation questions: Evaluation matters Katrin Dziekan, Veronique Riedel, Stephanie Müller, Michael Abraham, Stefanie Kettner, Stephan Daubi, 2013 Based on the authors' rich experiences, this book demonstrates that evaluation of measures aimed at more sustainable mobility is a useful task which can be learned by everybody. By integrating theory and practice it offers richly-illustrated case examples and cartoons to provide hands on advice. It offers a framework for thinking about evaluation of mobility-related measures and outlines the necessary steps for good evaluation practice. Key Features •Richly illustrated by comics and on real measure examples. •A step-by-step hands on guide for practitioners. |
examples of impact evaluation questions: Handbook on Impact Evaluation Shahidur R. Khandker, Gayatri B. Koolwal, Hussain A. Samad, 2009-10-13 Public programs are designed to reach certain goals and beneficiaries. Methods to understand whether such programs actually work, as well as the level and nature of impacts on intended beneficiaries, are main themes of this book. |
examples of impact evaluation questions: The Practice of Evaluation Ryan P. Kilmer, James R. Cook, 2020-09-18 The Practice of Evaluation: Partnership Approaches for Community Change provides foundational content on evaluation concepts, approaches, and methods, with an emphasis on the use of evaluation and partnership approaches to effect change. Real examples in every chapter illustrate key ideas and concepts in action on topics such as organizational development, capacity building, program improvement, and advocacy. Editors Ryan P. Kilmer and James R. Cook, and the chapter authors, highlight pragmatic approaches to evaluation that balance the needs of stakeholders in an ethical way, to provide useful, usable, and actionable guidance for program improvement. Included with this title: The password-protected Instructor Resource Site (formally known as SAGE Edge) offers access to all text-specific resources, including a test bank and editable, chapter-specific PowerPoint® slides. |
examples of impact evaluation questions: Evaluation Fundamentals: Insights into the Outcomes, Effectiveness, and Quality of Health Programs Arlene Fink, 2005 Arlene Fink outlines the basic concepts & vocabulary necessary for programme evaluation & illustrates how to review the quality of evaluation research so as to make informed decisions about methods & outcomes. |
examples of impact evaluation questions: A practical guide for ex-ante impact evaluation in fisheries and aquaculture Crissman, C.C., Abernethy, K., Delaporte, A., Timmers, B., |
examples of impact evaluation questions: The Logic Model Guidebook Lisa Wyatt Knowlton, Cynthia C. Phillips, 2012-08-24 The Logic Model Guidebook offers clear, step-by-step support for creating logic models and the modeling process in a range of contexts. Lisa Wyatt Knowlton and Cynthia C. Phillips describe the structures, processes, and language of logic models as a robust tool to improve the design, development, and implementation of program and organization change efforts. The text is enhanced by numerous visual learning guides (sample models, checklists, exercises, worksheets) and many new case examples. The authors provide students, practitioners, and beginning researchers with practical support to develop and improve models that reflect knowledge, practice, and beliefs. The Guidebook offers a range of new applied examples. The text includes logic models for evaluation, discusses archetypes, and explores display and meaning. In an important contribution to programs and organizations, it emphasizes quality by raising issues like plausibility, feasibility, and strategic choices in model creation. |
examples of impact evaluation questions: Development Research in Practice Kristoffer Bjärkefur, Luíza Cardoso de Andrade, Benjamin Daniels, Maria Ruth Jones, 2021-07-16 Development Research in Practice leads the reader through a complete empirical research project, providing links to continuously updated resources on the DIME Wiki as well as illustrative examples from the Demand for Safe Spaces study. The handbook is intended to train users of development data how to handle data effectively, efficiently, and ethically. “In the DIME Analytics Data Handbook, the DIME team has produced an extraordinary public good: a detailed, comprehensive, yet easy-to-read manual for how to manage a data-oriented research project from beginning to end. It offers everything from big-picture guidance on the determinants of high-quality empirical research, to specific practical guidance on how to implement specific workflows—and includes computer code! I think it will prove durably useful to a broad range of researchers in international development and beyond, and I learned new practices that I plan on adopting in my own research group.†? —Marshall Burke, Associate Professor, Department of Earth System Science, and Deputy Director, Center on Food Security and the Environment, Stanford University “Data are the essential ingredient in any research or evaluation project, yet there has been too little attention to standardized practices to ensure high-quality data collection, handling, documentation, and exchange. Development Research in Practice: The DIME Analytics Data Handbook seeks to fill that gap with practical guidance and tools, grounded in ethics and efficiency, for data management at every stage in a research project. This excellent resource sets a new standard for the field and is an essential reference for all empirical researchers.†? —Ruth E. Levine, PhD, CEO, IDinsight “Development Research in Practice: The DIME Analytics Data Handbook is an important resource and a must-read for all development economists, empirical social scientists, and public policy analysts. Based on decades of pioneering work at the World Bank on data collection, measurement, and analysis, the handbook provides valuable tools to allow research teams to more efficiently and transparently manage their work flows—yielding more credible analytical conclusions as a result.†? —Edward Miguel, Oxfam Professor in Environmental and Resource Economics and Faculty Director of the Center for Effective Global Action, University of California, Berkeley “The DIME Analytics Data Handbook is a must-read for any data-driven researcher looking to create credible research outcomes and policy advice. By meticulously describing detailed steps, from project planning via ethical and responsible code and data practices to the publication of research papers and associated replication packages, the DIME handbook makes the complexities of transparent and credible research easier.†? —Lars Vilhuber, Data Editor, American Economic Association, and Executive Director, Labor Dynamics Institute, Cornell University |
examples of impact evaluation questions: Doing Real Research Eric Jensen, Charles Laurie, 2016-03-17 Challenging the formality and idealized settings of conventional methods teaching and opting instead for a real world approach to social research, this book offers frank, practical advice designed to empower students and researchers alike. Theoretically robust and with an exhaustive coverage of key methodologies and methods the title establishes the cornerstones of social research. Examples reflect research conducted inside and outside formal university settings and range from the extremes of war torn countries to the complexities of school classrooms. Supported by a wealth of learning features and tools the textbook and website include: Video top tips Podcasts Full text journal articles Interviews with researchers conducting field research Links to external websites and blogs Student exercises Real world case studies |
examples of impact evaluation questions: Developmental Evaluation Exemplars Michael Quinn Patton, Kate McKegg, Nan Wehipeihana, 2015-11-16 Responding to evaluator and instructor demand, this book presents a diverse set of high-quality developmental evaluation (DE) case studies. Twelve insightful exemplars illustrate how DE is used to evaluate innovative initiatives in complex, dynamic environments, including a range of fields and international settings. Written by leading practitioners, chapters offer a rare window into what it takes to do DE, what roles must be fulfilled, and what results can be expected. Each case opens with an incisive introduction by the editors. The book also addresses frequently asked questions about DE, synthesizes key themes and lessons learned from the exemplars, and identifies eight essential principles of DE. See also Michael Quinn Patton's Developmental Evaluation, the authoritative presentation of DE. |
examples of impact evaluation questions: Small-Scale Evaluation Colin Robson, 2000-02-11 How can evaluation be used most effectively, and what are the strengths and weaknesses of the various methods? Colin Robson provides guidance in a clear and uncluttered way. The issue of collaboration is examined step-by-step; stakeholder models are compared with techniques such as participatory evaluation and practitioner-centred action research; ethical and political considerations are placed in context; and the best ways of communicating findings are discussed. Each chapter is illustrated with helpful exercises to show the practical application of the issues covered, making this an invaluable introduction for anyone new to evaluation. |
examples of impact evaluation questions: How Change Happens Duncan Green, 2016 DLP, Developmental Leadership Program; Australian Aid; Oxfam. |
examples of impact evaluation questions: Economic Evaluation of Sustainable Development Vinod Thomas, Namrata Chindarkar, 2019-04-16 This book is open access under a CC BY 4.0 license. This book presents methods to evaluate sustainable development using economic tools. The focus on sustainable development takes the reader beyond economic growth to encompass inclusion, environmental stewardship and good governance. Sustainable Development Goals (SDGs) provide a framework for outcomes. In illustrating the SDGs, the book employs three evaluation approaches: impact evaluation, cost-benefit analysis and objectives-based evaluation. The innovation lies in connecting evaluation tools with economics. Inclusion, environmental care and good governance, thought of as “wicked problems”, are given centre stage. The book uses case studies to show the application of evaluation tools. It offers guidance to evaluation practitioners, students of development and policymakers. The basic message is that evaluation comes to life when its links with socio-economic, environmental, and governance policies are capitalized on. |
examples of impact evaluation questions: , |
examples of impact evaluation questions: Evaluation Peter Henry Rossi, Howard E. Freeman, Sonia Rosenbaum, 1982-02 |
examples of impact evaluation questions: Community Impact Assessment , 1996 This guide was written as a quick primer for transportation professionals and analysts who assess the impacts of proposed transportation actions on communities. It outlines the community impact assessment process, highlights critical areas that must be examined, identifies basic tools and information sources, and stimulates the thought-process related to individual projects. In the past, the consequences of transportation investments on communities have often been ignored or introduced near the end of a planning process, reducing them to reactive considerations at best. The goals of this primer are to increase awareness of the effects of transportation actions on the human environment and emphasize that community impacts deserve serious attention in project planning and development-attention comparable to that given the natural environment. Finally, this guide is intended to provide some tips for facilitating public involvement in the decision making process. |
examples of impact evaluation questions: A Practical Guide to Program Evaluation Planning Marc A. Zimmerman, Debra J. Holden, 2009 This book guides evaluators in planning a comprehensive, yet practical, program evaluation—from start to design—within any context, in an accessible manner. |
examples of impact evaluation questions: Developmental Evaluation Michael Quinn Patton, 2010-06-14 Developmental evaluation (DE) offers a powerful approach to monitoring and supporting social innovations by working in partnership with program decision makers. In this book, eminent authority Michael Quinn Patton shows how to conduct evaluations within a DE framework. Patton draws on insights about complex dynamic systems, uncertainty, nonlinearity, and emergence. He illustrates how DE can be used for a range of purposes: ongoing program development, adapting effective principles of practice to local contexts, generating innovations and taking them to scale, and facilitating rapid response in crisis situations. Students and practicing evaluators will appreciate the book's extensive case examples and stories, cartoons, clear writing style, closer look sidebars, and summary tables. Provided is essential guidance for making evaluations useful, practical, and credible in support of social change. |
examples of impact evaluation questions: Realistic Evaluation Ray Pawson, Nick Tilley, 1997-06-23 Table of Contents |
examples of impact evaluation questions: Evaluating AIDS Prevention Programs National Research Council, Division of Behavioral and Social Sciences and Education, Commission on Behavioral and Social Sciences and Education, Committee on AIDS Research and the Behavioral, Social, and Statistical Sciences, Panel on the Evaluation of AIDS Interventions, 1991-02-01 With insightful discussion of program evaluation and the efforts of the Centers for Disease Control, this book presents a set of clear-cut recommendations to help ensure that the substantial resources devoted to the fight against AIDS will be used most effectively. This expanded edition of Evaluating AIDS Prevention Programs covers evaluation strategies and outcome measurements, including a realistic review of the factors that make evaluation of AIDS programs particularly difficult. Randomized field experiments are examined, focusing on the use of alternative treatments rather than placebo controls. The book also reviews nonexperimental techniques, including a critical examination of evaluation methods that are observational rather than experimentalâ€a necessity when randomized experiments are infeasible. |
examples of impact evaluation questions: Outcome-Based Evaluation Robert L. Schalock, 2005-12-17 Outcome-based evaluation continues to play a central role in the larger field of policy analysis and speaks to the needs and interests of administrators, students, policymakers, funders, consumers, and educators. In a thoroughgoing revision of the first edition of this classic text and reference, published by Plenum in 1995, the author broadens the coverage from his previous emphasis on developmental disabilities to include other areas of human and social service delivery such as education, health, mental health, aging, substance abuse, and corrections. |
examples of impact evaluation questions: Causal Inference Scott Cunningham, 2021-01-26 An accessible, contemporary introduction to the methods for determining cause and effect in the Social Sciences “Causation versus correlation has been the basis of arguments—economic and otherwise—since the beginning of time. Causal Inference: The Mixtape uses legit real-world examples that I found genuinely thought-provoking. It’s rare that a book prompts readers to expand their outlook; this one did for me.”—Marvin Young (Young MC) Causal inference encompasses the tools that allow social scientists to determine what causes what. In a messy world, causal inference is what helps establish the causes and effects of the actions being studied—for example, the impact (or lack thereof) of increases in the minimum wage on employment, the effects of early childhood education on incarceration later in life, or the influence on economic growth of introducing malaria nets in developing regions. Scott Cunningham introduces students and practitioners to the methods necessary to arrive at meaningful answers to the questions of causation, using a range of modeling techniques and coding instructions for both the R and the Stata programming languages. |
examples of impact evaluation questions: Evaluating the Impact of Development Projects on Poverty Judy L. Baker, 2000 Despite the billions of dollars spent on development assistance each year, there is still very little known about the actual impact of projects on the poor. There is broad evidence on the benefits of economic growth, investments in human capital, and the provision of safety nets for the poor. But for a specific program or project in a given country, is the intervention producing the intended benefits and what was the overall impact on the population? Could the program or project be better designed to achieve the intended outcomes? Are resources being spent efficiently? These are the types of questions that can only be answered through an impact evaluation, an approach which measures the outcomes of a program intervention in isolation of other possible factors. This handbook seeks to provide project managers and policy analysts with the tools needed for evaluating project impact. It is aimed at readers with a general knowledge of statistics. For some of the more in-depth statistical methods discussed, the reader is referred to the technical literature on the topic. Chapter 1 presents an overview of concepts and methods. Chapter 2 discusses key steps and related issues to consider in implementation. Chapter 3 illustrates various analytical techniques through a case study. Chapter 4 includes a discussion of lessons learned from a rich set of good practice evaluations of poverty projects which have been reviewed for this handbook. |
examples of impact evaluation questions: Design Considerations for Evaluating the Impact of PEPFAR Institute of Medicine, Board on Global Health, 2008-10-05 Design Considerations for Evaluating the Impact of PEPFAR is the summary of a 2-day workshop on methodological, policy, and practical design considerations for a future evaluation of human immunodeficiency virus/acquired immunodeficiency syndrome (HIV/AIDS) interventions carried out under the President's Emergency Plan for AIDS Relief (PEPFAR), which was convened by the Institute of Medicine (IOM) on April 30 and May 1, 2007. Participants at the workshop included staff of the U.S. Congress; PEPFAR officials and implementers; major multilateral organizations such as The Global Fund to Fight AIDS, Malaria, and Tuberculosis (The Global Fund), the Joint United Nations Programme on HIV/AIDS (UNAIDS), and the World Bank; representatives from international nongovernmental organizations; experienced evaluation experts; and representatives of partner countries, particularly the PEPFAR focus countries. The workshop represented a final element of the work of the congressionally mandated IOM Committee for the Evaluation of PEPFAR Implementation, which published a report of its findings in 2007 evaluating the first 2 years of implementation, but could not address longer term impact evaluation questions. |
examples of impact evaluation questions: Evaluating the Impact of Development Projects on Poverty Judy L. Baker, 2000 Despite the billions of dollars spent on development assistance each year, there is still very little known about the actual impact of projects on the poor. There is broad evidence on the benefits of economic growth, investments in human capital, and the provision of safety nets for the poor. But for a specific program or project in a given country, is the intervention producing the intended benefits and what was the overall impact on the population? Could the program or project be better designed to achieve the intended outcomes? Are resources being spent efficiently? These are the types of questions that can only be answered through an impact evaluation, an approach which measures the outcomes of a program intervention in isolation of other possible factors.This handbook seeks to provide project managers and policy analysts with the tools needed for evaluating project impact. It is aimed at readers with a general knowledge of statistics. For some of the more in-depth statistical methods discussed, the reader is referred to the technical literature on the topic. Chapter 1 presents an overview of concepts and methods. Chapter 2 discusses key steps and related issues to consider in implementation. Chapter 3 illustrates various analytical techniques through a case study. Chapter 4 includes a discussion of lessons learned from a rich set of 'good practice' evaluations of poverty projects which have been reviewed for this handbook. |
examples of impact evaluation questions: The 'most Significant Change' (MSC) Technique Rick Davies, Jess Dart, 2007 |
examples of impact evaluation questions: Utilization-Focused Evaluation Michael Quinn Patton, 1986 The second edition of Patton's classic text retains the practical advice, based on empirical observation and evaluation theory, of the original. It shows how to conduct an evaluation, from beginning to end, in a way that will be useful -- and actually used. Patton believes that evaluation epitomizes the challenges of producing and using information in the information age. His latest book includes new stories, new examples, new research findings, and more of Patton's evaluation humour. He adds to the original book's insights and analyses of the changes in evaluation during the past decade, including: the emergence of evaluation as a field of professional practice; articulation of standards for evaluation; a methodological synthesis of the qualitative versus quantitative debate; the tremendous growth of 'in-house' evaluations; and the cross-cultural development of evaluation as a profession. This edition also incorporates the considerable research done on utilization during the last ten years. Patton integrates diverse findings into a coherent framework which includes: articulation of utilization-focused evaluation premises; examination of the stakeholder assumption; and clarification of the meaning of utilization. --Publisher description. |
examples of impact evaluation questions: Evaluating Professional Development Thomas R. Guskey, 2000 Explains how to better evaluate professional development in order to ensure that it increases student learning, providing questions for accurate measurement of professional development and showing how to demonstrate results and accountability. |
examples of impact evaluation questions: Program Evaluation and Performance Measurement James C. McDavid, Irene Huse, Laura R. L. Hawthorn, 2012-10-25 Program Evaluation and Performance Measurement: An Introduction to Practice, Second Edition offers an accessible, practical introduction to program evaluation and performance measurement for public and non-profit organizations, and has been extensively updated since the first edition. Using examples, it covers topics in a detailed fashion, making it a useful guide for students as well as practitioners who are participating in program evaluations or constructing and implementing performance measurement systems. Authors James C. McDavid, Irene Huse, and Laura R. L. Hawthorn guide readers through conducting quantitative and qualitative program evaluations, needs assessments, cost-benefit and cost-effectiveness analyses, as well as constructing, implementing and using performance measurement systems. The importance of professional judgment is highlighted throughout the book as an intrinsic feature of evaluation practice. |
examples of impact evaluation questions: Attributing Development Impact James Copestake, Marlies Morsink, Fiona Remnant, 2019 Attributing Development Impact brings together responses using an innovative impact evaluation approach called the Qualitative Impact Protocol (QuIP). This is a transparent, flexible and relatively simple set of guidelines for collecting, analysing and sharing feedback from intended beneficiaries about significant drivers of change in their lives. |
examples of impact evaluation questions: Evaluating Programs to Increase Student Achievement Martin H. Jason, 2008-03-27 This updated edition on evaluating the effectiveness of school programs provides an expanded needs-assessment section, additional methods for data analysis, and tools for communicating program results. |
examples of impact evaluation questions: User-friendly Handbook for Mixed Method Evaluations Joy A. Frechtling, Laure Metzger Sharp, 1997 In the evaluation of the process and effectiveness of projects funded by the NSF's Directorate for Education, experienced evaluators have found that most often the best results are achieved through the use of mixed method evaluations combining quantitative and qualitative techniques. Aimed at users who need practical rather than technically sophisticated advice about evaluation methodology, this handbook includes an in-depth discussion of the collection and analysis of qualitative data and examines how qualitative techniques can be combined effectively with quantitative measures. Bibliography. Glossary. Worksheets. |
examples of impact evaluation questions: The Practice of Health Program Evaluation David Grembowski, 2015-09-16 Reflecting the latest developments in the field, the Second Edition provides readers with effective methods for evaluating health programs, policies, and health care systems, offering expert guidance for collaborating with stakeholders involved in the process. Author David Grembowski explores evaluation as a three-act play: Act I shows evaluators how to work with decision makers and other groups to identify the questions they want answered; Act II covers selecting appropriate evaluation designs and methods to answer the questions and reveal insights about the program’s impacts, cost-effectiveness, and implementation; and Act III discusses making use of the findings. Packed with relevant examples and detailed explanations, the book offers a step-by-step approach that fully prepares readers to apply research methods in the practice of health program evaluation. |
Examples - Apache ECharts
Apache ECharts,一款基于JavaScript的数据可视化图表库,提供直观,生动,可交互,可个性化定制的数据可视化图表。
Examples - Apache ECharts
Examples; Resources. Spread Sheet Tool; Theme Builder; Cheat Sheet; More Resources; Community. Events; Committers; Mailing List; How to Contribute; Dependencies; Code …
Examples - Apache ECharts
Examples; Resources. Spread Sheet Tool; Theme Builder; Cheat Sheet; More Resources; Community. Events; Committers; Mailing List; How to Contribute; Dependencies; Code …
Apache ECharts
ECharts: A Declarative Framework for Rapid Construction of Web-based Visualization. 如果您在科研项目、产品、学术论文、技术报告、新闻报告、教育、专利以及其他相关活动中使用了 …
Events - Apache ECharts
Examples; Resources. Spread Sheet Tool; Theme Builder; Cheat Sheet; More Resources; Community. Events; Committers; Mailing List; How to Contribute; Dependencies; Code …
Examples - Apache ECharts
Apache ECharts,一款基于JavaScript的数据可视化图表库,提供直观,生动,可交互,可个性化定制的数据可视化图表。
Examples - Apache ECharts
Examples; Resources. Spread Sheet Tool; Theme Builder; Cheat Sheet; More Resources; Community. Events; Committers; Mailing List; How to Contribute; Dependencies; Code Standard; …
Examples - Apache ECharts
Examples; Resources. Spread Sheet Tool; Theme Builder; Cheat Sheet; More Resources; Community. Events; Committers; Mailing List; How to Contribute; Dependencies; Code Standard; …
Apache ECharts
ECharts: A Declarative Framework for Rapid Construction of Web-based Visualization. 如果您在科研项目、产品、学术论文、技术报告、新闻报告、教育、专利以及其他相关活动中使用了 Apache …
Events - Apache ECharts
Examples; Resources. Spread Sheet Tool; Theme Builder; Cheat Sheet; More Resources; Community. Events; Committers; Mailing List; How to Contribute; Dependencies; Code Standard; …