question archive Assignment: Designing a Plan for Outcome Evaluation Social workers can apply knowledge and skills learned from conducting one type of evaluation to others
Subject:SociologyPrice: Bought3
Assignment: Designing a Plan for Outcome Evaluation
Social workers can apply knowledge and skills learned from conducting one type of evaluation to others. Moreover, evaluations themselves can inform and complement each other throughout the life of a program. This week, you apply all that you have learned about program evaluation throughout this course to aid you in program evaluation.
To prepare for this Assignment, review “Basic Guide to Program Evaluation (Including Outcomes Evaluation)” from this week’s resources, Plummer, S.-B., Makris, S., & Brocksen S. (Eds.). (2014b). Social work case studies: Concentration year. Retrieved from http://www.vitalsource.com , especially the sections titled “Outcomes-Based Evaluation” and “Contents of an Evaluation Plan.” Then, select a program that you would like to evaluate. You should build on work that you have done in previous assignments, but be sure to self-cite any written work that you have already submitted. Complete as many areas of the “Contents of an Evaluation Plan” as possible, leaving out items that assume you have already collected and analyzed the data.
By Day 7
Submit a 4- to 5-page paper that outlines a plan for a program evaluation focused on outcomes. Be specific and elaborate. Include the following information:
· The purpose of the evaluation, including specific questions to be answered
· The outcomes to be evaluated
· The indicators or instruments to be used to measure those outcomes, including the strengths and limitations of those measures to be used to evaluate the outcomes
· A rationale for selecting among the six group research designs
· The methods for collecting, organizing and analyzing data
Resource 1
McNamara, C. (2006a). Contents of an evaluation plan. In Basic guide to program evaluation (including outcomes evaluation). Retrieved from http://managementhelp.org/evaluation/program-evaluation-guide.htm#anchor1586742
Contents of an Evaluation Plan
Develop an evaluation plan to ensure your program evaluations
are carried out efficiently in the future. Note that bankers or
funders may want or benefit from a copy of this plan.
Ensure your evaluation plan is documented so you can regularly
and efficiently carry out your evaluation activities. Record enough
information in the plan so that someone outside of the organization
can understand what you’re evaluating and how. Consider the following
format for your report:
1. Title Page (name of the organization that is being, or has
a product/service/program that is being, evaluated; date)
2. Table of Contents
3. Executive Summary (one-page, concise overview of findings and
recommendations)
4. Purpose of the Report (what type of evaluation(s) was conducted,
what decisions are being aided by the findings of the evaluation,
who is making the decision, etc.)
5. Background About Organization and Product/Service/Program that
is being evaluated
a) Organization Description/History
b) Product/Service/Program Description (that is being evaluated)
i) Problem Statement (in the case of nonprofits, description of
the community need that is being met by the product/service/program)
ii) Overall Goal(s) of Product/Service/Program
iii) Outcomes (or client/customer impacts) and Performance Measures
(that can be measured as indicators toward the outcomes)
iv) Activities/Technologies of the Product/Service/Program (general
description of how the product/service/program is developed and
delivered)
v) Staffing (description of the number of personnel and roles
in the organization that are relevant to developing and delivering
the product/service/program)
6) Overall Evaluation Goals (eg, what questions are being answered
by the evaluation)
7) Methodology
a) Types of data/information that were collected
b) How data/information were collected (what instruments were
used, etc.)
c) How data/information were analyzed
d) Limitations of the evaluation (eg, cautions about findings/conclusions
and how to use the findings/conclusions, etc.)
8) Interpretations and Conclusions (from analysis of the data/information)
9) Recommendations (regarding the decisions that must be made
about the product/service/program)
Appendices: content of the appendices depends on the goals of
the evaluation report, eg.:
a) Instruments used to collect data/information
b) Data, eg, in tabular format, etc.
c) Testimonials, comments made by users of the product/service/program
d) Case studies of users of the product/service/program
e) Any related literature
Pitfalls to Avoid
1. Don’t balk at evaluation because it seems far too “scientific.”
It’s not. Usually the first 20% of effort will generate the first
80% of the plan, and this is far better than nothing.
2. There is no “perfect” evaluation design. Don’t worry
about the plan being perfect. It’s far more important to do something,
than to wait until every last detail has been tested.
3. Work hard to include some interviews in your evaluation methods.
Questionnaires don’t capture “the story,” and the story
is usually the most powerful depiction of the benefits of your
services.
4. Don’t interview just the successes. You’ll learn a great deal
about the program by understanding its failures, dropouts, etc.
5. Don’t throw away evaluation results once a report has been
generated. Results don’t take up much room, and they can provide
precious information later when trying to understand changes in
the program.
Resource 2
Plummer, S.-B., Makris, S., & Brocksen S. (Eds.). (2014b). Social work case studies: Concentration year. Baltimore, MD: Laureate International Universities Publishing. [Vital Source e-reader].
Read the following section:
“Social Work Research: Planning a Program Evaluation”
Social Work Research: Planning a Program Evaluation
Joan is a social worker who is currently enrolled in a social work PhD program. She is planning to conduct her dissertation research project with a large nonprofit child welfare organization where she has worked as a site coordinator for many years. She has already approached the agency director with her interest, and the leadership team of the agency stated that they would like to collaborate on the research project.
The child welfare organization at the center of the planned study has seven regional centers that operate fairly independently. The primary focus of work is on foster care; that is, recruiting and training foster parents and running a regular foster care program with an emphasis on family foster care. The agency has a residential program as well, but it will not participate in the study. Each of the regional centers services about 45–50 foster parents and approximately 100 foster children. On average, five to six new foster families are recruited at each center on a quarterly basis. This number has been consistent over the past 2 years.
Recently it was decided that a new training program for incoming foster parents would be used by the organization. The primary goals of this new training program include reducing foster placement disruptions, improving the quality of services delivered, and increasing child well-being through better trained and skilled foster families. Each of the regional centers will participate and implement the new training program. Three of the sites will start the program immediately, while the other four centers will not start until 12 months from now. The new training program consists of six separate 3-hour training sessions that are typically conducted in a biweekly format. It is a fairly proceduralized training program; that is, a very detailed set of manuals and training materials exists. All trainings will be conducted by the same two instructors. The current training program that it will replace differs considerably in its focus, but it also uses a 6-week, 3-hour format. It will be used by those sites not immediately participating until the new program is implemented.
Joan has done a thorough review of the foster care literature and has found that there has been no research on the training program to date, even though it is being used by a growing number of agencies. She also found that there are some standardized instruments that she could use for her study. In addition, she would need to create a set of Likert-type scales for the study. She will be able to use a group design because all seven regional centers are interested in participating and they are starting the training at different times.
**The Whole book will be in an attachment if needed. Thanks
Resource 3
https://managementhelp.org/evaluation/outcomes-evaluation-guide.htm#anchor30249
McNamara, C. (2006b). Reasons for priority on implementing outcomes-based evaluation.In Basic guide to outcomes-based evaluation for nonprofit organizations with very limited resources. Retrieved from http://managementhelp.org/evaluation/outcomes-evaluation-guide.htm#anchor30249