Text Size:   A+ A- A   •   Text Only
Find     
Site Image

Outcome-Based Evaluation

What is Outcome-Based Evaluation? (OBE)

Outcome-based evaluation (OBE) focuses on the difference a program, service or activity makes on the lives of its participants. An outcome is a desired change in an end user's knowledge, skills, behaviors, attitudes, condition or status.

OBE focuses on two key questions:

  1. How has your program, service or activity made a difference?
  2. How are the lives of your program, service or activity participants better as a result?

OBE is a systematic way to assess the extent to which a program has achieved its intended results. OBE identifies observations and measurements (indicators) that can credibly demonstrate changes or desirable conditions in your target audience. OBE is easier than one would think, and is not a research study.

As of 2013, the Oregon State Library is moving to OBE for LSTA and Ready to Read grant projects.

Back to Top

OBE and Ready to Read Grants

There is help for using OBE in Ready to Read grants - contact Katie Anderson at 503-378-2528 or katie.anderson@state.or.us. Using OBE in Ready to Read grants is easier than you'd think. Here are some simple tips:

  • Choose only 1-3 outcome(s) on your grant application. Sure, your program may address many of the provided outcomes, but focus just on the top one(s) to measure and work with.
  • Look at your program from the participant's point of view.
  • Evaluation methods can include observation, interviews, brief talks with kids, raised hands, surveys, etc. that are easy to do.

​Sample Ready to Read Grant Applications with OBE:

Early Literacy Sample

Summer Reading Program Sample

OBE in Ready to Read Grant Applications Coaching

 

Back to Top

Starting with OBE

Outcome-based evaluation starts with identifying the a problem in the community. Imagine the change one would like to see in the community that addresses that problem. How would the program participants change that would demonstrate the change in the community? Think about how one would know if that change is happening - Is there a behavior to observe? A question people could answer? A skill they could demonstrate?  Indicators are how one measures the change in knowledge, skills, attitudes, behaviors, etc.  Plan activities so that one can measure or observe those new skills, behaviors, or knowledge. For instance one may design a program to increase reading comprehension by including an activity where kids tell other kids about books they recently read, then observe how well they do that. In longer projects one would want to be able to measure progress towards project goals so that the project can make any course corrections that are needed.

The Institute of Museum and Library Services has a very nice Frequently Asked Outcome-based Evaluation Questions document.

Back to Top

Outcomes vs. Outputs

In managing a project or service, one would want both outputs and outcomes. The two are not mutually exclusive. One is quantitative, the other qualitative.

Outputs:

  • Answers "How many?" (extent)
  • Are measures of volume (e.g. number of services provided) or evidence of service delivery (e.g. number of attendees)
  • Are the results of inputs (resources) and activities, programs or services
  • Are from the staff perspective
  • Are objectively quantified

Outcomes:

  • Answers "So what?" (effectiveness)
  • Are measures of impact or benefit to the end user, usually reported in amount of change in skills, knowledge, attitude, behavior or condition
  • Are also the results of inputs (resources) and activities, programs or services
  • Are from the participant's perspective
  • Are often quantified by participants' or others' perceptions (e.g. surveys)
  • Are best used in conjunction with output measurements
  • Present assumptions of cause and effect, not concrete scientific evidence

 

 

Back to Top

Sample Outcome Statements

Remember that outcomes indicate the changes knowledge, skills, behaviors, attitudes, condition, or status.

Some sample outcome statements:

Sample from the 2013 Oregon OBE training example from The True Story of the Three Little Pigs:

  • Children believe A. Wolf, not the pigs
  • A. Wolf is released from the PEN

Samples from typical types of library services or projects:

  • Library staff will provide faster, more accurate and more complete answers to reference questions
  • ESL program participants' English literacy will improve
  • Students will demonstrate the ability to find an article on their topic in a Gale database
  • Children will maintain their reading skills over the summer
  • Academic libraries will partner with one another to create new digital information resources
Back to Top

Logic Model

A handy tool in planning to use OBE is the logic model, which is a visual way to link your project's resources, activities to the anticipated results. Some of the typical elements in a logic model are: key influencers, outcomes, assumptions, inputs, outputs, indicators, data sources, data intervals, goals and targets.

In the 2013 Oregon OBE training, the outcome being measured was to change people's attitudes so they would cheer for the wolf in the tale of the three little pigs. The sample shows how that outcome can be broken down and measured.

Sample of logic model for The True Story of the Three Little Pigs by Jon Scieszka

Blank logic model worksheet (MS Word)

Back to Top

Indicators

Indicators are the measurable characteristics or actions that show or imply that an outcome was achieved. Indicators are concrete, well-defined and observable, and usually countable.

For a digitization project, an indicator may be that the site attracts users back to it. This could be measured by a pop-up survey, or perhaps something in Google analytics. For a summer reading program, the outcome may be that children do not lose reading ability. An indicator may be observed time reading, measured by a survey of parents, or it may be keeping observations on reading comprehension, observed with a one-minute interview with children about books they enjoyed as they turn in a reading log.

A good pattern for an indicator statement is a format like this:

Number and/or percent of a target audience who report/demonstrate/exhibit an attitude/skill/knowledge/behavior/status in a specified quantity in a specified timeframe or circumstance.

From The True Story of the Three Little Pigs example:

# and % of children who report they feel sorry for the wolf on the EALRs (Early Assessments of Lobo Reading Sympathizers)

# and % of pigs who file new complaints against wolves

 

Back to Top

Data Sources

Indicators are measured by a variety of data sources.  With longer projects, one would want to specify intervals at which they would measure progress. The point of OBE is to learn from the target audience what works for them. Evaluation is most valuable when it helps make programs, services or activites more effective.

Quantitative data:

Sources would be circulation records, head counts, number of people answering a survey, number of library cards issued, number of books read

Qualitative data:

Sources would be comments by library users, public officials, students, or staff observations of program participants

With OBE, one would capture some quantitative data, but focus on qualitative data to see if a program was effective.

Some possible data sources would be:

​Testimonials ​Photographs, Slides & Videos
​Expert Opinion ​Focus Groups
​Observation ​Individual Interviews
​Surveys ​Group Interviews
 

In the 2013 Oregon OBE training example, some data sources that would fit indicators would be:

  • Pig construction company records
  • Number of times fresh pork recipes are accessed at allrecipes.com
  • Police reports
  • Pigs' self-reports

Pros and cons of different data sources

Back to Top

Tutorials and Online Courses

There are many resources that can help explain OBE.

For a quick overview, see the OBE training materials that were presented in Oregon in 2013:

The Institute of Museum and Library Services funded a good online OBE course:

Shaping Outcomes: Making a Difference in Libraries and Museums

Telling the Story of Your Library’s Impact – a webinar by Julie Pepera, Gale/Cengage, presented May 22, 2014.

This one hour webinar helps you prove your library’s value and impact by learning to gather the right kind of library stories. These stories you can use when reporting on the library to the news media, when asking for additional budgets from local and state governments and when applying for grants.

Useful Documents:

The United Way of America. Excerts from Measuring Program Outcomes: A Practical Approach. United Way of America, 1996. Available at http://www.madisoncommunityfoundation.org/document.doc?id=324. (Accessed August 19, 2014)

The W.K. Kellogg Foundation. W.K. Kellogg Foundation Logic Model Development Guide, 2004. Available via Acrobat PDF at http://www.wkkf.org/knowledge-center/resources/2006/02/wk-kellogg-foundation-logic-model-development-guide.aspx. (Accessed August 19, 2014) 

Useful Websites:

Institute of Museum and Library Services

The Institute of Museum and Library Services has a variety of useful resources. (Grant Applicants -> Outcome Based Evaluation). Under "basics" there is the PDF "Perspectives on Outcome Based Evaluations for Libraries and Museums". (Accessed August 19, 2014) 

Useful Videos:

Series of three short videos by United Way-Peel Region (Canada), with Dr. Andrew Taylor. (Accessed August 19, 2014)

 These three videos are made for non-profits with little technical expertise in evaluation but have an interest in improving their capacity to measure outcomes. The videos don't include a lot of technical detail, but are intended as conversation starters that serve to demystify the evaluation process. ​The first video explains program outcomes. The second provides suggestions on how to ask good evaluation questions. The third clarifies the ways in which the right indicators can be used to build stronger evaluation plans.

Tutorial 1

Tutorial 2

Tutorial 3

Back to Top
I