"A profession without reflective practitioners who are willing to learn about relevant research is a blinkered profession - one that's disconnected from best practices and best thinking, and one which, by default, often resorts to advocacy rather than evidence to survive." Ross Todd (2008).
She's a great library media specialist.
His library is always filled with enthusiastic students.
Her students always create great projects.
These are wonderful accolades about teacher librarians, but how do we know for a fact that these wonderful media specialists are effective? How do we know that the library media program impacts student performance? How can we demonstrate our worth to the school board in facts, not just anecdotes? We need evidence.
Evidence based practice is the systematic process of documenting how a teacher librarian makes a difference in student learning.
Read Ross Todd's article (Apr. 2008): The Evidence-Based Manifesto (Access requires login). School Library Journal; 54(4).
Article summary of the conference "Where's the Evidence? Understanding the Impact of School Libraries" hosted by the periodical "School Library Journal" as their 2007 Leadership Summit.
Is this realistic for library media specialists?
When classroom teachers and teacher librarians are collaborating on activities to impact student learning, it makes sense to collect data about the effectiveness of the activity. Traditional student assessments such as test score, rubrics, and checklists are the place to begin. However there are many other types of evidence that can be collected through the library media center such as circulation statistics and learning logs.
Read Kramer, Pamela and Diekman, Linda (Feb. 2010). Evidence = Assessment = Advocacy (Access requires login). Teacher Librarian; 37(3), 27-30. The author's discus working with teachers to gather data in the school, then analyzing that evidence, and using results to advocate for the school library program.
What are evidence-based strategies?
David Loertscher suggests three evidence-based strategies (Loertscher, 2003):
Ongoing Data Collection. Involves using various information gathering techniques to monitor progress regularly. Data may be collected daily, weekly, monthly, or other regular measures. Techniques may include real-time tracking (i.e., hits at website), periodic gathering (i.e., check blog), or project collecting (i.e., end of project).
Also read a recent report of initial data gathered from Surveys of Indiana School Library Media Specialists 2003-2004 (Nov. 2004) (MS Word document will download to desktop). This study was an effort of the Association for Indiana Media Educators (AIME) in collaboration with the Indiana University School of Library and Information Science and the IU Public Opinion Laboratory at Indianapolis on the IUPUI campus.
Read Show Me the Evidence! Using Data in Support of Library Media Programs(May/June 2005) by M. J. Langhome in Knowledge Quest; 33(5), 35-7. (Access requires login, download PDF Document)
Evidence-based Practices. Involves measuring effectiveness as the event occurs such as counting the use of proper citations in student PowerPoint presentations a few days after a mini-lesson on citing sources.
Action Research Projects. Involves conducting a formal study addressing specific questions about program effectiveness.
Project Achievement: Brief Guide and Handouts (2003-05) (pdf
document), national initiative to collect and present evidence
that links library media programs to student achievement.
Although designed for a California workshop one could easily apply
these ideas to any program.
Read Building Evidence-based Practice Through Action Research by Violet H. Harada.
What is Project Achievement?
David Loertscher spearheaded a national initiative called Project Achievement focusing on the collection and presentation of evidence that connects library media programs to student achievement.
He recommends the use of Ripple Effect Measures:
"significant measures that are most likely to produce results in achievement and indicate maximum teacher collaboration and organization effectiveness. Because you have these data, a ripple effect occurs, like throwing a pebble in a pool, triggering many other organizational practices and policies." David Loertscher (2003)
Loertscher's four measurement areas include:
- information literacy
Evidence is collected from three perspectives creating triangulation. This means gather evidence from various perspectives (Loertscher & Woolls, 2003):
- learner level - student gains (i.e., achievement test scores, rubrics, portfolios, attitude scales, checklists, reflections)
- teaching unit level - lessons and learning (i.e., checklists, collaboration rubrics, evaluation forms, timelines, log sheets)
- organization level - library output (i.e., center statistics, hardware and software data)
Read Overwhelming Evidence by E. Oatman (Jan. 2006). School Library Journal; 52(1), 56-59 (Access requires login). The article presents information on a student research project and evidence collected at Gill Saint Bernard's School in Gladstone, NJ. The roles of teachers and librarians in the program are emphasized.
Two types of evidence are recommended (Loertscher, 2003):
- direct measures - "those so close to actual learning that confidence in an impact could be inferred"
- indirect measures - "provide evidence that actions set the stage for, provide an environment for, give support to, enable, help, give encouragement to, mark progress toward, make change in direct measures over time the probable stimulus"
Read National Project Achievement Brief: Guide and Handouts (2003) (pdf document). This document explains the initiative and provides the materials needed to participate in the collection of evidence.
Read about David V. Loertscher.
Well-known for his commitment to library media program evaluation, Loertscher's book We Boost Achievement! Evidence-Based Practice for School Library Media Specialists contains useful information and resources on topic topic.
Also read a brief article by David V. Loertscher and B. Woolls (Jun 2003): A True Assessment of Your Program’s Value. School Library Journal; 49(6), 3. The authors recommend that you base the success of your library media program on three sources of information.
"What is evidence-based practice and how can it work for you? First coined by the medical profession in the early 1990s, evidence-based practice can now be applied to what you do to show how and why your services are important to student learning. Start by familiarizing yourself with existing research on how school libraries can optimize learning. Then make sure you systematically focus on gathering meaningful evidence on the impact of your instructional role on student achievement."
Todd, Ross (April 2003). How to Prove You Boost Student Achievement. School Library Journal.
Loertscher and others have written about using direct and indirect measures as ways to collect evidence. If those terms are fuzzy or unknown to you, here is a brief explanation:
There are many ways to collect evidence, and assessments can be categorized as being either direct or indirect measures.
Direct measures are most often used. A direct measure refers to measuring the thing that you want to measure; measuring actual work, measuring a sample of work including exams, projects, demonstrations, performances, and other completed work. The strength of direct measurement is that you are capturing a sample of what was done, which can be very strong evidence. A possible weakness of direct measurement is that not everything can be demonstrated in a direct way; things such as values, perceptions, feelings, and attitudes.
In contrast, an indirect measure is when you measure something by measuring something else. At first that may sound strange, but an indirect measure is based upon a report of perceived work. These reports can come from varied perspectives such as students, teachers, staff members, and others. Indirect measures provide added information (information that cannot be directly measured) about what happened, what was completed, and what is valued by different constituencies.
Indirect measures are not as strong as direct measures, because one has to make assumptions about what exactly this type of self-reporting means. For example, if someone reports that a student has performed a specific task, how do we know that their assessment is accurate? The strength of indirect measurement is that it can assess certain implicit qualities such as values, feelings, perceptions, and attitudes from a variety of perspectives. The weakness of this approach is that in the absence of direct evidence, assumptions must be made about how well perceptions match the reality of actual achievement.
For added information, visit Direct Vs. Indirect Assessment Methods from Skidmore College.
Your principal has announced that each department begin collecting evidence of the effectiveness of their program. You've been given David Loertscher's handout as a recommended place to start with the library media program. Read Project Achievement: Brief Guide and Handouts (PDF document, 2003).
You have two options:
Select one of the four areas: reading, collaboration, information literacy, or technology described by Loertscher. Select particular aspects of the guidelines or samples that you think would be particularly worthwhile. Would you select an emphasis or make your project more general? Would you use the entire school or focus on a particular grade level or subject area? What direct and/or indirect measures would you use? Why?
Discuss how you would go about convincing your teachers to participate in this type of evidence-gathering activity. Your arguments should be aimed at the average overworked and underpaid teacher who may not be as enthusiastic as you are about this project.
Develop a new measurement area or a subset of one of the basic four (i.e., primary sources, current events). Use the Collecting the Data project for ideas. Apply ideas from the four areas provided as examples. Create a Library Media Center Program Ripple Effect Measures page for your topic including LMC Agenda goals, Curriculum Agenda goals, pebbles to measure, justification, demonstrate through research and practice that..., and report.
Discuss why you think this approach would be effective with this topic.
Loertscher, David V. & Todd, Ross J. We Boost Achievement! Evidence-Based Practice for School Library Media Specialists. Salt Lake City, UT: Hi Willow Research & Publishing, 2003
Kenney, Brian (Apr. 2006). Ross to the Rescue! School Library Journal; 52(4), 44-47.
interview with Ross Todd, Rutgers University and director of research for the Center for International Scholarship in School Libraries. Challenges facing school libraries; Lessons raised by studies on the relationship between school libraries and student learning.
Logan, Deb (Jun 2010). Students + Evidence = Impact. DeBLOGan
Resources from Deb Logan's presentation at AASL in Denver.
Lance, Keith Curry (2001). Proof of the Power: Recent Research on the Impact of School Library Media Programs on the Academic Achievement of U.S. Public School Students. ERIC Digest.
Ripple Effect (2006). WikEd.
Todd, Ross (Apr. 2003). Irrefutable Evidence. School Library Journal; 49(4), 52.
The author suggests ways to boost student achievement.