Saturday, May 2, 2015

Blog #12: Challenges, Issues, Concerns, and Successes

I'm plugging along this week with my Pilot Study Report.  I'm currently writing up the summary of findings from the multiple choice questions about demographics.  I guess what I'm struggling most with is finding the balance between reporting my findings and analyzing them.  In such a small study, it is easy to report the data, but hard to say for sure why the data is what it is.  I am able, based on the responses, to provide a snapshot of a "typical" collaborator, but without comparative data from teachers who have not collaborated, it is difficult to say whether the traits I've identified are really what are making a difference.  Still, it is interesting to see what factors these teachers have in common and which don't seem to matter.  It's a good place to start.

Next up, I'll be coding the short essay responses, looking for patterns in how teachers responded to the questions about how they view the media specialist's roles, benefits and barriers to collaboration, and factors that would encourage more collaboration.  I think I'll do this by hand. NVivo seems a bit complicated, and I just don't know that the effort to use it is worth it in such a small study. Instead, I've printed the responses out and am marking the text for categories and sub-categories.  I'll see how that goes before I make a final decision about NVivo.

Friday, April 24, 2015

Blog #11: Information Behavior in Professional Contexts

For my research project, I am looking at the information seeking behaviors of teachers.  Case categorizes school teachers as practitioners under the umbrella of social scientists and claims they make "little use of research findings" (p. 295).  From my experience, unfortunately, this is very true.

Outside of coursework for master's degrees, most practicing teachers seem to spend very little time looking at research in education or their subject area journals and even less time contributing to them.  There are exceptions, of course: I adapted one of my most successful and long-standing lessons from an article that I just happened to see in the English Journal.  Certainly it's not for lack of interest, but like most things, the barrier is time.  Between lesson planning, delivering, and assessing; clerical work; communicating with parents; dealing with special cases; monitoring the hallway, playground, and lunchroom, etc., etc.; teachers simply do not have much extra time to devote to reading or conducting real research.  And since there is no financial advantage or professional expectation to do so, it falls to a very low priority task.

There is a clear and large divide between theoretical research and practice in education. So, how then do practicing teachers find out what they need to know? Personal learning communities provide the bulk of the information. Teacher-written websites, teacher-created district curriculum guides, and grade level or subject area PLC meetings can help teachers with lesson plans, teaching strategies, and rubrics. Taking time off to attend statewide or national conferences can also help spark new ideas.  Mostly, though, teachers have to rely on day-to-day creative planning, monitoring and adjusting in the classroom, and a healthy dose of self-reflection to develop a sound practice.

In terms of my progress this week, a couple more teachers completed my survey and I ended up with 21 responses, almost 50%.  I have created graphs and tables based on the quantitative responses via Google Forms, but have not yet officially started coding the longer essay responses.

Friday, April 17, 2015

Blog #10: Successes

This week, I am mostly feeling success!  I feel good about my survey questions, I was able to enter them into a Google Form and figure out how to send it to the teachers identified by my media specialist, and I am now having fun watching the responses roll in.  As of this writing, I have 18 respondents.   Out of the 44 teachers to whom I sent the survey, that is a 40% response rate.   Pretty good, right?  I did set a deadline of next Wednesday, so I may get a few more trickling in early next week, but most likely by then many of the non-respondents will have forgotten about it.  Even if this is it, though, I feel like it is a good sample size for a pilot study.  I am very grateful to my colleagues for taking the time to help me out.

As I briefly glance over the results, I am feeling glad I asked a combination of multiple-choice and short essay questions.  It is so interesting to read the longer answers.  I am already noticing some preliminary patterns.  The word "time, " for instance, appears far more frequently than any other word.  This is not surprising and seems to confirm the studies I cited in my lit review.  At the same time, it is surprising to see the diversity of responses.  How teachers perceive the role of the school media specialist is quite different depending upon the individual.  And answers are all over the board about whether our administration explicitly supports SLMS-teacher collaboration.  I never would have guessed this.

Of course, my feelings of success will surely dissipate as I start to try to make real sense out of all of this data.  My next step, after the survey closes next Wednesday, will be to try to input my data in NVivo and start coding.  I know that will be a major learning curve for me, and next week I probably will be writing about challenges, issues, and concerns instead of successes. But for now, I'm feeling good!

Friday, April 10, 2015

Blog #9: Potential Pitfalls

One of the greatest pitfalls of sending a voluntary email questionnaire to teachers is simply the rate of return.  Being a busy teacher myself who doesn't often have time to check my email during my teaching day, let alone take a survey, I know that not all of my subjects, no matter how nicely I ask, will end up completing the survey.  This may result in a thin data set and a threat to coming up with credible results.  One way to address this is to think carefully about timing.  I don't want to send the survey, for example, over the weekend or on a Friday afternoon.  Because we have an advisement period on Wednesdays and teachers might have some time during that hour, I plan on sending the survey out next Wednesday morning.  I have also considered offering cookies as a bribe for completing the survey.

It may also be difficult to generalize from the data what factors encourage SLMS-teacher collaboration.   My subjects are complex individuals who are unique in lots of ways (age, gender, length of career, past experiences, attitudes, etc.) while my survey, though reliable and consistent, is short, simple, and has a lot of multiple choice questions.  I may not get the depth of answers that will allow me to truly find out what is going on.  To combat this, I added the option of filling in"other" boxes on many of my questions, and did add a few short answer boxes as well.  I am hoping that the combination of multiple choice and short answer will result in the appropriate depth for a pilot study.

Another pitfall of qualitative studies is the transferability of the results to other situations.  Because I am doing a local study of one high school with unique characteristics, the findings may not be applicable to other high schools even within my own district.  In my discussion of findings, I will need to be sure to include demographic information about my school so others can determine if their situations are similar enough to consider the findings useful for them.

Saturday, March 28, 2015

Blog #8: Challenges, Issues, Concerns, Successes

After I presented my lit review to the class last week, I felt a bit worried that I didn't explicitly frame my topic as a study of "information seeking behaviors," and I started to panic that maybe it wasn't.  But, the more I think about it, that's exactly what it is.   I am trying to determine, in my pilot study, what individual and institutional factors influence teachers at my school to work with the school library media specialist.  The "information" in this case would be the media specialist's knowledge of resources, information literacy skills, technology tools, etc.  The "seeking" aspect of this study would be whether/to what extent/why teachers seek out or agree to collaboration with the the SLMS as they design, deliver, and assess lessons.  As I revise my lit review, I am working on incorporating this more explicitly into an introduction to the topic.

I also have been working on my survey this week and have a draft of the questions I will ask.  I am planning to do a combination of open ended and multiple choice questions.  So far, I have about 15.  I think this is a good number, but I am still working on refining them.  Although I feel good about my progress with the questions, I haven't started to input them into a Google Form.  This will be the first time I've created one (besides a few practice exercises in my tech and learning course), so I am a bit concerned about how that will go.  All in all, though, I feel pretty good about my pilot study design.  I think I will wait to actually send out the survey until after class next week.  Of course, I have concerns about the next steps: mostly, how many of the 46 will actually complete my survey and will I be able to effectively analyze and see patterns in the data?  But, one step at a time!

Thursday, March 19, 2015

Blog #7: Challenges, Issues, Concerns, Successes

This week, I have been focusing on my literature review, which I must say I've found to be seriously depressing.  The literature seems a variation on the same theme: it would be great if we (SLMS and teachers) could work together more, but without serious culture changes, it's just not happening very often.

Although the literature shows a direct correlation between this type of collaboration and student achievement, the barriers are many and difficult to overcome.  Administrators and teachers don't often seem to understand the role of the SLMS or have the time to commit to developing a collaborative partnership, media specialists themselves often feel overwhelmed with all they have to do, and schools are increasingly cutting library support staff and even media specialists themselves.  Yet the stakes, especially now with the adoption of Common Core, have never been higher.  Some of the studies I cite do show that with a supportive administration, dedicated time for structured planning, and the right partnership, collaboration can happen and be very effective, but getting all of those elements in place is rare.

In terms my own study, I am curious to see not whether there are barriers at my own school (I know there are as the same ones seem to exist in most schools), but rather how the teachers who do collaborate were able/willing to overcome them.  I've decided to survey just the 46 teachers who do collaborate with our SLMS to get a sense of the "type" of teacher who collaborates in our building and the factors that encouraged them to do so.  I'll start to develop questions this week and will probably focus on the following aspects:

  • What they were taught/not taught about collaboration in teacher ed
  • If they had worked with with a media specialist in a previous building
  • Their teaching style/personality
  • Their understanding of the roles of an SLMS
  • The degree to which they feel administration supports this collaboration
  • Professional development experience
  • "Marketing" by SLMS or others who collaborated with her in the past


Saturday, March 14, 2015

Blog #6: Decisions and Data Analysis

As I am working on my lit review, several themes are emerging:
a.  Media specialist-teacher collaboration is connected to student achievement.
b.  Media specialists, teachers, and administrators all want this collaboration to happen.
c.  It is often not happening because of several common barriers:  administrative support, time/communication, and the independent nature of teaching culture.

For my pilot study, I would like to explore to what extent these factors are influencing this type of collaboration in my own high school, and if there are others.  To start my project, I gave my media specialist a list of all teaching staff and asked her to highlight those with whom she had "collaborated on a lesson."  Out of 130 staff, she highlighted 46, which is roughly 35%.

At first glance, this percentage seemed low to me.  However, when I conducted a brief interview with the media specialist, she reported that she was happy with this percentage and that she didn't believe she could fit in working with many more teaching staff into her busy days.  I haven't found any concrete data in any of the literature to compare this percentage to, so I'm not sure what to think.  One article recommended media specialists spending 50% of their time in the role of instructional partner.  I could ask my media specialist about this.  Others conceded it was impossible to work with many teachers (particularly in a large high school), but what is more important is the quality of the co-teaching.  Maybe I don't have to make any judgments about whether media specialist-teacher collaboration should be increased in my building anyway, as I am just focusing on why it's happening and why it isn't happening.

The next step for me is making a final decision about my sampling for my survey.  I definitely want to stratify so I don't end up with all non-instructional partners, but I am still debating between the following options:

a.  Survey all 46 instructional partners only.  Focus just on why they did collaborate. This should result in data about how/why partner teachers were able to overcome the typical barriers.

b.  Survey half instructional partners and half non-instructional partners.  Ask questions about why they did or didn't collaborate.  This should result in data about partners and non-partners equally.

c.  Survey 35% instructional partners and 65% non instructional partners to more accurately represent the entire staff.  This may tip the survey to focusing on reasons collaboration isn't happening.

I would love any feedback on these sampling options!

Which ever sampling method I choose, I will likely create my survey on a Google Form.  I have limited knowledge of Google Forms and will have to spend some time getting to know how it all works.  Once the surveys are collected, I will likely have to read Richards and Morse again. I understand about grouping data into categories and then conceptualizing what it all means.  But, I have zero experience actually doing the coding of data and have never used coding software.  I'm sure I have a big learning curve ahead of me.