February Office Hours: Facilitating Classroom Review of Open Textbooks

Office Hours

Watch the video recording of this Office Hours session, or keep reading for a full transcript. The chat transcript is also available, for those interested in reading the conversation that took place amongst participants and seeing resources shared.


Note: If anyone would prefer to not be associated with their comments in either of these transcripts, please contact Apurva as soon as possible and we will remove any names or other identifying information.

Audio Transcript

Speakers: 

  • Ian McDermott, CUNY
  • Jared Robinson, McGill
  • Andre Mount, SUNY
  • Karen Lauritsen, OTN
  • Zoe Wake Hyde, Rebus Community
  • Leigh Kinch-Pedrosa, Rebus Community

Zoe: Hi, everyone and welcome to Office Hours for this month. We are very pleased to have you with us, as always. And we are diving into another very fun topic, continuing on a little bit from our conversation last month, where we started to think critically about peer review and what that can look like when we think about it differently for open textbooks. And this time we’re going to be talking about a different kind of review, which is classroom review. 

That’s sometimes also referred to as beta testing and there are lots of different facets to it that we’re going to explore today. As always, we are joined by OTN in partnership in running these calls. I should say I’m Zoe from Rebus Community, for those of you who don’t know me. And I will hand over to my lovely co-host, Karen, to introduce our speakers today. 

Karen: Thank you, Zoe. I am Karen from the Open Textbook Network, and as Zoe said, we’re going to talk about classroom review today. And we are joined by three guests, first Ian McDermott who is coordinator of library instruction and associate professor at LaGuardia Community College at the City University of New York, otherwise known as CUNY. Jared Robinson, research associate at McGill University, and former primary researcher for the Open Education Group. 

And also, Andre Mount, who’s associate professor of music theory at the Crane School of Music at State University of New York, known as SUNY, Potsdam. So, we’re going to hear from each of our three guests for around five minutes. You are welcome to put any notes or thoughts or questions in the chat. But we will hold those until all of our guests have had a moment to talk about their work in classroom review. 

And then, we will turn things over to all of you to drive this conversation. And to get things started, I’m going to turn things over to Jared Robinson. 

Jared: Hey, thanks for that Karen, I appreciate it. So, I’m Jared Robinson, and I was first introduced to open when I was doing my dissertation and my doctoral research work back at the Open Education Group at Brigham Young University and my advisor was David Wiley. And so, when we were doing work together, we were thinking about open educational resource quality and specifically textbook quality. 

And we thought about it on four dimensions, we did research that looked at the costs, outcomes, use, how we use textbooks and how students and teachers perceive the textbooks. And I think peer review of open materials really connects with how we use and how we perceive the materials. And when we start to use the materials differently, this can affect outcomes as well, which I think is really, really exciting. 

After finishing my PhD, I moved over into state government, where I worked for the Michigan Department of Education. And one of the fun experiences that I had there was to be part of the advisory committee for our state’s #GoOpen work as well. And then, we’re recently relocated to Montreal, where I work at McGill University and my wife is an assistant professor here at McGill University as well. 

One of the interesting experiences that I’ll use to frame how I think about this was in a project management course I took in graduate school from David Wiley, I don’t know if people on the call know David, but he’s a good friend. And he taught project management in an interesting way. And he took a project management open textbook for the construction industry and the project for our class was to convert the construction industry project management open textbook into an instructional design for learning open textbook. 

And so, for us, this put us in a really unique position as learners right from the beginning, where we were expected to become experts enough on our content that we could adopt project management to a completely new context. And that was a really transformative learning experience, one that ended with a product that hopefully we could share out and would be useful at the end of the day, which we ended up doing. 

So, when I think about peer review open materials that’s one of the experiences that I think about. And then, a second experience is with the group that I work with at McGill University right now, which is actually in the faculty of dentistry, training dentists. And then, also training dental researchers. And so, Mary Ellen Macdonald who’s a professor that I work very closely with, Dr. Macdonald with her graduate students engages in group peer review of academic articles. 

And one of the things that academics do is peer review the research and academic science of their peers and that helps to create a quality assurance. And it can be pretty intimidating to jump into thinking about quality. And so, she does this as a guided practice with scaffolding, with the opportunity to do it together in dialogue in the cohort. And this ends up being a really meaningful experience. Karen, thanks for posting that book, too. 

Anyway, these are just a couple of the different experiences that I think about when I think about that. And I wanted to just use those to introduce myself a little bit and my work. 

Karen: Thanks, Jared. Speaking of scaffolding, we are now going to hear from Ian McDermott, who is coordinator of library instruction, associate professor at LaGuardia Community College, I already said that, but for some reason I just wanted to say your whole title again, Ian. I’m going to turn things over to you now. You are still muted. 

Ian: All right, I think I am unmuted. 

Karen: Yes. 

Ian: Great. Yeah, thanks so much for the invite, happy to be talking to everybody today. I want to start actually, I’ll just make a very quick plug, I just had an article published yesterday “In the Library with the Lead Pipe”, that’s a critical evaluation of OER efficacy studies, which is adjacent to what we’re talking about today. But certainly, provides some context with how I approach OER and how my colleagues and I approach it from more of a critical pedagogy perspective. 

But what I wanted to focus on with my brief remarks is a seminar that I ran with my colleagues, Christopher McHale and Steven Ovadia, at LaGuardia Community College in spring of 2018. We taught students textbook evaluation skills, based on the association of college and research library information literacy framework. And then, what we did is comparative evaluations between commercial and OER textbooks for three gateway STEM classes taught at LaGuardia. 

That was a bio class, algebra, and anatomy and physiology. And so, the goal for our seminar with these students wasn’t so much to gather and analyze data so much as it was to test and develop survey instruments. So, we tested all of these instruments out, and then packaged them together into a toolkit that we’ve now made available as an OER through CUNY academic works, which is our institutional repository. 

It’s also up on GitHub. So, it’s available for everybody to use, however they see fit. So, we were really trying to get at these different survey and evaluation methods with the students, and just see what might work. So, I’ll just briefly run through some of the survey instruments, and evaluation techniques that we used. We did pre- and post-surveys which were designed mostly to collect data about the participants. 

Also, I just want to very quickly clarify that the students who participated in the seminar were paid. So, this was part of a state grant from New York State. And we wanted the students to be paid as interns, as opposed to it being a for credit class, just so everyone knows how students were. So, we interviewed them, we put out a call, we interviewed them, and then they were selected to participate. 

So, the pre- and post-survey was really just to gather data about them, but also address their academic experiences, their study habits and their textbook preferences. And the questions for this survey were primarily adapted from the CUNY zero textbook cost student experience survey, which there’s been at least one article published from data gathered by that CUNY wide survey. 

I think the lead authors are Professor Shawna M. Brandle and Stacy Katz. We also looked at the community college survey of student engagement and the collegiate student assessment of textbooks. And so, we were using these as inspiration slash template. So, we were sometimes just using the questions directly and other times adopting them for our own particular audience. 

Then, we had a mix of quantitative and qualitative textbook assessment surveys. On the quantitative side, we had approximately a 25-question survey that used an eight-point Likert scale, from zero to seven. And we used questions from the textbook assessment and usage scale which was developed by Gurung and Martin and then Gurung and Landrum in two different publications. 

And we administered these throughout the seminar, and the way we did it is so that in addition to these three gateway classes we basically gave the students a chapter from each type of textbook, commercial and OER. And they were put into groups, so like one group would be working on biology, anatomy and physiology, and algebra. And so, they worked in cohorts. 

So, they were taking these surveys, the qualitative and quantitative independent of each other in these three-week groupings. And then, what they would do is they would come back so they would do some remote work, and then they would come back into the seminar for focus groups and in-depth discussions that we ran with them. So, since I’m running out of time, I’ll just say that in addition to the surveys, we also had them write final reviews that was based on the OTN model that you use, mashed up with an Amazon review. 

So, ultimately, we were trying to get students to see how they were responding in all of these different survey instruments. But most importantly, how were they reacting to using these different instruments? Did they like doing qualitative versus quantitative? Did they enjoy writing reviews or participating in focus groups? And as you can imagine, that created a lot more data than we could really crunch in a timely manner. 

So, we’ve really been focused on packaging the assessment tools and getting them out there for others to use, while we very slowly go through some of the data and begin to write it up. So, I’m happy to answer more questions about that in the Q&A portion. 

Karen: Great. Thank you, Ian. And now, Andre, we will hand things over to you. 

Andre: Hi. Thanks, Karen, for the introduction. I am I think relatively new to OER, but I’m excited to be a part of this conversation since I’m hoping it’s going to help shape how I can collect some review information from my students. Let me tell you a little bit about my project. I have recently completed writing an OER music theory textbook. To give you a little background information on the project, this started about a decade ago. 

When I was in grad school, I was hired to write a series of texts for some online remedial modules for transfer students in our department. The idea was that students would take a diagnostic exam and then, we would give them topics to review. Eventually, the project sort of fizzled, we had a couple of bad hires with web developers and lost funding. But years later, I realized that we had enough material for a textbook and so started going about organising this, filling in the gaps and just reconfiguring everything. 

So, a couple of years ago I started working with Allison Brown, who is lurking out there right now, who is the digital publishing services manager at Open SUNY Textbooks. And over the last couple of years, we’ve gone through the peer review process, assembled a small team of reviewers to look at everything from the entire book to just the examples and the exercises that are in the textbook, which was a great, super helpful part of the project. 

The next phase was to get it out to some proofreaders, but I was eager to get it into the classroom, to avoid another year of my students having to pay for a textbook. And so, decided to take it out for a test run and that’s what I’ve been doing this year. Given that this was concurrent with the proofreading phase of the project, I was a little concerned, despite having read through the whole thing, 800 or so pages out loud to myself several times, I was a little wary because I knew there were still going to be some typos in there. 

And my concern was running the risk of students encountering these typos in the textbook, maybe losing faith in the textbook, and then maybe even losing faith in me as an instructor by extension of that. But I decided I think it was worth it to go ahead with this, and so, I had to come up with a strategy for addressing that. And the strategy that I came up with was to rather than let them just encounter any problems with the book as they were reading it to call their attention to that. 

To say from the get-go that they may encounter these things and during the first couple of days of class, talked to them a little bit about the OER development process. And tried to sell them on the idea of joining in this noble cause themselves and being a part of the whole thing. And also, incentivising it a little bit by offering some extra credit points if they found typos for exams. 

And so, the way it ended up playing out was when students found errors in the textbook, either typos or broken images or audio, they would report it to me. And I would keep a running list going on our LMS, so that they could see where these typos were. I ended up actually changing them in the text but leaving the original problematic part there, just with a red strikethrough so that I’m not making any changes in the middle of a semester. 

And so far, most of the feedback I’ve been gathering has been this kind of stuff, and informal conversations. I wanted to wait until the students had been with the text for a whole year, a whole academic year before trying to get their impressions, and that’s where I’m at right now. Trying to formulate where or how what questions I’m going to be asking them in a more formal survey at the end of the semester. 

The reason I told you about the origins of the book was because one of my big concerns is that I want it to be a cohesive text, but it started off as a series of disconnected lessons. And I’m hoping that having read through the book and using it in the classroom the way they did, that the students will have a pretty good perspective on whether or not it works as a textbook, as a cohesive whole. 

So, that’s where I’m at right now. And like I said, I’m excited to be a part of this conversation, because this is going to help me shape my survey of the students at the end of the semester. 

Karen: Great, thank you, Andre. Well, we have heard from our three guests, so this is the point when we turn things over to all of you and invite your comments and questions, you’re welcome to unmute or use chat, whichever you prefer. While you think of those questions, Ian, I’m wondering if you have any early findings from that trove of data pointing to how students did experience those different qualitative and quantitative surveys and other tools that you used. Any previews you can give us? Wait, you’re still muted. 

Ian: Thanks, so a few things that I would share, I guess I would share some unexpected findings. And again, this is one group of students, 18 students, so it’s a small sample size but nonetheless, I think it can be illuminating. One thing that we were very surprised to find is how much the students cared about design in the textbooks, down to like font size. For example, we were using I believe it was the OpenStax bio book has been adopted by University of British Columbia. 

And so, we just as an interesting test, we used both of those, even though they were very similar, but there were some changes in the Canadian version. But through that adoption process there some subtle changes to the design that the students were kind of frustrated by. And I think it was a font size, some pages’ context had been changed, so like half of the page would be empty. 

And so, I think that there were maybe some perceptions of why is half of a page wasted? What is this book? I think it just made some red flags for them that were a little surprising as far as perceptions of the quality perhaps are concerned. So, that was one thing that was surprising and the other was that they probably weren’t nearly as militant as I am about wanting there to be no cost for textbooks. 

I think some of them are just so accustomed to paying so much for their textbooks, that their idea of what a fair price is for a textbook was actually $45 up to $75. They often perceive that to be like somewhat fair, I think based on their previous experiences. They weren’t just like, come on in with free or nothing, which was a little surprising to us. But as we got to talk to them more some of it, like I said, was their past experience. 

But some of it I think was they’re very thoughtful, reasonable students, who understand that there is cost in producing textbooks. And that was something we spoke with them a lot about, was the academic labor that goes into this, whether it’s a commercial publisher or a university or a non-profit or philanthropic organisation. That’s why we wanted to pay them for their participation. So, we tried to bring those labor practices in and so they were very attuned to those issues. 

Karen: Thanks, Ian. Two good takeaways so far, about design and cost flexibility. Andre, you mentioned that you’re getting to the stage of asking your students about their experience most holistically with the text. Do you have a starter list of questions, things that you know you want to ask? 

Andre: Yeah, I’m starting to devise that, it’s kind of a busy semester, so that’s on the back burner. But like I said, a lot of it I think it’s just going to have to do with the cohesiveness of the whole thing. But then, there’s some other stuff, one of our reviewers prior to taking it to the classroom we had one person who was looking specifically at the activities, the little exercises that we have embedded in the text. 

And they have a lot of very useful feedback on that, but I think that even so, this was another music theory instructor, I think that it’s going to take the students’ actual experience working through those exercises to really find out how effective they were. I’m interested to see if they did the exercises, they are just self-graded things, and they’re in little dropdown expandable divs on the screen. 

And I think it’s really easy to just scroll right past them, despite my encouraging them to do this kind of thing. And so, I’m curious to find out, and hopefully they’ll be honest with me, how many of them actually did the exercises, if they felt like they were relevant to the text. Because I think they’re the only ones that are going to know, they’re the only ones that are going to be able to give me that information. 

So, that’s what most of my concern is. I’m not expecting any major changes suggested by the students. And I think at this point I wouldn’t be ready to make any major changes, I would want to do a public release first, but to begin collecting some of that information. There haven’t been any major alarms so far using the book and so I’m not anticipating anything dramatic. But I am thinking about that possibility and how to organise that data once I get it from the students. 

Karen: And you hinted a bit at your fear of students losing faith if they were to find typos and so you engaged them in that hunt. Have you heard any or seen any evidence of that fear being realised? Or do you think that that was more imagined than real?

Andre: I think my plan worked, because they do seem very enthusiastic about just the idea of OER. And really, really enthusiastic to be a part of it and to contribute to it as well. And so, I haven’t seen any of my worst fears realised, so far anyway. Fingers are still crossed. 

Karen: You know, it’s interesting to note between your story and Ian’s research, students who are more removed from the creation of the OER, who are evaluating it, but not necessarily as a first person student, might be more critical of the design or why does it look like this or what happened during adaptation? Whereas with that relationship in the classroom with the instructor and being invited to contribute to the creation of it, it seems like it’s a different experience. And they’re like, “Hey we all make typos, it’s all right.”

Andre: Yeah, and I think just the distraction of being in the class, their primary concern is doing well in the class. And so, they don’t have that level of removal from it to really be critical in the same way, I would imagine. 

Karen: Right, which gives to your other age-old question of whether it counts and if it doesn’t count if students are actually doing those exercises for their benefit or not, if they’re optional. So, I’m interested to follow up and hear what kind of completion rate those exercises may have had. So, Matt, I see that you turned your camera on, do you want to go ahead and ask the question that you posted in the chat? 

Or not, no, okay, you’re in a loud place maybe where they serve breakfast. Okay. Got it. So, Matt’s question in the chat does anyone have suggestions about students leading their own evaluation of a textbook? I’m evaluating my research methods textbook, so designing focus group questions is part of our learning process, anyway. I’m less clear on how to facilitate that process or work with them on developing a focus group script, especially since it’s my book. 

I may not be the most objective to lead those groups, since I’m also grading the students. What kind of thoughts do any of you have for Matt? 

Jared: Yeah, I’ll jump in. I’m not teaching students right now, so I’m speaking from a just brainstorming perspective or conceptual perspective. My background is research methodology and there’s not a right way to present that information. And I think, Matthew, you know that really well that there are myriad ways to think about these kinds of things. And so, one of the things that I think is really helpful students are capable of leading that work. 

But give them concrete things that you want to see, so the ideas that I would think about like instructionally are breaking this textbook up into pieces, and then assigning groups to actually make recommendations to improve each piece. And just expect that each piece could be improved, or that you might not take recommendations. But I think this is important for a couple of reasons, to give students the opportunity to think about how to improve and build on this. 

The first, is when we adopt an open textbook, I think we acknowledge tacitly that the value proposition of the textbook is changing. That information is so widely available, and we can learn from so many different sources that it’s hard for us to justify asking students to pay $80 or $180 for a textbook. Information isn’t shared that way outside of academic settings very often. 

And I think we also acknowledge then, that the value of them sitting in a class and getting that information from us is changing, too. So, in addition to just giving content, we have to be giving skills that are important and that are going to be important to them as well. And the openness of the textbook gives us an opportunity to really take a skill approach and invite them into a professional moment, where they actually exert professional skills and abilities on that. 

So, the kind of proposition that you’re asking, Matthew, is that exact thing that we can ask students to do. And if I think about like 21 century skills that are really important, evaluating information is at the top of my personal list, misinformation is such a problem. And so, giving students in any discipline, any field, any content the ability to say, “How is this information presented? Could we do it better? Is there other information that triangulates that?

Is the kind of thought pattern that’s increasingly valuable at the same time that the text itself is not as valuable as it used to be?” So, I think that that’s all extremely worthwhile. 

Zoe: Yeah, I love that thought around skills that you have, Jared. And that came up for me as soon as you were talking about the students were undertaking a form of peer review within their classroom. So, there was a conversation just recently on Twitter about a lot of grad students who then move into academic fields who aren’t trained in a lot of really key skills, like knowing how to do peer review. 

And I think peer review groups are a really great way to start engaging with that. I think bringing the focus of classroom review into what are the students learning out of it, what is the value of this? Because there’s immediate value to the person writing the textbook, right? You’re going to get a lot out of it, a bunch of feedback, you’re going to make it a stronger resource, that’s going to have a better impact on students who use it in future. 

But that idea of shaping it around very specifically what skills are we wanting to teach them? What should they be practising? What should they be learning as part of this process is super important. 

Karen: So, Leigh has a question for all of us to imagine, and, Leigh, I may need to ask for your clarification. In the chat, you suggested that we consider what would happen if we held up classroom review the way we hold up peer review. And I’m not sure if you mean hold up you mean for scrutiny, in terms of trying to standardise or examine? I just want to make sure that I’m getting this right. 

Leigh: I think I mean [echo]… Sorry, Zoe and I are in the same room. (Laughter) I think what I mean by held up is give credence to and hold as important and vital to the publishing process. 

Karen: I see. Thank you for clarifying. So, how would things look different for us if classroom review were heralded or held up or really allotted as an important part of the textbook production process, how might that shift things? 

Andre: I can talk to that a little bit. I think it’s a different part of it. For the peer review that we had done, these are other experts in the subject matter and so, I’m taking their perspective in a certain way. The students, their input on my project anyway is of a different nature. It’s a more I think practical nature, based on their own experiences. And so, for me, for my project, it’s really got to be a combination of the two. And so, in a way they’re both as important as the other. 

Ian: Yeah, and I think that there’s a spectrum of how this could work, too. Whether it’s you want feedback from students like one of the outcomes we had initially hoped for through our seminar was to create some kind of plugin perhaps that would work with the CUNY academic commons, which is like WordPress, open software that people are teaching with, not exclusively, far from it. 

But more and more people are teaching with that could be used for evaluating a textbook at the end of the semester, for example, on the kind of OTN review model. But I think on the other hand, if you wanted to use more open pedagogy techniques, that can just become a more involved process where you’re not just getting feedback, but you’re continually revising and creating the textbook with your students over the course of the semester like the purpose. 

It could just be this ongoing iterative process, where the creation of the textbook is part of the class itself, as opposed to traditional feedback, revisions, re-release for the next semester. I’m not saying one’s better than the other, but just that I think it all depends on how you want to approach it, as far as how much we want to try and address how we want to hold it up as peer review. Do we want them to be active participants in the creation of the book? Or do we want their feedback so that we can improve upon it ourselves? 

Karen: Yeah, and Rebel posted something in the chat related to that peer review comparison. And whether or not you guys are thinking about the trade-offs on whether things are blind review or at what stage there may be a blind review for what particular type of feedback. And she raises the concern that it might be difficult as a student to tell your professor this book is disjointed or it’s not working for me. 

Or I’m really struggling with it, if they know that obviously you have power in that relationship with grading and their future. So, I invite your comments on that. 

Andre: I think there’s an element of that when I ask my students for feedback, I tell them that if they think the book sucks, I want to know that it sucks, and I want to know why it sucks. And so, I think that this has been a part of my project at least since deciding to go in the OER direction that it ended up being a much more collaborative thing. When we found our peer reviewers, my initial thought was that we would do blind review, because my understanding at the time was that was the kind of standard for books in my field, anyway. 

And then, in discussions with Allison we decided to not do blind review, and I think that was a really good move, because it made it a more collaborative process, which hadn’t previously been, that wasn’t really on my radar before, but I think that the book benefitted from that a lot. And so, I took that same kind of perspective in approaching my students about this. 

And so, whether or not it’s blind for me, anyway, is irrelevant, because I want to know the good, I want to know the bad. And so, I think that element is removed from it, when it’s a more collaborative process like this. 

Jared: Yeah, although it’s worth mentioning, too that a lot of that’s idiosyncratic to the power dynamic that’s just in a specific classroom, right? Because I’ve been in situations where we were invited to review a professor’s work and that ended up being uncomfortable or hostile, and I don’t even know if the professor knew it, that it’s possible for us to conduct this on our own work in a way that’s inviting and open and people feel confident. 

But we should be realistic that this might be. It’s a really good consideration. You might be able to do things with someone else’s textbook if you’re bringing in an open textbook that you didn’t create, you might have opportunities to use peer review pedagogically in ways that would be threatening if it were your own work. And being empathetic to the student experience is really important. 

Karen: Listening to Jared, I imagined a situation where a colleague implemented another colleague’s textbook in the classroom. And then, provided that feedback, they’re sort of the conduit, they’re one step removed. And then, the potentials for awkwardness in that relationship. So, it’s hard to get around it, no matter where you are. But at least the power differential is perhaps less pronounced. 

Matt, following up on Leigh’s question which got us started down this road, has anyone ever seen a classroom review index similar to a peer review index in an open textbook? I can’t think of any. Anyone out there? Our guests and beyond. 

Zoe: I’m going to go see if I can find an example. I think one thing that we’ve spoken a lot about with projects at Rebus is including a peer review statement and alongside that a classroom review statement, explaining the process it’s been through. I’m trying to think whether many of those are actually released and available. So, I’ll go looking for one. But that’s something that we’ve tried to put into the process, that projects we work with go through, is thinking about them on the same keel but very different purposes. 

And I think this might speak a little bit to your question, Jonathan, about the difference between peer review and classroom review, where peer review particularly for scholarly work may be a little different for textbooks, but it’s about finding truth. We think of it as it’s the validation of the information, it’s the expertise, it’s the expert review of it. And then, classroom review is a space that is very dedicated to thinking about okay, how does that information actually work in practice? 

And so, peer review can get to that, too. I think especially in open textbooks, often the kind of collaborative process that Andre’s thinking of you know that all of those people were also interested in how this was going to impact students. It’s not out of the equation. And I think what classroom review offers is the space to really do that very deliberately and so you’ve been through the yeah, okay, we know the information in here is correct, it’s as good as we can get it right now. 

How does it actually play out in a classroom setting? And yeah, that is something that as I say is important for us that we do think of those as both being important, because you can have the best information in the world, it can be completely accurate and if students zone out while reading it and don’t do any of the activities and don’t get anything out it, then it isn’t actually achieving its goal as a text or as an object. 

So, I think that slightly speaks to what Jonathan asked, but I would open it up to anybody else who wants to talk about those differences between peer review and classroom review and how do you think about it. [Silence]

Karen: All right, I’m going to read Matt’s comment in the chat, because—

Zoe: I think I’m not good at asking questions, because I just talk and then, I’m like, “Somebody else talk about this.” Let’s hand back to Karen. 

Karen: So, just to come back around to this idea of working with students and whether or not they may feel inhibited in offering feedback about a book created by their instructor. One of the ways that Matt has dealt with it is by bringing students onto the research team, to lead the evaluation process and co-create the questions. So, the focus group is facilitated by a student, rather than the professor in getting IRB approval and consent forms that indicate the data won’t be shared with the professor until after the class. 

So, putting some safeguards in place, or making the whole process very transparent or semi-transparent based on that head shake has been helpful for Matt. So, Jonathan, I’m going to take a look at your comment here. Another aspect is purely the positive relationship building, for example, I’ve done the bug bounty approach, extra credit points for typos found in my OER during the first semester of deployment, much like Andre talked about. 

And I’ve never had such careful reading of the textbook as when looking for these bugs or typos. A treasure hunt, if you will. I’m thinking of introducing new typos, even in a stable version being reused in class, just to get that effect again. So, maybe we’re stumbling onto a very helpful pedagogical device here, which is the insertion of typos. 

Jared: And maybe a research question too, the stance you take if you’re doing a typo review or just trying to improve the text, you prime students in how they engage and are motivated to engage and then, ultimately, what they’re gleaning from that. So, it might be useful to think about different ways to prime and then, do cognitive interviews with students. What was your experience when you were reading, looking for typos in mind? 

How did that affect with your understanding of the concepts? Was it really helpful? Did it get in the way? How else could I prime this? Because I think that’s one of the benefits of this, is if students are reading with purpose, and we have the opportunity to give purpose, my background is as an English teacher, I taught high school English for six years before jumping back on the academic side of things. 

And trying to help my students approach a text with purpose was most of the work of being an English teacher. So, I am really interested in that idea, I don’t have any good answers about it, though. 

Ian: One thing that we did in the seminar right in the beginning that we found was extremely useful was to basically cover the anatomy of a textbook, because a lot of students may not know what all of the component parts of the book, what purpose they serve. What’s a glossary? What’s the index? And I know that there’s a handout, and maybe even some slides of that unit that we taught in the seminar in our toolkit. 

But we found that to be incredibly useful as like a way to start the seminar, because really their analysis skills are really sharp period. But I think that getting some of that background information to help them understand this is how we’re looking at these books. This is how they’re traditionally constructed. I just find working as a librarian at a community college that we can’t assume any prior knowledge. 

That’s not to discredit students so much as it’s just like at least where we are in New York City, they’re coming from such diverse backgrounds and academic experiences in their lives, that just doing that kind of fundamental review is very helpful for everybody. It just helps clarify what we’re looking for when it comes to a review or an evaluation of the textbook. And then, one other thing I just wanted to briefly touch on that came up a couple of minutes ago. 

The IRB and getting students involved, I think that’s a really great idea, too. And when we’ve presented at conferences on this toolkit, we’ve typically brought at least one of the students who participated in the seminar with us to talk about the toolkit. And that’s been a really great way to build rapport even more and just get them some lines on their CV early on, one of them wound up getting a really amazing Cooke scholarship for community college students. 

It’s a very great scholarship and he’s moved on to Haverford. So, it was amazing to see that trajectory. And I know one of my co-collaborators wrote one of his letters of recommendation for that scholarship. So, we really try to forge relationships with them, when possible. 

Karen: Very rewarding to do so, that’s why most of us are here. 

Zoe: I love what you bring up, Ian, as well about meeting students where they are and doing a bit of education before embarking on this. And I think this connects back to what Rebel raised and I think it’s really important that as this work is being undertaken, there’s an understanding that students all bring different experiences and levels of knowledge about these things. 

And I think what Andre did in terms of starting off by being really transparent about the process, and I think that’s also about trust building in the classroom. And this is very relevant in conversations about open pedagogy, if you start by building trust in the classroom, and there are mechanisms to do so which can include things like allowing for anonymous feedback or making sure that students know they won’t be punished for not participating. 

Giving them agency in whether or not they participate, how they participate, by providing a lot of options for them to then navigate through and then supporting them to make those decisions with the right information, I think that’s really relevant here, too. And in particular, as has been mentioned, there’s that power dynamic involved, there are a lot of students who come into a classroom like that who won’t have experienced a relationship with an instructor where they are trying to be more egalitarian and try and do things differently. 

Everybody brings their own experiences to this and it’s important to think about the ways in which we’re deliberate about making sure that students do feel safe and included and also feel like they won’t be punished for not participating or participating and having critique to give, who you’re meeting in your classroom and so, teach the students that you have not the ones you think you have. I’m probably slightly messing that quote from someone. I think that’s really critical here, too. 

Karen: All right, I’m just going to mention that we’re about 10 minutes before the hour, so if you’ve been holding on to any questions or comments, now is the time to put them out there. Also, I invite any of you in addition to our three guests, if you want to share a particular case study or something that you have tried yourself or supported an author in trying themselves in the classroom, please don’t hesitate to tell us about it. 

Let’s see. Rebel has a couple of comments in the chat for Jonathan. A faculty member at Kansas State who had a similar experience with the popularity of finding typos in a textbook. He actually had to limit the extra credit, but that they read it very closely. She also mentions we had one anonymous book complaint form that collected all errors and issues to address after the semester. 

It saved all entries into a spreadsheet and then she could address those that were most critical first and still track all issues in one place. So, this is a really great tool after publication to share some more perhaps in the front matter or back matter of the book, so that you can hear from people out there who are using your book and get that kind of input. Matt mentions that he credited a student as a contributor, because unprompted, he proofread the entire book, which is a big deal. 

And brought in sheets of paper with all the errors in each class. This person may have found their calling. So, I’m going to pause and see if there’s anyone else who would like to speak up before we start to say our goodbyes. It’s not too late. (Silence) Ian, here’s a question in the chat from Jonathan, is the toolkit mentioned earlier a good place to look for resources that we can use to do surveys to prove to funding agencies that OER are good? I think I got that right. 

Ian: Maybe. I guess it depends on how you want to use the surveys or adapt them to your own use. But it’s definitely I think perhaps some of Jared’s work on the COUP framework is probably a bit more applicable to that kind of proof of existence or existence justification type work that I think a lot of the literature gets into. So, maybe in ours, but I’d be curious, Jared probably has a clearer sense of that. Sorry to put you on the spot. 

Jared: No, absolutely not. I think student perceptions can be valuable. We found that students perceptions, oftentimes they rated open textbooks in different contexts very highly. They were less critical than we expected, even when we knew that there were design issues. And then, sometimes students are really focused on design issues, like you alluded to earlier, that they care about those where content was the thing that we were hoping that they’d zero in on. 

So, triangulating is really important. Add to student perceptions, getting multiple faculty to give perceptions is another way to triangulate that. And then, we’ve thought about outcomes, and we’ve done some quasi-experimental work and certainly with a little bit of design and thinking ahead. You have the ability to ask our students learning as well especially if you’re using the same text over multiple years, with different textbooks. 

That not going to give you the whole answer either, just because the design’s not going to be likely super strong. But if you’re using that to triangulate, you’ve got different ways of looking at it, as well. But I think the most exciting thing is how were you able to use it differently? And this peer review part and to be able to pick it apart and add to it and change it, if I were trying to make the case to funders, I would be talking about how we’re using it. 

And how openness changes how we use it and peer review would be a big part of that. I think that that’s a really compelling story to tell. 

Ian: Yeah, because I’m just thinking back to some of the things we did in the seminar and that. Early on, we showed them the most heavily circulated reserve textbooks in our library, and not coincidentally, those textbooks were on average like $120 to $300. And so, we broke it down by like well, the cost is one impediment, so those are circulating a lot because perhaps students aren’t buying them. 

Another then is like there’s basically three publishers, so we really dug into those almost all of the authors were men. It was just look at these books, are they reflective of the students here? How expensive are they? And that’s not necessarily proving that OER is good, but it is at least pointing out the problem, it’s addressing the problem and showing that there at least can be some justification to test this out, to start to introduce them. 

Because we were doing this survey, we weren’t teaching class, obviously, that’s a really important part. So, we perhaps were getting more comments on things like design and layout and things like that because we weren’t really teaching to the content at all. 

Jared: Sure. Yeah, I mean, what I am convinced of after spending time looking at outcomes in particular, too is the textbook, no matter how good or bad it is, is a relatively small part of the learning environment and learning outcomes as well. So, being realistic about that, that’s how we’re going to teach it, it’s how we’re going to engage with students and how students will engage with us. 

How we promote engagement is going to have a lot bigger effect than the textbook, no matter what the textbook is. And that’s why I think this is all intriguing, inviting students to engage as experts, as junior experts and not emerging experts is so powerful and unique to OER, too some of the ways that you can do that. 

Karen: Indeed. 

Zoe: And I would just say quickly, Jonathan, too why don’t we tell them anyway? Funding agencies set their metrics, if we’re doing this work because we see value in other places, yes, it’s more work. But let’s just tell them all anyway, let’s start trying to set our own metrics on some of this stuff, too and keep including it. And make that a part of the case that we’re making, the story we’re telling every time we’re talking to funding agencies that might too get through a little bit. 

Karen: Thanks, Zoe and thanks to our guests. I am just going to highlight Rebel’s story before we wrap up, she mentioned in the chat that at K State, they encouraged a year of class review before official publication. So, this seems to be an emerging trend. And found that the professor author changed the textbook sometimes quite significantly, up to 50% based on student feedback after the first semester, to work out the bugs. 

And then, reused it in another semester with an improved version and by the time they had a version ready to go live, it had had these two testing sessions. So, the thank yous are starting to come in, which usually means that we’re about at the hour. So, I would like to thank all of you for joining us for this conversation today about classroom review. And a special thanks to our guests, Jared Robinson, Ian McDermott and Andre Mount. And also, to Zoe and the team at Rebus. 

Zoe: Thank you, Karen. My thanks to everyone as well, and our next Office Hours is coming up in March and that will be on the use of multimedia in OER. And I’ll drop a link in the chat there for anyone who wants to check that out and all the rest of the details will be coming. Thank you everyone, this has been a wonderful conversation as always. Look forward to seeing you again. 

Ian: Thank you. Great talking to all of you. 

Jared: Thanks. 

Chat Transcript

00:21:48 Karen Lauritsen: Here’s the book I believe Jared was just talking about: https://open.umn.edu/opentextbooks/textbooks/project-management-for-instructional-designers

00:23:46 Jonathan Poritz: anyone have a link to that new article?

00:24:00 Karen Lauritsen: Here’s the Textbook Evaluation Toolkit: https://academicworks.cuny.edu/lg_oers/72/

00:24:00 Zoe Wake Hyde: http://www.inthelibrarywiththeleadpipe.org/2020/open-to-what/

00:24:05 Zoe Wake Hyde: I think that’s it ^^

00:25:24 Jonathan Poritz: (thanks, Zoe)

00:26:11 Zoe Wake Hyde: That’s excellent to hear that students were paid – v interesting conversation to be had around student labour related to this topic

00:31:42 Matthew DeCarlo: It’s always fun when students complain about YOUR typos

00:37:58 Matthew DeCarlo: Does anyone have suggestions about students leading their own evaluation of a textbook? I’m evaluating my research methods textbook, so designing focus group questions is part of our learning process anyway…I’m just less clear on how to facilitate that process or work with them on developing a focus group script

00:38:26 Matthew DeCarlo: especially since it’s my book, so I may not be the most objective to lead those groups, since I also grade the students

00:39:44 Leigh Kinch-Pedrosa: Moment for imagination: What would happen if we held up classroom review the way we hold up peer review?

00:41:42 Matthew DeCarlo: I’m in a noisy place

00:43:28 Karen Lauritsen: Here’s the book Matt mentioned: https://open.umn.edu/opentextbooks/textbooks/scientific-inquiry-in-social-work

00:43:59 Matthew DeCarlo: super-secret new book for grad students: https://pressbooks.rampages.us/msw-textbook/

00:44:54 Amanda Larson: I really appreciate that you included when it should be available to adopt, Matt. 

00:46:09 Rebel Cummings-Sauls: Are you worried that this is not a blind review?  It seems that I would have trouble telling my professor his book sucks if I knew they held my career in their hands.

00:50:07 Matthew DeCarlo: Re: Leigh’s question….has anyone ever seen a classroom review index similar to a peer review index in an open textbook?

00:50:30 Jonathan Poritz: So it seems that the difference  between peer review and classroom review is that there are multiple possible uses for classroom review, reasons for doing it, while peer review (at least for scholarly work… different, maybe, for peer review of textbooks) is about finding *truth*… or?

00:52:15 Leigh Kinch-Pedrosa: Our last Office Hours touched on collaborative open peer review: https://about.rebus.community/2020/02/03/january-office-hours-rethinking-peer-review-for-oer/

00:53:16 Matthew DeCarlo: One of the ways I’ve dealt with it is by bringing students onto the research team to lead the evaluation process and co-create the questions. So, the focus group is facilitated by a student, rather than the professor, and getting IRB approval and consent forms that indicate the data won’t be shared with the professor until after the class

00:55:20 Jonathan Poritz: Ah, that’s helpful, Zoe.

00:55:46 Jonathan Poritz: Another aspect is purely the postive relationship-building: e.g., I’ve done the “bug bounty” approach (extra credit pts for typos found in my OER during the first semester of deployment) a few times, and I have never had such careful reading of the textook as then!  I’m thinking of introducing new typos even in a stable version being reused in a class, just to get that effect again!

00:58:38 Karen Lauritsen: Love it when we get philosophical… we’re all looking for purpose!

01:00:18 Karen Lauritsen: Common textbook elements, to Ian’s comment on introducing student to what makes it a textbook: https://canvas.umn.edu/courses/106630/pages/textbook-elements?module_item_id=1306061

01:01:18 Rebel Cummings-Sauls: Poritz- we had a professor do this at K-State he had to cap the points at ten (especially the first year), but he also reported that this made the students read the book- Algebra no less!- and they still read the book today- he sees them- and there was only one error found over the last two years

01:02:56 Rebel Cummings-Sauls: We had one have an anonymous book complaint form that collected all errors and issues for her to address after the semester- it saved all entries into a spreadsheet and then she could address those that were most critical first and still track all issues in one place

01:03:56 Matthew DeCarlo: I actually credited a student as a contributor because he, unprompted, proofread the entire book

01:04:06 Matthew DeCarlo: And brought in sheets of paper with all the errors each class

01:04:21 Rebel Cummings-Sauls: That is great

01:04:52 Jonathan Poritz: Is the toolkit mentioned earlier a good place to look for resources that we can use to do surveys to “prove” to funding agencies that the OER of which we supported the creation were “good”?

01:06:26 Rebel Cummings-Sauls: At K-State we encouraged a year of class review before official publication- we found that professor changed the textbook from 5-50% based on student feedback after the first semester, worked out the “bugs” and then reused another semester with improved version and then had a quality ready to go live

01:10:26 Jonathan Poritz: Thx, all!!

01:10:41 Julia Remsik Larsen: Thank you all for your insights and for talking with us today!

01:11:12 Rebel Cummings-Sauls: Thank you!

01:11:21 Marilyn: Thank you!

01:11:36 Jared Robinson: Thanks Karen and Zoe!!!

01:11:36 Zoe Wake Hyde: https://www.rebus.community/t/office-hours-multimedia-in-open-textbooks/2586

01:11:40 Matthew DeCarlo: thanks so much, everyone! I’m so happy to be at my first office hours in wayyy too long

01:11:47 Allison Brown: Thanks everyone!

01:11:53 Alexis Clifton: Good conversation!

01:11:54 Amanda Larson: Thanks pals!


Thanks to Mei Lin for preparing the audio transcript and video captions!


Have comments or feedback about these transcripts? Let us know in the Rebus Community platform.

Leave a Reply

Your email address will not be published.

Stay up to date!