January Office Hours: Rethinking Peer Review for OER

Office Hours

Watch the video recording of this Office Hours session, or keep reading for a full transcript. The chat transcript is also available, for those interested in reading the conversation that took place amongst participants and seeing resources shared.

Note: If anyone would prefer to not be associated with their comments in either of these transcripts, please contact Apurva as soon as possible and we will remove any names or other identifying information.

Audio Transcript


  • Dan Rudmann 
  • Catherine Ahearn
  • Karen Lauritsen 
  • Zoe Wake Hyde
  • Daniela Saderi 
  • Amy Hofer

Zoe: Hi, everybody. Welcome to another Office Hours. And this is our first Office Hours for 2020, which is always very exciting. I think this is also our third January, so we’ve been going for a while now, which is exciting. We are tackling a very fun subject today, one that we know resonates with a lot of you and that we’re looking forward to picking apart and exploring, thinking about peer review. 

So, I am Zoe, from the Rebus Community and I am joined as always by my lovely co-host Karen, from the Open Textbook Network and our illustrious guests. And then, Karen and I are also stepping into the role of guests this time round, which is quite exciting. A bit of a change for us in terms of format, but this is a subject very close to both of our hearts, so we’re keen to also be contributing. 

As always, the format will be five minutes from each of us, followed by discussion, questions in the chat, you’re welcome to unmute and speak when you want to ask a question as well. And yeah, so for now, I’ll hand over to Karen to introduce our guests. Thank you.

Karen: Thanks very much, Zoe. We’re delighted to partner with the Rebus Community on monthly Office Hours calls. And today, we are going to hear from four of us, as Zoe said. We’re going to start with Dan Rudmann, he is associate director for community outreach of punctum books. And he is sitting in for Eileen Joy today. And then, after Dan we will turn things over to Catherine Ahearn, she is head of content for PubPub and Knowledge Futures Group. 

And then, I’ll say a few words, and then Zoe will say a few words, and then we will open things up and hear from all of you. So, Dan, I’m going to hand things over to you. 

Dan: Thank you so much. My name is Dan Rudmann, I am the associate director at punctum books. Just to give a little bit of background about our press and our position when it comes to peer review, we are an open access press. Not only do we publish our books open access, but we try to embody the concepts of open access in all of our processes and that extends of course to peer review. 

And when I say that, I think what I mean is that we are open to exploring a lot of different modes of peer review and governance for our community. We don’t just have one standardised way in which we enact peer review. But there are a number, so just a very brief rundown. We are a scholar-led press, so all of our directors are PhDs in various fields, including myself. 

And so, when we are doing acquisitions, when we’re editing already you have the community aspect with the material that we’re working with. We also have a very involved editorial board that we’ve actually just revamped recently. You can see that board on our website, and they take a very active role in acquisitions, in assessing work. And the third thing to mention is that we take our titles on a very case by case basis. 

The nature of our press we publish academic and what we call para-academic work. So, for example, if we’re publishing a piece of theory fiction, that requires a certain kind of review that is distinct from if we’re publishing a book that’s critical to somebody’s tenure file for a dissertation. And we make sure that we take all of those specific conditions very seriously. Yeah, that’s a brief rundown of our press and our view on peer review and I’m excited to dig into that and learn more about it from you all. Thank you. 

Karen: Thanks, Dan. And now, over to you, Catherine. 

Catherine: Hi, everyone. As mentioned, my name is Catherine Ahearn. I am the head of content at PubPub and the Knowledge Futures Group. And just for some context and for those who don’t know, PubPub is an open source publishing platform for open access content. And it’s also part of the KFG, which is a fairly new not-for-profit consortium of academic, industry, advocacy organizations, that is seeking to build support and really advocate for products to make knowledge open and just more accessible for everyone. 

And I consider myself really lucky in that I often get to work with a really broad range of researchers, university publishers, journal editors and others who all looking to plug into the existing scholarly comms ecosystem in different ways, including those who really just want to burn it all down and start from scratch, they’re out there. I often find myself gathering bits of perspectives, ideas, experiences from these different groups that really help inform the ways that we’re building PubPub.

And how we can help our users to collaborate and iterate and really communicate their research. And so, peer review is just one use case where at least in my opinion things get really interesting. And in a basic sense, I would say peer review generally is our best attempt at communicating trust. Over time, that simple act of providing a quality check has also been tied with a small number of processes for doing that correctly. 

And I think that what’s become clear to me since I joined PubPub a little over two years ago, is what Dan was saying, when we fail to take into account the subject matter, the goals, and the intended audiences of a publication, when we seek to put it through peer review, we also fail to fully support the work that has been done and we tend to undermine its potential value. 

So, what I mean by that is that there are just so many learning opportunities that open up when you also open up peer review and the publishing process. And so, for example last year, there was a book written by Lauren Klein and Catherine D’Ignazio called Data Feminism. And they put their manuscript on PubPub for open review, the final manuscript or the final book will be published by the MIT press later this year. 

But they asked to have an open peer review process, and they were really great and specific about the kind of help they sought, the questions they still had. And when they published their draft, they received hundreds of comments and annotations from readers. But they also responded to them and were able to extend that conversation in a way that they wouldn’t have been able to do otherwise. 

And this is exciting for me, because it doubles down on the peer in peer review. And by doing that, it opens up the publishing process. It invites communities in at an earlier stage to the benefit and learning of that community, the authors and the improvement of that final piece of work. And I should say, as a platform, we don’t see it as our job to be prescriptive. We’re in the process of building an open source review management system within PubPub. 

But we let communities define their terms and their processes however they’d like, but we do think it’s within our mandate to make sure that allowing for the redefinition, reconsideration of peer review is something that we support, is included in that spectrum of possibilities. And so, getting back to my initial definition of peer review as a signal of trust. 

We hope that building out open tools for open processes can help bring about a sense of flexibility, experimentation, maybe even a redefinition of peer review and other processes so that they’re better aligned with the works themselves that they’re meant to improve. And we’ve seen journals, like the Journal of Design and Science which is also on PubPub change their peer review policy on an issue by issue basis, just to better match the content. 

Recently a new journal called Reviews in Digital Humanities launched on PubPub and it not only took on the task of peer reviewing DH projects, but it’s also creating a catalog of them and I think this is great, because in a sense, in just doing that, it’s also communicating a level of trust and quality about projects just through the act of including them in this list. 

This journal and another new journal called the Harvard Data Science Review, just by existing, they’re attempting to define and create trust in community around newer fields. And in a sense peer reviewing whole new swaths of scholarly inquiry, methodology, just by their existence, so expanding what that notion even means is also something that I’m really interested in. 

So, I’ll end there, PubPub we’re trying to work through a bottom up process in trying to support this kind of experimentation and flexibility. And I’m really curious to see what other people’s experiences are and if they have any input to help us do that better. Thanks. 

Karen: Thanks Catherine. I will echo both what you and Dan have said so far in terms of listening to your members, allowing for flexibility and just allowing to burn things down and see what we build in its stead. So, both of you spoke pretty broadly about different scholarly publishing and I’m going to zero in a little bit on open textbooks, since I am responsible for managing the Open Textbook Library as well as developing publishing programs for OTN members. 

But before I get really specific about textbooks, or maybe to frame the conversation, I think it’s really important to step back and question where textbooks fit in on the spectrum of scholarly publications. Do we think this is a product of inquiry or argument? Or is it perhaps more a survey of what’s out there? I think that’s a really interesting question for all of us to examine together. 

For a little bit of background, as I think many of you know, books in the Open Textbook Library come from a wide variety of sources. They might come from an independent author, they might come from a library publishing program, they might come from a grant funded program, like OpenStax. So, there’s really a lot of diverse models that are behind the growing number of open textbooks that are in the OTL now, we’re approaching 700. 

And I think the diversity of sources is really important and really valuable for our community, and that keeping flexibility in our publishing model is also really important. So, I’ve heard from people who do faculty workshops that are based around the Open Textbook Library gosh, faculty really want to know what kind of peer review went on for these textbooks. What was involved? What kind of rigour was there? 

And so, we looked at the possibility of trying to unearth exactly what may have happened during a publication process for open textbooks that are in the library. And as you can imagine, that’s a huge undertaking that may or may not be possible. But I think there’s another question about systematising peer review whether it’s by some kind of symbol or explanation that really might be a burden for more fledgling publishing programs that are barely eking out open textbooks with limited staff, limited funding. 

And then, to say, “Hey, we also want you to do this particular type of peer review so that you can be legit”, can really be difficult. And it doesn’t take into account the fact that openly licensed textbooks of course allow for and invite there to be lots of revision, for students to be involved in reviewing the work and seeing what works for them and what doesn’t, or contributing new content. 

I also think just stepping back and thinking about commercial textbooks, their peer review process is not exactly transparent either. Usually, many people who I talk to have said it is part of the marketing of the textbook, you invite your colleagues to review it, they might be blurbed on the book and then, they have more of a stake in that book’s success or involved in its marketing down the road. 

So, I’m just cautious about favoring larger publishers that have more resources, I’m cautious about holding open textbooks to a higher standard than commercially produced textbooks. And I want to allow for this diversity in models to flourish and so, I think with that I will turn things over to Zoe. 

Zoe: Thank you so much, Karen and to Dan and Catherine as well. This is really, really wonderful to hear about the different work that you’re doing and the approach you take. I’ll speak both to our work at Rebus, but also how I think about these things a bit more generally from the perspective of open publishing. And I think what I come back to often and if you’ve talked to me at conferences, I’ve probably said this to you many times. 

We’re in the process of building a new system of publishing with open publishing. Or at least we have an opportunity to do so. And that opportunity means that we can also do things differently and improve on them in ways that align more with our values, with our goals and what we’re actually trying to achieve with this publishing system. It can be invisible at times. It’s so established and expected that things happen the way they do because they always have. 

But we really have space here to tear it apart a bit and from our perspective at Rebus what we think about is how we can do things better in terms of making them more inclusive, more responsive to a wider range of needs and more contextual. So, my typical list of things to do is we get to pick up whatever was there before, look at it, decide if we want to keep it, decide if we want to keep bits of it, or if we want to throw it away. 

All of those options are open to us as we make these decisions. So, within the context of peer review, one thing that we’ve built into our process, so what we’ve developed is a collaborative model for open textbook publishing that is both replicable but also very flexible and adaptable. All of that’s available in our guide. So, we’ve gone through the process of doing hands on peer review and then tried to document it in a way that other people can then use that. 

And the critical question that we ask is what is peer review for? What is peer review for on any given project? And traditional peer review structures have been, and I think Catherine you spoke very well to this, about being markers of quality and trust I think takes that in a really nice direction. But they’ve been a mark of quality, for only certain kinds of knowledge and knowing. 

The systems that have emerged have come from specific contexts, dedicated to specific purposes, that don’t necessarily meet the needs of what we’re working right now and what we want to achieve. So, there’s a lot built into the systems that if we just replicate them, will continue to only recognise certain kinds of knowledge, certain as I say ways of knowing is a phrase I come back to a lot. 

And so, I want to challenge that a little bit and there are at least two ways in which I think we can chat about that a bit today. There are many, many more. One is the ways in which we can challenge who is considered a peer, and who is considered an expert. So, a couple of examples of where you might want to reflect on that is if a work is really localised or contextual and being reviewed by somebody who has academic knowledge of the field, but no lived experience of it. 

You’re going to get a very different result in terms of what they can contribute to the work itself. The example that often comes up with that is traditional knowledge within indigenous communities. There are things that are known within certain cultures or communities that aren’t citable. So, not being able to necessarily reference it in the way that you may expect out of a traditional scholarly text or educational resource can maybe not fit with the rubric of peer review that’s been provided. 

And we have to open up space there to think about what that might mean, how else we can look at that and think about how to demonstrate the quality of it or its value. And another way that we can think about that is thinking about who peers are and thinking if we can include makers and people with applied knowledge. So, as I was chatting about this with Leigh earlier, we have a friend whose research is in the field of fermentation. 

So, she is currently in Japan, working in a soya sauce factory. And she’s worked in natural sake breweries and the scholarly production that comes out of her research we were talking about how there’s as much space for an expert in making soy sauce to be able to contribute value back into that work and analyse it from the place of an expert and think about how they can support enriching those materials. 

So, who is a peer and who are the people who are involved in these processes is really critical for us to think about and make more space for. Working as we do very hands on with a lot of projects, we also have to be practical. And going back to that question that we have asked of everybody we’ve worked with on peer review and that we encourage everybody to ask who’s embarking on it is what do you want to achieve with peer review?

Are you looking for that traditional marker of quality where you can put a peer review statement in the back of your open textbook that says, “This was double blind reviewed”? Maybe you want to undertake a different kind of feedback process, and those aren’t necessarily mutually exclusive. But maybe you want to give space for people to contribute more in-depth feedback and go through more rounds of revision. 

Again, kind of speaking to I think similar to what Catherine mentioned of being in conversation with your reviewers can offer a different kind of experience of being an author being peer reviewed. And those priorities as you think about them and decide them will shape the process differently from maybe say for example an open textbook author who’s been through traditional journal review processes more often and coming to open education may have space to think about whether that does apply, whether they can do things differently. 

And there’s also from that practical sense the idea that a lot of faculty authors need to work within systems that expect certain things, the tenure and promotion guidelines, the importance of peer reviewed articles and publications. Those are still things that are important and that we need to work with in any kind of systems transformation it isn’t one stops and then the other begins. There’s overlap. 

And we need to be encouraging and supportive of people who are trying to navigate that overlap so they can both achieve what they need to to move forward in terms of career progression and also engage differently in what they want to get out of the process of undertaking peer review on their work. And I would argue that also goes a lot for peer reviewers themselves, too. There’s a nice other side to this. 

So, a couple of the ways we encourage people to do that is to work with an expanded group of reviewers and to really think about who should be the expert voices there. And that includes having parallel processes for classroom review and really bringing students into the process in a very dedicated way. And very conveniently our next Office Hours session is going to be on classroom review. 

So, I’m both teeing that up and also, I can dig into that a little more if people are interested. But I wanted to bring them in too as another way of understanding if your purpose is to really enrich and strengthen the content that you’re producing and sharing with people. The kinds of voices that come into it should absolutely include students because they’re ultimately the audience for it. 

You are needing to appeal to people who are adopting a text, which is where those markers of quality, markers of peer review are important. But they are choosing it in order to apply in their classrooms, and so I think there’s a case that you too have a responsibility to those students to put as much work into shaping the text for them as you can. And I will wrap up shortly, but I think I’ll echo what’s been said a couple of times. 

That I think we need to get comfortable with being less strict in our frameworks. Focus on our desired outcomes what we really want to achieve and undertake all of these processes critically. So, not just reproducing things the same way that they’ve always been done. And even if you are in a position where you need to do that, reflecting on that process. Thinking about why you’ve had to do it, how you would do it differently if you had the opportunity to and then, finding those opportunities. 

And I think too a lot of this comes back to working contextually and understanding the case by case basis and thinking about what’s appropriate in any given time or space or group of people, rather than trying to maintain a strict highly built structure that everybody must conform to. And there is some discomfort in that, I think, when people are very used to working within those systems. 

And that’s also a really fun opportunity, so yeah, let’s get comfortable with a bit of discomfort around these things, and hopefully that’s a nice segue into maybe some questions or if people are reacting to this and however you are, or if you have more to share and build on, would really love to open it up and hear from everybody. Thank you. 

Karen: Thanks, Zoe. So, to echo what she just said, we are turning things over to you for your comments. Feel free to unmute to ask a question or make a comment or put it in the chat. There is one question in chat already and that is for Catherine. Daniela says, “Thanks for presenting PubPub as a platform for open peer review. I’m curious how you find your reviewers and how you define expertise.”

Catherine: Yeah, good question and thanks for prompting me to clarify. So, PubPub is a platform where publishing communities can create their own spaces. So, when I mentioned earlier that we’re not prescriptive, we allow those publishing communities to manage their own spaces however they’d like. So, we don’t actually manage peer review processes for the publishers or research groups or journals that use the platform. 

We just allow them to have the infrastructure to support whatever process they want to use. So, we’re kind of process agnostic and while we will sometimes provide some best practices and suggestions, we leave it up to the groups to decide for themselves and to select their own peer reviewers. And just to clarify, because I also see that the link to the peer review transparency project was shared on the thread. 

They’re a great group that has used PubPub to publish their own work, but they’re not necessarily a PubPub project. They’re aligned in a lot of the things that we are interested in and want to support. And I think to me the clarification between open review and transparent review is important. Because you can have a closed review system and be transparent about how it was done and say, “This was double blind and it went through three different iterations and it took this long.”

And that to me, is a good amount of transparency, but it’s not open review, and so the PRT is working to create badges that will allow people to be transparent about what kind of review process they engaged with and apply to a piece of work without it necessarily needing it to be open review. 

Karen: Thanks, Catherine. And thanks for explaining the link I sent. I was like, “Is this right?” 

Catherine: No, it’s not wrong. (Laughter)

Karen: But feel free to share any links you think would be helpful in the chat. So, Leigh said that she would also like to hear everyone respond to Daniela’s question. So, I’ll just say much like Catherine, we leave it up to our members of the Open Textbook Network how to publish, how to manage peer review. And the same is true in the Open Textbook Library, we do not ask how was this peer reviewed when we evaluate a textbook and consider adding it to the library. 

We have four other criteria, but peer review is not one of them. And then, of course, we do have a more informal peer review process after publication, and you can see those reviews associated with any book record in the Open Textbook Library. But by no means is it a rigorous peer review, it’s more aligned with the spirit of the open textbook workshop, which is hey faculty, this option exists for you, would you consider taking a look? 

Thanks for taking a look, what do you see? What do you think? Could this work for you? So, Dan I’ll maybe hand things to you to answer Daniela’s question. 

Dan: Absolutely. So, as a scholar-led press, we’re fortunate in that we’re very participatory in the networks of our peers. So, we’re able to through our own expertise reach out to reviewers who are in particular fields, both within and outside of the institution. We’re very cognisant of the fact that the institution provides a certain amount of expertise, but there are a lot of people with expertise that are outside of the university. 

And then, again, our editorial board assists us, not only in the review process themselves, but in recommending people because our review board has such a breadth and depth of expertise. So, we’re able to utilise that to seek out peer reviewers. And I will say that as an open access press, we also see peer review and governance handled a different way by virtue of the fact that our authors and in our context, people know that this is going to be made available to everyone. 

And so, I think when your audience is a little bit more mysterious, when you’re not writing a text that’s for a small group of insiders in the field, that’s another system of ensuring a certain amount of quality and a certain amount of care taken in your writing. So, yes, these are all of the ways in which we are really actively working to ensure a certain level of authority and quality. 

But in recognising that that’s often a very horizontal structure. There are a lot of different avenues, as Zoe and Catherine have mentioned as well. 

Zoe: I really like that point that by nature of writing an open text you can approach it differently in terms of the actual writing itself, too. That certainly sparked something with me, and I think that one of the ways in which we talk about that was also thinking of downstream users. So, within open education in particular, re-mixability and adapting texts is a huge part of what we encourage people to set up for when they are creating the text. 

And that can sometimes be something that’s a little new or different, and I’ve just got this idea in my head now. I’m like, “Well, what if somebody reviewed it for that? What if somebody looked at it through the lens of what if I’m going to adapt this text? How is it meeting those kind of needs as you look forward?” And those adaptations can be from a really hefty rewrite, a combination of many texts, through to I just want to localise my examples. 

And so, there are ways in which you can set up to do that, that can be very interesting. So, that’s a fun thought I’m going to put on a Post It note. Thank you, Dan. And actually, then Daniela, I’ll respond to your question as well. We follow much the model that Catherine described, where it’s up to the projects who are working within our community system, whatever you’d like to call it. 

And there’s a lot of self-determination there, what we again try to do is prompt them to think about what their goals are, use that to shape the process and I like that you introduced that clear distinction between open and transparent, Catherine. Because I think there are options to do both of those within the processes we’ve developed. I think probably most projects we’ve worked with have veered towards the transparency rather than the openness. 

Some do use Hypothesis as an open web annotation option to undertake the reviews, so the reviews stay there, and the engagement is there for people to see. I think a lot of others veer towards the actual review phase taking place within say Google Doc, but then, taking the time to write a peer review statement with the detailed process that they went through. Who was involved, if it’s not an anonymous review? So, I think that’s a really important distinction to acknowledge. 

Catherine: Another thing that I was thinking of when you were speaking, Zoe, I was in a meeting with our development team not too long ago and we were talking about this peer review system that we’re building out. We were also talking about making it easier for classrooms to use PubPub and have spaces for gated conversation, where teachers could seed questions or prompts for students and they can respond to them and it can have some protection from the public. 

And it was funny, because we were actually talking about what those workflows look like, and I was like, “Well, that’s kind of like another kind of review.” They’re not that different, right? You want to create a space for a group, however you want to define it, to do a thing. And when a student is engaging with a text, they’re giving their opinion and writing their notes and annotations and the functions aren’t that different in terms of what you’re building out for a tool, for technology. 

So, I was excited to see that you guys are going to have another Office Hours on classroom review, because I definitely want to join that and learn more about that. Because it wasn’t until recently that we realised that the actual act, the way that we’re thinking about review and the tools are probably a lot more similar than we would have thought. 

Zoe: Absolutely, yeah, I would love to have you there and responding to that. It is more similar than not in a lot of ways, and it goes to that idea of the positioning of who is the expert. And when you are working with students, I think it’s always really important to have options for some measure of privacy, that a lot of and this is things I’ve learned from great minds in open pedagogy. 

It’s about self-determination and choice and informed choice about how they participate in these kinds of things. But that the amount of value that can come into a project from having those voices really early on is immeasurable. 

Karen: So, Catherine, we actually had a session two years ago on beta testing open textbooks, which is pretty similar. So, I’m just going to put a link in the chat from that conversation. 

Zoe: Sorry, Karen, it reminded me, I’d like to make a little note. I think actually, as Rebus we’ve been reflecting on that term beta testing and we now actually have moved a bit more use a classroom review because it’s a bit more accessible for people. I think sometimes beta testing could sound like well, what is that? But classroom review is a little more immediately graspable. 

So, at least in the way that we approach those processes, there isn’t a lot different. But in the time since we did that session, we’ve changed our language a little bit, just to kind-of, yeah. 

Karen: So, it’s pretty quiet in the chat and I’m just going to encourage any of you to go ahead and post something in the chat or unmute. I know that there are many library publishers in this call. There are some people who work at university presses in this call. And so, we are really interested in hearing your thoughts as well and having a discussion. So, it doesn’t necessarily need to be a question, either. 

We welcome comments. So, while you get those ready, Zoe, can you say more about why authors would choose open peer review? This is a prompt from Leigh. 

Zoe: Absolutely. And I’m really interested in hearing Catherine and Dan respond to this, too. I think what open peer review does at least in the experience of what we’ve seen with it is that it expands the opportunity. And part of the magic of open and also something that sometimes people take a bit of time to get comfortable with is you don’t know who’s out there, who’s interested in your work. 

And so, I think there’s a measure of just opening up an opportunity for voices that you didn’t know wanted to contribute to contribute, which can be really exciting. And we’ve seen the power of that in a bunch of different ways, and one of my favourite examples isn’t related to peer review, but just the existence of a work in progress being made public, someone responded within half an hour to our announcement, saying, “I want to translate this into Spanish and I want to use it in my work in Chile.”

So, it’s just an excitement factor, but as I said, that also can introduce a bit of concern about well, then anybody could respond, and the internet is quite rightly considered a scary place. So, yeah, I don’t think that of us here we have the most experience with that, and so I’d really like to hand over to Dan or Catherine to respond to that, too. 

Dan: Sure, I can start a little bit about our approach and our position of the press. I think, Zoe, what you said is absolutely right on. There is a danger in not being open, that we are siloing our work and we certainly want everyone whether or not they have different research interests or they’re in a different kind of field, the work we’re doing really crosses over in so many different ways. 

So, we want that opportunity to be available to people. But also, I should say that as far as open review, often our authors see ethical implications, having blind review is sometimes used for conversations that are frankly not as productive in producing really wonderful scholarship. But also, we are building through a system that has for a very long time relied on labour that’s not recognised and not remunerated. 

And we are really cognisant of that and we don’t have necessarily a solution just yet, but we’re working toward remedying that. That’s a major concern of ours. 

Catherine: Thanks, yeah. I obviously agree with everything that’s been said. And our experience observing different types of open peer review to me, there’s almost a humility in engaging in an open review and in saying, “I’m writing this, and I worked really hard on this monograph. But what am I missing? How can this be better? How can I bring in voices and perspectives that I might be blind to and not realise it?” 

And opening up that door or many doors and engaging with ideas that may not have been part of your long reading list or resources list when you were putting together your manuscript to begin with. So, it’s a way of being a little bit more of just making sure that you’ve covered all your bases. I’ve also seen that it’s a nice way of creating community and interest in a work very early on. 

So, there was a little bit of a worry that by engaging with open review, say for Data Feminism, that people wouldn’t be interested in buying the book later, or that it would be considered old news when it finally came out. But we’re actually finding that the opposite was true, that people then felt like they had a stake in this book. That they read it, they gave feedback, they got excited about it, they started chatting. 

And now, people cannot wait for this final manuscript to come out. And it’s a publishing house’s marketing dream, you have this organic inclusive conversation and energy around a title because you actually invited people in earlier. So, it seems like everyone wins in that way. I’m sure there are lots of other reasons why people engage in open peer review, but that’s been some of the highlights I’ve witnessed just on PubPub. 

Zoe: Yeah, that last one is so, so important in open education. Our entire approach is how can you from day one, from day zero be building the community around your book? Because it makes it stronger content, and because it gives it longevity if you have people invested in this text over time, it’s better for the teams who are at the core of the creation. It’s better for people to know what’s coming down the pipeline, so they can then use it in their classrooms. 

And it’s better for the book to exist long into the future, because people are invested and connected and working on it together in these really exciting ways. So, I love that. 

Catherine: It’s really exciting. And then, people see themselves not just in the process but in the work. It’s more effective as a piece of scholarship just the process is also so much better geared toward learning. 

Zoe: Absolutely. Yeah. I’m also going to drop an example in the chat from Erin Rose Glass who’s done Social Diss. So, her dissertation was really a critical project of open review and what’s grown from that and she speaks to her experience about it really beautifully. And so, I’ve dropped in one link and I will look for a couple of others to share of her experience with that. 

So, that is almost a nice straddle between the educational materials and scholarly outputs in the journal and monograph context of an educational pursuit as part of her, I think it was her PhD. 

Karen: So, Jeremy is bringing us back to some hard, cold realities of tenure and promotion and the systems in which we find ourselves. So, he is asking for thoughts on how we get these new great models recognised by university administrations, scholarly societies and stodgy academics. Amy tacked onto that, but Amy I’m still not getting the myth busting. So, if you want to unmute and say more about what you’re trying to get at there, that would be helpful. 

Amy: Sorry. So, now I’m going to be more articulate on the microphone. Well, I think Jeremy may have said it better, but are there things that seem like immoveable objects that you run up against, but that are actually maybe not totally true? And what are some of the words that you use to dispel those myths? Does that help?

Karen: Yeah, I think so, thank you, Amy. 

Dan: I think I can speak to that a little bit. I think that a lot of that is tied to a larger wave of skepticism around open access more generally. And this was something, punctum books has been around since 2011, that we encountered much more frequently nearly a decade ago. But I think more recently, there has been culture change among academics, as more ECRs’ voices are being heard, in particular early career researchers are taking a part in their scholarly societies and having more active roles in departments. 

We do see I think just as a result of a long grind of doing this work that there is a lot of proof that open access publishing models, open access review has produced meaningful scholarship and meaningful community around scholarship. So, it’s certainly been just a very long process of building a broad coalition of people to support us and to support this sort of model. 

So, we’re part of a larger consortium of independent academic open access presses, called Scholar Led. And we’re based in the US, but we have partners in Europe and the UK who are engaged in similar work. And there’s been major movement with things like Plan S to move the needle on perceptions of open access more generally. So, I’m very hopeful, and I think we’ve certainly seen in the last few years a major change. 

And of course, it’s still an uphill battle, but we certainly see a lot shifting as far as perceptions of open review and open publishing. 

Catherine: Yeah, this is one of my favourite rants, and I wish, given how often I’m so angry about this or how often all of our conversations come back to the question of tenure and promotion practices, I’d be more fluent in talking about it. But I’m not. So much of what drives decisions around how we conduct research and publish is around promotion and you can’t blame people for that. 

It’s career moves, really at the end of the day. And I was brainstorming ways of how to fix this about six months ago, and I was like, “Well, why don’t we give universities open scores?” And at first this felt really daunting, but I was reminded that oftentimes the tenure and promotion policies are actually I think all of the time, they’re department by department which means that if you really wanted to impact, influence change, you’d have to lobby every department within every university and it just would be this slog. 

But I’ve come to think of that actually as a benefit because different disciplines are much more open to open access and open practices than others are. And I think that it’s nice to think of those as building blocks for increasing engagement and popularity and trust and legitimacy in this way of conducting work. It’s not the only way, but another legitimate way of publishing and reviewing. 

And I like to say that in my opinion it’s like the philosophy around getting more people to vote. I’ve said this before. But someone’s more likely to vote if they know or even just think that their neighbors are also doing it. And I think that’s actually kind of true with scholars, because many of them are pretty competitive and it’s more of a slow cultural change, which is always slower to take hold than other types of change. 

So, I think just by starting with those groups that are more accustomed and more naturally open to open access, we can actually start there and have a good amount of success over time. 

Zoe: I like that, and I want to hear another version of that rant, if you’ve got it in you, when we have more time. On the tenure and promotion, I wanted to point out and again, in one of our previous Office Hours Mark Poepsel is somebody who used his open textbook as a pillar of his application, and he did receive tenure. 

And the way that we worked through that with him, at that point, we were working quite closely with him was to learn how to put the process into language that was understood by the people who’d be reading it, who maybe didn’t have the understanding of open practices. And so, I’ve mentioned the peer review statement a few times, that is a really multi-purpose thing that you can include in your open textbook to make clear what has been done. 

And I think there are ways in which you can phrase that and choose to surface certain things or excise certain things that can be useful, when you’re using it as part of the T&P as the example. And he too, I think got a letter of support from us, if I’m remembering correctly, so if there’s someone in a kind of publisher role or a granting role or somebody with a bit of cache that you think can also be there to support, I think this is obviously talking at the faculty level. 

Those are the kinds of things that can work. And then, Karen, I’m really interested in going back to what you said initially about comparing it to commercial publisher practices. I’m interested in how those conversations go for you, and I think there is I want to say it was maybe Matt DeCarlo, I can’t remember quite, but somebody tweeted out that they’d been asked how someone would know that an open textbook was quality? 

And his response was just, “Well, read it.” Which I really enjoyed, and the follow up to that is well, how would you know that a commercial publisher was? And I won’t go on this any further, but it’s something I come back to a lot is that in open education we don’t have the same acquisitions role that you do in traditional presses. So, what you’re speaking to, Dan, there’s not the same kind of acquisitions process that a lot of people do look to as another marker of quality. 

And so, peer review, I think, then gains even more importance as building up that trust as Catherine’s spoken to. And Karen, yeah, what do those conversations look like?

Karen: Well, there are many people on this call who can speak to those conversations first-hand. I know that in our trainings we have a certain Greek chorus of, “It depends”. So, if a faculty member were to say, “What’s in the Open Textbook Library? Are the books any good?” We would say, “It depends. It’s up to you to decide as the subject area expert.” So, there’s no way for us, we’re not a publisher. 

Our role with the Open Textbook Library is really to try and be a one-stop comprehensive place that faculty can go and browse open textbooks that meet the four criteria in a simplified way. We’ve got one search bar. And we just want it to be immediately apparent, here’s what’s on offer. Here’s where it came from. And now, you decide if it could work for you. 

And if the book has been reviewed before by colleagues in that subject area, it’ll appear there with the record and they can see for better or for worse on a star system how that book has been perceived in I think about a dozen different areas, ranging from organisation to thoroughness. And that metric was adapted from BCcampus, or maybe not even adapted. Maybe we just used the exact same one. 

Laurie might be able to say, she’s in this call, too. So, that’s how we approach it. In keeping of the theme, talking point themes today, we really are not here to say like, “Here’s the right way to do it.” Or, “We can’t wait to replicate the established models.” We’re really just saying, “You as a faculty member are going to evaluate the materials you want to use in your course, and so here’s just one thing that you could choose to evaluate.

Here’s what we think the benefits are, but of course, this is totally your call.” And so, that’s how we approach those conversations. Other questions or comments for us? We haven’t really given much of a pause, so now we can have our awkward pause moment. Daniela. 

Daniela: Yeah, because I already asked a question, but I’m just going to try, I’m going to follow up on one. So, thank you for organising this, I’m Daniela Saderi, I’m the project director of PREreview. And PREreview is a project that has been trying to bring more diversity to peer review via including more people in the process of the peer review pre-prints. 

And I’m very interested in learning also more about the classroom engagement with peer review of textbooks, how peer review was actually born. When I was a student, I want to bring more awareness about pre-prints and trying to introduce the review of pre-prints into journal clubs and students’ classes. And then, we got involved towards this bigger, humongous goal, what we are trying to do, and it was also great to hear more about what PubPub is doing. 

We definitely need to keep in touch. But my question is so we have been, because we open up basically anyone who has an ORCID ID can do peer review on PREreview and the outbreak science now PREreview platform we brought up. A lot of people are like, “How do you know that I should trust this review? How do you know that I should trust these experts or whoever says it?” And we pushed back a lot on bringing any badge that can identify reviewers. 

We have a pseudonymity system, but even those that identify themselves with their name, because we actually want to go against this idea that a good review is related to a person who has 10 years’ experience and has had a PhD at Harvard, all of that. So, break down those existing metrics of expertise, and it is so incredibly hard. And that doesn’t mean that we don’t need to continue doing it. But by having more and more conversations with organisations that are trying to raise the voices of African scientists, for instance. 

The African Science Initiative, we’re really more and more like how we really need to get at this from an anticolonial point of view, really combatting this idea that most researchers are Africans themselves seem to have, is like well, they have been trained to not think like they are actually experts. So, if you have any ideas of if we need to bring, because to attach something to be trusted we need short texts or shortcuts perhaps or maybe not. 

But what are new metrics that we can think of, if we do need them at all, to actually level up this inequity and prevent the system to be recreated in this space of open, like what Zoe said. So, that was a little bit of all over the place. I didn’t prepare this question, but I wanted to hear more from everybody, not just from the panel, if anyone has expertise about this topic. Thank you. 

Karen: Thanks, Daniela. I will invite Dan or Catherine to comment, since we have but a mere two moments. Two moments equals two minutes left. 

Catherine: Neither of us want to take this, because we don’t have a good answer. I put in the chat that we need more, better impact metrics anyway. But for me, when you’re trying to assess the successful impact of a work, it’s very hard to do that without keeping in mind what the goals of publishing that work were. And I think right now, especially when you just present numbers to a tenure and promotion committee or something, those two things are often separated. 

And so, it’s hard for me to say, “Well, this is the metric that’s most important, or this is how we should do it”, because again, you have to be more in tune with what the work is actually trying to accomplish. But I think even just having a sense that downloads don’t mean reads, and time on page isn’t indicative of reading either. Yeah, I see eye rolls. But there are lots of ways that metrics can be gamed and that they are not connected at all to any kind of success. 

So, I’m sorry, I feel like I’m dodging this question, because I have more to say around what not to do, than solutions. That’s my least favourite type of answer, but that’s right now, on PubPub, we offer two different types of metrics, we collect lots of others because we just haven’t as a group figured out what’s right, what’s most useful. When we ask researchers, “Okay, what are you actually going to do with this number?”

They often can’t really say, everyone’s just used to having what Google spits out at you, because it’s now been considered standard. There’s a line between giving people information, giving your users and authors information, and then also protecting the privacy of your readers. So, all of that is in play. It’s all stuff that we’re weighing, I’m sorry I’m not giving you a better answer, but it’s all stuff that we’re really thinking about. And I’m curious to see what you guys have to say, too.  

Zoe: I want to sneak in with a 15-second, I think there’s a piece in there I want to pick up, Daniela, of like how do you know if the review you have received is worth paying attention to? Again, things sparking in my brain, often culture change is a big, big thing. I’m wondering whether there’s some kind of as you’re reading your review, there’s something next to it that says, “Remember that just because there might be grammar mistakes or spelling mistakes, that doesn’t mean that the review isn’t worthwhile.”

Because it may be English is the second language or whatever language they’re reading in, that may be the second language. Maybe there are other little things like that that you can just prompt, that sometimes people need the reminder to think differently about the review that they’ve received. Or to think about it in its full context or full potential context. I’m just throwing that out there. Karen’s trying to wrap this up, so I’m going to stop talking. 

Karen: Dan wants to say something, too. All right, Dan. 

Dan: The only thing that I would add to that is I think that what Catherine’s doing and a lot of what we’ve been talking about here is really the seeds for that. We need an alternative infrastructure, we’ve for so long been fed a certain amount of metrics or certain indicators of quality. And I think what you have emerging in the landscape are alternative ways to bring people together and more democratic or more equitable methods of people pointing to each other and saying, “This is great, you should listen to this.” 

And only with that infrastructure, I think do we really have the possibility. So, I think what PubPub is doing is a huge deal and I think what you all are doing at Rebus is amazing. So, thank you. 

Zoe: Thanks, Dan. Yeah. It’s just has been a really, really wonderful exploration of how we’re all approaching these in very similar and also different and contextual ways. I’ve thoroughly enjoyed this conversation, thank you all. 

Catherine: Thank you. 

Karen: I’m going to run with Dan’s metaphor of seeds. As a gardener, I would just like to say that these conversations are the water for those seeds in the landscape. And so, thanks to our guests, Dan Rudmann, Catherine Ahearn, and Zoe Wake Hyde, we’re delighted that you could join us today. And thanks to everyone for being here. And we hope to see you next month when we talk about classroom review. All right, farewell.

Chat Transcript

00:29:43 Karen Lauritsen: More on the board Dan mentioned: https://punctumbooks.com/about/editorial-advisory-board/

00:35:04 Daniela Saderi: Question for Cathrine: Thanks for presenting PubPub as a platform for open peer review. I am curious to know how you find your reviewers and how do you define expertise. Thank you.

00:35:11 Karen Lauritsen: More on process Catherine mentioned: https://prt.pubpub.org/

00:40:05 Karen Lauritsen: We touch on some of the different review methods seen in open textbook publishing so far in our open textbook publishing curriculum: https://canvas.umn.edu/courses/106630/pages/considering-peer-review

00:41:33 Leigh Kinch-Pedrosa: Rebus Guide to Publishing Open Textbooks: https://press.rebus.community/the-rebus-guide-to-publishing-open-textbooks/

00:41:57 Daniela Saderi: +1000 to what Zoe is saying!

00:42:13 Daniela Saderi: YES!

00:47:07 Leigh Kinch-Pedrosa: RSVP to Office Hours on Classroom Review 😉 https://www.rebus.community/t/office-hours-facilitating-classroom-review-of-open-textbooks/2422

00:47:18 Daniela Saderi: YEEEESSS!! We need to hear the voices of students and other ECRs… key to reach more diversity in peer review. Thanks, Zoe!

00:48:41 Lauren Ray: Thanks to everyone speaking!  There is so much here that is valuable and that I appreciate hearing discussed as an oer librarian!

00:49:40 Daniela Saderi: Thank you!

00:49:55 Leigh Kinch-Pedrosa: I would love to hear everyone answer Daniela’s question please

00:56:46 Leigh Kinch-Pedrosa: Zoe, can you say more about why authors would choose open peer review?

00:58:30 Karen Lauritsen: We had a session  on beta testing: https://about.rebus.community/2017/10/30/office-hours-recap-and-video-beta-testing-open-textbooks/

01:00:27 Leigh Kinch-Pedrosa: A team working on “beta testing” their open textbook on LGBTQ+ Studies has shared their classroom review feedback forms here:https://www.rebus.community/t/spring-2020-beta-testing/2385 for anyone interested in seeing what this type of review could look like

01:02:36 Zoe Wake Hyde: That’s a hugely important issue, thank you for raising that, Dan!

01:04:49 Jeremy Smith: Maybe this was already mentioned, but what are everyone’s thoughts on how we get these new, great, models recognized by administration, scholarly societies, stodgy academics, etc.?

01:05:23 Zoe Wake Hyde: http://www.erinroseglass.com/socialdiss/

01:05:24 Amy Hofer: I like the concept of understanding what authors hope to accomplish with peer review and then finding processes to match. Peer review sometimes seems like more of a hoop to jump through. I’ve been wondering similar to Jeremy’s comment, is some of the work you do related to mythbusting?

01:05:45 Amy Hofer: What are some of the myths and how you bust them (if that is clearer)?

01:05:57 Catherine Ahearn: Since I keep mentioning it, here’s a link to the open review of Data Feminism: https://bookbook.pubpub.org/data-feminism

01:06:32 Amy Hofer: ^ I found this MS very useful in my own work! Was grateful to find it prepub.

01:06:52 Leigh Kinch-Pedrosa: +1!

01:09:13 Leigh Kinch-Pedrosa: https://scholarled.org/

01:12:43 Leigh Kinch-Pedrosa: Mark Poepsel’s Review Statement: https://press.rebus.community/mscy/back-matter/review-statement/

01:13:49 Catherine Ahearn: Also, better impact metrics!

01:16:36 Leigh Kinch-Pedrosa: so if I’ve written a manuscript what should my first step towards peer review be

01:17:05 Zoe Wake Hyde: Yes! We need better student impact metrics to make the case for the value and impact of OER

01:18:25 Leigh Kinch-Pedrosa: about prereview: https://content.prereview.org/about/

01:21:26 Daniela Saderi: It’s a really hard thing.

01:22:03 Daniela Saderi: Thank you Cathrine.

01:23:06 Daniela Saderi: The language issue is a big problem. Thanks Zoe.

01:23:40 Daniela Saderi: If anyone wants to discuss more about this issue with me, please do contact me at daniela@prereview.org. Thanks!

01:24:14 Amy Hofer: Thank you presenters!

01:24:25 April Akins: Thank you so much for this discussion. It was great!

01:24:28 Josie Gray: Really wonderful and thoughtful conversation. Thanks everyone!

01:24:31 Daniela Saderi: Thanks everyone.

01:24:33 Allen Reichert: Thanks!

01:24:35 Mahrya Burnett: This was super interesting. Thanks, all!

01:24:37 Catherine Ahearn: Thank you!

01:24:37 Rebel Cummings-Sauls: TY

01:24:38 Zoe Wake Hyde: Thank you everyone!!

01:24:38 Michael Whitchurch: Thank you!

01:24:39 Leigh Kinch-Pedrosa: Thanks everyone!

01:24:39 Lauren Ray: Thank you! Such a great conversation

01:24:43 Anna Craft (UNCG): Thank you!

01:24:43 Raymond Vasquez: thank you all!

Thanks to Mei Lin for preparing the audio transcript and video captions!

Have comments or feedback about these transcripts? Let us know in the Rebus Community platform.

Stay up to date!