What is the best point to check for plagiarism within a journal's editorial workflow? Sarah McCormack, editorial manager at the American Society for Nutrition (ASN), spoke with us about the types of modifications they have tested in their editorial workflow (to check submitted articles) before settling on the most logical and successful point.
Watch or listen to the interview (4:31 minutes):
iThenticate: Thanks for speaking with us today, Sarah. Could I ask you to give us a quick description of your role and responsibilities?
Sarah McCormack (SM): My name is Sarah McCormack, I’m the editorial manager at the American Society for Nutrition (ASN). My role with iThenticate has been to plan how we’re going to implement it with our peer-review system. I mostly try to make sure that workflow within the system with Benchpress is working correctly.
iThenticate: What are some of the top line challenges you face with plagiarism? Is it more something you’re trying to address, something you’re trying to prevent? If you could, just talk to me a little bit about the motivation for implementing the iThenticate plagiarism software.
SM: Sure. As far as challenges, I don’t think our journals are different from most of the basic science journals. I think we were looking for this system probably for legal reasons more than anything else. And because it’s always been a goal to publish original material that hasn’t been used elsewhere. But we haven’t had the tools until now to screen for that kind of thing. So it meets a need that has been important to us for a long time.
We found that authors’ reuse of their own material is the most common issue, and the greatest reuse is in the “Methods” section, which there’s some justification for -- sometimes authors need to publish data from the same study more than once, which means that the Methods are pretty similar. The second area has been review-type articles where authors have generally reused substantial portions of their own material, so it’s author reuse of the same material that’s the greatest area where we’ve caught overlap.
I think though that we’re glad that we’ve begin using it [iThenticate] because it does meet this really important scientific and legal need, where in the past we’ve just had to take people’s word for it or rely on reviewers or even readers to let us know when there are problems.
iThenticate: ASN was one of the first organizations to implement the idea of featuring the [iThenticate] logo on the journal itself—is that something you wanted to do because you wanted to make it clear from the onset that avoiding plagiarism was a priority?
SM: We tell our authors on the submission page that we use the iThenticate service. We decided to do this because we want authors to know that we do this (before publication). We didn’t want this to come as a surprise that we were checking for plagiarism, and we’ve always asked authors to confirm that they haven’t plagiarized, but sometimes it does happen. And we want to let them know that now we have a tool for checking this kind of thing because we don’t want to get plagiarized papers. So we were hoping it would be a service to the authors to let them know that we’re using this tool to add a degree of credibility to our own journal – that yes, we are checking for this sort of thing.
iThenticate: Is there anything you’ve learned organizationally to make it flow better, be less work, easier? I’m looking for is a little bit of information about how you integrated this into your practices.
SM: Since we’re checking for this plagiarism up front, it does help us hopefully to avoid the time-consuming problems that might have occurred if authors were not aware that we were checking or if a problem surfaced after publication. We don’t know of course what would have happened without iThenticate and we’ve detected a little bit less plagiarism than expected. Of course, checking in the beginning means that we don’t have to spend the time to check later.
In terms of how we’ve actually implemented it, in the beginning we were thinking a lot about cost. We didn’t want to screen a bunch of papers we were just going to reject for different reasons. At first we began screening papers at acceptance, and then what we found is that the plagiarism wasn’t always as blatant as you might think. It wasn’t that the whole paper had issues. It might be that there was a part of it that might need something rewritten. So rather than the paper being a case of simply accept or reject, what we actually need is just to request revisions and rewriting of a certain section. Or maybe credit being given to the original source. It didn’t seem like an appropriate place to check at the end of the process so we progressively moved it closer and closer to the beginning, papers that get through the first review without getting rejected are very often accepted. Rejection usually happens on version one of a paper, and so we started checking on version two. But then we started to get complaints from authors saying “you know, you’re asking for these big changes now, I wish you had asked for them on the first version, because now you’re asking for something completely different. “ So, since we’ve found that the problems seem to come most with review-type papers, we’ve moved our focus on them a bit more.
So, we’ve begun checking a lot of review papers at the beginning of the process. We’ve found pretty particular ways to do that in our workflow after a lot of papers have been rejected, so we can minimize the number of unnecessary screenings. It’s been quite a process and we’ve gotten to know our workflow a lot better the more we think about this and we can figure out the best way for us to check and let authors know as early as possible without adding too much expense to our own organization.
iThenticate: When you introduced the idea of using iThenticate what was the overall response within the organization?
SM: I think everyone was glad to get access to this tool, which met a need we’ve had for a while. One of our editors did mention that he was concerned about adding another activity, to our already very busy workload. We told him that it wouldn’t take a lot of time that checking for plagiarism with iThenticate is a pretty simple process, and, of course, a much-needed piece of our editorial process, in which iThenticate was the best option.
Since 2011, ASN has participated in CrossCheck powered by iThenticate, a service offered by CrossRef® and powered by iThenticate that has more than 300 leading publisher participants that permit use of published works within the iThenticate database.