This is an interesting question. Plagiarism is considered one of the biggest threats to the integrity of published literature, and the proportion of journal retractions on account of plagiarism have increased recently. As a result, journals have become more vigilant to possible plagiarism in the manuscripts they receive. Plagiarism detection software packages have greatly helped in reducing the possibility of duplicate publication, and they thus enhance the quality of the literature. However, complete reliance on such software may hinder the publication of high-quality manuscripts because in some cases, like when a manuscript has highly technical content and terminology, the plagiarism software results can be inaccurate.
This issue was partially addressed in a recent study commissioned by COPE (Committee on Publication Ethics) and conducted by authors Zhang Y and Jia X, affiliated to the Journal of Zhejiang University-Science. The authors conducted a rigorous survey among journal editors from Anglophone and non-Anglophone countries to determine how they used CrossCheck—a plagiarism detection software powered by iThenticate—for submissions received. I highlight some of the study findings here.
Of 219 respondents, 42% said they used CrossCheck as part of their submission process. When asked to what extent they relied on this tool, a majority of the editors (66%) said they relied on a combination of the CrossCheck report and editorial assessment/peer reviewers’ comments, while a few (20%) said they relied completely on the report and outright rejected a manuscript with an unacceptably high similarity score. Another subgroup of editors (10%) said they sought the advice of peer reviewers on manuscripts where plagiarism was suspected, and a small proportion (4%) said they asked the manuscript authors for an explanation. On an average, the editors thought an overall similarity index of ≥50% might be considered unacceptable and might prompt a “reject” decision.
Further, most of the editors thought direct copying and pasting of content was unacceptable, and that the text should be reworded to the extent possible and accompanied by an appropriate citation. However, some editors stated they would be more tolerant of some amount of copying and pasting in the Materials and Methods section of an academic paper, because this section usually contains technical terms and a typical style of writing that cannot be revised drastically.
So to answer your question: Many journal editors seem to understand the limitations of a plagiarism detection tool and use it with discretion, in consultation with the authors or peer reviewers. Moreover, journal editors seem to be aware of some cases in which rewording content would be very difficult, and they tend to make some allowances in such cases. Nevertheless, it is true that some journal editors may over-rely on the software and promptly reject a manuscript with a high similarity index. Surveys such as the above one and other educative initiatives can assist efforts to standardize practices among journals.
I would advise authors to self-test their manuscript with a plagiarism detection tool prior to submission and be cautious if they find a similarity index of ≥50%. If the major portions of similarity are in the materials and methods section or because of technical content that absolutely cannot be reworded, you could perhaps explain it in your cover letter. In the present case, I would suggest that you go over the parts of your manuscript that show similarity with previous literature, and reword the content to the extent possible. Seek the help of an English-speaking colleague or a language editing service. Ensure that all required sources are cited. You could perhaps also write to the journal editor, explaining why certain parts were flagged in the plagiarism check and requesting that the decision be reconsidered. In your letter, don’t forget to reiterate the novel aspects of your study and how it adds value to the existing literature.