Technical College Požarevac
Discussion
Started 25th Jul, 2021
Post publication performance indicators
Why is the current research seemingly lacking novelty and having low average citation index? Are there any standardized post publication indicators of scoring the conceptual creativity of a researcher and monitoring the contribution of published research towards solving the existing problem(s)?
Most recent answer
There is a conflict in data analysis between simple outcomes that are readily reviewed, such as single-figure metrics, and the more complex exhibits that describe the underlying activity. The default option, for time-limited research managers and policy makers, is to use the simple metrics but in doing so they may miss essential information that can aid interpretation, explain unexpected results and guide future investment...
For example, we look at the individual and their publications and consider the question: what is excessive self-citation? This is a matter of increasing concern when suspect publications appear to be proliferating and the validity of research publication statistics is under threat...
2 Recommendations
Popular replies (1)
Otto-von-Guericke-Universität Magdeburg
Dear Job Omweno thank you for asking this relevant technical question which is certainly of significant interest to many other RG members as well. Actually, I'm wondering what made you believe that "current research seemingly lacks novelty and has low average citation index"? I assume that this depends on the different areas of research. At least for our discipline, chemistry, my personal impression is totally different. Leading chemical journals which used to be published biweekly now have large issues every week and publish 20,000+ pages every year (cf. e.g. the Journal of the American Chemical Society at https://pubs.acs.org/journal/jacsat). Every day exciting new results are being published in chemistry and its sub-disciplines, and it has become rather difficult to keep up with all the exciting new developments.
As for the second part of your question ("Are there any standardized post publication indicators of scoring the conceptual creativity of a researcher and monitoring the contribution of published research towards solving the existing problem(s)?") I might add that there have always been both exciting and boring research articles. On the average, high-impact journals publish more exciting and innovative research articles, while routine work finds its place in lower IF journals. Unlike most engineers we always had the privilege of being able to do basic research. This means that we mostly did not care about immediate practical applications and even less about "post publication indicators". However, I know of various cases where our work later turned out to be highly useful for a variety of practical applications such as catalysis and materials science. In the end our work can certainly be called highly cited (ca. 11,000 citations on RG) but I never tried anything to actively influence this. It just happened.
Good luck with your research and best wishes, Frank Edelmann
5 Recommendations
All replies (8)
Federal Agency for Cartography and Geodesy
That "the current research seemingly lacking novelty and having low average citation index" cannot be stated in general. Is this your observation in your field of research? There is a lot of innovative research in many fields. - Concerning "post publication indicators of scoring the conceptual creativity of a researcher": There is no such indicator which may measure creativity. Citation measures (h index etc.) primarily show how often a publication was used by others, but is does not give a direct measure of quality and creativity - see previous discussions listed below. - Concerning "monitoring the contribution of published research towards solving the existing problem(s)": This would demand something like a "science police" punishing those who are not creative, but science does not work like this. There are mechanisms in organizing science which select creative researchers, although these mechanisms do not work ideally.
3 Recommendations
Technical College Požarevac
Dear Wolfgang R. Dick , thanks for your contribution. Beside post publication performance indicators, like citations, impact factor, citescore etc, it is very important to bridge the gap between research results and solution to the problem (application).
"The engine through which research activities, transformed into innovation, leads to practical application is clearly Entrepreneurship.
Policies, properly regulated national legal/fiscal frameworks, dedicated infrastructures, interested and equal partners as well as the presence of an efficient capital market, the need for resources to finance basic research and the exploitation of scientific results to develop innovation are the necessary framework to activate and facilitate the process but the crucial ingredient is the entrepreneurial approach of the researcher..."
4 Recommendations
Otto-von-Guericke-Universität Magdeburg
Dear Job Omweno thank you for asking this relevant technical question which is certainly of significant interest to many other RG members as well. Actually, I'm wondering what made you believe that "current research seemingly lacks novelty and has low average citation index"? I assume that this depends on the different areas of research. At least for our discipline, chemistry, my personal impression is totally different. Leading chemical journals which used to be published biweekly now have large issues every week and publish 20,000+ pages every year (cf. e.g. the Journal of the American Chemical Society at https://pubs.acs.org/journal/jacsat). Every day exciting new results are being published in chemistry and its sub-disciplines, and it has become rather difficult to keep up with all the exciting new developments.
As for the second part of your question ("Are there any standardized post publication indicators of scoring the conceptual creativity of a researcher and monitoring the contribution of published research towards solving the existing problem(s)?") I might add that there have always been both exciting and boring research articles. On the average, high-impact journals publish more exciting and innovative research articles, while routine work finds its place in lower IF journals. Unlike most engineers we always had the privilege of being able to do basic research. This means that we mostly did not care about immediate practical applications and even less about "post publication indicators". However, I know of various cases where our work later turned out to be highly useful for a variety of practical applications such as catalysis and materials science. In the end our work can certainly be called highly cited (ca. 11,000 citations on RG) but I never tried anything to actively influence this. It just happened.
Good luck with your research and best wishes, Frank Edelmann
5 Recommendations
Bangabandhu Sheikh Mujib Medical University
Job Omweno, I think publication in a high impact factor journal and ample number of citations might be regarded as post-publication performance indicators.
1 Recommendation
Citation and indexing scores merely reflects usage of the publication, the actual research in most fields can best be judged by applying in practical scenarios and find how effectively and efficiently it can solve a real world problem.
Technical College Požarevac
Clarivate has added the Preprint Citation Index™ to the Web of Science™ platform. Researchers can now locate and link to preprints alongside other trusted content in the database, to streamline the research process and help make meaningful connections faster.
In academic publishing, a preprint is a version of a research paper or outcome publicly available in online repositories prior to peer review. Access to preprints in the Web of Science makes it quicker and easier for researchers to include them in their existing research workflows. It enables immediate access to up-to-date, aggregated and searchable preprints from selected repositories linked to author profiles...
5 Recommendations
Technical College Požarevac
There is a conflict in data analysis between simple outcomes that are readily reviewed, such as single-figure metrics, and the more complex exhibits that describe the underlying activity. The default option, for time-limited research managers and policy makers, is to use the simple metrics but in doing so they may miss essential information that can aid interpretation, explain unexpected results and guide future investment...
For example, we look at the individual and their publications and consider the question: what is excessive self-citation? This is a matter of increasing concern when suspect publications appear to be proliferating and the validity of research publication statistics is under threat...
2 Recommendations
Similar questions and discussions
Can the authors' rebuttal reverse the editors' decision?
Job Omweno
Most journals use double blind reviews (where the reviewers are anonymous to authors and vice versa), but few journals publish the reviewers' comments alongside the authors’ responses in an open review.
When the authors (through the corresponding author) submit a manuscript to journal, a technical editor takes an initial perusal on the document and decides whether the manuscript meets the basic standard criteria for publication such as referencing style and formatting and lies within the journal's scope for publication, hence can move forward with the document at this point. If not, the manuscript is desk-rejected and is never sent for reviewing. Depending on the journal, most manuscripts are rejected at this stage.
If the manuscript meets this criteria, the editor then sends it (I think) to three or four reviewers who have expertise in the field but have no conflict of interest to this research. It is assumed that the anonymous reviewers will make independent
decisions which will form the basis for consideration in rejecting or not rejecting the manuscript. The reviewers critically read the manuscript and comment on novelty, significance, technical quality and the possible research impact if published.
To some authors, the reasons for rejecting the manuscript looks minor and is not related to the main idea of the proposition, hence wonders whether they should submit a rebuttal to the editors decision (refer the previous discussion by Mumtaz Ali, May 2020: https://www.researchgate.net/post/rebuttal_to_editors_decision).
To the beginners or newbies in competitive research, the rejection of a manuscript can be extremely frustrating and too hard to cope with especially when when the reason for not considering the manuscript for publication is minor and largely unsubstantiated. It sometimes evokes negative responses from the author (refer the previous discussion by Panayiotis Koutentis, November 2013: https://www.researchgate.net/post/When_should_you_challenge_an_editors_decision_to_reject_a_paper)
which may affect their future authorship and submission to the journal.
It is particularly hard to deal with the first rejection when the author looked for the most suitable journal for their paper but will be better placed if the rewrite the manuscript and send it to another prioritized journal. Nevertheless, the authors may be experienced in the field and be well informed in the subject of research and may have preference to this particular journal. What if the authors decide to send a rebuttal to the editors' decision? Will the editors care to read a contrary response and provide a feedback? Is there any chance that the editors will consider it and rescind their decision on the manuscript?
Related Publications
This chapter explains the importance of creativity, innovation, and constant learning in establishing a learning culture and identifies the key role of Leaders in creating meaning and establishing that culture.
video: https://www.youtube.com/watch?v=Y_SHbhNHPww