ArticlePDF Available

Are the works that cite Beall's List accounting journals comparable to those that cite Scopus journals of similar citation impact?

Authors:

Abstract

Purpose Th study evaluates the apparent quality of the 10 Beall's List accounting journals with the highest citation rates by investigating whether the works that cite those journals are comparable to those that cite 11 Scopus journals of similar citation impact. Design/methodology/approach The study investigates the characteristics of the works that cited the Beall's List and Scopus journals, 2015–2020, comparing the two groups of citing works by publication type (article, book, etc.), extent of self-citation, inclusion in Beall's List and Scopus, Open Access (OA) status, publisher type, citation impact, and country/region of interest. Findings The Beall's List accounting journals tend to be cited in less reputable outlets; they are especially likely to be cited in Beall's List journals and especially unlikely to be cited in Scopus journals. However, other evidence suggests that these journals occupy a distinctive niche. The works that cite Beall's List journals are especially likely to be OA, to be published by universities and other nonprofits, and to focus on lower-income countries. They also have relatively low journal and publisher self-citation rates. Originality/value Beall's List accounting journals may be especially useful to scholars who rely on OA journals, who see their local universities as natural publishing partners, and who investigate topics of concern to developing countries. An increase in the number of non-predatory journals that cater to these authors' needs might help resolve the apparent problem of unmet demand for journal space.
1
Are the works that cite Beall’s List accounting
journals comparable to those that cite Scopus
journals of similar citation impact?
Forthcoming, Aslib Journal of Information Management
William H. Walters
Mary Alice & Tom O’Malley Library, Manhattan College, Riverdale, New York, USA
william.walters@manhattan.edu
Abstract
Purpose—Evaluates the apparent quality of the 10 Beall’s List accounting journals with
the highest citation rates by investigating whether the works that cite those journals
are comparable to those that cite 11 Scopus journals of similar citation impact.
Design/methodology/approach—Investigates the characteristics of the works that cited
the Beall’s List and Scopus journals, 2015–2020, comparing the two groups of citing
works by publication type (article, book, etc.), extent of self-citation, inclusion in
Beall’s List and Scopus, Open Access (OA) status, publisher type, citation impact, and
country/region of interest.
Findings—These predatory accounting journals tend to be cited in less reputable outlets;
they are especially likely to be cited in Beall’s List journals and especially unlikely to
be cited in Scopus journals. However, other evidence suggests that these journals
occupy a distinctive niche based on characteristics other than citation quality. The
Beall’s List citing works are especially likely to be OA, to be published by universities
and other nonprofits, and to focus on lower-income countries.
Originality/value—Beall’s List accounting journals may be especially useful to scholars
who rely on OA journals, who see their local universities as natural publishing
partners, and who investigate topics of concern to developing countries. An increase
in the number of non-predatory journals that cater to these authors’ needs might help
resolve the apparent problem of unmet demand for journal space.
Keywords: Citations, Google Scholar, Impact, Journal rankings, Predatory journals,
Quality
2
Introduction
Researchers, practitioners, university committees, and funding agencies often rely on a
journal’s reputation as an indicator of the type and quality of the articles published in the
journal. Although the reputation of a journal does not necessarily correspond to the
quality of any particular article, the careful evaluation of each individual paper is not
always feasible (Adler and Harzing, 2009; Ferrara and Bonaccorsi, 2016; Lozano et al.,
2012; Seglen, 1994, 1997). Journal rankings and perceptions of quality therefore play an
important role in the scholarly communication system.
Journal quality is not a single, straightforward construct, however, and at least six
dimensions of quality can be identified (Walters, 2022):
1. The editors’ and publishers’ motives and intentions
2. Adherence to established norms of peer review such as anonymity, use of expert
reviewers, absence of bias in reviewer selection, adequate time for review, and a
reasonable acceptance rate
3. Adherence to norms of scholarly publishing other than peer review (e.g.,
transparency, long-term preservation of content, professionalism in presentation, and
a web interface that facilitates discovery and access)
4. Scholarly quality (e.g., the subjective assessments of expert evaluators)
5. Impact on subsequent scholarship (e.g., number of times cited, outlets in which the
journal is cited, rate at which citations accrue, multidisciplinary impact, and extent to
which the theories, perspectives, and methods introduced in the journal are
incorporated into later work)
6. Impact on teaching and practice, as shown by citations in textbooks, inclusion of
articles in course syllabi and reading lists, and influence on professional norms and
standards.
The distinctions among these six dimensions of quality are readily apparent when
we consider the most widely known lists of predatory journals—Open Access (OA)
journals that pose as peer-reviewed outlets but accept nearly all submissions in order to
maximize revenue from the article processing charges (APCs) paid by authors, their
institutions, and their funding agencies. For example, Beall’s List defines predatory
journals based on the publisher’s motives (dimension 1) but actually selects publishers
and journals for inclusion based on dimensions 2 and 3, neither of which directly
represents the scholarly quality of the papers published in the journals (Beall, 2012; Beall
et al., 2021; Cobey et al., 2018; Krawczyk and Kulczycki, 2021).
This study gauges the apparent quality of the papers published in Beall’s List
accounting journals by investigating whether the works that cite those journals are
comparable to those that cite Scopus journals of similar citation impact. The analysis is
based on the assumption that the papers cited in reputable journals are themselves
reputable—that the authors who publish in well-regarded journals are in a good position
3
to judge the quality of the works they cite. In contrast, the papers cited in less reputable
outlets (such as predatory journals) are presumably less likely to be legitimate
contributions to the literature.
The paper first presents contextual information—an overview of the ways in which
the 10 Beall’s List accounting journals with the greatest number of citations per article
(2015–2020) are like or unlike the other Beall’s List accounting journals. The main
analysis then compares the characteristics of the works that cited (a) the 10 Beall’s list
accounting journals with the greatest number of citations per article and (b) 11 Scopus
journals of comparable citation impact. Specifically, it compares the two groups of citing
works with regard to publication type (journal article, book, chapter, conference paper,
thesis, or working paper), extent of self-citation, inclusion in Beall’s List, inclusion in
Scopus, Open Access status, publisher type (commercial publisher, scholarly society,
university, or other nonprofit), citation impact (CiteScore and CiteScore percentile), and
country or region of interest (by World Bank income classification and for 21 particular
countries and regions).
Previous research
Beall’s List
Beall’s List (Beall et al., 2021) is the oldest and best known list of potentially predatory
Open Access (OA) journals. It identifies more than 1300 publishers and nearly 1500
additional journals that authors are encouraged to avoid. Beall’s List was removed from
the web in January 2017, but it has since re-emerged and is now maintained by an
anonymous scholar. The original List is maintained in its final form, although a new
section contains additions and revisions made since that time. As of May 2022, the latest
updates were from December 2021.
As noted earlier, Beall’s List does not actually account for publishers’ intentions,
predatory or otherwise, since intentions are difficult to ascertain (Beall, 2012; Cobey et al.,
2018; Krawczyk and Kulczycki, 2021). Likewise, the evaluation process does not consider
the quality or impact of the articles published in each journal. Instead, the compilers of
Beall’s List have relied on a set of subjective criteria that represent departures from the
established norms of scholarly communication. For instance, the 54 warning signs
identified by Beall (2012, 2015a) include unusually rapid peer review, failure to identify
editors and board members, boastful language, misleading claims about index coverage
or citation impact, lack of transparency in editorial operations, absence of a
correction/retraction policy, poor copyediting, assignment of copyright to the publisher
rather than the author despite the journal’s OA status, and the use of spam e-mail to
solicit authors or board members.
There are multiple sources of legitimacy in scholarly publishing, and the journals
identified as predatory by one standard may be regarded as legitimate by other
4
standards (Siler, 2020). Conversely, some legitimate journals display idiosyncrasies that
might lead potential authors to question their integrity (Eriksson and Helgesson, 2018;
Grudniewicz et al., 2019). For instance, Olivarez et al. (2018) found that 18 of the 81
library and information science journals indexed in Web of Science could be regarded as
predatory based on two or more of Beall’s criteria; eight could be regarded as predatory
based on three or more criteria. Likewise, Shamseer et al. (2017) reported that the
journals on Beall’s List are most distinct from conventional journals in ways unrelated to
the content of the articles themselves—in the quality of their web sites, the intended
audience of those sites (authors rather than readers), their lack of transparency regarding
the review process, and their relatively low article processing fees.
Beall’s List has been criticized on other grounds as well. The problems most often
mentioned include imprecise evaluation criteria, inconsistent application of those criteria,
lack of transparency, bias against publishers in developing countries, infrequent updates,
lack of an appeals policy, and the absence of systematic mechanisms for the re-evaluation
of publishers and journals as conditions change (Berger and Cirasella, 2015; Chen, 2019;
Davis, 2013; Dony et al., 2020; Esposito, 2013). The last of these criticisms is especially
serious, since a publisher currently on Beall’s List may appear there due to the
characteristics of its web site more than a decade ago (Walters, 2022).
Citation impact of predatory journals
Three of the six dimensions of journal quality mentioned in the Introduction are
essentially procedural. Three others—scholarly quality (as assessed by expert reviewers),
impact on subsequent scholarship, and impact on teaching and practice—are more
fundamental because they are outcome-based; they concern the final product rather than
the methods by which the product was created. However, none of the 15 lists of
predatory or non-predatory journals identified by Koerber et al. (2020) and Strinzel et al.
(2019) evaluate journals based on their scholarly quality or impact. Assessments of
quality are especially problematic, since (1) they are potentially unreliable due to the
influence of reviewers’ individual backgrounds and perspectives; (2) the evaluation
process requires considerable effort; and (3) because scholarly quality is ultimately a
characteristic of individual articles rather than journals, the calculation or estimation of
journal-level statistics requires the relatively time-consuming evaluation of individual
articles.
Fortunately, the processes by which authors identify, evaluate, and cite relevant
papers may be regarded as an indirect means of assessing their quality. While Beall
(2015b), Moussa (2021a, 2021c), and others have stated that predatory journals bypass the
conventional peer review process and allow “un-vetted” research to enter the scholarly
record, the absence of serious peer review actually removes just one component of the
vetting process. Research papers are evaluated not just before they are published, but
5
also afterward, when authors decide whether to cite them. In most cases, evaluating
citation impact—that is, evaluating the cumulative effect of authors’ decisions to cite or
not cite particular papers—is the nearest we can get to a formal, expert assessment of
scholarly merit.
Although predatory journals tend to have citation rates substantially lower than
those of non-predatory journals, particular predatory journals (and particular articles)
may be similar to mid-ranked Scopus titles in their citation impact. This finding can be
seen in multidisciplinary analyses (Bagues et al., 2019; Björk et al., 2020; Moed et al., 2022)
and in studies of fields such as marketing (Moussa, 2021b) and the biomedical sciences
(Nwagwu and Ojemeni, 2015). Overall, these studies support the conclusion that while
most predatory journals have little scientific impact, some do include articles that could
have been placed in much better journals if the authors hadn’t “opted for the fast track
and easier option of a predatory journal” (Björk et al., 2020, pp. 1, 8).
Within the field of accounting, a journal’s adherence to procedural norms
(dimensions 2 and 3) is not necessarily a good predictor of the extent to which the articles
in the journal are used in subsequent scholarly work (dimension 5). At least six of the
journals on Beall’s List are cited at rates that would place them at or above the 25th
percentile among all Scopus accounting journals. Likewise, high-impact articles—cited
up to several hundred times—have appeared even in some of the Beall’s List accounting
journals with the lowest citation rates (Walters, 2022). From one perspective, we might
argue that a higher citation rate for a Beall’s List journal reflects favorably on the journal
by demonstrating that it contributes to the literature despite the factors working against
it—biases against OA journals, biases against authors and publishers in the developing
countries where many such journals originate, poor index coverage, and authors’
understandable reluctance to cite journals that have been publicly labeled as predatory
(Albu et al., 2015; Asai, 2021; Berger and Cirasella, 2015; Frandsen, 2017; Nwagwu and
Ojemeni, 2015; Shen and Björk, 2015; Xia et al., 2015; Yan and Li, 2018). From this first
perspective, a relatively high citation count for an article in a Beall’s List journal reveals
that subsequent authors have identified it as a genuine contribution to the literature
despite the many biases that make them inclined not to cite it.
There is another perspective, however. Several authors have argued that citations to
the papers in Beall’s List journals are downright harmful—that these citations reveal the
extent to which the scholarly literature has been contaminated by low-quality research,
flawed methods, and potentially false results (Anderson, 2019; Akça and Akbulut, 2021;
Frandsen, 2017; Moussa, 2021b, 2021c; Nelson and Huffman, 2015). From this second
perspective, the more highly cited Beall’s List journals are a danger because they have
been successful at claiming legitimacy for papers that may be inaccurate, biased, or
otherwise misleading.
6
Quality of the articles in predatory journals
Of course the implications of high or low citation rates depend on the actual scholarly
quality of the papers published in the predatory journals. Just two studies have directly
addressed this issue. The most careful analysis compared 25 articles published in Beall’s
List psychology journals with 25 published in Scopus journals of intermediate citation
impact (McCutcheon et al., 2016). Each article was blinded, then reviewed by the
research team on the basis of five criteria. Although the team found more than four times
as many statistical and grammatical errors in the Beall’s List articles, the score
differentials for the other three criteria (literature review, research methods, and overall
contribution to science) were not nearly as pronounced. Overall, McCutcheon et al. were
struck by the wide variations in quality among the Beall’s List articles. Some were of
uniformly low quality while others received high marks in every area. A second study
reviewed 358 articles in Beall’s List nursing journals, reporting that 48% of the papers
were poor rather than average (48%) or excellent (4%) and that many had numerous errors
in writing or presentation. Nonetheless, only 5% reported findings “potentially harmful
to patients or others” (Oermann et al., 2018, p. 8). That analysis may have been biased,
however, since all the assessors knew in advance that the papers had been published in
predatory journals.
Direct investigations of the scholarly quality of the papers in Beall’s List journals are
likely to provide the best evidence of their favorable or unfavorable impact on the
literature. A less direct but still informative approach is to evaluate whether the works
that cite Beall’s List journals are themselves of high or low quality, published in reputable
or questionable journals, by experienced or inexperienced authors. Three recent studies
suggest that predatory journals tend to be cited by works of modest reputation or impact
but that the differential between predatory and non-predatory journals is not substantial.
Frandsen (2017) compiled information on the first authors of 1295 papers that cited 124
Beall’s List journals in a range of disciplines, reporting that the characteristics of the
citing authors closely resemble those of the cited authors. In particular, both groups tend
to be based in South and Southeast Asia, and both tend to have relatively low publication
and citation counts. Frandsen’s results are not stratified by subject area, however, so it is
possible that Beall’s List authors and citers simply publish disproportionately in fields or
subfields with low publication and citation rates.
Moussa (2021c) evaluated the citations to four predatory marketing journals that
appeared in 68 non-predatory journals listed in the 2018 Guide of the Chartered
Association of Business Schools (CABS). He found that the number of citations to
predatory journals was unrelated to the age or citation impact of the citing journal but
weakly related to the subjective journal ratings assigned by CABS. Specifically, the
journals in the top CABS category (4*) cited the predatory journals at lower rates than
those in the lowest two categories (1 and 2).
7
Finally, Kulczycki et al. (2021) reviewed 3234 articles published in 65 predatory
social science journals, reporting that just 13% were cited in one or more Web of Science
journals over a seven-year period. This suggests that the Beall’s List journals tend to be
cited in less prestigious outlets—publications other than the Web of Science journals.
The citing journals are not always low-impact publications, however, since the Web of
Science journals that cited more than one article from a predatory journal were especially
likely to have high impact factors.
Methods
Cited journals
The 10 Beall’s list accounting journals selected for this investigation are those with the
highest year-adjusted number of citations per article. As described fully elsewhere
(Walters, 2022), each journal’s year-adjusted citation count indicates the number of times
the articles published from January 2015 through December 2018 were cited within the
works indexed by Google Scholar from the publication date to the search date (July
through September 2020), adjusted to approximate the number of citations that would
have accrued over the 2015–2020 period if all the articles had been published in 2015. In
turn, citations per article is simply the year-adjusted citation count divided by the number
of articles published during the 2015–2018 period.
Three of the 10 accounting journals in the Beall’s List group were also indexed in
Scopus at the time the data were compiled. The comparison group of Scopus journals
therefore includes those three journals as well as the eight other Scopus accounting
journals that most closely matched each of the Beall’s List journals in terms of citations per
article. Book series indexed in Scopus were excluded from consideration, as were
journals that published more than a third of their articles in languages other than
English.
With the overlap between the two groups, the study population (from which the
cited articles were drawn) includes 10 Beall’s List journals and 11 Scopus journals but just
18 journals altogether. As Table 1 shows, the Beall’s List journals and the Scopus journals
are identical in terms of mean citations per article and are very similar in several other
respects.
Sampling of cited works and citing works
Complete population data were compiled for every article (every cited work) published
from 2015 to 2018 in each of the 18 cited journals. However, this study is based mainly
on the characteristics of the works that cited those articles—the citing works. Two-stage
sampling was used to arrive at a list of citing works for which data were compiled. In
the first stage, 25 articles published from January 2015 through December 2018 were
selected at random from each of the 18 journals—450 articles, total.
8
The second stage was based on a list of the scholarly works (journal articles, books,
book chapters, conference papers, theses, and working papers) that cited those 450
articles during the 2015–2020 period. Individual article searches in Google Scholar were
used to compile the list in August 2021. This manual method avoids the difficulties
associated with the use of automated search tools (Wilder and Walters, 2021, pp. 8–10).
Basic bibliographic information was recorded for each of the 3255 works that cited any of
the 450 articles.
For each of the 18 journals, 100 citing works were selected at random, and those are
the works for which additional information was compiled. (See Data compilation, below.)
For three of the cited journals, however, there were fewer than 100 works that cited the
25 selected articles. The sample therefore includes all 69 works that cited the 25 articles
in Accounting and Finance Research, all 94 works that cited the 25 articles in the
International Journal of Financial Research, and all 87 works that cited the 25 articles in
Risks.
Data compilation
Publisher information for each of the citing works was compiled in September 2021 from
the journals’ web sites, and each publisher was categorized by OA status (OA or not) and
publisher type (commercial publisher, scholarly society, university, or other
nonprofit/government). Journals published jointly by nonprofit organizations and
commercial publishers were included in the commercial publisher category.
Of the citing works, 1112 are journal articles. Those articles appeared in 769
different journals, which were each checked in October 2021 to see if they appeared in
Beall’s List and/or Scopus. CiteScore information was compiled for the Scopus journals.
Each citing work (e.g., each article or chapter) was also evaluated individually to
determine whether it was associated with a particular country or region. In all but a few
cases, a country or region was identified/assigned if the title of the work included a place
name, if the title of the work was in a language other than English, or if the work was a
thesis submitted to a college or university. However, no country or region was assigned
if the place name referred to an approach or perspective rather than a location (e.g.,
Austrian economics). World Bank (2022) income categories were used to classify each
country. If a work involved two countries in different regions, one with a higher income
than the other, the designation for the lower-income country was used.
Case weights and significance tests
This study makes use of simple descriptive statistics—counts, percentages, means,
standard deviations, and medians. However, because the sampling method used here
results in an equal number of citing works from nearly every journal, the unweighted
sample data are not representative of the population. Within the population, the number
9
of citing works is proportional to the citation impact of each cited journal. For instance, a
journal with twice as many citations as another should be represented by twice the
number of citing works.
To ensure that the reported results are representative of the population rather than
the sample, I statistically weighted each citing work by the cited journal's overall citation
impact—its total year-adjusted citation count. Except as mentioned in the table notes,
these case weights were used whenever the works that cited the 10 Beall’s list accounting
journals were compared with the works that cited the 11 Scopus journals. For these same
comparisons, significance tests were undertaken using two-sample t tests for means or
chi-square tests for proportions, as appropriate (MedCalc, 2022a, 2022b).
Contextual results
These results compare the top 10 Beall’s List accounting journals—those with the most
citations per article—with 10 other Beall’s List accounting journals selected at random
from those that published in all four years of the study period. The journals are shown in
Table 2.
Distinctive characteristics of the top 10 Beall’s List journals
As Table 2 reveals, the top 10 Beall’s List journals are different from the others in at least
four respects. On average, they have higher citation rates, they publish more articles per
year, they have earlier founding dates, and they are more likely to be indexed in Scopus.
These differences are apparent despite the high internal variation within each group. For
instance, the 10 higher-impact journals received from 6.7 to 11.1 citations per article,
published from 26 to 454 articles over the four-year period, and were founded as early as
2001 or as recently as 2015.
Further analysis revealed one additional distinction between the 10 higher-impact
journals and the others. All but three of the top 10 journals are based in the United States
or Canada; the exceptions are the Journal of Accounting and Taxation (Nigeria), the
International Journal of Academic Research in Accounting, Finance and Management Sciences
(Pakistan), and Investment Management and Financial Innovations (Ukraine). In contrast,
the publishers of the other 10 journals are based in eight different nations: the United
States and India (2 journals each), Australia, Austria, Belgium, Jordan, Nigeria, and the
United Kingdom.
Editorial boards of the Beall’s List journals
Although the two groups are not significantly different in the average size of their
editorial boards (Table 2), further investigation revealed an unexpected finding: five of
the top 10 journals (and three of the lower-ranked Beall’s List journals) list no editor-in-
chief or list an editor-in-chief who is a publishing company executive rather than an
10
individual with a full-time teaching or research appointment. Perhaps surprisingly,
however, the absence of independent editorial positions has not limited the scholarly
impact of those journals, which are no different from the other top 10 journals in their
citation impact (p < 0.05, two-tailed). Every top 10 Beall’s List journal does provide
information on its editorial board members, who seem no different from the board
members of conventional journals in their academic positions and institutional
affiliations. Ruiter-Lopez et al. (2019) have reported a similar finding—that the boards of
many Beall’s List journals appear to be well qualified in terms of their academic
affiliations.
Of the 20 journals shown in Table 2, all but three have editorial boards with broad
geographic representation and no single dominant country or region. The exceptions are
the top-10 International Journal of Academic Research in Accounting, Finance and Management
Sciences (Romania) and two lower-ranked journals, the Journal of Accounting and Financial
Management (Nigeria) and the Journal of Advance Research in Business Management and
Accounting (India).
Other characteristics of the Beall’s List journals
For these 20 journals, the mean article processing charges are generally reasonable (Table
2). This is consistent with earlier reports that the journals on Beall’s List tend to charge
relatively low APCs (Shamseer et al., 2017; Xia, 2015). Eight of the journals charge APCs
of $200 or less, and two charge no APCs at all. Of course this raises the question of how
those two “predatory” journals generate revenue without charging either author fees or
subscription fees.
Although the top 10 journals are more likely than the other Beall’s List journals to be
indexed in Scopus, none of the journals shown in Table 2 are indexed extensively. Just
three of the top 10 journals are indexed in Scopus, and just two are included in the
Directory of Open Access Journals (DOAJ). None of the other 10 journals are included in
either database. Just one of the 20 journals, Investment Management and Financial
Innovations, is indexed by ABI/INFORM, and none are indexed by either EconLit or Social
Sciences Citation Index.
Main results
These results compare the works that cited the top 10 Beall’s List accounting journals
with the works that cited 11 Scopus accounting journals of equivalent citation impact.
The cited journals are shown in Table 1. All the inter-group differences mentioned in this
section are significant at the two-tailed 0.05 level.
11
Types of citing works
Perhaps not surprisingly for a field such as accounting, the cited works are cited mainly
in journals. This holds true for both the Beall’s List group and the Scopus comparison
group. As Table 3 shows, more than 60% of the citations appear in journals, 23% in
theses or dissertations, 7% in conference papers, and fewer than 5% each in working
papers, book chapters, and books. The Beall’s List journals and the Scopus journals are
cited by the same types of publications, and the one significant difference between the
two groups can be attributed to just a few journals. Three Beall’s List journals—the
Global Journal of Management and Business Research D, the Journal of Accounting and
Auditing, and the Journal of Finance and Accounting—are cited especially often in journals
(74–76% of the time) while three Scopus journals—EC Tax Review, the Journal of Business
Valuation and Economic Loss Analysis, and the Journal of Payments Strategy and Systems—are
cited less often in journals (44–48%) and relatively often in books (4–10%).
Characteristics of the articles that cite the Beall’s List and Scopus journals
One concern is that the Beall’s List accounting journals comprise an independent citation
network—that they are cited mainly by each other. Table 4 reveals that this is not a
problem, however. In fact, the top 10 Beall’s List journals have significantly fewer
journal and publisher self-citations than the Scopus journals.
Nonetheless, the articles in the Beall’s List journals do receive disproportionately
many of their citations from other Beall’s List journals. This finding does not itself settle
the question of whether the citations to the Beall’s List journals are legitimate, however,
since it can be used to support at least three different interpretations:
1. The higher percentage for the Beall’s List journals—23% rather than 17%—reveals the
inferior quality of the works that cite those articles.
2. The modest difference between the two groups, just 6 percentage points, is not a cause
for concern.
3. Both the Beall’s List journals and the Scopus journals are afflicted by a similar
problem; both are routinely cited in journals that are themselves suspect.
Inclusion in Scopus is sometimes regarded as an indicator of journal quality, since
the Scopus selection criteria include regularity of publication, number of articles
published, citation impact, self-citation rate, integrity and transparency of peer review,
adoption of a statement on ethics, consistency between journal content and the stated
aims of the journal, editor’s scholarly/professional standing, diversity in authorship and
board membership, readability, and quality of online presentation (Elsevier, 2022;
Holland et al., 2021). As Table 4 shows, only 24% of the citations to the Beall’s List
accounting journals appear in Scopus journals; for the Scopus comparison group, the
value is 44%. This finding is consistent with the first interpretation mentioned above—
12
that the Beall’s List journals are disproportionately cited in less prestigious (non-Scopus)
publications.
However, Table 4 also reveals that the Beall’s List journals are especially likely to be
cited in Open Access journals, in journals from nonprofit publishers, and in university-
sponsored journals. It is possible that these three types of journals are underrepresented
in Scopus for reasons unrelated to their quality. That is, the journals of the smaller
nonprofit publishers, such as individual university departments, are perhaps less likely
to be identified and evaluated by Scopus. After all, the Scopus Content Selection and
Advisory Board reviews the works that have been suggested to them by librarians,
publishers, and journal editors (Elsevier, 2020, pp. 6, 8, 16). Certain journals may have
been omitted from Scopus simply because they were never evaluated for possible
inclusion.
Citation rates of the Scopus-indexed citing journals
As noted earlier, the top 10 Beall’s List accounting journals are relatively unlikely to be
cited by the journals indexed in Scopus. However, the Scopus-journal articles that do cite
Beall’s List journals have an average journal citation rate no different from that of the
Scopus-journal articles that cite other Scopus journals. This can be seen in Table 5.
Perhaps surprisingly, the one significant difference between the two groups reveals a
higher citation impact for the Beall’s List citing journals. However, this difference can be
attributed entirely to two Scopus journals that are seldom cited by journals with
CiteScores at or above the 20th percentile—EC Tax Review (22%) and the Journal of
Payments Strategy and Systems (57%).
Countries and regions associated with the works that cite the Beall’s List and
Scopus journals
Previous studies have demonstrated that predatory journals tend to attract both authors
and citers from developing countries (Frandsen, 2017; Nwagwu and Ojemeni, 2015; Shen
and Björk, 2015; Xia et al., 2015). Table 6 supports this assertion. The works that cite
Beall’s List journals are especially likely to focus on countries in the World Bank’s lower
middle income category (i.e., countries with per capita gross national incomes from $1046
to $4095) and especially unlikely to focus on countries in the high income category
($12,696 or higher).
Moreover, 80% of the works that cite the Beall's List journals, but just 66% of those
that cite the Scopus journals, deal with particular countries or regions. Just as the articles
in Beall’s List journals seem especially useful to authors who publish in nonprofit and
university-sponsored journals (Table 4), they may also be especially useful to authors
who write about conditions, issues, and perspectives specific to lower-income countries
(Table 6). In particular, many of the citing articles deal with topics of substantial local
13
interest but little international interest (i.e., the impact of recent legislation on methods of
estimating the value of equipment in a locally important industry).
Table 7 shows the results for particular countries and regions. Indonesia and
Nigeria, both in the World Bank’s lower middle income group, are far more likely to be
featured in the works that cite the Beall’s List journals than in those that cite the Scopus
journals. (As noted earlier, one of the top 10 Beall’s List journals is based in Nigeria.
Another has relatively many Nigerian faculty on its editorial board.) In contrast, the
works that cite the Scopus journals are more likely to focus on Malaysia, South Africa,
and Europe. Ukraine, the United States, and Turkey are featured relatively often in both
sets of citing works.
Discussion and conclusion
Two key findings of this study support the idea that the top 10 Beall’s List accounting
journals tend to be cited in less reputable outlets than the Scopus accounting journals of
similar citation impact. First, the articles in the Beall’s List group are about 35% more
likely than those in the Scopus group to be cited in Beall’s List journals. Second, those
same articles are especially unlikely to be cited in Scopus journals.
At the same time, other findings suggest that the Beall’s List citing works may be
simply distinctive rather than lower in quality; they are especially likely to appear in
Open Access journals, to be published by nonprofit agencies (universities, in particular),
and to focus on lower-income countries. These findings raise the possibility that the
topics and approaches covered in Beall’s List journals are especially useful to scholars
without access to the full range of subscription journals, to those who see their local
universities as natural publishing partners, and to those who investigate topics of special
concern to developing regions. It is even possible that the relatively low number of
Beall’s List citing journals indexed in Scopus may be linked to factors other than
scholarly quality—that OA journals, small nonprofit publishers, and journals of special
interest to researchers in lower-income countries may be systematically
underrepresented in Scopus. This explanation is conjectural, but it is also consistent with
the methods used to identify and evaluate potential additions to the database (Elsevier,
2020, pp. 6, 8, 16). Finally, the Beall’s List citing works are especially unlikely to engage
in either journal self-citation or publisher self-citation. This is contrary to the notion that
they represent an independent citation network apart from the main body of accounting
literature.
A legitimate niche?
Direct assessments of scholarly quality suggest that the articles in Beall’s List journals are
prone to errors in writing, presentation, and statistical analysis. Most have no
fundamental flaws, however, and at least some contributions are comparable to the
14
articles that typically appear in mid-level journals (McCutcheon et al., 2016; Oermann et
al., 2018, p. 8). Previous analyses of the works that cite predatory journals show that they
often cater to the needs of authors and readers in developing countries. They are
generally written by less experienced authors and appear in less prestigious outlets,
although particular papers in Beall’s List journals have been cited by well known
scholars in top-ranked publications (Frandsen, 2017; Kulczycki et al., 2021; Moussa,
2021c).
The findings reported here support these earlier conclusions. They also suggest a
possible explanation—that Beall’s List journals tend to be cited in lower-ranking
publications because of their emphasis on developing countries. Authors in developing
countries are especially likely to publish in OA journals, to focus on research topics of
local or national interest, and to conduct applied or practice-oriented research (Asai,
2021; Contreras, 2012; Tennant et al., 2016; Vuong et al., 2020). Moreover, despite strong
pressure to publish in internationally ranked journals, many accounting faculty outside
Western Europe and North America find the journals published by national universities
and professional associations especially appropriate for their work (Albu et al., 2015).
Beall’s List journals, and the works that cite them, may therefore occupy a niche that is
underserved by other accounting journals. The results of this analysis are consistent with
the idea that in lower-income countries, nonprofits such as universities and scholarly
societies sponsor OA journals that draw on (and cite) the kinds of locally relevant articles
that often appear in Beall’s List journals. These citing publications, in turn, are perhaps
less likely than others to be evaluated for inclusion in indexes such as Scopus and Web of
Science.
Although this interpretation explains the demand for OA journals that deal with the
concerns of accounting scholars in developing countries, it does not explain the success
of predatory OA journals. After all, predatory journals are likely to be attractive to
authors under just three conditions:
1. The journal’s predatory status is not known to the author.
2. Although the journal has “predatory” characteristics or has been publicly labeled as
predatory, both authors and evaluators (such as promotion committees) are willing to
concede that it might be a legitimate journal. The distinction between predatory and
legitimate journals is not always clear, and at least one or two publishers once
identified as predatory have since managed to improve their reputations considerably
(Academic Journals, 2022; MDPI, 2019).
3. From the author’s perspective, the potential rewards of publishing in a journal known
to be predatory are perceived to be high relative to (a) the effort required to place a
paper in the journal and (b) the possible stigma associated with such a publication
(Kurt, 2018; Pyne, 2017).
Fortunately, the provision of accurate and current information about journal impact and
other dimensions of quality is likely to be beneficial under all three conditions, since it (a)
15
increases scholars’ awareness of journals that do not meet generally accepted standards,
(b) allows for the recognition of improvements (or declines) in particular journals’
standards over time, (c) minimizes the rewards associated with publication in low-
quality journals, and (d) encourages the development of legitimate journals that focus on
the needs of distinctive research communities.
One useful strategy for promoting access to accurate, current information is to
improve the reliability of predatory journal lists. Another is to include a broader range of
journals in citation databases and in journal rankings or ratings—not necessarily as an
indicator of quality, but as part of an effort to more readily validate the journals’ impact
and status. Of the major citation databases, only Google Scholar includes more than a
small number of accounting journals at the lower end of the reputation hierarchy. Of the
58 accounting journals on Beall’s List, just three are included in Scopus and none are
included in Web of Science. Likewise, only three are included in the journal ratings of
the Australian Business Deans Council (2019), one in the ratings of the Chartered
Association of Business Schools (2018), and none in Harzing’s (2021) Journal Quality List.
Meeting the demand for journal space
The apparent success of some predatory accounting journals, in terms of articles
published and cited, suggests that authors’ demand for journal space is high relative to
the space currently available in non-predatory journals. For at least some authors, certain
Beall’s List journals may seem like appropriate backup journals for papers that were not
accepted elsewhere. If predatory journals are attractive to authors mainly because of the
low effort required (condition 3, above), then an increase in the number of non-predatory
journals that cater to the needs of authors in developing countries is not likely to help
resolve the problem. However, if predatory journals attract submissions mainly due to
very low rates of acceptance at non-predatory journals, then an increase in the number of
legitimate journals might be expected to improve authors’ publishing opportunities and
perhaps even drive predatory journals out of the market.
Subsequent investigations may help clarify the extent of any unmet demand for
journal space in accounting, both generally and with regard to the kinds of authors who
most often write and cite the papers in Beall’s List journals. Likewise, similar analyses
might be undertaken for other fields and professions. As mentioned in an earlier study,
the role of predatory journals within particular disciplines is likely to vary with factors
such as “the degree of competition for journal space, the level of research productivity
expected of authors, the extent of agreement about methods and standards of quality, the
emphasis on pure or applied research, and the extent to which conventional journals are
receptive to the work of authors working outside the mainstream in terms of topic,
theoretical approach, or geographical emphasis” (Walters, 2022, p. 12).
16
Further research
Authors demonstrate the importance of earlier papers whenever they choose to cite
them. Not all citations reflect favorably on the cited research, however (Walters, 2017). It
is therefore important to learn more about authors’ motives for citing, how they evaluate
the papers they cite, and how their citation behavior varies based on the reputation of the
journals in which they publish. For instance, one assumption underlying this
investigation is the idea that the works cited in the foremost journals are of generally
higher quality than those cited in less prestigious journals. Nonetheless, the results
(Table 5) show no relationship between the predatory/non-predatory status of the cited
journal and the CiteScore of the citing journal. Likewise, Kulczycki et al. (2021) have
reported that higher-impact social science journals are more likely than others to cite
predatory journals repeatedly. This raises the possibility that the top journals expect
authors to cover the literature more fully while the lower-ranked journals expect them to
cover just the most important works.
Further research might also address a methodological limitation of the current
study. This analysis relies heavily on two binary variables—"included in Beall’s List or
not” and “included in Scopus or not”—even though (a) quality is best measured on a
continuous scale and (b) most of the journals that cite these Beall’s List journals appear in
neither Beall’s List nor Scopus. The use of more sophisticated indicators might be
helpful, even in analyses that focus solely on scholarly impact rather than other
dimensions of quality.
17
References
Academic Journals (2022), “About us”, available at:
https://academicjournals.org/about_us
Adler, N.J. and Harzing, A.-W. (2009), “When knowledge wins: transcending the sense
and nonsense of academic rankings”, Academy of Management Learning & Education,
Vol. 8 No. 1, pp. 72–95.
Akça, S. and Akbulut, M. (2021). “Are predatory journals contaminating science? An
analysis on the Cabells' Predatory Report”, Journal of Academic Librarianship, Vol. 47
No. 4, article 102366.
Albu, N., Albu, C.N., Bunea, S. and Girbina, M.M. (2015), “Accounting academia in
emerging economies: evolutions and challenges”, International Journal of Accounting
and Information Management, Vol. 23 No. 2, pp. 128–151.
Anderson, R. (2019), “Citation contamination: references to predatory journals in the
legitimate scientific literature”, Scholarly Kitchen, 28 October, available at:
https://scholarlykitchen.sspnet.org/2019/10/28/citation-contamination-references-to-
predatory-journals-in-the-legitimate-scientific-literature/
Asai, S. (2021), “Author choice of journal type based on income level of country”, Journal
of Scholarly Publishing, Vol. 53 No. 1, pp. 24–34.
Australian Business Deans Council (2019), “2019 ABDC journal quality list”, available at:
https://abdc.edu.au/wp-content/uploads/2020/11/abdc_jql_2019_v8.xlsx
Bagues, M., Sylos-Labini, M. and Zinovyeava, N. (2019), “A walk on the wild side:
‘predatory’ journals and information asymmetries in scientific evaluations”, Research
Policy, Vol. 48 No. 2, pp. 462–477.
Beall, J. (2012), “Predatory publishing”, The Scientist, Vol. 26 No. 8, pp. 22–23.
Beall, J. (2015a), “Criteria for determining predatory open-access publishers”, available
at: https://beallslist.net/wp-content/uploads/2019/12/criteria-2015.pdf
Beall, J. (2015b), “Predatory journals and the breakdown of research cultures”, Information
Development, Vol. 31 No. 5, pp. 473–476.
Beall, J., et al. (2021), “Beall’s list of potential predatory journals and publishers”,
available at https://beallslist.net/
Berger, M. and Cirasella, J. (2015), “Beyond Beall’s List: better understanding predatory
publishers”, College & Research Libraries News, Vol. 76 No. 3, pp. 132–135.
Björk, B.C., Kanto-Karvonen, S. and Harviainen, J.T. (2020), “How frequently are articles
in predatory open access journals cited?”, Publications, Vol. 8 No. 2, article 17.
Chartered Association of Business Schools (2018), “Academic journal guide”, available at:
https://facultystaff.richmond.edu/~tmattson/AJG%202018%20Journal%20Guide.pdf
Chen, X. (2019), “Beall’s List and Cabell’s Blacklist: a comparison of two lists of predatory
OA journals”, Serials Review, Vol. 45 No. 4, pp. 219–226.
18
Cobey, K.D., Lalu, M.M., Skidmore, B., Ahmadzai, N., Grudniewicz, A. and Moher, D.
(2018), “What is a predatory journal? A scoping review”, F1000Research, Vol. 7, article
1001.
Contreras, J. (2012), “Open access scientific publishing and the developing world”, St
Antony’s International Review, Vol. 8 No. 1, pp. 43–69.
Davis, P. (2013), “Open Access ‘sting’ reveals deception, missed opportunities”, Scholarly
Kitchen, 4 October, available at: https://scholarlykitchen.sspnet.org/2013/10/04/open-
access-sting-reveals-deception-missed-opportunities/
Dony, C., Raskinet, M., Renaville, F., Simon, S. and Thirion, P. (2020), “How reliable and
useful is Cabell’s blacklist? A data-driven analysis”, LIBER Quarterly, Vol. 30 No. 1,
pp. 1–38.
Elsevier (2020), “Scopus content coverage guide”, available at:
https://www.elsevier.com/__data/assets/pdf_file/0007/69451/Scopus_ContentCoverag
e_Guide_WEB.pdf
Elsevier (2022), “Content policy and selection”, available at:
https://www.elsevier.com/solutions/scopus/how-scopus-works/content/content-
policy-and-selection
Eriksson, S. and Helgesson, G. (2018), “Time to stop talking about ‘predatory journals’”,
Learned Publishing, Vol. 31 No. 2, pp. 181–183.
Esposito, J. (2013), “Parting company with Jeffrey Beall”, Scholarly Kitchen, 16 December,
available at: https://scholarlykitchen.sspnet.org/2013/12/16/parting-company-with-
jeffrey-beall/
Ferrara, A. and Bonaccorsi, A. (2016), “How robust is journal rating in humanities and
social sciences? Evidence from a large-scale, multi-method exercise”, Research
Evaluation, Vol. 25 No. 3, pp. 279–291.
Frandsen, T. (2017), “Are predatory journals undermining the credibility of science? A
bibliometric analysis of citers”, Scientometrics, Vol. 113 No. 3, pp. 1513–1528.
Grudniewicz, A., et al. (2019), “Predatory journals: No definition, no defence”, Nature,
Vol. 576 No. 7786, pp. 210–212.
Harzing, A.-W. (2021, July 4), “Journal quality list”, available at:
https://harzing.com/resources/journal-quality-list
Holland, K., Brimblecombe, P., Meester, W. and Chen, T. (2021), “The importance of
high-quality content: curation and reevaluation in Scopus”, available at:
https://www.elsevier.com/research-intelligence/resource-library/scopus-high-quality-
content
Koerber, A., Starkey, J.C., Ardon-Dryer, K., Cummins, R.G., Eko, L. and Kee, K.F. (2020),
“A qualitative content analysis of watchlists vs safelists: how do they address the
issue of predatory publishing?”, Journal of Academic Librarianship, Vol. 46 No. 6, article
102236.
19
Krawczyk, F. and Kulczycki, E. (2021), “How is open access accused of being predatory?
The impact of Beall’s Lists of predatory journals on academic publishing”, Journal of
Academic Librarianship, Vol. 47 No. 2, article 102271.
Kulczycki, E., Hołowiecki, M., Taşkın, Z. and Krawczyk, F. (2021), “Citation patterns
between impact-factor and questionable journals”, Scientometrics, Vol. 126 No. 10,
8541–8560.
Kurt, S. (2018), “Why do authors publish in predatory journals?”, Learned Publishing, Vol.
31 No. 2, pp. 141–147.
Lozano, G.A., Larivière, V. and Gingras, Y. (2012), “The weakening relationship between
the impact factor and papers’ citations in the digital age”, Journal of the American
Society for Information Science and Technology, Vol. 63 No. 11, pp. 2140–2145.
McCutcheon, L., Aruguete, M., McKelvie, S., Jenkins, W., Williams, J.L., McCarley, N.,
Rivardo, M. and Shaughnessy, M.F. (2016), “How questionable are predatory social
science journals?”, North American Journal of Psychology, Vol. 18 No. 3, pp. 427–440.
MDPI (2019), “Response to MDPI Wikipedia article”, available at:
https://www.mdpi.com/about/announcements/1558
MedCalc (2022a), “Comparison of means calculator”, available at:
https://www.medcalc.org/calc/comparison_of_means.php
MedCalc (2022b), “Comparison of proportions calculator”, available at:
https://www.medcalc.org/calc/comparison_of_proportions.php
Moed, H.F., Lopez-Illescas, C., Guerrero-Bote, V.P. and de Moya-Anegon, F. (2022),
“Journals in Beall’s List perform as a group less well than other open access journals
indexed in Scopus but reveal large differences among publishers”, Learned Publishing,
Vol. 35 No. 2, pp. 130–139.
Moussa, S. (2021a), “A ‘Trojan horse’ in the reference lists: citations to a hijacked journal
in SSCI-indexed marketing journals”, Journal of Academic Librarianship, Vol. 47 No. 5,
article 102388.
Moussa, S. (2021b), “Citation contagion: a citation analysis of selected predatory
marketing journals”, Scientometrics, Vol. 126 No. 1, pp. 485–506.
Moussa, S. (2021c), “Contamination by citations: references to predatory journals in the
peer-reviewed marketing literature”, South Asian Journal of Marketing, Vol. 2 No. 1, pp.
5–27.
Nelson, N. and Huffman, J. (2015), “Predatory journals in library databases: how much
should we worry?”, Serials Librarian, Vol. 69 No. 2, pp. 169–192.
Nwagwu, E.W. and Ojemeni, O. (2015), “Penetration of Nigerian predatory biomedical
open access journals, 2007–2012: a bibiliometric [sic] study”, Learned Publishing, Vol.
28 No. 1, pp. 23–34.
Oermann, M.H., Nicoll, L.H., Chinn, P.L., Ashton, K.S., Conklin, J.L., Edie, A.H.,
Amarasekara, S. and Williams, B.L. (2018), “Quality of articles published in predatory
nursing journals”, Nursing Outlook, Vol. 66 No. 1, pp. 4–10.
20
Olivarez, J.D., Bales, S., Sare, L. and van Duinkerken, W. (2018), “Format aside: Applying
Beall’s criteria to assess the predatory nature of both OA and non-OA library and
information science journals”, College & Research Libraries, Vol. 79 No. 1, pp. 52–67.
Pyne, D. (2017), “The rewards of predatory publications at a small business school”,
Journal of Scholarly Publishing, Vol. 48 No. 3, pp. 137–160.
Ruiter-Lopez, L., Lopez-Leon, S. and Forero, D.A. (2019), “Predatory journals: do not
judge journals by their editorial board members”, Medical Teacher, Vol. 41 No. 6, pp.
691–696.
Seglen, P.O. (1994), “Causal relationship between article citedness and journal impact”,
Journal of the American Society for Information Science, Vol. 45 No. 1, pp. 1–11.
Seglen, P.O. (1997), “Why the impact factor of journals should not be used for evaluating
research”, British Medical Journal, Vol. 314 No. 7079, pp. 498–513.
Shamseer, L., Moher, D., Maduekwe, O., Turner, L., Barbour, V., Burch, R., Clark, J.,
Galipeau, J., Roberts, J. and Shea, B.J. (2017), “Potential predatory and legitimate
biomedical journals: can you tell the difference? A cross-sectional comparison”, BMC
Medicine, Vol. 15, article 28.
Shen, C. and Björk, B.C. (2015), “‘Predatory’ open access: a longitudinal study of article
volumes and market characteristics”, BMC Medicine, Vol. 13, article 230.
Siler, K. (2020), “Demarcating spectrums of predatory publishing: economic and
institutional sources of academic legitimacy”, Journal of the Association for Information
Science and Technology, Vol. 71 No. 11, pp. 1386–1401.
Strinzel, M., Severin, A., Milzow, K. and Egger, M. (2019), “Blacklists and whitelists to
tackle predatory publishing: a cross-sectional comparison and thematic analysis”,
mBio, Vol. 10 No. 3, article e00411-19.
Tennant, J.P., Waldner, F., Jacques, D.C., Masuzzo, P., Collister, L.B. and Hartgerink,
C.H.J. (2016), “The academic, economic and societal impacts of Open Access: an
evidence-based review”, F1000 Research, Vol. 5, article 632.
Vuong, T.-T., Ho, M.-T., Nguyen, M.-H., Nguyen, T.-H.T., Nguyen, T.-D., Nguyen, T.-L.,
Luong, A.-P. and Vuong, Q.-H. (2020), “Adopting open access in the social sciences
and humanities: evidence from a developing nation”, Heliyon, Vol. 6 No. 7, article
e04522.
Walters, W.H. (2017), “Citation-based journal rankings: key questions, metrics, and data
sources”, IEEE Access, Vol. 5, pp. 22036–22053.
Walters, W.H. (2022), “The citation impact of the Open Access accounting journals that
appear on Beall’s List of potentially predatory publishers and journals”, Journal of
Academic Librarianship, Vol. 48 No. 1, article 102484.
Wilder, E.I. and Walters, W.H. (2021), “Using conventional bibliographic databases for
social science research: Web of Science and Scopus are not the only options”, Scholarly
Assessment Reports, Vol. 3 No. 1, article 4.
21
World Bank (2022), “World Bank country and lending groups”, available at:
https://datahelpdesk.worldbank.org/knowledgebase/articles/906519-world-bank-
country-and-lending-groups
Xia, J. (2015), “Predatory journals and their article publishing charges”, Learned
Publishing, Vol. 28 No. 1, pp. 69–74.
Xia, J., Harmon, J.L., Connolly, K.G., Donnelly, R.M., Anderson, M.R. and Howard, H.A.
(2015), “Who publishes in ‘predatory’ journals?”, Journal of the Association for
Information Science and Technology, Vol. 66 No. 7, pp. 1406–1417.
Yan, E. and Li, K. (2018), “Which domains do open-access journals do best in? A 5-year
longitudinal study”, Journal of the Association for Information Science and Technology,
Vol. 69 No. 6, pp. 844–856.
Table 1. Cited journals included in the study1
Mean Articles SD, Median Max.
citns. published, citns. citns. citns. Cite-
per 2015– per per per Score
Journal and publisher article 2018 article article article pctile.
2
Beall's List journals:
Jnl. of Acctng. & Taxation (Academic Jnls.) 11.1 49 11.8 7.7 66 28
Intl. J. of Finance & Acctng. (Scientific & Acad. Publ.) 9.1 104 13.5 5.0 81 21
Intl. J. of Acad. Rsrch. in Acctng., Finance & Mgt. Scis. (HRMARS) 8.8 383 14.5 5.1 196 21
Jnl. of Acctng. & Auditing: Rsrch. & Practice (IBIMA Publ.) 8.6 28 11.4 2.0 41 21
Global J. of Mgt. & Business Rsrch. D: Acctng. & Auditing (Global Jnls.) 7.4 53 18.3 2.8 130 18
Acctng. & Finance Rsrch. (Sciedu Press) 6.9 277 14.4 2.8 157 18
Jnl. of Finance & Acctng. (SciEP) 6.7 26 7.8 3.9 26 18
Scopus journals:
South African Jnl. of Acctng. Rsrch. (Taylor & Francis) 11.2 50 13.7 7.2 73 6
Risks (MDPI) 9.1 283 13.9 5.1 169 14
Asian Acad. of Mgt. Jnl. of Acctng. & Finance (Penerbit UKM) 8.9 65 9.4 5.5 50 18
Acad. of Acctng. & Financial Studies Jnl. (Acad. of Acctng. & Fin. Studies) 8.5 306 12.6 5.0 84 23
Jurnal Pengurusan (UKM) 7.7 185 10.7 5.1 95 28
Jnl. of Payments Strategy & Systems (Henry Stewart) 7.2 120 11.0 3.4 90 5
EC Tax Review (Kluwer) 7.0 118 6.5 5.1 36 23
Jnl. of Business Valuation & Econ. Loss Analysis (De Gruyter) 6.8 26 9.4 2.3 32 14
Journals in both Beall's List and Scopus:
Investment Mgt. & Financial Innovations (Business Perspectives) 8.5 454 12.0 5.1 144 23
Accounting (Growing Science) 7.7 69 13.7 3.7 77 31
Intl. Jnl. of Financial Rsrch. (Sciedu Press) 7.6 296 26.9 2.8 439 9
Mean value, all 10 Beall's List journals
3
8.2 174 14.4 4.1 136 21
Mean value, all 11 Scopus journals
3
8.2 179 12.7 4.6 117 18
1. The statistics presented in this table are based on the entire population of interest.
2. For journals not included in Scopus, this is the estimated CiteScore percentile from Walters (2022).
3. For this table, each journal was weighted equally.
Table 2. Characteristics of the 10 Beall’s List accounting journals with the most citations per article, and of 10 randomly selected Beall’s List
accounting journals with lower citation rates
Mean Articles APC Publr.: Publr.:
citns. published, Size for U.S. Incl. Incl. No. No. of
per 2015– Year of authors in in of bus.
Journal article 2018 founded board ($) Scopus? DOAJ? jnls. jnls.
Mean value, top 10 Beall's List accounting journals 8.2* 174* 2010* 41 450 30* 20 55 10
Jnl. of Acctng. & Taxation
1
11.1 49 2009 13 550 No No 120 5
Intl. J. of Finance & Acctng. (Scientific & Acad. Publ.) 9.1 104 2012 10 360 No No 133 9
Intl. J. of Acad. Rsrch. in Acctng., Finance & Mgt. Scis.
1
8.8 383 2011 56 75 No No 8 4
Jnl. of Acctng. & Auditing: Rsrch. & Practice
1
8.6 28 2012 17 295 No No 36 30
Investment Mgt. & Financial Innovations 8.5 454 2004 57 805 Yes Yes 12 10
Accounting (Growing Science) 7.7 69 2015 29 0 Yes Yes 9 5
Intl. Jnl. of Financial Rsrch. 7.6 296 2010 75 500 Yes No 29 8
Global J. of Mgt. & Business Rsrch. D: Acctng. & Auditing
1
7.4 53 2001 38 1125 No No 54 7
Acctng. & Finance Rsrch. 6.9 277 2012 60 400 No No 29 8
Jnl. of Finance & Acctng. (SciEP)
1
6.7 26 2013 54 390 No No 123 10
Mean value, 10 other Beall's List accounting journals 2.9 101 2014 31 305 0 0 46 7
Asian Jnl. of Finance & Acctng.
1
5.6 141 2009 58 200 No No 55 23
European Jnl. of Acctng., Auditing & Finance Rsrch.
1
5.1 261 2013 130 165 No No 82 9
Applied Finance & Acctng. 4.4 74 2015 27 300 No No 10 3
Jnl. of Acctng. & Financial Mgt. 3.6 100 2015 7 50 No No 32 10
Intl. Jnl. of Acctng. & Econ. Studies 3.1 96 2013 10 100 No No 24 1
Intl. Jnl. of Acctng. Rsrch. (Longdom Publ.) 2.9 70 2013 15 1080 No No 190 5
Australian Acad. of Acctng. & Finance Review 2.1 70 2015 12 0 No No 2 2
ResearchJournali’s Jnl. of Acctng.
1
1.2 25 2013 13 50 No No 34 14
Jnl. of Adv. Rsrch. in Business Mgt. & Acctng. 0.8 85 2015 28 800 No No 11 1
Specialty Jnl. of Acctng. & Econ. 0.4 83 2015 9 ? No No 21 3
* Significantly different from the value for the Beall’s List accounting journals not in the top 10 (p < 0.05, two-tailed). For this table, each journal was weighted
equally and one-sample tests were used to compare known population values (for the top 10 journals) with sample values (for the other Beall’s List
journals).
1. The journal web site lists no editor-in-chief, or the editor is an executive of the publishing company.
Table 3. Types of publications that cited the top 10 Beall's List journals and the comparable Scopus
journals: percentage of citing works of each type
1
Beall's
List Scopus
Type of citing work journals journals
% Journal articles 64* 60
% Books 1 2
% Book chapters 2 3
% Conference papers 6 8
% Theses 23 23
% Working papers 4 5
Column sum (%) 100 100
n (Citing works) 961 1079
* Significantly different from the value for the Scopus journals (p < 0.05, two-tailed).
1. Each work was weighted by the cited journal's overall citation impact.
Table 4. Characteristics of the articles that cited the top 10 Beall's List journals and the comparable
Scopus journals: percentage of citing articles with each characteristic
1
Beall's
List Scopus
Characteristic of citing article journals journals
% Journal self-citations
2
3* 6
% Publisher self-citations
3
4* 9
% in Beall's List journals 23* 17
% in Scopus journals 24* 44
% in Open Access journals 85* 74
% in Commercially published journals 55* 63
% in Scholarly society journals 6 6
% in University-published journals 34* 28
% in Other nonprofit or government journals 5 3
n (Citing articles) 660 637
* Significantly different from the value for the Scopus journals (p < 0.05, two-tailed).
1. Each article was weighted by the cited journal's overall citation impact.
2. Percentage of citing articles that appeared in the same journal as the cited article.
3. Percentage of citing articles that appeared in a journal with the same publisher as the cited article.
Table 5. Journal CiteScores of the Scopus-journal articles that cited the top 10 Beall's List journals and the
comparable Scopus journals
1
Beall's
List Scopus
Statistic journals journals
Mean CiteScore 2.8 2.4
Standard deviation of CiteScore 2.5 2.2
Median CiteScore 1.8 1.7
% with CiteScores at 20th percentile or higher 93* 84
% with CiteScores at 40th percentile or higher 60 50
% with CiteScores at 50th percentile or higher 45 43
% with CiteScores at 60th percentile or higher 41 37
% with CiteScores at 80th percentile or higher 23 22
n (Citing articles) 149 300
* Significantly different from the value for the Scopus journals (p < 0.05, two-tailed).
1. Each article was weighted by the cited journal's overall citation impact. Percentile scores refer to the journal's
place within the distribution of Scopus accounting journals even if the journal is not actually included in the
accounting subject category.
Table 6. World Bank income classifications of the countries associated with the works that cited the top
10 Beall's List journals and the comparable Scopus journals: percentage of citing works associated with
each country
1
Beall's
List Scopus
Income classification
2
journals
% Low income 1 1
% Lower middle income 45* 24
% Upper middle income 15 17
% High income 14* 20
% Associated with a region but not a country 3* 6
% Not associated with a region or country 22* 33
Column sum (%) 100 100
n (Citing works) 961 1079
* Significantly different from the value for the Scopus journals (p < 0.05, two-tailed).
1. Each work was weighted by the cited journal's overall citation impact.
2. "Low-income economies are defined as those with a GNI per capitaof $1,045 or less in 2020; lower middle-
income economies are those with a GNI per capita between $1,046 and $4,095; upper middle-income
economies are those with a GNI per capita between $4,096 and $12,695; high-income economies are those with
a GNI per capita of $12,696 or more" (World Bank, 2022).
Table 7. Countries and regions most often associated with the works that cited the top 10 Beall's List
journals and the comparable Scopus journals: percentage of citing works associated with each country or
region
1
Beall's
List Scopus
journals journals
Country or region (%) (%)
Indonesia 19* 7
Nigeria 16* 9
Malaysia 3* 5
Europe
2
1* 4
South Africa 0* 3
Ukraine 8 8
United States 4 6
Turkey 5 3
Kenya 3 2
India 3 3
China 2 3
Pakistan 3 2
Russia 1 3
Brazil 1 2
Italy 1 2
Portugal 1 2
Czech Republic 1 2
Asia
2
1 2
Middle East & North Africa
2
1 2
Jordan 1 1
Sweden 1 2
Column sum (%)
3
75 73
n (Citing works)
4
764 710
* Significantly different from the value for the Scopus journals (p < 0.05, two-tailed).
1. Each work was weighted by the cited journal's overall citation impact.
2. These works discuss the region as a whole, or multiple countries within the region.
3. Includes only the countries and regions shown here—those represented at least 10 times among the works that
cited the Beall's List journals or at least 10 times among the works that cited the Scopus journals (within this
sample).
4. Includes all citing works with country or region designations.
Article
Cabell Publishing Co. (herein Cabells), a private Texas-based US company, filled a void left after the shutdown of Jeffrey Beall's blacklists or watchlists that categorized open access “predatory publishing”. In addition to a journal whitelist or safelist product, Journalytics, Cabells markets a product (Predatory Reports) for postsecondary institutions that lists journals' infractions and poor practices, and thus acts as a warning system for academics to determine if they might be dealing with a scholarly entity, or not. Predatory Reports consists of three levels of criteria, or “infractions” (severe, moderate, and minor), that correspond to eight categories. In this paper, we critically examined the Predatory Reports' 74 criteria to determine which of them are clear and useful, and which might be ambiguous or not entirely meaningful. We identified seven criteria that could be accepted as they are, 28 criteria that required clarification and revision, and 39 criteria that we believe should be eliminated. Our expectation is that once this and additional publishing-related criteria have been fully and carefully assessed, validated and parsed, that a community-based set of criteria will eventually become established to better guide academics about safe or risky publishing venues.
Article
Full-text available
Although there are at least six dimensions of journal quality, Beall’s List identifies predatory Open Access journals based almost entirely on their adherence to procedural norms. The journals identified as predatory by one standard may be regarded as legitimate by other standards. This study examines the scholarly impact of the 58 accounting journals on Beall’s List, calculating citations per article and estimating CiteScore percentile using Google Scholar data for more than 13,000 articles published from 2015 through 2018. Most Beall’s List accounting journals have only modest citation impact, with an average estimated CiteScore in the 11th percentile among Scopus accounting journals. Some have a substantially greater impact, however. Six journals have estimated CiteScores at or above the 25th percentile, and two have scores at or above the 30th percentile. Moreover, there is considerable variation in citation impact among the articles within each journal, and high-impact articles (cited up to several hundred times) have appeared even in some of the Beall’s List accounting journals with low citation rates. Further research is needed to determine how well the citing journals are integrated into the disciplinary citation network—whether the citing journals are themselves reputable or not.
Article
Full-text available
The list of potential, possible or probable predatory scholarly open access (OA) publishers compiled by Jeffrey Beall was examined to determine the effect of their inclusion upon authors, and a possible bias against OA journals. Manually collected data from the publication archives of a sample of 250 journals from Beall publishers reveals a strong tendency towards a decline in their article output during 2012–2020. A comparison of the subset of 506 Beall journals indexed in Scopus with a benchmark set of other OA journals in Scopus with similar characteristics shows that Beall journals reveal as a group a strong decline in citation impact over the years, and reached an impact level far below that of their benchmarks. The Beall list of publishers was found to be heterogeneous in terms of bibliometric indicators but to be clearly differentiated from OA journals not included in the list. The same bibliometric comparison against comparable non‐OA journals reveal similar, but less marked, differences in citation and publication growth.
Article
Full-text available
One of the most fundamental issues in academia today is understanding the differences between legitimate and questionable publishing. While decision-makers and managers consider journals indexed in popular citation indexes such as Web of Science or Scopus as legitimate, they use two lists of questionable journals (Beall's and Cabell's), one of which has not been updated for a few years, to identify the so-called predatory journals. The main aim of our study is to reveal the contribution of the journals accepted as legitimate by the authorities to the visibility of questionable journals. For this purpose, 65 questionable journals from social sciences and 2338 Web-of-Science-indexed journals that cited these questionable journals were examined in-depth in terms of index coverages, subject categories, impact factors and self-citation patterns. We have analysed 3234 unique cited papers from questionable journals and 5964 unique citing papers (6750 citations of cited papers) from Web of Science journals. We found that 13% of the questionable papers were cited by WoS journals and 37% of the citations were from impact-factor journals. The findings show that neither the impact factor of citing journals nor the size of cited journals is a good predictor of the number of citations to the questionable journals.
Article
Full-text available
Although large citation databases such as Web of Science and Scopus are widely used in bibliometric research, they have several disadvantages, including limited availability, poor coverage of books and conference proceedings, and inadequate mechanisms for distinguishing among authors. We discuss these issues, then examine the comparative advantages and disadvantages of other bibliographic databases, with emphasis on (a) discipline-centered article databases such as EconLit, MEDLINE, PsycINFO, and SocINDEX, and (b) book databases such as Amazon.com, Books in Print, Google Books, and OCLC WorldCat. Finally, we document the methods used to compile a freely available data set that includes five-year publication counts from SocINDEX and Amazon along with a range of individual and institutional characteristics for 2,132 faculty in 426 U.S. departments of sociology. Although our methods are time-consuming, they can be readily adopted in other subject areas by investigators without access to Web of Science or Scopus (i.e., by faculty at institutions other than the top research universities). Data sets that combine bibliographic, individual, and institutional information may be especially useful for bibliometric studies grounded in disciplines such as labor economics and the sociology of professions. Policy highlights: (1) While nearly all research universities provide access to Web of Science or Scopus, these databases are available at only a small minority of undergraduate colleges. Systematic restrictions on access may result in systematic biases in the literature of scholarly communication and assessment. (2) The limitations of the largest citation databases influence the kinds of research that can be most readily pursued. In particular, research problems that use exclusively bibliometric data may be preferred over those that draw on a wider range of information sources. (3) Because books, conference papers, and other research outputs remain important in many fields of study, journal databases cover just one component of scholarly accomplishment. Likewise, data on publications and citation impact cannot fully account for the influence of scholarly work on teaching, practice, and public knowledge. (4) The automation of data compilation processes removes opportunities for investigators to gain first-hand, in-depth understanding of the patterns and relationships among variables. In contrast, manual processes may stimulate the kind of associative thinking that can lead to new insights and perspectives.
Article
Full-text available
Purpose - Predatory publishing is a growing and global issue infecting all scientific domains. Predatory publishers create counterfeit, not (properly) peer-reviewed journals to exploit the Open Access model in which the author pays. The plethora of predatory marketing journals along with the sophisticated deceptive practices of their publishers may create total confusion. One of the many highly likely risks of that confusion is when peer-reviewed, prestigious marketing journals cite these pseudo-marketing journals. This phenomenon is called citation contamination. This study aims to investigate the extent of citation contamination in the peer-reviewed marketing literature. Design/methodology/approach – Using Google Scholar as a citation gathering tool, this study investigates references to four predatory marketing journals in 68 peer-reviewed marketing journals listed in the 2018 version of the Academic Journal Guide by the Chartered Association of Business Schools (CABS). Findings– Results indicate that 59 of the 68 CABS-ranked peer-reviewed marketing journals were, up to late January 2021, contaminated by at least one of the four sampled predatory journals. Together, these four pseudo-journals received (at least) 605 citations. Findings from nonparametric statistical procedures show that citation contamination occurred irrespective of the age of a journal or its 2019 Journal Impact Factor. They also point out that citation contamination happened independently from the fact that a journal is recognized by Clarivate Analytics or not. Research limitations/implications –This study investigated citations to only four predatory marketing journals in only 68 CABS-listed peer-reviewed marketing journals. Practical implications– These findings should sound an alarm to the entire marketing community (including academics and practitioners). To counteract citation contamination, recommendations are provided for researchers, practitioners, journal editors, and academic and professional associations. Originality – This study is the first to offer a systematic assessment of references to predatory journals in the peer-reviewed marketing literature.
Article
Full-text available
The aim of this paper is to investigate how predatory journals are characterized by authors who write about such journals. We emphasize the ways in which predatory journals have been conflated with—or distinguished from—open access journals. We created a list of relevant publications on predatory publishing using four databases: Web of Science, Scopus, Dimensions, and Microsoft Academic. We included 280 English-language publications in the review according to their contributions to the discussions on predatory publishing. Then, we coded and qualitatively analyzed these publications. The findings show the profound influence of Jeffrey Beall, who composed and maintained himself lists of predatory publishers and journals, on the whole discussion on predatory publishing. The major themes by which Beall has characterized predatory journals are widely present in non-Beall publications. Moreover, 122 papers we reviewed combined predatory publishing with open access using similar strategies as Beall. The overgeneralization of the flaws of some open access journals to the entire open access movement has led to unjustified prejudices among the academic community toward open access. This is the first large-scale study that systematically examines how predatory publishing is defined in the literature.
Article
Full-text available
In scholarly publishing, blacklists aim to register fraudulent or deceptive journals and publishers, also known as “predatory”, to minimise the spread of unreliable research and the growing of fake publishing outlets. However, blacklisting remains a very controversial activity for several reasons: there is no consensus regarding the criteria used to determine fraudulent journals, the criteria used may not always be transparent or relevant, and blacklists are rarely updated regularly. Cabell’s paywalled blacklist service attempts to overcome some of these issues in reviewing fraudulent journals on the basis of transparent criteria and in providing allegedly up-to-date information at the journal entry level. We tested Cabell’s blacklist to analyse whether or not it could be adopted as a reliable tool by stakeholders in scholarly communication, including our own academic library. To do so, we used a copy of Walt Crawford’s Gray Open Access dataset (2012-2016) to assess the coverage of Cabell’s blacklist and get insights on their methodology. Out of the 10,123 journals that we tested, 4,681 are included in Cabell’s blacklist. Out of this number of journals included in the blacklist, 3,229 are empty journals, i.e. journals in which no single article has ever been published. Other collected data points to questionable weighing and reviewing methods and shows a lack of rigour in how Cabell applies its own procedures: some journals are blacklisted on the basis of 1 to 3 criteria – some of which are very questionable, identical criteria are recorded multiple times in individual journal entries, discrepancies exist between reviewing dates and the criteria version used and recorded by Cabell, reviewing dates are missing, and we observed two journals blacklisted twice with a different number of violations. Based on these observations, we conclude with recommendations and suggestions that could help improve Cabell’s blacklist service.
Article
Readers can access open access articles for free, but authors or research funders pay article-processing charges to publish them. This requirement may deter authors in low-income countries from publishing in open access. This study investigates the choices that authors make among three types of open access journal and closed (subscription) journals in history, economics, science, and technology based on their countries’ income level. The sample comprises research articles published in journals in English in 2020 and indexed in Scopus. The results show that authors in low-income countries publish more in gold open access than do authors in lower-middle- and upper-middle-income countries, who tend not to publish in hybrid open access and to favour closed journals. Authors from high-income countries publish more in hybrid open access than do authors in the other groups of countries. Although major publishers waive their article-processing charges for authors in low-income countries, these authors amount to less than 1 per cent of the total. Improving the effectiveness of publishers’ waiver policies is necessary.
Article
Hijacked journals are publication outlets that are created by fraudulent entities for financial gain. They deceitfully use the names of genuine journals to dupe researchers. A hijacked journal publishes papers in return for article publication charges similar to those of gold open access journals, but they are not authentic. By using the same title of a genuine journal, a hijacked journal may confuse authors who send their manuscripts to it. A hijacked journal may also confuse authors who cite articles published in it, wrongly assuming that they appeared in an authentic journal (a phenomenon herein called citation infiltration). Adopting a case study methodology, the main aim of this paper is to investigate the extent of citations received by a hijacked marketing journal from marketing journals indexed in Clarivate Analytics’ Social Sciences Citation Index (SSCI). Results indicate that the hijacked journal received 25 citations from 13 SSCI-indexed marketing journals. The list of the infiltrated journals includes some of marketing’s most “prestigious” journals. Ironically, the SSCI-indexed marketing journal that cited the hijacked journal the most (with nine citations) is none other than the genuine journal whose identity has been theft.
Article
Predatory journals, which are a major concern of the academic community, generally do not properly fulfill the reviewing and editorial processes which are the most important pillars of scientific communication. In line with the principle of the accumulation of science, the papers that have not been faithfully reviewed in these journals cause a bad effect on the scholarly communication. In this study, the impact of 17 journals with addresses in Turkey in Cabells' Predatory Report (formerly Cabells' Journal Blacklist) to the literature were examined. For this purpose, the journal and article level descriptive statistics were examined for the aforementioned journals, and analyses were made for the citations from the papers published in the journals indexed in the Web of Science citation database. A total of 3427 papers were published in these journals, which started to be published between 2010 and 2015, and 389 citations were made to these papers from the journals listed in the WoS. Such highest citations come from Turkey (24.16%), then China (7.20%) addressed papers. In addition, although there are no papers in fields such as art, humanities and physics, it has been seen that there are citations to papers from these fields. This is important in terms of showing the widespread impact of science. A paper published without serious peer review in any predatory journal affects all fields of science in terms of its method, findings and discussions. Therefore, to reduce the misleading or false effect of predatory journals on the literature, a more skeptical behavior should be displayed about citing the papers published in these journals.