As mentioned before I work on a literature review focusing on criteria for assessing the quality of scientific documents. As I found in the Web of Science, Scopus and Google Scholar very few literature on cognitive construction processes or relevant content-related parameters that make articles seem to be of high quality I asked in a recent posting for support by the readers of this blog.
One reader, Dr. Werner Dees, actually sent me very, very valuable tips. The complete list of the reviewed literature can be found below as text and downloaded as a BibTEx-file. Please give the file the extension .bib after the download, it ends now on .txt.
The articles I found most useful are Mårtensson, P., Fors, U., Wallin, S.-B., Zander, U., & Nilsson, G. H. (2016). and especially Bucholz (1995), the latter being a recommendation of Dr. Dees.
Bouter, L. M. (2008). Knowledge as Public Property : The Societal Relevance of Scientific Research. Higher Education Management and Policy, (September). Bruza, P., & Chang, V. (2014). Perceptions of document relevance. Frontiers in Psychology, 5(JUL), 1–8. http://doi.org/10.3389/fpsyg.2014.00612
Clyde, L. A. (2004). Evaluating the quality of research publications: A pilot study of school librarianship. Journal of the American Society for Information Science and Technology, 55(13), 1119–1130. http://doi.org/10.1002/asi.20066
Harris, M., Macinko, J., Jimenez, G., Mahfoud, M., & Anderson, C. (2015). Does a research article’s country of origin affect perception of its quality and relevance? A national trial of US public health researchers. BMJ Open, 5(12). http://doi.org/10.1136/bmjopen-2015-008993
Judge, T. A., Cable, D. M., Colbert, A. E., & Rynes, S. L. (2007). What causes a management article to be cited – article, author, or journal? Academy of Management Journal, 50(3), 491–506. http://doi.org/10.5465/AMJ.2007.25525577
Lowe, A., & Locke, J. (2005). Perceptions of journal quality and research paradigm: results of a web-based survey of British accounting academics. Accounting, Organizations and Society, 30(1), 81–98. http://doi.org/10.1016/j.aos.2004.05.002
Mårtensson, P., Fors, U., Wallin, S.-B., Zander, U., & Nilsson, G. H. (2016). Evaluating research: A multidisciplinary approach to assessing research practice and quality. Research Policy, 45(3), 593–603. http://doi.org/10.1016/J.RESPOL.2015.11.009
Nazim Ali, S., Young, H. C., & Ali, N. M. (1996). Determining the quality of publications and research for tenure or promotion decisions. Library Review, 45(1), 39–53. http://doi.org/10.1108/00242539610107749
Saarela, M., Kärkkäinen, T., Lahtonen, T., & Rossi, T. (2016). Expert-based versus citation-based ranking of scholarly and scientific publication channels. Journal of Informetrics, 10(3), 693–718. http://doi.org/10.1016/j.joi.2016.03.004
Working on a literature review I am looking for publications on the perceived quality of scientific literature. I use the Web of Science, Scopus and Google Scholar as databases and find very, very few literature on cognitive construction processes or relevant content-related parameters that make some articles appear to be of high-quality and others to be inferior.
It is clear to me that quality is a multi-dimensional construct, the dimensions of which have to be operationalised themselves. Since the study should not be very time-consuming, it would suffice for me to find at least texts describing these dimensions. The operationalisation of these dimensions or the definition of indicators to describe them would not interest me in a first step.
I am/was also aware that the use of quantitative factors (e. g. citation-based parameters) are – in the opinion of many people mistakenly – used to operationalise quality in science. Nevertheless, I am surprised that apart from this quantitative information, almost only formal (e. g. design of the text through structuring) or syntactic (e. g. structure of sentences, formulation questions) are discussed as criteria of qualitative evaluation – whereas extremely rarely really substantive criteria relating to the content of publications.
After finishing my collection of literature I will publish my list of relevant articles here. If you would like to give me some literature tips, you do so by commenting, mailing or by other means. These tips will also be included in the list to be shared here – with reference to the contributor.
The sociologist at the University of Arizona deals with outsourcing in publishing, as the title says. Sallaz draws parallels between the cheap production of overpriced mobile phones in Asian low wage countries and a similar practice of commercial science publishers: Outsourcing copy editing and layouting to low wage countries and selling the cheaply produced articles at top prices. It should be noted that copy editing and layouting are the only contributions of commercial publishers to the publication as a product.
Nonetheless, mobile phone manufacturers and publishers can sell cheaply produced works at overpriced prices, as customers tend to base the price they want to pay on the perceived prestige of the product rather than the real cost of its creation. Sallaz gives this finding a beautiful and catchy expression: Foxconning Science.
The Altmetrics service Altmetric.com announced yesterday that it tracks now discussions about books listed on Amazon in social media and other not primarily scientific publications. According to altmetric.com’s posting this new data source produces „huge volumes of attention data – in just a few days Altmetric has found over 145,000 new mentions of 3,000 Amazon-listed books from Twitter, and over 20,000 mentions from other sources such as news, blogs and policy documents. Around 2 million mentions a year that relate directly to an Amazon record for a book are expected.“
Since the impact measurement for book publications is considered to be complicated and as citations are considered to be of little use for this purpose, Altmetric. com integrated an important impact source for books. It should be noted, however, that smaller publishers in particular are struggling with the use of Amazon as a sales platform due to its high service fees. Should Altmetric. com gain in importance, this could lead authors to opt out of publishing with smaller publishers if these don’t use Amazon as a sales platform.
On August 12th a Phd Thesis entitled Drivers and Barriers for Open Access Publishing: From SOAP 2010 to WoS 2016 was published on Zenodo. The author Sergio Ruiz-Perez describes the publication in the abtract as follows:
„This PhD thesis follows up on previous studies aiming at finding out what a representative sample of researchers from all over the world and from all disciplines think about OA. We replicated the largest study of this type to date: the Study of Open Access Publishing run in 2010 (SOAP 2010).
We present a descriptive longitudinal study of active researcher’s opinions on open access publishing. We re-analysed a dataset from SOAP 2010 and we contacted authors publishing in scientific journals indexed in international databases (WoS 2016). We analysed the scientific community’s opinions on open access, in particular its evolution in the past 7 years. To do so, we used two different samples:
The SOAP project study (Dallmeier-Tiessen et al., 2011)
An ad-hoc sample obtained from the Web of Science database (WoS 2016) consisting of 15,235 unique responses
This PhD thesis was successfully defended on 24 July 2017 at the Facultad de Comunicacion y Documentacion from the University of Granada (Spain).
Please note that although the first few pages of this document are in Spanish all the rest is written in English.“
These are the biblographic details of Sergio Ruiz-Perez‚ thesis:
While I was working on a longer text, I gathered – more or less as a by-product – information from studies and reports on the (first copy) costs of a scientific article. As a result, it can only be noted that the information available on these costs (and related costs as margins) are very different and really difficult to compare. The latter confirms my impression that there is a huge lack of transparency in the scientific publications market. The following table may give some insight in the information I found. It is also available via GitHub (https://github.com/scinoptica/article_costs.git), so please feel free to improve/ update the data or add new information.
British Academy 2007
Houghton et al. 2010
Van Noorden 2013
First copy costs of an article, including profit margins
PNAS: 3.700 $
Nature: 30.000 – 40.000 $
First copy costs of an article, without profit margins
420 – 650 $
First copy costs, including profit margins per page
Closed Access on average 20-30%,
Open Access (commercial) on average 15%
Cost of peer review (not included in first copy costs):