I don't think of journals as bad or good, junk science vs good science. I try to look at the articles themselves. and ask, "Did the author properly analyze their data?"
In a "respected" journal, I submitted a manuscript where I used a fractional factorial designed experiment. From other reviewers comments, I added a section designing and experiment and referenced a dozen stats textbooks. One reviewer said, "This so-called statistician should know that you can't change more than one thing at a time during an experiment." The other reviewer said, "The writer bored me with a page on using designed experiments. Everyone already knows this material." When I asked the journal editor, I asked, "How do I reconcile their differences of opinion? One says I can't do what I did. The other says everyone already does it this way." Their response was, "Good luck."
------------------------------
Andrew Ekstrom
Statistician, Chemist, HPC Abuser;-)
------------------------------
Original Message:
Sent: 11-26-2019 09:30
From: Matthew Robinson
Subject: Good journals, junk journals.
In addition to separating good journals from bad journals, I have problems with reviewers, specifically pretty frequently getting statistically incompetent reviewers. Every couple of months or so a research comes to me with a project with nonsensical reviewers comments needing guidance. For example, one manuscript did not need a Bonferroni Correction as different types of statistical tests (t-test, chi-square, etc.) were being done, only a few variables, we were not interested in controlling type I error, and type II error would be very low, and blah blah blah... Eventually, we just gave-in and did a Bonferroni Correction to satisfy the inept reviewer and move on with our lives and like magic the reviewer accepted it, however he/she was clearly incompetent in how and when to use multiple comparisons, and we didn't know how to handle this situation or who to report it to.
In Europe, they are supposedly starting 'Plan S' open journals. Elsevier has been engaged in lobbying recently and other sketchy practices in recent years and California and FSU canceled their contracts with Elsevier this summer. And a lot of journals seem to want to 'curate' the 'best' articles which isn't their roll, as citations and number of downloads are supposed to do that.
https://en.wikipedia.org/wiki/Plan_S
https://en.wikipedia.org/wiki/Elsevier#Dissemination_of_research
-Matt
------------------------------
Matthew Robinson
Original Message:
Sent: 11-25-2019 14:00
From: Nayak Polissar
Subject: Good journals, junk journals.
Dear all,
About 2-3 times per week, I get an email invitation to submit an article to some journal. They virtually always seem to be part of a mass mailing, though sometimes they cite one of my co-authored articles.
I have read in Science about junk journals that are, basically, profit-making scams. Science also reported about an experiment where someone submitted a "junk" article to about 30 different journals (relatively new journals, I think), and most of them accepted the article.
Question: how do you tell the good from the bad?
Just before this email, we all received a nice invitation that seemed good! Felicity is in our group and she gave a nice link for us to have a look at the journal, provided important information and details, named others who are on this effort, etc. But what about the other invitations that pop up in our inbox? How do we evaluate those?
Are some of you also getting those frequent invitations? I would be interested to her how you evaluate these invitations.
Thanks and best wishes,
Nayak
------------------------------
Nayak Polissar
Principal Statistician
The Mountain-Whisper-Light Statistics
------------------------------