This reminds me a bit of cheating in speed-running (and related runs).
The common solution (streaming) is prevalent enough that offline runs can be accused of cheating b/c they’re done offline. The equiv with academia might be default assumptions of academic fraud if data isn’t public. That comparison is based on the prior default being – for academia: no public data, just conclusions; and for speedrunning: proof provided but was easily faked.
A conclusion like “speed running (and related stuff) have more intellectual integrity than academia” doesn’t feel unreasonable there.
Note: I think default assumptions like that are bad for ~speed-running. but, since academia should have higher standards, it’s okay there (or at least ppl should be rightly skeptical). And reasonable exceptions can be made where appropriate, tho 3rd party reviews (like software audits, where conclusions are published) can help with that case.
Also, the original paper (about dishonesty) was influential enough that I knew about it – and have for a while, I don’t remember where I heard it. I did study some 2nd year psychology (at uni) around that time, so I might know about it b/c of that.
The first few minutes cover some massive conflicts of interest and lies in a prestigious (I think) journal. The author of an article claiming lab-leak was a conspiracy was actually involved in funding relevant research in Wuhan. Also the journal had received funding from the CCP.
Based on the title and thumbnail, at first I figured the bad scholarship would be in the video rather than critiqued by the video. After clicking through, I see it’s on a decent channel though. The title and thumbnail pandering is misleading though.
One thing I just realized: a lot of papers are only accessible through preprints – like, if a paper was accepted into a journal and ends up behind a paywall, but a preprint is available, won’t ppl just use the preprint (university journal portals aside).
The problem with that is that even if an error was in the preprint and fixed before being published in a journal, the error can still propagate via the (more accessible) preprint. So journals both restrict access to the (potentially) higher quality doc + don’t prioritize taking errors seriously.
This is happening increasingly with their lectures - one of my close friends did a great operating systems course with amazing lectures and videos → stoled.jpg :(
And there’s a huge legal issue as well that’s brewing up because of this :\
This post criticizes the idea that Roman soldiers were paid in salt or received an allowance for buying salt and that’s where “salary” comes from.
Haven’t scrutinized it super closely but this part criticizing Wikipedia jumped out at me:
The trouble with citing Pliny as a source for the myth is of course that Pliny doesn’t say anything of the kind. The problem is exacerbated by Wikipedia, which bald-facedly re-writes Pliny, and has been quoted very widely:
the Roman historian Pliny the Elder, who stated as an aside in his Natural History’s discussion of sea water, that ‘[I]n Rome…the soldier’s pay was originally salt and the word salary derives from it…’.
This is a mistranslation, just to be clear. And this wording doesn’t even appear in the linked source. And Pliny isn’t writing about sea water, but about salt itself. None of that has stopped this fake quotation being repeated in countless books and websites.
Note, 18 Jan.: this error, and the other Wikipedia excerpt quoted above, have since been corrected. However, some other parts of the articles are still inaccurate: see below.
It is a calmer Stewart [on his 2021 TV show] than during his famous diatribe on Crossfire in 2004, during which he tore into his rightwing blowhard interviewers Tucker Carlson and Paul Begala
Crossfire was a show about the left and right fighting – which was one of Stewart’s main criticisms of it. It had two hosts so they could represent both the right and left. Paul Begala was the leftist. Stewart was making a non-tribalist criticism of the fighting between the left and right, and how the media encourages it. But The Guardian misrepresents that as Stewart having made a tribalist attack on two rightwing blowhards, not on one leftist and one rightist. Even if they weren’t liars (or misinformation dispensers or whatever), The Guardian would still be part of the problem that Stewart was criticizing.
Also I’m not well-versed enough in this stuff that I should be catching The Guardian out. I don’t think I’ve ever watched an episode of Crossfire. I had no idea who Paul Begala is – I just remembered the basic concepts of Crossfire, and of Stewart’s criticism, enough to be suspicious and look it up. Also I thought Stewart was pretty calm on Crossfire. Also “diatribe” makes it sound like Stewart gave one mean speech when actually he interactively asked some questions and made a few separate short points.
Moreover, the ‘‘faulty MCDA’’ model reviewed was not even created to guide shared decision making but rather to illustrate approaches to manage uncertainty within the MCDA framework.
The term “faulty MCDA” does not appear in the paper Dolan is replying to. This is a misquote.
To check, I downloaded then searched the version of the paper provided at the author’s website. Then I re-did the OCR myself with Abbyy FineReader and searched again. My main search term was “fau” which appears once. The term “MCDA” appears on 20 out of 30 pages in both the original and my new OCR, and both OCRs generally appear to have worked well.
I also noticed some very short misquotes by David Deutsch. I think when quotes are very short, e.g. 1-3 words long, people are more likely to go by memory instead of checking the words in the text they’re allegedly quoting.
Hundreds of papers promoted its use, though all relied on a single double blind, placebo controlled trial as support for its efficacy. Mentioned almost no where was that this trial did not show any superiority over placebo.