"Instead of arguing from expertise, you should use your expertise to make better arguments." I will be quoting this (and will try not to be obnoxious when doing so). Thanks Adam!
I have nothing substantive to say other than that I continue to be flabbergasted by how good you are as a writer -- you keep writing these pieces that compel me to share them with friends who don't even care one whit about the scientific establishment (I'm in industry), and they end up reliably amazed and entertained.
I think I was also compelled to comment because a ~decade ago I found myself at a fork in life reminiscent of your description that starts with "Then, couple years ago, I looked around and realized that I didn’t actually admire most of the people I was trying to be. ...", and ended up leaving for industry instead of continuing on to academia. In my case, I was doing a physics degree in my boyhood quest to become a theorist of some sort. (I think it also helped for my decision that I wasn't good enough to get into R1 grad programs.) I think earning well and working on not-too-uninteresting data analytics problems have helped, but I've always wistfully wondered about what could've been; your essays have helped me un-tint that rose-tinted counterfactual.
As one who did get into and through an R1 grad program but also ended up in industry, I am eternally grateful to have had the good fortune to escape what I now consider the hell of academic life. So I hope my testimony further un-tints your counterfactual.
(BTW, I had a great time in grad school. But the thought of being a research professor at a university today fills me with fear and loathing.)
My thoughts and actions resemble yours: got the PhD, didn't want to live in that world, so first started my own company then joined another as it grew. It all went exceedingly well ... until we were bought out by a private equity company and they brought a new set of measures.
“I, too, would like to beat the charlatans and the terrorists, which is why I want to do better than, “Don’t trust those guys—they lack the proper accreditation!” If that’s all you got, people shouldn’t trust you. Instead of arguing from expertise, you should use your expertise to make better arguments.”
While I love this article and the direction it is going, there is a major issue: if we don't trust academic institutions or titles or journalists, what do we trust? It is just not possible for the common man (even a very smart one) to independently evaluate every writer/scientist/inventor to see if they should trust what they have created.
We have to have _something_ to quickly let us know what to trust, perhaps like the Cochrane Library for medical info.
This is a great question and a good answer will probably require a whole post. Here's my short version:
It takes a lot of work to figure out who you can trust, and there's really no way of speeding that up. You could look at their credentials, for instance, but then, how do you know whether those credentials are trustworthy? You could see what other people say about them, but then, how do you know whether to trust those people? And so on, infinitely.
One option is paranoia: trust no one! But this, uh, limits your options quite a bit.
The other option is to make your best guess and realize that's all you're doing. This is, I think, where the most pernicious errors lie––feeling certain that you know the truth, when all you've really done is guess.
> It takes a lot of work to figure out who you can trust, and there's really no way of speeding that up. You could look at their credentials, for instance, but then, how do you know whether those credentials are trustworthy? You could see what other people say about them, but then, how do you know whether to trust those people? And so on, infinitely.
Here's a trick: realize that credentials do not guarantee trustworthiness - all one needs to prove this is to find literally one single instance where a credentialed expert demonstrated untrustworthiness....and luckily, we have many thousands of instances.
> One option is paranoia: trust no one!
Another option is not framing epistemic strictness as a mental illness.
> But this, uh, limits your options quite a bit.
Not always.
> The other option is to make your best guess and realize that's all you're doing.
It's by far the most popular, but it is not the only other option.
Yes, the laws of physics do constrain us. The main constraint is time, because it takes time and effort to evaluate any claim.
I'm an advocate of evaluating scientific claims for oneself, but time does not allow one to do a proper job of this for every claim that's out there. Prioritization is key. Questions to ask:
(1) How personally important to my life is this claim?
(2) How politically controversial is the field? (Claims in less politicized fields tend to be more trustworthy.)
Which laws of physics in particular make it necessary to trust, why am I not constrained by these laws, and how did you determine that all humans are constrained in the way you believe (missing me, at least, in the process)?
Let's presume you're not claiming to be free from the laws of physics. But plausibly like Descartes and the young St. Augustine, you claim you can avoid trusting other institutions, people, or knowledge.
St. Augustine reports why he came to believe this is impossible. And it's pretty clear Descartes did not actually doubt everything - but doubted enough wrong things to make some key contributions.
The charitable reading is that you are making a hyperbolic claim - not that you literally don't rely on any institutions -- requiring you to spend inordinate amounts of time testing your own electrical cords -- but that like Descartes you want to minimize this because you've seen "with how little wisdom the world is governed."
If it was admirable for Descartes, then also for you. But there is a tradeoff: Civilization is largely an expanded circle of trust reducing in friction of transactions enabling more to be done.
Less trust, more friction. But a frictionless scam is still a bad transaction.
I don't think you can manage "zero trust" any better than Descartes, but a targeted "trust, but verify" seems wise.
> Let's presume you're not claiming to be free from the laws of physics.
I am only claiming to not be subject to laws of physics (that you will not reveal despite me directly asking) that you claim constrain me. But sure, close enough.
> But plausibly like Descartes and the young St. Augustine, you claim you can avoid trusting other institutions, people, or knowledge.
Remove the "like Descartes and the young St. Augustine" part and I agree.
> St. Augustine reports why he came to believe this is impossible. And it's pretty clear Descartes did not actually doubt everything - but doubted enough wrong things to make some key contributions.
Sure - why should I care what these old farts claim?
> The charitable reading...
The mind that generates the simulation tends to view it with rose colored glasses.
> ...is that you are making a hyperbolic claim...
Is it *me* who is making the hyperbolic claim here, in fact (as opposed to *in individual subjective *experience*)?
Also, you should use "a", not "the". Seemingly small mistake, but one that I propose often has outsized consequences, especially when combined with hundreds of other similar seemingly small mistakes that can occur during human cognition (or: that *exist within* "reality").
> ...not that you literally don't rely on any institutions...
The point of contention is not "reliance" it is "trust".
> ...requiring you to spend inordinate amounts of time testing your own electrical cords...
Have "you" presumed a specific level of risk tolerance on my behalf? How could you possibly know such anything with any level of accuracy?
> ...but that like Descartes you want to minimize this because you've seen "with how little wisdom the world is governed."
Maybe. Also maybe: give me 365 days with the nuclear briefcase thingamajig and see if we make it out of this gong show alive.
> If it was admirable for Descartes, then also for you. But there is a tradeoff: Civilization is largely an expanded circle of trust reducing in friction of transactions enabling more to be done.
More importantly imho: "civilization" is *simultaneously* an ever expanding circle of delusion. If you disagree with this bold claim, I am happy to defend it.
> Less trust, more friction. But a frictionless scam is still a bad transaction.
Sounds like a neat world, can I visit it?
> I don't think you can manage "zero trust" any better than Descartes, but a targeted "trust, but verify" seems wise.
I don't think you have any means of even remotely estimating my capabilities - rather, I suspect you are running on ("thinking" according to) culturally conditioned heuristics and a misunderstanding of many of the big ideas you have learned.
But hey....it's just a theory, I could be wrong.
Now, let's return to my question regarding your claims of fact, that you shrewdly sidestepped:
Which laws of physics in particular make it necessary to trust, why am I not constrained by these laws, and how did you determine that all humans are constrained in the way you believe (missing me, at least, in the process)?
EDIT:
Rather than edit my text to hide my error: I just realized that now it is I who am experiencing delusion: you and DH are different people, yet I've pinned his crimes on you, in a smug, self-righteous manner no less....OH THE HYPOCRISY!!
I agree. There are many claims that are self-impeaching, and others that are self-authenticating. In other words they are consistent with our own systems of logic, wisdom and experience. To be sure we humans on average suck at theoretical physics because it is so different. So I'll listen to Lawrence Krauss explain some aspect of theoretical physics. But I don't need to hear his Trump derangement prattle any more than I would have accepted relationship advice from Richard Feynman. We don't always need an anointed class of experts to tell us what to believe.
> In other words they are consistent with our own systems of logic, wisdom and experience.
Which no one really practices, or even tries to with any level of seriousness. Humans seem to generally be able to only think/operate in either the abstract realm or concrete realm at a single point in time, at least when contemplating certain topics. (This is an interesting theory that could be studied...it would be a grind for sure, but I think it would yield fruit.)
> To be sure we humans on average suck at theoretical physics because it is so different.
That is only one reason we suck at it....consider how many variables are involved prior to this event!
> So I'll listen to Lawrence Krauss explain some aspect of theoretical physics. But I don't need to hear his Trump derangement prattle any more than I would have accepted relationship advice from Richard Feynman. We don't always need an anointed class of experts to tell us what to believe.
If you do not expose yourself to how delusional the brightest minds on the planet are, you might form the belief that they are bright in an absolute sense, and that is EXTREMELY dangerous. As an example: have you ever noticed science *hasn't even tried* (or discussed trying) to develop a comprehensive plan for addressing climate change (despite allegedly having a Theory Of "Everything")? And this is only one of humanity's major problems!
This is one of the funnest substack threads I've been in for ages, kudos to our host for the great prompt!!!
They do, in fact- computability. You are a tiny little mote enmeshed in more information than you can possible correlate for internal consistency in the span of the relevant decision (or your lifetime, or the lifetime of the universe). It's a vicious combinatorial problem. And so you trust that your mom was right when you shouldn't touch the stove and you trust that the programming language you're using adds correctly and that the pharmacist (and the manufacturer) put penicillin in your pills instead of rat poison. Being able to delegate and modularize trust at scale is in a very real way what makes civilization happen. So mostly we trust until the gears grind and we have cause to look deeper.
While it may be true that there are eventual hard physics constrains on human cognition, that's not what constrains us now. We have massive excess time and biological compute available, the problem is more so that we leave it idle, or use it for foolishness like describing and arguing over our respective delusions. I don't deny that it's fun and seems like we're doing something useful, but I'm very suspicious that humans might be essentially driving in the ditch beside a lovely highway we cannot see.
I don't think the takeaway is necessarily that all credentials are total shit forever. Certifications keep getting reinvented because they do something useful. But it's also been true that certifications keep getting tossed on the heap because any thoughtful person actively engaged in any field of inquiry realizes their limitations, and sometimes those loom larger in certain moments and from certain angles. Every stamp of approval is ultimately an experiment (and perhaps a time-limited one, given the inexorable march of Goodhart's Law) in trying to draw some binary inference about a continuous character, and lots of times they don't look very interesting. Right now, I think there's lots of cause to believe that the infrastructure of scientific publishing (selective journals with vast fees on both sides of the transaction, impact factors, fussiness about shared authorship, pre-publication peer review) simply doesn't actually do what it says on the label, and most people working in science act accordingly everywhere except their funding arrangements- reading, citing, and discussing pre-prints being the biggest example. Post publication review is turning out to be much bigger than pre-review- which was of course actually always the case.
And of course it's a sliding scale- if you notice that an article got published in 'Bob's Predatory Journal of Uniformly Politically Biased Results', sure, you're justified in drawing some inference from that. But at the end of the day worshipping institutional pedigrees is exactly what science should be trying to avoid whenever it can.
Exactly! We trust doctors because there's med school, the AMA, board certification, continuing training, etc. For academia, there need to be reliable, peer-reviewed journals that serve as a proxy for trust. Even this system has flaws but doing away with this system will lead to chaos. Also don't forget that wonderful science is done in academia, including stuff that gets developed into drugs/treatments later used by doctors.
To the extent these credentials help create trust at all, which is quite limited, it’s because they signal a doctor is a member of a community that possesses certain undeniable and useful skills, like setting broken bones and managing bacterial infections with antibiotics. That is, the AMA is a craft guild, and I trust a doctor, if I do, in the same sense that I trust a union carpenter. But science is not, primarily, a craft. People generally don’t go to a scientist when they have a specific problem to be reliably solved. The comparison is inapposite.
“usually polite but rarely honorable” exactly describes the people I interacted with in academia. Thank you. They were grasping, self-protective, dishonest, uncreative and shallow, for the most part. And there weren’t that many exceptions.
I sure do love reading your stuff, Adam! I can't wait to share this with friends. I'm a part of a network trying to figure out how to establish trust pain research (https://entrust-pe.org/), and through it read an editorial from 1994 by D.G. Altman on the Scandal of Poor Medical Research (https://www.bmj.com/content/308/6924/283). It could have been written today, with even more damning examples/evidence.
I come to the world of research from a lived experience/patient partner perspective (I have yet to find a good term for this, if you have one please do share!), and my lack of accreditation is often used against me. I never wanted to be in this role of activist/advocate for better pain research and care, as with most of us in these roles I came to it through shitty experiences. And I've had a lot of shitty experiences within academia in this 'outsider' role. It's exhausting to have to continually prove one's worth as a human and fight to be heard.
Thankfully there's lots of good folks within (and without) academia who do want change and they make the fight easier. But there's lots of folks who need to 'tug harder on your tether and pull yourself closer to reality, because you’re embarrassing yourself.' The Scandal paper starts with the line 'We need less research, better research, and research done for the right reasons' and I think about that all the damn time. I wonder how many folks doing research have ever asked themselves what the right reasons are? And once identified, asked themselves if research is being done for the right reasons?
The problem in the modern world with this approach is that the answer is often complicated, or short term inconvenient and only medium or long term obviously right.
Which in the absence of respect for expertise leaves a lot of people vulnerable to going 'oh, but I don't want to wear a mask' or 'oh, but I like cheap plane rides' or 'chemo makes me feel rubbish and the placebo snake oil makes me feel really good'...
It's easier to sell the wrong answers because they can be as simple, compelling, convenient and cheap as necessary to sell them, whereas the truth is much more constrained.
This seems right: "For centuries, the smartest people on earth had one aspiration: produce some minor improvements on ancient scholars like Aristotle and Galen. Accordingly, that’s all they ever did. To invent a new world, humans first had to stop being so impressed with the old world." But it states a *necessary* condition on improvement, not a *sufficient* condition. What else was required? And: do we have that "what else" right now?
The decline of trust in institutions is the inevitable result of the institutions abusing the trust they were given. Covid sledgehammer policies are one reason why Zeus might be at times a better choice..
I'm here to invoke the law of equal and opposite advice. This article is very applicable to anyone who reads Experimental History. That kind of person can learn the see the fuzzy boundaries of expertise while still being smart enough to know 1)the basics facts and 2)the limits of their knowledge.
But I think this piece ignores that the fact that the distrust of expertise is not all organic. Billions of dollars are spent every year to sow doubt among the populace. It's easy. Very easy to manipulate human psychology. My country has been running a drive to vaccinate girls against HPV. An honourable goal that needs a bit of trust in experts. But parents are refusing to give consent because some bad faith politicians have convinced them that the girls are being sterilized. A lil blind faith couldn't hurt them even if they 'dont know what's in the vaccine'.
It's unfortunate, because it means that people are often hesitant to take up medical interventions that are good for them, which sounds like what's happening in this case. But if you want people to trust you, you have to earn it, especially if you've broken that trust in the past.
True. But the field of medicine is a worldwide decentralized system. It most definitely will have such instances. We should be careful not to throw the baby out with the bath water though. And learn to see when discord and mistrust are being seeded from without. Because that is very different from people seeing the broken system themselves.
Not intentionally sterilized but injected with a seriously flawed vaccine that causes injuries. There is always a laudable goal. No one disputes the goal. It is the corrupt pharma system that games trials in the name of profits. You don't have to agree but to be so thoroughly dismissive of the prudence shown by parents is bothersome.
The way I see it, decline in major institutions is not a function of their claims to truth per se, but a reaction to an institution that violates, dehumanizes, stigmatizes and dismisses the range of human experience in order to extract profit and reproduce itself. Modern medicine is excellent at treating illness of identifiable material origin and treatment (like broken bones), but it is notoriously poor at responding to softer issues like substance abuse or mental illness, or even nutrition for that matter. In my experience working in a large hospital system, disengagement from the medical establishment is a rational response to the failures of that system rather than getting duped by some crank. Trust is not earned by experts forming watertight arguments, but about increasing people's felt sense of safety. This applies to academia too.
Also, I think it is an error to assume that "looking at goat entrails or consulting the stars or whatever" is essentially meaningless pseudoscience. Ritual practice carries a wide range of meaning and intention, few of which are intended to replace those of science. I see them operating sympathetically more than antithetically.
The Emergency Room is an interesting example. Apparently, there are big staffing problems in Emergency Departments at hospitals across the country. Most of the "physicians and physician extenders" (physicians, nurse practitioners, and others) are treated as independent contractors. In any case, it costs lots of money to staff an emergency department, we all know the costs are outrageous, and the hospitals are apparently still losing money. All of that to make this prediction: People will stop going to the Emergency Room to have a broken bone fixed. They'll go to a clinic that doesn't take insurance and is staffed by a nurse practitioner--of course until some law is passed the prohibits such a business in the interest of "patient safety."
I always enjoy your posts, they are engaging and thought provoking and I often agree with your critiques of higher ed. I've also lived in the academic world a long time - my credentials were stamped on a diploma in Y2K. That said, I think you've made a very broad generalization here, that seems to rest on an assumption that all institutions of higher ed are the same. But they aren't. Really it's "the Ivies" /Elites that in are the news with stories about injustice, inappropriate conduct, and questionable practices more often than not, and it's this news that fuels much of the public's mistrust in science / higher ed. However the elites are not actually representative of the masses. I came across another blogger recently who reminded readers that while egregious things happen at places like Harvard, the vast majority of profs and students work and study at places like UC-Stanislaus. The demographics of folks who work in the elite schools compared to folks who work in state colleges and other comprehensive universities are quite different. The entitlements, the entrenchments, these are concentrated in the elites, and more diluted elsewhere. Petty cruel, non-curious faculty members are probably sprinkled throughout the system -- higher ed is a "safe place" for folks who might not make it outside the tower -- but I do not think they make up the majority. I've gotten my fair share of snubs at conferences over the years when someone looks at my name tag and finds the status it conveys wanting, but I also always find there are many lovely, well meaning, and curious academics out there too who want nothing more than to work towards improving the science in all the right ways.
What we need to clamor for is respect for open source publishing and other public-facing scholarship that invites the public in rather than shuts them out, and we need to move beyond the 19th and 20th century entrenchments that are holding our science back. We need to call out "famous" folks who are actually terrible scientists and/or terrible humans. But all this can happen without a complete system failure. I do not think we should be advocating for a crash and rebuild scenario. In such a scenario the re-set would be tragic: The elites that have overflowing coffers will rebuild. The rest though - the institutions with small endowments and limited funds? -- they will just be gone.
And that's the heart of the problem, as I see it. We have "two worlds" in academia: We have the elite legacy stemming from the olden days where the aristocracy needed something to do so they became scholars. But then changes to societies happened and regular folk got involved in higher ed too. The classist issues in higher ed today perpetuate and we don't talk about that nearly enough.
"over the people who form their worldviews by looking at goat entrails or consulting the stars or whatever"
ahh -- you reference the climate scientists that calculate warming to the third significant digit, while many are of their inputs are not known to the first significant digit
You had two really good mentors, then met a bunch of shitty people. Your conclusions seem quite sweeping, as if you trust your sample to be representative - not just of a field or discipline, but the whole of academia (if not any form of institution). Why?
Out of curiosity, what do you see as the incentive for scientists to continue to publish in traditional journals? Lots of gatekeeping institutions derive their strength from coercive power – if you think the country is worse with FDA regulation than without it, you can't just decide to opt out of getting their approval. But if you think the public will afford you the same amount of legitimacy either way, there's nothing legally stopping you from submitting your paper to arXiv instead of Nature, right?
Probably a lot of scientists *don't* believe their work will earn as much respect that way, but what's stopping them from trying it once? Is there an lurking coordination problem?
uh oh I think this points to a fundamental underlying disagreement about human nature — serves me right for subscribing to a substack about psychology even though I don't believe in psychology just because it's well-written and interesting
> But if you think the public will afford you the same amount of legitimacy either way, there's nothing legally stopping you from submitting your paper to arXiv instead of Nature, right?
Most researchers aren't concerned about the opinion of the public. They're concerned about the opinion of tenure and grant committees.
"Instead of arguing from expertise, you should use your expertise to make better arguments." I will be quoting this (and will try not to be obnoxious when doing so). Thanks Adam!
I have nothing substantive to say other than that I continue to be flabbergasted by how good you are as a writer -- you keep writing these pieces that compel me to share them with friends who don't even care one whit about the scientific establishment (I'm in industry), and they end up reliably amazed and entertained.
I think I was also compelled to comment because a ~decade ago I found myself at a fork in life reminiscent of your description that starts with "Then, couple years ago, I looked around and realized that I didn’t actually admire most of the people I was trying to be. ...", and ended up leaving for industry instead of continuing on to academia. In my case, I was doing a physics degree in my boyhood quest to become a theorist of some sort. (I think it also helped for my decision that I wasn't good enough to get into R1 grad programs.) I think earning well and working on not-too-uninteresting data analytics problems have helped, but I've always wistfully wondered about what could've been; your essays have helped me un-tint that rose-tinted counterfactual.
As one who did get into and through an R1 grad program but also ended up in industry, I am eternally grateful to have had the good fortune to escape what I now consider the hell of academic life. So I hope my testimony further un-tints your counterfactual.
(BTW, I had a great time in grad school. But the thought of being a research professor at a university today fills me with fear and loathing.)
My thoughts and actions resemble yours: got the PhD, didn't want to live in that world, so first started my own company then joined another as it grew. It all went exceedingly well ... until we were bought out by a private equity company and they brought a new set of measures.
Standing up from my seat cheering loudly
“I, too, would like to beat the charlatans and the terrorists, which is why I want to do better than, “Don’t trust those guys—they lack the proper accreditation!” If that’s all you got, people shouldn’t trust you. Instead of arguing from expertise, you should use your expertise to make better arguments.”
While I love this article and the direction it is going, there is a major issue: if we don't trust academic institutions or titles or journalists, what do we trust? It is just not possible for the common man (even a very smart one) to independently evaluate every writer/scientist/inventor to see if they should trust what they have created.
We have to have _something_ to quickly let us know what to trust, perhaps like the Cochrane Library for medical info.
This is a great question and a good answer will probably require a whole post. Here's my short version:
It takes a lot of work to figure out who you can trust, and there's really no way of speeding that up. You could look at their credentials, for instance, but then, how do you know whether those credentials are trustworthy? You could see what other people say about them, but then, how do you know whether to trust those people? And so on, infinitely.
One option is paranoia: trust no one! But this, uh, limits your options quite a bit.
The other option is to make your best guess and realize that's all you're doing. This is, I think, where the most pernicious errors lie––feeling certain that you know the truth, when all you've really done is guess.
> It takes a lot of work to figure out who you can trust, and there's really no way of speeding that up. You could look at their credentials, for instance, but then, how do you know whether those credentials are trustworthy? You could see what other people say about them, but then, how do you know whether to trust those people? And so on, infinitely.
Here's a trick: realize that credentials do not guarantee trustworthiness - all one needs to prove this is to find literally one single instance where a credentialed expert demonstrated untrustworthiness....and luckily, we have many thousands of instances.
> One option is paranoia: trust no one!
Another option is not framing epistemic strictness as a mental illness.
> But this, uh, limits your options quite a bit.
Not always.
> The other option is to make your best guess and realize that's all you're doing.
It's by far the most popular, but it is not the only other option.
See: https://en.wikipedia.org/wiki/False_dilemma
> This is, I think, where the most pernicious errors lie––feeling certain that you know the truth, when all you've really done is guess.
Do you see the irony here?
Why do we need something to trust? Do the laws of physics constrain us in some known way?
Yes, the laws of physics do constrain us. The main constraint is time, because it takes time and effort to evaluate any claim.
I'm an advocate of evaluating scientific claims for oneself, but time does not allow one to do a proper job of this for every claim that's out there. Prioritization is key. Questions to ask:
(1) How personally important to my life is this claim?
(2) How politically controversial is the field? (Claims in less politicized fields tend to be more trustworthy.)
Which laws of physics in particular make it necessary to trust, why am I not constrained by these laws, and how did you determine that all humans are constrained in the way you believe (missing me, at least, in the process)?
Let's presume you're not claiming to be free from the laws of physics. But plausibly like Descartes and the young St. Augustine, you claim you can avoid trusting other institutions, people, or knowledge.
St. Augustine reports why he came to believe this is impossible. And it's pretty clear Descartes did not actually doubt everything - but doubted enough wrong things to make some key contributions.
The charitable reading is that you are making a hyperbolic claim - not that you literally don't rely on any institutions -- requiring you to spend inordinate amounts of time testing your own electrical cords -- but that like Descartes you want to minimize this because you've seen "with how little wisdom the world is governed."
If it was admirable for Descartes, then also for you. But there is a tradeoff: Civilization is largely an expanded circle of trust reducing in friction of transactions enabling more to be done.
Less trust, more friction. But a frictionless scam is still a bad transaction.
I don't think you can manage "zero trust" any better than Descartes, but a targeted "trust, but verify" seems wise.
> Let's presume you're not claiming to be free from the laws of physics.
I am only claiming to not be subject to laws of physics (that you will not reveal despite me directly asking) that you claim constrain me. But sure, close enough.
> But plausibly like Descartes and the young St. Augustine, you claim you can avoid trusting other institutions, people, or knowledge.
Remove the "like Descartes and the young St. Augustine" part and I agree.
Possibly relevant: https://en.wikipedia.org/wiki/Framing_effect_(psychology)
> St. Augustine reports why he came to believe this is impossible. And it's pretty clear Descartes did not actually doubt everything - but doubted enough wrong things to make some key contributions.
Sure - why should I care what these old farts claim?
> The charitable reading...
The mind that generates the simulation tends to view it with rose colored glasses.
> ...is that you are making a hyperbolic claim...
Is it *me* who is making the hyperbolic claim here, in fact (as opposed to *in individual subjective *experience*)?
Also, you should use "a", not "the". Seemingly small mistake, but one that I propose often has outsized consequences, especially when combined with hundreds of other similar seemingly small mistakes that can occur during human cognition (or: that *exist within* "reality").
> ...not that you literally don't rely on any institutions...
The point of contention is not "reliance" it is "trust".
Possibly relevant: https://en.wikipedia.org/wiki/Semiotics
> ...requiring you to spend inordinate amounts of time testing your own electrical cords...
Have "you" presumed a specific level of risk tolerance on my behalf? How could you possibly know such anything with any level of accuracy?
> ...but that like Descartes you want to minimize this because you've seen "with how little wisdom the world is governed."
Maybe. Also maybe: give me 365 days with the nuclear briefcase thingamajig and see if we make it out of this gong show alive.
> If it was admirable for Descartes, then also for you. But there is a tradeoff: Civilization is largely an expanded circle of trust reducing in friction of transactions enabling more to be done.
More importantly imho: "civilization" is *simultaneously* an ever expanding circle of delusion. If you disagree with this bold claim, I am happy to defend it.
> Less trust, more friction. But a frictionless scam is still a bad transaction.
Sounds like a neat world, can I visit it?
> I don't think you can manage "zero trust" any better than Descartes, but a targeted "trust, but verify" seems wise.
I don't think you have any means of even remotely estimating my capabilities - rather, I suspect you are running on ("thinking" according to) culturally conditioned heuristics and a misunderstanding of many of the big ideas you have learned.
But hey....it's just a theory, I could be wrong.
Now, let's return to my question regarding your claims of fact, that you shrewdly sidestepped:
Which laws of physics in particular make it necessary to trust, why am I not constrained by these laws, and how did you determine that all humans are constrained in the way you believe (missing me, at least, in the process)?
EDIT:
Rather than edit my text to hide my error: I just realized that now it is I who am experiencing delusion: you and DH are different people, yet I've pinned his crimes on you, in a smug, self-righteous manner no less....OH THE HYPOCRISY!!
Came across this reply again. Really gotta respect that last paragraph and the decision behind it.
I agree. There are many claims that are self-impeaching, and others that are self-authenticating. In other words they are consistent with our own systems of logic, wisdom and experience. To be sure we humans on average suck at theoretical physics because it is so different. So I'll listen to Lawrence Krauss explain some aspect of theoretical physics. But I don't need to hear his Trump derangement prattle any more than I would have accepted relationship advice from Richard Feynman. We don't always need an anointed class of experts to tell us what to believe.
> In other words they are consistent with our own systems of logic, wisdom and experience.
Which no one really practices, or even tries to with any level of seriousness. Humans seem to generally be able to only think/operate in either the abstract realm or concrete realm at a single point in time, at least when contemplating certain topics. (This is an interesting theory that could be studied...it would be a grind for sure, but I think it would yield fruit.)
> To be sure we humans on average suck at theoretical physics because it is so different.
That is only one reason we suck at it....consider how many variables are involved prior to this event!
> So I'll listen to Lawrence Krauss explain some aspect of theoretical physics. But I don't need to hear his Trump derangement prattle any more than I would have accepted relationship advice from Richard Feynman. We don't always need an anointed class of experts to tell us what to believe.
If you do not expose yourself to how delusional the brightest minds on the planet are, you might form the belief that they are bright in an absolute sense, and that is EXTREMELY dangerous. As an example: have you ever noticed science *hasn't even tried* (or discussed trying) to develop a comprehensive plan for addressing climate change (despite allegedly having a Theory Of "Everything")? And this is only one of humanity's major problems!
This is one of the funnest substack threads I've been in for ages, kudos to our host for the great prompt!!!
They do, in fact- computability. You are a tiny little mote enmeshed in more information than you can possible correlate for internal consistency in the span of the relevant decision (or your lifetime, or the lifetime of the universe). It's a vicious combinatorial problem. And so you trust that your mom was right when you shouldn't touch the stove and you trust that the programming language you're using adds correctly and that the pharmacist (and the manufacturer) put penicillin in your pills instead of rat poison. Being able to delegate and modularize trust at scale is in a very real way what makes civilization happen. So mostly we trust until the gears grind and we have cause to look deeper.
While it may be true that there are eventual hard physics constrains on human cognition, that's not what constrains us now. We have massive excess time and biological compute available, the problem is more so that we leave it idle, or use it for foolishness like describing and arguing over our respective delusions. I don't deny that it's fun and seems like we're doing something useful, but I'm very suspicious that humans might be essentially driving in the ditch beside a lovely highway we cannot see.
I don't think the takeaway is necessarily that all credentials are total shit forever. Certifications keep getting reinvented because they do something useful. But it's also been true that certifications keep getting tossed on the heap because any thoughtful person actively engaged in any field of inquiry realizes their limitations, and sometimes those loom larger in certain moments and from certain angles. Every stamp of approval is ultimately an experiment (and perhaps a time-limited one, given the inexorable march of Goodhart's Law) in trying to draw some binary inference about a continuous character, and lots of times they don't look very interesting. Right now, I think there's lots of cause to believe that the infrastructure of scientific publishing (selective journals with vast fees on both sides of the transaction, impact factors, fussiness about shared authorship, pre-publication peer review) simply doesn't actually do what it says on the label, and most people working in science act accordingly everywhere except their funding arrangements- reading, citing, and discussing pre-prints being the biggest example. Post publication review is turning out to be much bigger than pre-review- which was of course actually always the case.
And of course it's a sliding scale- if you notice that an article got published in 'Bob's Predatory Journal of Uniformly Politically Biased Results', sure, you're justified in drawing some inference from that. But at the end of the day worshipping institutional pedigrees is exactly what science should be trying to avoid whenever it can.
Exactly! We trust doctors because there's med school, the AMA, board certification, continuing training, etc. For academia, there need to be reliable, peer-reviewed journals that serve as a proxy for trust. Even this system has flaws but doing away with this system will lead to chaos. Also don't forget that wonderful science is done in academia, including stuff that gets developed into drugs/treatments later used by doctors.
To the extent these credentials help create trust at all, which is quite limited, it’s because they signal a doctor is a member of a community that possesses certain undeniable and useful skills, like setting broken bones and managing bacterial infections with antibiotics. That is, the AMA is a craft guild, and I trust a doctor, if I do, in the same sense that I trust a union carpenter. But science is not, primarily, a craft. People generally don’t go to a scientist when they have a specific problem to be reliably solved. The comparison is inapposite.
Well stated. Obviously a different Jerome Powell.
Journals are known to not be trustworthy, and yet academia continues to exist.
Plenty of children trust doctors while having zero knowledge of the things you mention.
“usually polite but rarely honorable” exactly describes the people I interacted with in academia. Thank you. They were grasping, self-protective, dishonest, uncreative and shallow, for the most part. And there weren’t that many exceptions.
I sure do love reading your stuff, Adam! I can't wait to share this with friends. I'm a part of a network trying to figure out how to establish trust pain research (https://entrust-pe.org/), and through it read an editorial from 1994 by D.G. Altman on the Scandal of Poor Medical Research (https://www.bmj.com/content/308/6924/283). It could have been written today, with even more damning examples/evidence.
I come to the world of research from a lived experience/patient partner perspective (I have yet to find a good term for this, if you have one please do share!), and my lack of accreditation is often used against me. I never wanted to be in this role of activist/advocate for better pain research and care, as with most of us in these roles I came to it through shitty experiences. And I've had a lot of shitty experiences within academia in this 'outsider' role. It's exhausting to have to continually prove one's worth as a human and fight to be heard.
Thankfully there's lots of good folks within (and without) academia who do want change and they make the fight easier. But there's lots of folks who need to 'tug harder on your tether and pull yourself closer to reality, because you’re embarrassing yourself.' The Scandal paper starts with the line 'We need less research, better research, and research done for the right reasons' and I think about that all the damn time. I wonder how many folks doing research have ever asked themselves what the right reasons are? And once identified, asked themselves if research is being done for the right reasons?
Thanks again.
The problem in the modern world with this approach is that the answer is often complicated, or short term inconvenient and only medium or long term obviously right.
Which in the absence of respect for expertise leaves a lot of people vulnerable to going 'oh, but I don't want to wear a mask' or 'oh, but I like cheap plane rides' or 'chemo makes me feel rubbish and the placebo snake oil makes me feel really good'...
It's easier to sell the wrong answers because they can be as simple, compelling, convenient and cheap as necessary to sell them, whereas the truth is much more constrained.
When things are complicated, it's often because we haven't fully understood them yet. Now that we finally understand how to cure scurvy, for example, it's hard to be simpler than "Eat this lemon." This is a good piece about that: https://slimemoldtimemold.com/2022/01/11/reality-is-very-weird-and-you-need-to-be-prepared-for-that/
Also a theme of one of my recent posts: https://www.experimental-history.com/p/there-are-no-statistics-in-the-kingdom
THIS!
This seems right: "For centuries, the smartest people on earth had one aspiration: produce some minor improvements on ancient scholars like Aristotle and Galen. Accordingly, that’s all they ever did. To invent a new world, humans first had to stop being so impressed with the old world." But it states a *necessary* condition on improvement, not a *sufficient* condition. What else was required? And: do we have that "what else" right now?
I would add "diverse convictions about how that better world might be created, and the agency to attempt them."
Smart questions!
The decline of trust in institutions is the inevitable result of the institutions abusing the trust they were given. Covid sledgehammer policies are one reason why Zeus might be at times a better choice..
https://www.msn.com/en-us/news/world/one-by-one-the-lockdown-myths-are-crumbling/ar-BB1hwLGC?ocid=msedgdhp&pc=DCTS&cvid=c1edd3edf8a5442a8495a93feb7e9149&ei=140
I'm here to invoke the law of equal and opposite advice. This article is very applicable to anyone who reads Experimental History. That kind of person can learn the see the fuzzy boundaries of expertise while still being smart enough to know 1)the basics facts and 2)the limits of their knowledge.
But I think this piece ignores that the fact that the distrust of expertise is not all organic. Billions of dollars are spent every year to sow doubt among the populace. It's easy. Very easy to manipulate human psychology. My country has been running a drive to vaccinate girls against HPV. An honourable goal that needs a bit of trust in experts. But parents are refusing to give consent because some bad faith politicians have convinced them that the girls are being sterilized. A lil blind faith couldn't hurt them even if they 'dont know what's in the vaccine'.
That same blind faith has, unfortunately, gotten people mistreated at the hands of the government and the medical establishment. (For instance: https://en.wikipedia.org/wiki/Tuskegee_Syphilis_Study)
It's unfortunate, because it means that people are often hesitant to take up medical interventions that are good for them, which sounds like what's happening in this case. But if you want people to trust you, you have to earn it, especially if you've broken that trust in the past.
True. But the field of medicine is a worldwide decentralized system. It most definitely will have such instances. We should be careful not to throw the baby out with the bath water though. And learn to see when discord and mistrust are being seeded from without. Because that is very different from people seeing the broken system themselves.
Not intentionally sterilized but injected with a seriously flawed vaccine that causes injuries. There is always a laudable goal. No one disputes the goal. It is the corrupt pharma system that games trials in the name of profits. You don't have to agree but to be so thoroughly dismissive of the prudence shown by parents is bothersome.
The way I see it, decline in major institutions is not a function of their claims to truth per se, but a reaction to an institution that violates, dehumanizes, stigmatizes and dismisses the range of human experience in order to extract profit and reproduce itself. Modern medicine is excellent at treating illness of identifiable material origin and treatment (like broken bones), but it is notoriously poor at responding to softer issues like substance abuse or mental illness, or even nutrition for that matter. In my experience working in a large hospital system, disengagement from the medical establishment is a rational response to the failures of that system rather than getting duped by some crank. Trust is not earned by experts forming watertight arguments, but about increasing people's felt sense of safety. This applies to academia too.
Also, I think it is an error to assume that "looking at goat entrails or consulting the stars or whatever" is essentially meaningless pseudoscience. Ritual practice carries a wide range of meaning and intention, few of which are intended to replace those of science. I see them operating sympathetically more than antithetically.
The Emergency Room is an interesting example. Apparently, there are big staffing problems in Emergency Departments at hospitals across the country. Most of the "physicians and physician extenders" (physicians, nurse practitioners, and others) are treated as independent contractors. In any case, it costs lots of money to staff an emergency department, we all know the costs are outrageous, and the hospitals are apparently still losing money. All of that to make this prediction: People will stop going to the Emergency Room to have a broken bone fixed. They'll go to a clinic that doesn't take insurance and is staffed by a nurse practitioner--of course until some law is passed the prohibits such a business in the interest of "patient safety."
I always enjoy your posts, they are engaging and thought provoking and I often agree with your critiques of higher ed. I've also lived in the academic world a long time - my credentials were stamped on a diploma in Y2K. That said, I think you've made a very broad generalization here, that seems to rest on an assumption that all institutions of higher ed are the same. But they aren't. Really it's "the Ivies" /Elites that in are the news with stories about injustice, inappropriate conduct, and questionable practices more often than not, and it's this news that fuels much of the public's mistrust in science / higher ed. However the elites are not actually representative of the masses. I came across another blogger recently who reminded readers that while egregious things happen at places like Harvard, the vast majority of profs and students work and study at places like UC-Stanislaus. The demographics of folks who work in the elite schools compared to folks who work in state colleges and other comprehensive universities are quite different. The entitlements, the entrenchments, these are concentrated in the elites, and more diluted elsewhere. Petty cruel, non-curious faculty members are probably sprinkled throughout the system -- higher ed is a "safe place" for folks who might not make it outside the tower -- but I do not think they make up the majority. I've gotten my fair share of snubs at conferences over the years when someone looks at my name tag and finds the status it conveys wanting, but I also always find there are many lovely, well meaning, and curious academics out there too who want nothing more than to work towards improving the science in all the right ways.
What we need to clamor for is respect for open source publishing and other public-facing scholarship that invites the public in rather than shuts them out, and we need to move beyond the 19th and 20th century entrenchments that are holding our science back. We need to call out "famous" folks who are actually terrible scientists and/or terrible humans. But all this can happen without a complete system failure. I do not think we should be advocating for a crash and rebuild scenario. In such a scenario the re-set would be tragic: The elites that have overflowing coffers will rebuild. The rest though - the institutions with small endowments and limited funds? -- they will just be gone.
And that's the heart of the problem, as I see it. We have "two worlds" in academia: We have the elite legacy stemming from the olden days where the aristocracy needed something to do so they became scholars. But then changes to societies happened and regular folk got involved in higher ed too. The classist issues in higher ed today perpetuate and we don't talk about that nearly enough.
"over the people who form their worldviews by looking at goat entrails or consulting the stars or whatever"
ahh -- you reference the climate scientists that calculate warming to the third significant digit, while many are of their inputs are not known to the first significant digit
You had two really good mentors, then met a bunch of shitty people. Your conclusions seem quite sweeping, as if you trust your sample to be representative - not just of a field or discipline, but the whole of academia (if not any form of institution). Why?
Out of curiosity, what do you see as the incentive for scientists to continue to publish in traditional journals? Lots of gatekeeping institutions derive their strength from coercive power – if you think the country is worse with FDA regulation than without it, you can't just decide to opt out of getting their approval. But if you think the public will afford you the same amount of legitimacy either way, there's nothing legally stopping you from submitting your paper to arXiv instead of Nature, right?
Probably a lot of scientists *don't* believe their work will earn as much respect that way, but what's stopping them from trying it once? Is there an lurking coordination problem?
I think the thing stopping them is fear. As for incentives: https://twitter.com/alicemazzy/status/1545394998732115969
uh oh I think this points to a fundamental underlying disagreement about human nature — serves me right for subscribing to a substack about psychology even though I don't believe in psychology just because it's well-written and interesting
> But if you think the public will afford you the same amount of legitimacy either way, there's nothing legally stopping you from submitting your paper to arXiv instead of Nature, right?
Most researchers aren't concerned about the opinion of the public. They're concerned about the opinion of tenure and grant committees.
Why would they? No upside.
I'm not an academic, but I think you would save on publishing fees this way, right?