Robin Hanson Paths Forward

Robin Hanson posted yesterday about how truth-seeking is sacred to him, so I replied about Paths Forward:

Note that I attempted to bring up Paths Forward with Robin Hanson in 2018: Curiosity – Paths Forward or Prediction Markets?


https://www.overcomingbias.com/2021/11/what-is-sacred-to-me.html

truth-seeking among intellectuals on important topics

Suppose, hypothetically, that you’re wrong about something important. What reasonable series of actions could someone take to correct your error?

Elaborating: Suppose a member of the public knows your error and would like to tell you. Further suppose that he cannot tell you in a paragraph or two. Your initial reaction will be that the idea does not sound promising. That’s because it’s complicated, it’s counter-intuitive, it disagrees with multiple ideas you have, and it relies on several pieces of background knowledge you don’t have. It’s not the kind of easily-understandable, immediately-appealing idea that would go viral. Also, the critic doesn’t have the right credentials, social status or social network to get your attention that way.

I understand that you have to protect your time, energy and attention. I understand that the large majority of people who’d like to teach you something are, in fact, wrong. But do you a written policy for how you protect your time and attention without blocking good ideas, which explains to a would-be critic what steps he needs to do to not be incorrectly ignored? Is such a policy publicly exposed to critical scrutiny? Do you have public transparency and accountability for how you handle this stuff? I believe not.

It’s also important, for persuasion, that would-be critics who are wrong can get their criticisms addressed. Otherwise they’ll keep thinking you’re wrong with no way to learn better from you. Again you must protect your time, but there are things you can do. Books, blogs and FAQs help but aren’t enough. A discussion forum with an organized community that takes responsibility for answering questions and criticisms would do more. That’d involve proxies who can answer some things in your place, and written escalation policies for when the proxies don’t address issues adequately.

I think these methodological issues related to resource-efficient ways error correction can happen are crucial to truth seeking, and that you and many others are not working on them. I have been: Paths Forward Summary

1 Like

Hanson replied:

I agree it is important to develop better institutions for this case, but I haven’t come up with anything, so don’t have anything concrete to suggest.

I replied:

I have concrete suggestions. I linked to some of my articles explaining them. One is to publish a policy for how you (and your proxies) will engage with critics, which explains some policies, offers some guarantees if certain conditions are met, etc. Then provide transparency regarding how you follow it so the public can hold you accountable. I do this: Debate Policy · Elliot Temple I know I have a smaller following than you. You could make a more conservative/limited policy, which filters more aggressively than I do, and that would still put you way ahead of your peers.

1 Like

I liked a reply that Elias Håkansson wrote to Hanson:

Truth-seeking is one of the least controversial values to hold sacred. It’s almost a humble brag. I’d be more impressed with your vulnerability display if you divulged some situation where you’ve been tempted to sacrifice truth-seeking at the altar of some other more controversial value, such as self-aggrandizement. Like maybe someone produced a really clever argument against some concept you spent many years developing, like prediction markets or grabby aliens or whatever, and you found yourself incapable of reconciling the discrepancy. But instead of signal-boosting that someone found a hole in your argument (in the name of truth-seeking!!), you suppress it or rationalize around it, or even try to keep it secret.

Harsh but not unreasonable.

Robin Hanson replied to a couple other people this morning but did not reply to me again. I’ll check back again in a few days to make sure he doesn’t reply to me, but assuming he doesn’t reply again:

What can you learn from this exchange? Got any analysis or conclusions?

He has conceded that he doesn’t know how to deal with soliciting error correction. He could read your stuff and ask you questions publicly or privately. He could criticise your ideas publicly or privately. It doesn’t sound like he’s done that. The rest of the comments don’t look very meaty so there’s not much there to distract him. He wants to look like he’s taking criticism and suggestions seriously, but he isn’t doing it or he’s doing a bad job.

Yeah. I think the thread is damning.

His reply to me reads like he was not paying attention to what I said (or else he’s an idiot, but I think he just wasn’t paying attention). He appears to have autopiloted to assuming I was raising a problem in hopes he could say the answer, instead of raising a problem that applies to him and that I have an answer to. His reply to me also shifted the issue from applying to individuals like him to being about institutions.

So he refused to pay attention to an idea about Paths Forward. So there are no Paths Forward because he just won’t listen at all…

It can be very hard to report software security breaches to businesses for free. They often ignore your communications.

These things are much simpler to listen to, and benefit from, than philosophical corrections. But people still badly screw up anything like Paths Forward.

Contact about software security usually begins private, so companies can hide the problem. It often becomes public if they ignore it, so there’s an extra incentive to deal with it. Sometimes it becomes public either way, but having already fixed it before it’s public makes them look better.

There are additional comments about the difficulty of telling people about software security bugs at Beg Bounties | Hacker News