My Rational Debate Policy [CF Video]

1 Like

Context: I think it’d be good if this kind of debate policy was normal/common among intellectuals. I think you have correctly identified that lots of intellectuals are wrong about lots of stuff, and you know it, and you’re willing to help, but they won’t debate and instead stay wrong. And I think you’re right to set the example by having such a policy yourself.

With that in mind…

https://www.elliottemple.com/debate-policy

The key idea is being open to error corrections from the public rather than acting in a way where if I’m biased, mistaken or irrational, I’ll stay that way even though people knew better and were willing to help me.

I’d modify the last clause to:
…I’ll stay that way even though people knew better explicitly and were willing and able to help me.

I don’t think your policy will help you in the case that someone has an intuition that’s better than your explicit knowledge, but your explicit knowledge is better than their explicit knowledge. I can think of reasons why that’d be super unlikely, and also reasons why it’d be reasonably likely. I don’t know / I take no position on how likely it actually is.

I also don’t think your policy is likely to help you in the case that someone has better explicit knowledge but lacks sufficient debate skill. I take no position on how likely it is that someone actually has better explicit knowledge than you AND needs to invoke your debate policy. But if you take that scenario as a given (they do have better explicit knowledge AND need to invoke the debate policy) then I think it’s highly likely they’d lack the skill to win the debate despite being correct. Put another way: I think you have the skill to win most debates against most people even if you’re on the wrong side. Note: I’m not talking about devil’s advocate situations - only situations where you genuinely believe the position you’re taking.

I have no major objection. Knowing is most of being “able” though not all of it. I did have in mind explicit knowledge. I’ll consider an edit.

If they have that intuition plus they know some other stuff, then they could help. I’m actually working on multiple articles about intuition currently and the first one is scheduled to be posted on the CF site tomorrow. The articles will explain how intuition can be discussed and used in explicit, rational debate.

Also, if the person who has the intuition doesn’t know how to explicitly analyze it and talk about it in a discussion, then basically he also doesn’t know that it is better than my explicit conclusion. He ought to be undecided on the matter rather than believe his intuitive idea is better than my position. (Again my upcoming articles will discuss this more.) So he’s not in a position to correct me because he doesn’t know whether he has an improvement or not. He has an idea which is different and which could potentially be an improvement, but the world has a lot of those. It’s when someone actually evaluated the matter and reached a conclusion that I’m mistaken that I’d like to hear about it. I don’t care to hear from e.g. everyone who has brainstormed any ideas different than my own. (People are welcome to share brainstorming if they think it’s worth sharing with me. The kind of person who reads a lot of my stuff and posts on my forum would have a reasonable chance to guess which brainstorming would be interesting to me or at least worth me quickly reading. They have some reasonable ability to be selective and not share all their thoughts with me. But if everyone in the world was sharing brainstorming without claiming to have concluded their ideas were actually better than mine, it’d be overwhelming and wouldn’t be useful.)

How well it would help it that case depends partly on my integrity. I could be helped, but it’d also be easier for me to not listen if e.g. I were biased against the idea.

If I acted with poor integrity, then any observer could notice and debate it. So the issue isn’t just the debate skill of the person raising the idea, but also of the entire audience.

Also, sufficient debate skill has major overlap with the same critical thinking skill needed to create and evaluate the knowledge itself. If they can’t debate it to me in a conclusive way, then how exactly did they debate it internally in their own mind and reach a conclusion? Why can’t they just repeat whatever explicit thought process evaluated the explicit knowledge and use that statement as their debating position? It is possible to be good at rational debate (so you can do it alone or with other people who are behaving rationally) but not know how to deal with irrational debate actions/comments/tricks, so that’s one potential answer. I’m not saying there are no problems here but it may be less commonly problematic than most people think.

https://ftxfuturefund.org/announcing-the-future-funds-ai-worldview-prize/

We do not plan to read everything written with the aim of claiming these prizes. We plan to rely in part on the judgment of other researchers and people we trust when deciding what to seriously engage with. We also do not plan to explain in individual cases why we did or did not engage seriously.

Our published decisions will be final and not subject to appeal. We also won’t be able to explain in individual cases why we did not offer a prize.

My bold. It’s like an anti-Paths-Forward and anti-debate policy. But transparently, in writing, which is unusual. I think they just have no idea how to organize efficient, rational debate or discussion, and aren’t trying to solve the problem either. AGI isn’t too hard but that is too hard to even try!?

Also:

If a single person does research leading to multiple updates, Future Fund may—at its discretion—award the single largest prize for which the analysis is eligible (rather than the sum of all such prizes).

If they decide to count whatever you say as multiple arguments instead of one big argument, they may arbitrarily reduce the amount of money they pay you. Although the amount was arbitrary anyway.