Ah, good, that was what I was looking for: an overview of what you think on the topic. Thank you for that. This allows me to better understand your premises and the definitions you use.
I agree with you on several points: that humans have general intelligence, and knowledge creation; that the behavior of animals is largely automatic depending on what’s in their genes; and about the different types of algorithms.
These large-scale, widespread problems causing human suffering seem more important than animal suffering. Even if you hate how factory farms treat animals, you should probably care more that a lot of humans live in terrible conditions including lacking their freedom in major ways.
The problems you point out are very important, indeed, and I’d like to see them solved too. However, I think an important element we should take into account here is scale (and tractability).
How many people do you think are suffering from these problems ? How many animals do you think are suffering from factory farming ? I’d be curious to see your estimates.
Also, as is common with causes, activists tend to be biased about their issue. Many people who care about the (alleged) suffering of animals do not care much about the suffering of human children, and vice versa.
Do you have a source for that claim? Because that sounds false. Most of the people I know who care about animals also care a lot about humans. I personally donated quite a lot to charities like the Against Malaria foundation, and know several people who did the same. Do you think that if you asked people interested in animal welfare whether they care about child suffering, they’d say no ?
Suffering involves wanting something and getting something else. Reality violates what you want. E.g. you feel pain that you don’t want to feel. Or you taste a food that you don’t want to taste. […] Suffering involves something happening and you interpreting it negatively. That’s another way to look at wanting something (that you would interpret positively or neutrally) but getting something else (that you interpret negatively).
I absolutely agree with that. It’s a very good way of defining suffering, actually.
Animals can’t interpret like this. They can’t create opinions of what is good and bad. This kind of thinking involves knowledge creation.
This is a very important claim at the core of your argument, since it’s what links your position on intelligence and suffering. I think we should try to talk about that first (I will answer about Appearances, Activism and Uncertainty in another response).
Personally, I don’t really understand why knowledge creation is required for thinking that something is bad. When my expectation is “I don’t want to starve to death”, I haven’t created any knowledge here. It’s innate.
Starving to death is indeed something below my expectations, but I don’t see why forming my expectations on the topic required knowledge creation. It’s likely that my genes pushed me toward this default position in an automatic way.
I understand that there are specific situations where a few people might agree to feel pain (“Sometimes humans like pain”, as you said), or even to starve to death if there is a strong enough reason (high enough social reward, strong belief system, wanting to support the next generation). But that’s a way of overriding a very powerful innate instinct. The “knowledge creation” part is, I think, in creating beliefs strong enough to override innate instincts, not the opposite.
I would tend to think that animals have default expectations on what’s good and bad arising from evolutionary pressure (they want food because they can’t survive without it; they don’t want to lose a leg or starve because they’d die), and that they mostly cannot change. This way, they’d still be in the situation of “wanting something and getting something else” - which, as you said, is suffering.
Why do you think that, for instance, me being fearful of dying is something that requires knowledge creation? I know I can override that, to some extent, but this seems like the default position, no?