AGI, LLMs, CR and CF

Helpful? Somewhat. Sometimes it’s worth skimming through to look for real issues even though a lot of LLM output is noise/wrong/nonsense.

AGI is a different matter than helpfulness.

Sure feel free to share examples.