💠key weakness of llms is trying to do everything in one message? Not "thinking step by step" enough. Eg I explain a very difficult problem I'm trying to work through, it'll just pretend its solving it, when a thoughtful reply would be acknowledging the difficulty, breaking down the problem, making connections to existing techniques, etc. If you asked an llm to write something long form it'll just start putting words down without any kind of outlining and with total blindness to how generic and sloppy the writing is. You could wrap any prompt in "make a plan breaking down the steps to do x" and forward those tasks to more llm calls, but how do you know when you're at a level of detail that can be acted on rather than that needs more breakdown?