6

The more I work with AI (LLMs) the more disillusioned I become

Image / Video / Audio Gen. AI is absolutetly impressive but I fail to see how anyone can still believe that LLMs are going to replace every other white-collar job ... let alone lead to AGI.

Its 2026 and we have yet to have a LLM complete something as simple as beating Pokemon Blue

seems like LLMs are really just dumb text generators after all that are just good for 3 things: - generating code - translating / summarizing text - pseudo "google search" (because real google search got turned to shit)

i wonder how much of the disillusionment comes from the interaction mode rather than the model itself.

i have a coaching background and accidentally used open questions instead of instructions when working with claude last week. stuff like "what would break here?" instead of "check for errors." the outputs were qualitatively different. not just better, different in kind.

when you only give instructions you only see execution. change the conversational style and you get something else entirely. most people (myself included until recently) talk to LLMs like they're command line tools. maybe that's why they feel like dumb text generators.

not saying they're secretly brilliant. just that we might be measuring their ceiling through a very narrow interaction pattern.