Ehhhh. Saying it’s not intelligence “of any kind,” when it can construct whole relevant sentences, is confusing intelligence for correctness. LLMs represent a lesser form of reasoning - like the difference between Turing machines and pushdown automata. They’re plainly doing some of what goes into proper general thinky-thinky behavior. They’re just not doing enough of it to avoid obvious fuckups.
Ehhhh. Saying it’s not intelligence “of any kind,” when it can construct whole relevant sentences, is confusing intelligence for correctness. LLMs represent a lesser form of reasoning - like the difference between Turing machines and pushdown automata. They’re plainly doing some of what goes into proper general thinky-thinky behavior. They’re just not doing enough of it to avoid obvious fuckups.