Discussion
Grammarly is using our identities without permission
drbig: The most interesting is the realization that if the LLM's input is only the output of a professional (human), then by definition the LLM cannot mimic the process the (human) professional applied to get from whatever input they had to produce the output.In other words an LLM can spit out a plausible "output of X", however it cannot encode the process that lead X to transform their inputs into their output.
weird-eye-issue: Replace "LLM" with "student" and read that again. You don't just blindly give students output, you teach them, like what you are supposed to do with an LLM.
Eddy_Viscosity2: Is it not possible for the process of input to output be inferred by the llm and therefore applied to new inputs to create appropriate outputs.
tomhow: Comments moved to https://news.ycombinator.com/item?id=47259366.