Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

We are uniquely capable of doing that because we invented that :) It’s a self-serving definition, a job description.

This isn’t an argument against LLMs capability. But the burden of proof is on the LLMs’ side.



True. That capability might be reserved for AGI. The current implementation does feel like a party trick and I don't enjoy working with it




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: