It’s kinda funny. Their magical bullshitting machine scored higher on made up tests than our magical bullshitting machine, the economy is in shambles! It’s like someone losing a year’s wages in sports betting.
Just because people are misusing tech they know nothing about does not mean this isn’t an impressive feat.
If you know what you are doing, and enough to know when it gives you garbage, LLMs are really useful, but part of using them correctly is giving them grounding context outside of just blindly asking questions.
It’s kinda funny. Their magical bullshitting machine scored higher on made up tests than our magical bullshitting machine, the economy is in shambles! It’s like someone losing a year’s wages in sports betting.
Just because people are misusing tech they know nothing about does not mean this isn’t an impressive feat.
If you know what you are doing, and enough to know when it gives you garbage, LLMs are really useful, but part of using them correctly is giving them grounding context outside of just blindly asking questions.
It is impressive, but the marketing around it has really, really gone off the deep end.