I've had LLMs make shit up, including math, many times during my undergrad work. I used them as research assistants and never trusted them for anything serious. Mostly just for finding references and checking my work or whatever. People are out there making decisions based on what could be made up information. The Meta one admitted to me that it's trained to provide information that sounds plausible if it can't find information. Reassuring as fuck.

Reply to this note

Please Login to reply.

Discussion

they just repeat what they were told.

I'm not against reasonable LLM usage but its a crutch for these people.

Same. They don't necessarily just repeat though.