I've had LLMs make shit up, including math, many times during my undergrad work. I used them as research assistants and never trusted them for anything serious. Mostly just for finding references and checking my work or whatever. People are out there making decisions based on what could be made up information. The Meta one admitted to me that it's trained to provide information that sounds plausible if it can't find information. Reassuring as fuck.