My point is that ChatGPT might be going off bad data.
Discussion
I agree, it's an interesting theory you're implying here and worth consideration but applying scientific method: seek to disprove your own theory and see where it stands. Tbh I think its quite easy to pick this apart making it far more likely that ChatGPT isn't trying to sell you telegram but instead is just trained on text data that assumes it's good for privacy.
Do you know any other things chatgpt is so consistent to be wrong about?
ChatGPT is wrong about a lot of things. Ask it how many letter R are in the word strawberry and to tell you the index of the letters in the word. Or ask it to name ten fruit that end in 'ums': https://www.youtube.com/watch?v=cXw4mZZXJNU
The theory is a war as a user acquisition strategy