I don’t understand why there is a think harder button on SuperGrok. Shouldn’t it think harder all the time? Why would someone want it to half ass its answers.
This makes no sense to me 😂
I don’t understand why there is a think harder button on SuperGrok. Shouldn’t it think harder all the time? Why would someone want it to half ass its answers.
This makes no sense to me 😂
It's slower and probably costs them more on compute, you can probably get a 100% answer with the quick mode most of the time
I’m mostly shitposting 😂
A little bit pissed because of the new dumb companions that take all the computing power and a simple request now takes 3-10x longer. Going back to GPT pro whenever my sub is done
i've found perplexity to be good (got a free sub through cell provider)
I used perplexity before. I had a free 1 year sub. Then I tried GPT and I found it was better. Grok was even better for some tasks until they added the useless companions
I haven't had the best luck with grok, maybe need to give it another try once it speeds up again
Were you using the standalone app or the one inside Twitter? Never tried the one embedded in Twitter, only standalone. Not sure if it makes a difference
Both
Any difference between the 2 or just that feature where it can summarize a thread in Twitter?
Full app is more feature rich, and it still lets me use it without a subscription 😂. I just checked and I can't query the built in x one anymore
You have to pay for the built in one? What’s the point of having it built-in or getting people to buy it? Very odd choices by Elon
Most requests are suffciently and quickly served from their internal pre-trained model but if you need anything more recent (post-training data) you’ll have to fan out a search (from X’ firehose of posts for example or more just more recent data from external sources). The latter is computationally more expensive and will definitely take longer to produce an answer.
Pretty sure they’ll monetize those features (if they aren’t already).
Sense making is not something that AI is even capable of — after all these are simply language models… not humans who are constantly trying to make sense of the world 🌎
Are they Incentivising not using it to prevent unwanted increases in bandwidth consumed?
Maybe It's the Grok "Eco" option when you decide not to use it. Greta would be thankful.
Reducing tokens saves the rainforests. 🤙