I don’t understand why there is a think harder button on SuperGrok. Shouldn’t it think harder all the time? Why would someone want it to half ass its answers.

This makes no sense to me 😂

Reply to this note

Please Login to reply.

Discussion

If it's ackchewally called superGrok and this isn't a joke, since I literally have no idea and don't keep up with shid...

One would think.

It's like the Die Hard.

DIE hard II... Die... Uhhh...HARDER

😂

It’s the name of their subscription yes 😂

☠️...

I think the think harder means it’s boiling more oceans sort of like when you mine bitcoins harder

Ah now I get it. I guess I have to press think harder everytime to offset the morons who put paint on timeless art to make the weather better

It's slower and probably costs them more on compute, you can probably get a 100% answer with the quick mode most of the time

I’m mostly shitposting 😂

A little bit pissed because of the new dumb companions that take all the computing power and a simple request now takes 3-10x longer. Going back to GPT pro whenever my sub is done

i've found perplexity to be good (got a free sub through cell provider)

I used perplexity before. I had a free 1 year sub. Then I tried GPT and I found it was better. Grok was even better for some tasks until they added the useless companions

I haven't had the best luck with grok, maybe need to give it another try once it speeds up again

Were you using the standalone app or the one inside Twitter? Never tried the one embedded in Twitter, only standalone. Not sure if it makes a difference

Both

Any difference between the 2 or just that feature where it can summarize a thread in Twitter?

Full app is more feature rich, and it still lets me use it without a subscription 😂. I just checked and I can't query the built in x one anymore

You have to pay for the built in one? What’s the point of having it built-in or getting people to buy it? Very odd choices by Elon

I think that's been his strategy, he wants x to be the everything app

He should have a free tier of grok inside twitter. Always. To tempt Twitter users into buying. Could be slow. But great publicity with tons of users

Agreed

Most requests are suffciently and quickly served from their internal pre-trained model but if you need anything more recent (post-training data) you’ll have to fan out a search (from X’ firehose of posts for example or more just more recent data from external sources). The latter is computationally more expensive and will definitely take longer to produce an answer.

Pretty sure they’ll monetize those features (if they aren’t already).

Sense making is not something that AI is even capable of — after all these are simply language models… not humans who are constantly trying to make sense of the world 🌎

Are they Incentivising not using it to prevent unwanted increases in bandwidth consumed?

Maybe It's the Grok "Eco" option when you decide not to use it. Greta would be thankful.

Reducing tokens saves the rainforests. 🤙