nostr:nprofile1qqsr7acdvhf6we9fch94qwhpy0nza36e3tgrtkpku25ppuu80f69kfqppemhxue69uhkummn9ekx7mp0qy0hwumn8ghj7mn0wd68yttjv4kxz7fwv3jhyettwfhhxuewd4jj7qg3waehxw309ahx7um5wgh8w6twv5hsleq7kw

How do you collapse the context of Dork so it does not use that much tokens?

Reply to this note

Please Login to reply.

Discussion

The context is very large because there is a lot of components in the MKStack template to make a working Nostr enabled application. You could always manually nuke the CONTEXT.md file, but you're going to get wildly different and potentially bad results. It's optimized to work well.

But the context gets gradually large, as I see on ppq. So it means, it does not load the whole context fie from the beginning?

Oh. I haven't used nostr:nprofile1qqsdy27dk8f9qk7qvrm94pkdtus9xtk970jpcp4w48k6cw0khfm06mspzamhxue69uhhyetvv9ujumn0wd68ytnzv9hxgtc58z7q5 so I am not aware of any issues. I'll try and test tomorrow. Goodnight for now.

Also, there seems to be an issue with both goose and MKStacks (Is Stacks based on Goose?) whereby it doesn't properly prompt cache when using our API key.

We would love to solve this issue because it seems to be costing users over double vs using openrouter.

It is on our to-do list to look into this but perhaps you nostr:nprofile1qy88wumn8ghj7mn0wvhxcmmv9uq37amnwvaz7tmwdaehgu3dwfjkccte9ejx2un9ddex7umn9ekk2tcqyqlhwrt96wnkf2w9edgr4cfruchvwkv26q6asdhz4qg08pm6w3djg3c8m4j can also look into why this might be happening.

That is painful to learn 😂

Could you elaborate a bit on "prompt cache"?

Prompt caching is something that can be done with anthropic models:

https://www.anthropic.com/news/prompt-caching

mhmmm, this sounds cool!

Thanks.

the stacks agent, Dork, is based off of Vercel's AI-SDK. maybet this is something that nostr:nprofile1qqsqgc0uhmxycvm5gwvn944c7yfxnnxm0nyh8tt62zhrvtd3xkj8fhgprdmhxue69uhkwmr9v9ek7mnpw3hhytnyv4mz7un9d3shjqg4waehxw309aex2mrp0yhxgctdw4eju6t09uq3vamnwvaz7tmjv4kxz7fwd4hhxarj9ec82c30hyvdwq or another one of our developers can take a look at when he has some time. thanks for bringing this to my attention!