Replying to Avatar Keyhan Alizadeh

It's a simple standard about how a prompt can be written and managed.

For example you have an AI agent that evaluates mathematical expressions and explains them with insights, tabels and...

You have a prompt for it which is kept in a repo or somewhere else.

You need to:

1. Update this prompt over time. (OP have a versioning)

2. Which model(s) you should pass this prompt to. (Different models behave differently on same prompt), (OP have a models section to determine this, if it's closed source, the team member would know it, if it's open source the people who run or develop it will know which is the best fit)

3. Which sampling output parameters is suitable for it? Low or high temp? And more. Which can be used when you pass it to model.

4. Some parts are dynamic. Like you want to pass a name to your agent like "Maticagent" and give user info to it as well. Like: User name is Sam, their lastname is Altman. So you need variables in your prompt. You do it like:

4.1 Define vars on your op.json file to explain it to next person who read/use it.

4.2 Apply/use it in your Jinja file like:

You ate {{ agent_name }}, taking with {{ user_name }} {{ user_last_name }}...

Also you can use conditions, like if variable is_premium is true, add an extra section to prompt.

That's it. You can do all of this in your own way like:

1. Version with git

2. Set parameters in your code, config or a markdown file.

3. Define placholders like [var_name] or other template engines forma to support vars and define vars in different places like your docs.

But open prompt lets you to organize these in the simplest and most dynamic form possible. If you see a prompt on github randomly which is using open prompt, you can find its configs, vars and everything needed in those files in expected places.

Also, since it's and standard, a platform like PromptLayer can accept open prompt directories and abstract them in a good UI for testing / using and...

You import it, they load info and you can update the op.json, set vars, and then call it with different models, tools, vars and compare. And a lot more will be possible.

Sincr its and standard, it can even integrate to stuff like Nostr. A way to share cool prompts over Nostr.

Very cool, thanks

Reply to this note

Please Login to reply.

Discussion

No replies yet.