Skip to main content

7 posts tagged with "preview"

View All Tags

· 3 min read
Aurélien Franky

Managing and organizing your Language Model initiatives for automation can be a real challenge. If you and your colleagues are constantly exploring different approaches to generate content and automate tasks, you're likely familiar with the frustration of switching between ChatGPT and Office tools to save and share prompts. And if you want to reuse your best ideas, it often involves a time-consuming process of searching for them and copy-pasting them back in ChatGPT.

Workspaces

workspaces

The latest version of Prompt Studio offers you a solution to effortlessly organize your initiatives into dedicated workspaces. Each workspace provides you with a convenient playground where you can access prompts, files, and chats specific to that project.

Currently, the workspaces you create are private, but we have an exciting team collaboration feature in the pipeline that will soon enable you to share your workspaces with your colleagues, enhancing collaboration and productivity.

Chats

chats

Last week, we made improvements to the chat feature in Prompt Studio. These enhancements include several important features that are particularly useful for those who wish to explore different scenarios for chatbots.

  • System Prompt: A system prompt enables you to specify how the language model should behave regardless of the user input. By using this prompt, you can partially override the default behavior of the language model.
  • Roles: Each message in the chat can be assigned a role, such as system, user, assistant, or function. This tagging helps the language model understand how to handle each message. Currently, we follow the OpenAI format for roles, but in the future, it may vary depending on the selected provider.
  • Completions: Every message within the chat contains a set of completions. These can be messages generated by the language model itself or provided by a user. You can easily switch between different completions to compare and evaluate them without worrying about accidentally overwriting your work.
  • Disable Messages: If you want to see how a completion would appear with or without a specific message, there's no need to delete the message entirely. Instead, you can simply disable it temporarily, allowing you to assess the impact of that message on the overall conversation.
  • Editor and Completion Modes: In completion mode, the chat operates like ChatGPT, where you make a request and the language model responds. On the other hand, in editor mode, you have the flexibility to manually add new messages and define their content. This way, you can explore and observe how your language model behaves in specific scenarios.

Are you looking for additional features when exploring chat scenarios for your chatbots?

Reach out to us on Discord, we’re happy to help.

· 3 min read
Aurélien Franky

Today is a very special day for me. I've decided to leave my full-time job at Klarna and dedicate myself to building Prompt Studio and hopefully become part of something we are all witnessing at the moment, this amazing push towards AI becoming a greater part of our daily lives. Until now I could only invest a few hours of my free time every day, a compromise that left me day dreaming of what Prompt Studio could be and left me unhappy with the progress I was making. Today is my first day working full-time on Prompt Studio and I know it is the start of an exciting journey. So what do we want Prompt Studio to become?

Prompt Engineering and Reasoning Engines

Language models are obviously great at processing language, from translations to text generation. But what excites me the most about them, is their ability to reason about things. If you haven't seen it yet, this presentation by Andrej Karpathy provides a comprehensive overview of the topic. If we draw a parallel between a language model and the ways we think then a pretrained model is somewhat akin to system 1 thinking, it is fast and automatic.

With Prompt Engineering we add another layer, a more deliberate and targeted approach to get the results we want. We are seeing many novel approaches emerging, from frameworks like Tree of Thought and prompting languages like Guidance. We want Prompt Studio to not only be the place where you build and track text inputs for language models but also where you can build and share these more complex processes that form system 2 thinking on top of language models.

More than a Collection of Libraries

Prompt Engineering will become a lot more mainstream and people working with prompts will come from a variety of backgrounds. With the current shortage of developers, and a continuous need for software, we think most prompt engineers will come from other domains. This is why we want Prompt Studio to bridge the gap between a tool that is only useful for software engineers and a tool that can be used by everyone. Our main focus needs to be its usability and collaboration features.

Becoming Open Core

We cannot keep up with the enormous strides in the development of AI we have seen in the past months on our own. This is why we want to focus our efforts on the aspects of prompt engineering where our expertise matters the most. We want to provide a layer for collaboration and real time editing/tools that is much needed for large organizations while providing an open source version of Prompt Studio that can be adapted and modified by anyone for their own needs. This way the editor will always be free and open source, with a layer of additional functionality for paying customers that will allow us to dedicate our time in making Prompt Studio better.

Thank you for your support and stay tuned for more updates!

· 2 min read
Aurélien Franky

Until now, OpenAI was the only Provider for Language Models you could connect to from Prompt Studio. We have been working the past week to decouple ourselves from the OpenAI API and allow you to connect to other providers as well. Here is a list of providers we plan to integrate soon:

Let us know what providers you are interested in connecting with here!

Custom APIs

Being LLM agnostic means you can now use the new Custom API Model Provider to connect to a language model you are hosting yourself with your own API. This requires a bit of configuration, but is the most flexible setup and you can already use it while we add the integrations listed above. The image below shows the setup to integrate to a custom GPT-2 model deployed on Huggingface.

huggingface setup

Above is a deployed GPT-2 API on Huggingface Inference Endpoints. Below is the corresponding configuration for the Custom API Model Provider.

config

You can now select the custom provider when making your requests:

using custom provider

Prompt Studio and your Local Language models

One feature we are looking forward to, is to have a desktop version of Prompt Studio available that would make it easy for you to connect to a language model on your machine. With the work we have done with custom LLM Providers we are one step closer to this.

Trying out Prompt Studio

You want to try out Prompt Studio, but you don’t want to add your own OpenAI key? No problem! You now have 10 prompts per day to try out Prompt Studio without the need of providing your own OpenAI API Key. To do that, you simply need to select “Prompt Studio” as your LLM provider in the editor to use your free prompts towards OpenAI.

What’s next?

Next on our roadmap are collaborative features. We will start work on adding workspaces, sharing prompts, and teams to Prompt Studio. Stay tuned!

· 2 min read
Aurélien Franky

This week we made working with files a lot easier by introducing inline assets. You no longer need to create separate file assets and pass them to your prompts or knowledge bases. You can now drop the file directly onto your prompt variable or onto your knowledge base.

Stay tuned for next week when we bring a new custom API provider for you to connect to your own language models.

Inline Assets

Inline assets are just like normal assets with the difference that they live only within the scope of another asset. This makes it a lot easier for you to keep your workspace de-cluttered. If you are only going to use one of your files in a single knowledge base there is no point for it to live on your workspace. Simply create the inline file asset by dropping the file onto the knowledge base. If you no longer need the knowledge base, all inline assets that were created as part of it will be deleted along with it. If you later want to use your file elsewhere, you can always convert it into a primary asset.

Adding files

To add a file to a knowledge base in Prompt Studio, you can now simply drag and drop it onto the knowledge base:

drop file on kb

You can do the same in a prompt. Keep in mind that the file size might push your prompt above the limit for your selected model.

drop file on prompt

UX improvements

This week we further improved the how prompt versions are presented in prompt studio. This new setup separates the template view from the prompts and completions so that you can view all of them at the same time. Figuring out the most convenient setup for this new type of development experience is still a work in progress, let us know what you think of our most recent changes!

prompt version list

· 2 min read
Aurélien Franky

Our focus this week was adding knowledge bases to Prompt Studio, allowing you to circumvent limitations with prompt lengths, and test interactions with your own data.

Knowledge Bases

When generating content through prompts you sometimes want to make some information available to the language model that it was not trained upon. This could be because the data was not available at the time of training or because the data you want it to use is very specific.

To get the results you want, you need to pass that information as part of the prompt you send to the language model. With that context the model will be more accurate in its responses.

However language models limit the number of tokens that can be provided as part of a prompt, this means that if you have a lot of information that you want to be include in the prompt, you need a way to decide what parts are the most relevant in a given situation. There are many ways to do that, a very popular approach is to do a vector similarity search, where the parts of the text that are most similar with the user query are passed as context.

You can now setup a knowledge base to do just that in Prompt Studio.

Adding files to a knowledge base

To build a knowledge base in Prompt Studio, create a new asset of type "knowledge base". And link the files you want to be part of the knowledge base. Then click "generate knowledge base".

generate knowledge base

Chat Context

We added the concept of chat contexts to Prompt Studio. You can use your knowledge base in a chat by selecting it under "chat context". This allows you to ask questions in the chat about the files in your knowledge base.

generate knowledge base

Let us know how you use knowledge bases and what features you would like to see added next.

UX improvements

This week we also improved the useability of prompt versions. You can now easily revert to a previous version of your prompt template, including previewing the number of tokens each prompt will use.

generate knowledge base

· 3 min read
Aurélien Franky

Over the past month we have improved the developer experience when working with prompt templates. The latest version of Prompt Studio allows you to include external text into your prompts like lists and files. We have also been busy preparing a new feature allowing you to create knowledge bases from your files and integrate them into your prompts and chats. Come back next week to learn more!

Here is what we have been working over the past weeks:

Prompt Versions and Completions

We have reworked how we present prompt completions to you, every time you create a completion with Prompt Studio that version of your prompt is saved. All completions done with the same prompt, but with different models or settings (Temperature, Top P or maximum length), will be grouped together, and you can now compare prompt completions side by side. This makes it really easy to compare how different language models perform, or how parameters like "temperature" affect your results.

Drafts

A draft is a prompt created from your prompt template that you have not saved yet. In the example below your prompt template results in two different drafts because the option "generate prompt for each item" is selected for the "storyTone" variable. When clicking run Prompts, all your drafts will be run at once, so that you can easily compare results, and revert to a previous version of your prompt at a later stage.

DraftsScreenshot

Preview a Prompt

You can preview the content of your prompt by clicking on the draft item in the prompt version list. All variables are inserted into the prompt and you can make sure everything is configured correctly before running it.

PromptPreviewScreenshot

Comparing completions

Once you run your prompts, completions will be generated and you can now compare them with each other:

CompletionPageScreenshot

With that foundation in place, here are features we will be adding soon:

  • Save a specific version of a prompt template
  • Revert to a version of your prompt template
  • Delete Prompt versions and completions you don't need
  • Run individual prompts instead of all draft prompts

Working with Files

To test your prompts you need to try them out on different sets of data, the same prompt might perform very differently given different user inputs. This is why we added a new feature allowing you to create files and link them into your prompts. This way you can check if your prompt performs the way you expect in different scenarios. You can also preview the number of tokens a file might have.

FilePageScreenshot

Currently you need to copy paste the file content into Prompt Studio, we know this is not ideal and we are working on improving this. Down the line you will be able to:

  • Drag and drop files into the editor
  • Tokenize PDFs and other Documents directly in Prompt Studio

Chats

Interacting in a chat format allows you to build up a question or request across several prompts. This is incredibly useful when you have more complex tasks for a language model. Our current chat interface is a starting point for many new features we intend to add in the coming weeks.

ChatPageScreenshot

There are many things we want to add to this early preview, but we need your help to understand what features you need the most, and what we should work on first. Reach out to us on Github, or get in touch on Discord.

· 4 min read
Aurélien Franky

PromptStudioBanner

We are excited to release an early preview version of Prompt Studio! After several weeks of work, we feel that we have a solid foundation for something that will be useful to anyone building applications on top of Language Models. Prompt Studio is an attempt at providing Prompt Engineers with the kind of tools we are accustomed to in Software development. How Prompt Studio will shape up will depend on you and other early adopters.

Our goal is to tailor this tool to provide the best developer experience for anyone needing to experiment, fine-tune or test prompts. Our mission is to help you find the right words to interact with your AI systems.

Let us know what features you need on Github, or get in touch on Discord.

Why Prompt Studio?

Last month I created a video game as part of the "All by AI" game jam competition on Itch.io. The game is a digital novel entirely generated by GPT-3 where each action you take changes what happens next in the story. Each scene, in turn, defines what actions are available to you, those actions are also created by GPT-3.

AJTU

The challenge was finding the correct words to not only get text that makes sense but also follows a certain writing style and exhibits a certain mood.

A prompt in the game would look like this:

const prompt = `You are a narrator writing in the second person.
The story you tell is a sci-fi story set in the distant future.

Previously the following happened:
1. The protagonists spaceship crashed on a desertic planet.

Lastly, the protagonist did the following action:
The protagonist explored the surroundings.

Describe what happens next in the story, within a paragraph of 80 words or less.`;

This prompt is composed of several variable pieces that each can affect the quality of the result in different ways. For example, in the case above we need to feed everything that has happened in the story so far back to the model since it has no memory.

But how do you know what would happen if you change a word or the order of a sentence? There are no good tools for this type of work as of now (I looked).

So I decided to build my own.

Working with Prompt Studio

When building more complicated applications you quickly outgrow tools like GPT playground. I wanted a tool that allows me to:

  1. keep track of my prompts and their completions
  2. easily compare results between prompts
  3. create large numbers of prompts from a template
  4. compare results across different language models

PromptStudioScreenshot

Over the past month I have been working on implementing these features. There is still work ahead but first I want to check what features you might need from such a tool.

Checkout our documentation to learn more about what the current capabilities of prompt studio are.

Prompt Engineering is taking off

As AI becomes capable of performing various tasks at levels similar to that of a humans, we will rely more on it to automate certain work. The more we rely on language models for automated tasks, the more prompts and how we write them become important. This is not a transitory phase while our AI models become better.

Human language is not precise and while models might get better at discerning intent in the requests we make towards them, there will always be a gap inherent to natural language. With people we are accustomed to a certain back and forth to clear misunderstandings but we are much less tolerant of these inefficiencies when it comes to machines.

From this perspective Prompt Engineering is the search for a way to formalize intent in natural language and I think that as AI becomes more human like in its capabilities, making sure the results it produces are aligned with our intents will only become more important.

If you want to learn more about Prompt Engineering here are a few amazing resources:

prompt-engineering-guide

What's next for Prompt Studio?

Our development roadmap is open and you are welcome to tell us what we should focus on next.

Check it out on Github.

We are looking forward to hear from you!