Published on

I Cloned OpenAI's LLM Playground

Authors

The motivation

If you have built any AI projects using OpenAI’s GPT-family APIs, chances are that you used OpenAI’s LLM playground before.

Their playground is a great place to test different models, prompts, and parameter settings. The UI is intuitive and simple to use. I have spent countless hours there and it helped me tremendously.

Recently, I got a request to prototype an LLM playground with similar functionality, so I thought to myself “Why not just clone OpenAI’s playground?”.

That is exactly what I did.

Here is the deployed app and the open-source repo.

Take a look at the screenshot of the original playground and my clone.

OpenAI playground OpenAI playground
My playground clone My clone

The stack

I used Next.js 13 as the web framework, and Tailwind CSS with Shadcn UI library for styling.

One of the nice things about the OpenAI playground is its streaming support. Luckily, Vercel AI SDK supports streaming on edge function out of the box, which makes the backend implementation much easier.

Lastly, I borrowed a few components from Vercel’s AI Chatbot template to prototype faster.

Key challenge

The biggest challenge I encountered was implementing the inline message editing feature. OpenAI’s playground uses textarea elements to render and edit messages.

However, by feature, textarea is a fixed height element and it doesn’t adjust height based on the content. So to solve that, I ended up using useEffect hook to dynamically adjust the height when content changes or window resizing happens, as shown below.

dynamic-textarea.tsx

// Function to auto adjust the textarea height given the content
  const adjustHeight = () => {
    const textarea = combinedRef.current;
    if (textarea) {
      textarea.style.height = "auto";
      textarea.style.height = `${textarea.scrollHeight}px`;
    }
  };

  // Adjust height when the value changes
  useEffect(() => {
    adjustHeight();
  }, [props.value]);

  // Adjust height when window resize happens
  useEffect(() => {
    window.addEventListener("resize", adjustHeight);

    // Cleanup function to remove the event listener
    return () => {
      window.removeEventListener("resize", adjustHeight);
    };
  }, []);

Another issue related to this is the scrolling position of the parent component. The parent component renders a list of messages. Because every message component height is dynamic, the scrollbar won’t update its position while the message content is streamed in.

The solution here is another useEffect hook in the parent component to adjust the scroll position whenever the last message content changes.

chat-messages-list.tsx
// Scroll to the bottom when the last message content changes
// This is due to textarea auto expanding height
  useEffect(() => {
    // Scroll the container to the bottom
    const container = containerRef.current;
    if (container) {
      container.scrollTop = container.scrollHeight;
    }
  }, [messages.slice(-1)[0]?.content]);

Future considerations

The cloned prototype is in a state where everyone can run with their API keys. However, to make this an actual usable product, some key pieces are missing.

  1. Authentication and user profile needs to be implemented so that user can sign up and configure their keys.
  2. The playground won’t be that useful if the user can’t save prompts, so adding history-saving feature is important
  3. Right now, there are only 3 OpenAI models in the prototype. I think adding more LLM models, especially open-source ones, will be helpful for different use cases.

Overall, this has been a fun project to build. If you have any questions or want to contribute, feel free to reach out.