JupyterChatbook

PyPI
PyPI  Downloads

This blog post proclaims the package “JupyterChatbook” that provides a Jupyter extension that facilitates the interaction with Large Language Models (LLMs).

The Chatbook extension provides the cell magics:

  • %%chatgpt (and the synonym %%openai)
  • %%palm
  • %%dalle
  • %%chat
  • %%chat_meta

The first three are for “shallow” access of the corresponding LLM services. The 4th one is the most important — allows contextual, multi-cell interactions with LLMs. The last one is for managing the chat objects created in a notebook session.

Remark: The chatbook LLM cells use the packages “openai”, [OAIp2], and “google-generativeai”, [GAIp1].

Remark: The results of the LLM cells are automatically copied to the clipboard using the package “pyperclip”, [ASp1].

Here is a couple of movies [AAv2, AAv3] that provide quick introductions to the features:


Installation

Install from GitHub

pip install -e git+https://github.com/antononcube/Python-JupyterChatbook.git#egg=Python-JupyterChatbook

From PyPi

pip install JupyterChatbook


Setup LLM services access

The API keys for the LLM cells can be specified in the magic lines. If not specified then the API keys are taken f rom the Operating System (OS) environmental variablesOPENAI_API_KEY and PALM_API_KEY. (For example, set in the “~/.zshrc” file in macOS.)

One way to set those environmental variables in a notebook session is to use the %env line magic. For example:

%env OPENAI_API_KEY = <YOUR API KEY>

Another way is to use Python code. For example:

import os os.environ['PALM_API_KEY'] = '<YOUR PALM API KEY>' os.environ['OPEN_API_KEY'] = '<YOUR OPEN API KEY>'


Demonstration notebooks (chatbooks)

NotebookDescription
Chatbooks-cells-demo.ipynbHow to do multi-cell (notebook-wide) chats?
Chatbook-LLM-cells.ipynbHow to “directly message” LLMs services?
DALL-E-cells-demo.ipynbHow to generate images with DALL-E?
Echoed-chats.ipynbHow to see the LLM interaction execution steps?

Notebook-wide chats

Chatbooks have the ability to maintain LLM conversations over multiple notebook cells. A chatbook can have more than one LLM conversations. “Under the hood” each chatbook maintains a database of chat objects. Chat cells are used to give messages to those chat objects.

For example, here is a chat cell with which a new “Email writer” chat object is made, and that new chat object has the identifier “em12”:

%%chat --chat_id em12, --prompt "Given a topic, write emails in a concise, professional manner" Write a vacation email.

Here is a chat cell in which another message is given to the chat object with identifier “em12”:

%%chat --chat_id em12 Rewrite with manager's name being Jane Doe, and start- and end dates being 8/20 and 9/5.

In this chat cell a new chat object is created:

%%chat -i snowman, --prompt "Pretend you are a friendly snowman. Stay in character for every response you give me. Keep your responses short." Hi!

And here is a chat cell that sends another message to the “snowman” chat object:

%%chat -i snowman Who build you? Where?

Remark: Specifying a chat object identifier is not required. I.e. only the magic spec %%chat can be used. The “default” chat object ID identifier is “NONE”.

For more examples see the notebook “Chatbook-cells-demo.ipynb”.

Here is a flowchart that summarizes the way chatbooks create and utilize LLM chat objects:


Chat meta cells

Each chatbook session has a dictionary of chat objects. Chatbooks can have chat meta cells that allow the access of the chat object “database” as whole, or its individual objects.

Here is an example of a chat meta cell (that applies the method print to the chat object with ID “snowman”):

%%chat_meta -i snowman print

Here is an example of chat meta cell that creates a new chat chat object with the LLM prompt specified in the cell (“Guess the word”):

%%chat_meta -i WordGuesser --prompt We're playing a game. I'm thinking of a word, and I need to get you to guess that word. But I can't say the word itself. I'll give you clues, and you'll respond with a guess. Your guess should be a single word only.

Here is another chat object creation cell using a prompt from the package “LLMPrompts”, [AAp2]:

%%chat_meta -i yoda1 --prompt @Yoda

Here is a table with examples of magic specs for chat meta cells and their interpretation:

cell magic linecell contentinterpretation
chat_meta -i ew12printGive the “print out” of the chat object with ID “ew12”
chat_meta –chat_id ew12messagesGive the messages of the chat object with ID “ew12”
chat_meta -i sn22 –promptYou pretend to be a melting snowman.Create a chat object with ID “sn22” with the prompt in the cell
chat_meta –allkeysShow the keys of the session chat objects DB
chat_meta –allprintPrint the repr forms of the session chat objects

Here is a flowchart that summarizes the chat meta cell processing:


DALL-E access

See the notebook “DALL-E-cells-demo.ipynb”

Here is a screenshot:


Implementation details

The design of this package — and corresponding envisioned workflows with it — follow those of the Raku package “Jupyter::Chatbook”, [AAp3].


References

Packages

[AAp1] Anton Antonov, LLMFunctionObjects Python package, (2023), Python-packages at GitHub/antononcube.

[AAp2] Anton Antonov, LLMPrompts Python package, (2023), Python-packages at GitHub/antononcube.

[AAp3] Anton Antonov, Jupyter::Chatbook Raku package, (2023), GitHub/antononcube.

[ASp1] Al Sweigart, pyperclip (Python package), (2013-2021), PyPI.org/AlSweigart.

[GAIp1] Google AI, google-generativeai (Google Generative AI Python Client), (2023), PyPI.org/google-ai.

[OAIp1] OpenAI, openai (OpenAI Python Library), (2020-2023), PyPI.org.

Videos

[AAv1] Anton Antonov, “Jupyter Chatbook multi cell LLM chats teaser (Raku)”, (2023), YouTube/@AAA4Prediction.

[AAv2] Anton Antonov, “Jupyter Chatbook LLM cells demo (Python)”, (2023), YouTube/@AAA4Prediction.

[AAv3] Anton Antonov, “Jupyter Chatbook multi cell LLM chats teaser (Python)”, (2023), YouTube/@AAA4Prediction.

Leave a comment