Rand Stats

Jupyter::Chatbook

zef:antononcube

Jupyter::Chatbook

MacOS Linux

In brief

This Raku package is a fork of Brian Duggan's "Jupyter::Kernel", [BDp1].

Here are the top opening statements of the README of "Jupyter::Kernel":

"Jupyter::Kernel" is a pure Raku implementation of a Raku kernel for Jupyter clients¹.

Jupyter notebooks provide a web-based (or console-based) Read Eval Print Loop (REPL) for running code and serializing input and output.

It is desirable to include the interaction with Large Language Models (LLMs) into the "typical" REPL systems or workflows. Having LLM-aware and LLM-chat-endowed notebooks -- chatbooks -- can really speed up the:

This repository is mostly for experimental work, but it aims to be always very useful for interacting with LLMs via Raku.

Remark: The reason to have a separate package -- a fork of "Jupyter::Kernel" -- is because:


Installation and setup

From "Zef ecosystem":

zef install Jupyter::Chatbook

From GitHub:

zef install https://github.com/antononcube/Raku-Jupyter-Chatbook.git

After installing the package "Jupyter::Chatbook" follow the setup instructions of "Jupyter::Kernel".

The default API keys for the chat cells, LLM functions, and chat objects are taken from the Operating System (OS) environmental variables OPENAI_API_KEY and PALM_API_KEY. The api keys can also be specified using LLM evaluator and configuration options and objects; see [AA3, AAp2].


Using LLMs in chatbooks

There are four ways to use LLMs in a chatbook:

  1. LLM functions, [AA3, AAp2]
  2. LLM chat objects, [AA4, AAp2]
  3. Code cells with magics accessing LLMs, like, OpenAI's, [AAp3], or PaLM's, [AAp4]
  4. Notebook-wide chats that are distributed over multiple code cells with chat-magic specs

The sections below briefly describe each of these ways and have links to notebooks with more detailed examples.


LLM functions and chat objects

LLM functions as described in [AA3] are best utilized via a certain REPL tool or environment. Notebooks are the perfect media for LLM functions workflows. Here is an example of a code cell that defines an LLM function:

use LLM::Functions;

my &fcp = llm-function({"What is the population of the country $_ ?"});
# -> **@args, *%args { #`(Block|3043322893520) ... }

Here is another cell that can be evaluated multiple times using different country names:

<Niger Gabon>.map({ &fcp($_) })
# (
# 
# As of July 2020, the population of Niger is estimated to be 24,206,644. 
# 
# As of July 2020, the population of Gabon is estimated to be 2,283,286 people.)

For more examples of LLM functions and LLM chat objects see the notebook "Chatbook-LLM-functions-and-chat-objects.ipynb".

Remark: Chatbooks load in their initialization phase the package "LLM::Functions", [AAp2]. Also, in the initialization phase are loaded the packages "Clipboard", [AAp5], "Data::Translators", [AAp6], "Data::TypeSystem", [AAp7], "Text::Plot", [AAp8], and "Text::SubParsers", [AAp9], that can be used to post-process LLM outputs.


LLM cells

The LLMs of OpenAI (ChatGPT, DALL-E) and Google (PaLM) can be interacted with using "dedicated" notebook cells.

Here is an example of a code cell with PaLM magic spec:

%% palm, max-tokens=600
Generate a horror story about a little girl lost in the forest and getting possessed.

For more examples see the notebook "Chatbook-LLM-cells.ipynb".


Notebook-wide chats

Chatbooks have the ability to maintain LLM conversations over multiple notebook cells. A chatbook can have more than one LLM conversations. "Under the hood" each chatbook maintains a database of chat objects. Chat cells are used to give messages to those chat objects.

For example, here is a chat cell with which a new "Email writer" chat object is made, and that new chat object has the identifier "em12":

%% chat-em12, prompt = «Given a topic, write emails in a concise, professional manner»
Write a vacation email.

Here is a chat cell in which another message is given to the chat object with identifier "em12":

%% chat-em12
Rewrite with manager's name being Jane Doe, and start- and end dates being 8/20 and 9/5.

In this chat cell a new chat object is created:

%% chat snowman, prompt = ⎡Pretend you are a friendly snowman. Stay in character for every response you give me. Keep your responses short.⎦
Hi!

And here is a chat cell that sends another message to the "snowman" chat object:

%% chat snowman
Who build you? Where?

Remark: Specifying a chat object identifier is not required. I.e. only the magic spec %% chat can be used. The "default" chat object ID identifier "NONE".

Remark: The magic keyword "chat" can be separated from the identifier of the chat object with the symbols "-", "_", ":", or with any number of (horizontal) white spaces.

For more examples see the notebook "Chatbook-LLM-chats.ipynb".

Here is a flowchart that summarizes the way chatbooks create and utilize LLM chat objects:

flowchart LR
    OpenAI{{OpenAI}}
    PaLM{{PaLM}}
    LLMFunc[[LLM::Functions]]
    LLMProm[[LLM::Prompts]]
    CODB[(Chat objects)]
    PDB[(Prompts)]
    CCell[/Chat cell/]
    CRCell[/Chat result cell/]
    CIDQ{Chat ID<br>specified?}
    CIDEQ{Chat ID<br>exists in DB?}
    RECO[Retrieve existing<br>chat object]
    COEval[Message<br>evaluation]
    PromParse[Prompt<br>DSL spec parsing]
    KPFQ{Known<br>prompts<br>found?}
    PromExp[Prompt<br>expansion]
    CNCO[Create new<br>chat object]
    CIDNone["Assume chat ID<br>is 'NONE'"] 
    subgraph Chatbook frontend    
        CCell
        CRCell
    end
    subgraph Chatbook backend
        CIDQ
        CIDEQ
        CIDNone
        RECO
        CNCO
        CODB
    end
    subgraph Prompt processing
        PDB
        LLMProm
        PromParse
        KPFQ
        PromExp 
    end
    subgraph LLM interaction
      COEval
      LLMFunc
      PaLM
      OpenAI
    end
    CCell --> CIDQ
    CIDQ --> |yes| CIDEQ
    CIDEQ --> |yes| RECO
    RECO --> PromParse
    COEval --> CRCell
    CIDEQ -.- CODB
    CIDEQ --> |no| CNCO
    LLMFunc -.- CNCO -.- CODB
    CNCO --> PromParse --> KPFQ
    KPFQ --> |yes| PromExp
    KPFQ --> |no| COEval
    PromParse -.- LLMProm 
    PromExp -.- LLMProm
    PromExp --> COEval 
    LLMProm -.- PDB
    CIDQ --> |no| CIDNone
    CIDNone --> CIDEQ
    COEval -.- LLMFunc
    LLMFunc <-.-> OpenAI
    LLMFunc <-.-> PaLM

Chat meta cells

Each chatbook session has a Hash of chat objects. Chatbooks can have chat meta cells that allow the access of the chat object "database" as whole, or its individual objects.

Here is an example of a chat meta cell (that applies the method say to the chat object with ID "snowman"):

%% chat snowman meta
say

Here is an example of chat meta cell that creates a new chat chat object with the LLM prompt specified in the cell ("Guess the word"):

%% chat-WordGuesser prompt
We're playing a game. I'm thinking of a word, and I need to get you to guess that word. 
But I can't say the word itself. 
I'll give you clues, and you'll respond with a guess. 
Your guess should be a single word only.

Here is another chat object creation cell using a prompt from the package "LLM::Prompts", [AAp4]:

%% chat yoda1 prompt
@Yoda

Here is a table with examples of magic specs for chat meta cells and their interpretation:

cell magic linecell contentinterpretation
chat-ew12 metasayGive the "print out" of the chat object with ID "ew12"
chat-ew12 metamessagesGive the "print out" of the chat object with ID "ew12"
chat sn22 promptYou pretend to be a melting snowman.Create a chat object with ID "sn22" with the prompt in the cell
chat meta allkeysShow the keys of the session chat objects DB
chat allkeys«same as above»

Here is a flowchart that summarizes the chat meta cell processing:

flowchart LR
    LLMFunc[[LLM::Functions]]
    CODB[(Chat objects)]
    CCell[/Chat meta cell/]
    CRCell[/Chat meta cell result/]
    CIDQ{Chat ID<br>specified?}
    KCOMQ{Known<br>chat object<br>method?}
    AKWQ{Keyword 'all'<br>specified?} 
    KCODBMQ{Known<br>chat objects<br>DB method?}
    CIDEQ{Chat ID<br>exists in DB?}
    RECO[Retrieve existing<br>chat object]
    COEval[Chat object<br>method<br>invocation]
    CODBEval[Chat objects DB<br>method<br>invocation]
    CNCO[Create new<br>chat object]
    CIDNone["Assume chat ID<br>is 'NONE'"] 
    NoCOM[/Cannot find<br>chat object<br>message/]
    CntCmd[/Cannot interpret<br>command<br>message/]
    subgraph Chatbook
        CCell
        NoCOM
        CntCmd
        CRCell
    end
    CCell --> CIDQ
    CIDQ --> |yes| CIDEQ  
    CIDEQ --> |yes| RECO
    RECO --> KCOMQ
    KCOMQ --> |yes| COEval --> CRCell
    KCOMQ --> |no| CntCmd
    CIDEQ -.- CODB
    CIDEQ --> |no| NoCOM
    LLMFunc -.- CNCO -.- CODB
    CNCO --> COEval
    CIDQ --> |no| AKWQ
    AKWQ --> |yes| KCODBMQ
    KCODBMQ --> |yes| CODBEval
    KCODBMQ --> |no| CntCmd
    CODBEval -.- CODB
    CODBEval --> CRCell
    AKWQ --> |no| CIDNone
    CIDNone --> CIDEQ
    COEval -.- LLMFunc

TODO

  1. TODO Features
    1. DONE Chat-meta cells (simple)
      • DONE meta
      • DONE all
      • DONE prompt
    2. TODO Chat-meta cells (via LLM)
    3. TODO DSL G4T cells
    4. TODO Using pre-prepared prompts
      • This requires implementing "LLM::Prompts".
        • And populating it with a good number of prompts.
  2. TODO Unit tests
    1. DONE PaLM cells
    2. DONE OpenAI cells
    3. DONE MermaidInk cells
    4. TODO DALL-E cells
    5. DONE Chat meta cells
  3. TODO Documentation
    • DONE LLM functions and chat objects in chatbooks
    • DONE LLM cells in chatbooks
    • DONE Notebook-wide chats and chat meta cells
    • TODO All parameters of OpenAI API in Raku
    • TODO All parameters of PaLM API in Raku
    • TODO More details on prompts
    • TODO Introductory video(s)

References

Articles

[AA1] Anton Antonov, "Literate programming via CLI", (2023), RakuForPrediction at WordPress.

[AA2] Anton Antonov, "Generating documents via templates and LLMs", (2023), RakuForPrediction at WordPress.

[AA3] Anton Antonov, "Workflows with LLM functions", (2023), RakuForPrediction at WordPress.

[AA4] Anton Antonov, "Number guessing games: PaLM vs ChatGPT", (2023), RakuForPrediction at WordPress.

[SW1] Stephen Wolfram, "Introducing Chat Notebooks: Integrating LLMs into the Notebook Paradigm", (2023), writings.stephenwolfram.com.

Packages

[AAp1] Anton Antonov, Jupyter::Chatbook Raku package, (2023), GitHub/antononcube.

[AAp2] Anton Antonov, LLM::Functions Raku package, (2023), GitHub/antononcube.

[AAp3] Anton Antonov, WWW::OpenAI Raku package, (2023), GitHub/antononcube.

[AAp4] Anton Antonov, WWW::PaLM Raku package, (2023), GitHub/antononcube.

[AAp5] Anton Antonov, Clipboard Raku package, (2023), GitHub/antononcube.

[AAp6] Anton Antonov, Data::Translators Raku package, (2023), GitHub/antononcube.

[AAp7] Anton Antonov, Data::TypeSystem Raku package, (2023), GitHub/antononcube.

[AAp8] Anton Antonov, Text::Plot Raku package, (2022), GitHub/antononcube.

[AAp9] Anton Antonov, Text::SubParsers Raku package, (2023), GitHub/antononcube.

[AAp10] Anton Antonov, LLM::Prompts Raku package, (2023), GitHub/antononcube.

[BDp1] Brian Duggan, Jupyter:Kernel Raku package, (2017-2023), GitHub/bduggan.

Videos

[AAv1] Anton Antonov, "Raku Literate Programming via command line pipelines", (2023), YouTube/@AAA4Prediction.

[AAv2] Anton Antonov, "Racoons playing with pearls and onions" (2023), YouTube/@AAA4Prediction.

[AAv3] Anton Antonov, "Streamlining ChatGPT code generation and narration workflows (Raku)" (2023), YouTube/@AAA4Prediction.


Footnotes

¹ Jupyter clients are user interfaces to interact with an interpreter kernel like "Jupyter::Kernel". Jupyter [Lab | Notebook | Console | QtConsole ] are the jupyter maintained clients. More info in the jupyter documentations site.