Rand Stats

Chatnik

zef:antononcube

Chatnik

Raku package that provides Command Line Interface (CLI) scripts for conversing with persistent Large Language Model (LLM) personas.

"Chatnik" uses files of the host Operating System (OS) to maintain persistent interaction with multiple LLM chat objects.

"Chatnik" simply moves the LLM-chat objects interaction system of the Raku package "Jupyter::Chatbook" into a UNIX-like OS terminal interaction. (I.e. an OS shell is used instead of a Jupyter notebook.)

Remark: The following quote is attributed to Ken Thompson about UNIX:

We have persistent objects, they're called files.


Installation

From Zef Ecosystem:

zef install Chatnik

From GitHub:

zef install https://github.com/antononcube/Raku-Chatnik.git

Usage examples

A few turns chat

The script llm-chat is used to create and chat with LLM personas (chat objects):

  1. Create and chat with an LLM persona named "yoda1" (using the Yoda chat persona):
llm-chat -i=yoda1 --prompt=@Yoda hi who are you
# Hmmm. Yoda, I am. Jedi Master, wise and old. Help you, I can. Yes, hmmm.
  1. Continue the conversation with "yoda1":
llm-chat -i=yoda1 since when do you use a green light saber
# Green, my lightsaber is. Symbol of a Jedi Consular, it is. Deep connection to the Force, it shows. Long ago, I chose this color, yes. Balance and harmony, it represents. Hmmm. Use the Force, I do. Much to learn, you still have.

Remark: The message input for llm-chat can be given in quotes. For example: llm-chat 'Hi, again!' -i=yoda1.

Chat objects management

The CLI script llm-chat-meta can be used to view and manage the chat objects used by "Chatnik". Here is its usage message:

llm-chat-meta --help
# Usage:
#   llm-chat-meta <command> [-i|--id|--chat-id=<Str>] [--all] -- Meta processing of persistent LLM-chat objects.
#   
#     <command>                  Command, one of: file, messages, clear, delete.
#     -i|--id|--chat-id=<Str>    Chat id; ignored if --all is specified. [default: '']
#     --all                      Whether to apply the command to all chat objects or not. [default: False]

Here we see the messages of "yoda1":

llm-chat-meta messages -i yoda1
# {content => hi who are you, role => user, timestamp => 2026-04-18T11:17:17.389634-04:00}
# {content => Hmmm. Yoda, I am. Jedi Master, wise and old. Help you, I can. Yes, hmmm., role => assistant, timestamp => 2026-04-18T11:17:18.990963-04:00}
# {content => since when do you use a green light saber, role => user, timestamp => 2026-04-18T11:17:19.412694-04:00}
# {content => Green, my lightsaber is. Symbol of a Jedi Consular, it is. Deep connection to the Force, it shows. Long ago, I chose this color, yes. Balance and harmony, it represents. Hmmm. Use the Force, I do. Much to learn, you still have., role => assistant, timestamp => 2026-04-18T11:17:22.278950-04:00}

Here we clear the messages:

llm-chat-meta clear -i yoda1
# Cleared the messages of chat object yoda1.

Design

Here is a flowchart that describes the interaction between the host Operating System and chat objects database:

flowchart LR
    OpenAI{{OpenAI}}
    Gemini{{Gemini}}
    Ollama{{Ollama}}
    LLMFunc[[LLM::Functions]]
    LLMProm[[LLM::Prompts]]
    CODBOS[(Chat objects<br>file)]
    CODB[(Chat objects)]
    PDB[(Prompts)]
    CCommand[/Chat command/]
    CCommandOutput[/Chat result/]
    CIDQ{Chat ID<br>specified?}
    CIDEQ{Chat ID<br>exists in DB?}
    IngestCODB[Chat objects file<br>ingestion]
    UpdateCODB[Chat objects file<br>update]
    RECO[Retrieve existing<br>chat object]
    COEval[Message<br>evaluation]
    PromParse[Prompt<br>DSL spec parsing]
    KPFQ{Known<br>prompts<br>found?}
    PromExp[Prompt<br>expansion]
    CNCO[Create new<br>chat object]
    CIDNone["Assume chat ID<br>is 'NONE'"] 
    subgraph "OS Shell"    
        CCommand
        CCommandOutput
    end
    subgraph OS file system
        CODBOS
    end
    subgraph PromptProc[Prompt processing]
        PDB
        LLMProm
        PromParse
        KPFQ
        PromExp 
    end
    subgraph LLMInteract[LLM interaction]
      COEval
      LLMFunc
      Gemini
      OpenAI
      Ollama
    end
    subgraph Chatnik backend
        IngestCODB
        CODB
        CIDQ
        CIDEQ
        CIDNone
        RECO
        CNCO
        UpdateCODB
        PromptProc
        LLMInteract
    end
    CCommand --> IngestCODB
    CODBOS -.-> IngestCODB 
    UpdateCODB -.-> CODBOS 
    IngestCODB -.-> CODB
    IngestCODB --> CIDQ
    CIDQ --> |yes| CIDEQ
    CIDEQ --> |yes| RECO
    RECO --> PromParse
    COEval --> CCommandOutput
    CIDEQ -.- CODB
    CIDEQ --> |no| CNCO
    LLMFunc -.- CNCO -.- CODB
    CNCO --> PromParse --> KPFQ
    KPFQ --> |yes| PromExp
    KPFQ --> |no| COEval
    PromParse -.- LLMProm 
    PromExp -.- LLMProm
    PromExp --> COEval 
    LLMProm -.- PDB
    CIDQ --> |no| CIDNone
    CIDNone --> CIDEQ
    COEval -.- LLMFunc
    COEval --> UpdateCODB
    LLMFunc <-.-> OpenAI
    LLMFunc <-.-> Gemini
    LLMFunc <-.-> Ollama

    style PromptProc fill:DimGray,stroke:#333,stroke-width:2px
    style LLMInteract fill:DimGray,stroke:#333,stroke-width:2px

Here is the corresponding UML Sequence diagram:

sequenceDiagram
    participant CCommand as Chat command
    participant IngestCODB as Chat objects file ingestion
    participant CODBOS as Chat objects file
    participant CODB as Chat objects
    participant CIDQ as Chat ID specified?
    participant CIDEQ as Chat ID exists in DB?
    participant RECO as Retrieve existing chat object
    participant PromParse as Prompt DSL spec parsing
    participant KPFQ as Known prompts found?
    participant PromExp as Prompt expansion
    participant COEval as Message evaluation
    participant CCommandOutput as Chat result
    participant CNCO as Create new chat object
    participant CIDNone as Assume chat ID is NONE
    participant UpdateCODB as Chat objects file update
    participant LLMFunc as LLM Functions
    participant LLMProm as LLM Prompts

    CCommand->>IngestCODB: Chat command
    CODBOS--)IngestCODB: Chat objects file
    IngestCODB--)CODB: Chat objects
    IngestCODB->>CIDQ: Chat ID specified?
    CIDQ-->>CIDEQ: Yes
    CIDQ-->>CIDNone: No
    CIDNone->>CIDEQ: Assume chat ID is NONE
    CIDEQ-->>RECO: Yes
    CIDEQ-->>CNCO: No
    CIDEQ--)CODB: Chat objects
    RECO->>PromParse: Prompt DSL spec parsing
    PromParse--)LLMProm: LLM Prompts
    CNCO--)LLMFunc: LLM Functions
    CNCO--)CODB: Chat objects
    CNCO->>PromParse: Prompt DSL spec parsing
    PromParse->>KPFQ: Known prompts found?
    KPFQ-->>PromExp: Yes
    KPFQ-->>COEval: No
    PromExp--)LLMProm: LLM Prompts
    PromExp->>COEval: Message evaluation
    COEval--)LLMFunc: LLM evaluator invocation
    LLMFunc--)COEval: Evaluation result
    COEval->>UpdateCODB: Chat objects file update
    COEval->>CCommandOutput: Chat result

TODO


References

Packages

[AAp1] Anton Antonov LLM::Functions, Raku package, (2023-2026), GitHub/antononcube.

[AAp2] Anton Antonov LLM::Prompts, Raku package, (2023-2025), GitHub/antononcube.

[AAp3] Anton Antonov Jupyter::Chatbook, Raku package, (2023-2026), GitHub/antononcube.

[JSp1] Jonathan Stowe, XDG::BaseDirectory, Raku package, (2016-2026), GitHub/jonathanstowe.