README-work
Jupyter::Chatbook
In brief
This Raku package is a fork of Brian Duggan's "Jupyter::Kernel".
Here are the top opening statements of the README of "Jupyter::Kernel":
"Jupyter::Kernel" is a pure Raku implementation of a Raku kernel for Jupyter clientsĀ¹.
Jupyter notebooks provide a web-based (or console-based) Read Eval Print Loop (REPL) for running code and serializing input and output.
It is desirable to include the interaction with Large Language Models (LLMs) to the "typical" REPL systems or workflows.
Having LLM-aware and LLM-chat-endowed notebooks -- chatbooks -- can really speed up the:
Writing and preparation of documents on variety of subjects
Derivation of useful Raku (actionable) code
Adoption of Raku by newcomers
This repository is mostly for experimental work, but it aims to be always very useful for interacting with LLMs via Raku.
Remark: The reason to have a separate package -- a fork of "Jupyter::Kernel" -- is because:
I plan to introduce 4-6 new package dependencies
I expect to do a fair amount of UX experimental implementations and refactoring
Installation and setup
From "Zef ecosystem":
zef install Jupyter::Chatbook
From GitHub:
zef install https://github.com/antononcube/Raku-Jupyter-Chatbook.git
After installing the package "Jupyter::Chatbook" follow the setup instructions of "Jupyter::Kernel".
Using LLMs in chatbooks
There are four ways to use LLMs in a chatbook:
LLM functions, [AA3, AAp4]
LLM chat objects, [AA4, AAp4]
OpenAI, [AAp2], or PaLM, [AAp3], code cells with magics
Notebook-wide chats that are distributed over multiple code cells with chat-magic specs
The sections below briefly describe each of these ways and have links to notebooks with more detailed examples.
LLM functions and chat objects
LLM functions as described in [AA3] are best utilized via a certain REPL tool or environment. Notebooks are the perfect media for LLM functions workflows. Here is an example of a code cell that defines an LLM function:
use LLM::Functions;
my &fcp = llm-function({"What is the population of the country $_ ?"});
Here is another cell that can be evaluated multiple times using different country names:
<Niger Gabon>.map({ &fcp($_) })
For more examples of LLM functions and LLM chat objects see the notebook "Chatbook-LLM-functions-and-chat-objects.ipynb".
LLM cells
The LLMs of OpenAI (ChatGPT, DALL-E) and Google (PaLM) can be interacted with using "dedicated" notebook cells.
Here is an example of a code cell with PaLM magic spec:
%% palm, max-tokens=600
Generate a horror story about a little girl lost in the forest and getting possessed.
For more examples see the notebook "Chatbook-LLM-cells.ipynb".
Notebook-wide chats
Chatbooks have the ability to maintain LLM conversations over multiple notebook cells. A chatbook can have more than one LLM conversations. "Under the hood" each chatbook maintains a database of chat objects. Chat cells are used to give messages to those chat objects.
For example, here is a chat cell with which a new "Email Writer" chat object is made, and that new chat object has the identifier "em12":
%% chat-em12, prompt = Ā«Given a topic, write emails in a concise, professional mannerĀ»
Write a vacation email.
Here is a chat cell in which another message is given to the chat object with identifier "em12":
%% chat-em12
Rewrite with manager's name being Jane Doe, and start- and end dates being 8/20 and 9/5.
In this chat cell a new chat object is created:
%% chat-snowman, prompt = ā”Pretend you are a friendly snowman. Stay in character for every response you give me. Keep your responses short.ā¦
Hi!
And here is a chat cell that sends another message to the "snowman" chat object:
%% chat-snowman
Who build you? Where?
Remark: Specifying a chat object identifier is not required. I.e. only the magic spec %% chat
can be used.
The "default" chat object ID identifier "NONE".
For more examples see the notebook "Chatbook-LLM-chats.ipynb".
Here is a flowchart that summarizes the way chatbooks create and utilize LLM chat objects:
TODO
TODO Features
TODO DSL G4T cells
TODO Chat-meta cells (simple)
TODO Chat-meta cells (via LLM)
TODO Unit tests
DONE PaLM cells
DONE OpenAI cells
DONE MermaidInk cells
TODO DALL-E cells
TODO Documentation
TODO Long chat
TODO All parameters of OpenAI API in Raku
TODO All parameters of PaLM API in Raku
TODO More details on prompts
TODO Introductory video(s)
References
Articles
[AA1] Anton Antonov, "Literate programming via CLI", (2023), RakuForPrediction at WordPress.
[AA2] Anton Antonov, "Generating documents via templates and LLMs", (2023), RakuForPrediction at WordPress.
[AA3] Anton Antonov, "Workflows with LLM functions", (2023), RakuForPrediction at WordPress.
[AA4] Anton Antonov, "Number guessing games: PaLM vs ChatGPT", (2023), RakuForPrediction at WordPress.
[SW1] Stephen Wolfram, "Introducing Chat Notebooks: Integrating LLMs into the Notebook Paradigm", (2023), writings.stephenwolfram.com.
Packages
[AAp1] Anton Antonov, Text::CodeProcessing Raku package, (2021), GitHub/antononcube.
[AAp2] Anton Antonov, WWW::OpenAI Raku package, (2023), GitHub/antononcube.
[AAp3] Anton Antonov, WWW::PaLM Raku package, (2023), GitHub/antononcube.
[AAp4] Anton Antonov, LLM::Functions Raku package, (2023), GitHub/antononcube.
[AAp4] Anton Antonov, Text::SubParsers Raku package, (2023), GitHub/antononcube.
[AAp5] Anton Antonov, Data::Translators Raku package, (2023), GitHub/antononcube.
[AAp4] Anton Antonov, Clipboard Raku package, (2023), GitHub/antononcube.
[BDp1] Brian Duggan, Jupyter:Kernel Raku package, (2017-2023), GitHub/bduggan.
Videos
[AAv1] Anton Antonov, "Raku Literate Programming via command line pipelines", (2023), YouTube/@AAA4Prediction.
[AAv2] Anton Antonov, "Racoons playing with pearls and onions" (2023), YouTube/@AAA4Prediction.
[AAv3] Anton Antonov, "Streamlining ChatGPT code generation and narration workflows (Raku)" (2023), YouTube/@AAA4Prediction.
Footnotes
Ā¹ Jupyter clients are user interfaces to interact with an interpreter kernel like "Jupyter::Kernel". Jupyter [Lab | Notebook | Console | QtConsole ] are the jupyter maintained clients. More info in the jupyter documentations site.