WWW::Ollama

Raku package for accessing Ollama models.
The implementation is based in the Ollama's API, [Ol1], and observing (and trying to imitate) the
Ollama client of Wolfram Language.
The package has the following features:
- If
ollama is not running the corresponding executable is found and started - If a request specifies the use of a known Ollama "local-evaluation" model, but that model is not available locally, then the model is downloaded first
Installation
From GitHub:
zef install https://github.com/antononcube/Raku-WWW-Ollama.git
From Zef ecosystem:
zef install WWW::Ollama
Usage examples
For detailed usage examples see:
CLI
The package provides the Command Line Interface (CLI) script ollama-client for making Ollama LLM generations.
Here is the usage message:
ollama-client --help
# Usage:
# ollama-client [<words> ...] [--path=<Str>] [-m|--model=<Str>] [-f|--format=<Str>] -- Ollama client invocation.
#
# --path=<Str> Path, one of 'completion', 'chat', 'embedding', 'model-info', 'list-models', or 'list-running-models'. [default: 'completion']
# -m|--model=<Str> Model to use. [default: 'gemma3:1b']
# -f|--format=<Str> Format of the result; one of "json", "hash", "values", or "Whatever". [default: 'Whatever']
TODO
- TODO Implementation
- DONE Reasonable gists for the different objects.
- TODO Authorization
- DONE Initialize the client with an API key and use that key
- TODO Pass & use an API key per client method call
- TODO Automatic discovery and use of OLLAMA_API_KEY
- TODO Refactor to simpler code
- TODO Functional interface
- I.e. without the need to explicitly make a client object.
- TODO CLI
- DONE MVP
- TODO Detect JSON file with valid chat records
- TODO Detect JSON string with valid chat records
- DONE Unit tests
- DONE Client object creation
- DONE Completion generation
- DONE Chat generation
- DONE Embeddings
- TODO Documentation
- DONE Basic usage script
- DONE Basic usage notebook
- TODO Using via the LLM-function framework
- TODO Benchmarking
- TODO Demo video
References
[Ol1] "Ollama API".