Plugin directory

The following plugins are available for LLM. Here’s how to install them.

Local models

These plugins all help you run LLMs directly on your own computer:

Remote APIs

These plugins can be used to interact with remotely hosted models via their API:

If an API model host provides an OpenAI-compatible API you can also configure LLM to talk to it without needing an extra plugin.

Embedding models

Embedding models are models that can be used to generate and store embedding vectors for text.

Extra commands

  • llm-python adds a llm python command for running a Python interpreter in the same virtual environment as LLM. This is useful for debugging, and also provides a convenient way to interact with the LLM Python API if you installed LLM using Homebrew or pipx.

  • llm-cluster adds a llm cluster command for calculating clusters for a collection of embeddings. Calculated clusters can then be passed to a Large Language Model to generate a summary description.

Just for fun