Package ramalama
Command line tool for working with AI LLM models
https://github.com/containers/ramalama
Command line tool for working with AI LLM models
On first run RamaLama inspects your system for GPU support, falling back to CPU
support if no GPUs are present. It then uses container engines like Podman to
pull the appropriate OCI image with all of the software necessary to run an
AI Model for your systems setup. This eliminates the need for the user to
configure the system for AI themselves. After the initialization, RamaLama
will run the AI Models within a container based on the OCI image.
Version: 0.14.0
General Commands | |
| ramalama | Simple management tool for working with AI Models |
| ramalama-bench | benchmark specified AI Model |
| ramalama-chat | OpenAI chat with the specified REST API URL |
| ramalama-containers | list all RamaLama containers |
| ramalama-convert | convert AI Models from local storage to OCI Image |
| ramalama-daemon | run a RamaLama REST server |
| ramalama-info | display RamaLama configuration information |
| ramalama-inspect | inspect the specified AI Model |
| ramalama-list | list all downloaded AI Models |
| ramalama-login | login to remote registry |
| ramalama-logout | logout from remote registry |
| ramalama-perplexity | calculate the perplexity value of an AI Model |
| ramalama-pull | pull AI Models from Model registries to local storage |
| ramalama-push | push AI Models from local storage to remote registries |
| ramalama-rag | generate and convert Retrieval Augmented Generation (RAG) data from provided documents into an OCI Image |
| ramalama-rm | remove AI Models from local storage |
| ramalama-run | run specified AI Model as a chatbot |
| ramalama-serve | serve REST API on specified AI Model |
| ramalama-stop | stop named container that is running AI Model |
| ramalama-version | display version of RamaLama |
File Formats | |
| ramalama-oci | RamaLama oci:// Image Format |
| ramalama.conf | These configuration files specifies default configuration options and command-line flags for RamaLama. |
Miscellanea | |
| ramalama-cann | |
| ramalama-cuda | |
| ramalama-macos | |
| ramalama-musa | |