Package ramalama: Information
Source package: ramalama
Version: 0.14.0-alt1
Build time: Nov 3, 2025, 01:46 AM in the task #399107
Category: Development/Python3
Report package bugHome page: https://github.com/containers/ramalama
License: MIT
Summary: RamaLama is a command line tool for working with AI LLM models
Description:
RamaLama is a command line tool for working with AI LLM models On first run RamaLama inspects your system for GPU support, falling back to CPU support if no GPUs are present. It then uses container engines like Podman to pull the appropriate OCI image with all of the software necessary to run an AI Model for your systems setup. This eliminates the need for the user to configure the system for AI themselves. After the initialization, RamaLama will run the AI Models within a container based on the OCI image.
Maintainer: Konstantin Lepikhov
Last changed
Nov. 2, 2025 Konstantin Lepikhov 0.14.0-alt1
- 0.14.0.
Oct. 28, 2025 Konstantin Lepikhov 0.13.0-alt1
- 0.13.0.
Oct. 2, 2025 Konstantin Lepikhov 0.12.3-alt1
- 0.12.3.