Package ramalama: Information

    Source package: ramalama
    Version: 0.14.0-alt1
    Latest version according to Repology
    Build time:  Nov 3, 2025, 01:46 AM in the task #399107
    Report package bug
    License: MIT
    Summary: RamaLama is a command line tool for working with AI LLM models
    Description: 
    RamaLama is a command line tool for working with AI LLM models
    
    On first run RamaLama inspects your system for GPU support, falling back to CPU
    support if no GPUs are present. It then uses container engines like Podman to
    pull the appropriate OCI image with all of the software necessary to run an
    AI Model for your systems setup. This eliminates the need for the user to
    configure the system for AI themselves. After the initialization, RamaLama
    will run the AI Models within a container based on the OCI image.

    List of RPM packages built from this SRPM:
    python3-module-ramalama (noarch)
    ramalama (noarch)

    Maintainer: Konstantin Lepikhov

    List of contributors:
    Konstantin Lepikhov

    ACL:
    Konstantin Lepikhov
    @everybody

      1. python3-devel
      2. python3-module-argcomplete
      3. python3-module-pyproject-installer >= 0.4.0
      4. python3-module-setuptools
      5. python3-module-wheel
      6. go-md2man
      7. golang

    Last changed


    Nov. 2, 2025 Konstantin Lepikhov 0.14.0-alt1
    - 0.14.0.
    Oct. 28, 2025 Konstantin Lepikhov 0.13.0-alt1
    - 0.13.0.
    Oct. 2, 2025 Konstantin Lepikhov 0.12.3-alt1
    - 0.12.3.