~ruther/guix-local

560d5c66925970a5104a46051ae74e2dbd16adba — Ludovic Courtès 2 years ago a568ac8
gnu: llama-cpp: Produce a portable binary unless tuned.

* gnu/packages/machine-learning.scm (llama-cpp)[arguments]:
Augment #:configure-flags.
[properties]: New field.

Co-authored-by: John Fremlin <john@fremlin.org>
Change-Id: I9b3d72849107a6988fec94dc4a22614443338cb2
1 files changed, 11 insertions(+), 2 deletions(-)

M gnu/packages/machine-learning.scm
M gnu/packages/machine-learning.scm => gnu/packages/machine-learning.scm +11 -2
@@ 541,8 541,16 @@ Performance is achieved by using the LLVM JIT compiler.")
      (build-system cmake-build-system)
      (arguments
       (list
        #:configure-flags
        '(list "-DLLAMA_BLAS=ON" "-DLLAMA_BLAS_VENDOR=OpenBLAS")
        #:configure-flags #~'("-DLLAMA_BLAS=ON"
                              "-DLLAMA_BLAS_VENDOR=OpenBLAS"

                              "-DLLAMA_NATIVE=OFF" ;no '-march=native'
                              "-DLLAMA_FMA=OFF"    ;and no '-mfma', etc.
                              "-DLLAMA_AVX2=OFF"
                              "-DLLAMA_AVX512=OFF"
                              "-DLLAMA_AVX512_VBMI=OFF"
                              "-DLLAMA_AVX512_VNNI=OFF")

        #:modules '((ice-9 textual-ports)
                    (guix build utils)
                    ((guix build python-build-system) #:prefix python:)


@@ 580,6 588,7 @@ Performance is achieved by using the LLVM JIT compiler.")
      (native-inputs (list pkg-config))
      (propagated-inputs
       (list python-numpy python-pytorch python-sentencepiece openblas))
      (properties '((tunable? . #true))) ;use AVX512, FMA, etc. when available
      (home-page "https://github.com/ggerganov/llama.cpp")
      (synopsis "Port of Facebook's LLaMA model in C/C++")
      (description "This package provides a port to Facebook's LLaMA collection