Package evaluation of LlamaCpp on Julia 1.11.4 (a71dd056e0*) started at 2025-04-08T09:47:03.914 ################################################################################ # Set-up # Installing PkgEval dependencies (TestEnv)... Set-up completed after 8.92s ################################################################################ # Installation # Installing LlamaCpp... Resolving package versions... Updating `~/.julia/environments/v1.11/Project.toml` [4028143c] + LlamaCpp v0.5.0 Updating `~/.julia/environments/v1.11/Manifest.toml` [fa961155] + CEnum v0.5.0 [692b3bcd] + JLLWrappers v1.7.0 [4028143c] + LlamaCpp v0.5.0 [21216c6a] + Preferences v1.4.3 [b873ce64] + ReplMaker v0.2.7 [98bd2d0b] + llama_cpp_jll v0.0.17+0 [0dad84c5] + ArgTools v1.1.2 [56f22d72] + Artifacts v1.11.0 [2a0f44e3] + Base64 v1.11.0 [ade2ca70] + Dates v1.11.0 [f43a241f] + Downloads v1.6.0 [7b1f6079] + FileWatching v1.11.0 [b77e0a4c] + InteractiveUtils v1.11.0 [b27032c2] + LibCURL v0.6.4 [8f399da3] + Libdl v1.11.0 [d6f4376e] + Markdown v1.11.0 [ca575930] + NetworkOptions v1.2.0 [de0858da] + Printf v1.11.0 [3fa0cd96] + REPL v1.11.0 [6462fe0b] + Sockets v1.11.0 [f489334b] + StyledStrings v1.11.0 [fa267f1f] + TOML v1.0.3 [4ec0a83e] + Unicode v1.11.0 [e66e0078] + CompilerSupportLibraries_jll v1.1.1+0 [deac9b47] + LibCURL_jll v8.6.0+0 [29816b5a] + LibSSH2_jll v1.11.0+1 [c8ffd9c3] + MbedTLS_jll v2.28.6+0 [14a3606d] + MozillaCACerts_jll v2023.12.12 [83775a58] + Zlib_jll v1.2.13+1 [8e850ede] + nghttp2_jll v1.59.0+0 Installation completed after 2.09s ################################################################################ # Precompilation # Precompiling PkgEval dependencies... Precompiling package dependencies... Precompilation completed after 20.32s ################################################################################ # Testing # Testing LlamaCpp Status `/tmp/jl_1QC3N3/Project.toml` ⌅ [4c88cf16] Aqua v0.7.4 [fa961155] CEnum v0.5.0 [4028143c] LlamaCpp v0.5.0 [b873ce64] ReplMaker v0.2.7 [98bd2d0b] llama_cpp_jll v0.0.17+0 [f43a241f] Downloads v1.6.0 [8dfed614] Test v1.11.0 Status `/tmp/jl_1QC3N3/Manifest.toml` ⌅ [4c88cf16] Aqua v0.7.4 [fa961155] CEnum v0.5.0 [34da2185] Compat v4.16.0 [692b3bcd] JLLWrappers v1.7.0 [4028143c] LlamaCpp v0.5.0 [21216c6a] Preferences v1.4.3 [b873ce64] ReplMaker v0.2.7 [98bd2d0b] llama_cpp_jll v0.0.17+0 [0dad84c5] ArgTools v1.1.2 [56f22d72] Artifacts v1.11.0 [2a0f44e3] Base64 v1.11.0 [ade2ca70] Dates v1.11.0 [f43a241f] Downloads v1.6.0 [7b1f6079] FileWatching v1.11.0 [b77e0a4c] InteractiveUtils v1.11.0 [b27032c2] LibCURL v0.6.4 [76f85450] LibGit2 v1.11.0 [8f399da3] Libdl v1.11.0 [56ddb016] Logging v1.11.0 [d6f4376e] Markdown v1.11.0 [ca575930] NetworkOptions v1.2.0 [44cfe95a] Pkg v1.11.0 [de0858da] Printf v1.11.0 [3fa0cd96] REPL v1.11.0 [9a3f8284] Random v1.11.0 [ea8e919c] SHA v0.7.0 [9e88b42a] Serialization v1.11.0 [6462fe0b] Sockets v1.11.0 [f489334b] StyledStrings v1.11.0 [fa267f1f] TOML v1.0.3 [a4e569a6] Tar v1.10.0 [8dfed614] Test v1.11.0 [cf7118a7] UUIDs v1.11.0 [4ec0a83e] Unicode v1.11.0 [e66e0078] CompilerSupportLibraries_jll v1.1.1+0 [deac9b47] LibCURL_jll v8.6.0+0 [e37daf67] LibGit2_jll v1.7.2+0 [29816b5a] LibSSH2_jll v1.11.0+1 [c8ffd9c3] MbedTLS_jll v2.28.6+0 [14a3606d] MozillaCACerts_jll v2023.12.12 [83775a58] Zlib_jll v1.2.13+1 [8e850ede] nghttp2_jll v1.59.0+0 [3f19e933] p7zip_jll v17.4.0+2 Info Packages marked with ⌅ have new versions available but compatibility constraints restrict them from upgrading. Testing Running tests... Skipping LlamaCpp.REPL_CTX Skipping Base.active_repl Skipping Base.active_repl_backend Skipping Base.cwstring Skipping LlamaCpp.REPL_CTX Test Summary: | Pass Total Time Code quality (Aqua.jl) | 9 9 41.8s testing Llama, no model needed testing run_llama warning: no usable GPU found, --gpu-layers option will be ignored warning: one possible reason is that llama.cpp was compiled without GPU support warning: consult docs/build.md for compilation instructions warning: no usable GPU found, --gpu-layers option will be ignored warning: one possible reason is that llama.cpp was compiled without GPU support warning: consult docs/build.md for compilation instructions testing run_chat warning: no usable GPU found, --gpu-layers option will be ignored warning: one possible reason is that llama.cpp was compiled without GPU support warning: consult docs/build.md for compilation instructions [ Info: ENV["LLAMA_JL_MODEL_TESTS"] doesn't exist, _not_ running tests with model Test Summary: | Pass Total Time LlamaCpp.jl | 5 5 6.1s Testing LlamaCpp tests passed Testing completed after 57.37s PkgEval succeeded after 96.29s