Package evaluation of FlashAttentionWrapper on Julia 1.13.0-DEV.636 (2bd37fbbab*) started at 2025-05-25T15:24:16.977 ################################################################################ # Set-up # Installing PkgEval dependencies (TestEnv)... Set-up completed after 8.19s ################################################################################ # Installation # Installing FlashAttentionWrapper... Resolving package versions... Updating `~/.julia/environments/v1.13/Project.toml` [4822a80a] + FlashAttentionWrapper v0.1.0 Updating `~/.julia/environments/v1.13/Manifest.toml` [ab4f0b2a] + BFloat16s v0.5.1 [d360d2e6] + ChainRulesCore v1.25.1 [34da2185] + Compat v4.16.0 [992eb4ea] + CondaPkg v0.2.29 [53c2dc0f] + DLPack v0.3.1 [9a962f9c] + DataAPI v1.16.0 [e2d170a0] + DataValueInterfaces v1.0.0 [8d63f2c5] + DispatchDoctor v0.4.19 [4822a80a] + FlashAttentionWrapper v0.1.0 [82899510] + IteratorInterfaceExtensions v1.0.0 [692b3bcd] + JLLWrappers v1.7.0 [0f8b85d8] + JSON3 v1.14.3 [bb33d45b] + LuxCore v1.2.4 [1914dd2f] + MacroTools v0.5.16 [0b3b1443] + MicroMamba v0.1.14 [bac558e1] + OrderedCollections v1.8.1 [69de0a69] + Parsers v2.8.3 [fa939f87] + Pidfile v1.3.0 [aea7be01] + PrecompileTools v1.3.2 [21216c6a] + Preferences v1.4.3 [6099a3de] + PythonCall v0.9.25 [ae029012] + Requires v1.3.1 [6c6a2e73] + Scratch v1.2.1 [856f2bd8] + StructTypes v1.11.0 [3783bdb8] + TableTraits v1.0.1 [bd369af6] + Tables v1.12.0 [e17b2a0c] + UnsafePointers v1.0.0 [f8abcde7] + micromamba_jll v1.5.8+0 [4d7b5844] + pixi_jll v0.41.3+0 [0dad84c5] + ArgTools v1.1.2 [56f22d72] + Artifacts v1.11.0 [2a0f44e3] + Base64 v1.11.0 [ade2ca70] + Dates v1.11.0 [f43a241f] + Downloads v1.7.0 [7b1f6079] + FileWatching v1.11.0 [b77e0a4c] + InteractiveUtils v1.11.0 [ac6e5ff7] + JuliaSyntaxHighlighting v1.12.0 [4af54fe1] + LazyArtifacts v1.11.0 [b27032c2] + LibCURL v0.6.4 [76f85450] + LibGit2 v1.11.0 [8f399da3] + Libdl v1.11.0 [37e2e46d] + LinearAlgebra v1.12.0 [56ddb016] + Logging v1.11.0 [d6f4376e] + Markdown v1.11.0 [a63ad114] + Mmap v1.11.0 [ca575930] + NetworkOptions v1.3.0 [44cfe95a] + Pkg v1.13.0 [de0858da] + Printf v1.11.0 [9a3f8284] + Random v1.11.0 [ea8e919c] + SHA v0.7.0 [9e88b42a] + Serialization v1.11.0 [f489334b] + StyledStrings v1.11.0 [fa267f1f] + TOML v1.0.3 [a4e569a6] + Tar v1.10.0 [8dfed614] + Test v1.11.0 [cf7118a7] + UUIDs v1.11.0 [4ec0a83e] + Unicode v1.11.0 [e66e0078] + CompilerSupportLibraries_jll v1.3.0+1 [deac9b47] + LibCURL_jll v8.12.1+1 [e37daf67] + LibGit2_jll v1.9.0+0 [29816b5a] + LibSSH2_jll v1.11.3+1 [14a3606d] + MozillaCACerts_jll v2025.5.20 [4536629a] + OpenBLAS_jll v0.3.29+0 [458c3c95] + OpenSSL_jll v3.5.0+0 [83775a58] + Zlib_jll v1.3.1+2 [8e850b90] + libblastrampoline_jll v5.12.0+0 [8e850ede] + nghttp2_jll v1.65.0+0 [3f19e933] + p7zip_jll v17.5.0+2 Installation completed after 3.58s ################################################################################ # Precompilation # Precompiling PkgEval dependencies... Precompiling package dependencies... ERROR: LoadError: The following 1 direct dependency failed to precompile: FlashAttentionWrapper Failed to precompile FlashAttentionWrapper [4822a80a-c2d5-4bbd-b1da-f89b711347c8] to "/home/pkgeval/.julia/compiled/v1.13/FlashAttentionWrapper/jl_73UqUo" (ProcessExited(1)). CondaPkg Found dependencies: /home/pkgeval/.julia/packages/PythonCall/L4cjh/CondaPkg.toml CondaPkg Found dependencies: /home/pkgeval/.julia/packages/FlashAttentionWrapper/kDhOX/CondaPkg.toml CondaPkg Resolving changes + openssl + python + pytorch CondaPkg Initialising pixi │ /home/pkgeval/.julia/artifacts/cefba4912c2b400756d043a2563ef77a0088866b/bin/pixi │ init │ --format pixi └ /tmp/jl_KbCRZG/.CondaPkg ✔ Created /tmp/jl_KbCRZG/.CondaPkg/pixi.toml CondaPkg Wrote /tmp/jl_KbCRZG/.CondaPkg/pixi.toml │ [dependencies] │ openssl = ">=3, <3.6" │ │ [dependencies.python] │ build = "*cpython*" │ channel = "conda-forge" │ version = ">=3.8,<4" │ │ [dependencies.pytorch] │ channel = "pytorch" │ version = "*" │ │ [project] │ description = "automatically generated by CondaPkg.jl" │ name = ".CondaPkg" │ channel-priority = "strict" │ platforms = ["linux-64"] └ channels = ["conda-forge", "pytorch"] CondaPkg Installing packages │ /home/pkgeval/.julia/artifacts/cefba4912c2b400756d043a2563ef77a0088866b/bin/pixi │ install └ --manifest-path /tmp/jl_KbCRZG/.CondaPkg/pixi.toml Error: × failed to fetch libcufft-11.2.1.3-he02047a_2.conda ├─▶ an io error occurred: failed to unpack `/home/pkgeval/.julia/ │ scratchspaces/992eb4ea-22a4-4c89-a5bb-47a3300528ab/pixi_cache/ │ pkgs/libcufft-11.2.1.3-he02047a_2/targets/x86_64-linux/lib/ │ libcufft.so.11.2.1.3` ├─▶ failed to unpack `/home/pkgeval/.julia/scratchspaces/992eb4ea-22a4-4c89- │ a5bb-47a3300528ab/pixi_cache/pkgs/libcufft-11.2.1.3-he02047a_2/targets/ │ x86_64-linux/lib/libcufft.so.11.2.1.3` ├─▶ failed to unpack `targets/x86_64-linux/lib/libcufft.so.11.2.1.3` into `/ │ home/pkgeval/.julia/scratchspaces/992eb4ea-22a4-4c89-a5bb-47a3300528ab/ │ pixi_cache/pkgs/libcufft-11.2.1.3-he02047a_2/targets/x86_64-linux/lib/ │ libcufft.so.11.2.1.3` ├─▶ error decoding response body ├─▶ request or response body error ├─▶ error reading a body from connection ╰─▶ stream error received: unexpected internal error encountered ERROR: LoadError: InitError: failed process: Process(setenv(`/home/pkgeval/.julia/artifacts/cefba4912c2b400756d043a2563ef77a0088866b/bin/pixi install --manifest-path /tmp/jl_KbCRZG/.CondaPkg/pixi.toml`,["PYTHON=", "LANG=C.UTF-8", "PATH=/home/pkgeval/.julia/artifacts/cefba4912c2b400756d043a2563ef77a0088866b/bin:/usr/local/bin:/usr/local/sbin:/usr/bin:/usr/sbin:/bin:/sbin", "JULIA_LOAD_PATH=/tmp/jl_KbCRZG/Project.toml:/home/pkgeval/.julia/environments/v1.13/Project.toml:/opt/julia/share/julia/stdlib/v1.13", "JULIA_NUM_PRECOMPILE_TASKS=1", "JULIA_PKG_PRECOMPILE_AUTO=0", "OPENBLAS_MAIN_FREE=1", "DISPLAY=:1", "HOME=/home/pkgeval", "R_HOME=*", "JULIA_CPU_THREADS=1", "JULIA_DEPOT_PATH=/home/pkgeval/.julia:/usr/local/share/julia:", "JULIA_NUM_THREADS=1", "PIXI_CACHE_DIR=/home/pkgeval/.julia/scratchspaces/992eb4ea-22a4-4c89-a5bb-47a3300528ab/pixi_cache", "OPENBLAS_NUM_THREADS=1", "CI=true", "JULIA_PKGEVAL=true", "LD_LIBRARY_PATH=/opt/julia/bin/../lib/julia:/opt/julia/bin/../lib", "RATTLER_CACHE_DIR=/home/pkgeval/.julia/scratchspaces/992eb4ea-22a4-4c89-a5bb-47a3300528ab/pixi_cache", "PKGEVAL=true"]), ProcessExited(1)) [1] Stacktrace: [1] pipeline_error @ ./process.jl:602 [inlined] [2] run(::Cmd; wait::Bool) @ Base ./process.jl:517 [3] run(::Cmd) @ Base ./process.jl:514 [4] _run(io::IO, cmd::Cmd, args::Any; flags::Any) @ CondaPkg ~/.julia/packages/CondaPkg/hJuAq/src/resolve.jl:715 [5] _run(io::IO, cmd::Cmd, args::Any) @ CondaPkg ~/.julia/packages/CondaPkg/hJuAq/src/resolve.jl:709 [6] (::CondaPkg.var"#63#64"{Bool, Base.PipeEndpoint, String, Vector{CondaPkg.PipPkgSpec}, Vector{CondaPkg.PkgSpec}, Vector{CondaPkg.ChannelSpec}})() @ CondaPkg ~/.julia/packages/CondaPkg/hJuAq/src/resolve.jl:1000 [7] cd(f::CondaPkg.var"#63#64"{Bool, Base.PipeEndpoint, String, Vector{CondaPkg.PipPkgSpec}, Vector{CondaPkg.PkgSpec}, Vector{CondaPkg.ChannelSpec}}, dir::String) @ Base.Filesystem ./file.jl:112 [8] resolve(; force::Bool, io::IO, interactive::Bool, dry_run::Bool) @ CondaPkg ~/.julia/packages/CondaPkg/hJuAq/src/resolve.jl:940 [9] resolve() @ CondaPkg ~/.julia/packages/CondaPkg/hJuAq/src/resolve.jl:718 [10] envdir() @ CondaPkg ~/.julia/packages/CondaPkg/hJuAq/src/env.jl:70 [11] init_context() @ PythonCall.C ~/.julia/packages/PythonCall/L4cjh/src/C/context.jl:63 [12] __init__() @ PythonCall.C ~/.julia/packages/PythonCall/L4cjh/src/C/C.jl:23 [13] run_module_init(mod::Module, i::Int64) @ Base ./loading.jl:1416 [14] register_restored_modules(sv::Core.SimpleVector, pkg::Base.PkgId, path::String) @ Base ./loading.jl:1404 [15] _include_from_serialized(pkg::Base.PkgId, path::String, ocachepath::Nothing, depmods::Vector{Any}; register::Bool) @ Base ./loading.jl:1292 [16] _include_from_serialized @ ./loading.jl:1246 [inlined] [17] _require_search_from_serialized(pkg::Base.PkgId, sourcepath::String, build_id::UInt128, stalecheck::Bool; reasons::Dict{String, Int64}, DEPOT_PATH::Vector{String}) @ Base ./loading.jl:2088 [18] __require_prelocked(pkg::Base.PkgId, env::String) @ Base ./loading.jl:2600 [19] _require_prelocked(uuidkey::Base.PkgId, env::String) @ Base ./loading.jl:2466 [20] macro expansion @ ./loading.jl:2394 [inlined] [21] macro expansion @ ./lock.jl:376 [inlined] [22] __require(into::Module, mod::Symbol) @ Base ./loading.jl:2359 [23] require @ ./loading.jl:2335 [inlined] [24] eval_import_path @ ./module.jl:36 [inlined] [25] eval_import_path_all(at::Module, path::Expr, keyword::String) @ Base ./module.jl:60 [26] _eval_using(to::Module, path::Expr) @ Base ./module.jl:137 [27] top-level scope @ ~/.julia/packages/FlashAttentionWrapper/kDhOX/src/rrule.jl:2 [28] include(mapexpr::Function, mod::Module, _path::String) @ Base ./Base.jl:310 [29] top-level scope @ ~/.julia/packages/FlashAttentionWrapper/kDhOX/src/FlashAttentionWrapper.jl:4 [30] include(mod::Module, _path::String) @ Base ./Base.jl:309 [31] include_package_for_output(pkg::Base.PkgId, input::String, depot_path::Vector{String}, dl_load_path::Vector{String}, load_path::Vector{String}, concrete_deps::Vector{Pair{Base.PkgId, UInt128}}, source::Nothing) @ Base ./loading.jl:3000 [32] top-level scope @ stdin:5 [33] eval(m::Module, e::Any) @ Core ./boot.jl:489 [34] include_string(mapexpr::typeof(identity), mod::Module, code::String, filename::String) @ Base ./loading.jl:2846 [35] include_string @ ./loading.jl:2856 [inlined] [36] exec_options(opts::Base.JLOptions) @ Base ./client.jl:329 [37] _start() @ Base ./client.jl:564 during initialization of module C in expression starting at /home/pkgeval/.julia/packages/FlashAttentionWrapper/kDhOX/src/rrule.jl:2 in expression starting at /home/pkgeval/.julia/packages/FlashAttentionWrapper/kDhOX/src/FlashAttentionWrapper.jl:1 in expression starting at stdin: in expression starting at /PkgEval.jl/scripts/precompile.jl:37 Precompilation failed after 195.72s ################################################################################ # Testing # Testing FlashAttentionWrapper Status `/tmp/jl_qOhMP2/Project.toml` [052768ef] CUDA v5.8.1 [d360d2e6] ChainRulesCore v1.25.1 [992eb4ea] CondaPkg v0.2.29 [53c2dc0f] DLPack v0.3.1 [4822a80a] FlashAttentionWrapper v0.1.0 [bb33d45b] LuxCore v1.2.4 [872c559c] NNlib v0.9.30 [6099a3de] PythonCall v0.9.25 [e88e6eb3] Zygote v0.7.7 [9a3f8284] Random v1.11.0 [8dfed614] Test v1.11.0 Status `/tmp/jl_qOhMP2/Manifest.toml` [621f4979] AbstractFFTs v1.5.0 [79e6a3ab] Adapt v4.3.0 [a9b6321e] Atomix v1.1.1 [ab4f0b2a] BFloat16s v0.5.1 [fa961155] CEnum v0.5.0 [052768ef] CUDA v5.8.1 [1af6417a] CUDA_Runtime_Discovery v0.3.5 [082447d4] ChainRules v1.72.3 [d360d2e6] ChainRulesCore v1.25.1 [3da002f7] ColorTypes v0.12.1 [5ae59095] Colors v0.13.1 [bbf7d656] CommonSubexpressions v0.3.1 [34da2185] Compat v4.16.0 [992eb4ea] CondaPkg v0.2.29 [187b0558] ConstructionBase v1.5.8 [a8cc5b0e] Crayons v4.1.1 [53c2dc0f] DLPack v0.3.1 [9a962f9c] DataAPI v1.16.0 [a93c6f00] DataFrames v1.7.0 [864edb3b] DataStructures v0.18.22 [e2d170a0] DataValueInterfaces v1.0.0 [163ba53b] DiffResults v1.1.0 [b552c78f] DiffRules v1.15.1 [8d63f2c5] DispatchDoctor v0.4.19 [ffbed154] DocStringExtensions v0.9.4 [e2ba6199] ExprTools v0.1.10 [1a297f60] FillArrays v1.13.0 [53c48c17] FixedPointNumbers v0.8.5 [4822a80a] FlashAttentionWrapper v0.1.0 [f6369f11] ForwardDiff v1.0.1 [0c68f7d7] GPUArrays v11.2.2 [46192b85] GPUArraysCore v0.2.0 [61eb1bfa] GPUCompiler v1.5.1 [096a3bc2] GPUToolbox v0.2.0 [076d061b] HashArrayMappedTries v0.2.0 [7869d1d1] IRTools v0.4.14 [842dd82b] InlineStrings v1.4.3 [41ab1584] InvertedIndices v1.3.1 [92d709cd] IrrationalConstants v0.2.4 [82899510] IteratorInterfaceExtensions v1.0.0 [692b3bcd] JLLWrappers v1.7.0 [0f8b85d8] JSON3 v1.14.3 [63c18a36] KernelAbstractions v0.9.34 [929cbde3] LLVM v9.4.0 [8b046642] LLVMLoopInfo v1.0.0 [b964fa9f] LaTeXStrings v1.4.0 [2ab3a3ac] LogExpFunctions v0.3.29 [bb33d45b] LuxCore v1.2.4 [1914dd2f] MacroTools v0.5.16 [0b3b1443] MicroMamba v0.1.14 [e1d29d7a] Missings v1.2.0 [872c559c] NNlib v0.9.30 [5da4648a] NVTX v1.0.0 [77ba4419] NaNMath v1.1.3 [bac558e1] OrderedCollections v1.8.1 [69de0a69] Parsers v2.8.3 [fa939f87] Pidfile v1.3.0 [2dfb63ee] PooledArrays v1.4.3 [aea7be01] PrecompileTools v1.3.2 [21216c6a] Preferences v1.4.3 [08abe8d2] PrettyTables v2.4.0 [6099a3de] PythonCall v0.9.25 [74087812] Random123 v1.7.1 [e6cf234a] RandomNumbers v1.6.0 [c1ae055f] RealDot v0.1.0 [189a3867] Reexport v1.2.2 [ae029012] Requires v1.3.1 [7e506255] ScopedValues v1.3.0 [6c6a2e73] Scratch v1.2.1 [91c51154] SentinelArrays v1.4.8 [a2af1166] SortingAlgorithms v1.2.1 [dc90abb0] SparseInverseSubset v0.1.2 [276daf66] SpecialFunctions v2.5.1 [90137ffa] StaticArrays v1.9.13 [1e83bf80] StaticArraysCore v1.4.3 [10745b16] Statistics v1.11.1 [892a3eda] StringManipulation v0.4.1 [09ab397b] StructArrays v0.7.1 [856f2bd8] StructTypes v1.11.0 [3783bdb8] TableTraits v1.0.1 [bd369af6] Tables v1.12.0 [e689c965] Tracy v0.1.4 [013be700] UnsafeAtomics v0.3.0 [e17b2a0c] UnsafePointers v1.0.0 [e88e6eb3] Zygote v0.7.7 [700de1a5] ZygoteRules v0.2.7 [4ee394cb] CUDA_Driver_jll v0.13.0+0 [76a88914] CUDA_Runtime_jll v0.17.0+0 [9c1d0b0a] JuliaNVTXCallbacks_jll v0.2.1+0 [dad2f222] LLVMExtra_jll v0.0.36+0 [ad6e5548] LibTracyClient_jll v0.9.1+6 [e98f9f5b] NVTX_jll v3.1.1+0 [efe28fd5] OpenSpecFun_jll v0.5.6+0 [1e29f10c] demumble_jll v1.3.0+0 [f8abcde7] micromamba_jll v1.5.8+0 [4d7b5844] pixi_jll v0.41.3+0 [0dad84c5] ArgTools v1.1.2 [56f22d72] Artifacts v1.11.0 [2a0f44e3] Base64 v1.11.0 [ade2ca70] Dates v1.11.0 [8ba89e20] Distributed v1.11.0 [f43a241f] Downloads v1.7.0 [7b1f6079] FileWatching v1.11.0 [9fa8497b] Future v1.11.0 [b77e0a4c] InteractiveUtils v1.11.0 [ac6e5ff7] JuliaSyntaxHighlighting v1.12.0 [4af54fe1] LazyArtifacts v1.11.0 [b27032c2] LibCURL v0.6.4 [76f85450] LibGit2 v1.11.0 [8f399da3] Libdl v1.11.0 [37e2e46d] LinearAlgebra v1.12.0 [56ddb016] Logging v1.11.0 [d6f4376e] Markdown v1.11.0 [a63ad114] Mmap v1.11.0 [ca575930] NetworkOptions v1.3.0 [44cfe95a] Pkg v1.13.0 [de0858da] Printf v1.11.0 [9a3f8284] Random v1.11.0 [ea8e919c] SHA v0.7.0 [9e88b42a] Serialization v1.11.0 [6462fe0b] Sockets v1.11.0 [2f01184e] SparseArrays v1.12.0 [f489334b] StyledStrings v1.11.0 [4607b0f0] SuiteSparse [fa267f1f] TOML v1.0.3 [a4e569a6] Tar v1.10.0 [8dfed614] Test v1.11.0 [cf7118a7] UUIDs v1.11.0 [4ec0a83e] Unicode v1.11.0 [e66e0078] CompilerSupportLibraries_jll v1.3.0+1 [deac9b47] LibCURL_jll v8.12.1+1 [e37daf67] LibGit2_jll v1.9.0+0 [29816b5a] LibSSH2_jll v1.11.3+1 [14a3606d] MozillaCACerts_jll v2025.5.20 [4536629a] OpenBLAS_jll v0.3.29+0 [05823500] OpenLibm_jll v0.8.5+0 [458c3c95] OpenSSL_jll v3.5.0+0 [bea87d4a] SuiteSparse_jll v7.10.1+0 [83775a58] Zlib_jll v1.3.1+2 [8e850b90] libblastrampoline_jll v5.12.0+0 [8e850ede] nghttp2_jll v1.65.0+0 [3f19e933] p7zip_jll v17.5.0+2 Testing Running tests... Precompiling packages... Info Given FlashAttentionWrapper was explicitly requested, output will be shown live   CondaPkg Found dependencies: /home/pkgeval/.julia/packages/PythonCall/L4cjh/CondaPkg.toml  CondaPkg Found dependencies: /home/pkgeval/.julia/packages/FlashAttentionWrapper/kDhOX/CondaPkg.toml  CondaPkg Resolving changes  + openssl  + python  + pytorch  CondaPkg Initialising pixi  │ /home/pkgeval/.julia/artifacts/cefba4912c2b400756d043a2563ef77a0088866b/bin/pixi  │ init  │ --format pixi  └ /tmp/jl_qOhMP2/.CondaPkg ✔ Created /tmp/jl_qOhMP2/.CondaPkg/pixi.toml  CondaPkg Wrote /tmp/jl_qOhMP2/.CondaPkg/pixi.toml  │ [dependencies]  │ openssl = ">=3, <3.6"  │  │ [dependencies.python]  │ build = "*cpython*"  │ channel = "conda-forge"  │ version = ">=3.8,<4"  │  │ [dependencies.pytorch]  │ channel = "pytorch"  │ version = "*"  │  │ [project]  │ description = "automatically generated by CondaPkg.jl"  │ name = ".CondaPkg"  │ channel-priority = "strict"  │ platforms = ["linux-64"]  └ channels = ["conda-forge", "pytorch"]  CondaPkg Installing packages  │ /home/pkgeval/.julia/artifacts/cefba4912c2b400756d043a2563ef77a0088866b/bin/pixi  │ install  └ --manifest-path /tmp/jl_qOhMP2/.CondaPkg/pixi.toml ✔ The default environment has been installed. 595731.3 ms ✓ FlashAttentionWrapper 1 dependency successfully precompiled in 596 seconds. 62 already precompiled. 1 dependency had output during precompilation: ┌ FlashAttentionWrapper │ [Output was shown above] └ /tmp/jl_qOhMP2/.CondaPkg/.pixi/envs/default/lib/python3.12/site-packages/torch/_subclasses/functional_tensor.py:295: UserWarning: Failed to initialize NumPy: No module named 'numpy' (Triggered internally at /opt/conda/conda-bld/pytorch_1729647378361/work/torch/csrc/utils/tensor_numpy.cpp:84.) cpu = _conversion_method_template(device=torch.device("cpu")) ┌ Error: Python packages [torch and/or flash_attn_2_cuda] not found. Please install the dependencies by running FlashAttentionWrapper.install() and restart the REPL └ @ FlashAttentionWrapper ~/.julia/packages/FlashAttentionWrapper/kDhOX/src/FlashAttentionWrapper.jl:11 Precompiling packages... 1554.2 ms ✓ StaticArrays → StaticArraysChainRulesCoreExt 16551.0 ms ✓ NNlib 2 dependencies successfully precompiled in 18 seconds. 31 already precompiled. Precompiling packages... 29507.1 ms ✓ PrettyTables 1275.4 ms ✓ NVTX_jll 1202.1 ms ✓ demumble_jll 1439.5 ms ✓ CUDA_Driver_jll 1139.1 ms ✓ JuliaNVTXCallbacks_jll 4990.0 ms ✓ LLVM → BFloat16sExt 8564.6 ms ✓ UnsafeAtomics → UnsafeAtomicsLLVM 77210.3 ms ✓ DataFrames 4316.6 ms ✓ CUDA_Runtime_jll 2176.3 ms ✓ NVTX 11527.1 ms ✓ GPUArrays Info Given CUDA was explicitly requested, output will be shown live  WARNING: Constructor for type "Int" was extended in `CUDA` without explicit qualification or import.  NOTE: Assumed "Int" refers to `Base.Int`. This behavior is deprecated and may differ in future versions.  NOTE: This behavior may have differed in Julia versions prior to 1.12.  Hint: If you intended to create a new generic function of the same name, use `function Int end`.  Hint: To silence the warning, qualify `Int` as `Base.Int` in the method signature or explicitly `import Base: Int`. WARNING: Constructor for type "UInt" was extended in `CUDA` without explicit qualification or import.  NOTE: Assumed "UInt" refers to `Base.UInt`. This behavior is deprecated and may differ in future versions.  NOTE: This behavior may have differed in Julia versions prior to 1.12.  Hint: If you intended to create a new generic function of the same name, use `function UInt end`.  Hint: To silence the warning, qualify `UInt` as `Base.UInt` in the method signature or explicitly `import Base: UInt`. WARNING: Constructor for type "BroadcastStyle" was extended in `CUDA` without explicit qualification or import.  NOTE: Assumed "BroadcastStyle" refers to `Broadcast.BroadcastStyle`. This behavior is deprecated and may differ in future versions.  NOTE: This behavior may have differed in Julia versions prior to 1.12.  Hint: If you intended to create a new generic function of the same name, use `function BroadcastStyle end`.  Hint: To silence the warning, qualify `BroadcastStyle` as `Broadcast.BroadcastStyle` in the method signature or explicitly `import Broadcast: BroadcastStyle`. WARNING: Constructor for type "CuArray" was extended in `CUBLAS` without explicit qualification or import.  NOTE: Assumed "CuArray" refers to `CUDA.CuArray`. This behavior is deprecated and may differ in future versions.  NOTE: This behavior may have differed in Julia versions prior to 1.12.  Hint: If you intended to create a new generic function of the same name, use `function CuArray end`.  Hint: To silence the warning, qualify `CuArray` as `CUDA.CuArray` in the method signature or explicitly `import CUDA: CuArray`. WARNING: Constructor for type "SparseVector" was extended in `CUSPARSE` without explicit qualification or import.  NOTE: Assumed "SparseVector" refers to `SparseArrays.SparseVector`. This behavior is deprecated and may differ in future versions.  NOTE: This behavior may have differed in Julia versions prior to 1.12.  Hint: If you intended to create a new generic function of the same name, use `function SparseVector end`.  Hint: To silence the warning, qualify `SparseVector` as `SparseArrays.SparseVector` in the method signature or explicitly `import SparseArrays: SparseVector`. WARNING: Constructor for type "SparseMatrixCSC" was extended in `CUSPARSE` without explicit qualification or import.  NOTE: Assumed "SparseMatrixCSC" refers to `SparseArrays.SparseMatrixCSC`. This behavior is deprecated and may differ in future versions.  NOTE: This behavior may have differed in Julia versions prior to 1.12.  Hint: If you intended to create a new generic function of the same name, use `function SparseMatrixCSC end`.  Hint: To silence the warning, qualify `SparseMatrixCSC` as `SparseArrays.SparseMatrixCSC` in the method signature or explicitly `import SparseArrays: SparseMatrixCSC`. WARNING: Constructor for type "CuVector" was extended in `CUSPARSE` without explicit qualification or import.  NOTE: Assumed "CuVector" refers to `CUDA.CuVector`. This behavior is deprecated and may differ in future versions.  NOTE: This behavior may have differed in Julia versions prior to 1.12.  Hint: If you intended to create a new generic function of the same name, use `function CuVector end`.  Hint: To silence the warning, qualify `CuVector` as `CUDA.CuVector` in the method signature or explicitly `import CUDA: CuVector`. WARNING: Constructor for type "CuMatrix" was extended in `CUSOLVER` without explicit qualification or import.  NOTE: Assumed "CuMatrix" refers to `CUDA.CuMatrix`. This behavior is deprecated and may differ in future versions.  NOTE: This behavior may have differed in Julia versions prior to 1.12.  Hint: If you intended to create a new generic function of the same name, use `function CuMatrix end`.  Hint: To silence the warning, qualify `CuMatrix` as `CUDA.CuMatrix` in the method signature or explicitly `import CUDA: CuMatrix`. WARNING: Constructor for type "CuArray" was extended in `CUSOLVER` without explicit qualification or import.  NOTE: Assumed "CuArray" refers to `CUDA.CuArray`. This behavior is deprecated and may differ in future versions.  NOTE: This behavior may have differed in Julia versions prior to 1.12.  Hint: If you intended to create a new generic function of the same name, use `function CuArray end`.  Hint: To silence the warning, qualify `CuArray` as `CUDA.CuArray` in the method signature or explicitly `import CUDA: CuArray`. 112905.0 ms ✓ CUDA 38458.6 ms ✓ Atomix → AtomixCUDAExt 13 dependencies successfully precompiled in 296 seconds. 90 already precompiled. 1 dependency had output during precompilation: ┌ CUDA │ [Output was shown above] └ Precompiling packages... 39217.8 ms ✓ DLPack → CUDAExt 1 dependency successfully precompiled in 40 seconds. 104 already precompiled. Precompiling packages... 40267.8 ms ✓ NNlib → NNlibCUDAExt 36599.9 ms ✓ CUDA → ChainRulesCoreExt 2 dependencies successfully precompiled in 79 seconds. 108 already precompiled. Precompiling packages... 2086.8 ms ✓ ZygoteRules 11200.7 ms ✓ ChainRules 3633.3 ms ✓ SpecialFunctions → SpecialFunctionsChainRulesCoreExt 91324.1 ms ✓ Zygote 4 dependencies successfully precompiled in 109 seconds. 62 already precompiled. Precompiling packages... 2622.1 ms ✓ StructArrays → StructArraysStaticArraysExt 1 dependency successfully precompiled in 3 seconds. 19 already precompiled. Precompiling packages... 3852.1 ms ✓ StructArrays → StructArraysGPUArraysCoreExt 1 dependency successfully precompiled in 4 seconds. 36 already precompiled. Precompiling packages... 3150.4 ms ✓ NNlib → NNlibSpecialFunctionsExt 1 dependency successfully precompiled in 4 seconds. 42 already precompiled. Precompiling packages... 41145.1 ms ✓ CUDA → SpecialFunctionsExt 1 dependency successfully precompiled in 42 seconds. 109 already precompiled. Precompiling packages... 4501.4 ms ✓ NNlib → NNlibForwardDiffExt 4898.3 ms ✓ ForwardDiff → ForwardDiffStaticArraysExt 2 dependencies successfully precompiled in 10 seconds. 48 already precompiled. Precompiling packages... 10372.7 ms ✓ Zygote → ZygoteColorsExt 1 dependency successfully precompiled in 11 seconds. 71 already precompiled. Testing FlashAttentionWrapper tests passed Testing completed after 1269.36s PkgEval succeeded after 1562.69s