Package evaluation of NDTensors on Julia 1.13.0-DEV.449 (8d6b63c3a6*) started at 2025-04-29T06:42:02.876 ################################################################################ # Set-up # Installing PkgEval dependencies (TestEnv)... Set-up completed after 8.16s ################################################################################ # Installation # Installing NDTensors... Resolving package versions... Updating `~/.julia/environments/v1.13/Project.toml` [23ae76d9] + NDTensors v0.4.8 Updating `~/.julia/environments/v1.13/Manifest.toml` [7d9f7c33] + Accessors v0.1.42 [79e6a3ab] + Adapt v4.3.0 [dce04be8] + ArgCheck v2.5.0 [4fba245c] + ArrayInterface v7.18.0 [4c555306] + ArrayLayouts v1.11.1 [198e06fe] + BangBang v0.4.4 [9718e550] + Baselet v0.1.1 [8e7c35d0] + BlockArrays v1.6.3 [f70d9fcc] + CommonWorldInvalidations v1.0.0 [34da2185] + Compat v4.16.0 [a33af91c] + CompositionsBase v0.1.2 [187b0558] + ConstructionBase v1.5.8 [9a962f9c] + DataAPI v1.16.0 [e2d170a0] + DataValueInterfaces v1.0.0 [244e2a9f] + DefineSingletons v0.1.2 [85a47980] + Dictionaries v0.4.5 [da5c29d0] + EllipsisNotation v1.8.0 [e2ba6199] + ExprTools v0.1.10 [e189563c] + ExternalDocstrings v0.1.1 [1a297f60] + FillArrays v1.13.0 [41a02a25] + Folds v0.2.10 [d9f16b24] + Functors v0.5.2 [f0d1745a] + HalfIntegers v1.6.0 [615f187c] + IfElse v0.1.1 [313cdc1a] + Indexing v1.1.1 [22cec73e] + InitialValues v0.3.1 [842dd82b] + InlineStrings v1.4.3 [3587e190] + InverseFunctions v0.1.17 [82899510] + IteratorInterfaceExtensions v1.0.0 [1914dd2f] + MacroTools v0.5.16 [128add7d] + MicroCollections v0.2.0 [23ae76d9] + NDTensors v0.4.8 [bac558e1] + OrderedCollections v1.8.0 [65ce6f38] + PackageExtensionCompat v1.0.2 [aea7be01] + PrecompileTools v1.3.2 [21216c6a] + Preferences v1.4.3 [42d2dcc6] + Referenceables v0.1.3 [ae029012] + Requires v1.3.1 [efcf1570] + Setfield v1.1.2 [699a6c99] + SimpleTraits v0.9.4 [03a91e81] + SplitApplyCombine v1.2.3 [171d559e] + SplittablesBase v0.1.15 [aedffcd0] + Static v1.2.0 [0d7ed370] + StaticArrayInterface v1.8.0 [90137ffa] + StaticArrays v1.9.13 [1e83bf80] + StaticArraysCore v1.4.3 [5e0ebb24] + Strided v2.3.0 [4db3bf67] + StridedViews v0.4.1 [3783bdb8] + TableTraits v1.0.1 [bd369af6] + Tables v1.12.0 [24d252fe] + ThreadedScans v0.1.0 [a759f4b9] + TimerOutputs v0.5.28 [28d57a85] + Transducers v0.4.84 [9d95972d] + TupleTools v1.6.0 [7e5a90cf] + TypeParameterAccessors v0.3.10 [409d34a3] + VectorInterface v0.5.0 [56f22d72] + Artifacts v1.11.0 [2a0f44e3] + Base64 v1.11.0 [ade2ca70] + Dates v1.11.0 [8ba89e20] + Distributed v1.11.0 [9fa8497b] + Future v1.11.0 [b77e0a4c] + InteractiveUtils v1.11.0 [ac6e5ff7] + JuliaSyntaxHighlighting v1.12.0 [8f399da3] + Libdl v1.11.0 [37e2e46d] + LinearAlgebra v1.12.0 [56ddb016] + Logging v1.11.0 [d6f4376e] + Markdown v1.11.0 [de0858da] + Printf v1.11.0 [9a3f8284] + Random v1.11.0 [ea8e919c] + SHA v0.7.0 [9e88b42a] + Serialization v1.11.0 [6462fe0b] + Sockets v1.11.0 [2f01184e] + SparseArrays v1.12.0 [f489334b] + StyledStrings v1.11.0 [fa267f1f] + TOML v1.0.3 [8dfed614] + Test v1.11.0 [cf7118a7] + UUIDs v1.11.0 [4ec0a83e] + Unicode v1.11.0 [e66e0078] + CompilerSupportLibraries_jll v1.3.0+1 [4536629a] + OpenBLAS_jll v0.3.29+0 [bea87d4a] + SuiteSparse_jll v7.10.1+0 [8e850b90] + libblastrampoline_jll v5.12.0+0 Installation completed after 3.53s ################################################################################ # Precompilation # Precompiling PkgEval dependencies... Precompiling package dependencies... ERROR: LoadError: The following 1 direct dependency failed to precompile: Zygote Failed to precompile Zygote [e88e6eb3-aa80-5325-afca-941959d7151f] to "/home/pkgeval/.julia/compiled/v1.13/Zygote/jl_J70UUl" (ProcessExited(1)). WARNING: Constructor for type "Variable" was extended in `Zygote` without explicit qualification or import. NOTE: Assumed "Variable" refers to `IRTools.Variable`. This behavior is deprecated and may differ in future versions. NOTE: This behavior may have differed in Julia versions prior to 1.12. Hint: If you intended to create a new generic function of the same name, use `function Variable end`. Hint: To silence the warning, qualify `Variable` as `IRTools.Variable` in the method signature or explicitly `import IRTools: Variable`. WARNING: Imported binding InteractiveUtils.typesof was undeclared at import time during import to Zygote. ERROR: LoadError: UndefVarError: `typesof` not defined in `Zygote` Suggestion: this global was defined as `InteractiveUtils.typesof` but not assigned a value. Stacktrace: [1] top-level scope @ ~/.julia/packages/Zygote/HdT4O/src/precompile.jl:13 [2] include(mapexpr::Function, mod::Module, _path::String) @ Base ./Base.jl:304 [3] top-level scope @ ~/.julia/packages/Zygote/HdT4O/src/Zygote.jl:85 [4] macro expansion @ ~/.julia/packages/PrecompileTools/Z8SWe/src/workloads.jl:73 [inlined] [5] macro expansion @ ~/.julia/packages/Zygote/HdT4O/src/Zygote.jl:68 [inlined] [6] include(mod::Module, _path::String) @ Base ./Base.jl:303 [7] include_package_for_output(pkg::Base.PkgId, input::String, depot_path::Vector{String}, dl_load_path::Vector{String}, load_path::Vector{String}, concrete_deps::Vector{Pair{Base.PkgId, UInt128}}, source::Nothing) @ Base ./loading.jl:2978 [8] top-level scope @ stdin:5 [9] eval(m::Module, e::Any) @ Core ./boot.jl:489 [10] include_string(mapexpr::typeof(identity), mod::Module, code::String, filename::String) @ Base ./loading.jl:2824 [11] include_string @ ./loading.jl:2834 [inlined] [12] exec_options(opts::Base.JLOptions) @ Base ./client.jl:329 [13] _start() @ Base ./client.jl:563 in expression starting at /home/pkgeval/.julia/packages/Zygote/HdT4O/src/precompile.jl:13 in expression starting at /home/pkgeval/.julia/packages/Zygote/HdT4O/src/Zygote.jl:1 in expression starting at stdin: in expression starting at /PkgEval.jl/scripts/precompile.jl:37 Precompilation failed after 111.73s ################################################################################ # Testing # Testing NDTensors Status `/tmp/jl_cZGSjs/Project.toml` [79e6a3ab] Adapt v4.3.0 [4c555306] ArrayLayouts v1.11.1 [8e7c35d0] BlockArrays v1.6.3 [861a8166] Combinatorics v1.0.2 [85a47980] Dictionaries v0.4.5 [da5c29d0] EllipsisNotation v1.8.0 [1a297f60] FillArrays v1.13.0 [46192b85] GPUArraysCore v0.2.0 [27aeb0d3] JLArrays v0.2.0 [dbb5928d] MappedArrays v0.4.2 [23ae76d9] NDTensors v0.4.8 [6fd5a793] Octavian v0.3.28 [1bc83da4] SafeTestsets v0.1.0 [4db3bf67] StridedViews v0.4.1 [6aa20fa7] TensorOperations v5.2.0 [7e5a90cf] TypeParameterAccessors v0.3.10 [e88e6eb3] Zygote v0.7.6 [37e2e46d] LinearAlgebra v1.12.0 [44cfe95a] Pkg v1.12.0 [9a3f8284] Random v1.11.0 [2f01184e] SparseArrays v1.12.0 [8dfed614] Test v1.11.0 Status `/tmp/jl_cZGSjs/Manifest.toml` [621f4979] AbstractFFTs v1.5.0 [7d9f7c33] Accessors v0.1.42 [79e6a3ab] Adapt v4.3.0 [dce04be8] ArgCheck v2.5.0 [4fba245c] ArrayInterface v7.18.0 [4c555306] ArrayLayouts v1.11.1 [a9b6321e] Atomix v1.1.1 [198e06fe] BangBang v0.4.4 [9718e550] Baselet v0.1.1 [62783981] BitTwiddlingConvenienceFunctions v0.1.6 [8e7c35d0] BlockArrays v1.6.3 [fa961155] CEnum v0.5.0 [2a0fbf3d] CPUSummary v0.2.6 [082447d4] ChainRules v1.72.3 [d360d2e6] ChainRulesCore v1.25.1 [fb6a15b2] CloseOpenIntervals v0.1.13 [861a8166] Combinatorics v1.0.2 [bbf7d656] CommonSubexpressions v0.3.1 [f70d9fcc] CommonWorldInvalidations v1.0.0 [34da2185] Compat v4.16.0 [a33af91c] CompositionsBase v0.1.2 [187b0558] ConstructionBase v1.5.8 [adafc99b] CpuId v0.3.1 [9a962f9c] DataAPI v1.16.0 [e2d170a0] DataValueInterfaces v1.0.0 [244e2a9f] DefineSingletons v0.1.2 [85a47980] Dictionaries v0.4.5 [163ba53b] DiffResults v1.1.0 [b552c78f] DiffRules v1.15.1 [ffbed154] DocStringExtensions v0.9.4 [da5c29d0] EllipsisNotation v1.8.0 [e2ba6199] ExprTools v0.1.10 [e189563c] ExternalDocstrings v0.1.1 [1a297f60] FillArrays v1.13.0 [41a02a25] Folds v0.2.10 ⌃ [f6369f11] ForwardDiff v0.10.38 [d9f16b24] Functors v0.5.2 [0c68f7d7] GPUArrays v11.2.2 [46192b85] GPUArraysCore v0.2.0 [f0d1745a] HalfIntegers v1.6.0 [076d061b] HashArrayMappedTries v0.2.0 [3e5b6fbb] HostCPUFeatures v0.1.17 [7869d1d1] IRTools v0.4.14 [615f187c] IfElse v0.1.1 [313cdc1a] Indexing v1.1.1 [22cec73e] InitialValues v0.3.1 [842dd82b] InlineStrings v1.4.3 [3587e190] InverseFunctions v0.1.17 [92d709cd] IrrationalConstants v0.2.4 [82899510] IteratorInterfaceExtensions v1.0.0 [27aeb0d3] JLArrays v0.2.0 [692b3bcd] JLLWrappers v1.7.0 [63c18a36] KernelAbstractions v0.9.34 [929cbde3] LLVM v9.3.1 [8ac3fa9e] LRUCache v1.6.2 [10f19ff3] LayoutPointers v0.1.17 [2ab3a3ac] LogExpFunctions v0.3.29 [bdcacae8] LoopVectorization v0.12.172 [1914dd2f] MacroTools v0.5.16 [d125e4d3] ManualMemory v0.1.8 [dbb5928d] MappedArrays v0.4.2 [128add7d] MicroCollections v0.2.0 [23ae76d9] NDTensors v0.4.8 [77ba4419] NaNMath v1.1.3 [6fd5a793] Octavian v0.3.28 [6fe1bfb0] OffsetArrays v1.17.0 [bac558e1] OrderedCollections v1.8.0 [65ce6f38] PackageExtensionCompat v1.0.2 [1d0040c9] PolyesterWeave v0.2.2 [aea7be01] PrecompileTools v1.3.2 [21216c6a] Preferences v1.4.3 [43287f4e] PtrArrays v1.3.0 [c1ae055f] RealDot v0.1.0 [189a3867] Reexport v1.2.2 [42d2dcc6] Referenceables v0.1.3 [ae029012] Requires v1.3.1 [94e857df] SIMDTypes v0.1.0 [476501e8] SLEEFPirates v0.6.43 [1bc83da4] SafeTestsets v0.1.0 [7e506255] ScopedValues v1.3.0 [efcf1570] Setfield v1.1.2 [699a6c99] SimpleTraits v0.9.4 [dc90abb0] SparseInverseSubset v0.1.2 [276daf66] SpecialFunctions v2.5.1 [03a91e81] SplitApplyCombine v1.2.3 [171d559e] SplittablesBase v0.1.15 [aedffcd0] Static v1.2.0 [0d7ed370] StaticArrayInterface v1.8.0 [90137ffa] StaticArrays v1.9.13 [1e83bf80] StaticArraysCore v1.4.3 [10745b16] Statistics v1.11.1 [5e0ebb24] Strided v2.3.0 [4db3bf67] StridedViews v0.4.1 [09ab397b] StructArrays v0.7.1 [3783bdb8] TableTraits v1.0.1 [bd369af6] Tables v1.12.0 [6aa20fa7] TensorOperations v5.2.0 [24d252fe] ThreadedScans v0.1.0 [8290d209] ThreadingUtilities v0.5.3 [a759f4b9] TimerOutputs v0.5.28 [28d57a85] Transducers v0.4.84 [9d95972d] TupleTools v1.6.0 [7e5a90cf] TypeParameterAccessors v0.3.10 [3a884ed6] UnPack v1.0.2 [013be700] UnsafeAtomics v0.3.0 [409d34a3] VectorInterface v0.5.0 [3d5dd08c] VectorizationBase v0.21.71 [e88e6eb3] Zygote v0.7.6 [700de1a5] ZygoteRules v0.2.7 [dad2f222] LLVMExtra_jll v0.0.35+0 [efe28fd5] OpenSpecFun_jll v0.5.6+0 [0dad84c5] ArgTools v1.1.2 [56f22d72] Artifacts v1.11.0 [2a0f44e3] Base64 v1.11.0 [ade2ca70] Dates v1.11.0 [8ba89e20] Distributed v1.11.0 [f43a241f] Downloads v1.7.0 [7b1f6079] FileWatching v1.11.0 [9fa8497b] Future v1.11.0 [b77e0a4c] InteractiveUtils v1.11.0 [ac6e5ff7] JuliaSyntaxHighlighting v1.12.0 [4af54fe1] LazyArtifacts v1.11.0 [b27032c2] LibCURL v0.6.4 [76f85450] LibGit2 v1.11.0 [8f399da3] Libdl v1.11.0 [37e2e46d] LinearAlgebra v1.12.0 [56ddb016] Logging v1.11.0 [d6f4376e] Markdown v1.11.0 [ca575930] NetworkOptions v1.3.0 [44cfe95a] Pkg v1.12.0 [de0858da] Printf v1.11.0 [9a3f8284] Random v1.11.0 [ea8e919c] SHA v0.7.0 [9e88b42a] Serialization v1.11.0 [6462fe0b] Sockets v1.11.0 [2f01184e] SparseArrays v1.12.0 [f489334b] StyledStrings v1.11.0 [4607b0f0] SuiteSparse [fa267f1f] TOML v1.0.3 [a4e569a6] Tar v1.10.0 [8dfed614] Test v1.11.0 [cf7118a7] UUIDs v1.11.0 [4ec0a83e] Unicode v1.11.0 [e66e0078] CompilerSupportLibraries_jll v1.3.0+1 [deac9b47] LibCURL_jll v8.12.1+1 [e37daf67] LibGit2_jll v1.9.0+0 [29816b5a] LibSSH2_jll v1.11.3+1 [14a3606d] MozillaCACerts_jll v2024.12.31 [4536629a] OpenBLAS_jll v0.3.29+0 [05823500] OpenLibm_jll v0.8.5+0 [458c3c95] OpenSSL_jll v3.0.16+0 [bea87d4a] SuiteSparse_jll v7.10.1+0 [83775a58] Zlib_jll v1.3.1+2 [8e850b90] libblastrampoline_jll v5.12.0+0 [8e850ede] nghttp2_jll v1.65.0+0 [3f19e933] p7zip_jll v17.5.0+2 Info Packages marked with ⌃ have new versions available and may be upgradable. Testing Running tests... Running /home/pkgeval/.julia/packages/NDTensors/1oqMy/test/test_blocksparse.jl Precompiling packages... 7981.6 ms ✓ TypeParameterAccessors → TypeParameterAccessorsJLArraysExt 1 dependency successfully precompiled in 8 seconds. 55 already precompiled. Running /home/pkgeval/.julia/packages/NDTensors/1oqMy/test/test_combiner.jl Running /home/pkgeval/.julia/packages/NDTensors/1oqMy/test/test_dense.jl DenseTensor basic functionality: Error During Test at /home/pkgeval/.julia/packages/NDTensors/1oqMy/test/test_dense.jl:148 Test threw exception Expression: Array(J * K) ≈ Array(J) * Array(K) Scalar indexing is disallowed. Invocation of getindex resulted in scalar indexing of a GPU array. This is typically caused by calling an iterating implementation of a method. Such implementations *do not* execute on the GPU, but very slowly on the CPU, and therefore should be avoided. If you want to allow scalar iteration, use `allowscalar` or `@allowscalar` to enable scalar iteration globally or for the operations in question. Stacktrace: [1] error(s::String) @ Base ./error.jl:44 [2] errorscalar(op::String) @ GPUArraysCore ~/.julia/packages/GPUArraysCore/aNaXo/src/GPUArraysCore.jl:151 [3] _assertscalar(op::String, behavior::GPUArraysCore.ScalarIndexing) @ GPUArraysCore ~/.julia/packages/GPUArraysCore/aNaXo/src/GPUArraysCore.jl:124 [4] assertscalar(op::String) @ GPUArraysCore ~/.julia/packages/GPUArraysCore/aNaXo/src/GPUArraysCore.jl:112 [5] getindex @ ~/.julia/packages/GPUArrays/uiVyU/src/host/indexing.jl:50 [inlined] [6] scalar_getindex @ ~/.julia/packages/GPUArrays/uiVyU/src/host/indexing.jl:36 [inlined] [7] _getindex @ ~/.julia/packages/GPUArrays/uiVyU/src/host/indexing.jl:19 [inlined] [8] getindex @ ~/.julia/packages/GPUArrays/uiVyU/src/host/indexing.jl:17 [inlined] [9] __matmul2x2_elements(tA::Char, A::JLArrays.JLArray{Float64, 2}) @ LinearAlgebra /opt/julia/share/julia/stdlib/v1.13/LinearAlgebra/src/matmul.jl:1091 [10] __matmul2x2_elements @ /opt/julia/share/julia/stdlib/v1.13/LinearAlgebra/src/matmul.jl:1120 [inlined] [11] _matmul2x2_elements @ /opt/julia/share/julia/stdlib/v1.13/LinearAlgebra/src/matmul.jl:1085 [inlined] [12] matmul2x2or3x3_nonzeroalpha!(C::JLArrays.JLArray{Float64, 2}, tA::Char, tB::Char, A::JLArrays.JLArray{Float64, 2}, B::JLArrays.JLArray{Float64, 2}, α::Bool, β::Bool) @ LinearAlgebra /opt/julia/share/julia/stdlib/v1.13/LinearAlgebra/src/matmul.jl:477 [13] generic_matmatmul_wrapper!(C::JLArrays.JLArray{Float64, 2}, tA::Char, tB::Char, A::JLArrays.JLArray{Float64, 2}, B::JLArrays.JLArray{Float64, 2}, α::Bool, β::Bool, val::Val{LinearAlgebra.BlasFlag.GEMM}) @ LinearAlgebra /opt/julia/share/julia/stdlib/v1.13/LinearAlgebra/src/matmul.jl:498 [14] _mul! @ /opt/julia/share/julia/stdlib/v1.13/LinearAlgebra/src/matmul.jl:328 [inlined] [15] mul! @ /opt/julia/share/julia/stdlib/v1.13/LinearAlgebra/src/matmul.jl:297 [inlined] [16] mul! @ /opt/julia/share/julia/stdlib/v1.13/LinearAlgebra/src/matmul.jl:265 [inlined] [17] * @ /opt/julia/share/julia/stdlib/v1.13/LinearAlgebra/src/matmul.jl:136 [inlined] [18] *(T1::NDTensors.DenseTensor{Float64, 2, Tuple{Int64, Int64}, NDTensors.Dense{Float64, JLArrays.JLArray{Float64, 1}}}, T2::NDTensors.DenseTensor{Float64, 2, Tuple{Int64, Int64}, NDTensors.Dense{Float64, JLArrays.JLArray{Float64, 1}}}) @ NDTensors ~/.julia/packages/NDTensors/1oqMy/src/linearalgebra/linearalgebra.jl:16 [19] top-level scope @ ~/.julia/packages/NDTensors/1oqMy/test/test_dense.jl:15 [20] macro expansion @ /opt/julia/share/julia/stdlib/v1.13/Test/src/Test.jl:1835 [inlined] [21] macro expansion @ ~/.julia/packages/NDTensors/1oqMy/test/test_dense.jl:15 [inlined] [22] macro expansion @ /opt/julia/share/julia/stdlib/v1.13/Test/src/Test.jl:1924 [inlined] [23] macro expansion @ ~/.julia/packages/NDTensors/1oqMy/test/test_dense.jl:19 [inlined] [24] macro expansion @ /opt/julia/share/julia/stdlib/v1.13/Test/src/Test.jl:1835 [inlined] [25] macro expansion @ ~/.julia/packages/NDTensors/1oqMy/test/test_dense.jl:148 [inlined] [26] macro expansion @ /opt/julia/share/julia/stdlib/v1.13/Test/src/Test.jl:728 [inlined] WARNING: llvmcall with integer pointers is deprecated. Use actual pointers instead, replacing i32 or i64 with i8* or ptr in initialize_task(Any) at /home/pkgeval/.julia/packages/ThreadingUtilities/nn4y1/src/ThreadingUtilities.jl Precompiling packages... 2121.0 ms ✓ StaticArrayInterface → StaticArrayInterfaceStaticArraysExt 1 dependency successfully precompiled in 2 seconds. 20 already precompiled. WARNING: llvmcall with integer pointers is deprecated. Use actual pointers instead, replacing i32 or i64 with i8* or ptr in (::Octavian.LoopMulFunc{P, TC, TA, TB, Α, Β, Md, Kd, Nd, W₁, W₂, R₁, R₂})(Ptr{UInt64}) where {P, TC, TA, TB, Α, Β, Md, Kd, Nd, W₁, W₂, R₁, R₂} at /home/pkgeval/.julia/packages/Octavian/LeRg7/src/funcptrs.jl WARNING: llvmcall with integer pointers is deprecated. Use actual pointers instead, replacing i32 or i64 with i8* or ptr in _turbo_!(Base.Val{var"#UNROLL#"}, Base.Val{var"#OPS#"}, Base.Val{var"#ARF#"}, Base.Val{var"#AM#"}, Base.Val{var"#LPSYM#"}, Base.Val{Tuple{var"#LB#", var"#V#"}}, Vararg{Any, var"#num#vargs#"}) where {var"#UNROLL#", var"#OPS#", var"#ARF#", var"#AM#", var"#LPSYM#", var"#LB#", var"#V#", var"#num#vargs#"} at /home/pkgeval/.julia/packages/LoopVectorization/ImqiY/src/reconstruct_loopset.jl WARNING: llvmcall with integer pointers is deprecated. Use actual pointers instead, replacing i32 or i64 with i8* or ptr in matmul_st_only_pack_A!(LayoutPointers.AbstractStridedPointer{T, N, C, B, R, X, O} where O<:Tuple{Vararg{Union{Int16, Int32, Int64, Int8, UInt16, UInt32, UInt64, UInt8, Static.StaticInt{N} where N}, N}} where X<:Tuple{Vararg{Union{Int16, Int32, Int64, Int8, UInt16, UInt32, UInt64, UInt8, Static.StaticInt{N} where N}, N}} where R where B where C where N, LayoutPointers.AbstractStridedPointer{T, N, C, B, R, X, O} where O<:Tuple{Vararg{Union{Int16, Int32, Int64, Int8, UInt16, UInt32, UInt64, UInt8, Static.StaticInt{N} where N}, N}} where X<:Tuple{Vararg{Union{Int16, Int32, Int64, Int8, UInt16, UInt32, UInt64, UInt8, Static.StaticInt{N} where N}, N}} where R where B where C where N where T, LayoutPointers.AbstractStridedPointer{T, N, C, B, R, X, O} where O<:Tuple{Vararg{Union{Int16, Int32, Int64, Int8, UInt16, UInt32, UInt64, UInt8, Static.StaticInt{N} where N}, N}} where X<:Tuple{Vararg{Union{Int16, Int32, Int64, Int8, UInt16, UInt32, UInt64, UInt8, Static.StaticInt{N} where N}, N}} where R where B where C where N where T, Any, Any, Any, Any, Any, Static.StaticFloat64{W₁}, Static.StaticFloat64{W₂}, Static.StaticFloat64{R₁}, Static.StaticFloat64{R₂}) where {T, W₁, W₂, R₁, R₂} at /home/pkgeval/.julia/packages/Octavian/LeRg7/src/matmul.jl WARNING: llvmcall with integer pointers is deprecated. Use actual pointers instead, replacing i32 or i64 with i8* or ptr in packaloopmul!(LayoutPointers.AbstractStridedPointer{T, N, C, B, R, X, O} where O<:Tuple{Vararg{Union{Int16, Int32, Int64, Int8, UInt16, UInt32, UInt64, UInt8, Static.StaticInt{N} where N}, N}} where X<:Tuple{Vararg{Union{Int16, Int32, Int64, Int8, UInt16, UInt32, UInt64, UInt8, Static.StaticInt{N} where N}, N}} where R where B where C where N, LayoutPointers.AbstractStridedPointer{T, N, C, B, R, X, O} where O<:Tuple{Vararg{Union{Int16, Int32, Int64, Int8, UInt16, UInt32, UInt64, UInt8, Static.StaticInt{N} where N}, N}} where X<:Tuple{Vararg{Union{Int16, Int32, Int64, Int8, UInt16, UInt32, UInt64, UInt8, Static.StaticInt{N} where N}, N}} where R where B where C where N where T, LayoutPointers.AbstractStridedPointer{T, N, C, B, R, X, O} where O<:Tuple{Vararg{Union{Int16, Int32, Int64, Int8, UInt16, UInt32, UInt64, UInt8, Static.StaticInt{N} where N}, N}} where X<:Tuple{Vararg{Union{Int16, Int32, Int64, Int8, UInt16, UInt32, UInt64, UInt8, Static.StaticInt{N} where N}, N}} where R where B where C where N where T, Any, Any, Any, Any, Any) where {T} at /home/pkgeval/.julia/packages/Octavian/LeRg7/src/macrokernels.jl WARNING: llvmcall with integer pointers is deprecated. Use actual pointers instead, replacing i32 or i64 with i8* or ptr in (::Octavian.SyncMulFunc{TC, TA, TB, Α, Β, Md, Kd, Nd, BCP, ID, TT, W₁, W₂, R₁, R₂})(Ptr{UInt64}) where {TC, TA, TB, Α, Β, Md, Kd, Nd, BCP, ID, TT, W₁, W₂, R₁, R₂} at /home/pkgeval/.julia/packages/Octavian/LeRg7/src/funcptrs.jl WARNING: llvmcall with integer pointers is deprecated. Use actual pointers instead, replacing i32 or i64 with i8* or ptr in sync_mul!(LayoutPointers.AbstractStridedPointer{T, N, C, B, R, X, O} where O<:Tuple{Vararg{Union{Int16, Int32, Int64, Int8, UInt16, UInt32, UInt64, UInt8, Static.StaticInt{N} where N}, N}} where X<:Tuple{Vararg{Union{Int16, Int32, Int64, Int8, UInt16, UInt32, UInt64, UInt8, Static.StaticInt{N} where N}, N}} where R where B where C where N, LayoutPointers.AbstractStridedPointer{T, N, C, B, R, X, O} where O<:Tuple{Vararg{Union{Int16, Int32, Int64, Int8, UInt16, UInt32, UInt64, UInt8, Static.StaticInt{N} where N}, N}} where X<:Tuple{Vararg{Union{Int16, Int32, Int64, Int8, UInt16, UInt32, UInt64, UInt8, Static.StaticInt{N} where N}, N}} where R where B where C where N where T, LayoutPointers.AbstractStridedPointer{T, N, C, B, R, X, O} where O<:Tuple{Vararg{Union{Int16, Int32, Int64, Int8, UInt16, UInt32, UInt64, UInt8, Static.StaticInt{N} where N}, N}} where X<:Tuple{Vararg{Union{Int16, Int32, Int64, Int8, UInt16, UInt32, UInt64, UInt8, Static.StaticInt{N} where N}, N}} where R where B where C where N where T, Any, Any, Any, Any, Any, Ptr{UInt32}, Ptr{T} where T, UInt64, UInt64, Static.StaticFloat64{W₁}, Static.StaticFloat64{W₂}, Static.StaticFloat64{R₁}, Static.StaticFloat64{R₂}) where {T, W₁, W₂, R₁, R₂} at /home/pkgeval/.julia/packages/Octavian/LeRg7/src/matmul.jl WARNING: llvmcall with integer pointers is deprecated. Use actual pointers instead, replacing i32 or i64 with i8* or ptr in unsafe_copyto_turbo!(Any, Any, Any, Any) at /home/pkgeval/.julia/packages/Octavian/LeRg7/src/utils.jl WARNING: llvmcall with integer pointers is deprecated. Use actual pointers instead, replacing i32 or i64 with i8* or ptr in __matmul!(LayoutPointers.AbstractStridedPointer{T, N, C, B, R, X, O} where O<:Tuple{Vararg{Union{Int16, Int32, Int64, Int8, UInt16, UInt32, UInt64, UInt8, Static.StaticInt{N} where N}, N}} where X<:Tuple{Vararg{Union{Int16, Int32, Int64, Int8, UInt16, UInt32, UInt64, UInt8, Static.StaticInt{N} where N}, N}} where R where B where C where N, LayoutPointers.AbstractStridedPointer{T, N, C, B, R, X, O} where O<:Tuple{Vararg{Union{Int16, Int32, Int64, Int8, UInt16, UInt32, UInt64, UInt8, Static.StaticInt{N} where N}, N}} where X<:Tuple{Vararg{Union{Int16, Int32, Int64, Int8, UInt16, UInt32, UInt64, UInt8, Static.StaticInt{N} where N}, N}} where R where B where C where N where T, LayoutPointers.AbstractStridedPointer{T, N, C, B, R, X, O} where O<:Tuple{Vararg{Union{Int16, Int32, Int64, Int8, UInt16, UInt32, UInt64, UInt8, Static.StaticInt{N} where N}, N}} where X<:Tuple{Vararg{Union{Int16, Int32, Int64, Int8, UInt16, UInt32, UInt64, UInt8, Static.StaticInt{N} where N}, N}} where R where B where C where N where T, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) where {T} at /home/pkgeval/.julia/packages/Octavian/LeRg7/src/matmul.jl WARNING: llvmcall with integer pointers is deprecated. Use actual pointers instead, replacing i32 or i64 with i8* or ptr in matmulsplitn!(LayoutPointers.AbstractStridedPointer{T, N, C, B, R, X, O} where O<:Tuple{Vararg{Union{Int16, Int32, Int64, Int8, UInt16, UInt32, UInt64, UInt8, Static.StaticInt{N} where N}, N}} where X<:Tuple{Vararg{Union{Int16, Int32, Int64, Int8, UInt16, UInt32, UInt64, UInt8, Static.StaticInt{N} where N}, N}} where R where B where C where N, Any, Any, Any, Any, Static.StaticInt{Mc}, Any, Any, Any, Any, Base.Val{PACK}, Any, Any, Any, Any) where {T, Mc, PACK} at /home/pkgeval/.julia/packages/Octavian/LeRg7/src/matmul.jl WARNING: llvmcall with integer pointers is deprecated. Use actual pointers instead, replacing i32 or i64 with i8* or ptr in waitonmultasks(Any, Any) at /home/pkgeval/.julia/packages/Octavian/LeRg7/src/matmul.jl WARNING: llvmcall with integer pointers is deprecated. Use actual pointers instead, replacing i32 or i64 with i8* or ptr in checktask(Any) at /home/pkgeval/.julia/packages/ThreadingUtilities/nn4y1/src/threadtasks.jl WARNING: llvmcall with integer pointers is deprecated. Use actual pointers instead, replacing i32 or i64 with i8* or ptr in matmul_pack_A_and_B!(LayoutPointers.AbstractStridedPointer{T, N, C, B, R, X, O} where O<:Tuple{Vararg{Union{Int16, Int32, Int64, Int8, UInt16, UInt32, UInt64, UInt8, Static.StaticInt{N} where N}, N}} where X<:Tuple{Vararg{Union{Int16, Int32, Int64, Int8, UInt16, UInt32, UInt64, UInt8, Static.StaticInt{N} where N}, N}} where R where B where C where N, LayoutPointers.AbstractStridedPointer{T, N, C, B, R, X, O} where O<:Tuple{Vararg{Union{Int16, Int32, Int64, Int8, UInt16, UInt32, UInt64, UInt8, Static.StaticInt{N} where N}, N}} where X<:Tuple{Vararg{Union{Int16, Int32, Int64, Int8, UInt16, UInt32, UInt64, UInt8, Static.StaticInt{N} where N}, N}} where R where B where C where N where T, LayoutPointers.AbstractStridedPointer{T, N, C, B, R, X, O} where O<:Tuple{Vararg{Union{Int16, Int32, Int64, Int8, UInt16, UInt32, UInt64, UInt8, Static.StaticInt{N} where N}, N}} where X<:Tuple{Vararg{Union{Int16, Int32, Int64, Int8, UInt16, UInt32, UInt64, UInt8, Static.StaticInt{N} where N}, N}} where R where B where C where N where T, Any, Any, Any, Any, Any, Any, Static.StaticFloat64{W₁}, Static.StaticFloat64{W₂}, Static.StaticFloat64{R₁}, Static.StaticFloat64{R₂}) where {T, W₁, W₂, R₁, R₂} at /home/pkgeval/.julia/packages/Octavian/LeRg7/src/matmul.jl WARNING: llvmcall with integer pointers is deprecated. Use actual pointers instead, replacing i32 or i64 with i8* or ptr in matmul_st_pack_A_and_B!(LayoutPointers.AbstractStridedPointer{T, N, C, B, R, X, O} where O<:Tuple{Vararg{Union{Int16, Int32, Int64, Int8, UInt16, UInt32, UInt64, UInt8, Static.StaticInt{N} where N}, N}} where X<:Tuple{Vararg{Union{Int16, Int32, Int64, Int8, UInt16, UInt32, UInt64, UInt8, Static.StaticInt{N} where N}, N}} where R where B where C where N, LayoutPointers.AbstractStridedPointer{T, N, C, B, R, X, O} where O<:Tuple{Vararg{Union{Int16, Int32, Int64, Int8, UInt16, UInt32, UInt64, UInt8, Static.StaticInt{N} where N}, N}} where X<:Tuple{Vararg{Union{Int16, Int32, Int64, Int8, UInt16, UInt32, UInt64, UInt8, Static.StaticInt{N} where N}, N}} where R where B where C where N where T, LayoutPointers.AbstractStridedPointer{T, N, C, B, R, X, O} where O<:Tuple{Vararg{Union{Int16, Int32, Int64, Int8, UInt16, UInt32, UInt64, UInt8, Static.StaticInt{N} where N}, N}} where X<:Tuple{Vararg{Union{Int16, Int32, Int64, Int8, UInt16, UInt32, UInt64, UInt8, Static.StaticInt{N} where N}, N}} where R where B where C where N where T, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) where {T} at /home/pkgeval/.julia/packages/Octavian/LeRg7/src/matmul.jl WARNING: llvmcall with integer pointers is deprecated. Use actual pointers instead, replacing i32 or i64 with i8* or ptr in matmul_only_β!(AbstractArray{T, 2} where T, Any) at /home/pkgeval/.julia/packages/Octavian/LeRg7/src/matmul.jl WARNING: llvmcall with integer pointers is deprecated. Use actual pointers instead, replacing i32 or i64 with i8* or ptr in matmul_only_β!(AbstractArray{T, 2}, Static.StaticInt{0}) where {T} at /home/pkgeval/.julia/packages/Octavian/LeRg7/src/matmul.jl Running /home/pkgeval/.julia/packages/NDTensors/1oqMy/test/test_diag.jl DiagTensor contractions: Error During Test at /home/pkgeval/.julia/packages/NDTensors/1oqMy/test/test_diag.jl:115 Test threw exception Expression: contract(A, (1, -2), t, (-2, 3)) == A Scalar indexing is disallowed. Invocation of getindex resulted in scalar indexing of a GPU array. This is typically caused by calling an iterating implementation of a method. Such implementations *do not* execute on the GPU, but very slowly on the CPU, and therefore should be avoided. If you want to allow scalar iteration, use `allowscalar` or `@allowscalar` to enable scalar iteration globally or for the operations in question. Stacktrace: [1] error(s::String) @ Base ./error.jl:44 [2] errorscalar(op::String) @ GPUArraysCore ~/.julia/packages/GPUArraysCore/aNaXo/src/GPUArraysCore.jl:151 [3] _assertscalar(op::String, behavior::GPUArraysCore.ScalarIndexing) @ GPUArraysCore ~/.julia/packages/GPUArraysCore/aNaXo/src/GPUArraysCore.jl:124 [4] assertscalar(op::String) @ GPUArraysCore ~/.julia/packages/GPUArraysCore/aNaXo/src/GPUArraysCore.jl:112 [5] getindex @ ~/.julia/packages/GPUArrays/uiVyU/src/host/indexing.jl:50 [inlined] [6] scalar_getindex @ ~/.julia/packages/GPUArrays/uiVyU/src/host/indexing.jl:36 [inlined] [7] _getindex @ ~/.julia/packages/GPUArrays/uiVyU/src/host/indexing.jl:19 [inlined] [8] getindex @ ~/.julia/packages/GPUArrays/uiVyU/src/host/indexing.jl:17 [inlined] [9] __matmul3x3_elements(tA::Char, A::JLArrays.JLArray{Float64, 2}) @ LinearAlgebra /opt/julia/share/julia/stdlib/v1.13/LinearAlgebra/src/matmul.jl:1153 [10] __matmul3x3_elements @ /opt/julia/share/julia/stdlib/v1.13/LinearAlgebra/src/matmul.jl:1190 [inlined] [11] _matmul3x3_elements @ /opt/julia/share/julia/stdlib/v1.13/LinearAlgebra/src/matmul.jl:1147 [inlined] [12] matmul3x3!(C::JLArrays.JLArray{Float64, 2}, tA::Char, tB::Char, A::JLArrays.JLArray{Float64, 2}, B::JLArrays.JLArray{Float64, 2}, α::Float64, β::Float64) @ LinearAlgebra /opt/julia/share/julia/stdlib/v1.13/LinearAlgebra/src/matmul.jl:1213 [13] matmul2x2or3x3_nonzeroalpha! @ /opt/julia/share/julia/stdlib/v1.13/LinearAlgebra/src/matmul.jl:470 [inlined] [14] generic_matmatmul_wrapper!(C::JLArrays.JLArray{Float64, 2}, tA::Char, tB::Char, A::JLArrays.JLArray{Float64, 2}, B::JLArrays.JLArray{Float64, 2}, α::Float64, β::Float64, val::Val{LinearAlgebra.BlasFlag.GEMM}) @ LinearAlgebra /opt/julia/share/julia/stdlib/v1.13/LinearAlgebra/src/matmul.jl:498 [15] _mul! @ /opt/julia/share/julia/stdlib/v1.13/LinearAlgebra/src/matmul.jl:328 [inlined] [16] mul! @ /opt/julia/share/julia/stdlib/v1.13/LinearAlgebra/src/matmul.jl:297 [inlined] [17] mul! @ ~/.julia/packages/NDTensors/1oqMy/src/lib/Expose/src/functions/mul.jl:2 [inlined] [18] mul!! @ ~/.julia/packages/NDTensors/1oqMy/src/abstractarray/mul.jl:2 [inlined] [19] mul!! @ ~/.julia/packages/NDTensors/1oqMy/src/abstractarray/mul.jl:10 [inlined] [20] _contract!(CT::JLArrays.JLArray{Float64, 2}, AT::JLArrays.JLArray{Float64, 2}, BT::JLArrays.JLArray{Float64, 2}, props::NDTensors.ContractionProperties{2, 2, 2}, α::Bool, β::Bool) @ NDTensors ~/.julia/packages/NDTensors/1oqMy/src/abstractarray/tensoralgebra/contract.jl:174 [21] _contract!(CT::NDTensors.DenseTensor{Float64, 2, Tuple{Int64, Int64}, NDTensors.Dense{Float64, JLArrays.JLArray{Float64, 1}}}, AT::NDTensors.DenseTensor{Float64, 2, Tuple{Int64, Int64}, NDTensors.Dense{Float64, JLArrays.JLArray{Float64, 1}}}, BT::NDTensors.DenseTensor{Float64, 2, Tuple{Int64, Int64}, NDTensors.Dense{Float64, JLArrays.JLArray{Float64, 1}}}, props::NDTensors.ContractionProperties{2, 2, 2}, α::Bool, β::Bool) @ NDTensors ~/.julia/packages/NDTensors/1oqMy/src/dense/tensoralgebra/contract.jl:230 [22] contract! @ ~/.julia/packages/NDTensors/1oqMy/src/dense/tensoralgebra/contract.jl:213 [inlined] [23] contract! @ ~/.julia/packages/NDTensors/1oqMy/src/tensoroperations/generic_tensor_operations.jl:165 [inlined] [24] #contract!#1 @ ~/.julia/packages/NDTensors/1oqMy/ext/NDTensorsGPUArraysCoreExt/contract.jl:21 [inlined] [25] contract! @ ~/.julia/packages/NDTensors/1oqMy/ext/NDTensorsGPUArraysCoreExt/contract.jl:7 [inlined] [26] contract! (repeats 2 times) @ ~/.julia/packages/NDTensors/1oqMy/ext/NDTensorsGPUArraysCoreExt/contract.jl:37 [inlined] [27] _contract!!(output_tensor::NDTensors.DenseTensor{Float64, 2, Tuple{Int64, Int64}, NDTensors.Dense{Float64, JLArrays.JLArray{Float64, 1}}}, labelsoutput_tensor::Tuple{Int64, Int64}, tensor1::NDTensors.DenseTensor{Float64, 2, Tuple{Int64, Int64}, NDTensors.Dense{Float64, JLArrays.JLArray{Float64, 1}}}, labelstensor1::Tuple{Int64, Int64}, tensor2::NDTensors.DiagTensor{Float64, 2, Tuple{Int64, Int64}, NDTensors.Diag{Float64, JLArrays.JLArray{Float64, 1}}}, labelstensor2::Tuple{Int64, Int64}, α::Int64, β::Int64) @ NDTensors ~/.julia/packages/NDTensors/1oqMy/src/tensoroperations/generic_tensor_operations.jl:143 [28] _contract!! @ ~/.julia/packages/NDTensors/1oqMy/src/tensoroperations/generic_tensor_operations.jl:131 [inlined] [29] contract!! @ ~/.julia/packages/NDTensors/1oqMy/src/tensoroperations/generic_tensor_operations.jl:219 [inlined] [30] contract!! @ ~/.julia/packages/NDTensors/1oqMy/src/tensoroperations/generic_tensor_operations.jl:188 [inlined] [31] contract(tensor1::NDTensors.DenseTensor{Float64, 2, Tuple{Int64, Int64}, NDTensors.Dense{Float64, JLArrays.JLArray{Float64, 1}}}, labelstensor1::Tuple{Int64, Int64}, tensor2::NDTensors.DiagTensor{Float64, 2, Tuple{Int64, Int64}, NDTensors.Diag{Float64, JLArrays.JLArray{Float64, 1}}}, labelstensor2::Tuple{Int64, Int64}, labelsoutput_tensor::Tuple{Int64, Int64}) @ NDTensors ~/.julia/packages/NDTensors/1oqMy/src/tensoroperations/generic_tensor_operations.jl:113 [32] contract(::Type{NDTensors.CanContract{NDTensors.DenseTensor{Float64, 2, Tuple{Int64, Int64}, NDTensors.Dense{Float64, JLArrays.JLArray{Float64, 1}}}, NDTensors.DiagTensor{Float64, 2, Tuple{Int64, Int64}, NDTensors.Diag{Float64, JLArrays.JLArray{Float64, 1}}}}}, tensor1::NDTensors.DenseTensor{Float64, 2, Tuple{Int64, Int64}, NDTensors.Dense{Float64, JLArrays.JLArray{Float64, 1}}}, labels_tensor1::Tuple{Int64, Int64}, tensor2::NDTensors.DiagTensor{Float64, 2, Tuple{Int64, Int64}, NDTensors.Diag{Float64, JLArrays.JLArray{Float64, 1}}}, labels_tensor2::Tuple{Int64, Int64}) @ NDTensors ~/.julia/packages/NDTensors/1oqMy/src/tensoroperations/generic_tensor_operations.jl:91 [33] contract(tensor1::NDTensors.DenseTensor{Float64, 2, Tuple{Int64, Int64}, NDTensors.Dense{Float64, JLArrays.JLArray{Float64, 1}}}, labels_tensor1::Tuple{Int64, Int64}, tensor2::NDTensors.DiagTensor{Float64, 2, Tuple{Int64, Int64}, NDTensors.Diag{Float64, JLArrays.JLArray{Float64, 1}}}, labels_tensor2::Tuple{Int64, Int64}) @ NDTensors ~/.julia/packages/SimpleTraits/l1ZsK/src/SimpleTraits.jl:331 [34] macro expansion @ /opt/julia/share/julia/stdlib/v1.13/Test/src/Test.jl:728 [inlined] [35] macro expansion @ ~/.julia/packages/NDTensors/1oqMy/test/test_diag.jl:115 [inlined] [36] macro expansion @ /opt/julia/share/julia/stdlib/v1.13/Test/src/Test.jl:1924 [inlined] [37] top-level scope @ ~/.julia/packages/NDTensors/1oqMy/test/test_diag.jl:1945 DiagTensor contractions: Error During Test at /home/pkgeval/.julia/packages/NDTensors/1oqMy/test/test_diag.jl:116 Test threw exception Expression: contract(A, (-2, 1), t, (-2, 3)) == transpose(A) Scalar indexing is disallowed. Invocation of getindex resulted in scalar indexing of a GPU array. This is typically caused by calling an iterating implementation of a method. Such implementations *do not* execute on the GPU, but very slowly on the CPU, and therefore should be avoided. If you want to allow scalar iteration, use `allowscalar` or `@allowscalar` to enable scalar iteration globally or for the operations in question. Stacktrace: [1] error(s::String) @ Base ./error.jl:44 [2] errorscalar(op::String) @ GPUArraysCore ~/.julia/packages/GPUArraysCore/aNaXo/src/GPUArraysCore.jl:151 [3] _assertscalar(op::String, behavior::GPUArraysCore.ScalarIndexing) @ GPUArraysCore ~/.julia/packages/GPUArraysCore/aNaXo/src/GPUArraysCore.jl:124 [4] assertscalar(op::String) @ GPUArraysCore ~/.julia/packages/GPUArraysCore/aNaXo/src/GPUArraysCore.jl:112 [5] getindex @ ~/.julia/packages/GPUArrays/uiVyU/src/host/indexing.jl:50 [inlined] [6] scalar_getindex @ ~/.julia/packages/GPUArrays/uiVyU/src/host/indexing.jl:36 [inlined] [7] _getindex @ ~/.julia/packages/GPUArrays/uiVyU/src/host/indexing.jl:19 [inlined] [8] getindex @ ~/.julia/packages/GPUArrays/uiVyU/src/host/indexing.jl:17 [inlined] [9] __matmul3x3_elements(tA::Char, A::JLArrays.JLArray{Float64, 2}) @ LinearAlgebra /opt/julia/share/julia/stdlib/v1.13/LinearAlgebra/src/matmul.jl:1158 [10] __matmul3x3_elements @ /opt/julia/share/julia/stdlib/v1.13/LinearAlgebra/src/matmul.jl:1190 [inlined] [11] _matmul3x3_elements @ /opt/julia/share/julia/stdlib/v1.13/LinearAlgebra/src/matmul.jl:1147 [inlined] [12] matmul3x3!(C::JLArrays.JLArray{Float64, 2}, tA::Char, tB::Char, A::JLArrays.JLArray{Float64, 2}, B::JLArrays.JLArray{Float64, 2}, α::Float64, β::Float64) @ LinearAlgebra /opt/julia/share/julia/stdlib/v1.13/LinearAlgebra/src/matmul.jl:1213 [13] matmul2x2or3x3_nonzeroalpha! @ /opt/julia/share/julia/stdlib/v1.13/LinearAlgebra/src/matmul.jl:470 [inlined] [14] generic_matmatmul_wrapper!(C::JLArrays.JLArray{Float64, 2}, tA::Char, tB::Char, A::JLArrays.JLArray{Float64, 2}, B::JLArrays.JLArray{Float64, 2}, α::Float64, β::Float64, val::Val{LinearAlgebra.BlasFlag.SYRK}) @ LinearAlgebra /opt/julia/share/julia/stdlib/v1.13/LinearAlgebra/src/matmul.jl:498 [15] _mul! @ /opt/julia/share/julia/stdlib/v1.13/LinearAlgebra/src/matmul.jl:328 [inlined] [16] mul! @ /opt/julia/share/julia/stdlib/v1.13/LinearAlgebra/src/matmul.jl:297 [inlined] [17] mul! @ ~/.julia/packages/NDTensors/1oqMy/ext/NDTensorsJLArraysExt/mul.jl:12 [inlined] [18] mul!!(CM::LinearAlgebra.Transpose{Float64, JLArrays.JLArray{Float64, 2}}, AM::LinearAlgebra.Transpose{Float64, JLArrays.JLArray{Float64, 2}}, BM::JLArrays.JLArray{Float64, 2}, α::Float64, β::Float64) @ NDTensors ~/.julia/packages/NDTensors/1oqMy/src/abstractarray/mul.jl:2 [19] _contract!(CT::JLArrays.JLArray{Float64, 2}, AT::JLArrays.JLArray{Float64, 2}, BT::JLArrays.JLArray{Float64, 2}, props::NDTensors.ContractionProperties{2, 2, 2}, α::Bool, β::Bool) @ NDTensors ~/.julia/packages/NDTensors/1oqMy/src/abstractarray/tensoralgebra/contract.jl:174 [20] _contract!(CT::NDTensors.DenseTensor{Float64, 2, Tuple{Int64, Int64}, NDTensors.Dense{Float64, JLArrays.JLArray{Float64, 1}}}, AT::NDTensors.DenseTensor{Float64, 2, Tuple{Int64, Int64}, NDTensors.Dense{Float64, JLArrays.JLArray{Float64, 1}}}, BT::NDTensors.DenseTensor{Float64, 2, Tuple{Int64, Int64}, NDTensors.Dense{Float64, JLArrays.JLArray{Float64, 1}}}, props::NDTensors.ContractionProperties{2, 2, 2}, α::Bool, β::Bool) @ NDTensors ~/.julia/packages/NDTensors/1oqMy/src/dense/tensoralgebra/contract.jl:230 [21] contract! @ ~/.julia/packages/NDTensors/1oqMy/src/dense/tensoralgebra/contract.jl:213 [inlined] [22] contract! @ ~/.julia/packages/NDTensors/1oqMy/src/tensoroperations/generic_tensor_operations.jl:165 [inlined] [23] #contract!#1 @ ~/.julia/packages/NDTensors/1oqMy/ext/NDTensorsGPUArraysCoreExt/contract.jl:21 [inlined] [24] contract! @ ~/.julia/packages/NDTensors/1oqMy/ext/NDTensorsGPUArraysCoreExt/contract.jl:7 [inlined] [25] contract! (repeats 2 times) @ ~/.julia/packages/NDTensors/1oqMy/ext/NDTensorsGPUArraysCoreExt/contract.jl:37 [inlined] [26] _contract!!(output_tensor::NDTensors.DenseTensor{Float64, 2, Tuple{Int64, Int64}, NDTensors.Dense{Float64, JLArrays.JLArray{Float64, 1}}}, labelsoutput_tensor::Tuple{Int64, Int64}, tensor1::NDTensors.DenseTensor{Float64, 2, Tuple{Int64, Int64}, NDTensors.Dense{Float64, JLArrays.JLArray{Float64, 1}}}, labelstensor1::Tuple{Int64, Int64}, tensor2::NDTensors.DiagTensor{Float64, 2, Tuple{Int64, Int64}, NDTensors.Diag{Float64, JLArrays.JLArray{Float64, 1}}}, labelstensor2::Tuple{Int64, Int64}, α::Int64, β::Int64) @ NDTensors ~/.julia/packages/NDTensors/1oqMy/src/tensoroperations/generic_tensor_operations.jl:143 [27] _contract!! @ ~/.julia/packages/NDTensors/1oqMy/src/tensoroperations/generic_tensor_operations.jl:131 [inlined] [28] contract!! @ ~/.julia/packages/NDTensors/1oqMy/src/tensoroperations/generic_tensor_operations.jl:219 [inlined] [29] contract!! @ ~/.julia/packages/NDTensors/1oqMy/src/tensoroperations/generic_tensor_operations.jl:188 [inlined] [30] contract(tensor1::NDTensors.DenseTensor{Float64, 2, Tuple{Int64, Int64}, NDTensors.Dense{Float64, JLArrays.JLArray{Float64, 1}}}, labelstensor1::Tuple{Int64, Int64}, tensor2::NDTensors.DiagTensor{Float64, 2, Tuple{Int64, Int64}, NDTensors.Diag{Float64, JLArrays.JLArray{Float64, 1}}}, labelstensor2::Tuple{Int64, Int64}, labelsoutput_tensor::Tuple{Int64, Int64}) @ NDTensors ~/.julia/packages/NDTensors/1oqMy/src/tensoroperations/generic_tensor_operations.jl:113 [31] contract(::Type{NDTensors.CanContract{NDTensors.DenseTensor{Float64, 2, Tuple{Int64, Int64}, NDTensors.Dense{Float64, JLArrays.JLArray{Float64, 1}}}, NDTensors.DiagTensor{Float64, 2, Tuple{Int64, Int64}, NDTensors.Diag{Float64, JLArrays.JLArray{Float64, 1}}}}}, tensor1::NDTensors.DenseTensor{Float64, 2, Tuple{Int64, Int64}, NDTensors.Dense{Float64, JLArrays.JLArray{Float64, 1}}}, labels_tensor1::Tuple{Int64, Int64}, tensor2::NDTensors.DiagTensor{Float64, 2, Tuple{Int64, Int64}, NDTensors.Diag{Float64, JLArrays.JLArray{Float64, 1}}}, labels_tensor2::Tuple{Int64, Int64}) @ NDTensors ~/.julia/packages/NDTensors/1oqMy/src/tensoroperations/generic_tensor_operations.jl:91 [32] contract(tensor1::NDTensors.DenseTensor{Float64, 2, Tuple{Int64, Int64}, NDTensors.Dense{Float64, JLArrays.JLArray{Float64, 1}}}, labels_tensor1::Tuple{Int64, Int64}, tensor2::NDTensors.DiagTensor{Float64, 2, Tuple{Int64, Int64}, NDTensors.Diag{Float64, JLArrays.JLArray{Float64, 1}}}, labels_tensor2::Tuple{Int64, Int64}) @ NDTensors ~/.julia/packages/SimpleTraits/l1ZsK/src/SimpleTraits.jl:331 [33] macro expansion @ /opt/julia/share/julia/stdlib/v1.13/Test/src/Test.jl:728 [inlined] [34] macro expansion @ ~/.julia/packages/NDTensors/1oqMy/test/test_diag.jl:116 [inlined] [35] macro expansion @ /opt/julia/share/julia/stdlib/v1.13/Test/src/Test.jl:1924 [inlined] [36] top-level scope @ ~/.julia/packages/NDTensors/1oqMy/test/test_diag.jl:1945 Running /home/pkgeval/.julia/packages/NDTensors/1oqMy/test/test_diagblocksparse.jl DiagBlockSparse contract: Error During Test at /home/pkgeval/.julia/packages/NDTensors/1oqMy/test/test_diagblocksparse.jl:60 Got exception outside of a @test Scalar indexing is disallowed. Invocation of getindex resulted in scalar indexing of a GPU array. This is typically caused by calling an iterating implementation of a method. Such implementations *do not* execute on the GPU, but very slowly on the CPU, and therefore should be avoided. If you want to allow scalar iteration, use `allowscalar` or `@allowscalar` to enable scalar iteration globally or for the operations in question. Stacktrace: [1] error(s::String) @ Base ./error.jl:44 [2] errorscalar(op::String) @ GPUArraysCore ~/.julia/packages/GPUArraysCore/aNaXo/src/GPUArraysCore.jl:151 [3] _assertscalar(op::String, behavior::GPUArraysCore.ScalarIndexing) @ GPUArraysCore ~/.julia/packages/GPUArraysCore/aNaXo/src/GPUArraysCore.jl:124 [4] assertscalar(op::String) @ GPUArraysCore ~/.julia/packages/GPUArraysCore/aNaXo/src/GPUArraysCore.jl:112 [5] getindex @ ~/.julia/packages/GPUArrays/uiVyU/src/host/indexing.jl:50 [inlined] [6] scalar_getindex @ ~/.julia/packages/GPUArrays/uiVyU/src/host/indexing.jl:36 [inlined] [7] _getindex @ ~/.julia/packages/GPUArrays/uiVyU/src/host/indexing.jl:19 [inlined] [8] getindex @ ~/.julia/packages/GPUArrays/uiVyU/src/host/indexing.jl:17 [inlined] [9] __matmul2x2_elements(tA::Char, A::JLArrays.JLArray{Float64, 2}) @ LinearAlgebra /opt/julia/share/julia/stdlib/v1.13/LinearAlgebra/src/matmul.jl:1091 [10] __matmul2x2_elements @ /opt/julia/share/julia/stdlib/v1.13/LinearAlgebra/src/matmul.jl:1120 [inlined] [11] _matmul2x2_elements @ /opt/julia/share/julia/stdlib/v1.13/LinearAlgebra/src/matmul.jl:1085 [inlined] [12] matmul2x2!(C::JLArrays.JLArray{Float64, 2}, tA::Char, tB::Char, A::JLArrays.JLArray{Float64, 2}, B::JLArrays.JLArray{Float64, 2}, α::Float64, β::Float64) @ LinearAlgebra /opt/julia/share/julia/stdlib/v1.13/LinearAlgebra/src/matmul.jl:1134 [13] matmul2x2or3x3_nonzeroalpha! @ /opt/julia/share/julia/stdlib/v1.13/LinearAlgebra/src/matmul.jl:466 [inlined] [14] generic_matmatmul_wrapper!(C::JLArrays.JLArray{Float64, 2}, tA::Char, tB::Char, A::JLArrays.JLArray{Float64, 2}, B::JLArrays.JLArray{Float64, 2}, α::Float64, β::Float64, val::Val{LinearAlgebra.BlasFlag.SYRK}) @ LinearAlgebra /opt/julia/share/julia/stdlib/v1.13/LinearAlgebra/src/matmul.jl:498 [15] _mul! @ /opt/julia/share/julia/stdlib/v1.13/LinearAlgebra/src/matmul.jl:328 [inlined] [16] mul! @ /opt/julia/share/julia/stdlib/v1.13/LinearAlgebra/src/matmul.jl:297 [inlined] [17] mul! @ ~/.julia/packages/NDTensors/1oqMy/ext/NDTensorsJLArraysExt/mul.jl:12 [inlined] [18] mul!!(CM::LinearAlgebra.Transpose{Float64, JLArrays.JLArray{Float64, 2}}, AM::JLArrays.JLArray{Float64, 2}, BM::LinearAlgebra.Transpose{Float64, JLArrays.JLArray{Float64, 2}}, α::Float64, β::Float64) @ NDTensors ~/.julia/packages/NDTensors/1oqMy/src/abstractarray/mul.jl:2 [19] _contract!(CT::JLArrays.JLArray{Float64, 2}, AT::JLArrays.JLArray{Float64, 2}, BT::JLArrays.JLArray{Float64, 2}, props::NDTensors.ContractionProperties{2, 2, 2}, α::Float64, β::Float64) @ NDTensors ~/.julia/packages/NDTensors/1oqMy/src/abstractarray/tensoralgebra/contract.jl:174 [20] _contract!(CT::NDTensors.DenseTensor{Float64, 2, Tuple{Int64, Int64}, NDTensors.Dense{Float64, JLArrays.JLArray{Float64, 1}}}, AT::NDTensors.DenseTensor{Float64, 2, Tuple{Int64, Int64}, NDTensors.Dense{Float64, JLArrays.JLArray{Float64, 1}}}, BT::NDTensors.DenseTensor{Float64, 2, Tuple{Int64, Int64}, NDTensors.Dense{Float64, JLArrays.JLArray{Float64, 1}}}, props::NDTensors.ContractionProperties{2, 2, 2}, α::Float64, β::Float64) @ NDTensors ~/.julia/packages/NDTensors/1oqMy/src/dense/tensoralgebra/contract.jl:230 [21] contract! @ ~/.julia/packages/NDTensors/1oqMy/src/dense/tensoralgebra/contract.jl:213 [inlined] [22] contract! @ ~/.julia/packages/NDTensors/1oqMy/src/tensoroperations/generic_tensor_operations.jl:165 [inlined] [23] contract!(output_tensor::NDTensors.Expose.Exposed{JLArrays.JLArray{Float64, 1}, NDTensors.DenseTensor{Float64, 2, Tuple{Int64, Int64}, NDTensors.Dense{Float64, JLArrays.JLArray{Float64, 1}}}}, labelsoutput_tensor::Tuple{Int64, Int64}, tensor1::NDTensors.Expose.Exposed{Float64, NDTensors.DiagTensor{Float64, 2, Tuple{Int64, Int64}, NDTensors.Diag{Float64, Float64}}}, labelstensor1::Tuple{Int64, Int64}, tensor2::NDTensors.Expose.Exposed{JLArrays.JLArray{Float64, 1}, NDTensors.DenseTensor{Float64, 2, Tuple{Int64, Int64}, NDTensors.Dense{Float64, JLArrays.JLArray{Float64, 1}}}}, labelstensor2::Tuple{Int64, Int64}, α::Float64, β::Float64) @ NDTensorsGPUArraysCoreExt ~/.julia/packages/NDTensors/1oqMy/ext/NDTensorsGPUArraysCoreExt/contract.jl:61 [24] contract! @ ~/.julia/packages/NDTensors/1oqMy/ext/NDTensorsGPUArraysCoreExt/contract.jl:77 [inlined] [25] contract!(R::NDTensors.BlockSparseTensor{Float64, 2, Tuple{Vector{Int64}, Vector{Int64}}, NDTensors.BlockSparse{Float64, JLArrays.JLArray{Float64, 1}, 2}}, labelsR::Tuple{Int64, Int64}, T1::NDTensors.BlockSparseTensor{Float64, 2, Tuple{Vector{Int64}, Vector{Int64}}, NDTensors.BlockSparse{Float64, JLArrays.JLArray{Float64, 1}, 2}}, labelsT1::Tuple{Int64, Int64}, T2::NDTensors.DiagBlockSparseTensor{Float64, 2, Tuple{Vector{Int64}, Vector{Int64}}, NDTensors.DiagBlockSparse{Float64, Float64, 2}}, labelsT2::Tuple{Int64, Int64}, contraction_plan::Vector{Tuple{NDTensors.Block{2}, NDTensors.Block{2}, NDTensors.Block{2}}}) @ NDTensors ~/.julia/packages/NDTensors/1oqMy/src/blocksparse/diagblocksparse.jl:671 [26] contract(T1::NDTensors.BlockSparseTensor{Float64, 2, Tuple{Vector{Int64}, Vector{Int64}}, NDTensors.BlockSparse{Float64, JLArrays.JLArray{Float64, 1}, 2}}, labelsT1::Tuple{Int64, Int64}, T2::NDTensors.DiagBlockSparseTensor{Float64, 2, Tuple{Vector{Int64}, Vector{Int64}}, NDTensors.DiagBlockSparse{Float64, Float64, 2}}, labelsT2::Tuple{Int64, Int64}, labelsR::Tuple{Int64, Int64}) @ NDTensors ~/.julia/packages/NDTensors/1oqMy/src/blocksparse/diagblocksparse.jl:621 [27] contract(T1::NDTensors.BlockSparseTensor{Float64, 2, Tuple{Vector{Int64}, Vector{Int64}}, NDTensors.BlockSparse{Float64, JLArrays.JLArray{Float64, 1}, 2}}, labelsT1::Tuple{Int64, Int64}, T2::NDTensors.DiagBlockSparseTensor{Float64, 2, Tuple{Vector{Int64}, Vector{Int64}}, NDTensors.DiagBlockSparse{Float64, Float64, 2}}, labelsT2::Tuple{Int64, Int64}) @ NDTensors ~/.julia/packages/NDTensors/1oqMy/src/blocksparse/diagblocksparse.jl:620 [28] macro expansion @ ~/.julia/packages/NDTensors/1oqMy/test/test_diagblocksparse.jl:67 [inlined] [29] macro expansion @ /opt/julia/share/julia/stdlib/v1.13/Test/src/Test.jl:1924 [inlined] [30] top-level scope @ ~/.julia/packages/NDTensors/1oqMy/test/test_diagblocksparse.jl:1945 [31] eval(m::Module, e::Any) @ Core ./boot.jl:489 [32] top-level scope @ ~/.julia/packages/NDTensors/1oqMy/test/test_diagblocksparse.jl:1 [33] include(mapexpr::Function, mod::Module, _path::String) @ Base ./Base.jl:304 [34] top-level scope @ ~/.julia/packages/SafeTestsets/raUNr/src/SafeTestsets.jl:4 [35] macro expansion @ /opt/julia/share/julia/stdlib/v1.13/Test/src/Test.jl:1835 [inlined] [36] macro expansion @ ~/.julia/packages/NDTensors/1oqMy/test/runtests.jl:7 [inlined] [37] macro expansion @ /opt/julia/share/julia/stdlib/v1.13/Test/src/Test.jl:1835 [inlined] [38] macro expansion @ ~/.julia/packages/NDTensors/1oqMy/test/runtests.jl:13 [inlined] [39] macro expansion @ /opt/julia/share/julia/stdlib/v1.13/Test/src/Test.jl:1924 [inlined] [40] macro expansion @ ~/.julia/packages/NDTensors/1oqMy/test/runtests.jl:15 [inlined] [41] eval(m::Module, e::Any) @ Core ./boot.jl:489 [42] top-level scope @ ~/.julia/packages/NDTensors/1oqMy/test/runtests.jl:28 [43] include(mapexpr::Function, mod::Module, _path::String) @ Base ./Base.jl:304 [44] top-level scope @ none:6 [45] eval(m::Module, e::Any) @ Core ./boot.jl:489 [46] exec_options(opts::Base.JLOptions) @ Base ./client.jl:297 [47] _start() @ Base ./client.jl:563 Running /home/pkgeval/.julia/packages/NDTensors/1oqMy/test/test_emptynumber.jl Running /home/pkgeval/.julia/packages/NDTensors/1oqMy/test/test_emptystorage.jl Running /home/pkgeval/.julia/packages/NDTensors/1oqMy/test/test_linearalgebra.jl Running /home/pkgeval/.julia/packages/NDTensors/1oqMy/test/test_tupletools.jl Running /home/pkgeval/.julia/packages/NDTensors/1oqMy/test/lib/runtests.jl NDTensors.BackendSelection.Algorithm type , NamedTuple()Testing Expose jl, Float32: Error During Test at /home/pkgeval/.julia/packages/NDTensors/1oqMy/src/lib/Expose/test/runtests.jl:22 Got exception outside of a @test Scalar indexing is disallowed. Invocation of getindex resulted in scalar indexing of a GPU array. This is typically caused by calling an iterating implementation of a method. Such implementations *do not* execute on the GPU, but very slowly on the CPU, and therefore should be avoided. If you want to allow scalar iteration, use `allowscalar` or `@allowscalar` to enable scalar iteration globally or for the operations in question. Stacktrace: [1] error(s::String) @ Base ./error.jl:44 [2] errorscalar(op::String) @ GPUArraysCore ~/.julia/packages/GPUArraysCore/aNaXo/src/GPUArraysCore.jl:151 [3] _assertscalar(op::String, behavior::GPUArraysCore.ScalarIndexing) @ GPUArraysCore ~/.julia/packages/GPUArraysCore/aNaXo/src/GPUArraysCore.jl:124 [4] assertscalar(op::String) @ GPUArraysCore ~/.julia/packages/GPUArraysCore/aNaXo/src/GPUArraysCore.jl:112 [5] getindex @ ~/.julia/packages/GPUArrays/uiVyU/src/host/indexing.jl:50 [inlined] [6] scalar_getindex @ ~/.julia/packages/GPUArrays/uiVyU/src/host/indexing.jl:36 [inlined] [7] _getindex @ ~/.julia/packages/GPUArrays/uiVyU/src/host/indexing.jl:19 [inlined] [8] getindex @ ~/.julia/packages/GPUArrays/uiVyU/src/host/indexing.jl:17 [inlined] [9] getindex @ /opt/julia/share/julia/stdlib/v1.13/LinearAlgebra/src/adjtrans.jl:343 [inlined] [10] macro expansion @ /opt/julia/share/julia/stdlib/v1.13/LinearAlgebra/src/generic.jl:100 [inlined] [11] _generic_matmatmul_nonadjtrans!(C::JLArrays.JLArray{Float32, 2}, A::JLArrays.JLArray{Float32, 2}, B::LinearAlgebra.Transpose{Float32, JLArrays.JLArray{Float32, 2}}, alpha::Float64, beta::Float64) @ LinearAlgebra /opt/julia/share/julia/stdlib/v1.13/LinearAlgebra/src/matmul.jl:1023 [12] __generic_matmatmul! @ /opt/julia/share/julia/stdlib/v1.13/LinearAlgebra/src/matmul.jl:1015 [inlined] [13] _generic_matmatmul!(C::JLArrays.JLArray{Float32, 2}, A::JLArrays.JLArray{Float32, 2}, B::LinearAlgebra.Transpose{Float32, JLArrays.JLArray{Float32, 2}}, alpha::Float64, beta::Float64) @ LinearAlgebra /opt/julia/share/julia/stdlib/v1.13/LinearAlgebra/src/matmul.jl:1009 [14] gemm_wrapper!(C::JLArrays.JLArray{Float32, 2}, tA::Char, tB::Char, A::JLArrays.JLArray{Float32, 2}, B::JLArrays.JLArray{Float32, 2}, α::Float64, β::Float64) @ LinearAlgebra /opt/julia/share/julia/stdlib/v1.13/LinearAlgebra/src/matmul.jl:800 [15] syrk_wrapper! @ /opt/julia/share/julia/stdlib/v1.13/LinearAlgebra/src/matmul.jl:720 [inlined] [16] _syrk_herk_gemm_wrapper!(C::JLArrays.JLArray{Float32, 2}, tA::Char, tB::Char, A::JLArrays.JLArray{Float32, 2}, B::JLArrays.JLArray{Float32, 2}, α::Float64, β::Float64, ::Val{LinearAlgebra.BlasFlag.SYRK}) @ LinearAlgebra /opt/julia/share/julia/stdlib/v1.13/LinearAlgebra/src/matmul.jl:505 [17] generic_matmatmul_wrapper!(C::JLArrays.JLArray{Float32, 2}, tA::Char, tB::Char, A::JLArrays.JLArray{Float32, 2}, B::JLArrays.JLArray{Float32, 2}, α::Float64, β::Float64, val::Val{LinearAlgebra.BlasFlag.SYRK}) @ LinearAlgebra /opt/julia/share/julia/stdlib/v1.13/LinearAlgebra/src/matmul.jl:499 [18] _mul! @ /opt/julia/share/julia/stdlib/v1.13/LinearAlgebra/src/matmul.jl:328 [inlined] [19] mul! @ /opt/julia/share/julia/stdlib/v1.13/LinearAlgebra/src/matmul.jl:297 [inlined] [20] mul!(CM::NDTensors.Expose.Exposed{JLArrays.JLArray{Float32, 2}, JLArrays.JLArray{Float32, 2}}, AM::NDTensors.Expose.Exposed{JLArrays.JLArray{Float32, 2}, JLArrays.JLArray{Float32, 2}}, BM::NDTensors.Expose.Exposed{JLArrays.JLArray{Float32, 2}, LinearAlgebra.Adjoint{Float32, JLArrays.JLArray{Float32, 2}}}, α::Float64, β::Float64) @ NDTensors.Expose ~/.julia/packages/NDTensors/1oqMy/src/lib/Expose/src/functions/mul.jl:2 [21] macro expansion @ ~/.julia/packages/NDTensors/1oqMy/src/lib/Expose/test/runtests.jl:102 [inlined] [22] macro expansion @ /opt/julia/share/julia/stdlib/v1.13/Test/src/Test.jl:1924 [inlined] [23] top-level scope @ ~/.julia/packages/NDTensors/1oqMy/src/lib/Expose/test/runtests.jl:1945 [24] eval(m::Module, e::Any) @ Core ./boot.jl:489 [25] top-level scope @ ~/.julia/packages/NDTensors/1oqMy/src/lib/Expose/test/runtests.jl:1 [26] include(mapexpr::Function, mod::Module, _path::String) @ Base ./Base.jl:304 [27] macro expansion @ ~/.julia/packages/NDTensors/1oqMy/test/lib/runtests.jl:12 [inlined] [28] macro expansion @ /opt/julia/share/julia/stdlib/v1.13/Test/src/Test.jl:1924 [inlined] [29] top-level scope @ ~/.julia/packages/NDTensors/1oqMy/test/lib/runtests.jl:1945 [30] eval(m::Module, e::Any) @ Core ./boot.jl:489 [31] top-level scope @ ~/.julia/packages/NDTensors/1oqMy/test/lib/runtests.jl:1 [32] include(mapexpr::Function, mod::Module, _path::String) @ Base ./Base.jl:304 [33] top-level scope @ ~/.julia/packages/SafeTestsets/raUNr/src/SafeTestsets.jl:4 [34] macro expansion @ /opt/julia/share/julia/stdlib/v1.13/Test/src/Test.jl:1835 [inlined] [35] macro expansion @ ~/.julia/packages/NDTensors/1oqMy/test/runtests.jl:7 [inlined] [36] macro expansion @ /opt/julia/share/julia/stdlib/v1.13/Test/src/Test.jl:1835 [inlined] [37] macro expansion @ ~/.julia/packages/NDTensors/1oqMy/test/runtests.jl:13 [inlined] [38] macro expansion @ /opt/julia/share/julia/stdlib/v1.13/Test/src/Test.jl:1924 [inlined] [39] macro expansion @ ~/.julia/packages/NDTensors/1oqMy/test/runtests.jl:15 [inlined] [40] eval(m::Module, e::Any) @ Core ./boot.jl:489 [41] top-level scope @ ~/.julia/packages/NDTensors/1oqMy/test/runtests.jl:28 [42] include(mapexpr::Function, mod::Module, _path::String) @ Base ./Base.jl:304 [43] top-level scope @ none:6 [44] eval(m::Module, e::Any) @ Core ./boot.jl:489 [45] exec_options(opts::Base.JLOptions) @ Base ./client.jl:297 [46] _start() @ Base ./client.jl:563 Testing Expose jl, ComplexF32: Error During Test at /home/pkgeval/.julia/packages/NDTensors/1oqMy/src/lib/Expose/test/runtests.jl:22 Got exception outside of a @test Scalar indexing is disallowed. Invocation of getindex resulted in scalar indexing of a GPU array. This is typically caused by calling an iterating implementation of a method. Such implementations *do not* execute on the GPU, but very slowly on the CPU, and therefore should be avoided. If you want to allow scalar iteration, use `allowscalar` or `@allowscalar` to enable scalar iteration globally or for the operations in question. Stacktrace: [1] error(s::String) @ Base ./error.jl:44 [2] errorscalar(op::String) @ GPUArraysCore ~/.julia/packages/GPUArraysCore/aNaXo/src/GPUArraysCore.jl:151 [3] _assertscalar(op::String, behavior::GPUArraysCore.ScalarIndexing) @ GPUArraysCore ~/.julia/packages/GPUArraysCore/aNaXo/src/GPUArraysCore.jl:124 [4] assertscalar(op::String) @ GPUArraysCore ~/.julia/packages/GPUArraysCore/aNaXo/src/GPUArraysCore.jl:112 [5] getindex @ ~/.julia/packages/GPUArrays/uiVyU/src/host/indexing.jl:50 [inlined] [6] scalar_getindex @ ~/.julia/packages/GPUArrays/uiVyU/src/host/indexing.jl:36 [inlined] [7] _getindex @ ~/.julia/packages/GPUArrays/uiVyU/src/host/indexing.jl:19 [inlined] [8] getindex @ ~/.julia/packages/GPUArrays/uiVyU/src/host/indexing.jl:17 [inlined] [9] getindex @ /opt/julia/share/julia/stdlib/v1.13/LinearAlgebra/src/adjtrans.jl:343 [inlined] [10] macro expansion @ /opt/julia/share/julia/stdlib/v1.13/LinearAlgebra/src/generic.jl:100 [inlined] [11] _generic_matmatmul_nonadjtrans!(C::JLArrays.JLArray{ComplexF32, 2}, A::JLArrays.JLArray{ComplexF32, 2}, B::LinearAlgebra.Adjoint{ComplexF32, JLArrays.JLArray{ComplexF32, 2}}, alpha::Float64, beta::Float64) @ LinearAlgebra /opt/julia/share/julia/stdlib/v1.13/LinearAlgebra/src/matmul.jl:1023 [12] __generic_matmatmul! @ /opt/julia/share/julia/stdlib/v1.13/LinearAlgebra/src/matmul.jl:1015 [inlined] [13] _generic_matmatmul!(C::JLArrays.JLArray{ComplexF32, 2}, A::JLArrays.JLArray{ComplexF32, 2}, B::LinearAlgebra.Adjoint{ComplexF32, JLArrays.JLArray{ComplexF32, 2}}, alpha::Float64, beta::Float64) @ LinearAlgebra /opt/julia/share/julia/stdlib/v1.13/LinearAlgebra/src/matmul.jl:1009 [14] gemm_wrapper!(C::JLArrays.JLArray{ComplexF32, 2}, tA::Char, tB::Char, A::JLArrays.JLArray{ComplexF32, 2}, B::JLArrays.JLArray{ComplexF32, 2}, α::Float64, β::Float64) @ LinearAlgebra /opt/julia/share/julia/stdlib/v1.13/LinearAlgebra/src/matmul.jl:800 [15] herk_wrapper! @ /opt/julia/share/julia/stdlib/v1.13/LinearAlgebra/src/matmul.jl:753 [inlined] [16] _syrk_herk_gemm_wrapper!(C::JLArrays.JLArray{ComplexF32, 2}, tA::Char, tB::Char, A::JLArrays.JLArray{ComplexF32, 2}, B::JLArrays.JLArray{ComplexF32, 2}, α::Float64, β::Float64, ::Val{LinearAlgebra.BlasFlag.HERK}) @ LinearAlgebra /opt/julia/share/julia/stdlib/v1.13/LinearAlgebra/src/matmul.jl:513 [17] generic_matmatmul_wrapper!(C::JLArrays.JLArray{ComplexF32, 2}, tA::Char, tB::Char, A::JLArrays.JLArray{ComplexF32, 2}, B::JLArrays.JLArray{ComplexF32, 2}, α::Float64, β::Float64, val::Val{LinearAlgebra.BlasFlag.HERK}) @ LinearAlgebra /opt/julia/share/julia/stdlib/v1.13/LinearAlgebra/src/matmul.jl:499 [18] _mul! @ /opt/julia/share/julia/stdlib/v1.13/LinearAlgebra/src/matmul.jl:328 [inlined] [19] mul! @ /opt/julia/share/julia/stdlib/v1.13/LinearAlgebra/src/matmul.jl:297 [inlined] [20] mul!(CM::NDTensors.Expose.Exposed{JLArrays.JLArray{ComplexF32, 2}, JLArrays.JLArray{ComplexF32, 2}}, AM::NDTensors.Expose.Exposed{JLArrays.JLArray{ComplexF32, 2}, JLArrays.JLArray{ComplexF32, 2}}, BM::NDTensors.Expose.Exposed{JLArrays.JLArray{ComplexF32, 2}, LinearAlgebra.Adjoint{ComplexF32, JLArrays.JLArray{ComplexF32, 2}}}, α::Float64, β::Float64) @ NDTensors.Expose ~/.julia/packages/NDTensors/1oqMy/src/lib/Expose/src/functions/mul.jl:2 [21] macro expansion @ ~/.julia/packages/NDTensors/1oqMy/src/lib/Expose/test/runtests.jl:102 [inlined] [22] macro expansion @ /opt/julia/share/julia/stdlib/v1.13/Test/src/Test.jl:1924 [inlined] [23] top-level scope @ ~/.julia/packages/NDTensors/1oqMy/src/lib/Expose/test/runtests.jl:1945 [24] eval(m::Module, e::Any) @ Core ./boot.jl:489 [25] top-level scope @ ~/.julia/packages/NDTensors/1oqMy/src/lib/Expose/test/runtests.jl:1 [26] include(mapexpr::Function, mod::Module, _path::String) @ Base ./Base.jl:304 [27] macro expansion @ ~/.julia/packages/NDTensors/1oqMy/test/lib/runtests.jl:12 [inlined] [28] macro expansion @ /opt/julia/share/julia/stdlib/v1.13/Test/src/Test.jl:1924 [inlined] [29] top-level scope @ ~/.julia/packages/NDTensors/1oqMy/test/lib/runtests.jl:1945 [30] eval(m::Module, e::Any) @ Core ./boot.jl:489 [31] top-level scope @ ~/.julia/packages/NDTensors/1oqMy/test/lib/runtests.jl:1 [32] include(mapexpr::Function, mod::Module, _path::String) @ Base ./Base.jl:304 [33] top-level scope @ ~/.julia/packages/SafeTestsets/raUNr/src/SafeTestsets.jl:4 [34] macro expansion @ /opt/julia/share/julia/stdlib/v1.13/Test/src/Test.jl:1835 [inlined] [35] macro expansion @ ~/.julia/packages/NDTensors/1oqMy/test/runtests.jl:7 [inlined] [36] macro expansion @ /opt/julia/share/julia/stdlib/v1.13/Test/src/Test.jl:1835 [inlined] [37] macro expansion @ ~/.julia/packages/NDTensors/1oqMy/test/runtests.jl:13 [inlined] [38] macro expansion @ /opt/julia/share/julia/stdlib/v1.13/Test/src/Test.jl:1924 [inlined] [39] macro expansion @ ~/.julia/packages/NDTensors/1oqMy/test/runtests.jl:15 [inlined] [40] eval(m::Module, e::Any) @ Core ./boot.jl:489 [41] top-level scope @ ~/.julia/packages/NDTensors/1oqMy/test/runtests.jl:28 [42] include(mapexpr::Function, mod::Module, _path::String) @ Base ./Base.jl:304 [43] top-level scope @ none:6 [44] eval(m::Module, e::Any) @ Core ./boot.jl:489 [45] exec_options(opts::Base.JLOptions) @ Base ./client.jl:297 [46] _start() @ Base ./client.jl:563 Test Summary: | Pass Error Total Time NDTensors | 2718 6 2724 15m38.6s /home/pkgeval/.julia/packages/NDTensors/1oqMy/test | 2718 6 2724 15m35.5s Test /home/pkgeval/.julia/packages/NDTensors/1oqMy/test/test_blocksparse.jl | 1025 1025 3m55.0s Test /home/pkgeval/.julia/packages/NDTensors/1oqMy/test/test_combiner.jl | 144 144 1m30.8s Test /home/pkgeval/.julia/packages/NDTensors/1oqMy/test/test_dense.jl | 544 1 545 3m27.3s Dense Tensors | 544 1 545 3m26.7s test device: cpu | 260 260 25.2s test device: jl | 259 1 260 35.6s DenseTensor basic functionality | 116 1 117 12.3s Random constructor | 8 8 0.1s Complex Valued Tensors | 72 72 0.1s Custom inds types | 21 21 1.0s generic contraction | 36 36 0.0s Contraction with size 1 block and NaN | 6 6 22.0s Contract with exotic types | 14 14 13.5s change backends | 6 6 1.5s change backends | 5 5 2m10.6s Test /home/pkgeval/.julia/packages/NDTensors/1oqMy/test/test_diag.jl | 264 2 266 3m21.4s DiagTensor basic functionality | 248 248 2m58.0s DiagTensor contractions | 9 9 9.3s DiagTensor contractions | 7 2 9 11.7s Test /home/pkgeval/.julia/packages/NDTensors/1oqMy/test/test_diagblocksparse.jl | 28 1 29 1m09.3s UniformDiagBlockSparseTensor basic functionality | 10 10 1.0s DiagBlockSparse off-diagonal (eltype=Float32) | 2 2 12.5s DiagBlockSparse off-diagonal (eltype=Float64) | 2 2 9.2s DiagBlockSparse off-diagonal (eltype=ComplexF32) | 2 2 11.6s DiagBlockSparse off-diagonal (eltype=ComplexF64) | 2 2 11.7s DiagBlockSparse contract | 6 6 9.6s DiagBlockSparse contract | 1 1 3.0s UniformDiagBlockSparse norm | 2 2 0.0s DiagBlockSparse denseblocks | 2 2 0.3s Test /home/pkgeval/.julia/packages/NDTensors/1oqMy/test/test_emptynumber.jl | 28 28 0.7s Test /home/pkgeval/.julia/packages/NDTensors/1oqMy/test/test_emptystorage.jl | 24 24 2.4s Test /home/pkgeval/.julia/packages/NDTensors/1oqMy/test/test_linearalgebra.jl | 454 454 49.4s Test /home/pkgeval/.julia/packages/NDTensors/1oqMy/test/test_tupletools.jl | 7 7 0.2s Test /home/pkgeval/.julia/packages/NDTensors/1oqMy/test/lib/runtests.jl | 200 2 202 1m18.7s Test NDTensors lib AMDGPUExtensions | 2 2 0.1s Test NDTensors lib BackendSelection | 12 12 0.9s Test NDTensors lib CUDAExtensions | 2 2 0.0s Test NDTensors lib GPUArraysCoreExtensions | 1 1 0.0s Test NDTensors lib MetalExtensions | 1 1 0.0s Test NDTensors lib Expose | 182 2 184 1m17.4s Testing Expose cpu, Float32 | 63 63 18.1s Testing Expose cpu, ComplexF32 | 63 63 18.8s Testing Expose jl, Float32 | 28 1 29 17.0s Testing Expose jl, ComplexF32 | 28 1 29 17.0s RNG of the outermost testset: Random.Xoshiro(0xf5076fbdca464931, 0x6eada5a8faa6286f, 0xfe9feaac44b152bc, 0xa3bcdbcfc3ebb122, 0xb459f8c46d213818) ERROR: LoadError: Some tests did not pass: 2718 passed, 0 failed, 6 errored, 0 broken. in expression starting at /home/pkgeval/.julia/packages/NDTensors/1oqMy/test/runtests.jl:3 Testing failed after 953.66s ERROR: LoadError: Package NDTensors errored during testing Stacktrace: [1] pkgerror(msg::String) @ Pkg.Types /opt/julia/share/julia/stdlib/v1.13/Pkg/src/Types.jl:68 [2] test(ctx::Pkg.Types.Context, pkgs::Vector{PackageSpec}; coverage::Bool, julia_args::Cmd, test_args::Cmd, test_fn::Nothing, force_latest_compatible_version::Bool, allow_earlier_backwards_compatible_versions::Bool, allow_reresolve::Bool) @ Pkg.Operations /opt/julia/share/julia/stdlib/v1.13/Pkg/src/Operations.jl:2414 [3] test @ /opt/julia/share/julia/stdlib/v1.13/Pkg/src/Operations.jl:2269 [inlined] [4] test(ctx::Pkg.Types.Context, pkgs::Vector{PackageSpec}; coverage::Bool, test_fn::Nothing, julia_args::Cmd, test_args::Cmd, force_latest_compatible_version::Bool, allow_earlier_backwards_compatible_versions::Bool, allow_reresolve::Bool, kwargs::@Kwargs{io::IOContext{IO}}) @ Pkg.API /opt/julia/share/julia/stdlib/v1.13/Pkg/src/API.jl:486 [5] test(pkgs::Vector{PackageSpec}; io::IOContext{IO}, kwargs::@Kwargs{julia_args::Cmd}) @ Pkg.API /opt/julia/share/julia/stdlib/v1.13/Pkg/src/API.jl:164 [6] test(pkgs::Vector{String}; kwargs::@Kwargs{julia_args::Cmd}) @ Pkg.API /opt/julia/share/julia/stdlib/v1.13/Pkg/src/API.jl:152 [7] test @ /opt/julia/share/julia/stdlib/v1.13/Pkg/src/API.jl:152 [inlined] [8] #test#81 @ /opt/julia/share/julia/stdlib/v1.13/Pkg/src/API.jl:151 [inlined] [9] top-level scope @ /PkgEval.jl/scripts/evaluate.jl:219 [10] include(mod::Module, _path::String) @ Base ./Base.jl:303 [11] exec_options(opts::Base.JLOptions) @ Base ./client.jl:331 [12] _start() @ Base ./client.jl:563 in expression starting at /PkgEval.jl/scripts/evaluate.jl:210 PkgEval failed after 1106.07s: package fails to precompile