Package evaluation to test GridapPETSc on Julia 1.12.4 (0f21d93eaa*) started at 2026-01-27T05:37:28.216 ################################################################################ # Set-up # Installing PkgEval dependencies (TestEnv)... Activating project at `~/.julia/environments/v1.12` Set-up completed after 7.79s ################################################################################ # Installation # Installing GridapPETSc... Resolving package versions... Installed GridapPETSc ─ v0.5.6 Updating `~/.julia/environments/v1.12/Project.toml` [bcdc36c2] + GridapPETSc v0.5.6 Updating `~/.julia/environments/v1.12/Manifest.toml` [47edcb42] + ADTypes v1.21.0 [621f4979] + AbstractFFTs v1.5.0 [1520ce14] + AbstractTrees v0.4.5 [79e6a3ab] + Adapt v4.4.0 [dce04be8] + ArgCheck v2.5.0 [4fba245c] + ArrayInterface v7.22.0 [4c555306] + ArrayLayouts v1.12.2 [15f4f7f2] + AutoHashEquals v2.2.0 [fbb218c0] + BSON v0.3.9 [8e7c35d0] + BlockArrays v1.9.3 [0b6fb165] + ChunkCodecCore v1.0.1 [4c0bbee4] + ChunkCodecLibZlib v1.0.0 [55437552] + ChunkCodecLibZstd v1.0.0 [7a955b69] + CircularArrays v1.4.0 [944b1d66] + CodecZlib v0.7.8 [861a8166] + Combinatorics v1.1.0 [bbf7d656] + CommonSubexpressions v0.3.1 [187b0558] + ConstructionBase v1.6.0 [864edb3b] + DataStructures v0.19.3 [163ba53b] + DiffResults v1.1.0 [b552c78f] + DiffRules v1.15.1 [a0c0ee7d] + DifferentiationInterface v0.7.15 [b4f34e82] + Distances v0.10.12 [ffbed154] + DocStringExtensions v0.9.5 [7a1cc6ca] + FFTW v1.10.0 [442a2c76] + FastGaussQuadrature v1.1.0 [5789e2e9] + FileIO v1.17.1 [1a297f60] + FillArrays v1.16.0 [6a86dc24] + FiniteDiff v2.29.0 [f6369f11] + ForwardDiff v1.3.1 [56d4f2e9] + Gridap v0.19.7 [f9701e48] + GridapDistributed v0.4.10 [bcdc36c2] + GridapPETSc v0.5.6 [076d061b] + HashArrayMappedTries v0.2.0 [92d709cd] + IrrationalConstants v0.2.6 [42fd0dbc] + IterativeSolvers v0.9.4 [033835bb] + JLD2 v0.6.3 [692b3bcd] + JLLWrappers v1.7.1 [682c06a0] + JSON v1.4.0 [9c8b4983] + LightXML v0.9.3 ⌃ [d3d80556] + LineSearches v7.5.1 [2ab3a3ac] + LogExpFunctions v0.3.29 [da04e1cc] + MPI v0.20.23 [3da0fdf6] + MPIPreferences v0.1.11 [1914dd2f] + MacroTools v0.5.16 ⌅ [d41bc354] + NLSolversBase v7.10.0 [2774e3e8] + NLsolve v4.5.1 [77ba4419] + NaNMath v1.1.3 [b8a86587] + NearestNeighbors v0.4.26 [6fe1bfb0] + OffsetArrays v1.17.0 [bac558e1] + OrderedCollections v1.8.1 [69de0a69] + Parsers v2.8.3 ⌅ [5a9dfac6] + PartitionedArrays v0.3.5 [eebad327] + PkgVersion v0.3.3 [c74db56a] + PolynomialBases v0.4.25 [aea7be01] + PrecompileTools v1.3.3 [21216c6a] + Preferences v1.5.1 [1fd47b50] + QuadGK v2.11.2 [3cdcf5f2] + RecipesBase v1.3.4 [189a3867] + Reexport v1.2.2 [ae029012] + Requires v1.3.1 [7e506255] + ScopedValues v1.5.0 [efcf1570] + Setfield v1.1.2 [ce78b400] + SimpleUnPack v1.1.0 [a0a7dd2c] + SparseMatricesCSR v0.6.9 [276daf66] + SpecialFunctions v2.6.1 [90137ffa] + StaticArrays v1.9.16 [1e83bf80] + StaticArraysCore v1.4.4 [10745b16] + Statistics v1.11.1 [82ae8749] + StatsAPI v1.8.0 [ec057cc2] + StructUtils v2.6.2 [3bb67fe8] + TranscodingStreams v0.11.3 [4004b06d] + VTKBase v1.0.1 [64499a7a] + WriteVTK v1.21.2 [f5851436] + FFTW_jll v3.3.11+0 ⌃ [e33a78d0] + Hwloc_jll v2.12.1+0 [1d5cc7b8] + IntelOpenMP_jll v2025.2.0+0 [94ce4f54] + Libiconv_jll v1.18.0+0 [856f044c] + MKL_jll v2025.2.0+0 [7cb0a576] + MPICH_jll v4.3.2+0 [f1f71cc9] + MPItrampoline_jll v5.5.4+0 [9237b28f] + MicrosoftMPI_jll v10.1.4+3 [656ef2d0] + OpenBLAS32_jll v0.3.30+0 [fe0851c0] + OpenMPI_jll v5.0.9+0 [efe28fd5] + OpenSpecFun_jll v0.5.6+0 [8fa3689e] + PETSc_jll v3.22.0+0 [aabda75e] + SCALAPACK32_jll v2.2.2+0 [02c8fc9c] + XML2_jll v2.15.1+0 [3161d3a3] + Zstd_jll v1.5.7+1 [1317d2d5] + oneTBB_jll v2022.0.0+1 [0dad84c5] + ArgTools v1.1.2 [56f22d72] + Artifacts v1.11.0 [2a0f44e3] + Base64 v1.11.0 [ade2ca70] + Dates v1.11.0 [8ba89e20] + Distributed v1.11.0 [f43a241f] + Downloads v1.7.0 [7b1f6079] + FileWatching v1.11.0 [9fa8497b] + Future v1.11.0 [b77e0a4c] + InteractiveUtils v1.11.0 [ac6e5ff7] + JuliaSyntaxHighlighting v1.12.0 [4af54fe1] + LazyArtifacts v1.11.0 [b27032c2] + LibCURL v0.6.4 [76f85450] + LibGit2 v1.11.0 [8f399da3] + Libdl v1.11.0 [37e2e46d] + LinearAlgebra v1.12.0 [56ddb016] + Logging v1.11.0 [d6f4376e] + Markdown v1.11.0 [a63ad114] + Mmap v1.11.0 [ca575930] + NetworkOptions v1.3.0 [44cfe95a] + Pkg v1.12.1 [de0858da] + Printf v1.11.0 [9a3f8284] + Random v1.11.0 [ea8e919c] + SHA v0.7.0 [9e88b42a] + Serialization v1.11.0 [6462fe0b] + Sockets v1.11.0 [2f01184e] + SparseArrays v1.12.0 [f489334b] + StyledStrings v1.11.0 [4607b0f0] + SuiteSparse [fa267f1f] + TOML v1.0.3 [a4e569a6] + Tar v1.10.0 [8dfed614] + Test v1.11.0 [cf7118a7] + UUIDs v1.11.0 [4ec0a83e] + Unicode v1.11.0 [e66e0078] + CompilerSupportLibraries_jll v1.3.0+1 [deac9b47] + LibCURL_jll v8.15.0+0 [e37daf67] + LibGit2_jll v1.9.0+0 [29816b5a] + LibSSH2_jll v1.11.3+1 [14a3606d] + MozillaCACerts_jll v2025.11.4 [4536629a] + OpenBLAS_jll v0.3.29+0 [05823500] + OpenLibm_jll v0.8.7+0 [458c3c95] + OpenSSL_jll v3.5.4+0 [bea87d4a] + SuiteSparse_jll v7.8.3+2 [83775a58] + Zlib_jll v1.3.1+2 [8e850b90] + libblastrampoline_jll v5.15.0+0 [8e850ede] + nghttp2_jll v1.64.0+1 [3f19e933] + p7zip_jll v17.7.0+0 Info Packages marked with ⌃ and ⌅ have new versions available. Those with ⌃ may be upgradable, but those with ⌅ are restricted by compatibility constraints from upgrading. To see why use `status --outdated -m` Building GridapPETSc → `~/.julia/scratchspaces/44cfe95a-1eb2-52ea-b672-e2afdf69b78f/aeb0905e921f122e11669e3f10faa318687aa3e0/build.log` Installation completed after 44.96s ################################################################################ # Precompilation # Precompiling PkgEval dependencies... Precompiling package dependencies... Precompiling packages... 12533.2 ms ✓ SCALAPACK32_jll 10343.6 ms ✓ PartitionedArrays 5321.8 ms ✓ PETSc_jll 50447.9 ms ✓ GridapDistributed 28261.9 ms ✓ GridapPETSc 5 dependencies successfully precompiled in 132 seconds. 146 already precompiled. 1 dependency had output during precompilation: ┌ GridapDistributed │ WARNING: Constructor for type "FESpace" was extended in `GridapDistributed` without explicit qualification or import. │ NOTE: Assumed "FESpace" refers to `.FESpace`. This behavior is deprecated and may differ in future versions.` │ NOTE: This behavior may have differed in Julia versions prior to 1.12. │ Hint: If you intended to create a new generic function of the same name, use `function FESpace end`. │ Hint: To silence the warning, qualify `FESpace` as `.FESpace` in the method signature or explicitly `import : FESpace`. └ Precompilation completed after 130.06s ################################################################################ # Testing # Testing GridapPETSc Status `/tmp/jl_gPmfJ2/Project.toml` [56d4f2e9] Gridap v0.19.7 [f9701e48] GridapDistributed v0.4.10 [bcdc36c2] GridapPETSc v0.5.6 [da04e1cc] MPI v0.20.23 ⌅ [5a9dfac6] PartitionedArrays v0.3.5 [a0a7dd2c] SparseMatricesCSR v0.6.9 [8fa3689e] PETSc_jll v3.22.0+0 [8f399da3] Libdl v1.11.0 [37e2e46d] LinearAlgebra v1.12.0 [9a3f8284] Random v1.11.0 [2f01184e] SparseArrays v1.12.0 [8dfed614] Test v1.11.0 Status `/tmp/jl_gPmfJ2/Manifest.toml` [47edcb42] ADTypes v1.21.0 [621f4979] AbstractFFTs v1.5.0 [1520ce14] AbstractTrees v0.4.5 [79e6a3ab] Adapt v4.4.0 [dce04be8] ArgCheck v2.5.0 [4fba245c] ArrayInterface v7.22.0 [4c555306] ArrayLayouts v1.12.2 [15f4f7f2] AutoHashEquals v2.2.0 [fbb218c0] BSON v0.3.9 [8e7c35d0] BlockArrays v1.9.3 [0b6fb165] ChunkCodecCore v1.0.1 [4c0bbee4] ChunkCodecLibZlib v1.0.0 [55437552] ChunkCodecLibZstd v1.0.0 [7a955b69] CircularArrays v1.4.0 [944b1d66] CodecZlib v0.7.8 [861a8166] Combinatorics v1.1.0 [bbf7d656] CommonSubexpressions v0.3.1 [187b0558] ConstructionBase v1.6.0 [864edb3b] DataStructures v0.19.3 [163ba53b] DiffResults v1.1.0 [b552c78f] DiffRules v1.15.1 [a0c0ee7d] DifferentiationInterface v0.7.15 [b4f34e82] Distances v0.10.12 [ffbed154] DocStringExtensions v0.9.5 [7a1cc6ca] FFTW v1.10.0 [442a2c76] FastGaussQuadrature v1.1.0 [5789e2e9] FileIO v1.17.1 [1a297f60] FillArrays v1.16.0 [6a86dc24] FiniteDiff v2.29.0 [f6369f11] ForwardDiff v1.3.1 [56d4f2e9] Gridap v0.19.7 [f9701e48] GridapDistributed v0.4.10 [bcdc36c2] GridapPETSc v0.5.6 [076d061b] HashArrayMappedTries v0.2.0 [92d709cd] IrrationalConstants v0.2.6 [42fd0dbc] IterativeSolvers v0.9.4 [033835bb] JLD2 v0.6.3 [692b3bcd] JLLWrappers v1.7.1 [682c06a0] JSON v1.4.0 [9c8b4983] LightXML v0.9.3 ⌃ [d3d80556] LineSearches v7.5.1 [2ab3a3ac] LogExpFunctions v0.3.29 [da04e1cc] MPI v0.20.23 [3da0fdf6] MPIPreferences v0.1.11 [1914dd2f] MacroTools v0.5.16 ⌅ [d41bc354] NLSolversBase v7.10.0 [2774e3e8] NLsolve v4.5.1 [77ba4419] NaNMath v1.1.3 [b8a86587] NearestNeighbors v0.4.26 [6fe1bfb0] OffsetArrays v1.17.0 [bac558e1] OrderedCollections v1.8.1 [69de0a69] Parsers v2.8.3 ⌅ [5a9dfac6] PartitionedArrays v0.3.5 [eebad327] PkgVersion v0.3.3 [c74db56a] PolynomialBases v0.4.25 [aea7be01] PrecompileTools v1.3.3 [21216c6a] Preferences v1.5.1 [1fd47b50] QuadGK v2.11.2 [3cdcf5f2] RecipesBase v1.3.4 [189a3867] Reexport v1.2.2 [ae029012] Requires v1.3.1 [7e506255] ScopedValues v1.5.0 [efcf1570] Setfield v1.1.2 [ce78b400] SimpleUnPack v1.1.0 [a0a7dd2c] SparseMatricesCSR v0.6.9 [276daf66] SpecialFunctions v2.6.1 [90137ffa] StaticArrays v1.9.16 [1e83bf80] StaticArraysCore v1.4.4 [10745b16] Statistics v1.11.1 [82ae8749] StatsAPI v1.8.0 [ec057cc2] StructUtils v2.6.2 [3bb67fe8] TranscodingStreams v0.11.3 [4004b06d] VTKBase v1.0.1 [64499a7a] WriteVTK v1.21.2 [f5851436] FFTW_jll v3.3.11+0 ⌃ [e33a78d0] Hwloc_jll v2.12.1+0 [1d5cc7b8] IntelOpenMP_jll v2025.2.0+0 [94ce4f54] Libiconv_jll v1.18.0+0 [856f044c] MKL_jll v2025.2.0+0 [7cb0a576] MPICH_jll v4.3.2+0 [f1f71cc9] MPItrampoline_jll v5.5.4+0 [9237b28f] MicrosoftMPI_jll v10.1.4+3 [656ef2d0] OpenBLAS32_jll v0.3.30+0 [fe0851c0] OpenMPI_jll v5.0.9+0 [efe28fd5] OpenSpecFun_jll v0.5.6+0 [8fa3689e] PETSc_jll v3.22.0+0 [aabda75e] SCALAPACK32_jll v2.2.2+0 [02c8fc9c] XML2_jll v2.15.1+0 [3161d3a3] Zstd_jll v1.5.7+1 [1317d2d5] oneTBB_jll v2022.0.0+1 [0dad84c5] ArgTools v1.1.2 [56f22d72] Artifacts v1.11.0 [2a0f44e3] Base64 v1.11.0 [ade2ca70] Dates v1.11.0 [8ba89e20] Distributed v1.11.0 [f43a241f] Downloads v1.7.0 [7b1f6079] FileWatching v1.11.0 [9fa8497b] Future v1.11.0 [b77e0a4c] InteractiveUtils v1.11.0 [ac6e5ff7] JuliaSyntaxHighlighting v1.12.0 [4af54fe1] LazyArtifacts v1.11.0 [b27032c2] LibCURL v0.6.4 [76f85450] LibGit2 v1.11.0 [8f399da3] Libdl v1.11.0 [37e2e46d] LinearAlgebra v1.12.0 [56ddb016] Logging v1.11.0 [d6f4376e] Markdown v1.11.0 [a63ad114] Mmap v1.11.0 [ca575930] NetworkOptions v1.3.0 [44cfe95a] Pkg v1.12.1 [de0858da] Printf v1.11.0 [9a3f8284] Random v1.11.0 [ea8e919c] SHA v0.7.0 [9e88b42a] Serialization v1.11.0 [6462fe0b] Sockets v1.11.0 [2f01184e] SparseArrays v1.12.0 [f489334b] StyledStrings v1.11.0 [4607b0f0] SuiteSparse [fa267f1f] TOML v1.0.3 [a4e569a6] Tar v1.10.0 [8dfed614] Test v1.11.0 [cf7118a7] UUIDs v1.11.0 [4ec0a83e] Unicode v1.11.0 [e66e0078] CompilerSupportLibraries_jll v1.3.0+1 [deac9b47] LibCURL_jll v8.15.0+0 [e37daf67] LibGit2_jll v1.9.0+0 [29816b5a] LibSSH2_jll v1.11.3+1 [14a3606d] MozillaCACerts_jll v2025.11.4 [4536629a] OpenBLAS_jll v0.3.29+0 [05823500] OpenLibm_jll v0.8.7+0 [458c3c95] OpenSSL_jll v3.5.4+0 [bea87d4a] SuiteSparse_jll v7.8.3+2 [83775a58] Zlib_jll v1.3.1+2 [8e850b90] libblastrampoline_jll v5.15.0+0 [8e850ede] nghttp2_jll v1.64.0+1 [3f19e933] p7zip_jll v17.7.0+0 Info Packages marked with ⌃ and ⌅ have new versions available. Those with ⌃ may be upgradable, but those with ⌅ are restricted by compatibility constraints from upgrading. Testing Running tests... [0] PetscDetermineInitialFPTrap(): Floating point trapping is off by default 0 [0] PetscDeviceInitializeTypeFromOptions_Private(): PetscDeviceType host available, initializing [0] PetscDeviceInitializeTypeFromOptions_Private(): PetscDevice host initialized, default device id 0, view FALSE, init type lazy [0] PetscDeviceInitializeTypeFromOptions_Private(): PetscDeviceType cuda not available [0] PetscDeviceInitializeTypeFromOptions_Private(): PetscDeviceType hip not available [0] PetscDeviceInitializeTypeFromOptions_Private(): PetscDeviceType sycl not available [0] PetscInitialize_Common(): PETSc successfully started: number of processors = 1 [0] PetscGetHostName(): Rejecting domainname, likely is NIS GridapPETSc-primary-liQJllli.(none) [0] PetscInitialize_Common(): Running on machine: GridapPETSc-primary-liQJllli [0] PetscInitialize_Common(): BLAS: Environment number of OpenBLAS threads 1 given by OPENBLAS_NUM_THREADS [0] PetscBLASSetNumThreads(): Setting number of threads used for OpenBLAS provided BLAS 1 [0] PetscCommDuplicate(): Duplicating a communicator 1140850689 -2080374784 max tags = 1073741823 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374784 [0] PetscGetHostName(): Rejecting domainname, likely is NIS GridapPETSc-primary-liQJllli.(none) Vec Object: 1 MPI process type: seq 1. 2. 4. 1. [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374783 max tags = 1073741823 [0] PetscGetHostName(): Rejecting domainname, likely is NIS GridapPETSc-primary-liQJllli.(none) Vec Object: 1 MPI process type: seq 1. 2. 4. 1. Vec Object: 1 MPI process type: seq 20. 40. 4. 60. [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374784 [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 4 X 5; storage space: 10 unneeded,10 used [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0 [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 3 [0] MatCheckCompressedRow(): Found the ratio (num_zerorows 0)/(num_localrows 4) < 0.6. Do not use CompressedRow routines. [0] MatSeqAIJCheckInode(): Found 3 nodes of 4. Limit used: 5. Using Inode routines Mat Object: 1 MPI process type: seqaij row 0: (1, 2.) (3, 3.) (4, 1.) row 1: (1, 6.) (3, 11.) (4, 5.) row 2: (1, 4.) (3, 3.) row 3: (3, 4.) (4, 3.) [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374784 [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 4 X 4; storage space: 0 unneeded,10 used [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0 [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 3 [0] MatCheckCompressedRow(): Found the ratio (num_zerorows 0)/(num_localrows 4) < 0.6. Do not use CompressedRow routines. [0] MatSeqAIJCheckInode(): Found 4 nodes out of 4 rows. Not using Inode routines Mat Object: 1 MPI process type: seqaij row 0: (0, 4.) (1, -2.) row 1: (0, -1.) (1, 6.) (2, -2.) row 2: (1, -1.) (2, 6.) (3, -2.) row 3: (2, -1.) (3, 4.) [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374784 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374784 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374784 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374784 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374784 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374784 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374784 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374784 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374784 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374784 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374784 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374784 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374784 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374784 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374784 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374784 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374784 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374784 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374784 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374784 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374784 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374784 [0] KSPConvergedDefault(): Linear solver has converged. Residual norm 1.249022930744e-16 is less than relative tolerance 1.000000000000e-05 times initial right-hand side norm 1.145643923739e+00 at iteration 4 [0] KSPConvergedDefault(): Linear solver has converged. Residual norm 1.249022930744e-16 is less than relative tolerance 1.000000000000e-05 times initial right-hand side norm 1.145643923739e+00 at iteration 4 [0] KSPConvergedDefault(): Linear solver has converged. Residual norm 1.291406315399e-16 is less than relative tolerance 1.000000000000e-05 times initial right-hand side norm 1.145643923739e+00 at iteration 4 KSP Object: (p_) 1 MPI process type: gmres restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test PC Object: (p_) 1 MPI process type: jacobi type DIAGONAL linear system matrix = precond matrix: Mat Object: 1 MPI process type: seqaij rows=4, cols=4 total: nonzeros=10, allocated nonzeros=0 total number of mallocs used during MatSetValues calls=0 not using I-node routines [0] PetscFinalize(): PetscFinalize() called [0] Petsc_OuterComm_Attr_DeleteFn(): Removing reference to PETSc communicator embedded in a user MPI_Comm -2080374783 [0] Petsc_InnerComm_Attr_DeleteFn(): User MPI_Comm 1140850688 is being unlinked from inner PETSc comm -2080374783 [0] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374783 [0] Petsc_Counter_Attr_DeleteFn(): Deleting counter data in an MPI_Comm -2080374783 [0] Petsc_OuterComm_Attr_DeleteFn(): Removing reference to PETSc communicator embedded in a user MPI_Comm -2080374784 [0] Petsc_InnerComm_Attr_DeleteFn(): User MPI_Comm 1140850689 is being unlinked from inner PETSc comm -2080374784 [0] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784 [0] Petsc_Counter_Attr_DeleteFn(): Deleting counter data in an MPI_Comm -2080374784 19.690486 seconds (7.76 M allocations: 459.426 MiB, 4.72% gc time, 41.28% compilation time: 16% of which was recompilation) [0] PetscDetermineInitialFPTrap(): Floating point trapping is off by default 0 [0] PetscDeviceInitializeTypeFromOptions_Private(): PetscDeviceType host available, initializing [0] PetscDeviceInitializeTypeFromOptions_Private(): PetscDevice host initialized, default device id 0, view FALSE, init type lazy [0] PetscDeviceInitializeTypeFromOptions_Private(): PetscDeviceType cuda not available [0] PetscDeviceInitializeTypeFromOptions_Private(): PetscDeviceType hip not available [0] PetscDeviceInitializeTypeFromOptions_Private(): PetscDeviceType sycl not available [0] PetscInitialize_Common(): PETSc successfully started: number of processors = 1 [0] PetscGetHostName(): Rejecting domainname, likely is NIS GridapPETSc-primary-liQJllli.(none) [0] PetscInitialize_Common(): Running on machine: GridapPETSc-primary-liQJllli [0] PetscInitialize_Common(): BLAS: Environment number of OpenBLAS threads 1 given by OPENBLAS_NUM_THREADS [0] PetscBLASSetNumThreads(): Setting number of threads used for OpenBLAS provided BLAS 1 [0] PetscCommDuplicate(): Duplicating a communicator 1140850689 -2080374784 max tags = 1073741823 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374784 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374784 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374784 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374784 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374784 [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 4 X 5; storage space: 20 unneeded,0 used [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0 [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 0 [0] MatCheckCompressedRow(): Found the ratio (num_zerorows 4)/(num_localrows 4) > 0.6. Use CompressedRow routines. [0] MatSeqAIJCheckInode(): Found 1 nodes of 4. Limit used: 5. Using Inode routines [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 4 X 5; storage space: 14 unneeded,1 used Testing failed after 40.87s ERROR: LoadError: Package GridapPETSc errored during testing (received signal: 11) Stacktrace: [1] pkgerror(msg::String) @ Pkg.Types /opt/julia/share/julia/stdlib/v1.12/Pkg/src/Types.jl:68 [2] test(ctx::Pkg.Types.Context, pkgs::Vector{PackageSpec}; coverage::Bool, julia_args::Cmd, test_args::Cmd, test_fn::Nothing, force_latest_compatible_version::Bool, allow_earlier_backwards_compatible_versions::Bool, allow_reresolve::Bool) @ Pkg.Operations /opt/julia/share/julia/stdlib/v1.12/Pkg/src/Operations.jl:2535 [3] test @ /opt/julia/share/julia/stdlib/v1.12/Pkg/src/Operations.jl:2384 [inlined] [4] test(ctx::Pkg.Types.Context, pkgs::Vector{PackageSpec}; coverage::Bool, test_fn::Nothing, julia_args::Cmd, test_args::Cmd, force_latest_compatible_version::Bool, allow_earlier_backwards_compatible_versions::Bool, allow_reresolve::Bool, kwargs::@Kwargs{io::IOContext{IO}}) @ Pkg.API /opt/julia/share/julia/stdlib/v1.12/Pkg/src/API.jl:538 [5] test(pkgs::Vector{PackageSpec}; io::IOContext{IO}, kwargs::@Kwargs{julia_args::Cmd}) @ Pkg.API /opt/julia/share/julia/stdlib/v1.12/Pkg/src/API.jl:169 [6] test(pkgs::Vector{String}; kwargs::@Kwargs{julia_args::Cmd}) @ Pkg.API /opt/julia/share/julia/stdlib/v1.12/Pkg/src/API.jl:157 [7] test @ /opt/julia/share/julia/stdlib/v1.12/Pkg/src/API.jl:157 [inlined] [8] #test#81 @ /opt/julia/share/julia/stdlib/v1.12/Pkg/src/API.jl:156 [inlined] [9] top-level scope @ /PkgEval.jl/scripts/evaluate.jl:223 [10] include(mod::Module, _path::String) @ Base ./Base.jl:306 [11] exec_options(opts::Base.JLOptions) @ Base ./client.jl:317 [12] _start() @ Base ./client.jl:550 in expression starting at /PkgEval.jl/scripts/evaluate.jl:214 PkgEval crashed after 275.07s: a segmentation fault happened