Package evaluation of GridapPETSc on Julia 1.11.4 (a71dd056e0*) started at 2025-04-08T18:10:29.571 ################################################################################ # Set-up # Installing PkgEval dependencies (TestEnv)... Set-up completed after 9.34s ################################################################################ # Installation # Installing GridapPETSc... Resolving package versions... Installed GridapPETSc ─ v0.5.3 Updating `~/.julia/environments/v1.11/Project.toml` [bcdc36c2] + GridapPETSc v0.5.3 Updating `~/.julia/environments/v1.11/Manifest.toml` [47edcb42] + ADTypes v1.14.0 [621f4979] + AbstractFFTs v1.5.0 [1520ce14] + AbstractTrees v0.4.5 [79e6a3ab] + Adapt v4.3.0 [dce04be8] + ArgCheck v2.5.0 [4fba245c] + ArrayInterface v7.18.0 [4c555306] + ArrayLayouts v1.11.1 [15f4f7f2] + AutoHashEquals v2.2.0 [fbb218c0] + BSON v0.3.9 [8e7c35d0] + BlockArrays v1.5.0 [7a955b69] + CircularArrays v1.4.0 [944b1d66] + CodecZlib v0.7.8 [861a8166] + Combinatorics v1.0.2 [bbf7d656] + CommonSubexpressions v0.3.1 [34da2185] + Compat v4.16.0 [187b0558] + ConstructionBase v1.5.8 [864edb3b] + DataStructures v0.18.22 [163ba53b] + DiffResults v1.1.0 [b552c78f] + DiffRules v1.15.1 [a0c0ee7d] + DifferentiationInterface v0.6.50 [b4f34e82] + Distances v0.10.12 [ffbed154] + DocStringExtensions v0.9.4 [7a1cc6ca] + FFTW v1.8.1 [442a2c76] + FastGaussQuadrature v1.0.2 [5789e2e9] + FileIO v1.17.0 [1a297f60] + FillArrays v1.13.0 [6a86dc24] + FiniteDiff v2.27.0 ⌅ [f6369f11] + ForwardDiff v0.10.38 [56d4f2e9] + Gridap v0.18.11 [f9701e48] + GridapDistributed v0.4.7 [bcdc36c2] + GridapPETSc v0.5.3 [92d709cd] + IrrationalConstants v0.2.4 [42fd0dbc] + IterativeSolvers v0.9.4 [033835bb] + JLD2 v0.5.12 [692b3bcd] + JLLWrappers v1.7.0 [682c06a0] + JSON v0.21.4 [9c8b4983] + LightXML v0.9.1 [d3d80556] + LineSearches v7.3.0 [2ab3a3ac] + LogExpFunctions v0.3.29 [da04e1cc] + MPI v0.20.22 [3da0fdf6] + MPIPreferences v0.1.11 [1914dd2f] + MacroTools v0.5.15 [d41bc354] + NLSolversBase v7.9.1 [2774e3e8] + NLsolve v4.5.1 [77ba4419] + NaNMath v1.1.3 [b8a86587] + NearestNeighbors v0.4.21 [6fe1bfb0] + OffsetArrays v1.16.0 [bac558e1] + OrderedCollections v1.8.0 [d96e819e] + Parameters v0.12.3 [69de0a69] + Parsers v2.8.1 ⌅ [5a9dfac6] + PartitionedArrays v0.3.4 [eebad327] + PkgVersion v0.3.3 [c74db56a] + PolynomialBases v0.4.22 ⌅ [aea7be01] + PrecompileTools v1.2.1 [21216c6a] + Preferences v1.4.3 [1fd47b50] + QuadGK v2.11.2 [3cdcf5f2] + RecipesBase v1.3.4 [189a3867] + Reexport v1.2.2 [ae029012] + Requires v1.3.1 [efcf1570] + Setfield v1.1.2 [ce78b400] + SimpleUnPack v1.1.0 [a0a7dd2c] + SparseMatricesCSR v0.6.9 [276daf66] + SpecialFunctions v2.5.0 [90137ffa] + StaticArrays v1.9.13 [1e83bf80] + StaticArraysCore v1.4.3 [10745b16] + Statistics v1.11.1 [82ae8749] + StatsAPI v1.7.0 [3bb67fe8] + TranscodingStreams v0.11.3 [3a884ed6] + UnPack v1.0.2 [4004b06d] + VTKBase v1.0.1 [64499a7a] + WriteVTK v1.21.1 [f5851436] + FFTW_jll v3.3.11+0 [e33a78d0] + Hwloc_jll v2.12.0+0 [1d5cc7b8] + IntelOpenMP_jll v2025.0.4+0 [94ce4f54] + Libiconv_jll v1.18.0+0 [856f044c] + MKL_jll v2025.0.1+1 [7cb0a576] + MPICH_jll v4.3.0+1 [f1f71cc9] + MPItrampoline_jll v5.5.3+0 [9237b28f] + MicrosoftMPI_jll v10.1.4+3 [656ef2d0] + OpenBLAS32_jll v0.3.29+0 [fe0851c0] + OpenMPI_jll v5.0.7+2 [efe28fd5] + OpenSpecFun_jll v0.5.6+0 ⌅ [8fa3689e] + PETSc_jll v3.15.2+0 [02c8fc9c] + XML2_jll v2.13.6+1 [1317d2d5] + oneTBB_jll v2022.0.0+0 [0dad84c5] + ArgTools v1.1.2 [56f22d72] + Artifacts v1.11.0 [2a0f44e3] + Base64 v1.11.0 [ade2ca70] + Dates v1.11.0 [8ba89e20] + Distributed v1.11.0 [f43a241f] + Downloads v1.6.0 [7b1f6079] + FileWatching v1.11.0 [9fa8497b] + Future v1.11.0 [b77e0a4c] + InteractiveUtils v1.11.0 [4af54fe1] + LazyArtifacts v1.11.0 [b27032c2] + LibCURL v0.6.4 [76f85450] + LibGit2 v1.11.0 [8f399da3] + Libdl v1.11.0 [37e2e46d] + LinearAlgebra v1.11.0 [56ddb016] + Logging v1.11.0 [d6f4376e] + Markdown v1.11.0 [a63ad114] + Mmap v1.11.0 [ca575930] + NetworkOptions v1.2.0 [44cfe95a] + Pkg v1.11.0 [de0858da] + Printf v1.11.0 [9a3f8284] + Random v1.11.0 [ea8e919c] + SHA v0.7.0 [9e88b42a] + Serialization v1.11.0 [6462fe0b] + Sockets v1.11.0 [2f01184e] + SparseArrays v1.11.0 [4607b0f0] + SuiteSparse [fa267f1f] + TOML v1.0.3 [a4e569a6] + Tar v1.10.0 [8dfed614] + Test v1.11.0 [cf7118a7] + UUIDs v1.11.0 [4ec0a83e] + Unicode v1.11.0 [e66e0078] + CompilerSupportLibraries_jll v1.1.1+0 [deac9b47] + LibCURL_jll v8.6.0+0 [e37daf67] + LibGit2_jll v1.7.2+0 [29816b5a] + LibSSH2_jll v1.11.0+1 [c8ffd9c3] + MbedTLS_jll v2.28.6+0 [14a3606d] + MozillaCACerts_jll v2023.12.12 [4536629a] + OpenBLAS_jll v0.3.27+1 [05823500] + OpenLibm_jll v0.8.5+0 [bea87d4a] + SuiteSparse_jll v7.7.0+0 [83775a58] + Zlib_jll v1.2.13+1 [8e850b90] + libblastrampoline_jll v5.11.0+0 [8e850ede] + nghttp2_jll v1.59.0+0 [3f19e933] + p7zip_jll v17.4.0+2 Info Packages marked with ⌅ have new versions available but compatibility constraints restrict them from upgrading. To see why use `status --outdated -m` Building GridapPETSc → `~/.julia/scratchspaces/44cfe95a-1eb2-52ea-b672-e2afdf69b78f/601899a547ac1f473be900e6b3157e62cb2db8fc/build.log` Installation completed after 25.08s ################################################################################ # Precompilation # Precompiling PkgEval dependencies... Precompiling package dependencies... Precompilation completed after 81.46s ################################################################################ # Testing # Testing GridapPETSc Status `/tmp/jl_ZdA7TN/Project.toml` [56d4f2e9] Gridap v0.18.11 [f9701e48] GridapDistributed v0.4.7 [bcdc36c2] GridapPETSc v0.5.3 [da04e1cc] MPI v0.20.22 ⌅ [5a9dfac6] PartitionedArrays v0.3.4 [a0a7dd2c] SparseMatricesCSR v0.6.9 ⌅ [8fa3689e] PETSc_jll v3.15.2+0 [8f399da3] Libdl v1.11.0 [37e2e46d] LinearAlgebra v1.11.0 [9a3f8284] Random v1.11.0 [2f01184e] SparseArrays v1.11.0 [8dfed614] Test v1.11.0 Status `/tmp/jl_ZdA7TN/Manifest.toml` [47edcb42] ADTypes v1.14.0 [621f4979] AbstractFFTs v1.5.0 [1520ce14] AbstractTrees v0.4.5 [79e6a3ab] Adapt v4.3.0 [dce04be8] ArgCheck v2.5.0 [4fba245c] ArrayInterface v7.18.0 [4c555306] ArrayLayouts v1.11.1 [15f4f7f2] AutoHashEquals v2.2.0 [fbb218c0] BSON v0.3.9 [8e7c35d0] BlockArrays v1.5.0 [7a955b69] CircularArrays v1.4.0 [944b1d66] CodecZlib v0.7.8 [861a8166] Combinatorics v1.0.2 [bbf7d656] CommonSubexpressions v0.3.1 [34da2185] Compat v4.16.0 [187b0558] ConstructionBase v1.5.8 [864edb3b] DataStructures v0.18.22 [163ba53b] DiffResults v1.1.0 [b552c78f] DiffRules v1.15.1 [a0c0ee7d] DifferentiationInterface v0.6.50 [b4f34e82] Distances v0.10.12 [ffbed154] DocStringExtensions v0.9.4 [7a1cc6ca] FFTW v1.8.1 [442a2c76] FastGaussQuadrature v1.0.2 [5789e2e9] FileIO v1.17.0 [1a297f60] FillArrays v1.13.0 [6a86dc24] FiniteDiff v2.27.0 ⌅ [f6369f11] ForwardDiff v0.10.38 [56d4f2e9] Gridap v0.18.11 [f9701e48] GridapDistributed v0.4.7 [bcdc36c2] GridapPETSc v0.5.3 [92d709cd] IrrationalConstants v0.2.4 [42fd0dbc] IterativeSolvers v0.9.4 [033835bb] JLD2 v0.5.12 [692b3bcd] JLLWrappers v1.7.0 [682c06a0] JSON v0.21.4 [9c8b4983] LightXML v0.9.1 [d3d80556] LineSearches v7.3.0 [2ab3a3ac] LogExpFunctions v0.3.29 [da04e1cc] MPI v0.20.22 [3da0fdf6] MPIPreferences v0.1.11 [1914dd2f] MacroTools v0.5.15 [d41bc354] NLSolversBase v7.9.1 [2774e3e8] NLsolve v4.5.1 [77ba4419] NaNMath v1.1.3 [b8a86587] NearestNeighbors v0.4.21 [6fe1bfb0] OffsetArrays v1.16.0 [bac558e1] OrderedCollections v1.8.0 [d96e819e] Parameters v0.12.3 [69de0a69] Parsers v2.8.1 ⌅ [5a9dfac6] PartitionedArrays v0.3.4 [eebad327] PkgVersion v0.3.3 [c74db56a] PolynomialBases v0.4.22 ⌅ [aea7be01] PrecompileTools v1.2.1 [21216c6a] Preferences v1.4.3 [1fd47b50] QuadGK v2.11.2 [3cdcf5f2] RecipesBase v1.3.4 [189a3867] Reexport v1.2.2 [ae029012] Requires v1.3.1 [efcf1570] Setfield v1.1.2 [ce78b400] SimpleUnPack v1.1.0 [a0a7dd2c] SparseMatricesCSR v0.6.9 [276daf66] SpecialFunctions v2.5.0 [90137ffa] StaticArrays v1.9.13 [1e83bf80] StaticArraysCore v1.4.3 [10745b16] Statistics v1.11.1 [82ae8749] StatsAPI v1.7.0 [3bb67fe8] TranscodingStreams v0.11.3 [3a884ed6] UnPack v1.0.2 [4004b06d] VTKBase v1.0.1 [64499a7a] WriteVTK v1.21.1 [f5851436] FFTW_jll v3.3.11+0 [e33a78d0] Hwloc_jll v2.12.0+0 [1d5cc7b8] IntelOpenMP_jll v2025.0.4+0 [94ce4f54] Libiconv_jll v1.18.0+0 [856f044c] MKL_jll v2025.0.1+1 [7cb0a576] MPICH_jll v4.3.0+1 [f1f71cc9] MPItrampoline_jll v5.5.3+0 [9237b28f] MicrosoftMPI_jll v10.1.4+3 [656ef2d0] OpenBLAS32_jll v0.3.29+0 [fe0851c0] OpenMPI_jll v5.0.7+2 [efe28fd5] OpenSpecFun_jll v0.5.6+0 ⌅ [8fa3689e] PETSc_jll v3.15.2+0 [02c8fc9c] XML2_jll v2.13.6+1 [1317d2d5] oneTBB_jll v2022.0.0+0 [0dad84c5] ArgTools v1.1.2 [56f22d72] Artifacts v1.11.0 [2a0f44e3] Base64 v1.11.0 [ade2ca70] Dates v1.11.0 [8ba89e20] Distributed v1.11.0 [f43a241f] Downloads v1.6.0 [7b1f6079] FileWatching v1.11.0 [9fa8497b] Future v1.11.0 [b77e0a4c] InteractiveUtils v1.11.0 [4af54fe1] LazyArtifacts v1.11.0 [b27032c2] LibCURL v0.6.4 [76f85450] LibGit2 v1.11.0 [8f399da3] Libdl v1.11.0 [37e2e46d] LinearAlgebra v1.11.0 [56ddb016] Logging v1.11.0 [d6f4376e] Markdown v1.11.0 [a63ad114] Mmap v1.11.0 [ca575930] NetworkOptions v1.2.0 [44cfe95a] Pkg v1.11.0 [de0858da] Printf v1.11.0 [9a3f8284] Random v1.11.0 [ea8e919c] SHA v0.7.0 [9e88b42a] Serialization v1.11.0 [6462fe0b] Sockets v1.11.0 [2f01184e] SparseArrays v1.11.0 [4607b0f0] SuiteSparse [fa267f1f] TOML v1.0.3 [a4e569a6] Tar v1.10.0 [8dfed614] Test v1.11.0 [cf7118a7] UUIDs v1.11.0 [4ec0a83e] Unicode v1.11.0 [e66e0078] CompilerSupportLibraries_jll v1.1.1+0 [deac9b47] LibCURL_jll v8.6.0+0 [e37daf67] LibGit2_jll v1.7.2+0 [29816b5a] LibSSH2_jll v1.11.0+1 [c8ffd9c3] MbedTLS_jll v2.28.6+0 [14a3606d] MozillaCACerts_jll v2023.12.12 [4536629a] OpenBLAS_jll v0.3.27+1 [05823500] OpenLibm_jll v0.8.5+0 [bea87d4a] SuiteSparse_jll v7.7.0+0 [83775a58] Zlib_jll v1.2.13+1 [8e850b90] libblastrampoline_jll v5.11.0+0 [8e850ede] nghttp2_jll v1.59.0+0 [3f19e933] p7zip_jll v17.4.0+2 Info Packages marked with ⌅ have new versions available but compatibility constraints restrict them from upgrading. Testing Running tests... [0] PetscDetermineInitialFPTrap(): Floating point trapping is off by default 0 [0] PetscInitialize(): PETSc successfully started: number of processors = 1 [0] PetscGetHostName(): Rejecting domainname, likely is NIS GridapPETSc-primary-74ys2yzC.(none) [0] PetscInitialize(): Running on machine: GridapPETSc-primary-74ys2yzC [0] PetscCommDuplicate(): Duplicating a communicator 1140850689 -2080374784 max tags = 1073741823 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374784 [0] PetscGetHostName(): Rejecting domainname, likely is NIS GridapPETSc-primary-74ys2yzC.(none) Vec Object: 1 MPI processes type: seq 1. 2. 4. 1. [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374783 max tags = 1073741823 [0] PetscGetHostName(): Rejecting domainname, likely is NIS GridapPETSc-primary-74ys2yzC.(none) Vec Object: 1 MPI processes type: seq 1. 2. 4. 1. Vec Object: 1 MPI processes type: seq 20. 40. 4. 60. [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374784 [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 4 X 5; storage space: 10 unneeded,10 used [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0 [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 3 [0] MatCheckCompressedRow(): Found the ratio (num_zerorows 0)/(num_localrows 4) < 0.6. Do not use CompressedRow routines. [0] MatSeqAIJCheckInode(): Found 3 nodes of 4. Limit used: 5. Using Inode routines Mat Object: 1 MPI processes type: seqaij row 0: (1, 2.) (3, 3.) (4, 1.) row 1: (1, 6.) (3, 11.) (4, 5.) row 2: (1, 4.) (3, 3.) row 3: (3, 4.) (4, 3.) [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374784 [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 4 X 4; storage space: 0 unneeded,10 used [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0 [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 3 [0] MatCheckCompressedRow(): Found the ratio (num_zerorows 0)/(num_localrows 4) < 0.6. Do not use CompressedRow routines. [0] MatSeqAIJCheckInode(): Found 4 nodes out of 4 rows. Not using Inode routines Mat Object: 1 MPI processes type: seqaij row 0: (0, 4.) (1, -2.) row 1: (0, -1.) (1, 6.) (2, -2.) row 2: (1, -1.) (2, 6.) (3, -2.) row 3: (2, -1.) (3, 4.) [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374784 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374784 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374784 [0] PCSetUp(): Setting up PC for first time [0] PCSetUp(): Leaving PC with identical preconditioner since operator is unchanged [0] PCSetUp(): Leaving PC with identical preconditioner since operator is unchanged [0] PCSetUp(): Leaving PC with identical preconditioner since operator is unchanged [0] PCSetUp(): Leaving PC with identical preconditioner since operator is unchanged [0] PCSetUp(): Leaving PC with identical preconditioner since operator is unchanged [0] PCSetUp(): Leaving PC with identical preconditioner since operator is unchanged [0] PCSetUp(): Leaving PC with identical preconditioner since operator is unchanged [0] PCSetUp(): Leaving PC with identical preconditioner since operator is unchanged [0] PCSetUp(): Leaving PC with identical preconditioner since operator is unchanged [0] KSPConvergedDefault(): Linear solver has converged. Residual norm 1.260351938337e-16 is less than relative tolerance 1.000000000000e-05 times initial right hand side norm 1.145643923739e+00 at iteration 4 [0] PCSetUp(): Leaving PC with identical preconditioner since operator is unchanged [0] PCSetUp(): Leaving PC with identical preconditioner since operator is unchanged [0] PCSetUp(): Leaving PC with identical preconditioner since operator is unchanged [0] PCSetUp(): Leaving PC with identical preconditioner since operator is unchanged [0] PCSetUp(): Leaving PC with identical preconditioner since operator is unchanged [0] PCSetUp(): Leaving PC with identical preconditioner since operator is unchanged [0] PCSetUp(): Leaving PC with identical preconditioner since operator is unchanged [0] PCSetUp(): Leaving PC with identical preconditioner since operator is unchanged [0] PCSetUp(): Leaving PC with identical preconditioner since operator is unchanged [0] PCSetUp(): Leaving PC with identical preconditioner since operator is unchanged [0] KSPConvergedDefault(): Linear solver has converged. Residual norm 1.260351938337e-16 is less than relative tolerance 1.000000000000e-05 times initial right hand side norm 1.145643923739e+00 at iteration 4 [0] PCSetUp(): Leaving PC with identical preconditioner since operator is unchanged [0] PCSetUp(): Leaving PC with identical preconditioner since operator is unchanged [0] PCSetUp(): Leaving PC with identical preconditioner since operator is unchanged [0] PCSetUp(): Leaving PC with identical preconditioner since operator is unchanged [0] PCSetUp(): Leaving PC with identical preconditioner since operator is unchanged [0] PCSetUp(): Leaving PC with identical preconditioner since operator is unchanged [0] PCSetUp(): Leaving PC with identical preconditioner since operator is unchanged [0] PCSetUp(): Leaving PC with identical preconditioner since operator is unchanged [0] PCSetUp(): Leaving PC with identical preconditioner since operator is unchanged [0] KSPConvergedDefault(): Linear solver has converged. Residual norm 1.315348017250e-16 is less than relative tolerance 1.000000000000e-05 times initial right hand side norm 1.145643923739e+00 at iteration 4 KSP Object: (p_) 1 MPI processes type: gmres restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test PC Object: (p_) 1 MPI processes type: jacobi type DIAGONAL linear system matrix = precond matrix: Mat Object: 1 MPI processes type: seqaij rows=4, cols=4 total: nonzeros=10, allocated nonzeros=0 total number of mallocs used during MatSetValues calls=0 not using I-node routines [0] PetscFinalize(): PetscFinalize() called [0] Petsc_DelViewer(): Removing viewer data attribute in an MPI_Comm -2080374784 [0] Petsc_OuterComm_Attr_Delete_Fn(): Removing reference to PETSc communicator embedded in a user MPI_Comm -2080374784 [0] Petsc_InnerComm_Attr_Delete_Fn(): User MPI_Comm 1140850689 is being unlinked from inner PETSc comm -2080374784 [0] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784 [0] Petsc_Counter_Attr_Delete_Fn(): Deleting counter data in an MPI_Comm -2080374784 [0] Petsc_DelViewer(): Removing viewer data attribute in an MPI_Comm -2080374783 [0] Petsc_OuterComm_Attr_Delete_Fn(): Removing reference to PETSc communicator embedded in a user MPI_Comm -2080374783 [0] Petsc_InnerComm_Attr_Delete_Fn(): User MPI_Comm 1140850688 is being unlinked from inner PETSc comm -2080374783 [0] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374783 [0] Petsc_Counter_Attr_Delete_Fn(): Deleting counter data in an MPI_Comm -2080374783 15.485668 seconds (5.88 M allocations: 336.894 MiB, 3.08% gc time, 54.46% compilation time) [0] PetscDetermineInitialFPTrap(): Floating point trapping is off by default 0 [0] PetscInitialize(): PETSc successfully started: number of processors = 1 [0] PetscGetHostName(): Rejecting domainname, likely is NIS GridapPETSc-primary-74ys2yzC.(none) [0] PetscInitialize(): Running on machine: GridapPETSc-primary-74ys2yzC [0] PetscCommDuplicate(): Duplicating a communicator 1140850689 -2080374783 max tags = 1073741823 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 4 X 5; storage space: 20 unneeded,0 used [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0 [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 0 [0] MatCheckCompressedRow(): Found the ratio (num_zerorows 4)/(num_localrows 4) > 0.6. Use CompressedRow routines. [0] MatSeqAIJCheckInode(): Found 1 nodes of 4. Limit used: 5. Using Inode routines [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 4 X 5; storage space: 14 unneeded,1 used [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 1 [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 1 [0] MatCheckCompressedRow(): Found the ratio (num_zerorows 3)/(num_localrows 4) > 0.6. Use CompressedRow routines. [0] MatSeqAIJCheckInode(): Found 2 nodes of 4. Limit used: 5. Using Inode routines [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 4 X 5; storage space: 14 unneeded,2 used [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 1 [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 1 [0] MatCheckCompressedRow(): Found the ratio (num_zerorows 2)/(num_localrows 4) < 0.6. Do not use CompressedRow routines. [0] MatSeqAIJCheckInode(): Found 4 nodes out of 4 rows. Not using Inode routines 4×5 GridapPETSc.PETScMatrix: 0.0 0.0 5.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 7.0 0.0 0.0 0.0 0.0 0.0 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 3 X 2; storage space: 6 unneeded,0 used [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0 [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 0 [0] MatCheckCompressedRow(): Found the ratio (num_zerorows 3)/(num_localrows 3) > 0.6. Use CompressedRow routines. [0] MatSeqAIJCheckInode(): Found 1 nodes of 3. Limit used: 5. Using Inode routines [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 4 X 4; storage space: 0 unneeded,10 used [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0 [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 3 [0] MatCheckCompressedRow(): Found the ratio (num_zerorows 0)/(num_localrows 4) < 0.6. Do not use CompressedRow routines. [0] MatSeqAIJCheckInode(): Found 4 nodes out of 4 rows. Not using Inode routines [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [0] MatConvert(): Calling duplicate for initial matrix seqaij 0 1 [0] MatConvert(): Calling duplicate for initial matrix seqaij 0 1 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [0] Petsc_OuterComm_Attr_Delete_Fn(): Removing reference to PETSc communicator embedded in a user MPI_Comm -2080374783 [0] Petsc_InnerComm_Attr_Delete_Fn(): User MPI_Comm 1140850689 is being unlinked from inner PETSc comm -2080374783 [0] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374783 [0] Petsc_Counter_Attr_Delete_Fn(): Deleting counter data in an MPI_Comm -2080374783 [0] PetscFinalize(): PetscFinalize() called 8.655846 seconds (3.03 M allocations: 157.370 MiB, 5.26% gc time, 95.08% compilation time) [1, 2, 3, 5, 6] [4, 5, 7, 3, 6] [6, 7, 4] Vec Object: 1 MPI processes type: seq 10. 20. 30. 40. 50. 60. 70. 3-element PartitionedArrays.DebugArray{SparseArrays.SparseMatrixCSC{Float64, Int64}, 1}: [1] = sparse([1, 2, 3, 1, 3], [1, 2, 3, 4, 5], [9.0, 9.0, 9.0, 1.0, 1.0], 5, 5) [2] = sparse([1, 2, 2, 2, 1], [1, 2, 3, 4, 5], [9.0, 9.0, 1.0, 9.0, 1.0], 5, 5) [3] = sparse([1, 2, 1], [1, 2, 3], [9.0, 9.0, 1.0], 3, 3) Mat Object: 1 MPI processes type: seqaij row 0: (0, 9.) (4, 1.) row 1: (1, 9.) row 2: (2, 9.) (5, 1.) row 3: (3, 9.) (5, 1.) row 4: (2, 9.) (4, 9.) (6, 1.) row 5: (3, 1.) (5, 9.) row 6: (6, 9.) Linear solve converged due to CONVERGED_RTOL iterations 5 Linear solve converged due to CONVERGED_RTOL iterations 5 Vec Object: 1 MPI processes type: seq 140. 180. 330. 420. 790. 580. 630. Vec Object: 1 MPI processes type: seq 10. 20. 30. 40. 50. 60. 70. Linear solve converged due to CONVERGED_RTOL iterations 5 Linear solve converged due to CONVERGED_RTOL iterations 5 Vec Object: 1 MPI processes type: seq 140. 180. 330. 420. 790. 580. 630. Vec Object: 1 MPI processes type: seq 10. 20. 30. 40. 50. 60. 70. 69.862548 seconds (35.26 M allocations: 1.766 GiB, 1.19% gc time, 84.31% compilation time) 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 KSP Object: 1 MPI processes type: gmres restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using DEFAULT norm type for convergence test PC Object: 1 MPI processes type: jacobi PC has not been set up so information may be incomplete type DIAGONAL linear system matrix = precond matrix: Mat Object: 1 MPI processes type: seqaij rows=4, cols=4 total: nonzeros=10, allocated nonzeros=0 total number of mallocs used during MatSetValues calls=0 not using I-node routines 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 10.016182 seconds (4.01 M allocations: 211.798 MiB, 7.24% gc time, 92.10% compilation time) 0 SNES Function norm 3.605551275464e+00 1 SNES Function norm 4.444444444444e-01 2 SNES Function norm 7.111111111111e-02 3 SNES Function norm 3.936947327951e-03 4 SNES Function norm 1.525925473445e-05 5 SNES Function norm 2.328306437081e-10 6 SNES Function norm 0.000000000000e+00 Nonlinear solve converged due to CONVERGED_FNORM_RELATIVE iterations 6 0 SNES Function norm 0.000000000000e+00 1 SNES Function norm 0.000000000000e+00 Nonlinear solve converged due to CONVERGED_FNORM_RELATIVE iterations 1 0 SNES Function norm 0.000000000000e+00 1 SNES Function norm 0.000000000000e+00 Nonlinear solve converged due to CONVERGED_FNORM_RELATIVE iterations 1 0 SNES Function norm 1.486606874732e-01 1 SNES Function norm 8.402777777778e-03 2 SNES Function norm 6.831067663990e-05 3 SNES Function norm 4.665073682466e-09 4 SNES Function norm 0.000000000000e+00 Nonlinear solve converged due to CONVERGED_FNORM_RELATIVE iterations 4 0 SNES Function norm 0.000000000000e+00 1 SNES Function norm 0.000000000000e+00 Nonlinear solve converged due to CONVERGED_FNORM_RELATIVE iterations 1 0 SNES Function norm 0.000000000000e+00 1 SNES Function norm 0.000000000000e+00 Nonlinear solve converged due to CONVERGED_FNORM_RELATIVE iterations 1 0 SNES Function norm 3.605551275464e+00 1 SNES Function norm 4.444444444444e-01 2 SNES Function norm 7.111111111111e-02 3 SNES Function norm 3.936947327951e-03 4 SNES Function norm 1.525925473445e-05 5 SNES Function norm 2.328306437081e-10 6 SNES Function norm 0.000000000000e+00 Nonlinear solve converged due to CONVERGED_FNORM_RELATIVE iterations 6 2.654435 seconds (1.08 M allocations: 56.024 MiB, 13.85% gc time, 85.62% compilation time) [0] PetscDetermineInitialFPTrap(): Floating point trapping is off by default 0 [0] PetscInitialize(): PETSc successfully started: number of processors = 1 [0] PetscGetHostName(): Rejecting domainname, likely is NIS GridapPETSc-primary-74ys2yzC.(none) [0] PetscInitialize(): Running on machine: GridapPETSc-primary-74ys2yzC [0] PetscCommDuplicate(): Duplicating a communicator 1140850689 -2080374784 max tags = 1073741823 [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 4 X 3; storage space: 2 unneeded,1 used [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0 [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 1 [0] MatCheckCompressedRow(): Found the ratio (num_zerorows 3)/(num_localrows 4) > 0.6. Use CompressedRow routines. [0] MatSeqAIJCheckInode(): Found 2 nodes of 4. Limit used: 5. Using Inode routines 4×3 GridapPETSc.PETScMatrix: -4.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374784 [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 4 X 3; storage space: 11 unneeded,1 used [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0 [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 1 [0] MatCheckCompressedRow(): Found the ratio (num_zerorows 3)/(num_localrows 4) > 0.6. Use CompressedRow routines. [0] MatSeqAIJCheckInode(): Found 2 nodes of 4. Limit used: 5. Using Inode routines 4×3 GridapPETSc.PETScMatrix: -4.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374784 [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 4 X 3; storage space: 0 unneeded,1 used [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0 [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 1 [0] MatCheckCompressedRow(): Found the ratio (num_zerorows 3)/(num_localrows 4) > 0.6. Use CompressedRow routines. [0] MatSeqAIJCheckInode(): Found 2 nodes of 4. Limit used: 5. Using Inode routines 4×3 GridapPETSc.PETScMatrix: -2.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 [0] MatConvert(): Calling duplicate for initial matrix seqaij 0 1 [0] MatConvert(): Calling duplicate for initial matrix seqaij 0 1 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374784 4-element GridapPETSc.PETScVector: 6.0 0.0 0.0 1.0 ┌ Warning: 1 objects still not finalized before calling GridapPETSc.Finalize() └ @ GridapPETSc ~/.julia/packages/GridapPETSc/X0jXh/src/Environment.jl:28 [0] PetscFinalize(): PetscFinalize() called 11.060852 seconds (3.43 M allocations: 176.025 MiB, 3.23% gc time, 96.44% compilation time) [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [0]PETSC ERROR: Invalid argument [0]PETSC ERROR: Invalid object classid 1211222 This could happen if you compile with PETSC_HAVE_DYNAMIC_LIBRARIES, but link with static libraries. [0]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. [0]PETSC ERROR: Petsc Release Version 3.15.2, Jul 10, 2021 [0]PETSC ERROR: GridapPETSc on a named GridapPETSc-primary-74ys2yzC by Unknown Tue Apr 8 18:14:54 2025 [0]PETSC ERROR: Configure options --prefix=/workspace/destdir CC=cc FC=gfortran CXX=c++ COPTFLAGS=-O3 CXXOPTFLAGS=-O3 CFLAGS=-fno-stack-protector FOPTFLAGS=-O3 --with-64-bit-indices=0 --with-debugging=0 --with-batch --PETSC_ARCH=x86_64-linux-gnu_double_real_Int32 --with-blaslapack-lib=/workspace/destdir/lib/libopenblas.so --with-blaslapack-suffix= --known-64-bit-blas-indices=0 --with-mpi-lib="[/workspace/destdir/lib/libmpifort.so,/workspace/destdir/lib/libmpi.so]" --known-mpi-int64_t=0 --with-mpi-include=/workspace/destdir/include --with-sowing=0 --with-precision=double --with-scalar-type=real [0]PETSC ERROR: #1 PetscClassRegLogGetClass() at /workspace/srcdir/petsc-3.15.2/src/sys/logging/utils/classlog.c:263 [0]PETSC ERROR: #2 PetscLogObjDestroyDefault() at /workspace/srcdir/petsc-3.15.2/src/sys/logging/utils/classlog.c:363 [0]PETSC ERROR: #3 PetscHeaderDestroy_Private() at /workspace/srcdir/petsc-3.15.2/src/sys/objects/inherit.c:103 [0]PETSC ERROR: #4 VecDestroy() at /workspace/srcdir/petsc-3.15.2/src/vec/vec/interface/vector.c:403 error in running finalizer: GridapPETSc.PETSC.PetscError(code=1) macro expansion at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/src/Config.jl:88 [inlined] Finalize at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/src/PETScArrays.jl:25 unknown function (ip: 0x7889b50e8712) _jl_invoke at /source/src/gf.c:2948 [inlined] ijl_apply_generic at /source/src/gf.c:3125 run_finalizer at /source/src/gc.c:303 jl_gc_run_finalizers_in_list at /source/src/gc.c:395 run_finalizers at /source/src/gc.c:439 jl_mutex_unlock at /source/src/julia_locks.h:80 [inlined] jl_generate_fptr_impl at /source/src/jitlayers.cpp:545 jl_compile_method_internal at /source/src/gf.c:2536 [inlined] jl_compile_method_internal at /source/src/gf.c:2423 _jl_invoke at /source/src/gf.c:2940 [inlined] ijl_apply_generic at /source/src/gf.c:3125 #27 at /home/pkgeval/.julia/packages/Gridap/lHC3y/src/CellData/CellFields.jl:614 map at ./tuple.jl:356 [inlined] _to_common_domain at /home/pkgeval/.julia/packages/Gridap/lHC3y/src/CellData/CellFields.jl:614 unknown function (ip: 0x788998d22856) _jl_invoke at /source/src/gf.c:2948 [inlined] ijl_apply_generic at /source/src/gf.c:3125 _operate_cellfields at /home/pkgeval/.julia/packages/Gridap/lHC3y/src/CellData/CellFields.jl:557 evaluate! at /home/pkgeval/.julia/packages/Gridap/lHC3y/src/CellData/CellFields.jl:451 [inlined] evaluate at /home/pkgeval/.julia/packages/Gridap/lHC3y/src/Arrays/Maps.jl:87 [inlined] Map at /home/pkgeval/.julia/packages/Gridap/lHC3y/src/Arrays/Maps.jl:90 [inlined] dot at /home/pkgeval/.julia/packages/Gridap/lHC3y/src/CellData/CellFields.jl:638 [inlined] a at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/sequential/PoissonDriver.jl:45 [inlined] #60 at /home/pkgeval/.julia/packages/Gridap/lHC3y/src/FESpaces/AffineFEOperators.jl:52 unknown function (ip: 0x788998d212be) _jl_invoke at /source/src/gf.c:2948 [inlined] ijl_apply_generic at /source/src/gf.c:3125 AffineFEOperator at /home/pkgeval/.julia/packages/Gridap/lHC3y/src/FESpaces/AffineFEOperators.jl:36 unknown function (ip: 0x788998d1d7be) _jl_invoke at /source/src/gf.c:2948 [inlined] ijl_apply_generic at /source/src/gf.c:3125 jl_apply at /source/src/julia.h:2157 [inlined] do_apply at /source/src/builtins.c:831 AffineFEOperator at /home/pkgeval/.julia/packages/Gridap/lHC3y/src/FESpaces/AffineFEOperators.jl:51 _jl_invoke at /source/src/gf.c:2948 [inlined] ijl_apply_generic at /source/src/gf.c:3125 #1 at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/sequential/PoissonDriver.jl:57 #with#2 at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/src/Environment.jl:38 unknown function (ip: 0x78899a19e282) _jl_invoke at /source/src/gf.c:2948 [inlined] ijl_apply_generic at /source/src/gf.c:3125 with at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/src/Environment.jl:36 unknown function (ip: 0x78899a192c96) _jl_invoke at /source/src/gf.c:2948 [inlined] ijl_apply_generic at /source/src/gf.c:3125 jl_apply at /source/src/julia.h:2157 [inlined] do_call at /source/src/interpreter.c:126 eval_value at /source/src/interpreter.c:223 eval_stmt_value at /source/src/interpreter.c:174 [inlined] eval_body at /source/src/interpreter.c:670 jl_interpret_toplevel_thunk at /source/src/interpreter.c:824 top-level scope at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/sequential/PoissonDriver.jl:31 jl_toplevel_eval_flex at /source/src/toplevel.c:943 jl_eval_module_expr at /source/src/toplevel.c:215 [inlined] jl_toplevel_eval_flex at /source/src/toplevel.c:743 jl_toplevel_eval_flex at /source/src/toplevel.c:886 ijl_toplevel_eval_in at /source/src/toplevel.c:994 eval at ./boot.jl:430 [inlined] include_string at ./loading.jl:2734 _jl_invoke at /source/src/gf.c:2948 [inlined] ijl_apply_generic at /source/src/gf.c:3125 _include at ./loading.jl:2794 include at ./Base.jl:557 jfptr_include_46977.1 at /opt/julia/lib/julia/sys.so (unknown line) _jl_invoke at /source/src/gf.c:2948 [inlined] ijl_apply_generic at /source/src/gf.c:3125 jl_apply at /source/src/julia.h:2157 [inlined] jl_f__call_latest at /source/src/builtins.c:875 include at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/sequential/runtests.jl:1 [inlined] macro expansion at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/sequential/runtests.jl:18 [inlined] macro expansion at /source/usr/share/julia/stdlib/v1.11/Test/src/Test.jl:1704 [inlined] macro expansion at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/sequential/runtests.jl:18 [inlined] macro expansion at ./timing.jl:581 [inlined] top-level scope at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/sequential/runtests.jl:18 _jl_invoke at /source/src/gf.c:2948 [inlined] ijl_invoke at /source/src/gf.c:2955 jl_toplevel_eval_flex at /source/src/toplevel.c:934 jl_eval_module_expr at /source/src/toplevel.c:215 [inlined] jl_toplevel_eval_flex at /source/src/toplevel.c:743 jl_toplevel_eval_flex at /source/src/toplevel.c:886 ijl_toplevel_eval_in at /source/src/toplevel.c:994 eval at ./boot.jl:430 [inlined] include_string at ./loading.jl:2734 _jl_invoke at /source/src/gf.c:2948 [inlined] ijl_apply_generic at /source/src/gf.c:3125 _include at ./loading.jl:2794 include at ./Base.jl:557 jfptr_include_46977.1 at /opt/julia/lib/julia/sys.so (unknown line) _jl_invoke at /source/src/gf.c:2948 [inlined] ijl_apply_generic at /source/src/gf.c:3125 jl_apply at /source/src/julia.h:2157 [inlined] jl_f__call_latest at /source/src/builtins.c:875 include at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/runtests.jl:1 [inlined] macro expansion at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/runtests.jl:5 [inlined] macro expansion at /source/usr/share/julia/stdlib/v1.11/Test/src/Test.jl:1704 [inlined] macro expansion at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/runtests.jl:5 [inlined] macro expansion at ./timing.jl:581 [inlined] top-level scope at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/runtests.jl:5 _jl_invoke at /source/src/gf.c:2948 [inlined] ijl_invoke at /source/src/gf.c:2955 jl_toplevel_eval_flex at /source/src/toplevel.c:934 jl_eval_module_expr at /source/src/toplevel.c:215 [inlined] jl_toplevel_eval_flex at /source/src/toplevel.c:743 jl_toplevel_eval_flex at /source/src/toplevel.c:886 ijl_toplevel_eval_in at /source/src/toplevel.c:994 eval at ./boot.jl:430 [inlined] include_string at ./loading.jl:2734 _jl_invoke at /source/src/gf.c:2948 [inlined] ijl_apply_generic at /source/src/gf.c:3125 _include at ./loading.jl:2794 include at ./sysimg.jl:38 unknown function (ip: 0x7889b5094082) _jl_invoke at /source/src/gf.c:2948 [inlined] ijl_apply_generic at /source/src/gf.c:3125 jl_apply at /source/src/julia.h:2157 [inlined] do_call at /source/src/interpreter.c:126 eval_value at /source/src/interpreter.c:223 eval_stmt_value at /source/src/interpreter.c:174 [inlined] eval_body at /source/src/interpreter.c:670 jl_interpret_toplevel_thunk at /source/src/interpreter.c:824 top-level scope at none:6 jl_toplevel_eval_flex at /source/src/toplevel.c:943 jl_toplevel_eval_flex at /source/src/toplevel.c:886 ijl_toplevel_eval_in at /source/src/toplevel.c:994 eval at ./boot.jl:430 [inlined] exec_options at ./client.jl:296 _start at ./client.jl:531 jfptr__start_73523.1 at /opt/julia/lib/julia/sys.so (unknown line) _jl_invoke at /source/src/gf.c:2948 [inlined] ijl_apply_generic at /source/src/gf.c:3125 jl_apply at /source/src/julia.h:2157 [inlined] true_main at /source/src/jlapi.c:900 jl_repl_entrypoint at /source/src/jlapi.c:1059 main at /source/cli/loader_exe.c:58 unknown function (ip: 0x7889b636d249) __libc_start_main at /lib/x86_64-linux-gnu/libc.so.6 (unknown line) unknown function (ip: 0x4010b8) 0 KSP Residual norm 8.550000000000e-03 1 KSP Residual norm 2.039013341310e-04 2 KSP Residual norm 6.195852905698e-06 3 KSP Residual norm 1.300754719285e-07 4 KSP Residual norm 3.121504432740e-09 5 KSP Residual norm 7.977302448122e-11 6 KSP Residual norm 1.598148195898e-12 7 KSP Residual norm 3.741486163257e-14 Linear solve converged due to CONVERGED_RTOL iterations 7 KSP Object: 1 MPI processes type: cg maximum iterations=1000, initial guess is zero tolerances: relative=1e-10, absolute=1e-50, divergence=10000. left preconditioning using UNPRECONDITIONED norm type for convergence test PC Object: 1 MPI processes type: gamg type is MULTIPLICATIVE, levels=3 cycles=v Cycles per PCApply=1 Using externally compute Galerkin coarse grid matrices GAMG specific options Threshold for dropping small values in graph on each level = 0. 0. 0. Threshold scaling factor for each level not specified = 1. AGG specific options Symmetric graph false Number of levels to square graph 0 Number smoothing steps 1 Complexity: grid = 1.15878 Coarse grid solver -- level ------------------------------- KSP Object: (mg_coarse_) 1 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_coarse_) 1 MPI processes type: bjacobi number of blocks = 1 Local solver information for first block is in the following KSP and PC objects on rank 0: Use -mg_coarse_ksp_view ::ascii_info_detail to display information for all blocks KSP Object: (mg_coarse_sub_) 1 MPI processes type: preonly maximum iterations=1, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_coarse_sub_) 1 MPI processes type: cholesky out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: nd factor fill ratio given 5., needed 1. Factored matrix follows: Mat Object: 1 MPI processes type: seqsbaij rows=7, cols=7 package used to perform factorization: petsc total: nonzeros=28, allocated nonzeros=28 block size is 1 linear system matrix = precond matrix: Mat Object: (mg_coarse_sub_) 1 MPI processes type: seqaij rows=7, cols=7 total: nonzeros=49, allocated nonzeros=49 total number of mallocs used during MatSetValues calls=0 using I-node routines: found 2 nodes, limit used is 5 linear system matrix = precond matrix: Mat Object: (mg_coarse_sub_) 1 MPI processes type: seqaij rows=7, cols=7 total: nonzeros=49, allocated nonzeros=49 total number of mallocs used during MatSetValues calls=0 using I-node routines: found 2 nodes, limit used is 5 Down solver (pre-smoother) on level 1 ------------------------------- KSP Object: (mg_levels_1_) 1 MPI processes type: chebyshev eigenvalue estimates used: min = 0.0998343, max = 1.09818 eigenvalues estimate via cg min 0.516584, max 0.998343 eigenvalues estimated using cg with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_1_esteig_) 1 MPI processes type: cg maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_1_) 1 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 1 MPI processes type: seqaij rows=76, cols=76 total: nonzeros=2432, allocated nonzeros=2432 total number of mallocs used during MatSetValues calls=0 not using I-node routines Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 2 ------------------------------- KSP Object: (mg_levels_2_) 1 MPI processes type: chebyshev eigenvalue estimates used: min = 0.0998116, max = 1.09793 eigenvalues estimate via cg min 0.336175, max 0.998116 eigenvalues estimated using cg with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_2_esteig_) 1 MPI processes type: cg maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_2_) 1 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 1 MPI processes type: seqaij rows=729, cols=729 total: nonzeros=15625, allocated nonzeros=0 total number of mallocs used during MatSetValues calls=0 not using I-node routines Up solver (post-smoother) same as down solver (pre-smoother) linear system matrix = precond matrix: Mat Object: 1 MPI processes type: seqaij rows=729, cols=729 total: nonzeros=15625, allocated nonzeros=0 total number of mallocs used during MatSetValues calls=0 not using I-node routines 0 KSP Residual norm 8.550000000000e-03 1 KSP Residual norm 2.039013341310e-04 2 KSP Residual norm 6.195852905698e-06 3 KSP Residual norm 1.300754719285e-07 4 KSP Residual norm 3.121504432740e-09 5 KSP Residual norm 7.977302448122e-11 6 KSP Residual norm 1.598148195898e-12 7 KSP Residual norm 3.741486163257e-14 Linear solve converged due to CONVERGED_RTOL iterations 7 KSP Object: 1 MPI processes type: cg maximum iterations=1000, initial guess is zero tolerances: relative=1e-10, absolute=1e-50, divergence=10000. left preconditioning using UNPRECONDITIONED norm type for convergence test PC Object: 1 MPI processes type: gamg type is MULTIPLICATIVE, levels=3 cycles=v Cycles per PCApply=1 Using externally compute Galerkin coarse grid matrices GAMG specific options Threshold for dropping small values in graph on each level = 0. 0. 0. Threshold scaling factor for each level not specified = 1. AGG specific options Symmetric graph false Number of levels to square graph 0 Number smoothing steps 1 Complexity: grid = 1.15878 Coarse grid solver -- level ------------------------------- KSP Object: (mg_coarse_) 1 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_coarse_) 1 MPI processes type: bjacobi number of blocks = 1 Local solver information for first block is in the following KSP and PC objects on rank 0: Use -mg_coarse_ksp_view ::ascii_info_detail to display information for all blocks KSP Object: (mg_coarse_sub_) 1 MPI processes type: preonly maximum iterations=1, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_coarse_sub_) 1 MPI processes type: cholesky out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: nd factor fill ratio given 5., needed 1. Factored matrix follows: Mat Object: 1 MPI processes type: seqsbaij rows=7, cols=7 package used to perform factorization: petsc total: nonzeros=28, allocated nonzeros=28 block size is 1 linear system matrix = precond matrix: Mat Object: (mg_coarse_sub_) 1 MPI processes type: seqaij rows=7, cols=7 total: nonzeros=49, allocated nonzeros=49 total number of mallocs used during MatSetValues calls=0 using I-node routines: found 2 nodes, limit used is 5 linear system matrix = precond matrix: Mat Object: (mg_coarse_sub_) 1 MPI processes type: seqaij rows=7, cols=7 total: nonzeros=49, allocated nonzeros=49 total number of mallocs used during MatSetValues calls=0 using I-node routines: found 2 nodes, limit used is 5 Down solver (pre-smoother) on level 1 ------------------------------- KSP Object: (mg_levels_1_) 1 MPI processes type: chebyshev eigenvalue estimates used: min = 0.0998343, max = 1.09818 eigenvalues estimate via cg min 0.516584, max 0.998343 eigenvalues estimated using cg with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_1_esteig_) 1 MPI processes type: cg maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_1_) 1 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 1 MPI processes type: seqaij rows=76, cols=76 total: nonzeros=2432, allocated nonzeros=2432 total number of mallocs used during MatSetValues calls=0 not using I-node routines Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 2 ------------------------------- KSP Object: (mg_levels_2_) 1 MPI processes type: chebyshev eigenvalue estimates used: min = 0.0998116, max = 1.09793 eigenvalues estimate via cg min 0.336175, max 0.998116 eigenvalues estimated using cg with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_2_esteig_) 1 MPI processes type: cg maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_2_) 1 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 1 MPI processes type: seqaij rows=729, cols=729 total: nonzeros=15625, allocated nonzeros=0 total number of mallocs used during MatSetValues calls=0 not using I-node routines Up solver (post-smoother) same as down solver (pre-smoother) linear system matrix = precond matrix: Mat Object: 1 MPI processes type: seqaij rows=729, cols=729 total: nonzeros=15625, allocated nonzeros=0 total number of mallocs used during MatSetValues calls=0 not using I-node routines 0 KSP Residual norm 8.550000000000e-03 1 KSP Residual norm 2.039013341310e-04 2 KSP Residual norm 6.195852905698e-06 3 KSP Residual norm 1.300754719285e-07 4 KSP Residual norm 3.121504432740e-09 5 KSP Residual norm 7.977302448122e-11 6 KSP Residual norm 1.598148195898e-12 7 KSP Residual norm 3.741486163257e-14 Linear solve converged due to CONVERGED_RTOL iterations 7 KSP Object: 1 MPI processes type: cg maximum iterations=1000, initial guess is zero tolerances: relative=1e-10, absolute=1e-50, divergence=10000. left preconditioning using UNPRECONDITIONED norm type for convergence test PC Object: 1 MPI processes type: gamg type is MULTIPLICATIVE, levels=3 cycles=v Cycles per PCApply=1 Using externally compute Galerkin coarse grid matrices GAMG specific options Threshold for dropping small values in graph on each level = 0. 0. 0. Threshold scaling factor for each level not specified = 1. AGG specific options Symmetric graph false Number of levels to square graph 0 Number smoothing steps 1 Complexity: grid = 1.15878 Coarse grid solver -- level ------------------------------- KSP Object: (mg_coarse_) 1 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_coarse_) 1 MPI processes type: bjacobi number of blocks = 1 Local solver information for first block is in the following KSP and PC objects on rank 0: Use -mg_coarse_ksp_view ::ascii_info_detail to display information for all blocks KSP Object: (mg_coarse_sub_) 1 MPI processes type: preonly maximum iterations=1, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_coarse_sub_) 1 MPI processes type: cholesky out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: nd factor fill ratio given 5., needed 1. Factored matrix follows: Mat Object: 1 MPI processes type: seqsbaij rows=7, cols=7 package used to perform factorization: petsc total: nonzeros=28, allocated nonzeros=28 block size is 1 linear system matrix = precond matrix: Mat Object: (mg_coarse_sub_) 1 MPI processes type: seqaij rows=7, cols=7 total: nonzeros=49, allocated nonzeros=49 total number of mallocs used during MatSetValues calls=0 using I-node routines: found 2 nodes, limit used is 5 linear system matrix = precond matrix: Mat Object: (mg_coarse_sub_) 1 MPI processes type: seqaij rows=7, cols=7 total: nonzeros=49, allocated nonzeros=49 total number of mallocs used during MatSetValues calls=0 using I-node routines: found 2 nodes, limit used is 5 Down solver (pre-smoother) on level 1 ------------------------------- KSP Object: (mg_levels_1_) 1 MPI processes type: chebyshev eigenvalue estimates used: min = 0.0998343, max = 1.09818 eigenvalues estimate via cg min 0.516584, max 0.998343 eigenvalues estimated using cg with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_1_esteig_) 1 MPI processes type: cg maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_1_) 1 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 1 MPI processes type: seqaij rows=76, cols=76 total: nonzeros=2432, allocated nonzeros=2432 total number of mallocs used during MatSetValues calls=0 not using I-node routines Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 2 ------------------------------- KSP Object: (mg_levels_2_) 1 MPI processes type: chebyshev eigenvalue estimates used: min = 0.0998116, max = 1.09793 eigenvalues estimate via cg min 0.336175, max 0.998116 eigenvalues estimated using cg with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_2_esteig_) 1 MPI processes type: cg maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_2_) 1 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 1 MPI processes type: seqaij rows=729, cols=729 total: nonzeros=15625, allocated nonzeros=39304 total number of mallocs used during MatSetValues calls=0 not using I-node routines Up solver (post-smoother) same as down solver (pre-smoother) linear system matrix = precond matrix: Mat Object: 1 MPI processes type: seqaij rows=729, cols=729 total: nonzeros=15625, allocated nonzeros=39304 total number of mallocs used during MatSetValues calls=0 not using I-node routines 0 KSP Residual norm 8.550000000000e-03 1 KSP Residual norm 2.039013341310e-04 2 KSP Residual norm 6.195852905698e-06 3 KSP Residual norm 1.300754719285e-07 4 KSP Residual norm 3.121504432740e-09 5 KSP Residual norm 7.977302448122e-11 6 KSP Residual norm 1.598148195898e-12 7 KSP Residual norm 3.741486163257e-14 Linear solve converged due to CONVERGED_RTOL iterations 7 KSP Object: 1 MPI processes type: cg maximum iterations=1000, initial guess is zero tolerances: relative=1e-10, absolute=1e-50, divergence=10000. left preconditioning using UNPRECONDITIONED norm type for convergence test PC Object: 1 MPI processes type: gamg type is MULTIPLICATIVE, levels=3 cycles=v Cycles per PCApply=1 Using externally compute Galerkin coarse grid matrices GAMG specific options Threshold for dropping small values in graph on each level = 0. 0. 0. Threshold scaling factor for each level not specified = 1. AGG specific options Symmetric graph false Number of levels to square graph 0 Number smoothing steps 1 Complexity: grid = 1.15878 Coarse grid solver -- level ------------------------------- KSP Object: (mg_coarse_) 1 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_coarse_) 1 MPI processes type: bjacobi number of blocks = 1 Local solver information for first block is in the following KSP and PC objects on rank 0: Use -mg_coarse_ksp_view ::ascii_info_detail to display information for all blocks KSP Object: (mg_coarse_sub_) 1 MPI processes type: preonly maximum iterations=1, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_coarse_sub_) 1 MPI processes type: cholesky out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: nd factor fill ratio given 5., needed 1. Factored matrix follows: Mat Object: 1 MPI processes type: seqsbaij rows=7, cols=7 package used to perform factorization: petsc total: nonzeros=28, allocated nonzeros=28 block size is 1 linear system matrix = precond matrix: Mat Object: (mg_coarse_sub_) 1 MPI processes type: seqaij rows=7, cols=7 total: nonzeros=49, allocated nonzeros=49 total number of mallocs used during MatSetValues calls=0 using I-node routines: found 2 nodes, limit used is 5 linear system matrix = precond matrix: Mat Object: (mg_coarse_sub_) 1 MPI processes type: seqaij rows=7, cols=7 total: nonzeros=49, allocated nonzeros=49 total number of mallocs used during MatSetValues calls=0 using I-node routines: found 2 nodes, limit used is 5 Down solver (pre-smoother) on level 1 ------------------------------- KSP Object: (mg_levels_1_) 1 MPI processes type: chebyshev eigenvalue estimates used: min = 0.0998343, max = 1.09818 eigenvalues estimate via cg min 0.516584, max 0.998343 eigenvalues estimated using cg with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_1_esteig_) 1 MPI processes type: cg maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_1_) 1 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 1 MPI processes type: seqaij rows=76, cols=76 total: nonzeros=2432, allocated nonzeros=2432 total number of mallocs used during MatSetValues calls=0 not using I-node routines Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 2 ------------------------------- KSP Object: (mg_levels_2_) 1 MPI processes type: chebyshev eigenvalue estimates used: min = 0.0998116, max = 1.09793 eigenvalues estimate via cg min 0.336175, max 0.998116 eigenvalues estimated using cg with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_2_esteig_) 1 MPI processes type: cg maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_2_) 1 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 1 MPI processes type: seqaij rows=729, cols=729 total: nonzeros=15625, allocated nonzeros=39304 total number of mallocs used during MatSetValues calls=0 not using I-node routines Up solver (post-smoother) same as down solver (pre-smoother) linear system matrix = precond matrix: Mat Object: 1 MPI processes type: seqaij rows=729, cols=729 total: nonzeros=15625, allocated nonzeros=39304 total number of mallocs used during MatSetValues calls=0 not using I-node routines 130.770350 seconds (85.96 M allocations: 4.277 GiB, 1.91% gc time, 99.42% compilation time) 0 KSP Residual norm 0.0467654 1 KSP Residual norm 0.0051782 2 KSP Residual norm 0.000607254 3 KSP Residual norm 4.67692e-05 4 KSP Residual norm 3.52562e-06 5 KSP Residual norm 3.76699e-06 6 KSP Residual norm 1.13505e-06 7 KSP Residual norm 8.08586e-07 8 KSP Residual norm 1.87761e-07 9 KSP Residual norm 1.5687e-07 10 KSP Residual norm 3.40583e-08 11 KSP Residual norm 1.62582e-08 12 KSP Residual norm 8.72662e-09 13 KSP Residual norm 4.20939e-09 14 KSP Residual norm 1.82524e-09 15 KSP Residual norm 4.163e-10 Linear solve converged due to CONVERGED_RTOL iterations 15 29.103882 seconds (14.59 M allocations: 757.531 MiB, 2.62% gc time, 97.10% compilation time) [ Info: Skipping DarcyDriver since petsc is not configured with mumps. 5.942170 seconds (8.74 M allocations: 434.683 MiB, 10.88% gc time, 90.46% compilation time) 0 SNES Function norm 2.409822369424e+06 Linear solve converged due to CONVERGED_RTOL iterations 54 1 SNES Function norm 8.075649416004e+00 Linear solve converged due to CONVERGED_RTOL iterations 39 2 SNES Function norm 4.662170753064e-01 Linear solve converged due to CONVERGED_RTOL iterations 37 3 SNES Function norm 8.399998666395e-03 Linear solve converged due to CONVERGED_RTOL iterations 37 4 SNES Function norm 4.665302402881e-06 Linear solve converged due to CONVERGED_RTOL iterations 44 5 SNES Function norm 1.573803618773e-11 Nonlinear solve converged due to CONVERGED_FNORM_RELATIVE iterations 5 0 SNES Function norm 2.409822369424e+06 Linear solve converged due to CONVERGED_RTOL iterations 54 1 SNES Function norm 8.075649416004e+00 Linear solve converged due to CONVERGED_RTOL iterations 39 2 SNES Function norm 4.662170753064e-01 Linear solve converged due to CONVERGED_RTOL iterations 37 3 SNES Function norm 8.399998666395e-03 Linear solve converged due to CONVERGED_RTOL iterations 37 4 SNES Function norm 4.665302402881e-06 Linear solve converged due to CONVERGED_RTOL iterations 44 5 SNES Function norm 1.573803618773e-11 Nonlinear solve converged due to CONVERGED_FNORM_RELATIVE iterations 5 48.625455 seconds (25.14 M allocations: 1.264 GiB, 1.80% gc time, 96.65% compilation time) Linear solve converged due to CONVERGED_RTOL iterations 28 174.415504 seconds (83.37 M allocations: 4.119 GiB, 1.18% gc time, 94.74% compilation time: <1% of which was recompilation) 0 SNES Function norm 2.409822369424e+06 Linear solve converged due to CONVERGED_RTOL iterations 258 1 SNES Function norm 8.077651079465e+00 Linear solve converged due to CONVERGED_RTOL iterations 225 2 SNES Function norm 4.661493250862e-01 Linear solve converged due to CONVERGED_RTOL iterations 215 3 SNES Function norm 8.397548297111e-03 Linear solve converged due to CONVERGED_RTOL iterations 246 4 SNES Function norm 4.669108166700e-06 Linear solve converged due to CONVERGED_RTOL iterations 313 5 SNES Function norm 4.635673871892e-11 Nonlinear solve converged due to CONVERGED_FNORM_RELATIVE iterations 5 0 SNES Function norm 4.635673871892e-11 Linear solve converged due to CONVERGED_RTOL iterations 287 1 SNES Function norm 5.072214559542e-13 Nonlinear solve converged due to CONVERGED_SNORM_RELATIVE iterations 1 0 SNES Function norm 2.409822369424e+06 Linear solve converged due to CONVERGED_RTOL iterations 258 1 SNES Function norm 8.077651079465e+00 Linear solve converged due to CONVERGED_RTOL iterations 225 2 SNES Function norm 4.661493250862e-01 Linear solve converged due to CONVERGED_RTOL iterations 215 3 SNES Function norm 8.397548297115e-03 Linear solve converged due to CONVERGED_RTOL iterations 246 4 SNES Function norm 4.669108167910e-06 Linear solve converged due to CONVERGED_RTOL iterations 313 5 SNES Function norm 4.635742176199e-11 Nonlinear solve converged due to CONVERGED_FNORM_RELATIVE iterations 5 0 SNES Function norm 4.635742176199e-11 Linear solve converged due to CONVERGED_RTOL iterations 287 1 SNES Function norm 5.163441011326e-13 Nonlinear solve converged due to CONVERGED_SNORM_RELATIVE iterations 1 68.109028 seconds (20.89 M allocations: 1.120 GiB, 1.86% gc time, 59.88% compilation time) Test Summary: | Pass Total Time SEQUENTIAL | 172 172 9m41.8s 581.901848 seconds (292.57 M allocations: 14.693 GiB, 1.98% gc time, 89.19% compilation time: <1% of which was recompilation) ┌ Warning: mpiexec() is deprecated, use the non-do-block form │ caller = ip:0x0 └ @ Core :-1 ERROR: ERROR: ERROR: LoadError: LoadError: LoadError: ArgumentError: Package Gridap [56d4f2e9-7ea1-5844-9cf6-b9c51ca7ce8e] is required but does not seem to be installed: - Run `Pkg.instantiate()` to install all recorded dependencies. ArgumentError: Package Gridap [56d4f2e9-7ea1-5844-9cf6-b9c51ca7ce8e] is required but does not seem to be installed: - Run `Pkg.instantiate()` to install all recorded dependencies. Stacktrace: Stacktrace: ArgumentError: Package Gridap [56d4f2e9-7ea1-5844-9cf6-b9c51ca7ce8e] is required but does not seem to be installed: - Run `Pkg.instantiate()` to install all recorded dependencies. Stacktrace: [1] [1] [1] _require(pkg::Base.PkgId, env::String) @ Base ./loading.jl:2515 [2] __require_prelocked(uuidkey::Base.PkgId, env::String) @ Base ./loading.jl:2388 [3] #invoke_in_world#3 @ ./essentials.jl:1089 [inlined] [4] invoke_in_world @ ./essentials.jl:1086 [inlined] [5] _require_prelocked(uuidkey::Base.PkgId, env::String) @ Base ./loading.jl:2375 [6] macro expansion @ ./loading.jl:2314 [inlined] [7] macro expansion @ ./lock.jl:273 [inlined] [8] __require(into::Module, mod::Symbol) @ Base ./loading.jl:2271 [9] #invoke_in_world#3 @ ./essentials.jl:1089 [inlined] [10] invoke_in_world @ ./essentials.jl:1086 [inlined] [11] require(into::Module, mod::Symbol) @ Base ./loading.jl:2260 [12] include(fname::String) @ Main ./sysimg.jl:38 [13] top-level scope @ ~/.julia/packages/GridapPETSc/X0jXh/test/mpi/PartitionedArraysTests.jl:1 in expression starting at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/PartitionedArraysTests.jl:1 in expression starting at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/mpi/PartitionedArraysTests.jl:1 _require(pkg::Base.PkgId, env::String) @ Base ./loading.jl:2515 [2] __require_prelocked(uuidkey::Base.PkgId, env::String) @ Base ./loading.jl:2388 [3] #invoke_in_world#3 @ ./essentials.jl:1089 [inlined] [4] invoke_in_world @ ./essentials.jl:1086 [inlined] [5] _require_prelocked(uuidkey::Base.PkgId, env::String) @ Base ./loading.jl:2375 [6] macro expansion @ ./loading.jl:2314 [inlined] [7] macro expansion @ ./lock.jl:273 [inlined] [8] __require(into::Module, mod::Symbol) @ Base ./loading.jl:2271 [9] #invoke_in_world#3 @ ./essentials.jl:1089 [inlined] [10] invoke_in_world @ ./essentials.jl:1086 [inlined] [11] require(into::Module, mod::Symbol) @ Base ./loading.jl:2260 [12] include(fname::String) @ Main ./sysimg.jl:38 [13] top-level scope @ ~/.julia/packages/GridapPETSc/X0jXh/test/mpi/PartitionedArraysTests.jl:1 in expression starting at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/PartitionedArraysTests.jl:1 in expression starting at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/mpi/PartitionedArraysTests.jl:1 _require(pkg::Base.PkgId, env::String) @ Base ./loading.jl:2515 [2] __require_prelocked(uuidkey::Base.PkgId, env::String) @ Base ./loading.jl:2388 [3] #invoke_in_world#3 @ ./essentials.jl:1089 [inlined] [4] invoke_in_world @ ./essentials.jl:1086 [inlined] [5] _require_prelocked(uuidkey::Base.PkgId, env::String) @ Base ./loading.jl:2375 [6] macro expansion @ ./loading.jl:2314 [inlined] [7] macro expansion @ ./lock.jl:273 [inlined] [8] __require(into::Module, mod::Symbol) @ Base ./loading.jl:2271 [9] #invoke_in_world#3 @ ./essentials.jl:1089 [inlined] [10] invoke_in_world @ ./essentials.jl:1086 [inlined] [11] require(into::Module, mod::Symbol) @ Base ./loading.jl:2260 [12] include(fname::String) @ Main ./sysimg.jl:38 [13] top-level scope @ ~/.julia/packages/GridapPETSc/X0jXh/test/mpi/PartitionedArraysTests.jl:1 in expression starting at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/PartitionedArraysTests.jl:1 in expression starting at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/mpi/PartitionedArraysTests.jl:1 PartitionedArrays: Error During Test at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/mpi/runtests.jl:7 Got exception outside of a @test LoadError: failed process: Process(`/home/pkgeval/.julia/artifacts/05d8c79b270470018e9de8dd24ddb6d7954aff9d/bin/mpiexec -n 3 /opt/julia/bin/julia -C native -J/opt/julia/lib/julia/sys.so --depwarn=yes --check-bounds=yes --pkgimages=existing -g1 --startup-file=no --project=/home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/mpi/../.. /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/mpi/PartitionedArraysTests.jl`, ProcessExited(1)) [1] Stacktrace: [1] pipeline_error @ ./process.jl:598 [inlined] [2] run(::Cmd; wait::Bool) @ Base ./process.jl:513 [3] run @ ./process.jl:510 [inlined] [4] (::Main.GridapPETScTests.GridapPETScMPITests.PartitionedArraysTestsRun.var"#2#3"{Int64, String, String, String})(cmd::String) @ Main.GridapPETScTests.GridapPETScMPITests.PartitionedArraysTestsRun ~/.julia/packages/GridapPETSc/X0jXh/test/mpi/mpiexec.jl:11 [5] (::JLLWrappers.var"#2#3"{Main.GridapPETScTests.GridapPETScMPITests.PartitionedArraysTestsRun.var"#2#3"{Int64, String, String, String}, String})() @ JLLWrappers ~/.julia/packages/JLLWrappers/GfYNv/src/runtime.jl:49 [6] withenv(::JLLWrappers.var"#2#3"{Main.GridapPETScTests.GridapPETScMPITests.PartitionedArraysTestsRun.var"#2#3"{Int64, String, String, String}, String}, ::Pair{String, String}, ::Vararg{Pair{String, String}}) @ Base ./env.jl:265 [7] withenv_executable_wrapper(f::Function, executable_path::String, PATH::String, LIBPATH::String, adjust_PATH::Bool, adjust_LIBPATH::Bool) @ JLLWrappers ~/.julia/packages/JLLWrappers/GfYNv/src/runtime.jl:48 [8] invokelatest(::Any, ::Any, ::Vararg{Any}; kwargs::@Kwargs{}) @ Base ./essentials.jl:1055 [9] invokelatest(::Any, ::Any, ::Vararg{Any}) @ Base ./essentials.jl:1052 [10] #mpiexec#3 @ ~/.julia/packages/JLLWrappers/GfYNv/src/products/executable_generators.jl:28 [inlined] [11] mpiexec @ ~/.julia/packages/JLLWrappers/GfYNv/src/products/executable_generators.jl:25 [inlined] [12] run_mpi_driver(; procs::Int64, file::String) @ Main.GridapPETScTests.GridapPETScMPITests.PartitionedArraysTestsRun ~/.julia/packages/GridapPETSc/X0jXh/test/mpi/mpiexec.jl:7 [13] top-level scope @ ~/.julia/packages/GridapPETSc/X0jXh/test/mpi/PartitionedArraysTestsRun.jl:3 [14] include(mod::Module, _path::String) @ Base ./Base.jl:557 [15] include @ ~/.julia/packages/GridapPETSc/X0jXh/test/mpi/runtests.jl:1 [inlined] [16] macro expansion @ ~/.julia/packages/GridapPETSc/X0jXh/test/mpi/runtests.jl:7 [inlined] [17] macro expansion @ /opt/julia/share/julia/stdlib/v1.11/Test/src/Test.jl:1704 [inlined] [18] macro expansion @ ~/.julia/packages/GridapPETSc/X0jXh/test/mpi/runtests.jl:7 [inlined] [19] macro expansion @ ./timing.jl:581 [inlined] [20] top-level scope @ ~/.julia/packages/GridapPETSc/X0jXh/test/mpi/runtests.jl:7 [21] include(mod::Module, _path::String) @ Base ./Base.jl:557 [22] include @ ~/.julia/packages/GridapPETSc/X0jXh/test/runtests.jl:1 [inlined] [23] macro expansion @ ~/.julia/packages/GridapPETSc/X0jXh/test/runtests.jl:6 [inlined] [24] macro expansion @ /opt/julia/share/julia/stdlib/v1.11/Test/src/Test.jl:1704 [inlined] [25] macro expansion @ ~/.julia/packages/GridapPETSc/X0jXh/test/runtests.jl:6 [inlined] [26] macro expansion @ ./timing.jl:581 [inlined] [27] top-level scope @ ~/.julia/packages/GridapPETSc/X0jXh/test/runtests.jl:6 [28] include(fname::String) @ Main ./sysimg.jl:38 [29] top-level scope @ none:6 [30] eval @ ./boot.jl:430 [inlined] [31] exec_options(opts::Base.JLOptions) @ Base ./client.jl:296 [32] _start() @ Base ./client.jl:531 in expression starting at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/mpi/PartitionedArraysTestsRun.jl:1 26.830425 seconds (2.07 M allocations: 109.818 MiB, 23.36% compilation time) ERROR: ERROR: LoadError: LoadError: ArgumentError: Package Gridap [56d4f2e9-7ea1-5844-9cf6-b9c51ca7ce8e] is required but does not seem to be installed: - Run `Pkg.instantiate()` to install all recorded dependencies. ArgumentError: Package Gridap [56d4f2e9-7ea1-5844-9cf6-b9c51ca7ce8e] is required but does not seem to be installed: - Run `Pkg.instantiate()` to install all recorded dependencies. Stacktrace: Stacktrace: [1] [1] _require(pkg::Base.PkgId, env::String)_require(pkg::Base.PkgId, env::String) @ Base @ Base./ loading.jl:2515./ loading.jl:2515 [2] [2] __require_prelocked(uuidkey::Base.PkgId, env::String) @ Base ./loading.jl:2388 [3] #invoke_in_world#3 @ ./essentials.jl:1089 [inlined] [4] invoke_in_world @ ./essentials.jl:1086 [inlined] [5] _require_prelocked(uuidkey::Base.PkgId, env::String) @ Base ./loading.jl:2375 [6] macro expansion @ ./loading.jl:2314 [inlined] [7] macro expansion @ ./lock.jl:273 [inlined] [8] __require(into::Module, mod::Symbol) @ Base ./loading.jl:2271 [9] #invoke_in_world#3 @ ./essentials.jl:1089 [inlined] [10] invoke_in_world @ ./essentials.jl:1086 [inlined] [11] require(into::Module, mod::Symbol) @ Base ./loading.jl:2260 [12] include(fname::String) @ Main ./sysimg.jl:38 [13] top-level scope @ ~/.julia/packages/GridapPETSc/X0jXh/test/mpi/PLaplacianTests.jl:1 in expression starting at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/PLaplacianTests.jl:1 in expression starting at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/mpi/PLaplacianTests.jl:1 __require_prelocked(uuidkey::Base.PkgId, env::String) @ Base ./loading.jl:2388 [3] #invoke_in_world#3 @ ./essentials.jl:1089 [inlined] [4] invoke_in_world @ ./essentials.jl:1086 [inlined] [5] _require_prelocked(uuidkey::Base.PkgId, env::String) @ Base ./loading.jl:2375 [6] macro expansion @ ./loading.jl:2314 [inlined] [7] macro expansion @ ./lock.jl:273 [inlined] [8] __require(into::Module, mod::Symbol) @ Base ./loading.jl:2271 [9] #invoke_in_world#3 @ ./essentials.jl:1089 [inlined] [10] invoke_in_world @ ./essentials.jl:1086 [inlined] [11] require(into::Module, mod::Symbol) @ Base ./loading.jl:2260 [12] include(fname::String) @ Main ./sysimg.jl:38 [13] top-level scope @ ~/.julia/packages/GridapPETSc/X0jXh/test/mpi/PLaplacianTests.jl:1 in expression starting at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/PLaplacianTests.jl:1 in expression starting at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/mpi/PLaplacianTests.jl:1 PLaplacianTests: Error During Test at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/mpi/runtests.jl:8 Got exception outside of a @test LoadError: failed process: Process(`/home/pkgeval/.julia/artifacts/05d8c79b270470018e9de8dd24ddb6d7954aff9d/bin/mpiexec -n 2 /opt/julia/bin/julia -C native -J/opt/julia/lib/julia/sys.so --depwarn=yes --check-bounds=yes --pkgimages=existing -g1 --startup-file=no --project=/home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/mpi/../.. /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/mpi/PLaplacianTests.jl`, ProcessExited(1)) [1] Stacktrace: [1] pipeline_error @ ./process.jl:598 [inlined] [2] run(::Cmd; wait::Bool) @ Base ./process.jl:513 [3] run @ ./process.jl:510 [inlined] [4] (::Main.GridapPETScTests.GridapPETScMPITests.PLaplacianTestsRun.var"#2#3"{Int64, String, String, String})(cmd::String) @ Main.GridapPETScTests.GridapPETScMPITests.PLaplacianTestsRun ~/.julia/packages/GridapPETSc/X0jXh/test/mpi/mpiexec.jl:11 [5] (::JLLWrappers.var"#2#3"{Main.GridapPETScTests.GridapPETScMPITests.PLaplacianTestsRun.var"#2#3"{Int64, String, String, String}, String})() @ JLLWrappers ~/.julia/packages/JLLWrappers/GfYNv/src/runtime.jl:49 [6] withenv(::JLLWrappers.var"#2#3"{Main.GridapPETScTests.GridapPETScMPITests.PLaplacianTestsRun.var"#2#3"{Int64, String, String, String}, String}, ::Pair{String, String}, ::Vararg{Pair{String, String}}) @ Base ./env.jl:265 [7] withenv_executable_wrapper(f::Function, executable_path::String, PATH::String, LIBPATH::String, adjust_PATH::Bool, adjust_LIBPATH::Bool) @ JLLWrappers ~/.julia/packages/JLLWrappers/GfYNv/src/runtime.jl:48 [8] invokelatest(::Any, ::Any, ::Vararg{Any}; kwargs::@Kwargs{}) @ Base ./essentials.jl:1055 [9] invokelatest(::Any, ::Any, ::Vararg{Any}) @ Base ./essentials.jl:1052 [10] #mpiexec#3 @ ~/.julia/packages/JLLWrappers/GfYNv/src/products/executable_generators.jl:28 [inlined] [11] mpiexec @ ~/.julia/packages/JLLWrappers/GfYNv/src/products/executable_generators.jl:25 [inlined] [12] run_mpi_driver(; procs::Int64, file::String) @ Main.GridapPETScTests.GridapPETScMPITests.PLaplacianTestsRun ~/.julia/packages/GridapPETSc/X0jXh/test/mpi/mpiexec.jl:7 [13] top-level scope @ ~/.julia/packages/GridapPETSc/X0jXh/test/mpi/PLaplacianTestsRun.jl:3 [14] include(mod::Module, _path::String) @ Base ./Base.jl:557 [15] include @ ~/.julia/packages/GridapPETSc/X0jXh/test/mpi/runtests.jl:1 [inlined] [16] macro expansion @ ~/.julia/packages/GridapPETSc/X0jXh/test/mpi/runtests.jl:8 [inlined] [17] macro expansion @ /opt/julia/share/julia/stdlib/v1.11/Test/src/Test.jl:1704 [inlined] [18] macro expansion @ ~/.julia/packages/GridapPETSc/X0jXh/test/mpi/runtests.jl:8 [inlined] [19] macro expansion @ ./timing.jl:581 [inlined] [20] top-level scope @ ~/.julia/packages/GridapPETSc/X0jXh/test/mpi/runtests.jl:8 [21] include(mod::Module, _path::String) @ Base ./Base.jl:557 [22] include @ ~/.julia/packages/GridapPETSc/X0jXh/test/runtests.jl:1 [inlined] [23] macro expansion @ ~/.julia/packages/GridapPETSc/X0jXh/test/runtests.jl:6 [inlined] [24] macro expansion @ /opt/julia/share/julia/stdlib/v1.11/Test/src/Test.jl:1704 [inlined] [25] macro expansion @ ~/.julia/packages/GridapPETSc/X0jXh/test/runtests.jl:6 [inlined] [26] macro expansion @ ./timing.jl:581 [inlined] [27] top-level scope @ ~/.julia/packages/GridapPETSc/X0jXh/test/runtests.jl:6 [28] include(fname::String) @ Main ./sysimg.jl:38 [29] top-level scope @ none:6 [30] eval @ ./boot.jl:430 [inlined] [31] exec_options(opts::Base.JLOptions) @ Base ./client.jl:296 [32] _start() @ Base ./client.jl:531 in expression starting at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/mpi/PLaplacianTestsRun.jl:1 13.986053 seconds (57.68 k allocations: 3.134 MiB, 1.57% compilation time) ERROR: ERROR: ERROR: ERROR: LoadError: LoadError: LoadError: LoadError: ArgumentError: Package Gridap [56d4f2e9-7ea1-5844-9cf6-b9c51ca7ce8e] is required but does not seem to be installed: - Run `Pkg.instantiate()` to install all recorded dependencies. ArgumentError: Package Gridap [56d4f2e9-7ea1-5844-9cf6-b9c51ca7ce8e] is required but does not seem to be installed: - Run `Pkg.instantiate()` to install all recorded dependencies. ArgumentError: Package Gridap [56d4f2e9-7ea1-5844-9cf6-b9c51ca7ce8e] is required but does not seem to be installed: - Run `Pkg.instantiate()` to install all recorded dependencies. Stacktrace: Stacktrace: Stacktrace: ArgumentError: Package Gridap [56d4f2e9-7ea1-5844-9cf6-b9c51ca7ce8e] is required but does not seem to be installed: - Run `Pkg.instantiate()` to install all recorded dependencies. Stacktrace: [1] [1] [1] [1] _require(pkg::Base.PkgId, env::String) @ Base ./loading.jl:2515 [2] __require_prelocked(uuidkey::Base.PkgId, env::String) @ Base ./loading.jl:2388 [3] #invoke_in_world#3 @ ./essentials.jl:1089 [inlined] [4] invoke_in_world @ ./essentials.jl:1086 [inlined] [5] _require_prelocked(uuidkey::Base.PkgId, env::String) @ Base ./loading.jl:2375 [6] macro expansion @ ./loading.jl:2314 [inlined] [7] macro expansion @ ./lock.jl:273 [inlined] [8] __require(into::Module, mod::Symbol) @ Base ./loading.jl:2271 [9] #invoke_in_world#3 @ ./essentials.jl:1089 [inlined] [10] invoke_in_world @ ./essentials.jl:1086 [inlined] [11] require(into::Module, mod::Symbol) @ Base ./loading.jl:2260 [12] include(fname::String) @ Main ./sysimg.jl:38 [13] top-level scope @ ~/.julia/packages/GridapPETSc/X0jXh/test/mpi/GCTests.jl:1 in expression starting at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/PLaplacianTests.jl:1 in expression starting at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/mpi/GCTests.jl:1 _require(pkg::Base.PkgId, env::String) @ Base ./loading.jl:2515 [2] __require_prelocked(uuidkey::Base.PkgId, env::String) @ Base ./loading.jl:2388 [3] #invoke_in_world#3 @ ./essentials.jl:1089 [inlined] [4] invoke_in_world @ ./essentials.jl:1086 [inlined] [5] _require_prelocked(uuidkey::Base.PkgId, env::String) @ Base ./loading.jl:2375 [6] macro expansion @ ./loading.jl:2314 [inlined] [7] macro expansion @ ./lock.jl:273 [inlined] [8] __require(into::Module, mod::Symbol) @ Base ./loading.jl:2271 [9] #invoke_in_world#3 @ ./essentials.jl:1089 [inlined] [10] invoke_in_world @ ./essentials.jl:1086 [inlined] [11] require(into::Module, mod::Symbol) @ Base ./loading.jl:2260 [12] include(fname::String) @ Main ./sysimg.jl:38 [13] top-level scope @ ~/.julia/packages/GridapPETSc/X0jXh/test/mpi/GCTests.jl:1 in expression starting at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/PLaplacianTests.jl:1 in expression starting at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/mpi/GCTests.jl:1 _require(pkg::Base.PkgId, env::String) @ Base ./loading.jl:2515 [2] __require_prelocked(uuidkey::Base.PkgId, env::String) @ Base ./loading.jl:2388 [3] #invoke_in_world#3 @ ./essentials.jl:1089 [inlined] [4] invoke_in_world @ ./essentials.jl:1086 [inlined] [5] _require_prelocked(uuidkey::Base.PkgId, env::String) @ Base ./loading.jl:2375 [6] macro expansion @ ./loading.jl:2314 [inlined] [7] macro expansion @ ./lock.jl:273 [inlined] [8] __require(into::Module, mod::Symbol) @ Base ./loading.jl:2271 [9] #invoke_in_world#3 @ ./essentials.jl:1089 [inlined] [10] invoke_in_world @ ./essentials.jl:1086 [inlined] [11] require(into::Module, mod::Symbol) @ Base ./loading.jl:2260 [12] include(fname::String) @ Main ./sysimg.jl:38 [13] top-level scope @ ~/.julia/packages/GridapPETSc/X0jXh/test/mpi/GCTests.jl:1 in expression starting at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/PLaplacianTests.jl:1 in expression starting at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/mpi/GCTests.jl:1 _require(pkg::Base.PkgId, env::String) @ Base ./loading.jl:2515 [2] __require_prelocked(uuidkey::Base.PkgId, env::String) @ Base ./loading.jl:2388 [3] #invoke_in_world#3 @ ./essentials.jl:1089 [inlined] [4] invoke_in_world @ ./essentials.jl:1086 [inlined] [5] _require_prelocked(uuidkey::Base.PkgId, env::String) @ Base ./loading.jl:2375 [6] macro expansion @ ./loading.jl:2314 [inlined] [7] macro expansion @ ./lock.jl:273 [inlined] [8] __require(into::Module, mod::Symbol) @ Base ./loading.jl:2271 [9] #invoke_in_world#3 @ ./essentials.jl:1089 [inlined] [10] invoke_in_world @ ./essentials.jl:1086 [inlined] [11] require(into::Module, mod::Symbol) @ Base ./loading.jl:2260 [12] include(fname::String) @ Main ./sysimg.jl:38 [13] top-level scope @ ~/.julia/packages/GridapPETSc/X0jXh/test/mpi/GCTests.jl:1 in expression starting at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/PLaplacianTests.jl:1 in expression starting at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/mpi/GCTests.jl:1 GCTests: Error During Test at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/mpi/runtests.jl:9 Got exception outside of a @test LoadError: failed process: Process(`/home/pkgeval/.julia/artifacts/05d8c79b270470018e9de8dd24ddb6d7954aff9d/bin/mpiexec -n 4 /opt/julia/bin/julia -C native -J/opt/julia/lib/julia/sys.so --depwarn=yes --check-bounds=yes --pkgimages=existing -g1 --startup-file=no --project=/home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/mpi/../.. /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/mpi/GCTests.jl`, ProcessExited(1)) [1] Stacktrace: [1] pipeline_error @ ./process.jl:598 [inlined] [2] run(::Cmd; wait::Bool) @ Base ./process.jl:513 [3] run @ ./process.jl:510 [inlined] [4] (::Main.GridapPETScTests.GridapPETScMPITests.GCTestsRun.var"#2#3"{Int64, String, String, String})(cmd::String) @ Main.GridapPETScTests.GridapPETScMPITests.GCTestsRun ~/.julia/packages/GridapPETSc/X0jXh/test/mpi/mpiexec.jl:11 [5] (::JLLWrappers.var"#2#3"{Main.GridapPETScTests.GridapPETScMPITests.GCTestsRun.var"#2#3"{Int64, String, String, String}, String})() @ JLLWrappers ~/.julia/packages/JLLWrappers/GfYNv/src/runtime.jl:49 [6] withenv(::JLLWrappers.var"#2#3"{Main.GridapPETScTests.GridapPETScMPITests.GCTestsRun.var"#2#3"{Int64, String, String, String}, String}, ::Pair{String, String}, ::Vararg{Pair{String, String}}) @ Base ./env.jl:265 [7] withenv_executable_wrapper(f::Function, executable_path::String, PATH::String, LIBPATH::String, adjust_PATH::Bool, adjust_LIBPATH::Bool) @ JLLWrappers ~/.julia/packages/JLLWrappers/GfYNv/src/runtime.jl:48 [8] invokelatest(::Any, ::Any, ::Vararg{Any}; kwargs::@Kwargs{}) @ Base ./essentials.jl:1055 [9] invokelatest(::Any, ::Any, ::Vararg{Any}) @ Base ./essentials.jl:1052 [10] #mpiexec#3 @ ~/.julia/packages/JLLWrappers/GfYNv/src/products/executable_generators.jl:28 [inlined] [11] mpiexec @ ~/.julia/packages/JLLWrappers/GfYNv/src/products/executable_generators.jl:25 [inlined] [12] run_mpi_driver(; procs::Int64, file::String) @ Main.GridapPETScTests.GridapPETScMPITests.GCTestsRun ~/.julia/packages/GridapPETSc/X0jXh/test/mpi/mpiexec.jl:7 [13] top-level scope @ ~/.julia/packages/GridapPETSc/X0jXh/test/mpi/GCTestsRun.jl:3 [14] include(mod::Module, _path::String) @ Base ./Base.jl:557 [15] include @ ~/.julia/packages/GridapPETSc/X0jXh/test/mpi/runtests.jl:1 [inlined] [16] macro expansion @ ~/.julia/packages/GridapPETSc/X0jXh/test/mpi/runtests.jl:9 [inlined] [17] macro expansion @ /opt/julia/share/julia/stdlib/v1.11/Test/src/Test.jl:1704 [inlined] [18] macro expansion @ ~/.julia/packages/GridapPETSc/X0jXh/test/mpi/runtests.jl:9 [inlined] [19] macro expansion @ ./timing.jl:581 [inlined] [20] top-level scope @ ~/.julia/packages/GridapPETSc/X0jXh/test/mpi/runtests.jl:9 [21] include(mod::Module, _path::String) @ Base ./Base.jl:557 [22] include @ ~/.julia/packages/GridapPETSc/X0jXh/test/runtests.jl:1 [inlined] [23] macro expansion @ ~/.julia/packages/GridapPETSc/X0jXh/test/runtests.jl:6 [inlined] [24] macro expansion @ /opt/julia/share/julia/stdlib/v1.11/Test/src/Test.jl:1704 [inlined] [25] macro expansion @ ~/.julia/packages/GridapPETSc/X0jXh/test/runtests.jl:6 [inlined] [26] macro expansion @ ./timing.jl:581 [inlined] [27] top-level scope @ ~/.julia/packages/GridapPETSc/X0jXh/test/runtests.jl:6 [28] include(fname::String) @ Main ./sysimg.jl:38 [29] top-level scope @ none:6 [30] eval @ ./boot.jl:430 [inlined] [31] exec_options(opts::Base.JLOptions) @ Base ./client.jl:296 [32] _start() @ Base ./client.jl:531 in expression starting at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/mpi/GCTestsRun.jl:1 27.594885 seconds (57.67 k allocations: 3.132 MiB, 0.77% compilation time) ERROR: ERROR: ERROR: ERROR: LoadError: LoadError: LoadError: LoadError: ArgumentError: Package SparseMatricesCSR [a0a7dd2c-ebf4-11e9-1f05-cf50bc540ca1] is required but does not seem to be installed: - Run `Pkg.instantiate()` to install all recorded dependencies. ArgumentError: Package SparseMatricesCSR [a0a7dd2c-ebf4-11e9-1f05-cf50bc540ca1] is required but does not seem to be installed: - Run `Pkg.instantiate()` to install all recorded dependencies. ArgumentError: Package SparseMatricesCSR [a0a7dd2c-ebf4-11e9-1f05-cf50bc540ca1] is required but does not seem to be installed: - Run `Pkg.instantiate()` to install all recorded dependencies. Stacktrace: Stacktrace: Stacktrace: ArgumentError: Package SparseMatricesCSR [a0a7dd2c-ebf4-11e9-1f05-cf50bc540ca1] is required but does not seem to be installed: - Run `Pkg.instantiate()` to install all recorded dependencies. [1] Stacktrace: [1] [1] _require(pkg::Base.PkgId, env::String) @ Base ./loading.jl:2515 [2] __require_prelocked(uuidkey::Base.PkgId, env::String) @ Base ./loading.jl:2388 [3] #invoke_in_world#3 @ ./essentials.jl:1089 [inlined] [4] invoke_in_world @ ./essentials.jl:1086 [inlined] [5] _require_prelocked(uuidkey::Base.PkgId, env::String) @ Base ./loading.jl:2375 [6] macro expansion @ ./loading.jl:2314 [inlined] [7] macro expansion @ ./lock.jl:273 [inlined] [8] __require(into::Module, mod::Symbol) @ Base ./loading.jl:2271 [9] #invoke_in_world#3 @ ./essentials.jl:1089 [inlined] [10] invoke_in_world @ ./essentials.jl:1086 [inlined] [11] require(into::Module, mod::Symbol) @ Base ./loading.jl:2260 [12] include(fname::String) @ Main ./sysimg.jl:38 [13] top-level scope @ ~/.julia/packages/GridapPETSc/X0jXh/test/mpi/PoissonTests.jl:1 in expression starting at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/PoissonTests.jl:1 in expression starting at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/mpi/PoissonTests.jl:1 _require(pkg::Base.PkgId, env::String) @ Base ./loading.jl:2515 [2] __require_prelocked(uuidkey::Base.PkgId, env::String) @ Base ./loading.jl:2388 [3] #invoke_in_world#3 @ ./essentials.jl:1089 [inlined] [4] invoke_in_world @ ./essentials.jl:1086 [inlined] [5] _require_prelocked(uuidkey::Base.PkgId, env::String) @ Base ./loading.jl:2375 [6] macro expansion @ ./loading.jl:2314 [inlined] [7] macro expansion @ ./lock.jl:273 [inlined] [8] __require(into::Module, mod::Symbol) @ Base ./loading.jl:2271 [9] #invoke_in_world#3 @ ./essentials.jl:1089 [inlined] [10] invoke_in_world @ ./essentials.jl:1086 [inlined] [11] require(into::Module, mod::Symbol) @ Base ./loading.jl:2260 [12] include(fname::String) @ Main ./sysimg.jl:38 [13] top-level scope @ ~/.julia/packages/GridapPETSc/X0jXh/test/mpi/PoissonTests.jl:1 in expression starting at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/PoissonTests.jl:1 in expression starting at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/mpi/PoissonTests.jl:1 _require(pkg::Base.PkgId, env::String) @ Base ./loading.jl:2515 [2] __require_prelocked(uuidkey::Base.PkgId, env::String) @ Base ./loading.jl:2388 [3] #invoke_in_world#3 @ ./essentials.jl:1089 [inlined] [4] invoke_in_world @ ./essentials.jl:1086 [inlined] [5] _require_prelocked(uuidkey::Base.PkgId, env::String) @ Base ./loading.jl:2375 [6] macro expansion @ ./loading.jl:2314 [inlined] [7] macro expansion @ ./lock.jl:273 [inlined] [8] __require(into::Module, mod::Symbol) @ Base ./loading.jl:2271 [9] #invoke_in_world#3 @ ./essentials.jl:1089 [inlined] [10] invoke_in_world @ ./essentials.jl:1086 [inlined] [11] require(into::Module, mod::Symbol) @ Base ./loading.jl:2260 [12] include(fname::String) @ Main ./sysimg.jl:38 [13] top-level scope @ ~/.julia/packages/GridapPETSc/X0jXh/test/mpi/PoissonTests.jl:1 in expression starting at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/PoissonTests.jl:1 in expression starting at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/mpi/PoissonTests.jl:1 [1] _require(pkg::Base.PkgId, env::String) @ Base ./loading.jl:2515 [2] __require_prelocked(uuidkey::Base.PkgId, env::String) @ Base ./loading.jl:2388 [3] #invoke_in_world#3 @ ./essentials.jl:1089 [inlined] [4] invoke_in_world @ ./essentials.jl:1086 [inlined] [5] _require_prelocked(uuidkey::Base.PkgId, env::String) @ Base ./loading.jl:2375 [6] macro expansion @ ./loading.jl:2314 [inlined] [7] macro expansion @ ./lock.jl:273 [inlined] [8] __require(into::Module, mod::Symbol) @ Base ./loading.jl:2271 [9] #invoke_in_world#3 @ ./essentials.jl:1089 [inlined] [10] invoke_in_world @ ./essentials.jl:1086 [inlined] [11] require(into::Module, mod::Symbol) @ Base ./loading.jl:2260 [12] include(fname::String) @ Main ./sysimg.jl:38 [13] top-level scope @ ~/.julia/packages/GridapPETSc/X0jXh/test/mpi/PoissonTests.jl:1 in expression starting at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/PoissonTests.jl:1 in expression starting at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/mpi/PoissonTests.jl:1 PoissonTests: Error During Test at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/mpi/runtests.jl:10 Got exception outside of a @test LoadError: failed process: Process(`/home/pkgeval/.julia/artifacts/05d8c79b270470018e9de8dd24ddb6d7954aff9d/bin/mpiexec -n 4 /opt/julia/bin/julia -C native -J/opt/julia/lib/julia/sys.so --depwarn=yes --check-bounds=yes --pkgimages=existing -g1 --startup-file=no --project=/home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/mpi/../.. /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/mpi/PoissonTests.jl`, ProcessExited(1)) [1] Stacktrace: [1] pipeline_error @ ./process.jl:598 [inlined] [2] run(::Cmd; wait::Bool) @ Base ./process.jl:513 [3] run @ ./process.jl:510 [inlined] [4] (::Main.GridapPETScTests.GridapPETScMPITests.PoissonTestsRun.var"#2#3"{Int64, String, String, String})(cmd::String) @ Main.GridapPETScTests.GridapPETScMPITests.PoissonTestsRun ~/.julia/packages/GridapPETSc/X0jXh/test/mpi/mpiexec.jl:11 [5] (::JLLWrappers.var"#2#3"{Main.GridapPETScTests.GridapPETScMPITests.PoissonTestsRun.var"#2#3"{Int64, String, String, String}, String})() @ JLLWrappers ~/.julia/packages/JLLWrappers/GfYNv/src/runtime.jl:49 [6] withenv(::JLLWrappers.var"#2#3"{Main.GridapPETScTests.GridapPETScMPITests.PoissonTestsRun.var"#2#3"{Int64, String, String, String}, String}, ::Pair{String, String}, ::Vararg{Pair{String, String}}) @ Base ./env.jl:265 [7] withenv_executable_wrapper(f::Function, executable_path::String, PATH::String, LIBPATH::String, adjust_PATH::Bool, adjust_LIBPATH::Bool) @ JLLWrappers ~/.julia/packages/JLLWrappers/GfYNv/src/runtime.jl:48 [8] invokelatest(::Any, ::Any, ::Vararg{Any}; kwargs::@Kwargs{}) @ Base ./essentials.jl:1055 [9] invokelatest(::Any, ::Any, ::Vararg{Any}) @ Base ./essentials.jl:1052 [10] #mpiexec#3 @ ~/.julia/packages/JLLWrappers/GfYNv/src/products/executable_generators.jl:28 [inlined] [11] mpiexec @ ~/.julia/packages/JLLWrappers/GfYNv/src/products/executable_generators.jl:25 [inlined] [12] run_mpi_driver(; procs::Int64, file::String) @ Main.GridapPETScTests.GridapPETScMPITests.PoissonTestsRun ~/.julia/packages/GridapPETSc/X0jXh/test/mpi/mpiexec.jl:7 [13] top-level scope @ ~/.julia/packages/GridapPETSc/X0jXh/test/mpi/PoissonTestsRun.jl:3 [14] include(mod::Module, _path::String) @ Base ./Base.jl:557 [15] include @ ~/.julia/packages/GridapPETSc/X0jXh/test/mpi/runtests.jl:1 [inlined] [16] macro expansion @ ~/.julia/packages/GridapPETSc/X0jXh/test/mpi/runtests.jl:10 [inlined] [17] macro expansion @ /opt/julia/share/julia/stdlib/v1.11/Test/src/Test.jl:1704 [inlined] [18] macro expansion @ ~/.julia/packages/GridapPETSc/X0jXh/test/mpi/runtests.jl:10 [inlined] [19] macro expansion @ ./timing.jl:581 [inlined] [20] top-level scope @ ~/.julia/packages/GridapPETSc/X0jXh/test/mpi/runtests.jl:10 [21] include(mod::Module, _path::String) @ Base ./Base.jl:557 [22] include @ ~/.julia/packages/GridapPETSc/X0jXh/test/runtests.jl:1 [inlined] [23] macro expansion @ ~/.julia/packages/GridapPETSc/X0jXh/test/runtests.jl:6 [inlined] [24] macro expansion @ /opt/julia/share/julia/stdlib/v1.11/Test/src/Test.jl:1704 [inlined] [25] macro expansion @ ~/.julia/packages/GridapPETSc/X0jXh/test/runtests.jl:6 [inlined] [26] macro expansion @ ./timing.jl:581 [inlined] [27] top-level scope @ ~/.julia/packages/GridapPETSc/X0jXh/test/runtests.jl:6 [28] include(fname::String) @ Main ./sysimg.jl:38 [29] top-level scope @ none:6 [30] eval @ ./boot.jl:430 [inlined] [31] exec_options(opts::Base.JLOptions) @ Base ./client.jl:296 [32] _start() @ Base ./client.jl:531 in expression starting at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/mpi/PoissonTestsRun.jl:1 27.762511 seconds (57.67 k allocations: 3.131 MiB, 0.76% compilation time) ERROR: ERROR: ERROR: ERROR: LoadError: LoadError: LoadError: LoadError: ArgumentError: Package Gridap [56d4f2e9-7ea1-5844-9cf6-b9c51ca7ce8e] is required but does not seem to be installed: - Run `Pkg.instantiate()` to install all recorded dependencies. ArgumentError: Package Gridap [56d4f2e9-7ea1-5844-9cf6-b9c51ca7ce8e] is required but does not seem to be installed: - Run `Pkg.instantiate()` to install all recorded dependencies. Stacktrace: ArgumentError: Package Gridap [56d4f2e9-7ea1-5844-9cf6-b9c51ca7ce8e] is required but does not seem to be installed: - Run `Pkg.instantiate()` to install all recorded dependencies. ArgumentError: Stacktrace: Package Gridap [56d4f2e9-7ea1-5844-9cf6-b9c51ca7ce8e] is required but does not seem to be installed: - Run `Pkg.instantiate()` to install all recorded dependencies. Stacktrace: Stacktrace: [1] [1] [1] [1] _require(pkg::Base.PkgId, env::String) @ Base ./loading.jl:2515 [2] __require_prelocked(uuidkey::Base.PkgId, env::String) @ Base ./loading.jl:2388 [3] #invoke_in_world#3 @ ./essentials.jl:1089 [inlined] [4] invoke_in_world @ ./essentials.jl:1086 [inlined] [5] _require_prelocked(uuidkey::Base.PkgId, env::String) @ Base ./loading.jl:2375 [6] macro expansion @ ./loading.jl:2314 [inlined] [7] macro expansion @ ./lock.jl:273 [inlined] [8] __require(into::Module, mod::Symbol) @ Base ./loading.jl:2271 [9] #invoke_in_world#3 @ ./essentials.jl:1089 [inlined] [10] invoke_in_world @ ./essentials.jl:1086 [inlined] [11] require(into::Module, mod::Symbol) @ Base ./loading.jl:2260 [12] include(fname::String) @ Main ./sysimg.jl:38 [13] top-level scope @ ~/.julia/packages/GridapPETSc/X0jXh/test/mpi/DarcyTests.jl:1 in expression starting at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/DarcyTests.jl:1 in expression starting at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/mpi/DarcyTests.jl:1 _require(pkg::Base.PkgId, env::String) @ Base ./loading.jl:2515 [2] __require_prelocked(uuidkey::Base.PkgId, env::String) @ Base ./loading.jl:2388 [3] #invoke_in_world#3 @ ./essentials.jl:1089 [inlined] [4] invoke_in_world @ ./essentials.jl:1086 [inlined] [5] _require_prelocked(uuidkey::Base.PkgId, env::String) @ Base ./loading.jl:2375 [6] macro expansion @ ./loading.jl:2314 [inlined] [7] macro expansion @ ./lock.jl:273 [inlined] [8] __require(into::Module, mod::Symbol) @ Base ./loading.jl:2271 [9] #invoke_in_world#3 @ ./essentials.jl:1089 [inlined] [10] invoke_in_world @ ./essentials.jl:1086 [inlined] [11] require(into::Module, mod::Symbol) @ Base ./loading.jl:2260 [12] include(fname::String) @ Main ./sysimg.jl:38 [13] top-level scope @ ~/.julia/packages/GridapPETSc/X0jXh/test/mpi/DarcyTests.jl:1 in expression starting at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/DarcyTests.jl:1 in expression starting at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/mpi/DarcyTests.jl:1 _require(pkg::Base.PkgId, env::String) @ Base ./loading.jl:2515 [2] __require_prelocked(uuidkey::Base.PkgId, env::String) @ Base ./loading.jl:2388 [3] #invoke_in_world#3 @ ./essentials.jl:1089 [inlined] [4] invoke_in_world @ ./essentials.jl:1086 [inlined] [5] _require_prelocked(uuidkey::Base.PkgId, env::String) @ Base ./loading.jl:2375 [6] macro expansion @ ./loading.jl:2314 [inlined] [7] macro expansion @ ./lock.jl:273 [inlined] [8] __require(into::Module, mod::Symbol) @ Base ./loading.jl:2271 [9] #invoke_in_world#3 @ ./essentials.jl:1089 [inlined] [10] invoke_in_world @ ./essentials.jl:1086 [inlined] [11] require(into::Module, mod::Symbol) @ Base ./loading.jl:2260 [12] include(fname::String) @ Main ./sysimg.jl:38 [13] top-level scope @ ~/.julia/packages/GridapPETSc/X0jXh/test/mpi/DarcyTests.jl:1 in expression starting at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/DarcyTests.jl:1 in expression starting at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/mpi/DarcyTests.jl:1 _require(pkg::Base.PkgId, env::String) @ Base ./loading.jl:2515 [2] __require_prelocked(uuidkey::Base.PkgId, env::String) @ Base ./loading.jl:2388 [3] #invoke_in_world#3 @ ./essentials.jl:1089 [inlined] [4] invoke_in_world @ ./essentials.jl:1086 [inlined] [5] _require_prelocked(uuidkey::Base.PkgId, env::String) @ Base ./loading.jl:2375 [6] macro expansion @ ./loading.jl:2314 [inlined] [7] macro expansion @ ./lock.jl:273 [inlined] [8] __require(into::Module, mod::Symbol) @ Base ./loading.jl:2271 [9] #invoke_in_world#3 @ ./essentials.jl:1089 [inlined] [10] invoke_in_world @ ./essentials.jl:1086 [inlined] [11] require(into::Module, mod::Symbol) @ Base ./loading.jl:2260 [12] include(fname::String) @ Main ./sysimg.jl:38 [13] top-level scope @ ~/.julia/packages/GridapPETSc/X0jXh/test/mpi/DarcyTests.jl:1 in expression starting at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/DarcyTests.jl:1 in expression starting at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/mpi/DarcyTests.jl:1 DarcyTests: Error During Test at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/mpi/runtests.jl:11 Got exception outside of a @test LoadError: failed process: Process(`/home/pkgeval/.julia/artifacts/05d8c79b270470018e9de8dd24ddb6d7954aff9d/bin/mpiexec -n 4 /opt/julia/bin/julia -C native -J/opt/julia/lib/julia/sys.so --depwarn=yes --check-bounds=yes --pkgimages=existing -g1 --startup-file=no --project=/home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/mpi/../.. /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/mpi/DarcyTests.jl`, ProcessExited(1)) [1] Stacktrace: [1] pipeline_error @ ./process.jl:598 [inlined] [2] run(::Cmd; wait::Bool) @ Base ./process.jl:513 [3] run @ ./process.jl:510 [inlined] [4] (::Main.GridapPETScTests.GridapPETScMPITests.DarcyTestsRun.var"#2#3"{Int64, String, String, String})(cmd::String) @ Main.GridapPETScTests.GridapPETScMPITests.DarcyTestsRun ~/.julia/packages/GridapPETSc/X0jXh/test/mpi/mpiexec.jl:11 [5] (::JLLWrappers.var"#2#3"{Main.GridapPETScTests.GridapPETScMPITests.DarcyTestsRun.var"#2#3"{Int64, String, String, String}, String})() @ JLLWrappers ~/.julia/packages/JLLWrappers/GfYNv/src/runtime.jl:49 [6] withenv(::JLLWrappers.var"#2#3"{Main.GridapPETScTests.GridapPETScMPITests.DarcyTestsRun.var"#2#3"{Int64, String, String, String}, String}, ::Pair{String, String}, ::Vararg{Pair{String, String}}) @ Base ./env.jl:265 [7] withenv_executable_wrapper(f::Function, executable_path::String, PATH::String, LIBPATH::String, adjust_PATH::Bool, adjust_LIBPATH::Bool) @ JLLWrappers ~/.julia/packages/JLLWrappers/GfYNv/src/runtime.jl:48 [8] invokelatest(::Any, ::Any, ::Vararg{Any}; kwargs::@Kwargs{}) @ Base ./essentials.jl:1055 [9] invokelatest(::Any, ::Any, ::Vararg{Any}) @ Base ./essentials.jl:1052 [10] #mpiexec#3 @ ~/.julia/packages/JLLWrappers/GfYNv/src/products/executable_generators.jl:28 [inlined] [11] mpiexec @ ~/.julia/packages/JLLWrappers/GfYNv/src/products/executable_generators.jl:25 [inlined] [12] run_mpi_driver(; procs::Int64, file::String) @ Main.GridapPETScTests.GridapPETScMPITests.DarcyTestsRun ~/.julia/packages/GridapPETSc/X0jXh/test/mpi/mpiexec.jl:7 [13] top-level scope @ ~/.julia/packages/GridapPETSc/X0jXh/test/mpi/DarcyTestsRun.jl:3 [14] include(mod::Module, _path::String) @ Base ./Base.jl:557 [15] include @ ~/.julia/packages/GridapPETSc/X0jXh/test/mpi/runtests.jl:1 [inlined] [16] macro expansion @ ~/.julia/packages/GridapPETSc/X0jXh/test/mpi/runtests.jl:11 [inlined] [17] macro expansion @ /opt/julia/share/julia/stdlib/v1.11/Test/src/Test.jl:1704 [inlined] [18] macro expansion @ ~/.julia/packages/GridapPETSc/X0jXh/test/mpi/runtests.jl:11 [inlined] [19] macro expansion @ ./timing.jl:581 [inlined] [20] top-level scope @ ~/.julia/packages/GridapPETSc/X0jXh/test/mpi/runtests.jl:11 [21] include(mod::Module, _path::String) @ Base ./Base.jl:557 [22] include @ ~/.julia/packages/GridapPETSc/X0jXh/test/runtests.jl:1 [inlined] [23] macro expansion @ ~/.julia/packages/GridapPETSc/X0jXh/test/runtests.jl:6 [inlined] [24] macro expansion @ /opt/julia/share/julia/stdlib/v1.11/Test/src/Test.jl:1704 [inlined] [25] macro expansion @ ~/.julia/packages/GridapPETSc/X0jXh/test/runtests.jl:6 [inlined] [26] macro expansion @ ./timing.jl:581 [inlined] [27] top-level scope @ ~/.julia/packages/GridapPETSc/X0jXh/test/runtests.jl:6 [28] include(fname::String) @ Main ./sysimg.jl:38 [29] top-level scope @ none:6 [30] eval @ ./boot.jl:430 [inlined] [31] exec_options(opts::Base.JLOptions) @ Base ./client.jl:296 [32] _start() @ Base ./client.jl:531 in expression starting at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/mpi/DarcyTestsRun.jl:1 27.857216 seconds (57.68 k allocations: 3.131 MiB, 0.76% compilation time) Test Summary: | Error Total Time MPI | 5 5 2m05.5s PartitionedArrays | 1 1 26.8s PLaplacianTests | 1 1 14.0s GCTests | 1 1 27.6s PoissonTests | 1 1 27.8s DarcyTests | 1 1 27.9s ERROR: LoadError: Some tests did not pass: 0 passed, 0 failed, 5 errored, 0 broken. in expression starting at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/runtests.jl:1 Testing failed after 676.3s ERROR: LoadError: Package GridapPETSc errored during testing Stacktrace: [1] pkgerror(msg::String) @ Pkg.Types /opt/julia/share/julia/stdlib/v1.11/Pkg/src/Types.jl:68 [2] test(ctx::Pkg.Types.Context, pkgs::Vector{Pkg.Types.PackageSpec}; coverage::Bool, julia_args::Cmd, test_args::Cmd, test_fn::Nothing, force_latest_compatible_version::Bool, allow_earlier_backwards_compatible_versions::Bool, allow_reresolve::Bool) @ Pkg.Operations /opt/julia/share/julia/stdlib/v1.11/Pkg/src/Operations.jl:2124 [3] test @ /opt/julia/share/julia/stdlib/v1.11/Pkg/src/Operations.jl:2007 [inlined] [4] test(ctx::Pkg.Types.Context, pkgs::Vector{Pkg.Types.PackageSpec}; coverage::Bool, test_fn::Nothing, julia_args::Cmd, test_args::Cmd, force_latest_compatible_version::Bool, allow_earlier_backwards_compatible_versions::Bool, allow_reresolve::Bool, kwargs::@Kwargs{io::IOContext{IO}}) @ Pkg.API /opt/julia/share/julia/stdlib/v1.11/Pkg/src/API.jl:481 [5] test(pkgs::Vector{Pkg.Types.PackageSpec}; io::IOContext{IO}, kwargs::@Kwargs{julia_args::Cmd}) @ Pkg.API /opt/julia/share/julia/stdlib/v1.11/Pkg/src/API.jl:159 [6] test @ /opt/julia/share/julia/stdlib/v1.11/Pkg/src/API.jl:147 [inlined] [7] #test#74 @ /opt/julia/share/julia/stdlib/v1.11/Pkg/src/API.jl:146 [inlined] [8] top-level scope @ /PkgEval.jl/scripts/evaluate.jl:219 in expression starting at /PkgEval.jl/scripts/evaluate.jl:210 PkgEval failed after 850.48s: package tests unexpectedly errored