Package evaluation of GridapPETSc on Julia 1.12.0-DEV.1805 (a080deafdd*) started at 2025-03-24T21:38:04.063 ################################################################################ # Set-up # Installing PkgEval dependencies (TestEnv)... Set-up completed after 8.97s ################################################################################ # Installation # Installing GridapPETSc... Resolving package versions... Installed GridapPETSc ─ v0.5.3 Updating `~/.julia/environments/v1.12/Project.toml` [bcdc36c2] + GridapPETSc v0.5.3 Updating `~/.julia/environments/v1.12/Manifest.toml` [47edcb42] + ADTypes v1.14.0 [621f4979] + AbstractFFTs v1.5.0 [1520ce14] + AbstractTrees v0.4.5 [79e6a3ab] + Adapt v4.3.0 [dce04be8] + ArgCheck v2.5.0 [4fba245c] + ArrayInterface v7.18.0 [4c555306] + ArrayLayouts v1.11.1 [15f4f7f2] + AutoHashEquals v2.2.0 [fbb218c0] + BSON v0.3.9 [8e7c35d0] + BlockArrays v1.5.0 [7a955b69] + CircularArrays v1.4.0 [944b1d66] + CodecZlib v0.7.8 [861a8166] + Combinatorics v1.0.2 [bbf7d656] + CommonSubexpressions v0.3.1 [34da2185] + Compat v4.16.0 [187b0558] + ConstructionBase v1.5.8 [864edb3b] + DataStructures v0.18.22 [163ba53b] + DiffResults v1.1.0 [b552c78f] + DiffRules v1.15.1 [a0c0ee7d] + DifferentiationInterface v0.6.48 [b4f34e82] + Distances v0.10.12 [ffbed154] + DocStringExtensions v0.9.3 [7a1cc6ca] + FFTW v1.8.1 [442a2c76] + FastGaussQuadrature v1.0.2 [5789e2e9] + FileIO v1.17.0 [1a297f60] + FillArrays v1.13.0 [6a86dc24] + FiniteDiff v2.27.0 [f6369f11] + ForwardDiff v0.10.38 [56d4f2e9] + Gridap v0.18.10 [f9701e48] + GridapDistributed v0.4.7 [bcdc36c2] + GridapPETSc v0.5.3 [92d709cd] + IrrationalConstants v0.2.4 [42fd0dbc] + IterativeSolvers v0.9.4 [033835bb] + JLD2 v0.5.12 [692b3bcd] + JLLWrappers v1.7.0 [682c06a0] + JSON v0.21.4 [9c8b4983] + LightXML v0.9.1 [d3d80556] + LineSearches v7.3.0 [2ab3a3ac] + LogExpFunctions v0.3.29 [da04e1cc] + MPI v0.20.22 [3da0fdf6] + MPIPreferences v0.1.11 [1914dd2f] + MacroTools v0.5.15 [d41bc354] + NLSolversBase v7.9.0 [2774e3e8] + NLsolve v4.5.1 [77ba4419] + NaNMath v1.1.2 [b8a86587] + NearestNeighbors v0.4.21 [6fe1bfb0] + OffsetArrays v1.16.0 [bac558e1] + OrderedCollections v1.8.0 [d96e819e] + Parameters v0.12.3 [69de0a69] + Parsers v2.8.1 ⌅ [5a9dfac6] + PartitionedArrays v0.3.4 [eebad327] + PkgVersion v0.3.3 [c74db56a] + PolynomialBases v0.4.22 [aea7be01] + PrecompileTools v1.2.1 [21216c6a] + Preferences v1.4.3 [1fd47b50] + QuadGK v2.11.2 [3cdcf5f2] + RecipesBase v1.3.4 [189a3867] + Reexport v1.2.2 [ae029012] + Requires v1.3.1 [efcf1570] + Setfield v1.1.2 [ce78b400] + SimpleUnPack v1.1.0 [a0a7dd2c] + SparseMatricesCSR v0.6.9 [276daf66] + SpecialFunctions v2.5.0 [90137ffa] + StaticArrays v1.9.13 [1e83bf80] + StaticArraysCore v1.4.3 [10745b16] + Statistics v1.11.1 [82ae8749] + StatsAPI v1.7.0 [3bb67fe8] + TranscodingStreams v0.11.3 [3a884ed6] + UnPack v1.0.2 [4004b06d] + VTKBase v1.0.1 [64499a7a] + WriteVTK v1.21.1 [f5851436] + FFTW_jll v3.3.10+3 [e33a78d0] + Hwloc_jll v2.12.0+0 [1d5cc7b8] + IntelOpenMP_jll v2025.0.4+0 [94ce4f54] + Libiconv_jll v1.18.0+0 [856f044c] + MKL_jll v2025.0.1+1 [7cb0a576] + MPICH_jll v4.3.0+1 [f1f71cc9] + MPItrampoline_jll v5.5.3+0 [9237b28f] + MicrosoftMPI_jll v10.1.4+3 [656ef2d0] + OpenBLAS32_jll v0.3.29+0 [fe0851c0] + OpenMPI_jll v5.0.7+1 [efe28fd5] + OpenSpecFun_jll v0.5.6+0 ⌅ [8fa3689e] + PETSc_jll v3.15.2+0 [02c8fc9c] + XML2_jll v2.13.6+1 [1317d2d5] + oneTBB_jll v2022.0.0+0 [0dad84c5] + ArgTools v1.1.2 [56f22d72] + Artifacts v1.11.0 [2a0f44e3] + Base64 v1.11.0 [ade2ca70] + Dates v1.11.0 [8ba89e20] + Distributed v1.11.0 [f43a241f] + Downloads v1.6.0 [7b1f6079] + FileWatching v1.11.0 [9fa8497b] + Future v1.11.0 [b77e0a4c] + InteractiveUtils v1.11.0 [dc6e5ff7] + JuliaSyntaxHighlighting v1.12.0 [4af54fe1] + LazyArtifacts v1.11.0 [b27032c2] + LibCURL v0.6.4 [76f85450] + LibGit2 v1.11.0 [8f399da3] + Libdl v1.11.0 [37e2e46d] + LinearAlgebra v1.11.0 [56ddb016] + Logging v1.11.0 [d6f4376e] + Markdown v1.11.0 [a63ad114] + Mmap v1.11.0 [ca575930] + NetworkOptions v1.2.0 [44cfe95a] + Pkg v1.12.0 [de0858da] + Printf v1.11.0 [9a3f8284] + Random v1.11.0 [ea8e919c] + SHA v0.7.0 [9e88b42a] + Serialization v1.11.0 [6462fe0b] + Sockets v1.11.0 [2f01184e] + SparseArrays v1.12.0 [f489334b] + StyledStrings v1.11.0 [4607b0f0] + SuiteSparse [fa267f1f] + TOML v1.0.3 [a4e569a6] + Tar v1.10.0 [8dfed614] + Test v1.11.0 [cf7118a7] + UUIDs v1.11.0 [4ec0a83e] + Unicode v1.11.0 [e66e0078] + CompilerSupportLibraries_jll v1.2.0+0 [deac9b47] + LibCURL_jll v8.6.0+0 [e37daf67] + LibGit2_jll v1.8.0+0 [29816b5a] + LibSSH2_jll v1.11.0+1 [c8ffd9c3] + MbedTLS_jll v2.28.6+1 [14a3606d] + MozillaCACerts_jll v2024.11.26 [4536629a] + OpenBLAS_jll v0.3.28+3 [05823500] + OpenLibm_jll v0.8.1+3 [bea87d4a] + SuiteSparse_jll v7.8.0+1 [83775a58] + Zlib_jll v1.3.1+1 [8e850b90] + libblastrampoline_jll v5.11.2+0 [8e850ede] + nghttp2_jll v1.63.0+1 [3f19e933] + p7zip_jll v17.5.0+1 Info Packages marked with ⌅ have new versions available but compatibility constraints restrict them from upgrading. To see why use `status --outdated -m` Building GridapPETSc → `~/.julia/scratchspaces/44cfe95a-1eb2-52ea-b672-e2afdf69b78f/601899a547ac1f473be900e6b3157e62cb2db8fc/build.log` Installation completed after 29.12s ################################################################################ # Precompilation # Precompiling PkgEval dependencies... Precompiling package dependencies... Precompilation completed after 100.21s ################################################################################ # Testing # Testing GridapPETSc Status `/tmp/jl_YfDWBV/Project.toml` [56d4f2e9] Gridap v0.18.10 [f9701e48] GridapDistributed v0.4.7 [bcdc36c2] GridapPETSc v0.5.3 [da04e1cc] MPI v0.20.22 ⌅ [5a9dfac6] PartitionedArrays v0.3.4 [a0a7dd2c] SparseMatricesCSR v0.6.9 ⌅ [8fa3689e] PETSc_jll v3.15.2+0 [8f399da3] Libdl v1.11.0 [37e2e46d] LinearAlgebra v1.11.0 [9a3f8284] Random v1.11.0 [2f01184e] SparseArrays v1.12.0 [8dfed614] Test v1.11.0 Status `/tmp/jl_YfDWBV/Manifest.toml` [47edcb42] ADTypes v1.14.0 [621f4979] AbstractFFTs v1.5.0 [1520ce14] AbstractTrees v0.4.5 [79e6a3ab] Adapt v4.3.0 [dce04be8] ArgCheck v2.5.0 [4fba245c] ArrayInterface v7.18.0 [4c555306] ArrayLayouts v1.11.1 [15f4f7f2] AutoHashEquals v2.2.0 [fbb218c0] BSON v0.3.9 [8e7c35d0] BlockArrays v1.5.0 [7a955b69] CircularArrays v1.4.0 [944b1d66] CodecZlib v0.7.8 [861a8166] Combinatorics v1.0.2 [bbf7d656] CommonSubexpressions v0.3.1 [34da2185] Compat v4.16.0 [187b0558] ConstructionBase v1.5.8 [864edb3b] DataStructures v0.18.22 [163ba53b] DiffResults v1.1.0 [b552c78f] DiffRules v1.15.1 [a0c0ee7d] DifferentiationInterface v0.6.48 [b4f34e82] Distances v0.10.12 [ffbed154] DocStringExtensions v0.9.3 [7a1cc6ca] FFTW v1.8.1 [442a2c76] FastGaussQuadrature v1.0.2 [5789e2e9] FileIO v1.17.0 [1a297f60] FillArrays v1.13.0 [6a86dc24] FiniteDiff v2.27.0 [f6369f11] ForwardDiff v0.10.38 [56d4f2e9] Gridap v0.18.10 [f9701e48] GridapDistributed v0.4.7 [bcdc36c2] GridapPETSc v0.5.3 [92d709cd] IrrationalConstants v0.2.4 [42fd0dbc] IterativeSolvers v0.9.4 [033835bb] JLD2 v0.5.12 [692b3bcd] JLLWrappers v1.7.0 [682c06a0] JSON v0.21.4 [9c8b4983] LightXML v0.9.1 [d3d80556] LineSearches v7.3.0 [2ab3a3ac] LogExpFunctions v0.3.29 [da04e1cc] MPI v0.20.22 [3da0fdf6] MPIPreferences v0.1.11 [1914dd2f] MacroTools v0.5.15 [d41bc354] NLSolversBase v7.9.0 [2774e3e8] NLsolve v4.5.1 [77ba4419] NaNMath v1.1.2 [b8a86587] NearestNeighbors v0.4.21 [6fe1bfb0] OffsetArrays v1.16.0 [bac558e1] OrderedCollections v1.8.0 [d96e819e] Parameters v0.12.3 [69de0a69] Parsers v2.8.1 ⌅ [5a9dfac6] PartitionedArrays v0.3.4 [eebad327] PkgVersion v0.3.3 [c74db56a] PolynomialBases v0.4.22 [aea7be01] PrecompileTools v1.2.1 [21216c6a] Preferences v1.4.3 [1fd47b50] QuadGK v2.11.2 [3cdcf5f2] RecipesBase v1.3.4 [189a3867] Reexport v1.2.2 [ae029012] Requires v1.3.1 [efcf1570] Setfield v1.1.2 [ce78b400] SimpleUnPack v1.1.0 [a0a7dd2c] SparseMatricesCSR v0.6.9 [276daf66] SpecialFunctions v2.5.0 [90137ffa] StaticArrays v1.9.13 [1e83bf80] StaticArraysCore v1.4.3 [10745b16] Statistics v1.11.1 [82ae8749] StatsAPI v1.7.0 [3bb67fe8] TranscodingStreams v0.11.3 [3a884ed6] UnPack v1.0.2 [4004b06d] VTKBase v1.0.1 [64499a7a] WriteVTK v1.21.1 [f5851436] FFTW_jll v3.3.10+3 [e33a78d0] Hwloc_jll v2.12.0+0 [1d5cc7b8] IntelOpenMP_jll v2025.0.4+0 [94ce4f54] Libiconv_jll v1.18.0+0 [856f044c] MKL_jll v2025.0.1+1 [7cb0a576] MPICH_jll v4.3.0+1 [f1f71cc9] MPItrampoline_jll v5.5.3+0 [9237b28f] MicrosoftMPI_jll v10.1.4+3 [656ef2d0] OpenBLAS32_jll v0.3.29+0 [fe0851c0] OpenMPI_jll v5.0.7+1 [efe28fd5] OpenSpecFun_jll v0.5.6+0 ⌅ [8fa3689e] PETSc_jll v3.15.2+0 [02c8fc9c] XML2_jll v2.13.6+1 [1317d2d5] oneTBB_jll v2022.0.0+0 [0dad84c5] ArgTools v1.1.2 [56f22d72] Artifacts v1.11.0 [2a0f44e3] Base64 v1.11.0 [ade2ca70] Dates v1.11.0 [8ba89e20] Distributed v1.11.0 [f43a241f] Downloads v1.6.0 [7b1f6079] FileWatching v1.11.0 [9fa8497b] Future v1.11.0 [b77e0a4c] InteractiveUtils v1.11.0 [dc6e5ff7] JuliaSyntaxHighlighting v1.12.0 [4af54fe1] LazyArtifacts v1.11.0 [b27032c2] LibCURL v0.6.4 [76f85450] LibGit2 v1.11.0 [8f399da3] Libdl v1.11.0 [37e2e46d] LinearAlgebra v1.11.0 [56ddb016] Logging v1.11.0 [d6f4376e] Markdown v1.11.0 [a63ad114] Mmap v1.11.0 [ca575930] NetworkOptions v1.2.0 [44cfe95a] Pkg v1.12.0 [de0858da] Printf v1.11.0 [9a3f8284] Random v1.11.0 [ea8e919c] SHA v0.7.0 [9e88b42a] Serialization v1.11.0 [6462fe0b] Sockets v1.11.0 [2f01184e] SparseArrays v1.12.0 [f489334b] StyledStrings v1.11.0 [4607b0f0] SuiteSparse [fa267f1f] TOML v1.0.3 [a4e569a6] Tar v1.10.0 [8dfed614] Test v1.11.0 [cf7118a7] UUIDs v1.11.0 [4ec0a83e] Unicode v1.11.0 [e66e0078] CompilerSupportLibraries_jll v1.2.0+0 [deac9b47] LibCURL_jll v8.6.0+0 [e37daf67] LibGit2_jll v1.8.0+0 [29816b5a] LibSSH2_jll v1.11.0+1 [c8ffd9c3] MbedTLS_jll v2.28.6+1 [14a3606d] MozillaCACerts_jll v2024.11.26 [4536629a] OpenBLAS_jll v0.3.28+3 [05823500] OpenLibm_jll v0.8.1+3 [bea87d4a] SuiteSparse_jll v7.8.0+1 [83775a58] Zlib_jll v1.3.1+1 [8e850b90] libblastrampoline_jll v5.11.2+0 [8e850ede] nghttp2_jll v1.63.0+1 [3f19e933] p7zip_jll v17.5.0+1 Info Packages marked with ⌅ have new versions available but compatibility constraints restrict them from upgrading. Testing Running tests... [0] PetscDetermineInitialFPTrap(): Floating point trapping is off by default 0 [0] PetscInitialize(): PETSc successfully started: number of processors = 1 [0] PetscGetHostName(): Rejecting domainname, likely is NIS GridapPETSc-primary-3PVDD95M.(none) [0] PetscInitialize(): Running on machine: GridapPETSc-primary-3PVDD95M [0] PetscCommDuplicate(): Duplicating a communicator 1140850689 -2080374784 max tags = 1073741823 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374784 [0] PetscGetHostName(): Rejecting domainname, likely is NIS GridapPETSc-primary-3PVDD95M.(none) Vec Object: 1 MPI processes type: seq 1. 2. 4. 1. [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374783 max tags = 1073741823 [0] PetscGetHostName(): Rejecting domainname, likely is NIS GridapPETSc-primary-3PVDD95M.(none) Vec Object: 1 MPI processes type: seq 1. 2. 4. 1. Vec Object: 1 MPI processes type: seq 20. 40. 4. 60. [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374784 [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 4 X 5; storage space: 10 unneeded,10 used [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0 [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 3 [0] MatCheckCompressedRow(): Found the ratio (num_zerorows 0)/(num_localrows 4) < 0.6. Do not use CompressedRow routines. [0] MatSeqAIJCheckInode(): Found 3 nodes of 4. Limit used: 5. Using Inode routines Mat Object: 1 MPI processes type: seqaij row 0: (1, 2.) (3, 3.) (4, 1.) row 1: (1, 6.) (3, 11.) (4, 5.) row 2: (1, 4.) (3, 3.) row 3: (3, 4.) (4, 3.) [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374784 [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 4 X 4; storage space: 0 unneeded,10 used [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0 [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 3 [0] MatCheckCompressedRow(): Found the ratio (num_zerorows 0)/(num_localrows 4) < 0.6. Do not use CompressedRow routines. [0] MatSeqAIJCheckInode(): Found 4 nodes out of 4 rows. Not using Inode routines Mat Object: 1 MPI processes type: seqaij row 0: (0, 4.) (1, -2.) row 1: (0, -1.) (1, 6.) (2, -2.) row 2: (1, -1.) (2, 6.) (3, -2.) row 3: (2, -1.) (3, 4.) [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374784 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374784 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374784 [0] PCSetUp(): Setting up PC for first time [0] PCSetUp(): Leaving PC with identical preconditioner since operator is unchanged [0] PCSetUp(): Leaving PC with identical preconditioner since operator is unchanged [0] PCSetUp(): Leaving PC with identical preconditioner since operator is unchanged [0] PCSetUp(): Leaving PC with identical preconditioner since operator is unchanged [0] PCSetUp(): Leaving PC with identical preconditioner since operator is unchanged [0] PCSetUp(): Leaving PC with identical preconditioner since operator is unchanged [0] PCSetUp(): Leaving PC with identical preconditioner since operator is unchanged [0] PCSetUp(): Leaving PC with identical preconditioner since operator is unchanged [0] PCSetUp(): Leaving PC with identical preconditioner since operator is unchanged [0] KSPConvergedDefault(): Linear solver has converged. Residual norm 1.260351938337e-16 is less than relative tolerance 1.000000000000e-05 times initial right hand side norm 1.145643923739e+00 at iteration 4 [0] PCSetUp(): Leaving PC with identical preconditioner since operator is unchanged [0] PCSetUp(): Leaving PC with identical preconditioner since operator is unchanged [0] PCSetUp(): Leaving PC with identical preconditioner since operator is unchanged [0] PCSetUp(): Leaving PC with identical preconditioner since operator is unchanged [0] PCSetUp(): Leaving PC with identical preconditioner since operator is unchanged [0] PCSetUp(): Leaving PC with identical preconditioner since operator is unchanged [0] PCSetUp(): Leaving PC with identical preconditioner since operator is unchanged [0] PCSetUp(): Leaving PC with identical preconditioner since operator is unchanged [0] PCSetUp(): Leaving PC with identical preconditioner since operator is unchanged [0] PCSetUp(): Leaving PC with identical preconditioner since operator is unchanged [0] KSPConvergedDefault(): Linear solver has converged. Residual norm 1.260351938337e-16 is less than relative tolerance 1.000000000000e-05 times initial right hand side norm 1.145643923739e+00 at iteration 4 [0] PCSetUp(): Leaving PC with identical preconditioner since operator is unchanged [0] PCSetUp(): Leaving PC with identical preconditioner since operator is unchanged [0] PCSetUp(): Leaving PC with identical preconditioner since operator is unchanged [0] PCSetUp(): Leaving PC with identical preconditioner since operator is unchanged [0] PCSetUp(): Leaving PC with identical preconditioner since operator is unchanged [0] PCSetUp(): Leaving PC with identical preconditioner since operator is unchanged [0] PCSetUp(): Leaving PC with identical preconditioner since operator is unchanged [0] PCSetUp(): Leaving PC with identical preconditioner since operator is unchanged [0] PCSetUp(): Leaving PC with identical preconditioner since operator is unchanged [0] KSPConvergedDefault(): Linear solver has converged. Residual norm 1.315348017250e-16 is less than relative tolerance 1.000000000000e-05 times initial right hand side norm 1.145643923739e+00 at iteration 4 KSP Object: (p_) 1 MPI processes type: gmres restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test PC Object: (p_) 1 MPI processes type: jacobi type DIAGONAL linear system matrix = precond matrix: Mat Object: 1 MPI processes type: seqaij rows=4, cols=4 total: nonzeros=10, allocated nonzeros=0 total number of mallocs used during MatSetValues calls=0 not using I-node routines [0] PetscFinalize(): PetscFinalize() called [0] Petsc_DelViewer(): Removing viewer data attribute in an MPI_Comm -2080374784 [0] Petsc_OuterComm_Attr_Delete_Fn(): Removing reference to PETSc communicator embedded in a user MPI_Comm -2080374784 [0] Petsc_InnerComm_Attr_Delete_Fn(): User MPI_Comm 1140850689 is being unlinked from inner PETSc comm -2080374784 [0] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784 [0] Petsc_Counter_Attr_Delete_Fn(): Deleting counter data in an MPI_Comm -2080374784 [0] Petsc_DelViewer(): Removing viewer data attribute in an MPI_Comm -2080374783 [0] Petsc_OuterComm_Attr_Delete_Fn(): Removing reference to PETSc communicator embedded in a user MPI_Comm -2080374783 [0] Petsc_InnerComm_Attr_Delete_Fn(): User MPI_Comm 1140850688 is being unlinked from inner PETSc comm -2080374783 [0] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374783 [0] Petsc_Counter_Attr_Delete_Fn(): Deleting counter data in an MPI_Comm -2080374783 22.862979 seconds (8.50 M allocations: 484.053 MiB, 2.60% gc time, 60.58% compilation time: 62% of which was recompilation) [0] PetscDetermineInitialFPTrap(): Floating point trapping is off by default 0 [0] PetscInitialize(): PETSc successfully started: number of processors = 1 [0] PetscGetHostName(): Rejecting domainname, likely is NIS GridapPETSc-primary-3PVDD95M.(none) [0] PetscInitialize(): Running on machine: GridapPETSc-primary-3PVDD95M [0] PetscCommDuplicate(): Duplicating a communicator 1140850689 -2080374783 max tags = 1073741823 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 4 X 5; storage space: 20 unneeded,0 used [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0 [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 0 [0] MatCheckCompressedRow(): Found the ratio (num_zerorows 4)/(num_localrows 4) > 0.6. Use CompressedRow routines. [0] MatSeqAIJCheckInode(): Found 1 nodes of 4. Limit used: 5. Using Inode routines [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 4 X 5; storage space: 14 unneeded,1 used [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 1 [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 1 [0] MatCheckCompressedRow(): Found the ratio (num_zerorows 3)/(num_localrows 4) > 0.6. Use CompressedRow routines. [0] MatSeqAIJCheckInode(): Found 2 nodes of 4. Limit used: 5. Using Inode routines [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 4 X 5; storage space: 14 unneeded,2 used [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 1 [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 1 [0] MatCheckCompressedRow(): Found the ratio (num_zerorows 2)/(num_localrows 4) < 0.6. Do not use CompressedRow routines. [0] MatSeqAIJCheckInode(): Found 4 nodes out of 4 rows. Not using Inode routines 4×5 GridapPETSc.PETScMatrix: 0.0 0.0 5.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 7.0 0.0 0.0 0.0 0.0 0.0 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 3 X 2; storage space: 6 unneeded,0 used [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0 [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 0 [0] MatCheckCompressedRow(): Found the ratio (num_zerorows 3)/(num_localrows 3) > 0.6. Use CompressedRow routines. [0] MatSeqAIJCheckInode(): Found 1 nodes of 3. Limit used: 5. Using Inode routines [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 4 X 4; storage space: 0 unneeded,10 used [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0 [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 3 [0] MatCheckCompressedRow(): Found the ratio (num_zerorows 0)/(num_localrows 4) < 0.6. Do not use CompressedRow routines. [0] MatSeqAIJCheckInode(): Found 4 nodes out of 4 rows. Not using Inode routines [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [0] MatConvert(): Calling duplicate for initial matrix seqaij 0 1 [0] MatConvert(): Calling duplicate for initial matrix seqaij 0 1 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [0] Petsc_OuterComm_Attr_Delete_Fn(): Removing reference to PETSc communicator embedded in a user MPI_Comm -2080374783 [0] Petsc_InnerComm_Attr_Delete_Fn(): User MPI_Comm 1140850689 is being unlinked from inner PETSc comm -2080374783 [0] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374783 [0] Petsc_Counter_Attr_Delete_Fn(): Deleting counter data in an MPI_Comm -2080374783 [0] PetscFinalize(): PetscFinalize() called 8.139239 seconds (2.56 M allocations: 131.380 MiB, 6.35% gc time, 93.45% compilation time: 3% of which was recompilation) [1, 2, 3, 5, 6] [4, 5, 7, 3, 6] [6, 7, 4] Vec Object: 1 MPI processes type: seq 10. 20. 30. 40. 50. 60. 70. 3-element PartitionedArrays.DebugArray{SparseArrays.SparseMatrixCSC{Float64, Int64}, 1}: [1] = sparse([1, 2, 3, 1, 3], [1, 2, 3, 4, 5], [9.0, 9.0, 9.0, 1.0, 1.0], 5, 5) [2] = sparse([1, 2, 2, 2, 1], [1, 2, 3, 4, 5], [9.0, 9.0, 1.0, 9.0, 1.0], 5, 5) [3] = sparse([1, 2, 1], [1, 2, 3], [9.0, 9.0, 1.0], 3, 3) Mat Object: 1 MPI processes type: seqaij row 0: (0, 9.) (4, 1.) row 1: (1, 9.) row 2: (2, 9.) (5, 1.) row 3: (3, 9.) (5, 1.) row 4: (2, 9.) (4, 9.) (6, 1.) row 5: (3, 1.) (5, 9.) row 6: (6, 9.) Linear solve converged due to CONVERGED_RTOL iterations 5 Linear solve converged due to CONVERGED_RTOL iterations 5 Vec Object: 1 MPI processes type: seq 140. 180. 330. 420. 790. 580. 630. Vec Object: 1 MPI processes type: seq 10. 20. 30. 40. 50. 60. 70. Linear solve converged due to CONVERGED_RTOL iterations 5 Linear solve converged due to CONVERGED_RTOL iterations 5 Vec Object: 1 MPI processes type: seq 140. 180. 330. 420. 790. 580. 630. Vec Object: 1 MPI processes type: seq 10. 20. 30. 40. 50. 60. 70. 68.045259 seconds (37.57 M allocations: 1.844 GiB, 1.63% gc time, 83.63% compilation time: 16% of which was recompilation) 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 KSP Object: 1 MPI processes type: gmres restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using DEFAULT norm type for convergence test PC Object: 1 MPI processes type: jacobi PC has not been set up so information may be incomplete type DIAGONAL linear system matrix = precond matrix: Mat Object: 1 MPI processes type: seqaij rows=4, cols=4 total: nonzeros=10, allocated nonzeros=0 total number of mallocs used during MatSetValues calls=0 not using I-node routines 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 0 KSP Residual norm 2.000000000000e+00 1 KSP Residual norm 4.002966042487e-16 9.060605 seconds (3.69 M allocations: 189.567 MiB, 12.22% gc time, 87.09% compilation time: 5% of which was recompilation) 0 SNES Function norm 3.605551275464e+00 1 SNES Function norm 4.444444444444e-01 2 SNES Function norm 7.111111111111e-02 3 SNES Function norm 3.936947327951e-03 4 SNES Function norm 1.525925473445e-05 5 SNES Function norm 2.328306437081e-10 6 SNES Function norm 0.000000000000e+00 Nonlinear solve converged due to CONVERGED_FNORM_RELATIVE iterations 6 0 SNES Function norm 0.000000000000e+00 1 SNES Function norm 0.000000000000e+00 Nonlinear solve converged due to CONVERGED_FNORM_RELATIVE iterations 1 0 SNES Function norm 0.000000000000e+00 1 SNES Function norm 0.000000000000e+00 Nonlinear solve converged due to CONVERGED_FNORM_RELATIVE iterations 1 0 SNES Function norm 1.486606874732e-01 1 SNES Function norm 8.402777777778e-03 2 SNES Function norm 6.831067663990e-05 3 SNES Function norm 4.665073682466e-09 4 SNES Function norm 0.000000000000e+00 Nonlinear solve converged due to CONVERGED_FNORM_RELATIVE iterations 4 0 SNES Function norm 0.000000000000e+00 1 SNES Function norm 0.000000000000e+00 Nonlinear solve converged due to CONVERGED_FNORM_RELATIVE iterations 1 0 SNES Function norm 0.000000000000e+00 1 SNES Function norm 0.000000000000e+00 Nonlinear solve converged due to CONVERGED_FNORM_RELATIVE iterations 1 0 SNES Function norm 3.605551275464e+00 1 SNES Function norm 4.444444444444e-01 2 SNES Function norm 7.111111111111e-02 3 SNES Function norm 3.936947327951e-03 4 SNES Function norm 1.525925473445e-05 5 SNES Function norm 2.328306437081e-10 6 SNES Function norm 0.000000000000e+00 Nonlinear solve converged due to CONVERGED_FNORM_RELATIVE iterations 6 2.627146 seconds (1.15 M allocations: 58.843 MiB, 21.37% gc time, 78.11% compilation time: 7% of which was recompilation) [0] PetscDetermineInitialFPTrap(): Floating point trapping is off by default 0 [0] PetscInitialize(): PETSc successfully started: number of processors = 1 [0] PetscGetHostName(): Rejecting domainname, likely is NIS GridapPETSc-primary-3PVDD95M.(none) [0] PetscInitialize(): Running on machine: GridapPETSc-primary-3PVDD95M [0] PetscCommDuplicate(): Duplicating a communicator 1140850689 -2080374784 max tags = 1073741823 [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 4 X 3; storage space: 2 unneeded,1 used [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0 [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 1 [0] MatCheckCompressedRow(): Found the ratio (num_zerorows 3)/(num_localrows 4) > 0.6. Use CompressedRow routines. [0] MatSeqAIJCheckInode(): Found 2 nodes of 4. Limit used: 5. Using Inode routines 4×3 GridapPETSc.PETScMatrix: -4.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374784 [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 4 X 3; storage space: 11 unneeded,1 used [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0 [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 1 [0] MatCheckCompressedRow(): Found the ratio (num_zerorows 3)/(num_localrows 4) > 0.6. Use CompressedRow routines. [0] MatSeqAIJCheckInode(): Found 2 nodes of 4. Limit used: 5. Using Inode routines 4×3 GridapPETSc.PETScMatrix: -4.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374784 [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 4 X 3; storage space: 0 unneeded,1 used [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0 [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 1 [0] MatCheckCompressedRow(): Found the ratio (num_zerorows 3)/(num_localrows 4) > 0.6. Use CompressedRow routines. [0] MatSeqAIJCheckInode(): Found 2 nodes of 4. Limit used: 5. Using Inode routines 4×3 GridapPETSc.PETScMatrix: -2.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 [0] MatConvert(): Calling duplicate for initial matrix seqaij 0 1 [0] MatConvert(): Calling duplicate for initial matrix seqaij 0 1 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374784 4-element GridapPETSc.PETScVector: 6.0 0.0 0.0 1.0 ┌ Warning: 1 objects still not finalized before calling GridapPETSc.Finalize() └ @ GridapPETSc ~/.julia/packages/GridapPETSc/X0jXh/src/Environment.jl:28 [0] PetscFinalize(): PetscFinalize() called 8.844418 seconds (2.32 M allocations: 116.653 MiB, 5.71% gc time, 93.87% compilation time: 23% of which was recompilation) 0 KSP Residual norm 8.550000000000e-03 1 KSP Residual norm 2.039013341310e-04 2 KSP Residual norm 6.195852905698e-06 3 KSP Residual norm 1.300754719285e-07 4 KSP Residual norm 3.121504432740e-09 5 KSP Residual norm 7.977302448122e-11 6 KSP Residual norm 1.598148195898e-12 7 KSP Residual norm 3.741486163257e-14 Linear solve converged due to CONVERGED_RTOL iterations 7 KSP Object: 1 MPI processes type: cg maximum iterations=1000, initial guess is zero tolerances: relative=1e-10, absolute=1e-50, divergence=10000. left preconditioning using UNPRECONDITIONED norm type for convergence test PC Object: 1 MPI processes type: gamg type is MULTIPLICATIVE, levels=3 cycles=v Cycles per PCApply=1 Using externally compute Galerkin coarse grid matrices GAMG specific options Threshold for dropping small values in graph on each level = 0. 0. 0. Threshold scaling factor for each level not specified = 1. AGG specific options Symmetric graph false Number of levels to square graph 0 Number smoothing steps 1 Complexity: grid = 1.15878 Coarse grid solver -- level ------------------------------- KSP Object: (mg_coarse_) 1 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_coarse_) 1 MPI processes type: bjacobi number of blocks = 1 Local solver information for first block is in the following KSP and PC objects on rank 0: Use -mg_coarse_ksp_view ::ascii_info_detail to display information for all blocks KSP Object: (mg_coarse_sub_) 1 MPI processes type: preonly maximum iterations=1, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_coarse_sub_) 1 MPI processes type: cholesky out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: nd factor fill ratio given 5., needed 1. Factored matrix follows: Mat Object: 1 MPI processes type: seqsbaij rows=7, cols=7 package used to perform factorization: petsc total: nonzeros=28, allocated nonzeros=28 block size is 1 linear system matrix = precond matrix: Mat Object: (mg_coarse_sub_) 1 MPI processes type: seqaij rows=7, cols=7 total: nonzeros=49, allocated nonzeros=49 total number of mallocs used during MatSetValues calls=0 using I-node routines: found 2 nodes, limit used is 5 linear system matrix = precond matrix: Mat Object: (mg_coarse_sub_) 1 MPI processes type: seqaij rows=7, cols=7 total: nonzeros=49, allocated nonzeros=49 total number of mallocs used during MatSetValues calls=0 using I-node routines: found 2 nodes, limit used is 5 Down solver (pre-smoother) on level 1 ------------------------------- KSP Object: (mg_levels_1_) 1 MPI processes type: chebyshev eigenvalue estimates used: min = 0.0998343, max = 1.09818 eigenvalues estimate via cg min 0.516584, max 0.998343 eigenvalues estimated using cg with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_1_esteig_) 1 MPI processes type: cg maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_1_) 1 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 1 MPI processes type: seqaij rows=76, cols=76 total: nonzeros=2432, allocated nonzeros=2432 total number of mallocs used during MatSetValues calls=0 not using I-node routines Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 2 ------------------------------- KSP Object: (mg_levels_2_) 1 MPI processes type: chebyshev eigenvalue estimates used: min = 0.0998116, max = 1.09793 eigenvalues estimate via cg min 0.336175, max 0.998116 eigenvalues estimated using cg with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_2_esteig_) 1 MPI processes type: cg maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_2_) 1 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 1 MPI processes type: seqaij rows=729, cols=729 total: nonzeros=15625, allocated nonzeros=0 total number of mallocs used during MatSetValues calls=0 not using I-node routines Up solver (post-smoother) same as down solver (pre-smoother) linear system matrix = precond matrix: Mat Object: 1 MPI processes type: seqaij rows=729, cols=729 total: nonzeros=15625, allocated nonzeros=0 total number of mallocs used during MatSetValues calls=0 not using I-node routines 0 KSP Residual norm 8.550000000000e-03 1 KSP Residual norm 2.039013341310e-04 2 KSP Residual norm 6.195852905698e-06 3 KSP Residual norm 1.300754719285e-07 4 KSP Residual norm 3.121504432740e-09 5 KSP Residual norm 7.977302448122e-11 6 KSP Residual norm 1.598148195898e-12 7 KSP Residual norm 3.741486163257e-14 Linear solve converged due to CONVERGED_RTOL iterations 7 KSP Object: 1 MPI processes type: cg maximum iterations=1000, initial guess is zero tolerances: relative=1e-10, absolute=1e-50, divergence=10000. left preconditioning using UNPRECONDITIONED norm type for convergence test PC Object: 1 MPI processes type: gamg type is MULTIPLICATIVE, levels=3 cycles=v Cycles per PCApply=1 Using externally compute Galerkin coarse grid matrices GAMG specific options Threshold for dropping small values in graph on each level = 0. 0. 0. Threshold scaling factor for each level not specified = 1. AGG specific options Symmetric graph false Number of levels to square graph 0 Number smoothing steps 1 Complexity: grid = 1.15878 Coarse grid solver -- level ------------------------------- KSP Object: (mg_coarse_) 1 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_coarse_) 1 MPI processes type: bjacobi number of blocks = 1 Local solver information for first block is in the following KSP and PC objects on rank 0: Use -mg_coarse_ksp_view ::ascii_info_detail to display information for all blocks KSP Object: (mg_coarse_sub_) 1 MPI processes type: preonly maximum iterations=1, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_coarse_sub_) 1 MPI processes type: cholesky out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: nd factor fill ratio given 5., needed 1. Factored matrix follows: Mat Object: 1 MPI processes type: seqsbaij rows=7, cols=7 package used to perform factorization: petsc total: nonzeros=28, allocated nonzeros=28 block size is 1 linear system matrix = precond matrix: Mat Object: (mg_coarse_sub_) 1 MPI processes type: seqaij rows=7, cols=7 total: nonzeros=49, allocated nonzeros=49 total number of mallocs used during MatSetValues calls=0 using I-node routines: found 2 nodes, limit used is 5 linear system matrix = precond matrix: Mat Object: (mg_coarse_sub_) 1 MPI processes type: seqaij rows=7, cols=7 total: nonzeros=49, allocated nonzeros=49 total number of mallocs used during MatSetValues calls=0 using I-node routines: found 2 nodes, limit used is 5 Down solver (pre-smoother) on level 1 ------------------------------- KSP Object: (mg_levels_1_) 1 MPI processes type: chebyshev eigenvalue estimates used: min = 0.0998343, max = 1.09818 eigenvalues estimate via cg min 0.516584, max 0.998343 eigenvalues estimated using cg with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_1_esteig_) 1 MPI processes type: cg maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_1_) 1 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 1 MPI processes type: seqaij rows=76, cols=76 total: nonzeros=2432, allocated nonzeros=2432 total number of mallocs used during MatSetValues calls=0 not using I-node routines Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 2 ------------------------------- KSP Object: (mg_levels_2_) 1 MPI processes type: chebyshev eigenvalue estimates used: min = 0.0998116, max = 1.09793 eigenvalues estimate via cg min 0.336175, max 0.998116 eigenvalues estimated using cg with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_2_esteig_) 1 MPI processes type: cg maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_2_) 1 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 1 MPI processes type: seqaij rows=729, cols=729 total: nonzeros=15625, allocated nonzeros=0 total number of mallocs used during MatSetValues calls=0 not using I-node routines Up solver (post-smoother) same as down solver (pre-smoother) linear system matrix = precond matrix: Mat Object: 1 MPI processes type: seqaij rows=729, cols=729 total: nonzeros=15625, allocated nonzeros=0 total number of mallocs used during MatSetValues calls=0 not using I-node routines 0 KSP Residual norm 8.550000000000e-03 1 KSP Residual norm 2.039013341310e-04 2 KSP Residual norm 6.195852905698e-06 3 KSP Residual norm 1.300754719285e-07 4 KSP Residual norm 3.121504432740e-09 5 KSP Residual norm 7.977302448122e-11 6 KSP Residual norm 1.598148195898e-12 7 KSP Residual norm 3.741486163257e-14 Linear solve converged due to CONVERGED_RTOL iterations 7 KSP Object: 1 MPI processes type: cg maximum iterations=1000, initial guess is zero tolerances: relative=1e-10, absolute=1e-50, divergence=10000. left preconditioning using UNPRECONDITIONED norm type for convergence test PC Object: 1 MPI processes type: gamg type is MULTIPLICATIVE, levels=3 cycles=v Cycles per PCApply=1 Using externally compute Galerkin coarse grid matrices GAMG specific options Threshold for dropping small values in graph on each level = 0. 0. 0. Threshold scaling factor for each level not specified = 1. AGG specific options Symmetric graph false Number of levels to square graph 0 Number smoothing steps 1 Complexity: grid = 1.15878 Coarse grid solver -- level ------------------------------- KSP Object: (mg_coarse_) 1 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_coarse_) 1 MPI processes type: bjacobi number of blocks = 1 Local solver information for first block is in the following KSP and PC objects on rank 0: Use -mg_coarse_ksp_view ::ascii_info_detail to display information for all blocks KSP Object: (mg_coarse_sub_) 1 MPI processes type: preonly maximum iterations=1, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_coarse_sub_) 1 MPI processes type: cholesky out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: nd factor fill ratio given 5., needed 1. Factored matrix follows: Mat Object: 1 MPI processes type: seqsbaij rows=7, cols=7 package used to perform factorization: petsc total: nonzeros=28, allocated nonzeros=28 block size is 1 linear system matrix = precond matrix: Mat Object: (mg_coarse_sub_) 1 MPI processes type: seqaij rows=7, cols=7 total: nonzeros=49, allocated nonzeros=49 total number of mallocs used during MatSetValues calls=0 using I-node routines: found 2 nodes, limit used is 5 linear system matrix = precond matrix: Mat Object: (mg_coarse_sub_) 1 MPI processes type: seqaij rows=7, cols=7 total: nonzeros=49, allocated nonzeros=49 total number of mallocs used during MatSetValues calls=0 using I-node routines: found 2 nodes, limit used is 5 Down solver (pre-smoother) on level 1 ------------------------------- KSP Object: (mg_levels_1_) 1 MPI processes type: chebyshev eigenvalue estimates used: min = 0.0998343, max = 1.09818 eigenvalues estimate via cg min 0.516584, max 0.998343 eigenvalues estimated using cg with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_1_esteig_) 1 MPI processes type: cg maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_1_) 1 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 1 MPI processes type: seqaij rows=76, cols=76 total: nonzeros=2432, allocated nonzeros=2432 total number of mallocs used during MatSetValues calls=0 not using I-node routines Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 2 ------------------------------- KSP Object: (mg_levels_2_) 1 MPI processes type: chebyshev eigenvalue estimates used: min = 0.0998116, max = 1.09793 eigenvalues estimate via cg min 0.336175, max 0.998116 eigenvalues estimated using cg with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_2_esteig_) 1 MPI processes type: cg maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_2_) 1 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 1 MPI processes type: seqaij rows=729, cols=729 total: nonzeros=15625, allocated nonzeros=39304 total number of mallocs used during MatSetValues calls=0 not using I-node routines Up solver (post-smoother) same as down solver (pre-smoother) linear system matrix = precond matrix: Mat Object: 1 MPI processes type: seqaij rows=729, cols=729 total: nonzeros=15625, allocated nonzeros=39304 total number of mallocs used during MatSetValues calls=0 not using I-node routines 0 KSP Residual norm 8.550000000000e-03 1 KSP Residual norm 2.039013341310e-04 2 KSP Residual norm 6.195852905698e-06 3 KSP Residual norm 1.300754719285e-07 4 KSP Residual norm 3.121504432740e-09 5 KSP Residual norm 7.977302448122e-11 6 KSP Residual norm 1.598148195898e-12 7 KSP Residual norm 3.741486163257e-14 Linear solve converged due to CONVERGED_RTOL iterations 7 KSP Object: 1 MPI processes type: cg maximum iterations=1000, initial guess is zero tolerances: relative=1e-10, absolute=1e-50, divergence=10000. left preconditioning using UNPRECONDITIONED norm type for convergence test PC Object: 1 MPI processes type: gamg type is MULTIPLICATIVE, levels=3 cycles=v Cycles per PCApply=1 Using externally compute Galerkin coarse grid matrices GAMG specific options Threshold for dropping small values in graph on each level = 0. 0. 0. Threshold scaling factor for each level not specified = 1. AGG specific options Symmetric graph false Number of levels to square graph 0 Number smoothing steps 1 Complexity: grid = 1.15878 Coarse grid solver -- level ------------------------------- KSP Object: (mg_coarse_) 1 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_coarse_) 1 MPI processes type: bjacobi number of blocks = 1 Local solver information for first block is in the following KSP and PC objects on rank 0: Use -mg_coarse_ksp_view ::ascii_info_detail to display information for all blocks KSP Object: (mg_coarse_sub_) 1 MPI processes type: preonly maximum iterations=1, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_coarse_sub_) 1 MPI processes type: cholesky out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: nd factor fill ratio given 5., needed 1. Factored matrix follows: Mat Object: 1 MPI processes type: seqsbaij rows=7, cols=7 package used to perform factorization: petsc total: nonzeros=28, allocated nonzeros=28 block size is 1 linear system matrix = precond matrix: Mat Object: (mg_coarse_sub_) 1 MPI processes type: seqaij rows=7, cols=7 total: nonzeros=49, allocated nonzeros=49 total number of mallocs used during MatSetValues calls=0 using I-node routines: found 2 nodes, limit used is 5 linear system matrix = precond matrix: Mat Object: (mg_coarse_sub_) 1 MPI processes type: seqaij rows=7, cols=7 total: nonzeros=49, allocated nonzeros=49 total number of mallocs used during MatSetValues calls=0 using I-node routines: found 2 nodes, limit used is 5 Down solver (pre-smoother) on level 1 ------------------------------- KSP Object: (mg_levels_1_) 1 MPI processes type: chebyshev eigenvalue estimates used: min = 0.0998343, max = 1.09818 eigenvalues estimate via cg min 0.516584, max 0.998343 eigenvalues estimated using cg with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_1_esteig_) 1 MPI processes type: cg maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_1_) 1 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 1 MPI processes type: seqaij rows=76, cols=76 total: nonzeros=2432, allocated nonzeros=2432 total number of mallocs used during MatSetValues calls=0 not using I-node routines Up solver (post-smoother) same as down solver (pre-smoother) Down solver (pre-smoother) on level 2 ------------------------------- KSP Object: (mg_levels_2_) 1 MPI processes type: chebyshev eigenvalue estimates used: min = 0.0998116, max = 1.09793 eigenvalues estimate via cg min 0.336175, max 0.998116 eigenvalues estimated using cg with translations [0. 0.1; 0. 1.1] KSP Object: (mg_levels_2_esteig_) 1 MPI processes type: cg maximum iterations=10, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test estimating eigenvalues using noisy right hand side maximum iterations=2, nonzero initial guess tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_2_) 1 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix = precond matrix: Mat Object: 1 MPI processes type: seqaij rows=729, cols=729 total: nonzeros=15625, allocated nonzeros=39304 total number of mallocs used during MatSetValues calls=0 not using I-node routines Up solver (post-smoother) same as down solver (pre-smoother) linear system matrix = precond matrix: Mat Object: 1 MPI processes type: seqaij rows=729, cols=729 total: nonzeros=15625, allocated nonzeros=39304 total number of mallocs used during MatSetValues calls=0 not using I-node routines ┌ Warning: -1 objects still not finalized before calling GridapPETSc.Finalize() └ @ GridapPETSc ~/.julia/packages/GridapPETSc/X0jXh/src/Environment.jl:28 [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [0]PETSC ERROR: General MPI error [0]PETSC ERROR: MPI error 336161797 Invalid communicator, error stack: internal_Comm_rank(87): MPI_Comm_rank(comm=0x84000000, rank=0x7ffc898168c8) failed internal_Comm_rank(53): Invalid communicator [0]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. [0]PETSC ERROR: Petsc Release Version 3.15.2, Jul 10, 2021 [0]PETSC ERROR: GridapPETSc on a named GridapPETSc-primary-3PVDD95M by Unknown Mon Mar 24 21:43:05 2025 [0]PETSC ERROR: Configure options --prefix=/workspace/destdir CC=cc FC=gfortran CXX=c++ COPTFLAGS=-O3 CXXOPTFLAGS=-O3 CFLAGS=-fno-stack-protector FOPTFLAGS=-O3 --with-64-bit-indices=0 --with-debugging=0 --with-batch --PETSC_ARCH=x86_64-linux-gnu_double_real_Int32 --with-blaslapack-lib=/workspace/destdir/lib/libopenblas.so --with-blaslapack-suffix= --known-64-bit-blas-indices=0 --with-mpi-lib="[/workspace/destdir/lib/libmpifort.so,/workspace/destdir/lib/libmpi.so]" --known-mpi-int64_t=0 --with-mpi-include=/workspace/destdir/include --with-sowing=0 --with-precision=double --with-scalar-type=real [0]PETSC ERROR: #1 PetscViewerFlush_ASCII() at /workspace/srcdir/petsc-3.15.2/src/sys/classes/viewer/impls/ascii/filev.c:117 [0]PETSC ERROR: #2 PetscViewerFlush() at /workspace/srcdir/petsc-3.15.2/src/sys/classes/viewer/interface/flush.c:26 [0]PETSC ERROR: #3 PetscViewerDestroy() at /workspace/srcdir/petsc-3.15.2/src/sys/classes/viewer/interface/view.c:114 [0]PETSC ERROR: #4 PetscObjectDestroy() at /workspace/srcdir/petsc-3.15.2/src/sys/objects/destroy.c:59 [0]PETSC ERROR: #5 PetscObjectRegisterDestroyAll() at /workspace/srcdir/petsc-3.15.2/src/sys/objects/destroy.c:334 [0]PETSC ERROR: #6 PetscFinalize() at /workspace/srcdir/petsc-3.15.2/src/sys/objects/pinit.c:1350 PoissonDriver: Error During Test at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/sequential/runtests.jl:18 Got exception outside of a @test LoadError: Petsc returned with error code: 98 Stacktrace: [1] macro expansion @ ~/.julia/packages/GridapPETSc/X0jXh/src/Config.jl:88 [inlined] [2] Finalize() @ GridapPETSc ~/.julia/packages/GridapPETSc/X0jXh/src/Environment.jl:31 [3] with(f::Main.GridapPETScTests.GridapPETScSequentialTests.PoissonDriver.var"#2#3"; kwargs::@Kwargs{args::Vector{String}}) @ GridapPETSc ~/.julia/packages/GridapPETSc/X0jXh/src/Environment.jl:39 [4] top-level scope @ ~/.julia/packages/GridapPETSc/X0jXh/test/sequential/PoissonDriver.jl:31 [5] include(mapexpr::Function, mod::Module, _path::String) @ Base ./Base.jl:301 [6] IncludeInto @ ./Base.jl:302 [inlined] [7] macro expansion @ ~/.julia/packages/GridapPETSc/X0jXh/test/sequential/runtests.jl:18 [inlined] [8] macro expansion @ /opt/julia/share/julia/stdlib/v1.12/Test/src/Test.jl:1724 [inlined] [9] macro expansion @ ~/.julia/packages/GridapPETSc/X0jXh/test/sequential/runtests.jl:18 [inlined] [10] macro expansion @ ./timing.jl:611 [inlined] [11] top-level scope @ ~/.julia/packages/GridapPETSc/X0jXh/test/sequential/runtests.jl:18 [12] include(mapexpr::Function, mod::Module, _path::String) @ Base ./Base.jl:301 [13] IncludeInto @ ./Base.jl:302 [inlined] [14] macro expansion @ ~/.julia/packages/GridapPETSc/X0jXh/test/runtests.jl:5 [inlined] [15] macro expansion @ /opt/julia/share/julia/stdlib/v1.12/Test/src/Test.jl:1724 [inlined] [16] macro expansion @ ~/.julia/packages/GridapPETSc/X0jXh/test/runtests.jl:5 [inlined] [17] macro expansion @ ./timing.jl:611 [inlined] [18] top-level scope @ ~/.julia/packages/GridapPETSc/X0jXh/test/runtests.jl:5 [19] include(mapexpr::Function, mod::Module, _path::String) @ Base ./Base.jl:301 [20] top-level scope @ none:6 [21] eval(m::Module, e::Any) @ Core ./boot.jl:485 [22] exec_options(opts::Base.JLOptions) @ Base ./client.jl:295 [23] _start() @ Base ./client.jl:558 in expression starting at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/sequential/PoissonDriver.jl:1 134.554133 seconds (111.26 M allocations: 5.399 GiB, 1.85% gc time, 98.64% compilation time: 21% of which was recompilation) [0]PETSC ERROR: #7 PetscViewerFlush_ASCII() at /workspace/srcdir/petsc-3.15.2/src/sys/classes/viewer/impls/ascii/filev.c:117 [0]PETSC ERROR: #8 PetscViewerFlush() at /workspace/srcdir/petsc-3.15.2/src/sys/classes/viewer/interface/flush.c:26 [0]PETSC ERROR: #9 PetscViewerDestroy() at /workspace/srcdir/petsc-3.15.2/src/sys/classes/viewer/interface/view.c:114 [0]PETSC ERROR: #10 PetscObjectDestroy() at /workspace/srcdir/petsc-3.15.2/src/sys/objects/destroy.c:59 [0]PETSC ERROR: #11 PetscObjectRegisterDestroyAll() at /workspace/srcdir/petsc-3.15.2/src/sys/objects/destroy.c:334 [0]PETSC ERROR: #12 PetscFinalize() at /workspace/srcdir/petsc-3.15.2/src/sys/objects/pinit.c:1350 ElasticityDriver: Error During Test at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/sequential/runtests.jl:20 Got exception outside of a @test LoadError: Petsc returned with error code: 98 Stacktrace: [1] macro expansion @ ~/.julia/packages/GridapPETSc/X0jXh/src/Config.jl:88 [inlined] [2] Finalize() @ GridapPETSc ~/.julia/packages/GridapPETSc/X0jXh/src/Environment.jl:31 [3] Init(; args::Vector{SubString{String}}, file::String, help::String, finalize_atexit::Bool) @ GridapPETSc ~/.julia/packages/GridapPETSc/X0jXh/src/Environment.jl:9 [4] Init @ ~/.julia/packages/GridapPETSc/X0jXh/src/Environment.jl:2 [inlined] [5] with(f::Main.GridapPETScTests.GridapPETScSequentialTests.ElasticityDriver.var"#9#10"; kwargs::@Kwargs{args::Vector{SubString{String}}}) @ GridapPETSc ~/.julia/packages/GridapPETSc/X0jXh/src/Environment.jl:37 [6] top-level scope @ ~/.julia/packages/GridapPETSc/X0jXh/test/sequential/ElasticityDriver.jl:50 [7] include(mapexpr::Function, mod::Module, _path::String) @ Base ./Base.jl:301 [8] IncludeInto @ ./Base.jl:302 [inlined] [9] macro expansion @ ~/.julia/packages/GridapPETSc/X0jXh/test/sequential/runtests.jl:20 [inlined] [10] macro expansion @ /opt/julia/share/julia/stdlib/v1.12/Test/src/Test.jl:1724 [inlined] [11] macro expansion @ ~/.julia/packages/GridapPETSc/X0jXh/test/sequential/runtests.jl:20 [inlined] [12] macro expansion @ ./timing.jl:611 [inlined] [13] top-level scope @ ~/.julia/packages/GridapPETSc/X0jXh/test/sequential/runtests.jl:20 [14] include(mapexpr::Function, mod::Module, _path::String) @ Base ./Base.jl:301 [15] IncludeInto @ ./Base.jl:302 [inlined] [16] macro expansion @ ~/.julia/packages/GridapPETSc/X0jXh/test/runtests.jl:5 [inlined] [17] macro expansion @ /opt/julia/share/julia/stdlib/v1.12/Test/src/Test.jl:1724 [inlined] [18] macro expansion @ ~/.julia/packages/GridapPETSc/X0jXh/test/runtests.jl:5 [inlined] [19] macro expansion @ ./timing.jl:611 [inlined] [20] top-level scope @ ~/.julia/packages/GridapPETSc/X0jXh/test/runtests.jl:5 [21] include(mapexpr::Function, mod::Module, _path::String) @ Base ./Base.jl:301 [22] top-level scope @ none:6 [23] eval(m::Module, e::Any) @ Core ./boot.jl:485 [24] exec_options(opts::Base.JLOptions) @ Base ./client.jl:295 [25] _start() @ Base ./client.jl:558 in expression starting at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/sequential/ElasticityDriver.jl:1 20.692464 seconds (10.97 M allocations: 551.245 MiB, 4.33% gc time, 92.09% compilation time: 5% of which was recompilation) [0]PETSC ERROR: #13 PetscViewerFlush_ASCII() at /workspace/srcdir/petsc-3.15.2/src/sys/classes/viewer/impls/ascii/filev.c:117 [0]PETSC ERROR: #14 PetscViewerFlush() at /workspace/srcdir/petsc-3.15.2/src/sys/classes/viewer/interface/flush.c:26 [0]PETSC ERROR: #15 PetscViewerDestroy() at /workspace/srcdir/petsc-3.15.2/src/sys/classes/viewer/interface/view.c:114 [0]PETSC ERROR: #16 PetscObjectDestroy() at /workspace/srcdir/petsc-3.15.2/src/sys/objects/destroy.c:59 [0]PETSC ERROR: #17 PetscObjectRegisterDestroyAll() at /workspace/srcdir/petsc-3.15.2/src/sys/objects/destroy.c:334 [0]PETSC ERROR: #18 PetscFinalize() at /workspace/srcdir/petsc-3.15.2/src/sys/objects/pinit.c:1350 DarcyDriver: Error During Test at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/sequential/runtests.jl:22 Got exception outside of a @test LoadError: Petsc returned with error code: 98 Stacktrace: [1] macro expansion @ ~/.julia/packages/GridapPETSc/X0jXh/src/Config.jl:88 [inlined] [2] Finalize() @ GridapPETSc ~/.julia/packages/GridapPETSc/X0jXh/src/Environment.jl:31 [3] Init(; args::Vector{SubString{String}}, file::String, help::String, finalize_atexit::Bool) @ GridapPETSc ~/.julia/packages/GridapPETSc/X0jXh/src/Environment.jl:9 [4] Init @ ~/.julia/packages/GridapPETSc/X0jXh/src/Environment.jl:2 [inlined] [5] with(f::Main.GridapPETScTests.GridapPETScSequentialTests.DarcyDriver.var"#2#3"; kwargs::@Kwargs{args::Vector{SubString{String}}}) @ GridapPETSc ~/.julia/packages/GridapPETSc/X0jXh/src/Environment.jl:37 [6] top-level scope @ ~/.julia/packages/GridapPETSc/X0jXh/test/sequential/DarcyDriver.jl:10 [7] include(mapexpr::Function, mod::Module, _path::String) @ Base ./Base.jl:301 [8] IncludeInto @ ./Base.jl:302 [inlined] [9] macro expansion @ ~/.julia/packages/GridapPETSc/X0jXh/test/sequential/runtests.jl:22 [inlined] [10] macro expansion @ /opt/julia/share/julia/stdlib/v1.12/Test/src/Test.jl:1724 [inlined] [11] macro expansion @ ~/.julia/packages/GridapPETSc/X0jXh/test/sequential/runtests.jl:22 [inlined] [12] macro expansion @ ./timing.jl:611 [inlined] [13] top-level scope @ ~/.julia/packages/GridapPETSc/X0jXh/test/sequential/runtests.jl:22 [14] include(mapexpr::Function, mod::Module, _path::String) @ Base ./Base.jl:301 [15] IncludeInto @ ./Base.jl:302 [inlined] [16] macro expansion @ ~/.julia/packages/GridapPETSc/X0jXh/test/runtests.jl:5 [inlined] [17] macro expansion @ /opt/julia/share/julia/stdlib/v1.12/Test/src/Test.jl:1724 [inlined] [18] macro expansion @ ~/.julia/packages/GridapPETSc/X0jXh/test/runtests.jl:5 [inlined] [19] macro expansion @ ./timing.jl:611 [inlined] [20] top-level scope @ ~/.julia/packages/GridapPETSc/X0jXh/test/runtests.jl:5 [21] include(mapexpr::Function, mod::Module, _path::String) @ Base ./Base.jl:301 [22] top-level scope @ none:6 [23] eval(m::Module, e::Any) @ Core ./boot.jl:485 [24] exec_options(opts::Base.JLOptions) @ Base ./client.jl:295 [25] _start() @ Base ./client.jl:558 in expression starting at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/sequential/DarcyDriver.jl:1 7.853024 seconds (9.78 M allocations: 473.107 MiB, 9.98% gc time, 80.56% compilation time: 7% of which was recompilation) [0]PETSC ERROR: #19 PetscViewerFlush_ASCII() at /workspace/srcdir/petsc-3.15.2/src/sys/classes/viewer/impls/ascii/filev.c:117 [0]PETSC ERROR: #20 PetscViewerFlush() at /workspace/srcdir/petsc-3.15.2/src/sys/classes/viewer/interface/flush.c:26 [0]PETSC ERROR: #21 PetscViewerDestroy() at /workspace/srcdir/petsc-3.15.2/src/sys/classes/viewer/interface/view.c:114 [0]PETSC ERROR: #22 PetscObjectDestroy() at /workspace/srcdir/petsc-3.15.2/src/sys/objects/destroy.c:59 [0]PETSC ERROR: #23 PetscObjectRegisterDestroyAll() at /workspace/srcdir/petsc-3.15.2/src/sys/objects/destroy.c:334 [0]PETSC ERROR: #24 PetscFinalize() at /workspace/srcdir/petsc-3.15.2/src/sys/objects/pinit.c:1350 PLaplacianDriver: Error During Test at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/sequential/runtests.jl:24 Got exception outside of a @test LoadError: Petsc returned with error code: 98 Stacktrace: [1] macro expansion @ ~/.julia/packages/GridapPETSc/X0jXh/src/Config.jl:88 [inlined] [2] Finalize() @ GridapPETSc ~/.julia/packages/GridapPETSc/X0jXh/src/Environment.jl:31 [3] Init(; args::Vector{SubString{String}}, file::String, help::String, finalize_atexit::Bool) @ GridapPETSc ~/.julia/packages/GridapPETSc/X0jXh/src/Environment.jl:9 [4] Init @ ~/.julia/packages/GridapPETSc/X0jXh/src/Environment.jl:2 [inlined] [5] with(f::Main.GridapPETScTests.GridapPETScSequentialTests.PLaplacianDriver.var"#3#4"; kwargs::@Kwargs{args::Vector{SubString{String}}}) @ GridapPETSc ~/.julia/packages/GridapPETSc/X0jXh/src/Environment.jl:37 [6] top-level scope @ ~/.julia/packages/GridapPETSc/X0jXh/test/sequential/PLaplacianDriver.jl:15 [7] include(mapexpr::Function, mod::Module, _path::String) @ Base ./Base.jl:301 [8] IncludeInto @ ./Base.jl:302 [inlined] [9] macro expansion @ ~/.julia/packages/GridapPETSc/X0jXh/test/sequential/runtests.jl:24 [inlined] [10] macro expansion @ /opt/julia/share/julia/stdlib/v1.12/Test/src/Test.jl:1724 [inlined] [11] macro expansion @ ~/.julia/packages/GridapPETSc/X0jXh/test/sequential/runtests.jl:24 [inlined] [12] macro expansion @ ./timing.jl:611 [inlined] [13] top-level scope @ ~/.julia/packages/GridapPETSc/X0jXh/test/sequential/runtests.jl:24 [14] include(mapexpr::Function, mod::Module, _path::String) @ Base ./Base.jl:301 [15] IncludeInto @ ./Base.jl:302 [inlined] [16] macro expansion @ ~/.julia/packages/GridapPETSc/X0jXh/test/runtests.jl:5 [inlined] [17] macro expansion @ /opt/julia/share/julia/stdlib/v1.12/Test/src/Test.jl:1724 [inlined] [18] macro expansion @ ~/.julia/packages/GridapPETSc/X0jXh/test/runtests.jl:5 [inlined] [19] macro expansion @ ./timing.jl:611 [inlined] [20] top-level scope @ ~/.julia/packages/GridapPETSc/X0jXh/test/runtests.jl:5 [21] include(mapexpr::Function, mod::Module, _path::String) @ Base ./Base.jl:301 [22] top-level scope @ none:6 [23] eval(m::Module, e::Any) @ Core ./boot.jl:485 [24] exec_options(opts::Base.JLOptions) @ Base ./client.jl:295 [25] _start() @ Base ./client.jl:558 in expression starting at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/sequential/PLaplacianDriver.jl:1 2.070920 seconds (658.48 k allocations: 32.319 MiB, 37.52% gc time, 27.74% compilation time) [0]PETSC ERROR: #25 PetscViewerFlush_ASCII() at /workspace/srcdir/petsc-3.15.2/src/sys/classes/viewer/impls/ascii/filev.c:117 [0]PETSC ERROR: #26 PetscViewerFlush() at /workspace/srcdir/petsc-3.15.2/src/sys/classes/viewer/interface/flush.c:26 [0]PETSC ERROR: #27 PetscViewerDestroy() at /workspace/srcdir/petsc-3.15.2/src/sys/classes/viewer/interface/view.c:114 [0]PETSC ERROR: #28 PetscObjectDestroy() at /workspace/srcdir/petsc-3.15.2/src/sys/objects/destroy.c:59 [0]PETSC ERROR: #29 PetscObjectRegisterDestroyAll() at /workspace/srcdir/petsc-3.15.2/src/sys/objects/destroy.c:334 [0]PETSC ERROR: #30 PetscFinalize() at /workspace/srcdir/petsc-3.15.2/src/sys/objects/pinit.c:1350 PoissonTests: Error During Test at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/sequential/runtests.jl:28 Got exception outside of a @test LoadError: Petsc returned with error code: 98 Stacktrace: [1] macro expansion @ ~/.julia/packages/GridapPETSc/X0jXh/src/Config.jl:88 [inlined] [2] Finalize() @ GridapPETSc ~/.julia/packages/GridapPETSc/X0jXh/src/Environment.jl:31 [3] Init(; args::Vector{SubString{String}}, file::String, help::String, finalize_atexit::Bool) @ GridapPETSc ~/.julia/packages/GridapPETSc/X0jXh/src/Environment.jl:9 [4] Init @ ~/.julia/packages/GridapPETSc/X0jXh/src/Environment.jl:2 [inlined] [5] #with#2 @ ~/.julia/packages/GridapPETSc/X0jXh/src/Environment.jl:37 [inlined] [6] with @ ~/.julia/packages/GridapPETSc/X0jXh/src/Environment.jl:36 [inlined] [7] main(distribute::Type{PartitionedArrays.DebugArray}, nparts::Tuple{Int64, Int64}, solver::Symbol) @ Main.GridapPETScTests.GridapPETScSequentialTests.PoissonTests ~/.julia/packages/GridapPETSc/X0jXh/test/PoissonTests.jl:47 [8] main(distribute::Type, nparts::Tuple{Int64, Int64}) @ Main.GridapPETScTests.GridapPETScSequentialTests.PoissonTests ~/.julia/packages/GridapPETSc/X0jXh/test/PoissonTests.jl:31 [9] (::Main.GridapPETScTests.GridapPETScSequentialTests.PoissonTests.var"#5#6")(distribute::Type) @ Main.GridapPETScTests.GridapPETScSequentialTests.PoissonTests ~/.julia/packages/GridapPETSc/X0jXh/test/sequential/PoissonTests.jl:5 [10] with_debug(f::Main.GridapPETScTests.GridapPETScSequentialTests.PoissonTests.var"#5#6") @ PartitionedArrays ~/.julia/packages/PartitionedArrays/py6uo/src/debug_array.jl:8 [11] top-level scope @ ~/.julia/packages/GridapPETSc/X0jXh/test/sequential/PoissonTests.jl:4 [12] include(mapexpr::Function, mod::Module, _path::String) @ Base ./Base.jl:301 [13] IncludeInto @ ./Base.jl:302 [inlined] [14] macro expansion @ ~/.julia/packages/GridapPETSc/X0jXh/test/sequential/runtests.jl:28 [inlined] [15] macro expansion @ /opt/julia/share/julia/stdlib/v1.12/Test/src/Test.jl:1724 [inlined] [16] macro expansion @ ~/.julia/packages/GridapPETSc/X0jXh/test/sequential/runtests.jl:28 [inlined] [17] macro expansion @ ./timing.jl:611 [inlined] [18] top-level scope @ ~/.julia/packages/GridapPETSc/X0jXh/test/sequential/runtests.jl:28 [19] include(mapexpr::Function, mod::Module, _path::String) @ Base ./Base.jl:301 [20] IncludeInto @ ./Base.jl:302 [inlined] [21] macro expansion @ ~/.julia/packages/GridapPETSc/X0jXh/test/runtests.jl:5 [inlined] [22] macro expansion @ /opt/julia/share/julia/stdlib/v1.12/Test/src/Test.jl:1724 [inlined] [23] macro expansion @ ~/.julia/packages/GridapPETSc/X0jXh/test/runtests.jl:5 [inlined] [24] macro expansion @ ./timing.jl:611 [inlined] [25] top-level scope @ ~/.julia/packages/GridapPETSc/X0jXh/test/runtests.jl:5 [26] include(mapexpr::Function, mod::Module, _path::String) @ Base ./Base.jl:301 [27] top-level scope @ none:6 [28] eval(m::Module, e::Any) @ Core ./boot.jl:485 [29] exec_options(opts::Base.JLOptions) @ Base ./client.jl:295 [30] _start() @ Base ./client.jl:558 in expression starting at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/sequential/PoissonTests.jl:1 17.283684 seconds (27.37 M allocations: 1.161 GiB, 6.46% gc time, 89.61% compilation time: <1% of which was recompilation) [0]PETSC ERROR: #31 PetscViewerFlush_ASCII() at /workspace/srcdir/petsc-3.15.2/src/sys/classes/viewer/impls/ascii/filev.c:117 [0]PETSC ERROR: #32 PetscViewerFlush() at /workspace/srcdir/petsc-3.15.2/src/sys/classes/viewer/interface/flush.c:26 [0]PETSC ERROR: #33 PetscViewerDestroy() at /workspace/srcdir/petsc-3.15.2/src/sys/classes/viewer/interface/view.c:114 [0]PETSC ERROR: #34 PetscObjectDestroy() at /workspace/srcdir/petsc-3.15.2/src/sys/objects/destroy.c:59 [0]PETSC ERROR: #35 PetscObjectRegisterDestroyAll() at /workspace/srcdir/petsc-3.15.2/src/sys/objects/destroy.c:334 [0]PETSC ERROR: #36 PetscFinalize() at /workspace/srcdir/petsc-3.15.2/src/sys/objects/pinit.c:1350 PLaplacianTests: Error During Test at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/sequential/runtests.jl:30 Got exception outside of a @test LoadError: Petsc returned with error code: 98 Stacktrace: [1] macro expansion @ ~/.julia/packages/GridapPETSc/X0jXh/src/Config.jl:88 [inlined] [2] Finalize() @ GridapPETSc ~/.julia/packages/GridapPETSc/X0jXh/src/Environment.jl:31 [3] Init(; args::Vector{SubString{String}}, file::String, help::String, finalize_atexit::Bool) @ GridapPETSc ~/.julia/packages/GridapPETSc/X0jXh/src/Environment.jl:9 [4] Init @ ~/.julia/packages/GridapPETSc/X0jXh/src/Environment.jl:2 [inlined] [5] with(f::Main.GridapPETScTests.GridapPETScSequentialTests.PLaplacianTests.var"#main##0#main##1"{Type{PartitionedArrays.DebugArray}, Tuple{Int64, Int64}, Symbol}; kwargs::@Kwargs{args::Vector{SubString{String}}}) @ GridapPETSc ~/.julia/packages/GridapPETSc/X0jXh/src/Environment.jl:37 [6] main @ ~/.julia/packages/GridapPETSc/X0jXh/test/PLaplacianTests.jl:62 [inlined] [7] main(distribute::Type, nparts::Tuple{Int64, Int64}) @ Main.GridapPETScTests.GridapPETScSequentialTests.PLaplacianTests ~/.julia/packages/GridapPETSc/X0jXh/test/PLaplacianTests.jl:48 [8] (::Main.GridapPETScTests.GridapPETScSequentialTests.PLaplacianTests.var"#4#5")(distribute::Type) @ Main.GridapPETScTests.GridapPETScSequentialTests.PLaplacianTests ~/.julia/packages/GridapPETSc/X0jXh/test/sequential/PLaplacianTests.jl:5 [9] with_debug(f::Main.GridapPETScTests.GridapPETScSequentialTests.PLaplacianTests.var"#4#5") @ PartitionedArrays ~/.julia/packages/PartitionedArrays/py6uo/src/debug_array.jl:8 [10] top-level scope @ ~/.julia/packages/GridapPETSc/X0jXh/test/sequential/PLaplacianTests.jl:4 [11] include(mapexpr::Function, mod::Module, _path::String) @ Base ./Base.jl:301 [12] IncludeInto @ ./Base.jl:302 [inlined] [13] macro expansion @ ~/.julia/packages/GridapPETSc/X0jXh/test/sequential/runtests.jl:30 [inlined] [14] macro expansion @ /opt/julia/share/julia/stdlib/v1.12/Test/src/Test.jl:1724 [inlined] [15] macro expansion @ ~/.julia/packages/GridapPETSc/X0jXh/test/sequential/runtests.jl:30 [inlined] [16] macro expansion @ ./timing.jl:611 [inlined] [17] top-level scope @ ~/.julia/packages/GridapPETSc/X0jXh/test/sequential/runtests.jl:30 [18] include(mapexpr::Function, mod::Module, _path::String) @ Base ./Base.jl:301 [19] IncludeInto @ ./Base.jl:302 [inlined] [20] macro expansion @ ~/.julia/packages/GridapPETSc/X0jXh/test/runtests.jl:5 [inlined] [21] macro expansion @ /opt/julia/share/julia/stdlib/v1.12/Test/src/Test.jl:1724 [inlined] [22] macro expansion @ ~/.julia/packages/GridapPETSc/X0jXh/test/runtests.jl:5 [inlined] [23] macro expansion @ ./timing.jl:611 [inlined] [24] top-level scope @ ~/.julia/packages/GridapPETSc/X0jXh/test/runtests.jl:5 [25] include(mapexpr::Function, mod::Module, _path::String) @ Base ./Base.jl:301 [26] top-level scope @ none:6 [27] eval(m::Module, e::Any) @ Core ./boot.jl:485 [28] exec_options(opts::Base.JLOptions) @ Base ./client.jl:295 [29] _start() @ Base ./client.jl:558 in expression starting at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/sequential/PLaplacianTests.jl:1 2.390119 seconds (819.90 k allocations: 38.148 MiB, 34.92% gc time, 33.38% compilation time) Test Summary: | Pass Error Total Time SEQUENTIAL | 147 6 153 5m12.2s PETSC | 10 10 22.9s PETScArrays | 28 28 8.1s PartitionedArrays (sequential) | 21 21 1m08.0s PETScLinearSolvers | 58 58 9.1s PETScNonLinearSolvers | 6 6 2.6s PETScAssembly | 8 8 8.8s PoissonDriver | 16 1 17 2m14.6s ElasticityDriver | 1 1 20.7s DarcyDriver | 1 1 7.9s PLaplacianDriver | 1 1 2.1s PoissonTests | 1 1 17.3s PLaplacianTests | 1 1 2.4s ERROR: LoadError: Some tests did not pass: 147 passed, 0 failed, 6 errored, 0 broken. in expression starting at /home/pkgeval/.julia/packages/GridapPETSc/X0jXh/test/runtests.jl:1 Attempting to use an MPI routine (internal_Comm_rank) before initializing or after finalizing MPICH Testing failed after 314.28s ERROR: LoadError: Package GridapPETSc errored during testing Stacktrace: [1] pkgerror(msg::String) @ Pkg.Types /opt/julia/share/julia/stdlib/v1.12/Pkg/src/Types.jl:68 [2] test(ctx::Pkg.Types.Context, pkgs::Vector{Pkg.Types.PackageSpec}; coverage::Bool, julia_args::Cmd, test_args::Cmd, test_fn::Nothing, force_latest_compatible_version::Bool, allow_earlier_backwards_compatible_versions::Bool, allow_reresolve::Bool) @ Pkg.Operations /opt/julia/share/julia/stdlib/v1.12/Pkg/src/Operations.jl:2359 [3] test @ /opt/julia/share/julia/stdlib/v1.12/Pkg/src/Operations.jl:2214 [inlined] [4] test(ctx::Pkg.Types.Context, pkgs::Vector{Pkg.Types.PackageSpec}; coverage::Bool, test_fn::Nothing, julia_args::Cmd, test_args::Cmd, force_latest_compatible_version::Bool, allow_earlier_backwards_compatible_versions::Bool, allow_reresolve::Bool, kwargs::@Kwargs{io::IOContext{IO}}) @ Pkg.API /opt/julia/share/julia/stdlib/v1.12/Pkg/src/API.jl:481 [5] test(pkgs::Vector{Pkg.Types.PackageSpec}; io::IOContext{IO}, kwargs::@Kwargs{julia_args::Cmd}) @ Pkg.API /opt/julia/share/julia/stdlib/v1.12/Pkg/src/API.jl:159 [6] test(pkgs::Vector{String}; kwargs::@Kwargs{julia_args::Cmd}) @ Pkg.API /opt/julia/share/julia/stdlib/v1.12/Pkg/src/API.jl:147 [7] test @ /opt/julia/share/julia/stdlib/v1.12/Pkg/src/API.jl:147 [inlined] [8] #test#81 @ /opt/julia/share/julia/stdlib/v1.12/Pkg/src/API.jl:146 [inlined] [9] top-level scope @ /PkgEval.jl/scripts/evaluate.jl:219 [10] include(mod::Module, _path::String) @ Base ./Base.jl:300 [11] exec_options(opts::Base.JLOptions) @ Base ./client.jl:329 [12] _start() @ Base ./client.jl:558 in expression starting at /PkgEval.jl/scripts/evaluate.jl:210 PkgEval failed after 484.31s: package tests unexpectedly errored