Package evaluation of XGBoost on Julia 1.11.4 (a71dd056e0*) started at 2025-04-08T17:04:35.893 ################################################################################ # Set-up # Installing PkgEval dependencies (TestEnv)... Set-up completed after 8.35s ################################################################################ # Installation # Installing XGBoost... Resolving package versions... Updating `~/.julia/environments/v1.11/Project.toml` [009559a3] + XGBoost v2.5.1 Updating `~/.julia/environments/v1.11/Manifest.toml` [1520ce14] + AbstractTrees v0.4.5 [fa961155] + CEnum v0.5.0 [9a962f9c] + DataAPI v1.16.0 [e2d170a0] + DataValueInterfaces v1.0.0 [82899510] + IteratorInterfaceExtensions v1.0.0 [692b3bcd] + JLLWrappers v1.7.0 [0f8b85d8] + JSON3 v1.14.2 [bac558e1] + OrderedCollections v1.8.0 [69de0a69] + Parsers v2.8.1 ⌅ [aea7be01] + PrecompileTools v1.2.1 [21216c6a] + Preferences v1.4.3 [a0a7dd2c] + SparseMatricesCSR v0.6.9 [10745b16] + Statistics v1.11.1 [856f2bd8] + StructTypes v1.11.0 [3783bdb8] + TableTraits v1.0.1 [bd369af6] + Tables v1.12.0 [009559a3] + XGBoost v2.5.1 [4ee394cb] + CUDA_Driver_jll v0.12.1+1 [76a88914] + CUDA_Runtime_jll v0.16.1+0 [1d63c593] + LLVMOpenMP_jll v18.1.7+0 [a5c6f535] + XGBoost_jll v2.0.1+0 [0dad84c5] + ArgTools v1.1.2 [56f22d72] + Artifacts v1.11.0 [2a0f44e3] + Base64 v1.11.0 [ade2ca70] + Dates v1.11.0 [f43a241f] + Downloads v1.6.0 [7b1f6079] + FileWatching v1.11.0 [4af54fe1] + LazyArtifacts v1.11.0 [b27032c2] + LibCURL v0.6.4 [76f85450] + LibGit2 v1.11.0 [8f399da3] + Libdl v1.11.0 [37e2e46d] + LinearAlgebra v1.11.0 [56ddb016] + Logging v1.11.0 [d6f4376e] + Markdown v1.11.0 [a63ad114] + Mmap v1.11.0 [ca575930] + NetworkOptions v1.2.0 [44cfe95a] + Pkg v1.11.0 [de0858da] + Printf v1.11.0 [9a3f8284] + Random v1.11.0 [ea8e919c] + SHA v0.7.0 [9e88b42a] + Serialization v1.11.0 [2f01184e] + SparseArrays v1.11.0 [4607b0f0] + SuiteSparse [fa267f1f] + TOML v1.0.3 [a4e569a6] + Tar v1.10.0 [cf7118a7] + UUIDs v1.11.0 [4ec0a83e] + Unicode v1.11.0 [e66e0078] + CompilerSupportLibraries_jll v1.1.1+0 [deac9b47] + LibCURL_jll v8.6.0+0 [e37daf67] + LibGit2_jll v1.7.2+0 [29816b5a] + LibSSH2_jll v1.11.0+1 [c8ffd9c3] + MbedTLS_jll v2.28.6+0 [14a3606d] + MozillaCACerts_jll v2023.12.12 [4536629a] + OpenBLAS_jll v0.3.27+1 [bea87d4a] + SuiteSparse_jll v7.7.0+0 [83775a58] + Zlib_jll v1.2.13+1 [8e850b90] + libblastrampoline_jll v5.11.0+0 [8e850ede] + nghttp2_jll v1.59.0+0 [3f19e933] + p7zip_jll v17.4.0+2 Info Packages marked with ⌅ have new versions available but compatibility constraints restrict them from upgrading. To see why use `status --outdated -m` Installation completed after 8.91s ################################################################################ # Precompilation # Precompiling PkgEval dependencies... Precompiling package dependencies... Precompilation completed after 219.74s ################################################################################ # Testing # Testing XGBoost Status `/tmp/jl_wEOKG4/Project.toml` [1520ce14] AbstractTrees v0.4.5 [fa961155] CEnum v0.5.0 [052768ef] CUDA v5.7.2 [0f8b85d8] JSON3 v1.14.2 [bac558e1] OrderedCollections v1.8.0 [a0a7dd2c] SparseMatricesCSR v0.6.9 [10745b16] Statistics v1.11.1 [bd369af6] Tables v1.12.0 [22787eb5] Term v2.0.7 [009559a3] XGBoost v2.5.1 [a5c6f535] XGBoost_jll v2.0.1+0 [37e2e46d] LinearAlgebra v1.11.0 [9a3f8284] Random v1.11.0 [2f01184e] SparseArrays v1.11.0 [8dfed614] Test v1.11.0 Status `/tmp/jl_wEOKG4/Manifest.toml` [621f4979] AbstractFFTs v1.5.0 [1520ce14] AbstractTrees v0.4.5 [79e6a3ab] Adapt v4.3.0 [a9b6321e] Atomix v1.1.1 [ab4f0b2a] BFloat16s v0.5.1 [fa961155] CEnum v0.5.0 [052768ef] CUDA v5.7.2 [1af6417a] CUDA_Runtime_Discovery v0.3.5 [da1fd8a2] CodeTracking v1.3.9 [3da002f7] ColorTypes v0.12.1 [5ae59095] Colors v0.13.0 [34da2185] Compat v4.16.0 [a8cc5b0e] Crayons v4.1.1 [9a962f9c] DataAPI v1.16.0 [a93c6f00] DataFrames v1.7.0 [864edb3b] DataStructures v0.18.22 [e2d170a0] DataValueInterfaces v1.0.0 [ffbed154] DocStringExtensions v0.9.4 [e2ba6199] ExprTools v0.1.10 [53c48c17] FixedPointNumbers v0.8.5 [0c68f7d7] GPUArrays v11.2.2 [46192b85] GPUArraysCore v0.2.0 [61eb1bfa] GPUCompiler v1.3.2 [096a3bc2] GPUToolbox v0.2.0 [076d061b] HashArrayMappedTries v0.2.0 [eafb193a] Highlights v0.5.3 [842dd82b] InlineStrings v1.4.3 [41ab1584] InvertedIndices v1.3.1 [82899510] IteratorInterfaceExtensions v1.0.0 [692b3bcd] JLLWrappers v1.7.0 [0f8b85d8] JSON3 v1.14.2 [63c18a36] KernelAbstractions v0.9.34 [929cbde3] LLVM v9.2.0 [8b046642] LLVMLoopInfo v1.0.0 [b964fa9f] LaTeXStrings v1.4.0 [1914dd2f] MacroTools v0.5.15 [e1d29d7a] Missings v1.2.0 [1c23619d] MyterialColors v0.3.0 [5da4648a] NVTX v1.0.0 [bac558e1] OrderedCollections v1.8.0 [d96e819e] Parameters v0.12.3 [69de0a69] Parsers v2.8.1 [2dfb63ee] PooledArrays v1.4.3 ⌅ [aea7be01] PrecompileTools v1.2.1 [21216c6a] Preferences v1.4.3 [08abe8d2] PrettyTables v2.4.0 [33c8b6b6] ProgressLogging v0.1.4 [74087812] Random123 v1.7.0 [e6cf234a] RandomNumbers v1.6.0 [189a3867] Reexport v1.2.2 [ae029012] Requires v1.3.1 [7e506255] ScopedValues v1.3.0 [6c6a2e73] Scratch v1.2.1 [91c51154] SentinelArrays v1.4.8 [a2af1166] SortingAlgorithms v1.2.1 [a0a7dd2c] SparseMatricesCSR v0.6.9 [90137ffa] StaticArrays v1.9.13 [1e83bf80] StaticArraysCore v1.4.3 [10745b16] Statistics v1.11.1 [892a3eda] StringManipulation v0.4.1 [856f2bd8] StructTypes v1.11.0 [3783bdb8] TableTraits v1.0.1 [bd369af6] Tables v1.12.0 [22787eb5] Term v2.0.7 [a759f4b9] TimerOutputs v0.5.28 [3a884ed6] UnPack v1.0.2 [1cfade01] UnicodeFun v0.4.1 [013be700] UnsafeAtomics v0.3.0 [009559a3] XGBoost v2.5.1 [4ee394cb] CUDA_Driver_jll v0.12.1+1 [76a88914] CUDA_Runtime_jll v0.16.1+0 [9c1d0b0a] JuliaNVTXCallbacks_jll v0.2.1+0 [dad2f222] LLVMExtra_jll v0.0.35+0 [1d63c593] LLVMOpenMP_jll v18.1.7+0 [e98f9f5b] NVTX_jll v3.1.1+0 [a5c6f535] XGBoost_jll v2.0.1+0 [1e29f10c] demumble_jll v1.3.0+0 [0dad84c5] ArgTools v1.1.2 [56f22d72] Artifacts v1.11.0 [2a0f44e3] Base64 v1.11.0 [ade2ca70] Dates v1.11.0 [f43a241f] Downloads v1.6.0 [7b1f6079] FileWatching v1.11.0 [9fa8497b] Future v1.11.0 [b77e0a4c] InteractiveUtils v1.11.0 [4af54fe1] LazyArtifacts v1.11.0 [b27032c2] LibCURL v0.6.4 [76f85450] LibGit2 v1.11.0 [8f399da3] Libdl v1.11.0 [37e2e46d] LinearAlgebra v1.11.0 [56ddb016] Logging v1.11.0 [d6f4376e] Markdown v1.11.0 [a63ad114] Mmap v1.11.0 [ca575930] NetworkOptions v1.2.0 [44cfe95a] Pkg v1.11.0 [de0858da] Printf v1.11.0 [3fa0cd96] REPL v1.11.0 [9a3f8284] Random v1.11.0 [ea8e919c] SHA v0.7.0 [9e88b42a] Serialization v1.11.0 [6462fe0b] Sockets v1.11.0 [2f01184e] SparseArrays v1.11.0 [f489334b] StyledStrings v1.11.0 [4607b0f0] SuiteSparse [fa267f1f] TOML v1.0.3 [a4e569a6] Tar v1.10.0 [8dfed614] Test v1.11.0 [cf7118a7] UUIDs v1.11.0 [4ec0a83e] Unicode v1.11.0 [e66e0078] CompilerSupportLibraries_jll v1.1.1+0 [deac9b47] LibCURL_jll v8.6.0+0 [e37daf67] LibGit2_jll v1.7.2+0 [29816b5a] LibSSH2_jll v1.11.0+1 [c8ffd9c3] MbedTLS_jll v2.28.6+0 [14a3606d] MozillaCACerts_jll v2023.12.12 [4536629a] OpenBLAS_jll v0.3.27+1 [bea87d4a] SuiteSparse_jll v7.7.0+0 [83775a58] Zlib_jll v1.2.13+1 [8e850b90] libblastrampoline_jll v5.11.0+0 [8e850ede] nghttp2_jll v1.59.0+0 [3f19e933] p7zip_jll v17.4.0+2 Info Packages marked with ⌅ have new versions available but compatibility constraints restrict them from upgrading. Testing Running tests... Precompiling AtomixCUDAExt... 16364.4 ms ✓ Atomix → AtomixCUDAExt 1 dependency successfully precompiled in 18 seconds. 98 already precompiled. [ Info: XGBoost: starting training. [ Info: [1] train-rmsle:0.09516384827189184 train-rmse:0.12458323318968342 eval-rmsle:0.09311178558380145 eval-rmse:0.12088154560829874 [ Info: [2] train-rmsle:0.03395206101636909 train-rmse:0.04351749281779705 eval-rmsle:0.03373299245987331 eval-rmse:0.04277789053406580 [ Info: [3] train-rmsle:0.01331363722758162 train-rmse:0.01709045333696795 eval-rmsle:0.01331696804112428 eval-rmse:0.01670001211886211 [ Info: [4] train-rmsle:0.00589786198558646 train-rmse:0.00797049360684598 eval-rmsle:0.00579506668724286 eval-rmse:0.00738355084242434 [ Info: [5] train-rmsle:0.00366354179357722 train-rmse:0.00532751693094597 eval-rmsle:0.00349203361333551 eval-rmse:0.00446209115554762 [ Info: [6] train-rmsle:0.00243987827289096 train-rmse:0.00372948872389998 eval-rmsle:0.00230696799980833 eval-rmse:0.00299441031049603 [ Info: [7] train-rmsle:0.00233072197811878 train-rmse:0.00351171869094617 eval-rmsle:0.00198743457613418 eval-rmse:0.00232108134724293 [ Info: [8] train-rmsle:0.00187464304048571 train-rmse:0.00268984631891293 eval-rmsle:0.00183314865051830 eval-rmse:0.00218473839382444 [ Info: [9] train-rmsle:0.00159687119873875 train-rmse:0.00248948409447635 eval-rmsle:0.00149587022150816 eval-rmse:0.00193510625680311 [ Info: [10] train-rmsle:0.00140161122736119 train-rmse:0.00208175014146188 eval-rmsle:0.00140072429358725 eval-rmse:0.00175193085203870 [ Info: [11] train-rmsle:0.00133461832044545 train-rmse:0.00185869184937647 eval-rmsle:0.00139425759891176 eval-rmse:0.00171232580418471 [ Info: [12] train-rmsle:0.00125521370789571 train-rmse:0.00186169247253542 eval-rmsle:0.00128924998271972 eval-rmse:0.00167666615775793 [ Info: [13] train-rmsle:0.00126216218475461 train-rmse:0.00179938516954769 eval-rmsle:0.00135006814414959 eval-rmse:0.00174141040367590 [ Info: [14] train-rmsle:0.00116890511458032 train-rmse:0.00170827649955664 eval-rmsle:0.00119777567702527 eval-rmse:0.00145940207453431 [ Info: [15] train-rmsle:0.00118974353584588 train-rmse:0.00177087078369563 eval-rmsle:0.00121096513763090 eval-rmse:0.00151873815398286 [ Info: [16] train-rmsle:0.00115370292588551 train-rmse:0.00170877055944155 eval-rmsle:0.00116580214824423 eval-rmse:0.00147338413425719 [ Info: [17] train-rmsle:0.00115343224161791 train-rmse:0.00170742032679273 eval-rmsle:0.00116769293487799 eval-rmse:0.00150531399852515 [ Info: [18] train-rmsle:0.00116895813804995 train-rmse:0.00176534614117505 eval-rmsle:0.00116757024613724 eval-rmse:0.00153388579192881 [ Info: [19] train-rmsle:0.00115367151344788 train-rmse:0.00169788673981039 eval-rmsle:0.00117949399272166 eval-rmse:0.00154123625515960 [ Info: [20] train-rmsle:0.00115139182701327 train-rmse:0.00172168335108636 eval-rmsle:0.00116092991579580 eval-rmse:0.00150312512255332 [ Info: [21] train-rmsle:0.00114728041666402 train-rmse:0.00167748745739459 eval-rmsle:0.00118796190841522 eval-rmse:0.00153513114044318 [ Info: [22] train-rmsle:0.00117711407295660 train-rmse:0.00169646826244684 eval-rmsle:0.00121855834339621 eval-rmse:0.00152822453884797 [ Info: [23] train-rmsle:0.00118725861738746 train-rmse:0.00174999975846509 eval-rmsle:0.00121350033617140 eval-rmse:0.00155144268305313 [ Info: [24] train-rmsle:0.00119284726384868 train-rmse:0.00173113255705680 eval-rmsle:0.00123989025146443 eval-rmse:0.00158063921115342 [ Info: [25] train-rmsle:0.00118569258609211 train-rmse:0.00179151713214355 eval-rmsle:0.00115788172083162 eval-rmse:0.00144977114457099 [ Info: [26] train-rmsle:0.00117353919747055 train-rmse:0.00173753487411019 eval-rmsle:0.00117963404450325 eval-rmse:0.00148180408088154 [ Info: [27] train-rmsle:0.00117541774685521 train-rmse:0.00170835272237206 eval-rmsle:0.00120768725948632 eval-rmse:0.00151422918331702 [ Info: [28] train-rmsle:0.00118625369766517 train-rmse:0.00170582353689105 eval-rmsle:0.00122934585122194 eval-rmse:0.00152303975267601 [ Info: [29] train-rmsle:0.00119239251035479 train-rmse:0.00174310319569286 eval-rmsle:0.00122433257245684 eval-rmse:0.00153754057432816 [ Info: [30] train-rmsle:0.00117908433842026 train-rmse:0.00174496072333165 eval-rmsle:0.00119202599693146 eval-rmse:0.00150090543620498 [ Info: Training rounds complete. [ Info: XGBoost: starting training. [ Info: Will train until there has been no improvement in 2 rounds. [ Info: [1] train-rmsle:0.09516384827189184 train-rmse:0.12458323318968342 eval-rmsle:0.09311178558380145 eval-rmse:0.12088154560829874 [ Info: [2] train-rmsle:0.03395206101636909 train-rmse:0.04351749281779705 eval-rmsle:0.03373299245987331 eval-rmse:0.04277789053406580 [ Info: [3] train-rmsle:0.01331363722758162 train-rmse:0.01709045333696795 eval-rmsle:0.01331696804112428 eval-rmse:0.01670001211886211 [ Info: [4] train-rmsle:0.00589786198558646 train-rmse:0.00797049360684598 eval-rmsle:0.00579506668724286 eval-rmse:0.00738355084242434 [ Info: [5] train-rmsle:0.00366354179357722 train-rmse:0.00532751693094597 eval-rmsle:0.00349203361333551 eval-rmse:0.00446209115554762 [ Info: [6] train-rmsle:0.00243987827289096 train-rmse:0.00372948872389998 eval-rmsle:0.00230696799980833 eval-rmse:0.00299441031049603 [ Info: [7] train-rmsle:0.00233072197811878 train-rmse:0.00351171869094617 eval-rmsle:0.00198743457613418 eval-rmse:0.00232108134724293 [ Info: [8] train-rmsle:0.00187464304048571 train-rmse:0.00268984631891293 eval-rmsle:0.00183314865051830 eval-rmse:0.00218473839382444 [ Info: [9] train-rmsle:0.00159687119873875 train-rmse:0.00248948409447635 eval-rmsle:0.00149587022150816 eval-rmse:0.00193510625680311 [ Info: [10] train-rmsle:0.00140161122736119 train-rmse:0.00208175014146188 eval-rmsle:0.00140072429358725 eval-rmse:0.00175193085203870 [ Info: [11] train-rmsle:0.00133461832044545 train-rmse:0.00185869184937647 eval-rmsle:0.00139425759891176 eval-rmse:0.00171232580418471 [ Info: [12] train-rmsle:0.00125521370789571 train-rmse:0.00186169247253542 eval-rmsle:0.00128924998271972 eval-rmse:0.00167666615775793 [ Info: [13] train-rmsle:0.00126216218475461 train-rmse:0.00179938516954769 eval-rmsle:0.00135006814414959 eval-rmse:0.00174141040367590 [ Info: [14] train-rmsle:0.00116890511458032 train-rmse:0.00170827649955664 eval-rmsle:0.00119777567702527 eval-rmse:0.00145940207453431 [ Info: [15] train-rmsle:0.00118974353584588 train-rmse:0.00177087078369563 eval-rmsle:0.00121096513763090 eval-rmse:0.00151873815398286 [ Info: [16] train-rmsle:0.00115370292588551 train-rmse:0.00170877055944155 eval-rmsle:0.00116580214824423 eval-rmse:0.00147338413425719 ┌ Info: Xgboost: Stopping. │ Best iteration: 14. └ No improvement in eval-rmse result in 2 rounds. [ Info: Training rounds complete. [ Info: XGBoost: starting training. [ Info: Will train until there has been no improvement in 2 rounds. [ Info: [1] train-rmsle:0.09516384827189184 train-rmse:0.12458323318968342 eval-rmsle:0.09311178558380145 eval-rmse:0.12088154560829874 [ Info: [2] train-rmsle:0.03395206101636909 train-rmse:0.04351749281779705 eval-rmsle:0.03373299245987331 eval-rmse:0.04277789053406580 [ Info: [3] train-rmsle:0.01331363722758162 train-rmse:0.01709045333696795 eval-rmsle:0.01331696804112428 eval-rmse:0.01670001211886211 [ Info: [4] train-rmsle:0.00589786198558646 train-rmse:0.00797049360684598 eval-rmsle:0.00579506668724286 eval-rmse:0.00738355084242434 [ Info: [5] train-rmsle:0.00366354179357722 train-rmse:0.00532751693094597 eval-rmsle:0.00349203361333551 eval-rmse:0.00446209115554762 [ Info: [6] train-rmsle:0.00243987827289096 train-rmse:0.00372948872389998 eval-rmsle:0.00230696799980833 eval-rmse:0.00299441031049603 [ Info: [7] train-rmsle:0.00233072197811878 train-rmse:0.00351171869094617 eval-rmsle:0.00198743457613418 eval-rmse:0.00232108134724293 [ Info: [8] train-rmsle:0.00187464304048571 train-rmse:0.00268984631891293 eval-rmsle:0.00183314865051830 eval-rmse:0.00218473839382444 [ Info: [9] train-rmsle:0.00159687119873875 train-rmse:0.00248948409447635 eval-rmsle:0.00149587022150816 eval-rmse:0.00193510625680311 [ Info: [10] train-rmsle:0.00140161122736119 train-rmse:0.00208175014146188 eval-rmsle:0.00140072429358725 eval-rmse:0.00175193085203870 [ Info: [11] train-rmsle:0.00133461832044545 train-rmse:0.00185869184937647 eval-rmsle:0.00139425759891176 eval-rmse:0.00171232580418471 [ Info: [12] train-rmsle:0.00125521370789571 train-rmse:0.00186169247253542 eval-rmsle:0.00128924998271972 eval-rmse:0.00167666615775793 [ Info: [13] train-rmsle:0.00126216218475461 train-rmse:0.00179938516954769 eval-rmsle:0.00135006814414959 eval-rmse:0.00174141040367590 [ Info: [14] train-rmsle:0.00116890511458032 train-rmse:0.00170827649955664 eval-rmsle:0.00119777567702527 eval-rmse:0.00145940207453431 [ Info: [15] train-rmsle:0.00118974353584588 train-rmse:0.00177087078369563 eval-rmsle:0.00121096513763090 eval-rmse:0.00151873815398286 [ Info: [16] train-rmsle:0.00115370292588551 train-rmse:0.00170877055944155 eval-rmsle:0.00116580214824423 eval-rmse:0.00147338413425719 ┌ Info: Xgboost: Stopping. │ Best iteration: 14. └ No improvement in eval-rmse result in 2 rounds. [ Info: Training rounds complete. [ Info: XGBoost: starting training. [ Info: Will train until there has been no improvement in 2 rounds. [ Info: [1] train-rmsle:0.09400195477049222 train-rmse:0.12322671770529799 eval-rmsle:0.09313339368737485 eval-rmse:0.12120867271092552 [ Info: [2] train-rmsle:0.03487183368152718 train-rmse:0.04513222059761323 eval-rmsle:0.03442407220821330 eval-rmse:0.04376119871504091 [ Info: [3] train-rmsle:0.01345630651976374 train-rmse:0.01758327052905027 eval-rmsle:0.01325199322876083 eval-rmse:0.01673821606383434 [ Info: [4] train-rmsle:0.00575009007956296 train-rmse:0.00778971169020835 eval-rmsle:0.00562688547525238 eval-rmse:0.00718863276872639 [ Info: [5] train-rmsle:0.00331128325832219 train-rmse:0.00470488682222526 eval-rmsle:0.00322093534141225 eval-rmse:0.00415931299415092 [ Info: [6] train-rmsle:0.00251547565671247 train-rmse:0.00371619449834243 eval-rmsle:0.00239707351497643 eval-rmse:0.00311402815052861 [ Info: [7] train-rmsle:0.00212331914770491 train-rmse:0.00311067940898930 eval-rmsle:0.00208087226742323 eval-rmse:0.00273267001999106 [ Info: [8] train-rmsle:0.00191662074216459 train-rmse:0.00293733945465145 eval-rmsle:0.00186748114606016 eval-rmse:0.00258671918942795 [ Info: [9] train-rmsle:0.00176634997932151 train-rmse:0.00286029486007627 eval-rmsle:0.00172351167437055 eval-rmse:0.00258621515555209 [ Info: [10] train-rmsle:0.00161890495856948 train-rmse:0.00253080566247204 eval-rmsle:0.00161846783359263 eval-rmse:0.00233267001813942 [ Info: [11] train-rmsle:0.00162215900030326 train-rmse:0.00248301376055555 eval-rmsle:0.00163788081627377 eval-rmse:0.00230947020252175 [ Info: [12] train-rmsle:0.00152668274892861 train-rmse:0.00222891392437491 eval-rmsle:0.00156210387491020 eval-rmse:0.00214132525956894 [ Info: [13] train-rmsle:0.00146300580644028 train-rmse:0.00224000609872601 eval-rmsle:0.00148696817561512 eval-rmse:0.00214482901904615 [ Info: [14] train-rmsle:0.00138691384716277 train-rmse:0.00200881845215055 eval-rmsle:0.00141664950189492 eval-rmse:0.00194508359022985 [ Info: [15] train-rmsle:0.00136958978925420 train-rmse:0.00206230100598218 eval-rmsle:0.00142164104764360 eval-rmse:0.00205109524376891 [ Info: [16] train-rmsle:0.00132957857546709 train-rmse:0.00196568270020401 eval-rmsle:0.00136180902879615 eval-rmse:0.00191012708466523 [ Info: [17] train-rmsle:0.00130170026854564 train-rmse:0.00199074283345849 eval-rmsle:0.00133664716019292 eval-rmse:0.00193702268248683 [ Info: [18] train-rmsle:0.00128327777864138 train-rmse:0.00194318724807855 eval-rmsle:0.00133940096623383 eval-rmse:0.00194758163510130 ┌ Info: Xgboost: Stopping. │ Best iteration: 16. └ No improvement in eval-rmse result in 2 rounds. [ Info: Training rounds complete. [ Info: XGBoost: starting training. [ Info: Will train until there has been no improvement in 2 rounds. [ Info: [1] train-rmsle:0.09516384827189184 train-rmse:0.12458323318968342 [ Info: [2] train-rmsle:0.03395206101636909 train-rmse:0.04351749281779705 [ Info: [3] train-rmsle:0.01331363722758162 train-rmse:0.01709045333696795 [ Info: [4] train-rmsle:0.00589786198558646 train-rmse:0.00797049360684598 [ Info: [5] train-rmsle:0.00366354179357722 train-rmse:0.00532751693094597 [ Info: [6] train-rmsle:0.00243987827289096 train-rmse:0.00372948872389998 [ Info: [7] train-rmsle:0.00233072197811878 train-rmse:0.00351171869094617 [ Info: [8] train-rmsle:0.00187464304048571 train-rmse:0.00268984631891293 [ Info: [9] train-rmsle:0.00159687119873875 train-rmse:0.00248948409447635 [ Info: [10] train-rmsle:0.00140161122736119 train-rmse:0.00208175014146188 [ Info: [11] train-rmsle:0.00133461832044545 train-rmse:0.00185869184937647 [ Info: [12] train-rmsle:0.00125521370789571 train-rmse:0.00186169247253542 [ Info: [13] train-rmsle:0.00126216218475461 train-rmse:0.00179938516954769 [ Info: [14] train-rmsle:0.00116890511458032 train-rmse:0.00170827649955664 [ Info: [15] train-rmsle:0.00118974353584588 train-rmse:0.00177087078369563 [ Info: [16] train-rmsle:0.00115370292588551 train-rmse:0.00170877055944155 ┌ Info: Xgboost: Stopping. │ Best iteration: 14. └ No improvement in train-rmse result in 2 rounds. [ Info: Training rounds complete. [ Info: XGBoost: starting training. [ Info: [1] train-rmse:0.98377194542124635 [ Info: [2] train-rmse:0.80821988024796854 [ Info: [3] train-rmse:0.67041944537635045 [ Info: [4] train-rmse:0.56305742199033382 [ Info: [5] train-rmse:0.47691942771740453 [ Info: [6] train-rmse:0.40445986325425781 [ Info: [7] train-rmse:0.34316456717660027 [ Info: [8] train-rmse:0.29128295989145248 [ Info: [9] train-rmse:0.24735806556534279 [ Info: [10] train-rmse:0.21015731819519615 [ Info: Training rounds complete. Test Summary: | Pass Total Time XGBoost | 57 57 1m27.3s Testing XGBoost tests passed Testing completed after 154.25s PkgEval succeeded after 411.42s