Package evaluation of SDeMo on Julia 1.13.0-DEV.443 (8a8e3d1f98*) started at 2025-04-29T02:26:26.202 ################################################################################ # Set-up # Installing PkgEval dependencies (TestEnv)... Set-up completed after 8.46s ################################################################################ # Installation # Installing SDeMo... Resolving package versions... Updating `~/.julia/environments/v1.13/Project.toml` [3e5feb82] + SDeMo v1.4.0 Updating `~/.julia/environments/v1.13/Manifest.toml` [66dad0bd] + AliasTables v1.1.3 [7d9fca2a] + Arpack v0.5.4 [34da2185] + Compat v4.16.0 [9a962f9c] + DataAPI v1.16.0 [864edb3b] + DataStructures v0.18.22 [31c24e10] + Distributions v0.25.119 [ffbed154] + DocStringExtensions v0.9.4 [1a297f60] + FillArrays v1.13.0 [34004b35] + HypergeometricFunctions v0.3.28 [92d709cd] + IrrationalConstants v0.2.4 [692b3bcd] + JLLWrappers v1.7.0 [682c06a0] + JSON v0.21.4 [2ab3a3ac] + LogExpFunctions v0.3.29 [e1d29d7a] + Missings v1.2.0 [6f286f6a] + MultivariateStats v0.10.3 [bac558e1] + OrderedCollections v1.8.0 [90014a1f] + PDMats v0.11.34 [69de0a69] + Parsers v2.8.3 [aea7be01] + PrecompileTools v1.3.2 [21216c6a] + Preferences v1.4.3 [43287f4e] + PtrArrays v1.3.0 [1fd47b50] + QuadGK v2.11.2 [189a3867] + Reexport v1.2.2 [79098fc4] + Rmath v0.8.0 [3e5feb82] + SDeMo v1.4.0 [a2af1166] + SortingAlgorithms v1.2.1 [276daf66] + SpecialFunctions v2.5.1 [10745b16] + Statistics v1.11.1 [82ae8749] + StatsAPI v1.7.0 [2913bbd2] + StatsBase v0.34.4 [4c63d2b9] + StatsFuns v1.5.0 [1c621080] + TestItems v1.0.0 ⌅ [68821587] + Arpack_jll v3.5.1+1 [efe28fd5] + OpenSpecFun_jll v0.5.6+0 [f50d1b31] + Rmath_jll v0.5.1+0 [0dad84c5] + ArgTools v1.1.2 [56f22d72] + Artifacts v1.11.0 [2a0f44e3] + Base64 v1.11.0 [ade2ca70] + Dates v1.11.0 [f43a241f] + Downloads v1.7.0 [7b1f6079] + FileWatching v1.11.0 [b77e0a4c] + InteractiveUtils v1.11.0 [ac6e5ff7] + JuliaSyntaxHighlighting v1.12.0 [b27032c2] + LibCURL v0.6.4 [76f85450] + LibGit2 v1.11.0 [8f399da3] + Libdl v1.11.0 [37e2e46d] + LinearAlgebra v1.12.0 [56ddb016] + Logging v1.11.0 [d6f4376e] + Markdown v1.11.0 [a63ad114] + Mmap v1.11.0 [ca575930] + NetworkOptions v1.3.0 [44cfe95a] + Pkg v1.12.0 [de0858da] + Printf v1.11.0 [9a3f8284] + Random v1.11.0 [ea8e919c] + SHA v0.7.0 [9e88b42a] + Serialization v1.11.0 [2f01184e] + SparseArrays v1.12.0 [f489334b] + StyledStrings v1.11.0 [4607b0f0] + SuiteSparse [fa267f1f] + TOML v1.0.3 [a4e569a6] + Tar v1.10.0 [cf7118a7] + UUIDs v1.11.0 [4ec0a83e] + Unicode v1.11.0 [e66e0078] + CompilerSupportLibraries_jll v1.3.0+1 [deac9b47] + LibCURL_jll v8.12.1+1 [e37daf67] + LibGit2_jll v1.9.0+0 [29816b5a] + LibSSH2_jll v1.11.3+1 [14a3606d] + MozillaCACerts_jll v2024.12.31 [4536629a] + OpenBLAS_jll v0.3.29+0 [05823500] + OpenLibm_jll v0.8.5+0 [458c3c95] + OpenSSL_jll v3.0.16+0 [bea87d4a] + SuiteSparse_jll v7.10.1+0 [83775a58] + Zlib_jll v1.3.1+2 [8e850b90] + libblastrampoline_jll v5.12.0+0 [8e850ede] + nghttp2_jll v1.65.0+0 [3f19e933] + p7zip_jll v17.5.0+2 Info Packages marked with ⌅ have new versions available but compatibility constraints restrict them from upgrading. To see why use `status --outdated -m` Installation completed after 4.09s ################################################################################ # Precompilation # Precompiling PkgEval dependencies... Precompiling package dependencies... Precompilation completed after 36.0s ################################################################################ # Testing # Testing SDeMo Status `/tmp/jl_2PvGlf/Project.toml` [3e5feb82] SDeMo v1.4.0 [10745b16] Statistics v1.11.1 [f8b46487] TestItemRunner v1.1.0 [8dfed614] Test v1.11.0 Status `/tmp/jl_2PvGlf/Manifest.toml` [66dad0bd] AliasTables v1.1.3 [7d9fca2a] Arpack v0.5.4 [34da2185] Compat v4.16.0 [9a962f9c] DataAPI v1.16.0 [864edb3b] DataStructures v0.18.22 [31c24e10] Distributions v0.25.119 [ffbed154] DocStringExtensions v0.9.4 [1a297f60] FillArrays v1.13.0 [34004b35] HypergeometricFunctions v0.3.28 [92d709cd] IrrationalConstants v0.2.4 [692b3bcd] JLLWrappers v1.7.0 [682c06a0] JSON v0.21.4 [2ab3a3ac] LogExpFunctions v0.3.29 [e1d29d7a] Missings v1.2.0 [6f286f6a] MultivariateStats v0.10.3 [bac558e1] OrderedCollections v1.8.0 [90014a1f] PDMats v0.11.34 [69de0a69] Parsers v2.8.3 [aea7be01] PrecompileTools v1.3.2 [21216c6a] Preferences v1.4.3 [43287f4e] PtrArrays v1.3.0 [1fd47b50] QuadGK v2.11.2 [189a3867] Reexport v1.2.2 [79098fc4] Rmath v0.8.0 [3e5feb82] SDeMo v1.4.0 [a2af1166] SortingAlgorithms v1.2.1 [276daf66] SpecialFunctions v2.5.1 [10745b16] Statistics v1.11.1 [82ae8749] StatsAPI v1.7.0 [2913bbd2] StatsBase v0.34.4 [4c63d2b9] StatsFuns v1.5.0 [f8b46487] TestItemRunner v1.1.0 [1c621080] TestItems v1.0.0 ⌅ [68821587] Arpack_jll v3.5.1+1 [efe28fd5] OpenSpecFun_jll v0.5.6+0 [f50d1b31] Rmath_jll v0.5.1+0 [0dad84c5] ArgTools v1.1.2 [56f22d72] Artifacts v1.11.0 [2a0f44e3] Base64 v1.11.0 [ade2ca70] Dates v1.11.0 [f43a241f] Downloads v1.7.0 [7b1f6079] FileWatching v1.11.0 [b77e0a4c] InteractiveUtils v1.11.0 [ac6e5ff7] JuliaSyntaxHighlighting v1.12.0 [b27032c2] LibCURL v0.6.4 [76f85450] LibGit2 v1.11.0 [8f399da3] Libdl v1.11.0 [37e2e46d] LinearAlgebra v1.12.0 [56ddb016] Logging v1.11.0 [d6f4376e] Markdown v1.11.0 [a63ad114] Mmap v1.11.0 [ca575930] NetworkOptions v1.3.0 [44cfe95a] Pkg v1.12.0 [de0858da] Printf v1.11.0 [9a3f8284] Random v1.11.0 [ea8e919c] SHA v0.7.0 [9e88b42a] Serialization v1.11.0 [2f01184e] SparseArrays v1.12.0 [f489334b] StyledStrings v1.11.0 [4607b0f0] SuiteSparse [fa267f1f] TOML v1.0.3 [a4e569a6] Tar v1.10.0 [8dfed614] Test v1.11.0 [cf7118a7] UUIDs v1.11.0 [4ec0a83e] Unicode v1.11.0 [e66e0078] CompilerSupportLibraries_jll v1.3.0+1 [deac9b47] LibCURL_jll v8.12.1+1 [e37daf67] LibGit2_jll v1.9.0+0 [29816b5a] LibSSH2_jll v1.11.3+1 [14a3606d] MozillaCACerts_jll v2024.12.31 [4536629a] OpenBLAS_jll v0.3.29+0 [05823500] OpenLibm_jll v0.8.5+0 [458c3c95] OpenSSL_jll v3.0.16+0 [bea87d4a] SuiteSparse_jll v7.10.1+0 [83775a58] Zlib_jll v1.3.1+2 [8e850b90] libblastrampoline_jll v5.12.0+0 [8e850ede] nghttp2_jll v1.65.0+0 [3f19e933] p7zip_jll v17.5.0+2 Info Packages marked with ⌅ have new versions available but compatibility constraints restrict them from upgrading. Testing Running tests... ┌ Warning: forwardselection! will be deprecated - use variables! with ForwardSelection instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/selection.jl:90 ┌ Warning: forwardselection! will be deprecated - use variables! with ForwardSelection instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/selection.jl:90 [ Info: [ 2 vars.] MCC val. ≈ -0.0 [ Info: [ 3 vars.] MCC val. ≈ 0.531 [ Info: Optimal var. pool: [1, 12, 2] ┌ Warning: backwardselection! will be deprecated - use variables! with BackwardSelection instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/selection.jl:44 ┌ Warning: backwardselection! will be deprecated - use variables! with BackwardSelection instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/selection.jl:44 ┌ Warning: forwardselection! will be deprecated - use variables! with ForwardSelection instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/selection.jl:90 [ Info: [ 0 vars.] MCC val. ≈ -0.0 [ Info: [ 1 vars.] MCC val. ≈ 0.767 [ Info: [ 2 vars.] MCC val. ≈ 0.787 [ Info: [ 3 vars.] MCC val. ≈ 0.814 [ Info: Optimal var. pool: [1, 3, 14] ┌ Warning: backwardselection! will be deprecated - use variables! with BackwardSelection instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/selection.jl:44 [ Info: [19 vars.] MCC val. ≈ -0.0 [ Info: [18 vars.] MCC val. ≈ 0.515 [ Info: Optimal var. pool: [1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 14, 15, 16, 17, 18, 19] ┌ Warning: backwardselection! will be deprecated - use variables! with BackwardSelection instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/selection.jl:44 [ Info: [ 9 vars.] MCC val. ≈ -0.0 [ Info: [ 8 vars.] MCC val. ≈ 0.807 [ Info: [ 7 vars.] MCC val. ≈ 0.828 [ Info: [ 6 vars.] MCC val. ≈ 0.835 [ Info: [ 5 vars.] MCC val. ≈ 0.85 [ Info: [ 4 vars.] MCC val. ≈ 0.855 [ Info: Optimal var. pool: [2, 6, 7, 8] ┌ Warning: forwardselection! will be deprecated - use variables! with ForwardSelection instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/selection.jl:90 [ Info: [ 2 vars.] MCC val. ≈ -0.0 [ Info: [ 3 vars.] MCC val. ≈ 0.437 [ Info: Optimal var. pool: [12, 13, 1] ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/vif.jl:24 ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/vif.jl:24 ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/vif.jl:24 ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/vif.jl:24 ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/vif.jl:24 ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/vif.jl:24 ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/vif.jl:24 ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/vif.jl:24 ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/vif.jl:24 ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/vif.jl:24 ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/vif.jl:24 ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/vif.jl:24 ┌ Warning: forwardselection! will be deprecated - use variables! with ForwardSelection instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/selection.jl:90 [ Info: [ 0 vars.] MCC val. ≈ -0.0 [ Info: [ 1 vars.] MCC val. ≈ 0.717 [ Info: [ 2 vars.] MCC val. ≈ 0.738 [ Info: [ 3 vars.] MCC val. ≈ 0.752 [ Info: Optimal var. pool: [8, 5, 15] ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/vif.jl:24 ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/vif.jl:24 ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/vif.jl:24 ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/vif.jl:24 ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/vif.jl:24 ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/vif.jl:24 ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/vif.jl:24 ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/vif.jl:24 ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/vif.jl:24 ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/vif.jl:24 ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/vif.jl:24 ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/vif.jl:24 ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/vif.jl:24 ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/vif.jl:24 [ Info: Baseline mcc: -2.8367742685852565e-17 [ Info: Optimal 1 variables model - mcc ≈ 0.7479 [ Info: Optimal 2 variables model - mcc ≈ 0.7554 [ Info: Optimal 3 variables model - mcc ≈ 0.7747 [ Info: Returning model with 4 variables - mcc ≈ 0.7747 [ Info: [ 0%] LOSS: training ≈ 0.5 validation ≈ 0.4765 ( 95%) [ Info: [ 1%] LOSS: training ≈ 0.4302 validation ≈ 0.401 ( 93%) [ Info: [ 2%] LOSS: training ≈ 0.396 validation ≈ 0.3609 ( 91%) [ Info: [ 2%] LOSS: training ≈ 0.3758 validation ≈ 0.336 ( 89%) [ Info: [ 2%] LOSS: training ≈ 0.3624 validation ≈ 0.3188 ( 88%) [ Info: [ 3%] LOSS: training ≈ 0.3528 validation ≈ 0.3062 ( 87%) [ Info: [ 4%] LOSS: training ≈ 0.3454 validation ≈ 0.2965 ( 86%) [ Info: [ 4%] LOSS: training ≈ 0.3395 validation ≈ 0.2887 ( 85%) [ Info: [ 4%] LOSS: training ≈ 0.3346 validation ≈ 0.2822 ( 84%) [ Info: [ 5%] LOSS: training ≈ 0.3304 validation ≈ 0.2767 ( 84%) [ Info: [ 6%] LOSS: training ≈ 0.3268 validation ≈ 0.272 ( 83%) [ Info: [ 6%] LOSS: training ≈ 0.3236 validation ≈ 0.268 ( 83%) [ Info: [ 6%] LOSS: training ≈ 0.3207 validation ≈ 0.2643 ( 82%) [ Info: [ 7%] LOSS: training ≈ 0.3182 validation ≈ 0.2611 ( 82%) [ Info: [ 8%] LOSS: training ≈ 0.3158 validation ≈ 0.2582 ( 82%) [ Info: [ 8%] LOSS: training ≈ 0.3136 validation ≈ 0.2556 ( 81%) [ Info: [ 8%] LOSS: training ≈ 0.3116 validation ≈ 0.2532 ( 81%) [ Info: [ 9%] LOSS: training ≈ 0.3098 validation ≈ 0.251 ( 81%) [ Info: [ 10%] LOSS: training ≈ 0.308 validation ≈ 0.249 ( 81%) [ Info: [ 10%] LOSS: training ≈ 0.3064 validation ≈ 0.2471 ( 81%) [ Info: [ 10%] LOSS: training ≈ 0.3049 validation ≈ 0.2454 ( 80%) [ Info: [ 11%] LOSS: training ≈ 0.3035 validation ≈ 0.2437 ( 80%) [ Info: [ 12%] LOSS: training ≈ 0.3021 validation ≈ 0.2422 ( 80%) [ Info: [ 12%] LOSS: training ≈ 0.3008 validation ≈ 0.2408 ( 80%) [ Info: [ 12%] LOSS: training ≈ 0.2996 validation ≈ 0.2395 ( 80%) [ Info: [ 13%] LOSS: training ≈ 0.2984 validation ≈ 0.2382 ( 80%) [ Info: [ 14%] LOSS: training ≈ 0.2973 validation ≈ 0.2371 ( 80%) [ Info: [ 14%] LOSS: training ≈ 0.2963 validation ≈ 0.2359 ( 80%) [ Info: [ 14%] LOSS: training ≈ 0.2953 validation ≈ 0.2349 ( 80%) [ Info: [ 15%] LOSS: training ≈ 0.2943 validation ≈ 0.2339 ( 79%) [ Info: [ 16%] LOSS: training ≈ 0.2934 validation ≈ 0.233 ( 79%) [ Info: [ 16%] LOSS: training ≈ 0.2925 validation ≈ 0.2321 ( 79%) [ Info: [ 16%] LOSS: training ≈ 0.2917 validation ≈ 0.2312 ( 79%) [ Info: [ 17%] LOSS: training ≈ 0.2908 validation ≈ 0.2304 ( 79%) [ Info: [ 18%] LOSS: training ≈ 0.2901 validation ≈ 0.2296 ( 79%) [ Info: [ 18%] LOSS: training ≈ 0.2893 validation ≈ 0.2289 ( 79%) [ Info: [ 18%] LOSS: training ≈ 0.2886 validation ≈ 0.2282 ( 79%) [ Info: [ 19%] LOSS: training ≈ 0.2879 validation ≈ 0.2275 ( 79%) [ Info: [ 20%] LOSS: training ≈ 0.2873 validation ≈ 0.2269 ( 79%) [ Info: [ 20%] LOSS: training ≈ 0.2866 validation ≈ 0.2262 ( 79%) [ Info: [ 20%] LOSS: training ≈ 0.286 validation ≈ 0.2256 ( 79%) [ Info: [ 21%] LOSS: training ≈ 0.2854 validation ≈ 0.2251 ( 79%) [ Info: [ 22%] LOSS: training ≈ 0.2849 validation ≈ 0.2245 ( 79%) [ Info: [ 22%] LOSS: training ≈ 0.2843 validation ≈ 0.224 ( 79%) [ Info: [ 22%] LOSS: training ≈ 0.2838 validation ≈ 0.2235 ( 79%) [ Info: [ 23%] LOSS: training ≈ 0.2833 validation ≈ 0.223 ( 79%) [ Info: [ 24%] LOSS: training ≈ 0.2828 validation ≈ 0.2226 ( 79%) [ Info: [ 24%] LOSS: training ≈ 0.2823 validation ≈ 0.2221 ( 79%) [ Info: [ 24%] LOSS: training ≈ 0.2818 validation ≈ 0.2217 ( 79%) [ Info: [ 25%] LOSS: training ≈ 0.2814 validation ≈ 0.2213 ( 79%) [ Info: [ 26%] LOSS: training ≈ 0.281 validation ≈ 0.2209 ( 79%) [ Info: [ 26%] LOSS: training ≈ 0.2806 validation ≈ 0.2205 ( 79%) [ Info: [ 26%] LOSS: training ≈ 0.2802 validation ≈ 0.2201 ( 79%) [ Info: [ 27%] LOSS: training ≈ 0.2798 validation ≈ 0.2198 ( 79%) [ Info: [ 28%] LOSS: training ≈ 0.2794 validation ≈ 0.2194 ( 79%) [ Info: [ 28%] LOSS: training ≈ 0.279 validation ≈ 0.2191 ( 79%) [ Info: [ 28%] LOSS: training ≈ 0.2787 validation ≈ 0.2187 ( 78%) [ Info: [ 29%] LOSS: training ≈ 0.2783 validation ≈ 0.2184 ( 78%) [ Info: [ 30%] LOSS: training ≈ 0.278 validation ≈ 0.2181 ( 78%) [ Info: [ 30%] LOSS: training ≈ 0.2777 validation ≈ 0.2178 ( 78%) [ Info: [ 30%] LOSS: training ≈ 0.2774 validation ≈ 0.2175 ( 78%) [ Info: [ 31%] LOSS: training ≈ 0.2771 validation ≈ 0.2173 ( 78%) [ Info: [ 32%] LOSS: training ≈ 0.2768 validation ≈ 0.217 ( 78%) [ Info: [ 32%] LOSS: training ≈ 0.2765 validation ≈ 0.2167 ( 78%) [ Info: [ 32%] LOSS: training ≈ 0.2762 validation ≈ 0.2165 ( 78%) [ Info: [ 33%] LOSS: training ≈ 0.276 validation ≈ 0.2162 ( 78%) [ Info: [ 34%] LOSS: training ≈ 0.2757 validation ≈ 0.216 ( 78%) [ Info: [ 34%] LOSS: training ≈ 0.2755 validation ≈ 0.2158 ( 78%) [ Info: [ 34%] LOSS: training ≈ 0.2752 validation ≈ 0.2155 ( 78%) [ Info: [ 35%] LOSS: training ≈ 0.275 validation ≈ 0.2153 ( 78%) [ Info: [ 36%] LOSS: training ≈ 0.2748 validation ≈ 0.2151 ( 78%) [ Info: [ 36%] LOSS: training ≈ 0.2745 validation ≈ 0.2149 ( 78%) [ Info: [ 36%] LOSS: training ≈ 0.2743 validation ≈ 0.2147 ( 78%) [ Info: [ 37%] LOSS: training ≈ 0.2741 validation ≈ 0.2145 ( 78%) [ Info: [ 38%] LOSS: training ≈ 0.2739 validation ≈ 0.2143 ( 78%) [ Info: [ 38%] LOSS: training ≈ 0.2737 validation ≈ 0.2141 ( 78%) [ Info: [ 38%] LOSS: training ≈ 0.2735 validation ≈ 0.214 ( 78%) [ Info: [ 39%] LOSS: training ≈ 0.2733 validation ≈ 0.2138 ( 78%) [ Info: [ 40%] LOSS: training ≈ 0.2731 validation ≈ 0.2136 ( 78%) [ Info: [ 40%] LOSS: training ≈ 0.273 validation ≈ 0.2134 ( 78%) [ Info: [ 40%] LOSS: training ≈ 0.2728 validation ≈ 0.2133 ( 78%) [ Info: [ 41%] LOSS: training ≈ 0.2726 validation ≈ 0.2131 ( 78%) [ Info: [ 42%] LOSS: training ≈ 0.2725 validation ≈ 0.213 ( 78%) [ Info: [ 42%] LOSS: training ≈ 0.2723 validation ≈ 0.2128 ( 78%) [ Info: [ 42%] LOSS: training ≈ 0.2722 validation ≈ 0.2127 ( 78%) [ Info: [ 43%] LOSS: training ≈ 0.272 validation ≈ 0.2125 ( 78%) [ Info: [ 44%] LOSS: training ≈ 0.2719 validation ≈ 0.2124 ( 78%) [ Info: [ 44%] LOSS: training ≈ 0.2717 validation ≈ 0.2123 ( 78%) [ Info: [ 44%] LOSS: training ≈ 0.2716 validation ≈ 0.2121 ( 78%) [ Info: [ 45%] LOSS: training ≈ 0.2714 validation ≈ 0.212 ( 78%) [ Info: [ 46%] LOSS: training ≈ 0.2713 validation ≈ 0.2119 ( 78%) [ Info: [ 46%] LOSS: training ≈ 0.2712 validation ≈ 0.2117 ( 78%) [ Info: [ 46%] LOSS: training ≈ 0.2711 validation ≈ 0.2116 ( 78%) [ Info: [ 47%] LOSS: training ≈ 0.2709 validation ≈ 0.2115 ( 78%) [ Info: [ 48%] LOSS: training ≈ 0.2708 validation ≈ 0.2114 ( 78%) [ Info: [ 48%] LOSS: training ≈ 0.2707 validation ≈ 0.2113 ( 78%) [ Info: [ 48%] LOSS: training ≈ 0.2706 validation ≈ 0.2112 ( 78%) [ Info: [ 49%] LOSS: training ≈ 0.2705 validation ≈ 0.2111 ( 78%) [ Info: [ 50%] LOSS: training ≈ 0.2704 validation ≈ 0.2109 ( 78%) [ Info: [ 50%] LOSS: training ≈ 0.2703 validation ≈ 0.2108 ( 78%) [ Info: [ 50%] LOSS: training ≈ 0.2702 validation ≈ 0.2107 ( 78%) [ Info: [ 51%] LOSS: training ≈ 0.2701 validation ≈ 0.2106 ( 78%) [ Info: [ 52%] LOSS: training ≈ 0.27 validation ≈ 0.2105 ( 78%) [ Info: [ 52%] LOSS: training ≈ 0.2699 validation ≈ 0.2105 ( 78%) [ Info: [ 52%] LOSS: training ≈ 0.2698 validation ≈ 0.2104 ( 78%) [ Info: [ 53%] LOSS: training ≈ 0.2697 validation ≈ 0.2103 ( 78%) [ Info: [ 54%] LOSS: training ≈ 0.2696 validation ≈ 0.2102 ( 78%) [ Info: [ 54%] LOSS: training ≈ 0.2695 validation ≈ 0.2101 ( 78%) [ Info: [ 55%] LOSS: training ≈ 0.2694 validation ≈ 0.21 ( 78%) [ Info: [ 55%] LOSS: training ≈ 0.2693 validation ≈ 0.2099 ( 78%) [ Info: [ 56%] LOSS: training ≈ 0.2693 validation ≈ 0.2098 ( 78%) [ Info: [ 56%] LOSS: training ≈ 0.2692 validation ≈ 0.2098 ( 78%) [ Info: [ 56%] LOSS: training ≈ 0.2691 validation ≈ 0.2097 ( 78%) [ Info: [ 57%] LOSS: training ≈ 0.269 validation ≈ 0.2096 ( 78%) [ Info: [ 57%] LOSS: training ≈ 0.269 validation ≈ 0.2095 ( 78%) [ Info: [ 58%] LOSS: training ≈ 0.2689 validation ≈ 0.2094 ( 78%) [ Info: [ 58%] LOSS: training ≈ 0.2688 validation ≈ 0.2094 ( 78%) [ Info: [ 59%] LOSS: training ≈ 0.2687 validation ≈ 0.2093 ( 78%) [ Info: [ 60%] LOSS: training ≈ 0.2687 validation ≈ 0.2092 ( 78%) [ Info: [ 60%] LOSS: training ≈ 0.2686 validation ≈ 0.2092 ( 78%) [ Info: [ 60%] LOSS: training ≈ 0.2686 validation ≈ 0.2091 ( 78%) [ Info: [ 61%] LOSS: training ≈ 0.2685 validation ≈ 0.209 ( 78%) [ Info: [ 62%] LOSS: training ≈ 0.2684 validation ≈ 0.209 ( 78%) [ Info: [ 62%] LOSS: training ≈ 0.2684 validation ≈ 0.2089 ( 78%) [ Info: [ 62%] LOSS: training ≈ 0.2683 validation ≈ 0.2088 ( 78%) [ Info: [ 63%] LOSS: training ≈ 0.2682 validation ≈ 0.2088 ( 78%) [ Info: [ 64%] LOSS: training ≈ 0.2682 validation ≈ 0.2087 ( 78%) [ Info: [ 64%] LOSS: training ≈ 0.2681 validation ≈ 0.2087 ( 78%) [ Info: [ 64%] LOSS: training ≈ 0.2681 validation ≈ 0.2086 ( 78%) [ Info: [ 65%] LOSS: training ≈ 0.268 validation ≈ 0.2085 ( 78%) [ Info: [ 66%] LOSS: training ≈ 0.268 validation ≈ 0.2085 ( 78%) [ Info: [ 66%] LOSS: training ≈ 0.2679 validation ≈ 0.2084 ( 78%) [ Info: [ 66%] LOSS: training ≈ 0.2679 validation ≈ 0.2084 ( 78%) [ Info: [ 67%] LOSS: training ≈ 0.2678 validation ≈ 0.2083 ( 78%) [ Info: [ 68%] LOSS: training ≈ 0.2678 validation ≈ 0.2083 ( 78%) [ Info: [ 68%] LOSS: training ≈ 0.2677 validation ≈ 0.2082 ( 78%) [ Info: [ 68%] LOSS: training ≈ 0.2677 validation ≈ 0.2082 ( 78%) [ Info: [ 69%] LOSS: training ≈ 0.2676 validation ≈ 0.2081 ( 78%) [ Info: [ 70%] LOSS: training ≈ 0.2676 validation ≈ 0.2081 ( 78%) [ Info: [ 70%] LOSS: training ≈ 0.2676 validation ≈ 0.208 ( 78%) [ Info: [ 70%] LOSS: training ≈ 0.2675 validation ≈ 0.208 ( 78%) [ Info: [ 71%] LOSS: training ≈ 0.2675 validation ≈ 0.2079 ( 78%) [ Info: [ 72%] LOSS: training ≈ 0.2674 validation ≈ 0.2079 ( 78%) [ Info: [ 72%] LOSS: training ≈ 0.2674 validation ≈ 0.2078 ( 78%) [ Info: [ 72%] LOSS: training ≈ 0.2674 validation ≈ 0.2078 ( 78%) [ Info: [ 73%] LOSS: training ≈ 0.2673 validation ≈ 0.2077 ( 78%) [ Info: [ 74%] LOSS: training ≈ 0.2673 validation ≈ 0.2077 ( 78%) [ Info: [ 74%] LOSS: training ≈ 0.2672 validation ≈ 0.2077 ( 78%) [ Info: [ 74%] LOSS: training ≈ 0.2672 validation ≈ 0.2076 ( 78%) [ Info: [ 75%] LOSS: training ≈ 0.2672 validation ≈ 0.2076 ( 78%) [ Info: [ 76%] LOSS: training ≈ 0.2671 validation ≈ 0.2075 ( 78%) [ Info: [ 76%] LOSS: training ≈ 0.2671 validation ≈ 0.2075 ( 78%) [ Info: [ 76%] LOSS: training ≈ 0.2671 validation ≈ 0.2074 ( 78%) [ Info: [ 77%] LOSS: training ≈ 0.267 validation ≈ 0.2074 ( 78%) [ Info: [ 78%] LOSS: training ≈ 0.267 validation ≈ 0.2074 ( 78%) [ Info: [ 78%] LOSS: training ≈ 0.267 validation ≈ 0.2073 ( 78%) [ Info: [ 78%] LOSS: training ≈ 0.2669 validation ≈ 0.2073 ( 78%) [ Info: [ 79%] LOSS: training ≈ 0.2669 validation ≈ 0.2073 ( 78%) [ Info: [ 80%] LOSS: training ≈ 0.2669 validation ≈ 0.2072 ( 78%) [ Info: [ 80%] LOSS: training ≈ 0.2668 validation ≈ 0.2072 ( 78%) [ Info: [ 80%] LOSS: training ≈ 0.2668 validation ≈ 0.2072 ( 78%) [ Info: [ 81%] LOSS: training ≈ 0.2668 validation ≈ 0.2071 ( 78%) [ Info: [ 82%] LOSS: training ≈ 0.2668 validation ≈ 0.2071 ( 78%) [ Info: [ 82%] LOSS: training ≈ 0.2667 validation ≈ 0.2071 ( 78%) [ Info: [ 82%] LOSS: training ≈ 0.2667 validation ≈ 0.207 ( 78%) [ Info: [ 83%] LOSS: training ≈ 0.2667 validation ≈ 0.207 ( 78%) [ Info: [ 84%] LOSS: training ≈ 0.2667 validation ≈ 0.207 ( 78%) [ Info: [ 84%] LOSS: training ≈ 0.2666 validation ≈ 0.2069 ( 78%) [ Info: [ 84%] LOSS: training ≈ 0.2666 validation ≈ 0.2069 ( 78%) [ Info: [ 85%] LOSS: training ≈ 0.2666 validation ≈ 0.2069 ( 78%) [ Info: [ 86%] LOSS: training ≈ 0.2666 validation ≈ 0.2068 ( 78%) [ Info: [ 86%] LOSS: training ≈ 0.2665 validation ≈ 0.2068 ( 78%) [ Info: [ 86%] LOSS: training ≈ 0.2665 validation ≈ 0.2068 ( 78%) [ Info: [ 87%] LOSS: training ≈ 0.2665 validation ≈ 0.2068 ( 78%) [ Info: [ 88%] LOSS: training ≈ 0.2665 validation ≈ 0.2067 ( 78%) [ Info: [ 88%] LOSS: training ≈ 0.2664 validation ≈ 0.2067 ( 78%) [ Info: [ 88%] LOSS: training ≈ 0.2664 validation ≈ 0.2067 ( 78%) [ Info: [ 89%] LOSS: training ≈ 0.2664 validation ≈ 0.2066 ( 78%) [ Info: [ 90%] LOSS: training ≈ 0.2664 validation ≈ 0.2066 ( 78%) [ Info: [ 90%] LOSS: training ≈ 0.2664 validation ≈ 0.2066 ( 78%) [ Info: [ 90%] LOSS: training ≈ 0.2663 validation ≈ 0.2066 ( 78%) [ Info: [ 91%] LOSS: training ≈ 0.2663 validation ≈ 0.2065 ( 78%) [ Info: [ 92%] LOSS: training ≈ 0.2663 validation ≈ 0.2065 ( 78%) [ Info: [ 92%] LOSS: training ≈ 0.2663 validation ≈ 0.2065 ( 78%) [ Info: [ 92%] LOSS: training ≈ 0.2663 validation ≈ 0.2065 ( 78%) [ Info: [ 93%] LOSS: training ≈ 0.2662 validation ≈ 0.2064 ( 78%) [ Info: [ 94%] LOSS: training ≈ 0.2662 validation ≈ 0.2064 ( 78%) [ Info: [ 94%] LOSS: training ≈ 0.2662 validation ≈ 0.2064 ( 78%) [ Info: [ 94%] LOSS: training ≈ 0.2662 validation ≈ 0.2064 ( 78%) [ Info: [ 95%] LOSS: training ≈ 0.2662 validation ≈ 0.2063 ( 78%) [ Info: [ 96%] LOSS: training ≈ 0.2662 validation ≈ 0.2063 ( 78%) [ Info: [ 96%] LOSS: training ≈ 0.2661 validation ≈ 0.2063 ( 78%) [ Info: [ 96%] LOSS: training ≈ 0.2661 validation ≈ 0.2063 ( 78%) [ Info: [ 97%] LOSS: training ≈ 0.2661 validation ≈ 0.2063 ( 78%) [ Info: [ 98%] LOSS: training ≈ 0.2661 validation ≈ 0.2062 ( 78%) [ Info: [ 98%] LOSS: training ≈ 0.2661 validation ≈ 0.2062 ( 78%) [ Info: [ 98%] LOSS: training ≈ 0.2661 validation ≈ 0.2062 ( 77%) [ Info: [ 99%] LOSS: training ≈ 0.266 validation ≈ 0.2062 ( 77%) [ Info: [100%] LOSS: training ≈ 0.266 validation ≈ 0.2061 ( 77%) [ Info: [100%] LOSS: training ≈ 0.266 validation ≈ 0.2061 ( 77%) [ Info: [ 0%] LOSS: training ≈ 0.4215 [ Info: [ 1%] LOSS: training ≈ 0.3806 [ Info: [ 2%] LOSS: training ≈ 0.3564 [ Info: [ 2%] LOSS: training ≈ 0.3399 [ Info: [ 2%] LOSS: training ≈ 0.3279 [ Info: [ 3%] LOSS: training ≈ 0.3185 [ Info: [ 4%] LOSS: training ≈ 0.311 [ Info: [ 4%] LOSS: training ≈ 0.3048 [ Info: [ 4%] LOSS: training ≈ 0.2996 [ Info: [ 5%] LOSS: training ≈ 0.2952 [ Info: [ 6%] LOSS: training ≈ 0.2914 [ Info: [ 6%] LOSS: training ≈ 0.2881 [ Info: [ 6%] LOSS: training ≈ 0.2852 [ Info: [ 7%] LOSS: training ≈ 0.2826 [ Info: [ 8%] LOSS: training ≈ 0.2802 [ Info: [ 8%] LOSS: training ≈ 0.2782 [ Info: [ 8%] LOSS: training ≈ 0.2763 [ Info: [ 9%] LOSS: training ≈ 0.2746 [ Info: [ 10%] LOSS: training ≈ 0.2731 [ Info: [ 10%] LOSS: training ≈ 0.2717 [ Info: [ 10%] LOSS: training ≈ 0.2704 [ Info: [ 11%] LOSS: training ≈ 0.2692 [ Info: [ 12%] LOSS: training ≈ 0.2681 [ Info: [ 12%] LOSS: training ≈ 0.2671 [ Info: [ 12%] LOSS: training ≈ 0.2662 [ Info: [ 13%] LOSS: training ≈ 0.2654 [ Info: [ 14%] LOSS: training ≈ 0.2646 [ Info: [ 14%] LOSS: training ≈ 0.2639 [ Info: [ 14%] LOSS: training ≈ 0.2632 [ Info: [ 15%] LOSS: training ≈ 0.2626 [ Info: [ 16%] LOSS: training ≈ 0.262 [ Info: [ 16%] LOSS: training ≈ 0.2614 [ Info: [ 16%] LOSS: training ≈ 0.2609 [ Info: [ 17%] LOSS: training ≈ 0.2605 [ Info: [ 18%] LOSS: training ≈ 0.26 [ Info: [ 18%] LOSS: training ≈ 0.2596 [ Info: [ 18%] LOSS: training ≈ 0.2592 [ Info: [ 19%] LOSS: training ≈ 0.2589 [ Info: [ 20%] LOSS: training ≈ 0.2585 [ Info: [ 20%] LOSS: training ≈ 0.2582 [ Info: [ 20%] LOSS: training ≈ 0.2579 [ Info: [ 21%] LOSS: training ≈ 0.2576 [ Info: [ 22%] LOSS: training ≈ 0.2573 [ Info: [ 22%] LOSS: training ≈ 0.2571 [ Info: [ 22%] LOSS: training ≈ 0.2568 [ Info: [ 23%] LOSS: training ≈ 0.2566 [ Info: [ 24%] LOSS: training ≈ 0.2564 [ Info: [ 24%] LOSS: training ≈ 0.2562 [ Info: [ 24%] LOSS: training ≈ 0.256 [ Info: [ 25%] LOSS: training ≈ 0.2558 [ Info: [ 26%] LOSS: training ≈ 0.2557 [ Info: [ 26%] LOSS: training ≈ 0.2555 [ Info: [ 26%] LOSS: training ≈ 0.2554 [ Info: [ 27%] LOSS: training ≈ 0.2552 [ Info: [ 28%] LOSS: training ≈ 0.2551 [ Info: [ 28%] LOSS: training ≈ 0.255 [ Info: [ 28%] LOSS: training ≈ 0.2548 [ Info: [ 29%] LOSS: training ≈ 0.2547 [ Info: [ 30%] LOSS: training ≈ 0.2546 [ Info: [ 30%] LOSS: training ≈ 0.2545 [ Info: [ 30%] LOSS: training ≈ 0.2544 [ Info: [ 31%] LOSS: training ≈ 0.2543 [ Info: [ 32%] LOSS: training ≈ 0.2542 [ Info: [ 32%] LOSS: training ≈ 0.2541 [ Info: [ 32%] LOSS: training ≈ 0.2541 [ Info: [ 33%] LOSS: training ≈ 0.254 [ Info: [ 34%] LOSS: training ≈ 0.2539 [ Info: [ 34%] LOSS: training ≈ 0.2538 [ Info: [ 34%] LOSS: training ≈ 0.2538 [ Info: [ 35%] LOSS: training ≈ 0.2537 [ Info: [ 36%] LOSS: training ≈ 0.2536 [ Info: [ 36%] LOSS: training ≈ 0.2536 [ Info: [ 36%] LOSS: training ≈ 0.2535 [ Info: [ 37%] LOSS: training ≈ 0.2535 [ Info: [ 38%] LOSS: training ≈ 0.2534 [ Info: [ 38%] LOSS: training ≈ 0.2534 [ Info: [ 38%] LOSS: training ≈ 0.2533 [ Info: [ 39%] LOSS: training ≈ 0.2533 [ Info: [ 40%] LOSS: training ≈ 0.2532 [ Info: [ 40%] LOSS: training ≈ 0.2532 [ Info: [ 40%] LOSS: training ≈ 0.2532 [ Info: [ 41%] LOSS: training ≈ 0.2531 [ Info: [ 42%] LOSS: training ≈ 0.2531 [ Info: [ 42%] LOSS: training ≈ 0.2531 [ Info: [ 42%] LOSS: training ≈ 0.253 [ Info: [ 43%] LOSS: training ≈ 0.253 [ Info: [ 44%] LOSS: training ≈ 0.253 [ Info: [ 44%] LOSS: training ≈ 0.2529 [ Info: [ 44%] LOSS: training ≈ 0.2529 [ Info: [ 45%] LOSS: training ≈ 0.2529 [ Info: [ 46%] LOSS: training ≈ 0.2529 [ Info: [ 46%] LOSS: training ≈ 0.2528 [ Info: [ 46%] LOSS: training ≈ 0.2528 [ Info: [ 47%] LOSS: training ≈ 0.2528 [ Info: [ 48%] LOSS: training ≈ 0.2528 [ Info: [ 48%] LOSS: training ≈ 0.2527 [ Info: [ 48%] LOSS: training ≈ 0.2527 [ Info: [ 49%] LOSS: training ≈ 0.2527 [ Info: [ 50%] LOSS: training ≈ 0.2527 [ Info: [ 50%] LOSS: training ≈ 0.2527 [ Info: [ 50%] LOSS: training ≈ 0.2527 [ Info: [ 51%] LOSS: training ≈ 0.2526 [ Info: [ 52%] LOSS: training ≈ 0.2526 [ Info: [ 52%] LOSS: training ≈ 0.2526 [ Info: [ 52%] LOSS: training ≈ 0.2526 [ Info: [ 53%] LOSS: training ≈ 0.2526 [ Info: [ 54%] LOSS: training ≈ 0.2526 [ Info: [ 54%] LOSS: training ≈ 0.2525 [ Info: [ 55%] LOSS: training ≈ 0.2525 [ Info: [ 55%] LOSS: training ≈ 0.2525 [ Info: [ 56%] LOSS: training ≈ 0.2525 [ Info: [ 56%] LOSS: training ≈ 0.2525 [ Info: [ 56%] LOSS: training ≈ 0.2525 [ Info: [ 57%] LOSS: training ≈ 0.2525 [ Info: [ 57%] LOSS: training ≈ 0.2525 [ Info: [ 58%] LOSS: training ≈ 0.2525 [ Info: [ 58%] LOSS: training ≈ 0.2525 [ Info: [ 59%] LOSS: training ≈ 0.2524 [ Info: [ 60%] LOSS: training ≈ 0.2524 [ Info: [ 60%] LOSS: training ≈ 0.2524 [ Info: [ 60%] LOSS: training ≈ 0.2524 [ Info: [ 61%] LOSS: training ≈ 0.2524 [ Info: [ 62%] LOSS: training ≈ 0.2524 [ Info: [ 62%] LOSS: training ≈ 0.2524 [ Info: [ 62%] LOSS: training ≈ 0.2524 [ Info: [ 63%] LOSS: training ≈ 0.2524 [ Info: [ 64%] LOSS: training ≈ 0.2524 [ Info: [ 64%] LOSS: training ≈ 0.2524 [ Info: [ 64%] LOSS: training ≈ 0.2524 [ Info: [ 65%] LOSS: training ≈ 0.2524 [ Info: [ 66%] LOSS: training ≈ 0.2523 [ Info: [ 66%] LOSS: training ≈ 0.2523 [ Info: [ 66%] LOSS: training ≈ 0.2523 [ Info: [ 67%] LOSS: training ≈ 0.2523 [ Info: [ 68%] LOSS: training ≈ 0.2523 [ Info: [ 68%] LOSS: training ≈ 0.2523 [ Info: [ 68%] LOSS: training ≈ 0.2523 [ Info: [ 69%] LOSS: training ≈ 0.2523 [ Info: [ 70%] LOSS: training ≈ 0.2523 [ Info: [ 70%] LOSS: training ≈ 0.2523 [ Info: [ 70%] LOSS: training ≈ 0.2523 [ Info: [ 71%] LOSS: training ≈ 0.2523 [ Info: [ 72%] LOSS: training ≈ 0.2523 [ Info: [ 72%] LOSS: training ≈ 0.2523 [ Info: [ 72%] LOSS: training ≈ 0.2523 [ Info: [ 73%] LOSS: training ≈ 0.2523 [ Info: [ 74%] LOSS: training ≈ 0.2523 [ Info: [ 74%] LOSS: training ≈ 0.2523 [ Info: [ 74%] LOSS: training ≈ 0.2523 [ Info: [ 75%] LOSS: training ≈ 0.2523 [ Info: [ 76%] LOSS: training ≈ 0.2523 [ Info: [ 76%] LOSS: training ≈ 0.2523 [ Info: [ 76%] LOSS: training ≈ 0.2523 [ Info: [ 77%] LOSS: training ≈ 0.2523 [ Info: [ 78%] LOSS: training ≈ 0.2523 [ Info: [ 78%] LOSS: training ≈ 0.2523 [ Info: [ 78%] LOSS: training ≈ 0.2522 [ Info: [ 79%] LOSS: training ≈ 0.2522 [ Info: [ 80%] LOSS: training ≈ 0.2522 [ Info: [ 80%] LOSS: training ≈ 0.2522 [ Info: [ 80%] LOSS: training ≈ 0.2522 [ Info: [ 81%] LOSS: training ≈ 0.2522 [ Info: [ 82%] LOSS: training ≈ 0.2522 [ Info: [ 82%] LOSS: training ≈ 0.2522 [ Info: [ 82%] LOSS: training ≈ 0.2522 [ Info: [ 83%] LOSS: training ≈ 0.2522 [ Info: [ 84%] LOSS: training ≈ 0.2522 [ Info: [ 84%] LOSS: training ≈ 0.2522 [ Info: [ 84%] LOSS: training ≈ 0.2522 [ Info: [ 85%] LOSS: training ≈ 0.2522 [ Info: [ 86%] LOSS: training ≈ 0.2522 [ Info: [ 86%] LOSS: training ≈ 0.2522 [ Info: [ 86%] LOSS: training ≈ 0.2522 [ Info: [ 87%] LOSS: training ≈ 0.2522 [ Info: [ 88%] LOSS: training ≈ 0.2522 [ Info: [ 88%] LOSS: training ≈ 0.2522 [ Info: [ 88%] LOSS: training ≈ 0.2522 [ Info: [ 89%] LOSS: training ≈ 0.2522 [ Info: [ 90%] LOSS: training ≈ 0.2522 [ Info: [ 90%] LOSS: training ≈ 0.2522 [ Info: [ 90%] LOSS: training ≈ 0.2522 [ Info: [ 91%] LOSS: training ≈ 0.2522 [ Info: [ 92%] LOSS: training ≈ 0.2522 [ Info: [ 92%] LOSS: training ≈ 0.2522 [ Info: [ 92%] LOSS: training ≈ 0.2522 [ Info: [ 93%] LOSS: training ≈ 0.2522 [ Info: [ 94%] LOSS: training ≈ 0.2522 [ Info: [ 94%] LOSS: training ≈ 0.2522 [ Info: [ 94%] LOSS: training ≈ 0.2522 [ Info: [ 95%] LOSS: training ≈ 0.2522 [ Info: [ 96%] LOSS: training ≈ 0.2522 [ Info: [ 96%] LOSS: training ≈ 0.2522 [ Info: [ 96%] LOSS: training ≈ 0.2522 [ Info: [ 97%] LOSS: training ≈ 0.2522 [ Info: [ 98%] LOSS: training ≈ 0.2522 [ Info: [ 98%] LOSS: training ≈ 0.2522 [ Info: [ 98%] LOSS: training ≈ 0.2522 [ Info: [ 99%] LOSS: training ≈ 0.2522 [ Info: [100%] LOSS: training ≈ 0.2522 [ Info: [100%] LOSS: training ≈ 0.2522 Test Summary: | Pass Total Time Package | 294 294 22m54.3s Testing SDeMo tests passed Testing completed after 1436.87s PkgEval succeeded after 1507.58s