Package evaluation to test SDeMo on Julia 1.14.0-DEV.1840 (ab7ba34e40*) started at 2026-03-03T18:07:30.629 ################################################################################ # Set-up # Installing PkgEval dependencies (TestEnv)... Activating project at `~/.julia/environments/v1.14` Set-up completed after 13.81s ################################################################################ # Installation # Installing SDeMo... Resolving package versions... Updating `~/.julia/environments/v1.14/Project.toml` [3e5feb82] + SDeMo v1.8.0 Updating `~/.julia/environments/v1.14/Manifest.toml` [66dad0bd] + AliasTables v1.1.3 [7d9fca2a] + Arpack v0.5.4 [9a962f9c] + DataAPI v1.16.0 [864edb3b] + DataStructures v0.19.3 [31c24e10] + Distributions v0.25.123 [ffbed154] + DocStringExtensions v0.9.5 [1a297f60] + FillArrays v1.16.0 [34004b35] + HypergeometricFunctions v0.3.28 [92d709cd] + IrrationalConstants v0.2.6 [692b3bcd] + JLLWrappers v1.7.1 [682c06a0] + JSON v1.4.0 [2ab3a3ac] + LogExpFunctions v0.3.29 [e1d29d7a] + Missings v1.2.0 [6f286f6a] + MultivariateStats v0.10.4 [bac558e1] + OrderedCollections v1.8.1 [90014a1f] + PDMats v0.11.37 [69de0a69] + Parsers v2.8.3 [aea7be01] + PrecompileTools v1.3.3 [21216c6a] + Preferences v1.5.2 [43287f4e] + PtrArrays v1.4.0 [1fd47b50] + QuadGK v2.11.2 [189a3867] + Reexport v1.2.2 [79098fc4] + Rmath v0.9.0 [3e5feb82] + SDeMo v1.8.0 [a2af1166] + SortingAlgorithms v1.2.2 [276daf66] + SpecialFunctions v2.7.1 [10745b16] + Statistics v1.11.1 [82ae8749] + StatsAPI v1.8.0 [2913bbd2] + StatsBase v0.34.10 [4c63d2b9] + StatsFuns v1.5.2 [ec057cc2] + StructUtils v2.6.3 [1c621080] + TestItems v1.0.0 ⌅ [68821587] + Arpack_jll v3.5.2+0 [efe28fd5] + OpenSpecFun_jll v0.5.6+0 [f50d1b31] + Rmath_jll v0.5.1+0 [56f22d72] + Artifacts v1.11.0 [ade2ca70] + Dates v1.11.0 [8f399da3] + Libdl v1.11.0 [37e2e46d] + LinearAlgebra v1.13.0 [56ddb016] + Logging v1.11.0 [de0858da] + Printf v1.11.0 [9a3f8284] + Random v1.11.0 [ea8e919c] + SHA v1.0.0 [9e88b42a] + Serialization v1.11.0 [2f01184e] + SparseArrays v1.13.0 [4607b0f0] + SuiteSparse [fa267f1f] + TOML v1.0.3 [cf7118a7] + UUIDs v1.11.0 [4ec0a83e] + Unicode v1.11.0 [e66e0078] + CompilerSupportLibraries_jll v1.3.0+1 [4536629a] + OpenBLAS_jll v0.3.30+0 [05823500] + OpenLibm_jll v0.8.7+0 [bea87d4a] + SuiteSparse_jll v7.10.1+0 [8e850b90] + libblastrampoline_jll v5.15.0+0 Info Packages marked with ⌅ have new versions available but compatibility constraints restrict them from upgrading. To see why use `status --outdated -m` Installation completed after 5.68s ################################################################################ # Precompilation # Precompiling PkgEval dependencies... Precompiling package dependencies... Precompiling packages... 5707.8 ms ✓ SDeMo 321404.0 ms ✓ CairoMakie 71826.8 ms ✓ SDeMo → MakieExtension 3 dependencies successfully precompiled in 404 seconds. 277 already precompiled. Precompilation completed after 430.0s ################################################################################ # Testing # Testing SDeMo Status `/tmp/jl_aH3tfx/Project.toml` [13f3f980] CairoMakie v0.15.8 [3e5feb82] SDeMo v1.8.0 [10745b16] Statistics v1.11.1 [f8b46487] TestItemRunner v1.1.4 [8dfed614] Test v1.11.0 Status `/tmp/jl_aH3tfx/Manifest.toml` [621f4979] AbstractFFTs v1.5.0 [1520ce14] AbstractTrees v0.4.5 [79e6a3ab] Adapt v4.5.0 [35492f91] AdaptivePredicates v1.2.0 [66dad0bd] AliasTables v1.1.3 [27a7e980] Animations v0.4.2 [7d9fca2a] Arpack v0.5.4 [67c07d97] Automa v1.1.0 [13072b0f] AxisAlgorithms v1.1.0 [39de3d68] AxisArrays v0.4.8 [18cc8868] BaseDirs v1.3.2 [fa961155] CEnum v0.5.0 [96374032] CRlibm v1.0.2 [159f3aea] Cairo v1.1.1 [13f3f980] CairoMakie v0.15.8 [d360d2e6] ChainRulesCore v1.26.0 [a2cac450] ColorBrewer v0.4.2 [35d6a980] ColorSchemes v3.31.0 [3da002f7] ColorTypes v0.12.1 [c3611d14] ColorVectorSpace v0.11.0 [5ae59095] Colors v0.13.1 [34da2185] Compat v4.18.1 [95dc2771] ComputePipeline v0.1.6 [187b0558] ConstructionBase v1.6.0 [d38c429a] Contour v0.6.3 [9a962f9c] DataAPI v1.16.0 [864edb3b] DataStructures v0.19.3 [e2d170a0] DataValueInterfaces v1.0.0 [927a84f5] DelaunayTriangulation v1.6.6 [31c24e10] Distributions v0.25.123 [ffbed154] DocStringExtensions v0.9.5 [4e289a0a] EnumX v1.0.7 [429591f6] ExactPredicates v2.2.9 [411431e0] Extents v0.1.6 [b86e33f2] FFTA v0.3.1 [5789e2e9] FileIO v1.18.0 [8fc22ac5] FilePaths v0.9.0 [48062228] FilePathsBase v0.9.24 [1a297f60] FillArrays v1.16.0 [53c48c17] FixedPointNumbers v0.8.5 [1fa38f19] Format v1.3.7 [b38be410] FreeType v4.1.1 [663a7486] FreeTypeAbstraction v0.10.8 [5c1252a2] GeometryBasics v0.5.10 [a2bd30eb] Graphics v1.1.3 [3955a311] GridLayoutBase v0.11.2 [42e2da0e] Grisu v1.0.2 [34004b35] HypergeometricFunctions v0.3.28 [2803e5a7] ImageAxes v0.6.12 [c817782e] ImageBase v0.1.7 [a09fc81d] ImageCore v0.10.5 [82e4d734] ImageIO v0.6.9 [bc367c6b] ImageMetadata v0.9.10 [9b13fd28] IndirectArrays v1.0.0 [d25df0c9] Inflate v0.1.5 [18e54dd8] IntegerMathUtils v0.1.3 [a98d9a8b] Interpolations v0.16.2 [d1acc4aa] IntervalArithmetic v1.0.2 [8197267c] IntervalSets v0.7.13 [3587e190] InverseFunctions v0.1.17 [92d709cd] IrrationalConstants v0.2.6 [f1662d9f] Isoband v0.1.1 [c8e1da08] IterTools v1.10.0 [82899510] IteratorInterfaceExtensions v1.0.0 [692b3bcd] JLLWrappers v1.7.1 [682c06a0] JSON v1.4.0 [b835a17e] JpegTurbo v0.1.6 [5ab0869b] KernelDensity v0.6.11 [b964fa9f] LaTeXStrings v1.4.0 [8cdb02fc] LazyModules v0.3.1 [2ab3a3ac] LogExpFunctions v0.3.29 [1914dd2f] MacroTools v0.5.16 [ee78f7c6] Makie v0.24.8 [dbb5928d] MappedArrays v0.4.3 [0a4f8689] MathTeXEngine v0.6.7 [e1d29d7a] Missings v1.2.0 [e94cdb99] MosaicViews v0.3.4 [46d2c3a1] MuladdMacro v0.2.4 [6f286f6a] MultivariateStats v0.10.4 [77ba4419] NaNMath v1.1.3 [f09324ee] Netpbm v1.1.1 [510215fc] Observables v0.5.5 [6fe1bfb0] OffsetArrays v1.17.0 [52e1d378] OpenEXR v0.3.3 [bac558e1] OrderedCollections v1.8.1 [90014a1f] PDMats v0.11.37 [f57f5aa1] PNGFiles v0.4.4 [19eb6ba3] Packing v0.5.1 [5432bcbf] PaddedViews v0.5.12 [69de0a69] Parsers v2.8.3 [eebad327] PkgVersion v0.3.3 [995b91a9] PlotUtils v1.4.4 [647866c9] PolygonOps v0.1.2 [aea7be01] PrecompileTools v1.3.3 [21216c6a] Preferences v1.5.2 [27ebfcd6] Primes v0.5.7 [92933f4c] ProgressMeter v1.11.0 [43287f4e] PtrArrays v1.4.0 [4b34888f] QOI v1.0.2 [1fd47b50] QuadGK v2.11.2 [b3c3ace0] RangeArrays v0.3.2 [c84ed2f1] Ratios v0.4.5 [189a3867] Reexport v1.2.2 [05181044] RelocatableFolders v1.0.1 [ae029012] Requires v1.3.1 [79098fc4] Rmath v0.9.0 [5eaf0fd0] RoundingEmulator v0.2.1 [3e5feb82] SDeMo v1.8.0 [fdea26ae] SIMD v3.7.2 [6c6a2e73] Scratch v1.3.0 [65257c39] ShaderAbstractions v0.5.0 [992d4aef] Showoff v1.0.3 [73760f76] SignedDistanceFields v0.4.1 [699a6c99] SimpleTraits v0.9.5 [45858cf5] Sixel v0.1.5 [a2af1166] SortingAlgorithms v1.2.2 [276daf66] SpecialFunctions v2.7.1 [860ef19b] StableRNGs v1.0.4 [cae243ae] StackViews v0.1.2 [90137ffa] StaticArrays v1.9.17 [1e83bf80] StaticArraysCore v1.4.4 [10745b16] Statistics v1.11.1 [82ae8749] StatsAPI v1.8.0 [2913bbd2] StatsBase v0.34.10 [4c63d2b9] StatsFuns v1.5.2 [09ab397b] StructArrays v0.7.2 [ec057cc2] StructUtils v2.6.3 [3783bdb8] TableTraits v1.0.1 [bd369af6] Tables v1.12.1 [62fd8b95] TensorCore v0.1.1 [f8b46487] TestItemRunner v1.1.4 [1c621080] TestItems v1.0.0 [731e570b] TiffImages v0.11.6 [3bb67fe8] TranscodingStreams v0.11.3 [981d1d27] TriplotBase v0.1.0 [1cfade01] UnicodeFun v0.4.1 [1986cc42] Unitful v1.28.0 [e3aaa7dc] WebP v0.1.3 [efce3f68] WoodburyMatrices v1.1.0 ⌅ [68821587] Arpack_jll v3.5.2+0 [6e34b625] Bzip2_jll v1.0.9+0 [4e9b3aee] CRlibm_jll v1.0.1+0 [83423d85] Cairo_jll v1.18.5+1 [5ae413db] EarCut_jll v2.2.4+0 [2e619515] Expat_jll v2.7.3+0 [b22a6f82] FFMPEG_jll v8.0.1+0 [a3f928ae] Fontconfig_jll v2.17.1+0 [d7e528f0] FreeType2_jll v2.13.4+0 [559328eb] FriBidi_jll v1.0.17+0 [b0724c58] GettextRuntime_jll v0.22.4+0 [59f7168a] Giflib_jll v5.2.3+0 [7746bdde] Glib_jll v2.86.3+0 [3b182d85] Graphite2_jll v1.3.15+0 [2e76f6c2] HarfBuzz_jll v8.5.1+0 [905a6f67] Imath_jll v3.2.2+0 [aacddb02] JpegTurbo_jll v3.1.4+0 [c1c5ebd0] LAME_jll v3.100.3+0 [88015f11] LERC_jll v4.0.1+0 [1d63c593] LLVMOpenMP_jll v18.1.8+0 [dd4b983a] LZO_jll v2.10.3+0 ⌅ [e9f186c6] Libffi_jll v3.4.7+0 [7e76a0d4] Libglvnd_jll v1.7.1+1 [94ce4f54] Libiconv_jll v1.18.0+0 [4b2f31a3] Libmount_jll v2.41.3+0 [89763e89] Libtiff_jll v4.7.2+0 [38a345b3] Libuuid_jll v2.41.3+0 [e7412a2a] Ogg_jll v1.3.6+0 [6cdc7f73] OpenBLASConsistentFPCSR_jll v0.3.30+0 [18a262bb] OpenEXR_jll v3.4.4+0 [efe28fd5] OpenSpecFun_jll v0.5.6+0 [91d4177d] Opus_jll v1.6.1+0 [36c8627f] Pango_jll v1.57.0+0 ⌅ [30392449] Pixman_jll v0.44.2+0 [f50d1b31] Rmath_jll v0.5.1+0 [ffd25f8a] XZ_jll v5.8.2+0 [4f6342f7] Xorg_libX11_jll v1.8.13+0 [0c0b7dd1] Xorg_libXau_jll v1.0.13+0 [a3789734] Xorg_libXdmcp_jll v1.1.6+0 [1082639a] Xorg_libXext_jll v1.3.8+0 [ea2f1a96] Xorg_libXrender_jll v0.9.12+0 [c7cfdc94] Xorg_libxcb_jll v1.17.1+0 [c5fb5394] Xorg_xtrans_jll v1.6.0+0 [9a68df92] isoband_jll v0.2.3+0 [a4ae2306] libaom_jll v3.13.1+0 [0ac62f75] libass_jll v0.17.4+0 [f638f0a6] libfdk_aac_jll v2.0.4+0 [b53b4c65] libpng_jll v1.6.55+0 [075b6546] libsixel_jll v1.10.5+0 [f27f6e37] libvorbis_jll v1.3.8+0 [c5f90fcd] libwebp_jll v1.6.0+0 ⌅ [1270edf5] x264_jll v10164.0.1+0 [dfaa095f] x265_jll v4.1.0+0 [0dad84c5] ArgTools v1.1.2 [56f22d72] Artifacts v1.11.0 [2a0f44e3] Base64 v1.11.0 [8bf52ea8] CRC32c v1.11.0 [ade2ca70] Dates v1.11.0 [8ba89e20] Distributed v1.11.0 [f43a241f] Downloads v1.7.0 [7b1f6079] FileWatching v1.11.0 [b77e0a4c] InteractiveUtils v1.11.0 [ac6e5ff7] JuliaSyntaxHighlighting v1.13.0 [b27032c2] LibCURL v1.0.0 [76f85450] LibGit2 v1.11.0 [8f399da3] Libdl v1.11.0 [37e2e46d] LinearAlgebra v1.13.0 [56ddb016] Logging v1.11.0 [d6f4376e] Markdown v1.11.0 [a63ad114] Mmap v1.11.0 [ca575930] NetworkOptions v1.3.0 [44cfe95a] Pkg v1.14.0 [de0858da] Printf v1.11.0 [3fa0cd96] REPL v1.11.0 [9a3f8284] Random v1.11.0 [ea8e919c] SHA v1.0.0 [9e88b42a] Serialization v1.11.0 [1a1011a3] SharedArrays v1.11.0 [6462fe0b] Sockets v1.11.0 [2f01184e] SparseArrays v1.13.0 [f489334b] StyledStrings v1.13.0 [4607b0f0] SuiteSparse [fa267f1f] TOML v1.0.3 [a4e569a6] Tar v1.10.0 [8dfed614] Test v1.11.0 [cf7118a7] UUIDs v1.11.0 [4ec0a83e] Unicode v1.11.0 [e66e0078] CompilerSupportLibraries_jll v1.3.0+1 [deac9b47] LibCURL_jll v8.18.0+0 [e37daf67] LibGit2_jll v1.9.2+0 [29816b5a] LibSSH2_jll v1.11.3+1 [14a3606d] MozillaCACerts_jll v2025.12.2 [4536629a] OpenBLAS_jll v0.3.30+0 [05823500] OpenLibm_jll v0.8.7+0 [458c3c95] OpenSSL_jll v3.5.5+0 [efcefdf7] PCRE2_jll v10.47.0+0 [bea87d4a] SuiteSparse_jll v7.10.1+0 [83775a58] Zlib_jll v1.3.2+0 [3161d3a3] Zstd_jll v1.5.7+1 [8e850b90] libblastrampoline_jll v5.15.0+0 [8e850ede] nghttp2_jll v1.68.0+1 [3f19e933] p7zip_jll v17.7.0+0 Info Packages marked with ⌅ have new versions available but compatibility constraints restrict them from upgrading. Testing Running tests... [ Info: Baseline mcc: 0.0 [ Info: Optimal 1 variables model - mcc ≈ 0.8138 [ Info: Optimal 2 variables model - mcc ≈ 0.8292 [ Info: Optimal 3 variables model - mcc ≈ 0.8295 [ Info: Returning model with 3 variables - mcc ≈ 0.8295 [ Info: [ 0%] LOSS: training ≈ 0.466 validation ≈ 0.4744 (102%) [ Info: [ 1%] LOSS: training ≈ 0.3792 validation ≈ 0.4094 (108%) [ Info: [ 2%] LOSS: training ≈ 0.3379 validation ≈ 0.3815 (113%) [ Info: [ 2%] LOSS: training ≈ 0.3142 validation ≈ 0.3675 (117%) [ Info: [ 2%] LOSS: training ≈ 0.299 validation ≈ 0.3599 (120%) [ Info: [ 3%] LOSS: training ≈ 0.2886 validation ≈ 0.3557 (123%) [ Info: [ 4%] LOSS: training ≈ 0.281 validation ≈ 0.3532 (126%) [ Info: [ 4%] LOSS: training ≈ 0.2753 validation ≈ 0.3519 (128%) [ Info: [ 4%] LOSS: training ≈ 0.2708 validation ≈ 0.3513 (130%) [ Info: [ 5%] LOSS: training ≈ 0.2673 validation ≈ 0.351 (131%) [ Info: [ 6%] LOSS: training ≈ 0.2644 validation ≈ 0.351 (133%) [ Info: [ 6%] LOSS: training ≈ 0.2621 validation ≈ 0.3512 (134%) [ Info: [ 6%] LOSS: training ≈ 0.2601 validation ≈ 0.3515 (135%) [ Info: [ 7%] LOSS: training ≈ 0.2584 validation ≈ 0.3518 (136%) [ Info: [ 8%] LOSS: training ≈ 0.257 validation ≈ 0.3522 (137%) [ Info: [ 8%] LOSS: training ≈ 0.2557 validation ≈ 0.3526 (138%) [ Info: [ 8%] LOSS: training ≈ 0.2546 validation ≈ 0.353 (139%) [ Info: [ 9%] LOSS: training ≈ 0.2537 validation ≈ 0.3534 (139%) [ Info: [ 10%] LOSS: training ≈ 0.2528 validation ≈ 0.3538 (140%) [ Info: [ 10%] LOSS: training ≈ 0.252 validation ≈ 0.3542 (141%) [ Info: [ 10%] LOSS: training ≈ 0.2514 validation ≈ 0.3546 (141%) [ Info: [ 11%] LOSS: training ≈ 0.2507 validation ≈ 0.3549 (142%) [ Info: [ 12%] LOSS: training ≈ 0.2502 validation ≈ 0.3552 (142%) [ Info: [ 12%] LOSS: training ≈ 0.2497 validation ≈ 0.3555 (142%) [ Info: [ 12%] LOSS: training ≈ 0.2492 validation ≈ 0.3558 (143%) [ Info: [ 13%] LOSS: training ≈ 0.2488 validation ≈ 0.3561 (143%) [ Info: [ 14%] LOSS: training ≈ 0.2484 validation ≈ 0.3563 (143%) [ Info: [ 14%] LOSS: training ≈ 0.248 validation ≈ 0.3566 (144%) [ Info: [ 14%] LOSS: training ≈ 0.2476 validation ≈ 0.3568 (144%) [ Info: [ 15%] LOSS: training ≈ 0.2473 validation ≈ 0.357 (144%) [ Info: [ 16%] LOSS: training ≈ 0.247 validation ≈ 0.3572 (145%) [ Info: [ 16%] LOSS: training ≈ 0.2468 validation ≈ 0.3574 (145%) [ Info: [ 16%] LOSS: training ≈ 0.2465 validation ≈ 0.3576 (145%) [ Info: [ 17%] LOSS: training ≈ 0.2463 validation ≈ 0.3577 (145%) [ Info: [ 18%] LOSS: training ≈ 0.246 validation ≈ 0.3579 (145%) [ Info: [ 18%] LOSS: training ≈ 0.2458 validation ≈ 0.358 (146%) [ Info: [ 18%] LOSS: training ≈ 0.2456 validation ≈ 0.3581 (146%) [ Info: [ 19%] LOSS: training ≈ 0.2454 validation ≈ 0.3583 (146%) [ Info: [ 20%] LOSS: training ≈ 0.2452 validation ≈ 0.3584 (146%) [ Info: [ 20%] LOSS: training ≈ 0.2451 validation ≈ 0.3585 (146%) [ Info: [ 20%] LOSS: training ≈ 0.2449 validation ≈ 0.3586 (146%) [ Info: [ 21%] LOSS: training ≈ 0.2448 validation ≈ 0.3587 (147%) [ Info: [ 22%] LOSS: training ≈ 0.2446 validation ≈ 0.3588 (147%) [ Info: [ 22%] LOSS: training ≈ 0.2445 validation ≈ 0.3589 (147%) [ Info: [ 22%] LOSS: training ≈ 0.2443 validation ≈ 0.3589 (147%) [ Info: [ 23%] LOSS: training ≈ 0.2442 validation ≈ 0.359 (147%) [ Info: [ 24%] LOSS: training ≈ 0.2441 validation ≈ 0.3591 (147%) [ Info: [ 24%] LOSS: training ≈ 0.244 validation ≈ 0.3591 (147%) [ Info: [ 24%] LOSS: training ≈ 0.2439 validation ≈ 0.3592 (147%) [ Info: [ 25%] LOSS: training ≈ 0.2438 validation ≈ 0.3593 (147%) [ Info: [ 26%] LOSS: training ≈ 0.2437 validation ≈ 0.3593 (147%) [ Info: [ 26%] LOSS: training ≈ 0.2436 validation ≈ 0.3593 (148%) [ Info: [ 26%] LOSS: training ≈ 0.2435 validation ≈ 0.3594 (148%) [ Info: [ 27%] LOSS: training ≈ 0.2434 validation ≈ 0.3594 (148%) [ Info: [ 28%] LOSS: training ≈ 0.2433 validation ≈ 0.3595 (148%) [ Info: [ 28%] LOSS: training ≈ 0.2432 validation ≈ 0.3595 (148%) [ Info: [ 28%] LOSS: training ≈ 0.2431 validation ≈ 0.3595 (148%) [ Info: [ 29%] LOSS: training ≈ 0.243 validation ≈ 0.3596 (148%) [ Info: [ 30%] LOSS: training ≈ 0.243 validation ≈ 0.3596 (148%) [ Info: [ 30%] LOSS: training ≈ 0.2429 validation ≈ 0.3596 (148%) [ Info: [ 30%] LOSS: training ≈ 0.2428 validation ≈ 0.3596 (148%) [ Info: [ 31%] LOSS: training ≈ 0.2428 validation ≈ 0.3597 (148%) [ Info: [ 32%] LOSS: training ≈ 0.2427 validation ≈ 0.3597 (148%) [ Info: [ 32%] LOSS: training ≈ 0.2426 validation ≈ 0.3597 (148%) [ Info: [ 32%] LOSS: training ≈ 0.2426 validation ≈ 0.3597 (148%) [ Info: [ 33%] LOSS: training ≈ 0.2425 validation ≈ 0.3597 (148%) [ Info: [ 34%] LOSS: training ≈ 0.2425 validation ≈ 0.3597 (148%) [ Info: [ 34%] LOSS: training ≈ 0.2424 validation ≈ 0.3597 (148%) [ Info: [ 34%] LOSS: training ≈ 0.2424 validation ≈ 0.3597 (148%) [ Info: [ 35%] LOSS: training ≈ 0.2423 validation ≈ 0.3597 (148%) [ Info: [ 36%] LOSS: training ≈ 0.2422 validation ≈ 0.3597 (149%) [ Info: [ 36%] LOSS: training ≈ 0.2422 validation ≈ 0.3598 (149%) [ Info: [ 36%] LOSS: training ≈ 0.2422 validation ≈ 0.3598 (149%) [ Info: [ 37%] LOSS: training ≈ 0.2421 validation ≈ 0.3598 (149%) [ Info: [ 38%] LOSS: training ≈ 0.2421 validation ≈ 0.3598 (149%) [ Info: [ 38%] LOSS: training ≈ 0.242 validation ≈ 0.3598 (149%) [ Info: [ 38%] LOSS: training ≈ 0.242 validation ≈ 0.3598 (149%) [ Info: [ 39%] LOSS: training ≈ 0.2419 validation ≈ 0.3597 (149%) [ Info: [ 40%] LOSS: training ≈ 0.2419 validation ≈ 0.3597 (149%) [ Info: [ 40%] LOSS: training ≈ 0.2419 validation ≈ 0.3597 (149%) [ Info: [ 40%] LOSS: training ≈ 0.2418 validation ≈ 0.3597 (149%) [ Info: [ 41%] LOSS: training ≈ 0.2418 validation ≈ 0.3597 (149%) [ Info: [ 42%] LOSS: training ≈ 0.2418 validation ≈ 0.3597 (149%) [ Info: [ 42%] LOSS: training ≈ 0.2417 validation ≈ 0.3597 (149%) [ Info: [ 42%] LOSS: training ≈ 0.2417 validation ≈ 0.3597 (149%) [ Info: [ 43%] LOSS: training ≈ 0.2417 validation ≈ 0.3597 (149%) [ Info: [ 44%] LOSS: training ≈ 0.2416 validation ≈ 0.3597 (149%) [ Info: [ 44%] LOSS: training ≈ 0.2416 validation ≈ 0.3597 (149%) [ Info: [ 44%] LOSS: training ≈ 0.2416 validation ≈ 0.3597 (149%) [ Info: [ 45%] LOSS: training ≈ 0.2415 validation ≈ 0.3597 (149%) [ Info: [ 46%] LOSS: training ≈ 0.2415 validation ≈ 0.3596 (149%) [ Info: [ 46%] LOSS: training ≈ 0.2415 validation ≈ 0.3596 (149%) [ Info: [ 46%] LOSS: training ≈ 0.2414 validation ≈ 0.3596 (149%) [ Info: [ 47%] LOSS: training ≈ 0.2414 validation ≈ 0.3596 (149%) [ Info: [ 48%] LOSS: training ≈ 0.2414 validation ≈ 0.3596 (149%) [ Info: [ 48%] LOSS: training ≈ 0.2414 validation ≈ 0.3596 (149%) [ Info: [ 48%] LOSS: training ≈ 0.2413 validation ≈ 0.3596 (149%) [ Info: [ 49%] LOSS: training ≈ 0.2413 validation ≈ 0.3595 (149%) [ Info: [ 50%] LOSS: training ≈ 0.2413 validation ≈ 0.3595 (149%) [ Info: [ 50%] LOSS: training ≈ 0.2413 validation ≈ 0.3595 (149%) [ Info: [ 50%] LOSS: training ≈ 0.2412 validation ≈ 0.3595 (149%) [ Info: [ 51%] LOSS: training ≈ 0.2412 validation ≈ 0.3595 (149%) [ Info: [ 52%] LOSS: training ≈ 0.2412 validation ≈ 0.3595 (149%) [ Info: [ 52%] LOSS: training ≈ 0.2412 validation ≈ 0.3595 (149%) [ Info: [ 52%] LOSS: training ≈ 0.2412 validation ≈ 0.3594 (149%) [ Info: [ 53%] LOSS: training ≈ 0.2411 validation ≈ 0.3594 (149%) [ Info: [ 54%] LOSS: training ≈ 0.2411 validation ≈ 0.3594 (149%) [ Info: [ 54%] LOSS: training ≈ 0.2411 validation ≈ 0.3594 (149%) [ Info: [ 55%] LOSS: training ≈ 0.2411 validation ≈ 0.3594 (149%) [ Info: [ 55%] LOSS: training ≈ 0.2411 validation ≈ 0.3594 (149%) [ Info: [ 56%] LOSS: training ≈ 0.2411 validation ≈ 0.3593 (149%) [ Info: [ 56%] LOSS: training ≈ 0.241 validation ≈ 0.3593 (149%) [ Info: [ 56%] LOSS: training ≈ 0.241 validation ≈ 0.3593 (149%) [ Info: [ 57%] LOSS: training ≈ 0.241 validation ≈ 0.3593 (149%) [ Info: [ 57%] LOSS: training ≈ 0.241 validation ≈ 0.3593 (149%) [ Info: [ 58%] LOSS: training ≈ 0.241 validation ≈ 0.3592 (149%) [ Info: [ 58%] LOSS: training ≈ 0.241 validation ≈ 0.3592 (149%) [ Info: [ 59%] LOSS: training ≈ 0.2409 validation ≈ 0.3592 (149%) [ Info: [ 60%] LOSS: training ≈ 0.2409 validation ≈ 0.3592 (149%) [ Info: [ 60%] LOSS: training ≈ 0.2409 validation ≈ 0.3592 (149%) [ Info: [ 60%] LOSS: training ≈ 0.2409 validation ≈ 0.3592 (149%) [ Info: [ 61%] LOSS: training ≈ 0.2409 validation ≈ 0.3591 (149%) [ Info: [ 62%] LOSS: training ≈ 0.2409 validation ≈ 0.3591 (149%) [ Info: [ 62%] LOSS: training ≈ 0.2409 validation ≈ 0.3591 (149%) [ Info: [ 62%] LOSS: training ≈ 0.2408 validation ≈ 0.3591 (149%) [ Info: [ 63%] LOSS: training ≈ 0.2408 validation ≈ 0.3591 (149%) [ Info: [ 64%] LOSS: training ≈ 0.2408 validation ≈ 0.359 (149%) [ Info: [ 64%] LOSS: training ≈ 0.2408 validation ≈ 0.359 (149%) [ Info: [ 64%] LOSS: training ≈ 0.2408 validation ≈ 0.359 (149%) [ Info: [ 65%] LOSS: training ≈ 0.2408 validation ≈ 0.359 (149%) [ Info: [ 66%] LOSS: training ≈ 0.2408 validation ≈ 0.359 (149%) [ Info: [ 66%] LOSS: training ≈ 0.2408 validation ≈ 0.359 (149%) [ Info: [ 66%] LOSS: training ≈ 0.2407 validation ≈ 0.3589 (149%) [ Info: [ 67%] LOSS: training ≈ 0.2407 validation ≈ 0.3589 (149%) [ Info: [ 68%] LOSS: training ≈ 0.2407 validation ≈ 0.3589 (149%) [ Info: [ 68%] LOSS: training ≈ 0.2407 validation ≈ 0.3589 (149%) [ Info: [ 68%] LOSS: training ≈ 0.2407 validation ≈ 0.3589 (149%) [ Info: [ 69%] LOSS: training ≈ 0.2407 validation ≈ 0.3588 (149%) [ Info: [ 70%] LOSS: training ≈ 0.2407 validation ≈ 0.3588 (149%) [ Info: [ 70%] LOSS: training ≈ 0.2407 validation ≈ 0.3588 (149%) [ Info: [ 70%] LOSS: training ≈ 0.2407 validation ≈ 0.3588 (149%) [ Info: [ 71%] LOSS: training ≈ 0.2407 validation ≈ 0.3588 (149%) [ Info: [ 72%] LOSS: training ≈ 0.2407 validation ≈ 0.3588 (149%) [ Info: [ 72%] LOSS: training ≈ 0.2406 validation ≈ 0.3587 (149%) [ Info: [ 72%] LOSS: training ≈ 0.2406 validation ≈ 0.3587 (149%) [ Info: [ 73%] LOSS: training ≈ 0.2406 validation ≈ 0.3587 (149%) [ Info: [ 74%] LOSS: training ≈ 0.2406 validation ≈ 0.3587 (149%) [ Info: [ 74%] LOSS: training ≈ 0.2406 validation ≈ 0.3587 (149%) [ Info: [ 74%] LOSS: training ≈ 0.2406 validation ≈ 0.3587 (149%) [ Info: [ 75%] LOSS: training ≈ 0.2406 validation ≈ 0.3586 (149%) [ Info: [ 76%] LOSS: training ≈ 0.2406 validation ≈ 0.3586 (149%) [ Info: [ 76%] LOSS: training ≈ 0.2406 validation ≈ 0.3586 (149%) [ Info: [ 76%] LOSS: training ≈ 0.2406 validation ≈ 0.3586 (149%) [ Info: [ 77%] LOSS: training ≈ 0.2406 validation ≈ 0.3586 (149%) [ Info: [ 78%] LOSS: training ≈ 0.2406 validation ≈ 0.3586 (149%) [ Info: [ 78%] LOSS: training ≈ 0.2406 validation ≈ 0.3585 (149%) [ Info: [ 78%] LOSS: training ≈ 0.2405 validation ≈ 0.3585 (149%) [ Info: [ 79%] LOSS: training ≈ 0.2405 validation ≈ 0.3585 (149%) [ Info: [ 80%] LOSS: training ≈ 0.2405 validation ≈ 0.3585 (149%) [ Info: [ 80%] LOSS: training ≈ 0.2405 validation ≈ 0.3585 (149%) [ Info: [ 80%] LOSS: training ≈ 0.2405 validation ≈ 0.3585 (149%) [ Info: [ 81%] LOSS: training ≈ 0.2405 validation ≈ 0.3584 (149%) [ Info: [ 82%] LOSS: training ≈ 0.2405 validation ≈ 0.3584 (149%) [ Info: [ 82%] LOSS: training ≈ 0.2405 validation ≈ 0.3584 (149%) [ Info: [ 82%] LOSS: training ≈ 0.2405 validation ≈ 0.3584 (149%) [ Info: [ 83%] LOSS: training ≈ 0.2405 validation ≈ 0.3584 (149%) [ Info: [ 84%] LOSS: training ≈ 0.2405 validation ≈ 0.3584 (149%) [ Info: [ 84%] LOSS: training ≈ 0.2405 validation ≈ 0.3584 (149%) [ Info: [ 84%] LOSS: training ≈ 0.2405 validation ≈ 0.3583 (149%) [ Info: [ 85%] LOSS: training ≈ 0.2405 validation ≈ 0.3583 (149%) [ Info: [ 86%] LOSS: training ≈ 0.2405 validation ≈ 0.3583 (149%) [ Info: [ 86%] LOSS: training ≈ 0.2405 validation ≈ 0.3583 (149%) [ Info: [ 86%] LOSS: training ≈ 0.2405 validation ≈ 0.3583 (149%) [ Info: [ 87%] LOSS: training ≈ 0.2405 validation ≈ 0.3583 (149%) [ Info: [ 88%] LOSS: training ≈ 0.2404 validation ≈ 0.3583 (149%) [ Info: [ 88%] LOSS: training ≈ 0.2404 validation ≈ 0.3582 (149%) [ Info: [ 88%] LOSS: training ≈ 0.2404 validation ≈ 0.3582 (149%) [ Info: [ 89%] LOSS: training ≈ 0.2404 validation ≈ 0.3582 (149%) [ Info: [ 90%] LOSS: training ≈ 0.2404 validation ≈ 0.3582 (149%) [ Info: [ 90%] LOSS: training ≈ 0.2404 validation ≈ 0.3582 (149%) [ Info: [ 90%] LOSS: training ≈ 0.2404 validation ≈ 0.3582 (149%) [ Info: [ 91%] LOSS: training ≈ 0.2404 validation ≈ 0.3582 (149%) [ Info: [ 92%] LOSS: training ≈ 0.2404 validation ≈ 0.3582 (149%) [ Info: [ 92%] LOSS: training ≈ 0.2404 validation ≈ 0.3581 (149%) [ Info: [ 92%] LOSS: training ≈ 0.2404 validation ≈ 0.3581 (149%) [ Info: [ 93%] LOSS: training ≈ 0.2404 validation ≈ 0.3581 (149%) [ Info: [ 94%] LOSS: training ≈ 0.2404 validation ≈ 0.3581 (149%) [ Info: [ 94%] LOSS: training ≈ 0.2404 validation ≈ 0.3581 (149%) [ Info: [ 94%] LOSS: training ≈ 0.2404 validation ≈ 0.3581 (149%) [ Info: [ 95%] LOSS: training ≈ 0.2404 validation ≈ 0.3581 (149%) [ Info: [ 96%] LOSS: training ≈ 0.2404 validation ≈ 0.3581 (149%) [ Info: [ 96%] LOSS: training ≈ 0.2404 validation ≈ 0.358 (149%) [ Info: [ 96%] LOSS: training ≈ 0.2404 validation ≈ 0.358 (149%) [ Info: [ 97%] LOSS: training ≈ 0.2404 validation ≈ 0.358 (149%) [ Info: [ 98%] LOSS: training ≈ 0.2404 validation ≈ 0.358 (149%) [ Info: [ 98%] LOSS: training ≈ 0.2404 validation ≈ 0.358 (149%) [ Info: [ 98%] LOSS: training ≈ 0.2404 validation ≈ 0.358 (149%) [ Info: [ 99%] LOSS: training ≈ 0.2404 validation ≈ 0.358 (149%) [ Info: [100%] LOSS: training ≈ 0.2404 validation ≈ 0.358 (149%) [ Info: [100%] LOSS: training ≈ 0.2404 validation ≈ 0.3579 (149%) [ Info: [ 0%] LOSS: training ≈ 0.4458 [ Info: [ 1%] LOSS: training ≈ 0.367 [ Info: [ 2%] LOSS: training ≈ 0.3328 [ Info: [ 2%] LOSS: training ≈ 0.3142 [ Info: [ 2%] LOSS: training ≈ 0.3028 [ Info: [ 3%] LOSS: training ≈ 0.2952 [ Info: [ 4%] LOSS: training ≈ 0.2898 [ Info: [ 4%] LOSS: training ≈ 0.2858 [ Info: [ 4%] LOSS: training ≈ 0.2827 [ Info: [ 5%] LOSS: training ≈ 0.2804 [ Info: [ 6%] LOSS: training ≈ 0.2784 [ Info: [ 6%] LOSS: training ≈ 0.2768 [ Info: [ 6%] LOSS: training ≈ 0.2755 [ Info: [ 7%] LOSS: training ≈ 0.2744 [ Info: [ 8%] LOSS: training ≈ 0.2734 [ Info: [ 8%] LOSS: training ≈ 0.2725 [ Info: [ 8%] LOSS: training ≈ 0.2718 [ Info: [ 9%] LOSS: training ≈ 0.2711 [ Info: [ 10%] LOSS: training ≈ 0.2706 [ Info: [ 10%] LOSS: training ≈ 0.27 [ Info: [ 10%] LOSS: training ≈ 0.2695 [ Info: [ 11%] LOSS: training ≈ 0.2691 [ Info: [ 12%] LOSS: training ≈ 0.2687 [ Info: [ 12%] LOSS: training ≈ 0.2683 [ Info: [ 12%] LOSS: training ≈ 0.268 [ Info: [ 13%] LOSS: training ≈ 0.2677 [ Info: [ 14%] LOSS: training ≈ 0.2674 [ Info: [ 14%] LOSS: training ≈ 0.2671 [ Info: [ 14%] LOSS: training ≈ 0.2669 [ Info: [ 15%] LOSS: training ≈ 0.2666 [ Info: [ 16%] LOSS: training ≈ 0.2664 [ Info: [ 16%] LOSS: training ≈ 0.2662 [ Info: [ 16%] LOSS: training ≈ 0.266 [ Info: [ 17%] LOSS: training ≈ 0.2658 [ Info: [ 18%] LOSS: training ≈ 0.2656 [ Info: [ 18%] LOSS: training ≈ 0.2655 [ Info: [ 18%] LOSS: training ≈ 0.2653 [ Info: [ 19%] LOSS: training ≈ 0.2652 [ Info: [ 20%] LOSS: training ≈ 0.265 [ Info: [ 20%] LOSS: training ≈ 0.2649 [ Info: [ 20%] LOSS: training ≈ 0.2648 [ Info: [ 21%] LOSS: training ≈ 0.2646 [ Info: [ 22%] LOSS: training ≈ 0.2645 [ Info: [ 22%] LOSS: training ≈ 0.2644 [ Info: [ 22%] LOSS: training ≈ 0.2643 [ Info: [ 23%] LOSS: training ≈ 0.2642 [ Info: [ 24%] LOSS: training ≈ 0.2641 [ Info: [ 24%] LOSS: training ≈ 0.264 [ Info: [ 24%] LOSS: training ≈ 0.2639 [ Info: [ 25%] LOSS: training ≈ 0.2638 [ Info: [ 26%] LOSS: training ≈ 0.2637 [ Info: [ 26%] LOSS: training ≈ 0.2636 [ Info: [ 26%] LOSS: training ≈ 0.2635 [ Info: [ 27%] LOSS: training ≈ 0.2635 [ Info: [ 28%] LOSS: training ≈ 0.2634 [ Info: [ 28%] LOSS: training ≈ 0.2633 [ Info: [ 28%] LOSS: training ≈ 0.2633 [ Info: [ 29%] LOSS: training ≈ 0.2632 [ Info: [ 30%] LOSS: training ≈ 0.2631 [ Info: [ 30%] LOSS: training ≈ 0.2631 [ Info: [ 30%] LOSS: training ≈ 0.263 [ Info: [ 31%] LOSS: training ≈ 0.2629 [ Info: [ 32%] LOSS: training ≈ 0.2629 [ Info: [ 32%] LOSS: training ≈ 0.2628 [ Info: [ 32%] LOSS: training ≈ 0.2628 [ Info: [ 33%] LOSS: training ≈ 0.2627 [ Info: [ 34%] LOSS: training ≈ 0.2627 [ Info: [ 34%] LOSS: training ≈ 0.2626 [ Info: [ 34%] LOSS: training ≈ 0.2626 [ Info: [ 35%] LOSS: training ≈ 0.2625 [ Info: [ 36%] LOSS: training ≈ 0.2625 [ Info: [ 36%] LOSS: training ≈ 0.2625 [ Info: [ 36%] LOSS: training ≈ 0.2624 [ Info: [ 37%] LOSS: training ≈ 0.2624 [ Info: [ 38%] LOSS: training ≈ 0.2623 [ Info: [ 38%] LOSS: training ≈ 0.2623 [ Info: [ 38%] LOSS: training ≈ 0.2623 [ Info: [ 39%] LOSS: training ≈ 0.2622 [ Info: [ 40%] LOSS: training ≈ 0.2622 [ Info: [ 40%] LOSS: training ≈ 0.2622 [ Info: [ 40%] LOSS: training ≈ 0.2621 [ Info: [ 41%] LOSS: training ≈ 0.2621 [ Info: [ 42%] LOSS: training ≈ 0.2621 [ Info: [ 42%] LOSS: training ≈ 0.2621 [ Info: [ 42%] LOSS: training ≈ 0.262 [ Info: [ 43%] LOSS: training ≈ 0.262 [ Info: [ 44%] LOSS: training ≈ 0.262 [ Info: [ 44%] LOSS: training ≈ 0.2619 [ Info: [ 44%] LOSS: training ≈ 0.2619 [ Info: [ 45%] LOSS: training ≈ 0.2619 [ Info: [ 46%] LOSS: training ≈ 0.2619 [ Info: [ 46%] LOSS: training ≈ 0.2618 [ Info: [ 46%] LOSS: training ≈ 0.2618 [ Info: [ 47%] LOSS: training ≈ 0.2618 [ Info: [ 48%] LOSS: training ≈ 0.2618 [ Info: [ 48%] LOSS: training ≈ 0.2618 [ Info: [ 48%] LOSS: training ≈ 0.2617 [ Info: [ 49%] LOSS: training ≈ 0.2617 [ Info: [ 50%] LOSS: training ≈ 0.2617 [ Info: [ 50%] LOSS: training ≈ 0.2617 [ Info: [ 50%] LOSS: training ≈ 0.2617 [ Info: [ 51%] LOSS: training ≈ 0.2616 [ Info: [ 52%] LOSS: training ≈ 0.2616 [ Info: [ 52%] LOSS: training ≈ 0.2616 [ Info: [ 52%] LOSS: training ≈ 0.2616 [ Info: [ 53%] LOSS: training ≈ 0.2616 [ Info: [ 54%] LOSS: training ≈ 0.2616 [ Info: [ 54%] LOSS: training ≈ 0.2616 [ Info: [ 55%] LOSS: training ≈ 0.2615 [ Info: [ 55%] LOSS: training ≈ 0.2615 [ Info: [ 56%] LOSS: training ≈ 0.2615 [ Info: [ 56%] LOSS: training ≈ 0.2615 [ Info: [ 56%] LOSS: training ≈ 0.2615 [ Info: [ 57%] LOSS: training ≈ 0.2615 [ Info: [ 57%] LOSS: training ≈ 0.2615 [ Info: [ 58%] LOSS: training ≈ 0.2614 [ Info: [ 58%] LOSS: training ≈ 0.2614 [ Info: [ 59%] LOSS: training ≈ 0.2614 [ Info: [ 60%] LOSS: training ≈ 0.2614 [ Info: [ 60%] LOSS: training ≈ 0.2614 [ Info: [ 60%] LOSS: training ≈ 0.2614 [ Info: [ 61%] LOSS: training ≈ 0.2614 [ Info: [ 62%] LOSS: training ≈ 0.2614 [ Info: [ 62%] LOSS: training ≈ 0.2614 [ Info: [ 62%] LOSS: training ≈ 0.2613 [ Info: [ 63%] LOSS: training ≈ 0.2613 [ Info: [ 64%] LOSS: training ≈ 0.2613 [ Info: [ 64%] LOSS: training ≈ 0.2613 [ Info: [ 64%] LOSS: training ≈ 0.2613 [ Info: [ 65%] LOSS: training ≈ 0.2613 [ Info: [ 66%] LOSS: training ≈ 0.2613 [ Info: [ 66%] LOSS: training ≈ 0.2613 [ Info: [ 66%] LOSS: training ≈ 0.2613 [ Info: [ 67%] LOSS: training ≈ 0.2613 [ Info: [ 68%] LOSS: training ≈ 0.2613 [ Info: [ 68%] LOSS: training ≈ 0.2613 [ Info: [ 68%] LOSS: training ≈ 0.2613 [ Info: [ 69%] LOSS: training ≈ 0.2612 [ Info: [ 70%] LOSS: training ≈ 0.2612 [ Info: [ 70%] LOSS: training ≈ 0.2612 [ Info: [ 70%] LOSS: training ≈ 0.2612 [ Info: [ 71%] LOSS: training ≈ 0.2612 [ Info: [ 72%] LOSS: training ≈ 0.2612 [ Info: [ 72%] LOSS: training ≈ 0.2612 [ Info: [ 72%] LOSS: training ≈ 0.2612 [ Info: [ 73%] LOSS: training ≈ 0.2612 [ Info: [ 74%] LOSS: training ≈ 0.2612 [ Info: [ 74%] LOSS: training ≈ 0.2612 [ Info: [ 74%] LOSS: training ≈ 0.2612 [ Info: [ 75%] LOSS: training ≈ 0.2612 [ Info: [ 76%] LOSS: training ≈ 0.2612 [ Info: [ 76%] LOSS: training ≈ 0.2612 [ Info: [ 76%] LOSS: training ≈ 0.2612 [ Info: [ 77%] LOSS: training ≈ 0.2612 [ Info: [ 78%] LOSS: training ≈ 0.2611 [ Info: [ 78%] LOSS: training ≈ 0.2611 [ Info: [ 78%] LOSS: training ≈ 0.2611 [ Info: [ 79%] LOSS: training ≈ 0.2611 [ Info: [ 80%] LOSS: training ≈ 0.2611 [ Info: [ 80%] LOSS: training ≈ 0.2611 [ Info: [ 80%] LOSS: training ≈ 0.2611 [ Info: [ 81%] LOSS: training ≈ 0.2611 [ Info: [ 82%] LOSS: training ≈ 0.2611 [ Info: [ 82%] LOSS: training ≈ 0.2611 [ Info: [ 82%] LOSS: training ≈ 0.2611 [ Info: [ 83%] LOSS: training ≈ 0.2611 [ Info: [ 84%] LOSS: training ≈ 0.2611 [ Info: [ 84%] LOSS: training ≈ 0.2611 [ Info: [ 84%] LOSS: training ≈ 0.2611 [ Info: [ 85%] LOSS: training ≈ 0.2611 [ Info: [ 86%] LOSS: training ≈ 0.2611 [ Info: [ 86%] LOSS: training ≈ 0.2611 [ Info: [ 86%] LOSS: training ≈ 0.2611 [ Info: [ 87%] LOSS: training ≈ 0.2611 [ Info: [ 88%] LOSS: training ≈ 0.2611 [ Info: [ 88%] LOSS: training ≈ 0.2611 [ Info: [ 88%] LOSS: training ≈ 0.2611 [ Info: [ 89%] LOSS: training ≈ 0.2611 [ Info: [ 90%] LOSS: training ≈ 0.2611 [ Info: [ 90%] LOSS: training ≈ 0.2611 [ Info: [ 90%] LOSS: training ≈ 0.2611 [ Info: [ 91%] LOSS: training ≈ 0.2611 [ Info: [ 92%] LOSS: training ≈ 0.261 [ Info: [ 92%] LOSS: training ≈ 0.261 [ Info: [ 92%] LOSS: training ≈ 0.261 [ Info: [ 93%] LOSS: training ≈ 0.261 [ Info: [ 94%] LOSS: training ≈ 0.261 [ Info: [ 94%] LOSS: training ≈ 0.261 [ Info: [ 94%] LOSS: training ≈ 0.261 [ Info: [ 95%] LOSS: training ≈ 0.261 [ Info: [ 96%] LOSS: training ≈ 0.261 [ Info: [ 96%] LOSS: training ≈ 0.261 [ Info: [ 96%] LOSS: training ≈ 0.261 [ Info: [ 97%] LOSS: training ≈ 0.261 [ Info: [ 98%] LOSS: training ≈ 0.261 [ Info: [ 98%] LOSS: training ≈ 0.261 [ Info: [ 98%] LOSS: training ≈ 0.261 [ Info: [ 99%] LOSS: training ≈ 0.261 [ Info: [100%] LOSS: training ≈ 0.261 [ Info: [100%] LOSS: training ≈ 0.261 ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/rmAiF/src/variables/vif.jl:24 ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/rmAiF/src/variables/vif.jl:24 ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/rmAiF/src/variables/vif.jl:24 ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/rmAiF/src/variables/vif.jl:24 ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/rmAiF/src/variables/vif.jl:24 ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/rmAiF/src/variables/vif.jl:24 ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/rmAiF/src/variables/vif.jl:24 ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/rmAiF/src/variables/vif.jl:24 ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/rmAiF/src/variables/vif.jl:24 ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/rmAiF/src/variables/vif.jl:24 ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/rmAiF/src/variables/vif.jl:24 ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/rmAiF/src/variables/vif.jl:24 ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/rmAiF/src/variables/vif.jl:24 ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/rmAiF/src/variables/vif.jl:24 Test Summary: | Pass Total Time Package | 3464 3464 11m07.7s Testing SDeMo tests passed Testing completed after 732.37s PkgEval succeeded after 1199.13s