Package evaluation of SDeMo on Julia 1.13.0-DEV.725 (d6294ba973*) started at 2025-06-10T17:13:27.243 ################################################################################ # Set-up # Installing PkgEval dependencies (TestEnv)... Set-up completed after 8.18s ################################################################################ # Installation # Installing SDeMo... Resolving package versions... Updating `~/.julia/environments/v1.13/Project.toml` [3e5feb82] + SDeMo v1.4.0 Updating `~/.julia/environments/v1.13/Manifest.toml` [66dad0bd] + AliasTables v1.1.3 [7d9fca2a] + Arpack v0.5.4 [34da2185] + Compat v4.16.0 [9a962f9c] + DataAPI v1.16.0 [864edb3b] + DataStructures v0.18.22 [31c24e10] + Distributions v0.25.120 [ffbed154] + DocStringExtensions v0.9.5 [1a297f60] + FillArrays v1.13.0 [34004b35] + HypergeometricFunctions v0.3.28 [92d709cd] + IrrationalConstants v0.2.4 [692b3bcd] + JLLWrappers v1.7.0 [682c06a0] + JSON v0.21.4 [2ab3a3ac] + LogExpFunctions v0.3.29 [e1d29d7a] + Missings v1.2.0 [6f286f6a] + MultivariateStats v0.10.3 [bac558e1] + OrderedCollections v1.8.1 [90014a1f] + PDMats v0.11.35 [69de0a69] + Parsers v2.8.3 [aea7be01] + PrecompileTools v1.3.2 [21216c6a] + Preferences v1.4.3 [43287f4e] + PtrArrays v1.3.0 [1fd47b50] + QuadGK v2.11.2 [189a3867] + Reexport v1.2.2 [79098fc4] + Rmath v0.8.0 [3e5feb82] + SDeMo v1.4.0 [a2af1166] + SortingAlgorithms v1.2.1 [276daf66] + SpecialFunctions v2.5.1 [10745b16] + Statistics v1.11.1 [82ae8749] + StatsAPI v1.7.1 [2913bbd2] + StatsBase v0.34.5 [4c63d2b9] + StatsFuns v1.5.0 [1c621080] + TestItems v1.0.0 ⌅ [68821587] + Arpack_jll v3.5.1+1 [efe28fd5] + OpenSpecFun_jll v0.5.6+0 [f50d1b31] + Rmath_jll v0.5.1+0 [0dad84c5] + ArgTools v1.1.2 [56f22d72] + Artifacts v1.11.0 [2a0f44e3] + Base64 v1.11.0 [ade2ca70] + Dates v1.11.0 [f43a241f] + Downloads v1.7.0 [7b1f6079] + FileWatching v1.11.0 [b77e0a4c] + InteractiveUtils v1.11.0 [ac6e5ff7] + JuliaSyntaxHighlighting v1.12.0 [b27032c2] + LibCURL v0.6.4 [76f85450] + LibGit2 v1.11.0 [8f399da3] + Libdl v1.11.0 [37e2e46d] + LinearAlgebra v1.12.0 [56ddb016] + Logging v1.11.0 [d6f4376e] + Markdown v1.11.0 [a63ad114] + Mmap v1.11.0 [ca575930] + NetworkOptions v1.3.0 [44cfe95a] + Pkg v1.13.0 [de0858da] + Printf v1.11.0 [9a3f8284] + Random v1.11.0 [ea8e919c] + SHA v0.7.0 [9e88b42a] + Serialization v1.11.0 [2f01184e] + SparseArrays v1.12.0 [f489334b] + StyledStrings v1.11.0 [4607b0f0] + SuiteSparse [fa267f1f] + TOML v1.0.3 [a4e569a6] + Tar v1.10.0 [cf7118a7] + UUIDs v1.11.0 [4ec0a83e] + Unicode v1.11.0 [e66e0078] + CompilerSupportLibraries_jll v1.3.0+1 [deac9b47] + LibCURL_jll v8.12.1+1 [e37daf67] + LibGit2_jll v1.9.0+0 [29816b5a] + LibSSH2_jll v1.11.3+1 [14a3606d] + MozillaCACerts_jll v2025.5.20 [4536629a] + OpenBLAS_jll v0.3.29+0 [05823500] + OpenLibm_jll v0.8.5+0 [458c3c95] + OpenSSL_jll v3.5.0+0 [bea87d4a] + SuiteSparse_jll v7.10.1+0 [83775a58] + Zlib_jll v1.3.1+2 [8e850b90] + libblastrampoline_jll v5.12.0+0 [8e850ede] + nghttp2_jll v1.65.0+0 [3f19e933] + p7zip_jll v17.5.0+2 Info Packages marked with ⌅ have new versions available but compatibility constraints restrict them from upgrading. To see why use `status --outdated -m` Installation completed after 4.05s ################################################################################ # Precompilation # Precompiling PkgEval dependencies... Precompiling package dependencies... Precompilation completed after 31.5s ################################################################################ # Testing # Testing SDeMo Status `/tmp/jl_ttD4z8/Project.toml` [3e5feb82] SDeMo v1.4.0 [10745b16] Statistics v1.11.1 [f8b46487] TestItemRunner v1.1.0 [8dfed614] Test v1.11.0 Status `/tmp/jl_ttD4z8/Manifest.toml` [66dad0bd] AliasTables v1.1.3 [7d9fca2a] Arpack v0.5.4 [34da2185] Compat v4.16.0 [9a962f9c] DataAPI v1.16.0 [864edb3b] DataStructures v0.18.22 [31c24e10] Distributions v0.25.120 [ffbed154] DocStringExtensions v0.9.5 [1a297f60] FillArrays v1.13.0 [34004b35] HypergeometricFunctions v0.3.28 [92d709cd] IrrationalConstants v0.2.4 [692b3bcd] JLLWrappers v1.7.0 [682c06a0] JSON v0.21.4 [2ab3a3ac] LogExpFunctions v0.3.29 [e1d29d7a] Missings v1.2.0 [6f286f6a] MultivariateStats v0.10.3 [bac558e1] OrderedCollections v1.8.1 [90014a1f] PDMats v0.11.35 [69de0a69] Parsers v2.8.3 [aea7be01] PrecompileTools v1.3.2 [21216c6a] Preferences v1.4.3 [43287f4e] PtrArrays v1.3.0 [1fd47b50] QuadGK v2.11.2 [189a3867] Reexport v1.2.2 [79098fc4] Rmath v0.8.0 [3e5feb82] SDeMo v1.4.0 [a2af1166] SortingAlgorithms v1.2.1 [276daf66] SpecialFunctions v2.5.1 [10745b16] Statistics v1.11.1 [82ae8749] StatsAPI v1.7.1 [2913bbd2] StatsBase v0.34.5 [4c63d2b9] StatsFuns v1.5.0 [f8b46487] TestItemRunner v1.1.0 [1c621080] TestItems v1.0.0 ⌅ [68821587] Arpack_jll v3.5.1+1 [efe28fd5] OpenSpecFun_jll v0.5.6+0 [f50d1b31] Rmath_jll v0.5.1+0 [0dad84c5] ArgTools v1.1.2 [56f22d72] Artifacts v1.11.0 [2a0f44e3] Base64 v1.11.0 [ade2ca70] Dates v1.11.0 [f43a241f] Downloads v1.7.0 [7b1f6079] FileWatching v1.11.0 [b77e0a4c] InteractiveUtils v1.11.0 [ac6e5ff7] JuliaSyntaxHighlighting v1.12.0 [b27032c2] LibCURL v0.6.4 [76f85450] LibGit2 v1.11.0 [8f399da3] Libdl v1.11.0 [37e2e46d] LinearAlgebra v1.12.0 [56ddb016] Logging v1.11.0 [d6f4376e] Markdown v1.11.0 [a63ad114] Mmap v1.11.0 [ca575930] NetworkOptions v1.3.0 [44cfe95a] Pkg v1.13.0 [de0858da] Printf v1.11.0 [9a3f8284] Random v1.11.0 [ea8e919c] SHA v0.7.0 [9e88b42a] Serialization v1.11.0 [2f01184e] SparseArrays v1.12.0 [f489334b] StyledStrings v1.11.0 [4607b0f0] SuiteSparse [fa267f1f] TOML v1.0.3 [a4e569a6] Tar v1.10.0 [8dfed614] Test v1.11.0 [cf7118a7] UUIDs v1.11.0 [4ec0a83e] Unicode v1.11.0 [e66e0078] CompilerSupportLibraries_jll v1.3.0+1 [deac9b47] LibCURL_jll v8.12.1+1 [e37daf67] LibGit2_jll v1.9.0+0 [29816b5a] LibSSH2_jll v1.11.3+1 [14a3606d] MozillaCACerts_jll v2025.5.20 [4536629a] OpenBLAS_jll v0.3.29+0 [05823500] OpenLibm_jll v0.8.5+0 [458c3c95] OpenSSL_jll v3.5.0+0 [bea87d4a] SuiteSparse_jll v7.10.1+0 [83775a58] Zlib_jll v1.3.1+2 [8e850b90] libblastrampoline_jll v5.12.0+0 [8e850ede] nghttp2_jll v1.65.0+0 [3f19e933] p7zip_jll v17.5.0+2 Info Packages marked with ⌅ have new versions available but compatibility constraints restrict them from upgrading. Testing Running tests... [ Info: [ 0%] LOSS: training ≈ 0.4914 validation ≈ 0.4951 (101%) [ Info: [ 1%] LOSS: training ≈ 0.4175 validation ≈ 0.4278 (102%) [ Info: [ 2%] LOSS: training ≈ 0.3817 validation ≈ 0.3938 (103%) [ Info: [ 2%] LOSS: training ≈ 0.3607 validation ≈ 0.3735 (104%) [ Info: [ 2%] LOSS: training ≈ 0.3467 validation ≈ 0.3599 (104%) [ Info: [ 3%] LOSS: training ≈ 0.3366 validation ≈ 0.3502 (104%) [ Info: [ 4%] LOSS: training ≈ 0.329 validation ≈ 0.3428 (104%) [ Info: [ 4%] LOSS: training ≈ 0.3229 validation ≈ 0.3369 (104%) [ Info: [ 4%] LOSS: training ≈ 0.3179 validation ≈ 0.3322 (104%) [ Info: [ 5%] LOSS: training ≈ 0.3137 validation ≈ 0.3282 (105%) [ Info: [ 6%] LOSS: training ≈ 0.3102 validation ≈ 0.3248 (105%) [ Info: [ 6%] LOSS: training ≈ 0.307 validation ≈ 0.3218 (105%) [ Info: [ 6%] LOSS: training ≈ 0.3042 validation ≈ 0.3191 (105%) [ Info: [ 7%] LOSS: training ≈ 0.3018 validation ≈ 0.3168 (105%) [ Info: [ 8%] LOSS: training ≈ 0.2995 validation ≈ 0.3146 (105%) [ Info: [ 8%] LOSS: training ≈ 0.2975 validation ≈ 0.3126 (105%) [ Info: [ 8%] LOSS: training ≈ 0.2956 validation ≈ 0.3108 (105%) [ Info: [ 9%] LOSS: training ≈ 0.2939 validation ≈ 0.3091 (105%) [ Info: [ 10%] LOSS: training ≈ 0.2923 validation ≈ 0.3075 (105%) [ Info: [ 10%] LOSS: training ≈ 0.2908 validation ≈ 0.3061 (105%) [ Info: [ 10%] LOSS: training ≈ 0.2895 validation ≈ 0.3047 (105%) [ Info: [ 11%] LOSS: training ≈ 0.2882 validation ≈ 0.3033 (105%) [ Info: [ 12%] LOSS: training ≈ 0.2869 validation ≈ 0.3021 (105%) [ Info: [ 12%] LOSS: training ≈ 0.2858 validation ≈ 0.3009 (105%) [ Info: [ 12%] LOSS: training ≈ 0.2847 validation ≈ 0.2997 (105%) [ Info: [ 13%] LOSS: training ≈ 0.2837 validation ≈ 0.2986 (105%) [ Info: [ 14%] LOSS: training ≈ 0.2827 validation ≈ 0.2976 (105%) [ Info: [ 14%] LOSS: training ≈ 0.2817 validation ≈ 0.2966 (105%) [ Info: [ 14%] LOSS: training ≈ 0.2808 validation ≈ 0.2956 (105%) [ Info: [ 15%] LOSS: training ≈ 0.28 validation ≈ 0.2947 (105%) [ Info: [ 16%] LOSS: training ≈ 0.2792 validation ≈ 0.2938 (105%) [ Info: [ 16%] LOSS: training ≈ 0.2784 validation ≈ 0.2929 (105%) [ Info: [ 16%] LOSS: training ≈ 0.2777 validation ≈ 0.2921 (105%) [ Info: [ 17%] LOSS: training ≈ 0.2769 validation ≈ 0.2913 (105%) [ Info: [ 18%] LOSS: training ≈ 0.2762 validation ≈ 0.2905 (105%) [ Info: [ 18%] LOSS: training ≈ 0.2756 validation ≈ 0.2897 (105%) [ Info: [ 18%] LOSS: training ≈ 0.2749 validation ≈ 0.289 (105%) [ Info: [ 19%] LOSS: training ≈ 0.2743 validation ≈ 0.2883 (105%) [ Info: [ 20%] LOSS: training ≈ 0.2737 validation ≈ 0.2876 (105%) [ Info: [ 20%] LOSS: training ≈ 0.2732 validation ≈ 0.2869 (105%) [ Info: [ 20%] LOSS: training ≈ 0.2726 validation ≈ 0.2862 (105%) [ Info: [ 21%] LOSS: training ≈ 0.2721 validation ≈ 0.2856 (105%) [ Info: [ 22%] LOSS: training ≈ 0.2716 validation ≈ 0.285 (105%) [ Info: [ 22%] LOSS: training ≈ 0.2711 validation ≈ 0.2844 (105%) [ Info: [ 22%] LOSS: training ≈ 0.2706 validation ≈ 0.2838 (105%) [ Info: [ 23%] LOSS: training ≈ 0.2702 validation ≈ 0.2832 (105%) [ Info: [ 24%] LOSS: training ≈ 0.2697 validation ≈ 0.2827 (105%) [ Info: [ 24%] LOSS: training ≈ 0.2693 validation ≈ 0.2821 (105%) [ Info: [ 24%] LOSS: training ≈ 0.2689 validation ≈ 0.2816 (105%) [ Info: [ 25%] LOSS: training ≈ 0.2685 validation ≈ 0.2811 (105%) [ Info: [ 26%] LOSS: training ≈ 0.2681 validation ≈ 0.2806 (105%) [ Info: [ 26%] LOSS: training ≈ 0.2677 validation ≈ 0.2801 (105%) [ Info: [ 26%] LOSS: training ≈ 0.2673 validation ≈ 0.2796 (105%) [ Info: [ 27%] LOSS: training ≈ 0.267 validation ≈ 0.2792 (105%) [ Info: [ 28%] LOSS: training ≈ 0.2666 validation ≈ 0.2787 (105%) [ Info: [ 28%] LOSS: training ≈ 0.2663 validation ≈ 0.2783 (105%) [ Info: [ 28%] LOSS: training ≈ 0.266 validation ≈ 0.2779 (104%) [ Info: [ 29%] LOSS: training ≈ 0.2657 validation ≈ 0.2774 (104%) [ Info: [ 30%] LOSS: training ≈ 0.2653 validation ≈ 0.277 (104%) [ Info: [ 30%] LOSS: training ≈ 0.2651 validation ≈ 0.2766 (104%) [ Info: [ 30%] LOSS: training ≈ 0.2648 validation ≈ 0.2762 (104%) [ Info: [ 31%] LOSS: training ≈ 0.2645 validation ≈ 0.2759 (104%) [ Info: [ 32%] LOSS: training ≈ 0.2642 validation ≈ 0.2755 (104%) [ Info: [ 32%] LOSS: training ≈ 0.264 validation ≈ 0.2751 (104%) [ Info: [ 32%] LOSS: training ≈ 0.2637 validation ≈ 0.2748 (104%) [ Info: [ 33%] LOSS: training ≈ 0.2634 validation ≈ 0.2744 (104%) [ Info: [ 34%] LOSS: training ≈ 0.2632 validation ≈ 0.2741 (104%) [ Info: [ 34%] LOSS: training ≈ 0.263 validation ≈ 0.2738 (104%) [ Info: [ 34%] LOSS: training ≈ 0.2627 validation ≈ 0.2735 (104%) [ Info: [ 35%] LOSS: training ≈ 0.2625 validation ≈ 0.2731 (104%) [ Info: [ 36%] LOSS: training ≈ 0.2623 validation ≈ 0.2728 (104%) [ Info: [ 36%] LOSS: training ≈ 0.2621 validation ≈ 0.2725 (104%) [ Info: [ 36%] LOSS: training ≈ 0.2619 validation ≈ 0.2722 (104%) [ Info: [ 37%] LOSS: training ≈ 0.2617 validation ≈ 0.272 (104%) [ Info: [ 38%] LOSS: training ≈ 0.2615 validation ≈ 0.2717 (104%) [ Info: [ 38%] LOSS: training ≈ 0.2613 validation ≈ 0.2714 (104%) [ Info: [ 38%] LOSS: training ≈ 0.2611 validation ≈ 0.2711 (104%) [ Info: [ 39%] LOSS: training ≈ 0.2609 validation ≈ 0.2709 (104%) [ Info: [ 40%] LOSS: training ≈ 0.2608 validation ≈ 0.2706 (104%) [ Info: [ 40%] LOSS: training ≈ 0.2606 validation ≈ 0.2704 (104%) [ Info: [ 40%] LOSS: training ≈ 0.2604 validation ≈ 0.2701 (104%) [ Info: [ 41%] LOSS: training ≈ 0.2603 validation ≈ 0.2699 (104%) [ Info: [ 42%] LOSS: training ≈ 0.2601 validation ≈ 0.2696 (104%) [ Info: [ 42%] LOSS: training ≈ 0.26 validation ≈ 0.2694 (104%) [ Info: [ 42%] LOSS: training ≈ 0.2598 validation ≈ 0.2692 (104%) [ Info: [ 43%] LOSS: training ≈ 0.2597 validation ≈ 0.269 (104%) [ Info: [ 44%] LOSS: training ≈ 0.2595 validation ≈ 0.2688 (104%) [ Info: [ 44%] LOSS: training ≈ 0.2594 validation ≈ 0.2686 (104%) [ Info: [ 44%] LOSS: training ≈ 0.2593 validation ≈ 0.2683 (104%) [ Info: [ 45%] LOSS: training ≈ 0.2591 validation ≈ 0.2681 (103%) [ Info: [ 46%] LOSS: training ≈ 0.259 validation ≈ 0.2679 (103%) [ Info: [ 46%] LOSS: training ≈ 0.2589 validation ≈ 0.2678 (103%) [ Info: [ 46%] LOSS: training ≈ 0.2588 validation ≈ 0.2676 (103%) [ Info: [ 47%] LOSS: training ≈ 0.2586 validation ≈ 0.2674 (103%) [ Info: [ 48%] LOSS: training ≈ 0.2585 validation ≈ 0.2672 (103%) [ Info: [ 48%] LOSS: training ≈ 0.2584 validation ≈ 0.267 (103%) [ Info: [ 48%] LOSS: training ≈ 0.2583 validation ≈ 0.2668 (103%) [ Info: [ 49%] LOSS: training ≈ 0.2582 validation ≈ 0.2667 (103%) [ Info: [ 50%] LOSS: training ≈ 0.2581 validation ≈ 0.2665 (103%) [ Info: [ 50%] LOSS: training ≈ 0.258 validation ≈ 0.2663 (103%) [ Info: [ 50%] LOSS: training ≈ 0.2579 validation ≈ 0.2662 (103%) [ Info: [ 51%] LOSS: training ≈ 0.2578 validation ≈ 0.266 (103%) [ Info: [ 52%] LOSS: training ≈ 0.2577 validation ≈ 0.2659 (103%) [ Info: [ 52%] LOSS: training ≈ 0.2576 validation ≈ 0.2657 (103%) [ Info: [ 52%] LOSS: training ≈ 0.2575 validation ≈ 0.2656 (103%) [ Info: [ 53%] LOSS: training ≈ 0.2574 validation ≈ 0.2654 (103%) [ Info: [ 54%] LOSS: training ≈ 0.2573 validation ≈ 0.2653 (103%) [ Info: [ 54%] LOSS: training ≈ 0.2572 validation ≈ 0.2651 (103%) [ Info: [ 55%] LOSS: training ≈ 0.2572 validation ≈ 0.265 (103%) [ Info: [ 55%] LOSS: training ≈ 0.2571 validation ≈ 0.2649 (103%) [ Info: [ 56%] LOSS: training ≈ 0.257 validation ≈ 0.2647 (103%) [ Info: [ 56%] LOSS: training ≈ 0.2569 validation ≈ 0.2646 (103%) [ Info: [ 56%] LOSS: training ≈ 0.2568 validation ≈ 0.2645 (103%) [ Info: [ 57%] LOSS: training ≈ 0.2568 validation ≈ 0.2643 (103%) [ Info: [ 57%] LOSS: training ≈ 0.2567 validation ≈ 0.2642 (103%) [ Info: [ 58%] LOSS: training ≈ 0.2566 validation ≈ 0.2641 (103%) [ Info: [ 58%] LOSS: training ≈ 0.2565 validation ≈ 0.264 (103%) [ Info: [ 59%] LOSS: training ≈ 0.2565 validation ≈ 0.2639 (103%) [ Info: [ 60%] LOSS: training ≈ 0.2564 validation ≈ 0.2638 (103%) [ Info: [ 60%] LOSS: training ≈ 0.2563 validation ≈ 0.2636 (103%) [ Info: [ 60%] LOSS: training ≈ 0.2563 validation ≈ 0.2635 (103%) [ Info: [ 61%] LOSS: training ≈ 0.2562 validation ≈ 0.2634 (103%) [ Info: [ 62%] LOSS: training ≈ 0.2562 validation ≈ 0.2633 (103%) [ Info: [ 62%] LOSS: training ≈ 0.2561 validation ≈ 0.2632 (103%) [ Info: [ 62%] LOSS: training ≈ 0.256 validation ≈ 0.2631 (103%) [ Info: [ 63%] LOSS: training ≈ 0.256 validation ≈ 0.263 (103%) [ Info: [ 64%] LOSS: training ≈ 0.2559 validation ≈ 0.2629 (103%) [ Info: [ 64%] LOSS: training ≈ 0.2559 validation ≈ 0.2628 (103%) [ Info: [ 64%] LOSS: training ≈ 0.2558 validation ≈ 0.2627 (103%) [ Info: [ 65%] LOSS: training ≈ 0.2558 validation ≈ 0.2626 (103%) [ Info: [ 66%] LOSS: training ≈ 0.2557 validation ≈ 0.2625 (103%) [ Info: [ 66%] LOSS: training ≈ 0.2557 validation ≈ 0.2624 (103%) [ Info: [ 66%] LOSS: training ≈ 0.2556 validation ≈ 0.2623 (103%) [ Info: [ 67%] LOSS: training ≈ 0.2556 validation ≈ 0.2623 (103%) [ Info: [ 68%] LOSS: training ≈ 0.2555 validation ≈ 0.2622 (103%) [ Info: [ 68%] LOSS: training ≈ 0.2555 validation ≈ 0.2621 (103%) [ Info: [ 68%] LOSS: training ≈ 0.2554 validation ≈ 0.262 (103%) [ Info: [ 69%] LOSS: training ≈ 0.2554 validation ≈ 0.2619 (103%) [ Info: [ 70%] LOSS: training ≈ 0.2553 validation ≈ 0.2618 (103%) [ Info: [ 70%] LOSS: training ≈ 0.2553 validation ≈ 0.2618 (103%) [ Info: [ 70%] LOSS: training ≈ 0.2552 validation ≈ 0.2617 (103%) [ Info: [ 71%] LOSS: training ≈ 0.2552 validation ≈ 0.2616 (103%) [ Info: [ 72%] LOSS: training ≈ 0.2552 validation ≈ 0.2615 (102%) [ Info: [ 72%] LOSS: training ≈ 0.2551 validation ≈ 0.2614 (102%) [ Info: [ 72%] LOSS: training ≈ 0.2551 validation ≈ 0.2614 (102%) [ Info: [ 73%] LOSS: training ≈ 0.255 validation ≈ 0.2613 (102%) [ Info: [ 74%] LOSS: training ≈ 0.255 validation ≈ 0.2612 (102%) [ Info: [ 74%] LOSS: training ≈ 0.255 validation ≈ 0.2612 (102%) [ Info: [ 74%] LOSS: training ≈ 0.2549 validation ≈ 0.2611 (102%) [ Info: [ 75%] LOSS: training ≈ 0.2549 validation ≈ 0.261 (102%) [ Info: [ 76%] LOSS: training ≈ 0.2548 validation ≈ 0.261 (102%) [ Info: [ 76%] LOSS: training ≈ 0.2548 validation ≈ 0.2609 (102%) [ Info: [ 76%] LOSS: training ≈ 0.2548 validation ≈ 0.2608 (102%) [ Info: [ 77%] LOSS: training ≈ 0.2547 validation ≈ 0.2608 (102%) [ Info: [ 78%] LOSS: training ≈ 0.2547 validation ≈ 0.2607 (102%) [ Info: [ 78%] LOSS: training ≈ 0.2547 validation ≈ 0.2606 (102%) [ Info: [ 78%] LOSS: training ≈ 0.2546 validation ≈ 0.2606 (102%) [ Info: [ 79%] LOSS: training ≈ 0.2546 validation ≈ 0.2605 (102%) [ Info: [ 80%] LOSS: training ≈ 0.2546 validation ≈ 0.2604 (102%) [ Info: [ 80%] LOSS: training ≈ 0.2546 validation ≈ 0.2604 (102%) [ Info: [ 80%] LOSS: training ≈ 0.2545 validation ≈ 0.2603 (102%) [ Info: [ 81%] LOSS: training ≈ 0.2545 validation ≈ 0.2603 (102%) [ Info: [ 82%] LOSS: training ≈ 0.2545 validation ≈ 0.2602 (102%) [ Info: [ 82%] LOSS: training ≈ 0.2544 validation ≈ 0.2602 (102%) [ Info: [ 82%] LOSS: training ≈ 0.2544 validation ≈ 0.2601 (102%) [ Info: [ 83%] LOSS: training ≈ 0.2544 validation ≈ 0.2601 (102%) [ Info: [ 84%] LOSS: training ≈ 0.2544 validation ≈ 0.26 (102%) [ Info: [ 84%] LOSS: training ≈ 0.2543 validation ≈ 0.2599 (102%) [ Info: [ 84%] LOSS: training ≈ 0.2543 validation ≈ 0.2599 (102%) [ Info: [ 85%] LOSS: training ≈ 0.2543 validation ≈ 0.2598 (102%) [ Info: [ 86%] LOSS: training ≈ 0.2543 validation ≈ 0.2598 (102%) [ Info: [ 86%] LOSS: training ≈ 0.2542 validation ≈ 0.2597 (102%) [ Info: [ 86%] LOSS: training ≈ 0.2542 validation ≈ 0.2597 (102%) [ Info: [ 87%] LOSS: training ≈ 0.2542 validation ≈ 0.2596 (102%) [ Info: [ 88%] LOSS: training ≈ 0.2542 validation ≈ 0.2596 (102%) [ Info: [ 88%] LOSS: training ≈ 0.2541 validation ≈ 0.2595 (102%) [ Info: [ 88%] LOSS: training ≈ 0.2541 validation ≈ 0.2595 (102%) [ Info: [ 89%] LOSS: training ≈ 0.2541 validation ≈ 0.2595 (102%) [ Info: [ 90%] LOSS: training ≈ 0.2541 validation ≈ 0.2594 (102%) [ Info: [ 90%] LOSS: training ≈ 0.2541 validation ≈ 0.2594 (102%) [ Info: [ 90%] LOSS: training ≈ 0.254 validation ≈ 0.2593 (102%) [ Info: [ 91%] LOSS: training ≈ 0.254 validation ≈ 0.2593 (102%) [ Info: [ 92%] LOSS: training ≈ 0.254 validation ≈ 0.2592 (102%) [ Info: [ 92%] LOSS: training ≈ 0.254 validation ≈ 0.2592 (102%) [ Info: [ 92%] LOSS: training ≈ 0.254 validation ≈ 0.2592 (102%) [ Info: [ 93%] LOSS: training ≈ 0.2539 validation ≈ 0.2591 (102%) [ Info: [ 94%] LOSS: training ≈ 0.2539 validation ≈ 0.2591 (102%) [ Info: [ 94%] LOSS: training ≈ 0.2539 validation ≈ 0.259 (102%) [ Info: [ 94%] LOSS: training ≈ 0.2539 validation ≈ 0.259 (102%) [ Info: [ 95%] LOSS: training ≈ 0.2539 validation ≈ 0.2589 (102%) [ Info: [ 96%] LOSS: training ≈ 0.2538 validation ≈ 0.2589 (102%) [ Info: [ 96%] LOSS: training ≈ 0.2538 validation ≈ 0.2589 (102%) [ Info: [ 96%] LOSS: training ≈ 0.2538 validation ≈ 0.2588 (102%) [ Info: [ 97%] LOSS: training ≈ 0.2538 validation ≈ 0.2588 (102%) [ Info: [ 98%] LOSS: training ≈ 0.2538 validation ≈ 0.2588 (102%) [ Info: [ 98%] LOSS: training ≈ 0.2538 validation ≈ 0.2587 (102%) [ Info: [ 98%] LOSS: training ≈ 0.2537 validation ≈ 0.2587 (102%) [ Info: [ 99%] LOSS: training ≈ 0.2537 validation ≈ 0.2587 (102%) [ Info: [100%] LOSS: training ≈ 0.2537 validation ≈ 0.2586 (102%) [ Info: [100%] LOSS: training ≈ 0.2537 validation ≈ 0.2586 (102%) [ Info: [ 0%] LOSS: training ≈ 0.4215 [ Info: [ 1%] LOSS: training ≈ 0.3806 [ Info: [ 2%] LOSS: training ≈ 0.3564 [ Info: [ 2%] LOSS: training ≈ 0.3399 [ Info: [ 2%] LOSS: training ≈ 0.3279 [ Info: [ 3%] LOSS: training ≈ 0.3185 [ Info: [ 4%] LOSS: training ≈ 0.311 [ Info: [ 4%] LOSS: training ≈ 0.3048 [ Info: [ 4%] LOSS: training ≈ 0.2996 [ Info: [ 5%] LOSS: training ≈ 0.2952 [ Info: [ 6%] LOSS: training ≈ 0.2914 [ Info: [ 6%] LOSS: training ≈ 0.2881 [ Info: [ 6%] LOSS: training ≈ 0.2852 [ Info: [ 7%] LOSS: training ≈ 0.2826 [ Info: [ 8%] LOSS: training ≈ 0.2802 [ Info: [ 8%] LOSS: training ≈ 0.2782 [ Info: [ 8%] LOSS: training ≈ 0.2763 [ Info: [ 9%] LOSS: training ≈ 0.2746 [ Info: [ 10%] LOSS: training ≈ 0.2731 [ Info: [ 10%] LOSS: training ≈ 0.2717 [ Info: [ 10%] LOSS: training ≈ 0.2704 [ Info: [ 11%] LOSS: training ≈ 0.2692 [ Info: [ 12%] LOSS: training ≈ 0.2681 [ Info: [ 12%] LOSS: training ≈ 0.2671 [ Info: [ 12%] LOSS: training ≈ 0.2662 [ Info: [ 13%] LOSS: training ≈ 0.2654 [ Info: [ 14%] LOSS: training ≈ 0.2646 [ Info: [ 14%] LOSS: training ≈ 0.2639 [ Info: [ 14%] LOSS: training ≈ 0.2632 [ Info: [ 15%] LOSS: training ≈ 0.2626 [ Info: [ 16%] LOSS: training ≈ 0.262 [ Info: [ 16%] LOSS: training ≈ 0.2614 [ Info: [ 16%] LOSS: training ≈ 0.2609 [ Info: [ 17%] LOSS: training ≈ 0.2605 [ Info: [ 18%] LOSS: training ≈ 0.26 [ Info: [ 18%] LOSS: training ≈ 0.2596 [ Info: [ 18%] LOSS: training ≈ 0.2592 [ Info: [ 19%] LOSS: training ≈ 0.2589 [ Info: [ 20%] LOSS: training ≈ 0.2585 [ Info: [ 20%] LOSS: training ≈ 0.2582 [ Info: [ 20%] LOSS: training ≈ 0.2579 [ Info: [ 21%] LOSS: training ≈ 0.2576 [ Info: [ 22%] LOSS: training ≈ 0.2573 [ Info: [ 22%] LOSS: training ≈ 0.2571 [ Info: [ 22%] LOSS: training ≈ 0.2568 [ Info: [ 23%] LOSS: training ≈ 0.2566 [ Info: [ 24%] LOSS: training ≈ 0.2564 [ Info: [ 24%] LOSS: training ≈ 0.2562 [ Info: [ 24%] LOSS: training ≈ 0.256 [ Info: [ 25%] LOSS: training ≈ 0.2558 [ Info: [ 26%] LOSS: training ≈ 0.2557 [ Info: [ 26%] LOSS: training ≈ 0.2555 [ Info: [ 26%] LOSS: training ≈ 0.2554 [ Info: [ 27%] LOSS: training ≈ 0.2552 [ Info: [ 28%] LOSS: training ≈ 0.2551 [ Info: [ 28%] LOSS: training ≈ 0.255 [ Info: [ 28%] LOSS: training ≈ 0.2548 [ Info: [ 29%] LOSS: training ≈ 0.2547 [ Info: [ 30%] LOSS: training ≈ 0.2546 [ Info: [ 30%] LOSS: training ≈ 0.2545 [ Info: [ 30%] LOSS: training ≈ 0.2544 [ Info: [ 31%] LOSS: training ≈ 0.2543 [ Info: [ 32%] LOSS: training ≈ 0.2542 [ Info: [ 32%] LOSS: training ≈ 0.2541 [ Info: [ 32%] LOSS: training ≈ 0.2541 [ Info: [ 33%] LOSS: training ≈ 0.254 [ Info: [ 34%] LOSS: training ≈ 0.2539 [ Info: [ 34%] LOSS: training ≈ 0.2538 [ Info: [ 34%] LOSS: training ≈ 0.2538 [ Info: [ 35%] LOSS: training ≈ 0.2537 [ Info: [ 36%] LOSS: training ≈ 0.2536 [ Info: [ 36%] LOSS: training ≈ 0.2536 [ Info: [ 36%] LOSS: training ≈ 0.2535 [ Info: [ 37%] LOSS: training ≈ 0.2535 [ Info: [ 38%] LOSS: training ≈ 0.2534 [ Info: [ 38%] LOSS: training ≈ 0.2534 [ Info: [ 38%] LOSS: training ≈ 0.2533 [ Info: [ 39%] LOSS: training ≈ 0.2533 [ Info: [ 40%] LOSS: training ≈ 0.2532 [ Info: [ 40%] LOSS: training ≈ 0.2532 [ Info: [ 40%] LOSS: training ≈ 0.2532 [ Info: [ 41%] LOSS: training ≈ 0.2531 [ Info: [ 42%] LOSS: training ≈ 0.2531 [ Info: [ 42%] LOSS: training ≈ 0.2531 [ Info: [ 42%] LOSS: training ≈ 0.253 [ Info: [ 43%] LOSS: training ≈ 0.253 [ Info: [ 44%] LOSS: training ≈ 0.253 [ Info: [ 44%] LOSS: training ≈ 0.2529 [ Info: [ 44%] LOSS: training ≈ 0.2529 [ Info: [ 45%] LOSS: training ≈ 0.2529 [ Info: [ 46%] LOSS: training ≈ 0.2529 [ Info: [ 46%] LOSS: training ≈ 0.2528 [ Info: [ 46%] LOSS: training ≈ 0.2528 [ Info: [ 47%] LOSS: training ≈ 0.2528 [ Info: [ 48%] LOSS: training ≈ 0.2528 [ Info: [ 48%] LOSS: training ≈ 0.2527 [ Info: [ 48%] LOSS: training ≈ 0.2527 [ Info: [ 49%] LOSS: training ≈ 0.2527 [ Info: [ 50%] LOSS: training ≈ 0.2527 [ Info: [ 50%] LOSS: training ≈ 0.2527 [ Info: [ 50%] LOSS: training ≈ 0.2527 [ Info: [ 51%] LOSS: training ≈ 0.2526 [ Info: [ 52%] LOSS: training ≈ 0.2526 [ Info: [ 52%] LOSS: training ≈ 0.2526 [ Info: [ 52%] LOSS: training ≈ 0.2526 [ Info: [ 53%] LOSS: training ≈ 0.2526 [ Info: [ 54%] LOSS: training ≈ 0.2526 [ Info: [ 54%] LOSS: training ≈ 0.2525 [ Info: [ 55%] LOSS: training ≈ 0.2525 [ Info: [ 55%] LOSS: training ≈ 0.2525 [ Info: [ 56%] LOSS: training ≈ 0.2525 [ Info: [ 56%] LOSS: training ≈ 0.2525 [ Info: [ 56%] LOSS: training ≈ 0.2525 [ Info: [ 57%] LOSS: training ≈ 0.2525 [ Info: [ 57%] LOSS: training ≈ 0.2525 [ Info: [ 58%] LOSS: training ≈ 0.2525 [ Info: [ 58%] LOSS: training ≈ 0.2525 [ Info: [ 59%] LOSS: training ≈ 0.2524 [ Info: [ 60%] LOSS: training ≈ 0.2524 [ Info: [ 60%] LOSS: training ≈ 0.2524 [ Info: [ 60%] LOSS: training ≈ 0.2524 [ Info: [ 61%] LOSS: training ≈ 0.2524 [ Info: [ 62%] LOSS: training ≈ 0.2524 [ Info: [ 62%] LOSS: training ≈ 0.2524 [ Info: [ 62%] LOSS: training ≈ 0.2524 [ Info: [ 63%] LOSS: training ≈ 0.2524 [ Info: [ 64%] LOSS: training ≈ 0.2524 [ Info: [ 64%] LOSS: training ≈ 0.2524 [ Info: [ 64%] LOSS: training ≈ 0.2524 [ Info: [ 65%] LOSS: training ≈ 0.2524 [ Info: [ 66%] LOSS: training ≈ 0.2523 [ Info: [ 66%] LOSS: training ≈ 0.2523 [ Info: [ 66%] LOSS: training ≈ 0.2523 [ Info: [ 67%] LOSS: training ≈ 0.2523 [ Info: [ 68%] LOSS: training ≈ 0.2523 [ Info: [ 68%] LOSS: training ≈ 0.2523 [ Info: [ 68%] LOSS: training ≈ 0.2523 [ Info: [ 69%] LOSS: training ≈ 0.2523 [ Info: [ 70%] LOSS: training ≈ 0.2523 [ Info: [ 70%] LOSS: training ≈ 0.2523 [ Info: [ 70%] LOSS: training ≈ 0.2523 [ Info: [ 71%] LOSS: training ≈ 0.2523 [ Info: [ 72%] LOSS: training ≈ 0.2523 [ Info: [ 72%] LOSS: training ≈ 0.2523 [ Info: [ 72%] LOSS: training ≈ 0.2523 [ Info: [ 73%] LOSS: training ≈ 0.2523 [ Info: [ 74%] LOSS: training ≈ 0.2523 [ Info: [ 74%] LOSS: training ≈ 0.2523 [ Info: [ 74%] LOSS: training ≈ 0.2523 [ Info: [ 75%] LOSS: training ≈ 0.2523 [ Info: [ 76%] LOSS: training ≈ 0.2523 [ Info: [ 76%] LOSS: training ≈ 0.2523 [ Info: [ 76%] LOSS: training ≈ 0.2523 [ Info: [ 77%] LOSS: training ≈ 0.2523 [ Info: [ 78%] LOSS: training ≈ 0.2523 [ Info: [ 78%] LOSS: training ≈ 0.2523 [ Info: [ 78%] LOSS: training ≈ 0.2522 [ Info: [ 79%] LOSS: training ≈ 0.2522 [ Info: [ 80%] LOSS: training ≈ 0.2522 [ Info: [ 80%] LOSS: training ≈ 0.2522 [ Info: [ 80%] LOSS: training ≈ 0.2522 [ Info: [ 81%] LOSS: training ≈ 0.2522 [ Info: [ 82%] LOSS: training ≈ 0.2522 [ Info: [ 82%] LOSS: training ≈ 0.2522 [ Info: [ 82%] LOSS: training ≈ 0.2522 [ Info: [ 83%] LOSS: training ≈ 0.2522 [ Info: [ 84%] LOSS: training ≈ 0.2522 [ Info: [ 84%] LOSS: training ≈ 0.2522 [ Info: [ 84%] LOSS: training ≈ 0.2522 [ Info: [ 85%] LOSS: training ≈ 0.2522 [ Info: [ 86%] LOSS: training ≈ 0.2522 [ Info: [ 86%] LOSS: training ≈ 0.2522 [ Info: [ 86%] LOSS: training ≈ 0.2522 [ Info: [ 87%] LOSS: training ≈ 0.2522 [ Info: [ 88%] LOSS: training ≈ 0.2522 [ Info: [ 88%] LOSS: training ≈ 0.2522 [ Info: [ 88%] LOSS: training ≈ 0.2522 [ Info: [ 89%] LOSS: training ≈ 0.2522 [ Info: [ 90%] LOSS: training ≈ 0.2522 [ Info: [ 90%] LOSS: training ≈ 0.2522 [ Info: [ 90%] LOSS: training ≈ 0.2522 [ Info: [ 91%] LOSS: training ≈ 0.2522 [ Info: [ 92%] LOSS: training ≈ 0.2522 [ Info: [ 92%] LOSS: training ≈ 0.2522 [ Info: [ 92%] LOSS: training ≈ 0.2522 [ Info: [ 93%] LOSS: training ≈ 0.2522 [ Info: [ 94%] LOSS: training ≈ 0.2522 [ Info: [ 94%] LOSS: training ≈ 0.2522 [ Info: [ 94%] LOSS: training ≈ 0.2522 [ Info: [ 95%] LOSS: training ≈ 0.2522 [ Info: [ 96%] LOSS: training ≈ 0.2522 [ Info: [ 96%] LOSS: training ≈ 0.2522 [ Info: [ 96%] LOSS: training ≈ 0.2522 [ Info: [ 97%] LOSS: training ≈ 0.2522 [ Info: [ 98%] LOSS: training ≈ 0.2522 [ Info: [ 98%] LOSS: training ≈ 0.2522 [ Info: [ 98%] LOSS: training ≈ 0.2522 [ Info: [ 99%] LOSS: training ≈ 0.2522 [ Info: [100%] LOSS: training ≈ 0.2522 [ Info: [100%] LOSS: training ≈ 0.2522 ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/vif.jl:24 ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/vif.jl:24 ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/vif.jl:24 ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/vif.jl:24 ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/vif.jl:24 ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/vif.jl:24 ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/vif.jl:24 ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/vif.jl:24 ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/vif.jl:24 ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/vif.jl:24 ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/vif.jl:24 ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/vif.jl:24 ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/vif.jl:24 ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/vif.jl:24 ┌ Warning: forwardselection! will be deprecated - use variables! with ForwardSelection instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/selection.jl:90 ┌ Warning: forwardselection! will be deprecated - use variables! with ForwardSelection instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/selection.jl:90 [ Info: [ 2 vars.] MCC val. ≈ -0.0 [ Info: [ 3 vars.] MCC val. ≈ 0.499 [ Info: Optimal var. pool: [1, 12, 19] ┌ Warning: backwardselection! will be deprecated - use variables! with BackwardSelection instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/selection.jl:44 ┌ Warning: backwardselection! will be deprecated - use variables! with BackwardSelection instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/selection.jl:44 ┌ Warning: forwardselection! will be deprecated - use variables! with ForwardSelection instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/selection.jl:90 [ Info: [ 0 vars.] MCC val. ≈ -0.0 [ Info: [ 1 vars.] MCC val. ≈ 0.689 [ Info: [ 2 vars.] MCC val. ≈ 0.696 [ Info: [ 3 vars.] MCC val. ≈ 0.703 [ Info: [ 4 vars.] MCC val. ≈ 0.704 [ Info: [ 5 vars.] MCC val. ≈ 0.724 [ Info: Optimal var. pool: [8, 7, 17, 18, 1] ┌ Warning: backwardselection! will be deprecated - use variables! with BackwardSelection instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/selection.jl:44 [ Info: [19 vars.] MCC val. ≈ -0.0 [ Info: [18 vars.] MCC val. ≈ 0.628 [ Info: [17 vars.] MCC val. ≈ 0.835 [ Info: Optimal var. pool: [1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 13, 14, 15, 16, 17, 18] ┌ Warning: backwardselection! will be deprecated - use variables! with BackwardSelection instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/selection.jl:44 [ Info: [ 9 vars.] MCC val. ≈ -0.0 [ Info: [ 8 vars.] MCC val. ≈ 0.801 [ Info: Optimal var. pool: [1, 2, 3, 4, 5, 6, 7, 9] ┌ Warning: forwardselection! will be deprecated - use variables! with ForwardSelection instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/selection.jl:90 [ Info: [ 2 vars.] MCC val. ≈ -0.0 [ Info: [ 3 vars.] MCC val. ≈ 0.515 [ Info: Optimal var. pool: [12, 13, 1] ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/vif.jl:24 ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/vif.jl:24 ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/vif.jl:24 ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/vif.jl:24 ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/vif.jl:24 ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/vif.jl:24 ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/vif.jl:24 ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/vif.jl:24 ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/vif.jl:24 ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/vif.jl:24 ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/vif.jl:24 ┌ Warning: stepwisevif! will be deprecated - use variables! with VarianceInflationFactor instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/vif.jl:24 ┌ Warning: forwardselection! will be deprecated - use variables! with ForwardSelection instead └ @ SDeMo ~/.julia/packages/SDeMo/WBmCZ/src/variables/selection.jl:90 [ Info: [ 0 vars.] MCC val. ≈ -0.0 [ Info: [ 1 vars.] MCC val. ≈ 0.78 [ Info: [ 2 vars.] MCC val. ≈ 0.786 [ Info: Optimal var. pool: [8, 7] [ Info: Baseline mcc: -2.8367742685852565e-17 [ Info: Optimal 1 variables model - mcc ≈ 0.7399 [ Info: Optimal 2 variables model - mcc ≈ 0.7608 [ Info: Optimal 3 variables model - mcc ≈ 0.7684 [ Info: Optimal 4 variables model - mcc ≈ 0.7686 [ Info: Optimal 5 variables model - mcc ≈ 0.8044 [ Info: Optimal 6 variables model - mcc ≈ 0.8156 [ Info: Returning model with 7 variables - mcc ≈ 0.8156 Test Summary: | Pass Total Time Package | 298 298 20m35.0s Testing SDeMo tests passed Testing completed after 1293.66s PkgEval succeeded after 1354.34s