Package evaluation of ConformalPrediction on Julia 1.10.8 (92f03a4775*) started at 2025-02-25T16:12:55.390 ################################################################################ # Set-up # Installing PkgEval dependencies (TestEnv)... Set-up completed after 5.08s ################################################################################ # Installation # Installing ConformalPrediction... Resolving package versions... Updating `~/.julia/environments/v1.10/Project.toml` [98bfc277] + ConformalPrediction v0.1.13 Updating `~/.julia/environments/v1.10/Manifest.toml` [621f4979] + AbstractFFTs v1.5.0 [7d9f7c33] + Accessors v0.1.41 [79e6a3ab] + Adapt v4.2.0 [66dad0bd] + AliasTables v1.1.3 [dce04be8] + ArgCheck v2.4.0 [4fba245c] + ArrayInterface v7.18.0 [a9b6321e] + Atomix v1.1.0 [fbb218c0] + BSON v0.3.9 [198e06fe] + BangBang v0.4.3 [9718e550] + Baselet v0.1.1 [fa961155] + CEnum v0.5.0 [324d7699] + CategoricalArrays v0.10.8 [af321ab8] + CategoricalDistributions v0.1.15 [082447d4] + ChainRules v1.72.2 [d360d2e6] + ChainRulesCore v1.25.1 ⌅ [3da002f7] + ColorTypes v0.11.5 [bbf7d656] + CommonSubexpressions v0.3.1 [34da2185] + Compat v4.16.0 [a33af91c] + CompositionsBase v0.1.2 [ed09eef8] + ComputationalResources v0.3.2 [98bfc277] + ConformalPrediction v0.1.13 [187b0558] + ConstructionBase v1.5.8 [6add18c4] + ContextVariablesX v0.1.3 [a8cc5b0e] + Crayons v4.1.1 [9a962f9c] + DataAPI v1.16.0 [864edb3b] + DataStructures v0.18.20 [e2d170a0] + DataValueInterfaces v1.0.0 [244e2a9f] + DefineSingletons v0.1.2 [8bb1440f] + DelimitedFiles v1.9.1 [b429d917] + DensityInterface v0.4.0 [163ba53b] + DiffResults v1.1.0 [b552c78f] + DiffRules v1.15.1 [31c24e10] + Distributions v0.25.117 [ffbed154] + DocStringExtensions v0.9.3 [cc61a311] + FLoops v0.2.2 [b9860ae5] + FLoopsBase v0.1.1 [5789e2e9] + FileIO v1.16.6 [1a297f60] + FillArrays v1.13.0 [6a86dc24] + FiniteDiff v2.27.0 [53c48c17] + FixedPointNumbers v0.8.5 ⌅ [587475ba] + Flux v0.14.25 [f6369f11] + ForwardDiff v0.10.38 ⌅ [d9f16b24] + Functors v0.4.12 [0c68f7d7] + GPUArrays v11.2.2 [46192b85] + GPUArraysCore v0.2.0 [076d061b] + HashArrayMappedTries v0.2.0 [34004b35] + HypergeometricFunctions v0.3.27 [7869d1d1] + IRTools v0.4.14 [4846b161] + InferOpt v0.6.1 [22cec73e] + InitialValues v0.3.1 [3587e190] + InverseFunctions v0.1.17 [41ab1584] + InvertedIndices v1.3.1 [92d709cd] + IrrationalConstants v0.2.4 [42fd0dbc] + IterativeSolvers v0.9.4 [82899510] + IteratorInterfaceExtensions v1.0.0 [033835bb] + JLD2 v0.5.11 [692b3bcd] + JLLWrappers v1.7.0 [b14d175d] + JuliaVariables v0.2.4 [63c18a36] + KernelAbstractions v0.9.34 [929cbde3] + LLVM v9.2.0 [b964fa9f] + LaTeXStrings v1.4.0 ⌅ [92ad9a40] + LearnAPI v0.1.0 [d3d80556] + LineSearches v7.3.0 [7a12625a] + LinearMaps v3.11.4 [2ab3a3ac] + LogExpFunctions v0.3.29 [c2834f40] + MLCore v1.0.0 ⌃ [7e8f7934] + MLDataDevices v1.5.3 [a7f614a8] + MLJBase v1.7.0 [50ed68f4] + MLJEnsembles v0.4.3 ⌅ [094fc8d1] + MLJFlux v0.5.1 [6ee0df7b] + MLJLinearModels v0.10.0 [e80e1ace] + MLJModelInterface v1.11.0 [d8e11817] + MLStyle v0.4.17 [f1d291b0] + MLUtils v0.4.7 [1914dd2f] + MacroTools v0.5.15 [dbeba491] + Metalhead v0.9.5 [128add7d] + MicroCollections v0.2.0 [e1d29d7a] + Missings v1.2.0 [d41bc354] + NLSolversBase v7.8.3 [872c559c] + NNlib v0.9.27 [77ba4419] + NaNMath v1.1.2 [71a1bf82] + NameResolution v0.1.5 [0b1bfda6] + OneHotArrays v0.2.6 [429524aa] + Optim v1.11.0 ⌅ [3bd65402] + Optimisers v0.3.4 [bac558e1] + OrderedCollections v1.8.0 [90014a1f] + PDMats v0.11.32 [d96e819e] + Parameters v0.12.3 [570af359] + PartialFunctions v1.2.0 [85a6dd25] + PositiveFactorizations v0.2.4 [aea7be01] + PrecompileTools v1.2.1 [21216c6a] + Preferences v1.4.3 [8162dcfd] + PrettyPrint v0.2.0 [08abe8d2] + PrettyTables v2.4.0 [33c8b6b6] + ProgressLogging v0.1.4 [92933f4c] + ProgressMeter v1.10.2 [43287f4e] + PtrArrays v1.3.0 [1fd47b50] + QuadGK v2.11.2 [c1ae055f] + RealDot v0.1.0 [3cdcf5f2] + RecipesBase v1.3.4 [189a3867] + Reexport v1.2.2 [42d2dcc6] + Referenceables v0.1.3 [97f35ef4] + RequiredInterfaces v0.1.7 [ae029012] + Requires v1.3.0 [79098fc4] + Rmath v0.8.0 [321657f4] + ScientificTypes v3.1.0 [30f210dd] + ScientificTypesBase v3.0.0 [7e506255] + ScopedValues v1.3.0 [efcf1570] + Setfield v1.1.1 [605ecd9f] + ShowCases v0.1.0 [699a6c99] + SimpleTraits v0.9.4 [a2af1166] + SortingAlgorithms v1.2.1 [dc90abb0] + SparseInverseSubset v0.1.2 [276daf66] + SpecialFunctions v2.5.0 [171d559e] + SplittablesBase v0.1.15 [90137ffa] + StaticArrays v1.9.12 [1e83bf80] + StaticArraysCore v1.4.3 [c062fc1d] + StatisticalMeasuresBase v0.1.2 [64bff920] + StatisticalTraits v3.4.0 [82ae8749] + StatsAPI v1.7.0 [2913bbd2] + StatsBase v0.34.4 [4c63d2b9] + StatsFuns v1.3.2 [892a3eda] + StringManipulation v0.4.1 ⌃ [09ab397b] + StructArrays v0.6.21 [3783bdb8] + TableTraits v1.0.1 [bd369af6] + Tables v1.12.0 [ac1d9e8a] + ThreadsX v0.1.12 [3bb67fe8] + TranscodingStreams v0.11.3 [28d57a85] + Transducers v0.4.84 [3a884ed6] + UnPack v1.0.2 [013be700] + UnsafeAtomics v0.3.0 ⌅ [e88e6eb3] + Zygote v0.6.75 [700de1a5] + ZygoteRules v0.2.7 [dad2f222] + LLVMExtra_jll v0.0.35+0 [efe28fd5] + OpenSpecFun_jll v0.5.6+0 [f50d1b31] + Rmath_jll v0.5.1+0 [0dad84c5] + ArgTools v1.1.1 [56f22d72] + Artifacts [2a0f44e3] + Base64 [ade2ca70] + Dates [8ba89e20] + Distributed [f43a241f] + Downloads v1.6.0 [7b1f6079] + FileWatching [9fa8497b] + Future [b77e0a4c] + InteractiveUtils [4af54fe1] + LazyArtifacts [b27032c2] + LibCURL v0.6.4 [76f85450] + LibGit2 [8f399da3] + Libdl [37e2e46d] + LinearAlgebra [56ddb016] + Logging [d6f4376e] + Markdown [a63ad114] + Mmap [ca575930] + NetworkOptions v1.2.0 [44cfe95a] + Pkg v1.10.0 [de0858da] + Printf [3fa0cd96] + REPL [9a3f8284] + Random [ea8e919c] + SHA v0.7.0 [9e88b42a] + Serialization [6462fe0b] + Sockets [2f01184e] + SparseArrays v1.10.0 [10745b16] + Statistics v1.10.0 [4607b0f0] + SuiteSparse [fa267f1f] + TOML v1.0.3 [a4e569a6] + Tar v1.10.0 [8dfed614] + Test [cf7118a7] + UUIDs [4ec0a83e] + Unicode [e66e0078] + CompilerSupportLibraries_jll v1.1.1+0 [deac9b47] + LibCURL_jll v8.4.0+0 [e37daf67] + LibGit2_jll v1.6.4+0 [29816b5a] + LibSSH2_jll v1.11.0+1 [c8ffd9c3] + MbedTLS_jll v2.28.2+1 [14a3606d] + MozillaCACerts_jll v2023.1.10 [4536629a] + OpenBLAS_jll v0.3.23+4 [05823500] + OpenLibm_jll v0.8.1+4 [bea87d4a] + SuiteSparse_jll v7.2.1+1 [83775a58] + Zlib_jll v1.2.13+1 [8e850b90] + libblastrampoline_jll v5.11.0+0 [8e850ede] + nghttp2_jll v1.52.0+1 [3f19e933] + p7zip_jll v17.4.0+2 Info Packages marked with ⌃ and ⌅ have new versions available. Those with ⌃ may be upgradable, but those with ⌅ are restricted by compatibility constraints from upgrading. To see why use `status --outdated -m` Installation completed after 11.57s ################################################################################ # Precompilation # Precompiling PkgEval dependencies... Precompiling package dependencies... Precompilation completed after 534.44s ################################################################################ # Testing # Testing ConformalPrediction Status `/tmp/jl_zoPIqr/Project.toml` [4c88cf16] Aqua v0.8.11 [5224ae11] CompatHelperLocal v0.1.27 [98bfc277] ConformalPrediction v0.1.13 [7806a523] DecisionTree v0.12.4 [e30172f5] Documenter v1.8.1 [f6006082] EvoTrees v0.16.9 ⌅ [7acf609c] LightGBM v0.7.2 [add582a8] MLJ v0.20.7 [c6f25543] MLJDecisionTreeInterface v0.4.2 ⌅ [094fc8d1] MLJFlux v0.5.1 [6ee0df7b] MLJLinearModels v0.10.0 [e80e1ace] MLJModelInterface v1.11.0 [636a865e] NearestNeighborModels v0.2.3 [91a5bcdd] Plots v1.40.9 [bd7198b4] TaijaPlotting v1.3.0 [8dfed614] Test Status `/tmp/jl_zoPIqr/Manifest.toml` [a4c015fc] ANSIColoredPrinters v0.0.1 [da404889] ARFFFiles v1.5.0 [621f4979] AbstractFFTs v1.5.0 [1520ce14] AbstractTrees v0.4.5 [7d9f7c33] Accessors v0.1.41 [79e6a3ab] Adapt v4.2.0 [66dad0bd] AliasTables v1.1.3 [4c88cf16] Aqua v0.8.11 [dce04be8] ArgCheck v2.4.0 [ec485272] ArnoldiMethod v0.4.0 [7d9fca2a] Arpack v0.5.4 [4fba245c] ArrayInterface v7.18.0 [a9b6321e] Atomix v1.1.0 [fbb218c0] BSON v0.3.9 [198e06fe] BangBang v0.4.3 [9718e550] Baselet v0.1.1 [d1d4a3ce] BitFlags v0.1.9 [fa961155] CEnum v0.5.0 [324d7699] CategoricalArrays v0.10.8 [af321ab8] CategoricalDistributions v0.1.15 ⌅ [8e462317] CausalInference v0.18.0 [082447d4] ChainRules v1.72.2 [d360d2e6] ChainRulesCore v1.25.1 [944b1d66] CodecZlib v0.7.8 [35d6a980] ColorSchemes v3.29.0 ⌅ [3da002f7] ColorTypes v0.11.5 ⌃ [c3611d14] ColorVectorSpace v0.10.0 [5ae59095] Colors v0.13.0 [861a8166] Combinatorics v1.0.2 [bbf7d656] CommonSubexpressions v0.3.1 [34da2185] Compat v4.16.0 [5224ae11] CompatHelperLocal v0.1.27 [a33af91c] CompositionsBase v0.1.2 [ed09eef8] ComputationalResources v0.3.2 [f0e56b4a] ConcurrentUtilities v2.5.0 [98bfc277] ConformalPrediction v0.1.13 [187b0558] ConstructionBase v1.5.8 [6add18c4] ContextVariablesX v0.1.3 [d38c429a] Contour v0.6.3 [2f13d31b] CounterfactualExplanations v1.4.5 [a8cc5b0e] Crayons v4.1.1 [9a962f9c] DataAPI v1.16.0 [a93c6f00] DataFrames v1.7.0 [864edb3b] DataStructures v0.18.20 [e2d170a0] DataValueInterfaces v1.0.0 [7806a523] DecisionTree v0.12.4 [244e2a9f] DefineSingletons v0.1.2 [8bb1440f] DelimitedFiles v1.9.1 [b429d917] DensityInterface v0.4.0 [163ba53b] DiffResults v1.1.0 [b552c78f] DiffRules v1.15.1 [b4f34e82] Distances v0.10.12 [31c24e10] Distributions v0.25.117 [ffbed154] DocStringExtensions v0.9.3 [e30172f5] Documenter v1.8.1 [792122b4] EarlyStopping v0.3.0 [f446124b] EnergySamplers v1.0.3 [f6006082] EvoTrees v0.16.9 [460bff9d] ExceptionUnwrapping v0.1.11 [411431e0] Extents v0.1.5 [c87230d0] FFMPEG v0.4.2 [cc61a311] FLoops v0.2.2 [b9860ae5] FLoopsBase v0.1.1 [33837fe5] FeatureSelection v0.2.2 [5789e2e9] FileIO v1.16.6 [48062228] FilePathsBase v0.9.23 [1a297f60] FillArrays v1.13.0 [6a86dc24] FiniteDiff v2.27.0 [53c48c17] FixedPointNumbers v0.8.5 ⌅ [587475ba] Flux v0.14.25 [1fa38f19] Format v1.3.7 [f6369f11] ForwardDiff v0.10.38 ⌅ [d9f16b24] Functors v0.4.12 [0c68f7d7] GPUArrays v11.2.2 [46192b85] GPUArraysCore v0.2.0 [28b8d3ca] GR v0.73.13 [68eda718] GeoFormatTypes v0.4.4 [cf35fbd7] GeoInterface v1.4.1 [5c1252a2] GeometryBasics v0.5.5 [d7ba0133] Git v1.3.1 [86223c79] Graphs v1.12.0 [42e2da0e] Grisu v1.0.2 [cd3eb016] HTTP v1.10.15 [076d061b] HashArrayMappedTries v0.2.0 [34004b35] HypergeometricFunctions v0.3.27 [b5f81e59] IOCapture v0.2.5 [7869d1d1] IRTools v0.4.14 [4846b161] InferOpt v0.6.1 [d25df0c9] Inflate v0.1.5 [22cec73e] InitialValues v0.3.1 [842dd82b] InlineStrings v1.4.3 [3587e190] InverseFunctions v0.1.17 [41ab1584] InvertedIndices v1.3.1 [92d709cd] IrrationalConstants v0.2.4 [c8e1da08] IterTools v1.10.0 [b3c1a2ee] IterationControl v0.5.4 [42fd0dbc] IterativeSolvers v0.9.4 [82899510] IteratorInterfaceExtensions v1.0.0 [033835bb] JLD2 v0.5.11 [1019f520] JLFzf v0.1.9 [692b3bcd] JLLWrappers v1.7.0 [682c06a0] JSON v0.21.4 [b14d175d] JuliaVariables v0.2.4 [63c18a36] KernelAbstractions v0.9.34 [ec8451be] KernelFunctions v0.10.64 [929cbde3] LLVM v9.2.0 [8ac3fa9e] LRUCache v1.6.1 [b964fa9f] LaTeXStrings v1.4.0 [c52c1a26] LaplaceRedux v1.2.0 [23fbe1c1] Latexify v0.16.6 [a5e1c1ea] LatinHypercubeSampling v1.9.0 [0e77f7df] LazilyInitializedFields v1.3.0 ⌅ [92ad9a40] LearnAPI v0.1.0 ⌅ [7acf609c] LightGBM v0.7.2 [d3d80556] LineSearches v7.3.0 [7a12625a] LinearMaps v3.11.4 [70f5e60a] LinkedLists v0.1.1 [2ab3a3ac] LogExpFunctions v0.3.29 [aa2f6b4e] LogarithmicNumbers v1.4.0 [e6f89c97] LoggingExtras v1.1.0 [c2834f40] MLCore v1.0.0 ⌃ [7e8f7934] MLDataDevices v1.5.3 ⌅ [64a0f543] MLFlowClient v0.5.1 [add582a8] MLJ v0.20.7 [45f359ea] MLJBalancing v0.1.5 [a7f614a8] MLJBase v1.7.0 [c6f25543] MLJDecisionTreeInterface v0.4.2 [50ed68f4] MLJEnsembles v0.4.3 [7b7b8358] MLJFlow v0.5.0 ⌅ [094fc8d1] MLJFlux v0.5.1 [614be32b] MLJIteration v0.6.3 [6ee0df7b] MLJLinearModels v0.10.0 [e80e1ace] MLJModelInterface v1.11.0 [d491faf4] MLJModels v0.17.7 [03970b2e] MLJTuning v0.8.8 [d8e11817] MLStyle v0.4.17 [f1d291b0] MLUtils v0.4.7 [1914dd2f] MacroTools v0.5.15 [d0879d2d] MarkdownAST v0.1.2 [739be429] MbedTLS v1.1.9 [442fdcdd] Measures v0.3.2 [6fafb56a] Memoization v0.2.2 [626554b9] MetaGraphs v0.8.0 [dbeba491] Metalhead v0.9.5 [128add7d] MicroCollections v0.2.0 [e1d29d7a] Missings v1.2.0 [6f286f6a] MultivariateStats v0.10.3 [d41bc354] NLSolversBase v7.8.3 [872c559c] NNlib v0.9.27 [77ba4419] NaNMath v1.1.2 [71a1bf82] NameResolution v0.1.5 [c020b1a1] NaturalSort v1.0.0 [636a865e] NearestNeighborModels v0.2.3 [b8a86587] NearestNeighbors v0.4.21 [46757867] NetworkLayout v0.4.9 [6fe1bfb0] OffsetArrays v1.15.0 [0b1bfda6] OneHotArrays v0.2.6 [8b6db2d4] OpenML v0.3.2 [4d8831e6] OpenSSL v1.4.3 [429524aa] Optim v1.11.0 ⌅ [3bd65402] Optimisers v0.3.4 [bac558e1] OrderedCollections v1.8.0 [90014a1f] PDMats v0.11.32 [65ce6f38] PackageExtensionCompat v1.0.2 [d96e819e] Parameters v0.12.3 [69de0a69] Parsers v2.8.1 [570af359] PartialFunctions v1.2.0 [b98c9c47] Pipe v1.3.0 [ccf2f8ad] PlotThemes v3.3.0 [995b91a9] PlotUtils v1.4.3 [91a5bcdd] Plots v1.40.9 [2dfb63ee] PooledArrays v1.4.3 [85a6dd25] PositiveFactorizations v0.2.4 [aea7be01] PrecompileTools v1.2.1 [21216c6a] Preferences v1.4.3 [8162dcfd] PrettyPrint v0.2.0 [54e16d92] PrettyPrinting v0.4.2 [08abe8d2] PrettyTables v2.4.0 [33c8b6b6] ProgressLogging v0.1.4 [92933f4c] ProgressMeter v1.10.2 [43287f4e] PtrArrays v1.3.0 [1fd47b50] QuadGK v2.11.2 [c1ae055f] RealDot v0.1.0 [3cdcf5f2] RecipesBase v1.3.4 [01d81517] RecipesPipeline v0.6.12 [189a3867] Reexport v1.2.2 [42d2dcc6] Referenceables v0.1.3 [2792f1a3] RegistryInstances v0.1.0 [05181044] RelocatableFolders v1.0.1 [97f35ef4] RequiredInterfaces v0.1.7 [ae029012] Requires v1.3.0 [79098fc4] Rmath v0.8.0 [321657f4] ScientificTypes v3.1.0 [30f210dd] ScientificTypesBase v3.0.0 [6e75b9c4] ScikitLearnBase v0.5.0 [7e506255] ScopedValues v1.3.0 [6c6a2e73] Scratch v1.2.1 [91c51154] SentinelArrays v1.4.8 [efcf1570] Setfield v1.1.1 [605ecd9f] ShowCases v0.1.0 [992d4aef] Showoff v1.0.3 [777ac1f9] SimpleBufferStream v1.2.0 [699a6c99] SimpleTraits v0.9.4 [a2af1166] SortingAlgorithms v1.2.1 [dc90abb0] SparseInverseSubset v0.1.2 [276daf66] SpecialFunctions v2.5.0 [171d559e] SplittablesBase v0.1.15 [860ef19b] StableRNGs v1.0.2 [90137ffa] StaticArrays v1.9.12 [1e83bf80] StaticArraysCore v1.4.3 ⌅ [a19d573c] StatisticalMeasures v0.1.7 [c062fc1d] StatisticalMeasuresBase v0.1.2 [64bff920] StatisticalTraits v3.4.0 [82ae8749] StatsAPI v1.7.0 [2913bbd2] StatsBase v0.34.4 [4c63d2b9] StatsFuns v1.3.2 [892a3eda] StringManipulation v0.4.1 ⌃ [09ab397b] StructArrays v0.6.21 [3783bdb8] TableTraits v1.0.1 [bd369af6] Tables v1.12.0 [3eeacb1d] TabularDisplay v1.3.0 [10284c91] TaijaBase v1.2.3 [bd7198b4] TaijaPlotting v1.3.0 [62fd8b95] TensorCore v0.1.1 [ac1d9e8a] ThreadsX v0.1.12 [3bb67fe8] TranscodingStreams v0.11.3 [28d57a85] Transducers v0.4.84 [592b5752] Trapz v2.0.3 [bc48ee85] Tullio v0.3.8 [5c2747f8] URIs v1.5.1 [3a884ed6] UnPack v1.0.2 [1cfade01] UnicodeFun v0.4.1 [1986cc42] Unitful v1.22.0 [45397f5d] UnitfulLatexify v1.6.4 [013be700] UnsafeAtomics v0.3.0 [41fe7b60] Unzip v0.2.0 ⌅ [e88e6eb3] Zygote v0.6.75 [700de1a5] ZygoteRules v0.2.7 ⌅ [68821587] Arpack_jll v3.5.1+1 [6e34b625] Bzip2_jll v1.0.9+0 [83423d85] Cairo_jll v1.18.2+1 [ee1fde0b] Dbus_jll v1.14.10+0 [5ae413db] EarCut_jll v2.2.4+0 [2702e6a9] EpollShim_jll v0.0.20230411+1 [2e619515] Expat_jll v2.6.5+0 ⌅ [b22a6f82] FFMPEG_jll v4.4.4+1 [a3f928ae] Fontconfig_jll v2.15.0+0 [d7e528f0] FreeType2_jll v2.13.3+1 [559328eb] FriBidi_jll v1.0.16+0 [0656b61e] GLFW_jll v3.4.0+2 [d2c73de3] GR_jll v0.73.13+0 [78b55507] Gettext_jll v0.21.0+0 [f8c6e375] Git_jll v2.47.1+0 [7746bdde] Glib_jll v2.82.4+0 [3b182d85] Graphite2_jll v1.3.14+1 [2e76f6c2] HarfBuzz_jll v8.5.0+0 [aacddb02] JpegTurbo_jll v3.1.1+0 [c1c5ebd0] LAME_jll v3.100.2+0 [88015f11] LERC_jll v4.0.1+0 [dad2f222] LLVMExtra_jll v0.0.35+0 [1d63c593] LLVMOpenMP_jll v18.1.7+0 [dd4b983a] LZO_jll v2.10.3+0 ⌅ [e9f186c6] Libffi_jll v3.2.2+2 [d4300ac3] Libgcrypt_jll v1.11.0+0 [7e76a0d4] Libglvnd_jll v1.7.0+0 [7add5ba3] Libgpg_error_jll v1.51.1+0 [94ce4f54] Libiconv_jll v1.18.0+0 [4b2f31a3] Libmount_jll v2.40.3+0 [89763e89] Libtiff_jll v4.7.1+0 [38a345b3] Libuuid_jll v2.40.3+0 ⌅ [0e4427ef] LightGBM_jll v3.3.5+1 [e7412a2a] Ogg_jll v1.3.5+1 [458c3c95] OpenSSL_jll v3.0.16+0 [efe28fd5] OpenSpecFun_jll v0.5.6+0 [91d4177d] Opus_jll v1.3.3+0 [36c8627f] Pango_jll v1.56.1+0 ⌅ [30392449] Pixman_jll v0.43.4+0 ⌅ [c0090381] Qt6Base_jll v6.7.1+1 ⌅ [629bc702] Qt6Declarative_jll v6.7.1+2 ⌅ [ce943373] Qt6ShaderTools_jll v6.7.1+1 ⌃ [e99dba38] Qt6Wayland_jll v6.7.1+1 [f50d1b31] Rmath_jll v0.5.1+0 [a44049a8] Vulkan_Loader_jll v1.3.243+0 [a2964d1f] Wayland_jll v1.21.0+2 [2381bf8a] Wayland_protocols_jll v1.36.0+0 [02c8fc9c] XML2_jll v2.13.6+1 [aed1982a] XSLT_jll v1.1.42+0 [ffd25f8a] XZ_jll v5.6.4+1 [f67eecfb] Xorg_libICE_jll v1.1.1+0 [c834827a] Xorg_libSM_jll v1.2.4+0 [4f6342f7] Xorg_libX11_jll v1.8.6+3 [0c0b7dd1] Xorg_libXau_jll v1.0.12+0 [935fb764] Xorg_libXcursor_jll v1.2.3+0 [a3789734] Xorg_libXdmcp_jll v1.1.5+0 [1082639a] Xorg_libXext_jll v1.3.6+3 [d091e8ba] Xorg_libXfixes_jll v6.0.0+0 [a51aa0fd] Xorg_libXi_jll v1.8.2+0 [d1454406] Xorg_libXinerama_jll v1.1.5+0 [ec84b674] Xorg_libXrandr_jll v1.5.4+0 [ea2f1a96] Xorg_libXrender_jll v0.9.11+1 [14d82f49] Xorg_libpthread_stubs_jll v0.1.2+0 [c7cfdc94] Xorg_libxcb_jll v1.17.0+3 [cc61e674] Xorg_libxkbfile_jll v1.1.2+1 [e920d4aa] Xorg_xcb_util_cursor_jll v0.1.4+0 [12413925] Xorg_xcb_util_image_jll v0.4.0+1 [2def613f] Xorg_xcb_util_jll v0.4.0+1 [975044d2] Xorg_xcb_util_keysyms_jll v0.4.0+1 [0d47668e] Xorg_xcb_util_renderutil_jll v0.3.9+1 [c22f9ab0] Xorg_xcb_util_wm_jll v0.4.1+1 [35661453] Xorg_xkbcomp_jll v1.4.6+1 [33bec58e] Xorg_xkeyboard_config_jll v2.39.0+0 [c5fb5394] Xorg_xtrans_jll v1.5.1+0 [3161d3a3] Zstd_jll v1.5.7+1 [35ca27e7] eudev_jll v3.2.9+0 [214eeab7] fzf_jll v0.56.3+0 [1a1c6b14] gperf_jll v3.1.1+1 [a4ae2306] libaom_jll v3.11.0+0 [0ac62f75] libass_jll v0.15.2+0 [1183f4f0] libdecor_jll v0.2.2+0 [2db6ffa8] libevdev_jll v1.11.0+0 [f638f0a6] libfdk_aac_jll v2.0.3+0 [36db933b] libinput_jll v1.18.0+0 [b53b4c65] libpng_jll v1.6.46+0 [f27f6e37] libvorbis_jll v1.3.7+2 [009596ad] mtdev_jll v1.1.6+0 ⌅ [1270edf5] x264_jll v2021.5.5+0 ⌅ [dfaa095f] x265_jll v3.5.0+0 [d8fb68d0] xkbcommon_jll v1.4.1+2 [0dad84c5] ArgTools v1.1.1 [56f22d72] Artifacts [2a0f44e3] Base64 [ade2ca70] Dates [8ba89e20] Distributed [f43a241f] Downloads v1.6.0 [7b1f6079] FileWatching [9fa8497b] Future [b77e0a4c] InteractiveUtils [4af54fe1] LazyArtifacts [b27032c2] LibCURL v0.6.4 [76f85450] LibGit2 [8f399da3] Libdl [37e2e46d] LinearAlgebra [56ddb016] Logging [d6f4376e] Markdown [a63ad114] Mmap [ca575930] NetworkOptions v1.2.0 [44cfe95a] Pkg v1.10.0 [de0858da] Printf [3fa0cd96] REPL [9a3f8284] Random [ea8e919c] SHA v0.7.0 [9e88b42a] Serialization [1a1011a3] SharedArrays [6462fe0b] Sockets [2f01184e] SparseArrays v1.10.0 [10745b16] Statistics v1.10.0 [4607b0f0] SuiteSparse [fa267f1f] TOML v1.0.3 [a4e569a6] Tar v1.10.0 [8dfed614] Test [cf7118a7] UUIDs [4ec0a83e] Unicode [e66e0078] CompilerSupportLibraries_jll v1.1.1+0 [deac9b47] LibCURL_jll v8.4.0+0 [e37daf67] LibGit2_jll v1.6.4+0 [29816b5a] LibSSH2_jll v1.11.0+1 [c8ffd9c3] MbedTLS_jll v2.28.2+1 [14a3606d] MozillaCACerts_jll v2023.1.10 [4536629a] OpenBLAS_jll v0.3.23+4 [05823500] OpenLibm_jll v0.8.1+4 [efcefdf7] PCRE2_jll v10.42.0+1 [bea87d4a] SuiteSparse_jll v7.2.1+1 [83775a58] Zlib_jll v1.2.13+1 [8e850b90] libblastrampoline_jll v5.11.0+0 [8e850ede] nghttp2_jll v1.52.0+1 [3f19e933] p7zip_jll v17.4.0+2 Info Packages marked with ⌃ and ⌅ have new versions available. Those with ⌃ may be upgradable, but those with ⌅ are restricted by compatibility constraints from upgrading. Testing Running tests... WARNING: Method definition _dummy_backedge() in module Memoization at /home/pkgeval/.julia/packages/Memoization/ON3Za/src/Memoization.jl:49 overwritten at /home/pkgeval/.julia/packages/Memoization/ON3Za/src/Memoization.jl:65. ┌ Warning: Unable to determine HTML(edit_link = ...) from remote HEAD branch, defaulting to "master". │ Calling `git remote` failed with an exception. Set JULIA_DEBUG=Documenter to see the error. │ Unless this is due to a configuration error, the relevant variable should be set explicitly. └ @ Documenter ~/.julia/packages/Documenter/iwb7N/src/utilities/utilities.jl:651 [ Info: SetupBuildDirectory: setting up build directory. [ Info: Doctest: running doctests. [ Info: Skipped ExpandTemplates step (doctest only). [ Info: Skipped CrossReferences step (doctest only). [ Info: Skipped CheckDocument step (doctest only). [ Info: Skipped Populate step (doctest only). [ Info: Skipped RenderDocument step (doctest only). Test Summary: | Pass Total Time Doctests: ConformalPrediction | 1 1 25.7s WARNING: using deprecated binding ColorTypes.RGB1 in Colors. , use XRGB instead. WARNING: using deprecated binding Colors.RGB1 in PlotUtils. , use ColorTypes.XRGB{T} where T<:Union{AbstractFloat, FixedPointNumbers.FixedPoint{T, f} where f where T<:Integer} instead. WARNING: PlotUtils.RGB1 is deprecated, use ColorTypes.XRGB{T} where T<:Union{AbstractFloat, FixedPointNumbers.FixedPoint{T, f} where f where T<:Integer} instead. likely near /home/pkgeval/.julia/packages/ConformalPrediction/bz1ka/test/aqua.jl:3 WARNING: using deprecated binding ColorTypes.RGB4 in Colors. , use RGBX instead. WARNING: using deprecated binding Colors.RGB4 in PlotUtils. , use ColorTypes.RGBX{T} where T<:Union{AbstractFloat, FixedPointNumbers.FixedPoint{T, f} where f where T<:Integer} instead. WARNING: PlotUtils.RGB4 is deprecated, use ColorTypes.RGBX{T} where T<:Union{AbstractFloat, FixedPointNumbers.FixedPoint{T, f} where f where T<:Integer} instead. likely near /home/pkgeval/.julia/packages/ConformalPrediction/bz1ka/test/aqua.jl:3 [ Info: For silent loading, specify `verbosity=0`. import NearestNeighborModels ✔ [ Info: Training machine(AdaptiveInductiveClassifier(model = KNNClassifier(K = 5, …), …), …). [ Info: Training machine(AdaptiveInductiveClassifier(model = KNNClassifier(K = 5, …), …), …). [ Info: Training machine(AdaptiveInductiveClassifier(model = KNNClassifier(K = 5, …), …), …). [ Info: Training machine(NaiveClassifier(model = KNNClassifier(K = 5, …), …), …). [ Info: Training machine(NaiveClassifier(model = KNNClassifier(K = 5, …), …), …). [ Info: Training machine(NaiveClassifier(model = KNNClassifier(K = 5, …), …), …). [ Info: Training machine(SimpleInductiveClassifier(model = KNNClassifier(K = 5, …), …), …). [ Info: Training machine(SimpleInductiveClassifier(model = KNNClassifier(K = 5, …), …), …). [ Info: Training machine(SimpleInductiveClassifier(model = KNNClassifier(K = 5, …), …), …). [ Info: For silent loading, specify `verbosity=0`. import EvoTrees ✔ ┌ Info: Training machine(AdaptiveInductiveClassifier(model = EvoTrees.EvoTreeClassifier{EvoTrees.MLogLoss} │ - nrounds: 100 │ - L2: 0.0 │ - lambda: 0.0 │ - gamma: 0.0 │ - eta: 0.1 │ - max_depth: 6 │ - min_weight: 1.0 │ - rowsample: 1.0 │ - colsample: 1.0 │ - nbins: 64 │ - alpha: 0.5 │ - tree_type: binary │ - rng: Random.MersenneTwister(123) └ , …), …). ┌ Info: Training machine(AdaptiveInductiveClassifier(model = EvoTrees.EvoTreeClassifier{EvoTrees.MLogLoss} │ - nrounds: 100 │ - L2: 0.0 │ - lambda: 0.0 │ - gamma: 0.0 │ - eta: 0.1 │ - max_depth: 6 │ - min_weight: 1.0 │ - rowsample: 1.0 │ - colsample: 1.0 │ - nbins: 64 │ - alpha: 0.5 │ - tree_type: binary │ - rng: Random.MersenneTwister(123, (0, 8016, 7014, 286)) └ , …), …). ┌ Info: Training machine(AdaptiveInductiveClassifier(model = EvoTrees.EvoTreeClassifier{EvoTrees.MLogLoss} │ - nrounds: 100 │ - L2: 0.0 │ - lambda: 0.0 │ - gamma: 0.0 │ - eta: 0.1 │ - max_depth: 6 │ - min_weight: 1.0 │ - rowsample: 1.0 │ - colsample: 1.0 │ - nbins: 64 │ - alpha: 0.5 │ - tree_type: binary │ - rng: Random.MersenneTwister(123, (0, 15030, 14028, 172)) └ , …), …). ┌ Info: Training machine(NaiveClassifier(model = EvoTrees.EvoTreeClassifier{EvoTrees.MLogLoss} │ - nrounds: 100 │ - L2: 0.0 │ - lambda: 0.0 │ - gamma: 0.0 │ - eta: 0.1 │ - max_depth: 6 │ - min_weight: 1.0 │ - rowsample: 1.0 │ - colsample: 1.0 │ - nbins: 64 │ - alpha: 0.5 │ - tree_type: binary │ - rng: Random.MersenneTwister(123, (0, 22044, 21042, 158)) └ , …), …). ┌ Info: Training machine(NaiveClassifier(model = EvoTrees.EvoTreeClassifier{EvoTrees.MLogLoss} │ - nrounds: 100 │ - L2: 0.0 │ - lambda: 0.0 │ - gamma: 0.0 │ - eta: 0.1 │ - max_depth: 6 │ - min_weight: 1.0 │ - rowsample: 1.0 │ - colsample: 1.0 │ - nbins: 64 │ - alpha: 0.5 │ - tree_type: binary │ - rng: Random.MersenneTwister(123, (0, 36072, 35070, 30)) └ , …), …). ┌ Info: Training machine(NaiveClassifier(model = EvoTrees.EvoTreeClassifier{EvoTrees.MLogLoss} │ - nrounds: 100 │ - L2: 0.0 │ - lambda: 0.0 │ - gamma: 0.0 │ - eta: 0.1 │ - max_depth: 6 │ - min_weight: 1.0 │ - rowsample: 1.0 │ - colsample: 1.0 │ - nbins: 64 │ - alpha: 0.5 │ - tree_type: binary │ - rng: Random.MersenneTwister(123, (0, 49098, 48096, 504)) └ , …), …). ┌ Info: Training machine(SimpleInductiveClassifier(model = EvoTrees.EvoTreeClassifier{EvoTrees.MLogLoss} │ - nrounds: 100 │ - L2: 0.0 │ - lambda: 0.0 │ - gamma: 0.0 │ - eta: 0.1 │ - max_depth: 6 │ - min_weight: 1.0 │ - rowsample: 1.0 │ - colsample: 1.0 │ - nbins: 64 │ - alpha: 0.5 │ - tree_type: binary │ - rng: Random.MersenneTwister(123, (0, 63126, 62124, 76)) └ , …), …). ┌ Info: Training machine(SimpleInductiveClassifier(model = EvoTrees.EvoTreeClassifier{EvoTrees.MLogLoss} │ - nrounds: 100 │ - L2: 0.0 │ - lambda: 0.0 │ - gamma: 0.0 │ - eta: 0.1 │ - max_depth: 6 │ - min_weight: 1.0 │ - rowsample: 1.0 │ - colsample: 1.0 │ - nbins: 64 │ - alpha: 0.5 │ - tree_type: binary │ - rng: Random.MersenneTwister(123, (0, 160320, 159318, 779)) └ , …), …). ┌ Info: Training machine(SimpleInductiveClassifier(model = EvoTrees.EvoTreeClassifier{EvoTrees.MLogLoss} │ - nrounds: 100 │ - L2: 0.0 │ - lambda: 0.0 │ - gamma: 0.0 │ - eta: 0.1 │ - max_depth: 6 │ - min_weight: 1.0 │ - rowsample: 1.0 │ - colsample: 1.0 │ - nbins: 64 │ - alpha: 0.5 │ - tree_type: binary │ - rng: Random.MersenneTwister(123, (0, 167334, 166332, 665)) └ , …), …). [ Info: For silent loading, specify `verbosity=0`. import MLJDecisionTreeInterface ✔ [ Info: Training machine(AdaptiveInductiveClassifier(model = RandomForestClassifier(max_depth = -1, …), …), …). [ Info: Training machine(AdaptiveInductiveClassifier(model = RandomForestClassifier(max_depth = -1, …), …), …). [ Info: Training machine(AdaptiveInductiveClassifier(model = RandomForestClassifier(max_depth = -1, …), …), …). [ Info: Training machine(NaiveClassifier(model = RandomForestClassifier(max_depth = -1, …), …), …). [ Info: Training machine(NaiveClassifier(model = RandomForestClassifier(max_depth = -1, …), …), …). [ Info: Training machine(NaiveClassifier(model = RandomForestClassifier(max_depth = -1, …), …), …). [ Info: Training machine(SimpleInductiveClassifier(model = RandomForestClassifier(max_depth = -1, …), …), …). [ Info: Training machine(SimpleInductiveClassifier(model = RandomForestClassifier(max_depth = -1, …), …), …). [ Info: Training machine(SimpleInductiveClassifier(model = RandomForestClassifier(max_depth = -1, …), …), …). [ Info: For silent loading, specify `verbosity=0`. import MLJLinearModels ✔ [ Info: Training machine(AdaptiveInductiveClassifier(model = LogisticClassifier(lambda = 2.220446049250313e-16, …), …), …). ┌ Info: Solver: MLJLinearModels.LBFGS{Optim.Options{Float64, Nothing}, @NamedTuple{}} │ optim_options: Optim.Options{Float64, Nothing} └ lbfgs_options: @NamedTuple{} NamedTuple() [ Info: Training machine(AdaptiveInductiveClassifier(model = LogisticClassifier(lambda = 2.220446049250313e-16, …), …), …). ┌ Info: Solver: MLJLinearModels.LBFGS{Optim.Options{Float64, Nothing}, @NamedTuple{}} │ optim_options: Optim.Options{Float64, Nothing} └ lbfgs_options: @NamedTuple{} NamedTuple() [ Info: Training machine(AdaptiveInductiveClassifier(model = LogisticClassifier(lambda = 2.220446049250313e-16, …), …), …). ┌ Info: Solver: MLJLinearModels.LBFGS{Optim.Options{Float64, Nothing}, @NamedTuple{}} │ optim_options: Optim.Options{Float64, Nothing} └ lbfgs_options: @NamedTuple{} NamedTuple() [ Info: Training machine(NaiveClassifier(model = LogisticClassifier(lambda = 2.220446049250313e-16, …), …), …). ┌ Info: Solver: MLJLinearModels.LBFGS{Optim.Options{Float64, Nothing}, @NamedTuple{}} │ optim_options: Optim.Options{Float64, Nothing} └ lbfgs_options: @NamedTuple{} NamedTuple() [ Info: Training machine(NaiveClassifier(model = LogisticClassifier(lambda = 2.220446049250313e-16, …), …), …). ┌ Info: Solver: MLJLinearModels.LBFGS{Optim.Options{Float64, Nothing}, @NamedTuple{}} │ optim_options: Optim.Options{Float64, Nothing} └ lbfgs_options: @NamedTuple{} NamedTuple() [ Info: Training machine(NaiveClassifier(model = LogisticClassifier(lambda = 2.220446049250313e-16, …), …), …). ┌ Info: Solver: MLJLinearModels.LBFGS{Optim.Options{Float64, Nothing}, @NamedTuple{}} │ optim_options: Optim.Options{Float64, Nothing} └ lbfgs_options: @NamedTuple{} NamedTuple() [ Info: Training machine(SimpleInductiveClassifier(model = LogisticClassifier(lambda = 2.220446049250313e-16, …), …), …). ┌ Info: Solver: MLJLinearModels.LBFGS{Optim.Options{Float64, Nothing}, @NamedTuple{}} │ optim_options: Optim.Options{Float64, Nothing} └ lbfgs_options: @NamedTuple{} NamedTuple() [ Info: Training machine(SimpleInductiveClassifier(model = LogisticClassifier(lambda = 2.220446049250313e-16, …), …), …). ┌ Info: Solver: MLJLinearModels.LBFGS{Optim.Options{Float64, Nothing}, @NamedTuple{}} │ optim_options: Optim.Options{Float64, Nothing} └ lbfgs_options: @NamedTuple{} NamedTuple() [ Info: Training machine(SimpleInductiveClassifier(model = LogisticClassifier(lambda = 2.220446049250313e-16, …), …), …). ┌ Info: Solver: MLJLinearModels.LBFGS{Optim.Options{Float64, Nothing}, @NamedTuple{}} │ optim_options: Optim.Options{Float64, Nothing} └ lbfgs_options: @NamedTuple{} NamedTuple() [ Info: For silent loading, specify `verbosity=0`. import MLJDecisionTreeInterface ✔ [ Info: Training machine(AdaptiveInductiveClassifier(model = DecisionTreeClassifier(max_depth = -1, …), …), …). [ Info: Training machine(AdaptiveInductiveClassifier(model = DecisionTreeClassifier(max_depth = -1, …), …), …). [ Info: Training machine(AdaptiveInductiveClassifier(model = DecisionTreeClassifier(max_depth = -1, …), …), …). [ Info: Training machine(NaiveClassifier(model = DecisionTreeClassifier(max_depth = -1, …), …), …). [ Info: Training machine(NaiveClassifier(model = DecisionTreeClassifier(max_depth = -1, …), …), …). [ Info: Training machine(NaiveClassifier(model = DecisionTreeClassifier(max_depth = -1, …), …), …). [ Info: Training machine(SimpleInductiveClassifier(model = DecisionTreeClassifier(max_depth = -1, …), …), …). [ Info: Training machine(SimpleInductiveClassifier(model = DecisionTreeClassifier(max_depth = -1, …), …), …). [ Info: Training machine(SimpleInductiveClassifier(model = DecisionTreeClassifier(max_depth = -1, …), …), …). [ Info: For silent loading, specify `verbosity=0`. import MLJLinearModels ✔ ┌ Warning: This test is skipped as the method is not suitable for Quantile Regression └ @ Main ~/.julia/packages/ConformalPrediction/bz1ka/test/regression.jl:80 [ Info: Training machine(CVPlusRegressor(model = RidgeRegressor(lambda = 1.0, …), …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(CVPlusRegressor(model = RidgeRegressor(lambda = 1.0, …), …), …). [ Info: Training machine(NaiveRegressor(model = RidgeRegressor(lambda = 1.0, …), …), …). ┌ Info: Solver: MLJLinearModels.Analytical │ iterative: Bool false └ max_inner: Int64 200 [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(NaiveRegressor(model = RidgeRegressor(lambda = 1.0, …), …), …). ┌ Info: Solver: MLJLinearModels.Analytical │ iterative: Bool false └ max_inner: Int64 200 [ Info: Training machine(JackknifePlusAbMinMaxRegressor(model = RidgeRegressor(lambda = 1.0, …), …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(JackknifePlusAbMinMaxRegressor(model = RidgeRegressor(lambda = 1.0, …), …), …). [ Info: Training machine(JackknifePlusAbRegressor(model = RidgeRegressor(lambda = 1.0, …), …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(JackknifePlusAbRegressor(model = RidgeRegressor(lambda = 1.0, …), …), …). [ Info: Training machine(JackknifePlusRegressor(model = RidgeRegressor(lambda = 1.0, …), …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(JackknifePlusRegressor(model = RidgeRegressor(lambda = 1.0, …), …), …). [ Info: Training machine(CVMinMaxRegressor(model = RidgeRegressor(lambda = 1.0, …), …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(CVMinMaxRegressor(model = RidgeRegressor(lambda = 1.0, …), …), …). [ Info: Training machine(JackknifeRegressor(model = RidgeRegressor(lambda = 1.0, …), …), …). ┌ Info: Solver: MLJLinearModels.Analytical │ iterative: Bool false └ max_inner: Int64 200 [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(JackknifeRegressor(model = RidgeRegressor(lambda = 1.0, …), …), …). ┌ Info: Solver: MLJLinearModels.Analytical │ iterative: Bool false └ max_inner: Int64 200 [ Info: Training machine(JackknifeMinMaxRegressor(model = RidgeRegressor(lambda = 1.0, …), …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(JackknifeMinMaxRegressor(model = RidgeRegressor(lambda = 1.0, …), …), …). [ Info: Training machine(TimeSeriesRegressorEnsembleBatch(model = RidgeRegressor(lambda = 1.0, …), …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(TimeSeriesRegressorEnsembleBatch(model = RidgeRegressor(lambda = 1.0, …), …), …). [ Info: Training machine(SimpleInductiveRegressor(model = RidgeRegressor(lambda = 1.0, …), …), …). ┌ Info: Solver: MLJLinearModels.Analytical │ iterative: Bool false └ max_inner: Int64 200 [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(SimpleInductiveRegressor(model = RidgeRegressor(lambda = 1.0, …), …), …). ┌ Info: Solver: MLJLinearModels.Analytical │ iterative: Bool false └ max_inner: Int64 200 [ Info: For silent loading, specify `verbosity=0`. import MLJLinearModels ✔ ┌ Warning: This test is skipped as the method is not suitable for Quantile Regression └ @ Main ~/.julia/packages/ConformalPrediction/bz1ka/test/regression.jl:80 [ Info: Training machine(CVPlusRegressor(model = LassoRegressor(lambda = 1.0, …), …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(CVPlusRegressor(model = LassoRegressor(lambda = 1.0, …), …), …). [ Info: Training machine(NaiveRegressor(model = LassoRegressor(lambda = 1.0, …), …), …). ┌ Info: Solver: MLJLinearModels.ProxGrad │ accel: Bool true │ max_iter: Int64 1000 │ tol: Float64 0.0001 │ max_inner: Int64 100 │ beta: Float64 0.8 └ gram: Bool false [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(NaiveRegressor(model = LassoRegressor(lambda = 1.0, …), …), …). ┌ Info: Solver: MLJLinearModels.ProxGrad │ accel: Bool true │ max_iter: Int64 1000 │ tol: Float64 0.0001 │ max_inner: Int64 100 │ beta: Float64 0.8 └ gram: Bool false [ Info: Training machine(JackknifePlusAbMinMaxRegressor(model = LassoRegressor(lambda = 1.0, …), …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(JackknifePlusAbMinMaxRegressor(model = LassoRegressor(lambda = 1.0, …), …), …). [ Info: Training machine(JackknifePlusAbRegressor(model = LassoRegressor(lambda = 1.0, …), …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(JackknifePlusAbRegressor(model = LassoRegressor(lambda = 1.0, …), …), …). [ Info: Training machine(JackknifePlusRegressor(model = LassoRegressor(lambda = 1.0, …), …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(JackknifePlusRegressor(model = LassoRegressor(lambda = 1.0, …), …), …). [ Info: Training machine(CVMinMaxRegressor(model = LassoRegressor(lambda = 1.0, …), …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(CVMinMaxRegressor(model = LassoRegressor(lambda = 1.0, …), …), …). [ Info: Training machine(JackknifeRegressor(model = LassoRegressor(lambda = 1.0, …), …), …). ┌ Info: Solver: MLJLinearModels.ProxGrad │ accel: Bool true │ max_iter: Int64 1000 │ tol: Float64 0.0001 │ max_inner: Int64 100 │ beta: Float64 0.8 └ gram: Bool false [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(JackknifeRegressor(model = LassoRegressor(lambda = 1.0, …), …), …). ┌ Info: Solver: MLJLinearModels.ProxGrad │ accel: Bool true │ max_iter: Int64 1000 │ tol: Float64 0.0001 │ max_inner: Int64 100 │ beta: Float64 0.8 └ gram: Bool false [ Info: Training machine(JackknifeMinMaxRegressor(model = LassoRegressor(lambda = 1.0, …), …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(JackknifeMinMaxRegressor(model = LassoRegressor(lambda = 1.0, …), …), …). [ Info: Training machine(TimeSeriesRegressorEnsembleBatch(model = LassoRegressor(lambda = 1.0, …), …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(TimeSeriesRegressorEnsembleBatch(model = LassoRegressor(lambda = 1.0, …), …), …). [ Info: Training machine(SimpleInductiveRegressor(model = LassoRegressor(lambda = 1.0, …), …), …). ┌ Info: Solver: MLJLinearModels.ProxGrad │ accel: Bool true │ max_iter: Int64 1000 │ tol: Float64 0.0001 │ max_inner: Int64 100 │ beta: Float64 0.8 └ gram: Bool false [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(SimpleInductiveRegressor(model = LassoRegressor(lambda = 1.0, …), …), …). ┌ Info: Solver: MLJLinearModels.ProxGrad │ accel: Bool true │ max_iter: Int64 1000 │ tol: Float64 0.0001 │ max_inner: Int64 100 │ beta: Float64 0.8 └ gram: Bool false [ Info: For silent loading, specify `verbosity=0`. import EvoTrees ✔ ┌ Warning: This test is skipped as the method is not suitable for Quantile Regression └ @ Main ~/.julia/packages/ConformalPrediction/bz1ka/test/regression.jl:80 ┌ Info: Training machine(CVPlusRegressor(model = EvoTrees.EvoTreeRegressor{EvoTrees.MSE} │ - nrounds: 100 │ - L2: 0.0 │ - lambda: 0.0 │ - gamma: 0.0 │ - eta: 0.1 │ - max_depth: 6 │ - min_weight: 1.0 │ - rowsample: 1.0 │ - colsample: 1.0 │ - nbins: 64 │ - alpha: 0.5 │ - monotone_constraints: Dict{Int64, Int64}() │ - tree_type: binary │ - rng: Random.MersenneTwister(123) └ , …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. ┌ Info: Training machine(CVPlusRegressor(model = EvoTrees.EvoTreeRegressor{EvoTrees.MSE} │ - nrounds: 100 │ - L2: 0.0 │ - lambda: 0.0 │ - gamma: 0.0 │ - eta: 0.1 │ - max_depth: 6 │ - min_weight: 1.0 │ - rowsample: 1.0 │ - colsample: 1.0 │ - nbins: 64 │ - alpha: 0.5 │ - monotone_constraints: Dict{Int64, Int64}() │ - tree_type: binary │ - rng: Random.MersenneTwister(123, (0, 43086, 42084, 916)) └ , …), …). ┌ Info: Training machine(NaiveRegressor(model = EvoTrees.EvoTreeRegressor{EvoTrees.MSE} │ - nrounds: 100 │ - L2: 0.0 │ - lambda: 0.0 │ - gamma: 0.0 │ - eta: 0.1 │ - max_depth: 6 │ - min_weight: 1.0 │ - rowsample: 1.0 │ - colsample: 1.0 │ - nbins: 64 │ - alpha: 0.5 │ - monotone_constraints: Dict{Int64, Int64}() │ - tree_type: binary │ - rng: Random.MersenneTwister(123, (0, 84168, 83166, 435)) └ , …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. ┌ Info: Training machine(NaiveRegressor(model = EvoTrees.EvoTreeRegressor{EvoTrees.MSE} │ - nrounds: 100 │ - L2: 0.0 │ - lambda: 0.0 │ - gamma: 0.0 │ - eta: 0.1 │ - max_depth: 6 │ - min_weight: 1.0 │ - rowsample: 1.0 │ - colsample: 1.0 │ - nbins: 64 │ - alpha: 0.5 │ - monotone_constraints: Dict{Int64, Int64}() │ - tree_type: binary │ - rng: Random.MersenneTwister(123, (0, 91182, 90180, 721)) └ , …), …). ┌ Info: Training machine(JackknifePlusAbMinMaxRegressor(model = EvoTrees.EvoTreeRegressor{EvoTrees.MSE} │ - nrounds: 100 │ - L2: 0.0 │ - lambda: 0.0 │ - gamma: 0.0 │ - eta: 0.1 │ - max_depth: 6 │ - min_weight: 1.0 │ - rowsample: 1.0 │ - colsample: 1.0 │ - nbins: 64 │ - alpha: 0.5 │ - monotone_constraints: Dict{Int64, Int64}() │ - tree_type: binary │ - rng: Random.MersenneTwister(123, (0, 98196, 97194, 607)) └ , …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. ┌ Info: Training machine(JackknifePlusAbMinMaxRegressor(model = EvoTrees.EvoTreeRegressor{EvoTrees.MSE} │ - nrounds: 100 │ - L2: 0.0 │ - lambda: 0.0 │ - gamma: 0.0 │ - eta: 0.1 │ - max_depth: 6 │ - min_weight: 1.0 │ - rowsample: 1.0 │ - colsample: 1.0 │ - nbins: 64 │ - alpha: 0.5 │ - monotone_constraints: Dict{Int64, Int64}() │ - tree_type: binary │ - rng: Random.MersenneTwister(123, (0, 233466, 232464, 341)) └ , …), …). ┌ Info: Training machine(JackknifePlusAbRegressor(model = EvoTrees.EvoTreeRegressor{EvoTrees.MSE} │ - nrounds: 100 │ - L2: 0.0 │ - lambda: 0.0 │ - gamma: 0.0 │ - eta: 0.1 │ - max_depth: 6 │ - min_weight: 1.0 │ - rowsample: 1.0 │ - colsample: 1.0 │ - nbins: 64 │ - alpha: 0.5 │ - monotone_constraints: Dict{Int64, Int64}() │ - tree_type: binary │ - rng: Random.MersenneTwister(123, (0, 356712, 355710, 101)) └ , …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. ┌ Info: Training machine(JackknifePlusAbRegressor(model = EvoTrees.EvoTreeRegressor{EvoTrees.MSE} │ - nrounds: 100 │ - L2: 0.0 │ - lambda: 0.0 │ - gamma: 0.0 │ - eta: 0.1 │ - max_depth: 6 │ - min_weight: 1.0 │ - rowsample: 1.0 │ - colsample: 1.0 │ - nbins: 64 │ - alpha: 0.5 │ - monotone_constraints: Dict{Int64, Int64}() │ - tree_type: binary │ - rng: Random.MersenneTwister(123, (0, 490980, 489978, 836)) └ , …), …). ┌ Info: Training machine(JackknifePlusRegressor(model = EvoTrees.EvoTreeRegressor{EvoTrees.MSE} │ - nrounds: 100 │ - L2: 0.0 │ - lambda: 0.0 │ - gamma: 0.0 │ - eta: 0.1 │ - max_depth: 6 │ - min_weight: 1.0 │ - rowsample: 1.0 │ - colsample: 1.0 │ - nbins: 64 │ - alpha: 0.5 │ - monotone_constraints: Dict{Int64, Int64}() │ - tree_type: binary │ - rng: Random.MersenneTwister(123, (0, 614226, 613224, 593)) └ , …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. ┌ Info: Training machine(JackknifePlusRegressor(model = EvoTrees.EvoTreeRegressor{EvoTrees.MSE} │ - nrounds: 100 │ - L2: 0.0 │ - lambda: 0.0 │ - gamma: 0.0 │ - eta: 0.1 │ - max_depth: 6 │ - min_weight: 1.0 │ - rowsample: 1.0 │ - colsample: 1.0 │ - nbins: 64 │ - alpha: 0.5 │ - monotone_constraints: Dict{Int64, Int64}() │ - tree_type: binary │ - rng: Random.MersenneTwister(123, (0, 3973932, 3972930, 532)) └ , …), …). ┌ Info: Training machine(CVMinMaxRegressor(model = EvoTrees.EvoTreeRegressor{EvoTrees.MSE} │ - nrounds: 100 │ - L2: 0.0 │ - lambda: 0.0 │ - gamma: 0.0 │ - eta: 0.1 │ - max_depth: 6 │ - min_weight: 1.0 │ - rowsample: 1.0 │ - colsample: 1.0 │ - nbins: 64 │ - alpha: 0.5 │ - monotone_constraints: Dict{Int64, Int64}() │ - tree_type: binary │ - rng: Random.MersenneTwister(123, (0, 7173318, 7172316, 789)) └ , …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. ┌ Info: Training machine(CVMinMaxRegressor(model = EvoTrees.EvoTreeRegressor{EvoTrees.MSE} │ - nrounds: 100 │ - L2: 0.0 │ - lambda: 0.0 │ - gamma: 0.0 │ - eta: 0.1 │ - max_depth: 6 │ - min_weight: 1.0 │ - rowsample: 1.0 │ - colsample: 1.0 │ - nbins: 64 │ - alpha: 0.5 │ - monotone_constraints: Dict{Int64, Int64}() │ - tree_type: binary │ - rng: Random.MersenneTwister(123, (0, 7216404, 7215402, 705)) └ , …), …). ┌ Info: Training machine(JackknifeRegressor(model = EvoTrees.EvoTreeRegressor{EvoTrees.MSE} │ - nrounds: 100 │ - L2: 0.0 │ - lambda: 0.0 │ - gamma: 0.0 │ - eta: 0.1 │ - max_depth: 6 │ - min_weight: 1.0 │ - rowsample: 1.0 │ - colsample: 1.0 │ - nbins: 64 │ - alpha: 0.5 │ - monotone_constraints: Dict{Int64, Int64}() │ - tree_type: binary │ - rng: Random.MersenneTwister(123, (0, 7257486, 7256484, 224)) └ , …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. ┌ Info: Training machine(JackknifeRegressor(model = EvoTrees.EvoTreeRegressor{EvoTrees.MSE} │ - nrounds: 100 │ - L2: 0.0 │ - lambda: 0.0 │ - gamma: 0.0 │ - eta: 0.1 │ - max_depth: 6 │ - min_weight: 1.0 │ - rowsample: 1.0 │ - colsample: 1.0 │ - nbins: 64 │ - alpha: 0.5 │ - monotone_constraints: Dict{Int64, Int64}() │ - tree_type: binary │ - rng: Random.MersenneTwister(123, (0, 10624206, 10623204, 452)) └ , …), …). ┌ Info: Training machine(JackknifeMinMaxRegressor(model = EvoTrees.EvoTreeRegressor{EvoTrees.MSE} │ - nrounds: 100 │ - L2: 0.0 │ - lambda: 0.0 │ - gamma: 0.0 │ - eta: 0.1 │ - max_depth: 6 │ - min_weight: 1.0 │ - rowsample: 1.0 │ - colsample: 1.0 │ - nbins: 64 │ - alpha: 0.5 │ - monotone_constraints: Dict{Int64, Int64}() │ - tree_type: binary │ - rng: Random.MersenneTwister(123, (0, 13830606, 13829604, 598)) └ , …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. ┌ Info: Training machine(JackknifeMinMaxRegressor(model = EvoTrees.EvoTreeRegressor{EvoTrees.MSE} │ - nrounds: 100 │ - L2: 0.0 │ - lambda: 0.0 │ - gamma: 0.0 │ - eta: 0.1 │ - max_depth: 6 │ - min_weight: 1.0 │ - rowsample: 1.0 │ - colsample: 1.0 │ - nbins: 64 │ - alpha: 0.5 │ - monotone_constraints: Dict{Int64, Int64}() │ - tree_type: binary │ - rng: Random.MersenneTwister(123, (0, 17190312, 17189310, 540)) └ , …), …). ┌ Info: Training machine(TimeSeriesRegressorEnsembleBatch(model = EvoTrees.EvoTreeRegressor{EvoTrees.MSE} │ - nrounds: 100 │ - L2: 0.0 │ - lambda: 0.0 │ - gamma: 0.0 │ - eta: 0.1 │ - max_depth: 6 │ - min_weight: 1.0 │ - rowsample: 1.0 │ - colsample: 1.0 │ - nbins: 64 │ - alpha: 0.5 │ - monotone_constraints: Dict{Int64, Int64}() │ - tree_type: binary │ - rng: Random.MersenneTwister(123, (0, 20389698, 20388696, 798)) └ , …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. ┌ Info: Training machine(TimeSeriesRegressorEnsembleBatch(model = EvoTrees.EvoTreeRegressor{EvoTrees.MSE} │ - nrounds: 100 │ - L2: 0.0 │ - lambda: 0.0 │ - gamma: 0.0 │ - eta: 0.1 │ - max_depth: 6 │ - min_weight: 1.0 │ - rowsample: 1.0 │ - colsample: 1.0 │ - nbins: 64 │ - alpha: 0.5 │ - monotone_constraints: Dict{Int64, Int64}() │ - tree_type: binary │ - rng: Random.MersenneTwister(123, (0, 20551020, 20550018, 480)) └ , …), …). ┌ Info: Training machine(SimpleInductiveRegressor(model = EvoTrees.EvoTreeRegressor{EvoTrees.MSE} │ - nrounds: 100 │ - L2: 0.0 │ - lambda: 0.0 │ - gamma: 0.0 │ - eta: 0.1 │ - max_depth: 6 │ - min_weight: 1.0 │ - rowsample: 1.0 │ - colsample: 1.0 │ - nbins: 64 │ - alpha: 0.5 │ - monotone_constraints: Dict{Int64, Int64}() │ - tree_type: binary │ - rng: Random.MersenneTwister(123, (0, 20692302, 20691300, 198)) └ , …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. ┌ Info: Training machine(SimpleInductiveRegressor(model = EvoTrees.EvoTreeRegressor{EvoTrees.MSE} │ - nrounds: 100 │ - L2: 0.0 │ - lambda: 0.0 │ - gamma: 0.0 │ - eta: 0.1 │ - max_depth: 6 │ - min_weight: 1.0 │ - rowsample: 1.0 │ - colsample: 1.0 │ - nbins: 64 │ - alpha: 0.5 │ - monotone_constraints: Dict{Int64, Int64}() │ - tree_type: binary │ - rng: Random.MersenneTwister(123, (0, 20745408, 20744406, 889)) └ , …), …). [ Info: For silent loading, specify `verbosity=0`. import NearestNeighborModels ✔ ┌ Warning: This test is skipped as the method is not suitable for Quantile Regression └ @ Main ~/.julia/packages/ConformalPrediction/bz1ka/test/regression.jl:80 [ Info: Training machine(CVPlusRegressor(model = KNNRegressor(K = 5, …), …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(CVPlusRegressor(model = KNNRegressor(K = 5, …), …), …). [ Info: Training machine(NaiveRegressor(model = KNNRegressor(K = 5, …), …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(NaiveRegressor(model = KNNRegressor(K = 5, …), …), …). [ Info: Training machine(JackknifePlusAbMinMaxRegressor(model = KNNRegressor(K = 5, …), …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(JackknifePlusAbMinMaxRegressor(model = KNNRegressor(K = 5, …), …), …). [ Info: Training machine(JackknifePlusAbRegressor(model = KNNRegressor(K = 5, …), …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(JackknifePlusAbRegressor(model = KNNRegressor(K = 5, …), …), …). [ Info: Training machine(JackknifePlusRegressor(model = KNNRegressor(K = 5, …), …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(JackknifePlusRegressor(model = KNNRegressor(K = 5, …), …), …). [ Info: Training machine(CVMinMaxRegressor(model = KNNRegressor(K = 5, …), …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(CVMinMaxRegressor(model = KNNRegressor(K = 5, …), …), …). [ Info: Training machine(JackknifeRegressor(model = KNNRegressor(K = 5, …), …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(JackknifeRegressor(model = KNNRegressor(K = 5, …), …), …). [ Info: Training machine(JackknifeMinMaxRegressor(model = KNNRegressor(K = 5, …), …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(JackknifeMinMaxRegressor(model = KNNRegressor(K = 5, …), …), …). [ Info: Training machine(TimeSeriesRegressorEnsembleBatch(model = KNNRegressor(K = 5, …), …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(TimeSeriesRegressorEnsembleBatch(model = KNNRegressor(K = 5, …), …), …). [ Info: Training machine(SimpleInductiveRegressor(model = KNNRegressor(K = 5, …), …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(SimpleInductiveRegressor(model = KNNRegressor(K = 5, …), …), …). [ Info: For silent loading, specify `verbosity=0`. import MLJDecisionTreeInterface ✔ ┌ Warning: This test is skipped as the method is not suitable for Quantile Regression └ @ Main ~/.julia/packages/ConformalPrediction/bz1ka/test/regression.jl:80 [ Info: Training machine(CVPlusRegressor(model = DecisionTreeRegressor(max_depth = -1, …), …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(CVPlusRegressor(model = DecisionTreeRegressor(max_depth = -1, …), …), …). [ Info: Training machine(NaiveRegressor(model = DecisionTreeRegressor(max_depth = -1, …), …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(NaiveRegressor(model = DecisionTreeRegressor(max_depth = -1, …), …), …). [ Info: Training machine(JackknifePlusAbMinMaxRegressor(model = DecisionTreeRegressor(max_depth = -1, …), …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(JackknifePlusAbMinMaxRegressor(model = DecisionTreeRegressor(max_depth = -1, …), …), …). [ Info: Training machine(JackknifePlusAbRegressor(model = DecisionTreeRegressor(max_depth = -1, …), …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(JackknifePlusAbRegressor(model = DecisionTreeRegressor(max_depth = -1, …), …), …). [ Info: Training machine(JackknifePlusRegressor(model = DecisionTreeRegressor(max_depth = -1, …), …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(JackknifePlusRegressor(model = DecisionTreeRegressor(max_depth = -1, …), …), …). [ Info: Training machine(CVMinMaxRegressor(model = DecisionTreeRegressor(max_depth = -1, …), …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(CVMinMaxRegressor(model = DecisionTreeRegressor(max_depth = -1, …), …), …). [ Info: Training machine(JackknifeRegressor(model = DecisionTreeRegressor(max_depth = -1, …), …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(JackknifeRegressor(model = DecisionTreeRegressor(max_depth = -1, …), …), …). [ Info: Training machine(JackknifeMinMaxRegressor(model = DecisionTreeRegressor(max_depth = -1, …), …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(JackknifeMinMaxRegressor(model = DecisionTreeRegressor(max_depth = -1, …), …), …). [ Info: Training machine(TimeSeriesRegressorEnsembleBatch(model = DecisionTreeRegressor(max_depth = -1, …), …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(TimeSeriesRegressorEnsembleBatch(model = DecisionTreeRegressor(max_depth = -1, …), …), …). [ Info: Training machine(SimpleInductiveRegressor(model = DecisionTreeRegressor(max_depth = -1, …), …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(SimpleInductiveRegressor(model = DecisionTreeRegressor(max_depth = -1, …), …), …). [ Info: For silent loading, specify `verbosity=0`. import MLJLinearModels ✔ [ Info: Training machine(ConformalQuantileRegressor(model = QuantileRegressor(delta = 0.5, …), …), …). ┌ Info: Solver: MLJLinearModels.LBFGS{Optim.Options{Float64, Nothing}, @NamedTuple{}} │ optim_options: Optim.Options{Float64, Nothing} └ lbfgs_options: @NamedTuple{} NamedTuple() ┌ Info: Solver: MLJLinearModels.LBFGS{Optim.Options{Float64, Nothing}, @NamedTuple{}} │ optim_options: Optim.Options{Float64, Nothing} └ lbfgs_options: @NamedTuple{} NamedTuple() [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(ConformalQuantileRegressor(model = QuantileRegressor(delta = 0.5, …), …), …). ┌ Info: Solver: MLJLinearModels.LBFGS{Optim.Options{Float64, Nothing}, @NamedTuple{}} │ optim_options: Optim.Options{Float64, Nothing} └ lbfgs_options: @NamedTuple{} NamedTuple() ┌ Info: Solver: MLJLinearModels.LBFGS{Optim.Options{Float64, Nothing}, @NamedTuple{}} │ optim_options: Optim.Options{Float64, Nothing} └ lbfgs_options: @NamedTuple{} NamedTuple() [ Info: Training machine(CVPlusRegressor(model = QuantileRegressor(delta = 0.5, …), …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(CVPlusRegressor(model = QuantileRegressor(delta = 0.5, …), …), …). [ Info: Training machine(NaiveRegressor(model = QuantileRegressor(delta = 0.5, …), …), …). ┌ Info: Solver: MLJLinearModels.LBFGS{Optim.Options{Float64, Nothing}, @NamedTuple{}} │ optim_options: Optim.Options{Float64, Nothing} └ lbfgs_options: @NamedTuple{} NamedTuple() [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(NaiveRegressor(model = QuantileRegressor(delta = 0.5, …), …), …). ┌ Info: Solver: MLJLinearModels.LBFGS{Optim.Options{Float64, Nothing}, @NamedTuple{}} │ optim_options: Optim.Options{Float64, Nothing} └ lbfgs_options: @NamedTuple{} NamedTuple() [ Info: Training machine(JackknifePlusAbMinMaxRegressor(model = QuantileRegressor(delta = 0.5, …), …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(JackknifePlusAbMinMaxRegressor(model = QuantileRegressor(delta = 0.5, …), …), …). [ Info: Training machine(JackknifePlusAbRegressor(model = QuantileRegressor(delta = 0.5, …), …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(JackknifePlusAbRegressor(model = QuantileRegressor(delta = 0.5, …), …), …). [ Info: Training machine(JackknifePlusRegressor(model = QuantileRegressor(delta = 0.5, …), …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(JackknifePlusRegressor(model = QuantileRegressor(delta = 0.5, …), …), …). [ Info: Training machine(CVMinMaxRegressor(model = QuantileRegressor(delta = 0.5, …), …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(CVMinMaxRegressor(model = QuantileRegressor(delta = 0.5, …), …), …). [ Info: Training machine(JackknifeRegressor(model = QuantileRegressor(delta = 0.5, …), …), …). ┌ Info: Solver: MLJLinearModels.LBFGS{Optim.Options{Float64, Nothing}, @NamedTuple{}} │ optim_options: Optim.Options{Float64, Nothing} └ lbfgs_options: @NamedTuple{} NamedTuple() [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(JackknifeRegressor(model = QuantileRegressor(delta = 0.5, …), …), …). ┌ Info: Solver: MLJLinearModels.LBFGS{Optim.Options{Float64, Nothing}, @NamedTuple{}} │ optim_options: Optim.Options{Float64, Nothing} └ lbfgs_options: @NamedTuple{} NamedTuple() [ Info: Training machine(JackknifeMinMaxRegressor(model = QuantileRegressor(delta = 0.5, …), …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(JackknifeMinMaxRegressor(model = QuantileRegressor(delta = 0.5, …), …), …). [ Info: Training machine(TimeSeriesRegressorEnsembleBatch(model = QuantileRegressor(delta = 0.5, …), …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(TimeSeriesRegressorEnsembleBatch(model = QuantileRegressor(delta = 0.5, …), …), …). [ Info: Training machine(SimpleInductiveRegressor(model = QuantileRegressor(delta = 0.5, …), …), …). ┌ Info: Solver: MLJLinearModels.LBFGS{Optim.Options{Float64, Nothing}, @NamedTuple{}} │ optim_options: Optim.Options{Float64, Nothing} └ lbfgs_options: @NamedTuple{} NamedTuple() [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(SimpleInductiveRegressor(model = QuantileRegressor(delta = 0.5, …), …), …). ┌ Info: Solver: MLJLinearModels.LBFGS{Optim.Options{Float64, Nothing}, @NamedTuple{}} │ optim_options: Optim.Options{Float64, Nothing} └ lbfgs_options: @NamedTuple{} NamedTuple() [ Info: For silent loading, specify `verbosity=0`. import MLJDecisionTreeInterface ✔ ┌ Warning: This test is skipped as the method is not suitable for Quantile Regression └ @ Main ~/.julia/packages/ConformalPrediction/bz1ka/test/regression.jl:80 [ Info: Training machine(CVPlusRegressor(model = RandomForestRegressor(max_depth = -1, …), …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(CVPlusRegressor(model = RandomForestRegressor(max_depth = -1, …), …), …). [ Info: Training machine(NaiveRegressor(model = RandomForestRegressor(max_depth = -1, …), …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(NaiveRegressor(model = RandomForestRegressor(max_depth = -1, …), …), …). [ Info: Training machine(JackknifePlusAbMinMaxRegressor(model = RandomForestRegressor(max_depth = -1, …), …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(JackknifePlusAbMinMaxRegressor(model = RandomForestRegressor(max_depth = -1, …), …), …). [ Info: Training machine(JackknifePlusAbRegressor(model = RandomForestRegressor(max_depth = -1, …), …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(JackknifePlusAbRegressor(model = RandomForestRegressor(max_depth = -1, …), …), …). [ Info: Training machine(JackknifePlusRegressor(model = RandomForestRegressor(max_depth = -1, …), …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(JackknifePlusRegressor(model = RandomForestRegressor(max_depth = -1, …), …), …). [ Info: Training machine(CVMinMaxRegressor(model = RandomForestRegressor(max_depth = -1, …), …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(CVMinMaxRegressor(model = RandomForestRegressor(max_depth = -1, …), …), …). [ Info: Training machine(JackknifeRegressor(model = RandomForestRegressor(max_depth = -1, …), …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(JackknifeRegressor(model = RandomForestRegressor(max_depth = -1, …), …), …). [ Info: Training machine(JackknifeMinMaxRegressor(model = RandomForestRegressor(max_depth = -1, …), …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(JackknifeMinMaxRegressor(model = RandomForestRegressor(max_depth = -1, …), …), …). [ Info: Training machine(TimeSeriesRegressorEnsembleBatch(model = RandomForestRegressor(max_depth = -1, …), …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(TimeSeriesRegressorEnsembleBatch(model = RandomForestRegressor(max_depth = -1, …), …), …). [ Info: Training machine(SimpleInductiveRegressor(model = RandomForestRegressor(max_depth = -1, …), …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(SimpleInductiveRegressor(model = RandomForestRegressor(max_depth = -1, …), …), …). [ Info: For silent loading, specify `verbosity=0`. import MLJLinearModels ✔ ┌ Warning: This test is skipped as the method is not suitable for Quantile Regression └ @ Main ~/.julia/packages/ConformalPrediction/bz1ka/test/regression.jl:80 [ Info: Training machine(CVPlusRegressor(model = LinearRegressor(fit_intercept = true, …), …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(CVPlusRegressor(model = LinearRegressor(fit_intercept = true, …), …), …). [ Info: Training machine(NaiveRegressor(model = LinearRegressor(fit_intercept = true, …), …), …). ┌ Info: Solver: MLJLinearModels.Analytical │ iterative: Bool false └ max_inner: Int64 200 [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(NaiveRegressor(model = LinearRegressor(fit_intercept = true, …), …), …). ┌ Info: Solver: MLJLinearModels.Analytical │ iterative: Bool false └ max_inner: Int64 200 [ Info: Training machine(JackknifePlusAbMinMaxRegressor(model = LinearRegressor(fit_intercept = true, …), …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(JackknifePlusAbMinMaxRegressor(model = LinearRegressor(fit_intercept = true, …), …), …). [ Info: Training machine(JackknifePlusAbRegressor(model = LinearRegressor(fit_intercept = true, …), …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(JackknifePlusAbRegressor(model = LinearRegressor(fit_intercept = true, …), …), …). [ Info: Training machine(JackknifePlusRegressor(model = LinearRegressor(fit_intercept = true, …), …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(JackknifePlusRegressor(model = LinearRegressor(fit_intercept = true, …), …), …). [ Info: Training machine(CVMinMaxRegressor(model = LinearRegressor(fit_intercept = true, …), …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(CVMinMaxRegressor(model = LinearRegressor(fit_intercept = true, …), …), …). [ Info: Training machine(JackknifeRegressor(model = LinearRegressor(fit_intercept = true, …), …), …). ┌ Info: Solver: MLJLinearModels.Analytical │ iterative: Bool false └ max_inner: Int64 200 [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(JackknifeRegressor(model = LinearRegressor(fit_intercept = true, …), …), …). ┌ Info: Solver: MLJLinearModels.Analytical │ iterative: Bool false └ max_inner: Int64 200 [ Info: Training machine(JackknifeMinMaxRegressor(model = LinearRegressor(fit_intercept = true, …), …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(JackknifeMinMaxRegressor(model = LinearRegressor(fit_intercept = true, …), …), …). [ Info: Training machine(TimeSeriesRegressorEnsembleBatch(model = LinearRegressor(fit_intercept = true, …), …), …). [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(TimeSeriesRegressorEnsembleBatch(model = LinearRegressor(fit_intercept = true, …), …), …). [ Info: Training machine(SimpleInductiveRegressor(model = LinearRegressor(fit_intercept = true, …), …), …). ┌ Info: Solver: MLJLinearModels.Analytical │ iterative: Bool false └ max_inner: Int64 200 [ Info: Multivariate input for regression with no input variable (`input_var`) specified: defaulting to first variable. [ Info: Training machine(SimpleInductiveRegressor(model = LinearRegressor(fit_intercept = true, …), …), …). ┌ Info: Solver: MLJLinearModels.Analytical │ iterative: Bool false └ max_inner: Int64 200 [ Info: For silent loading, specify `verbosity=0`. import MLJDecisionTreeInterface ✔ Test Summary: | Pass Total Time ConformalPrediction.jl | 504 504 21m49.4s Testing ConformalPrediction tests passed Testing completed after 1408.55s PkgEval succeeded after 2011.74s