Fit multiple models with differing numbers of regimes to trend data

find_regimes(
  y,
  sds = NULL,
  min_regimes = 1,
  max_regimes = 3,
  iter = 2000,
  thin = 1,
  chains = 1,
  ...
)

Arguments

y

Data, time series or trend from fitted DFA model.

sds

Optional time series of standard deviations of estimates. If passed in, residual variance not estimated.

min_regimes

Smallest of regimes to evaluate, defaults to 1.

max_regimes

Biggest of regimes to evaluate, defaults to 3.

iter

MCMC iterations, defaults to 2000.

thin

MCMC thinning rate, defaults to 1.

chains

MCMC chains; defaults to 1 (note that running multiple chains may result in a "label switching" problem where the regimes are identified with different IDs across chains).

...

Other parameters to pass to rstan::sampling().

Examples

data(Nile) find_regimes(log(Nile), iter = 50, chains = 1, max_regimes = 2)
#> #> SAMPLING FOR MODEL 'regime_1' NOW (CHAIN 1). #> Chain 1: #> Chain 1: Gradient evaluation took 1.6e-05 seconds #> Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 0.16 seconds. #> Chain 1: Adjust your expectations accordingly! #> Chain 1: #> Chain 1: #> Chain 1: WARNING: There aren't enough warmup iterations to fit the #> Chain 1: three stages of adaptation as currently configured. #> Chain 1: Reducing each adaptation stage to 15%/75%/10% of #> Chain 1: the given number of warmup iterations: #> Chain 1: init_buffer = 3 #> Chain 1: adapt_window = 20 #> Chain 1: term_buffer = 2 #> Chain 1: #> Chain 1: Iteration: 1 / 50 [ 2%] (Warmup) #> Chain 1: Iteration: 5 / 50 [ 10%] (Warmup) #> Chain 1: Iteration: 10 / 50 [ 20%] (Warmup) #> Chain 1: Iteration: 15 / 50 [ 30%] (Warmup) #> Chain 1: Iteration: 20 / 50 [ 40%] (Warmup) #> Chain 1: Iteration: 25 / 50 [ 50%] (Warmup) #> Chain 1: Iteration: 26 / 50 [ 52%] (Sampling) #> Chain 1: Iteration: 30 / 50 [ 60%] (Sampling) #> Chain 1: Iteration: 35 / 50 [ 70%] (Sampling) #> Chain 1: Iteration: 40 / 50 [ 80%] (Sampling) #> Chain 1: Iteration: 45 / 50 [ 90%] (Sampling) #> Chain 1: Iteration: 50 / 50 [100%] (Sampling) #> Chain 1: #> Chain 1: Elapsed Time: 0.000986 seconds (Warm-up) #> Chain 1: 0.00081 seconds (Sampling) #> Chain 1: 0.001796 seconds (Total) #> Chain 1:
#> Warning: The largest R-hat is 1.43, indicating chains have not mixed. #> Running the chains for more iterations may help. See #> http://mc-stan.org/misc/warnings.html#r-hat
#> Warning: Bulk Effective Samples Size (ESS) is too low, indicating posterior means and medians may be unreliable. #> Running the chains for more iterations may help. See #> http://mc-stan.org/misc/warnings.html#bulk-ess
#> Warning: Tail Effective Samples Size (ESS) is too low, indicating posterior variances and tail quantiles may be unreliable. #> Running the chains for more iterations may help. See #> http://mc-stan.org/misc/warnings.html#tail-ess
#> Warning: Some Pareto k diagnostic values are too high. See help('pareto-k-diagnostic') for details.
#> Warning: Some Pareto k diagnostic values are too high. See help('pareto-k-diagnostic') for details.
#> #> SAMPLING FOR MODEL 'hmm_gaussian' NOW (CHAIN 1). #> Chain 1: #> Chain 1: Gradient evaluation took 0.00011 seconds #> Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 1.1 seconds. #> Chain 1: Adjust your expectations accordingly! #> Chain 1: #> Chain 1: #> Chain 1: WARNING: There aren't enough warmup iterations to fit the #> Chain 1: three stages of adaptation as currently configured. #> Chain 1: Reducing each adaptation stage to 15%/75%/10% of #> Chain 1: the given number of warmup iterations: #> Chain 1: init_buffer = 3 #> Chain 1: adapt_window = 20 #> Chain 1: term_buffer = 2 #> Chain 1: #> Chain 1: Iteration: 1 / 50 [ 2%] (Warmup) #> Chain 1: Iteration: 5 / 50 [ 10%] (Warmup) #> Chain 1: Iteration: 10 / 50 [ 20%] (Warmup) #> Chain 1: Iteration: 15 / 50 [ 30%] (Warmup) #> Chain 1: Iteration: 20 / 50 [ 40%] (Warmup) #> Chain 1: Iteration: 25 / 50 [ 50%] (Warmup) #> Chain 1: Iteration: 26 / 50 [ 52%] (Sampling) #> Chain 1: Iteration: 30 / 50 [ 60%] (Sampling) #> Chain 1: Iteration: 35 / 50 [ 70%] (Sampling) #> Chain 1: Iteration: 40 / 50 [ 80%] (Sampling) #> Chain 1: Iteration: 45 / 50 [ 90%] (Sampling) #> Chain 1: Iteration: 50 / 50 [100%] (Sampling) #> Chain 1: #> Chain 1: Elapsed Time: 0.291106 seconds (Warm-up) #> Chain 1: 0.289729 seconds (Sampling) #> Chain 1: 0.580835 seconds (Total) #> Chain 1:
#> Warning: The largest R-hat is NA, indicating chains have not mixed. #> Running the chains for more iterations may help. See #> http://mc-stan.org/misc/warnings.html#r-hat
#> Warning: Bulk Effective Samples Size (ESS) is too low, indicating posterior means and medians may be unreliable. #> Running the chains for more iterations may help. See #> http://mc-stan.org/misc/warnings.html#bulk-ess
#> Warning: Tail Effective Samples Size (ESS) is too low, indicating posterior variances and tail quantiles may be unreliable. #> Running the chains for more iterations may help. See #> http://mc-stan.org/misc/warnings.html#tail-ess
#> Warning: Some Pareto k diagnostic values are too high. See help('pareto-k-diagnostic') for details.
#> Warning: Some Pareto k diagnostic values are too high. See help('pareto-k-diagnostic') for details.
#> $table #> regimes looic #> 1 1 -49.71780 #> 2 2 19.44365 #> #> $best_model #> $best_model$model #> Inference for Stan model: regime_1. #> 1 chains, each with iter=50; warmup=25; thin=1; #> post-warmup draws per chain=25, total post-warmup draws=25. #> #> mean se_mean sd 2.5% 25% 50% 75% 97.5% n_eff Rhat #> mu_k 6.81 0.00 0.01 6.79 6.80 6.81 6.82 6.84 25 0.97 #> sigma_k 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[1] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[2] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[3] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[4] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[5] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[6] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[7] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[8] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[9] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[10] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[11] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[12] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[13] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[14] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[15] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[16] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[17] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[18] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[19] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[20] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[21] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[22] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[23] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[24] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[25] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[26] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[27] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[28] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[29] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[30] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[31] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[32] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[33] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[34] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[35] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[36] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[37] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[38] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[39] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[40] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[41] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[42] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[43] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[44] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[45] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[46] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[47] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[48] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[49] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[50] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[51] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[52] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[53] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[54] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[55] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[56] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[57] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[58] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[59] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[60] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[61] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[62] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[63] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[64] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[65] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[66] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[67] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[68] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[69] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[70] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[71] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[72] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[73] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[74] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[75] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[76] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[77] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[78] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[79] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[80] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[81] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[82] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[83] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[84] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[85] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[86] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[87] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[88] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[89] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[90] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[91] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[92] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[93] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[94] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[95] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[96] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[97] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[98] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[99] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> sigmas[100] 0.19 0.01 0.01 0.17 0.18 0.19 0.20 0.21 7 1.16 #> log_lik[1] 0.10 0.02 0.10 -0.07 0.06 0.11 0.16 0.30 18 1.01 #> log_lik[2] -0.13 0.04 0.12 -0.37 -0.17 -0.10 -0.06 0.09 12 1.07 #> log_lik[3] 0.68 0.02 0.08 0.57 0.62 0.66 0.75 0.81 11 1.07 #> log_lik[4] -0.44 0.06 0.16 -0.79 -0.47 -0.41 -0.36 -0.21 9 1.12 #> log_lik[5] -0.13 0.04 0.12 -0.37 -0.17 -0.10 -0.06 0.09 12 1.07 #> log_lik[6] -0.13 0.04 0.12 -0.37 -0.17 -0.10 -0.06 0.09 12 1.07 #> log_lik[7] 0.58 0.02 0.07 0.48 0.55 0.57 0.61 0.72 9 1.10 #> log_lik[8] -0.58 0.06 0.18 -0.97 -0.62 -0.56 -0.49 -0.34 8 1.14 #> log_lik[9] -1.67 0.13 0.35 -2.39 -1.78 -1.55 -1.46 -1.27 7 1.18 #> log_lik[10] -0.01 0.03 0.11 -0.21 -0.05 0.01 0.06 0.20 14 1.04 #> log_lik[11] 0.61 0.02 0.07 0.51 0.55 0.59 0.65 0.75 19 1.02 #> log_lik[12] 0.72 0.03 0.08 0.61 0.66 0.71 0.78 0.85 9 1.11 #> log_lik[13] 0.15 0.02 0.09 0.00 0.11 0.16 0.21 0.35 19 1.00 #> log_lik[14] 0.61 0.02 0.07 0.51 0.55 0.59 0.65 0.75 19 1.03 #> log_lik[15] 0.54 0.02 0.07 0.44 0.49 0.51 0.57 0.68 22 0.99 #> log_lik[16] 0.69 0.02 0.08 0.58 0.63 0.67 0.75 0.82 11 1.08 #> log_lik[17] -0.25 0.04 0.14 -0.53 -0.28 -0.21 -0.17 -0.02 10 1.09 #> log_lik[18] 0.52 0.02 0.07 0.41 0.50 0.51 0.54 0.66 11 1.06 #> log_lik[19] 0.69 0.02 0.08 0.58 0.63 0.67 0.76 0.82 11 1.08 #> log_lik[20] -0.01 0.03 0.11 -0.21 -0.05 0.01 0.06 0.20 14 1.04 #> log_lik[21] 0.20 0.02 0.09 0.07 0.15 0.20 0.26 0.39 21 0.98 #> log_lik[22] -0.44 0.06 0.16 -0.79 -0.47 -0.41 -0.36 -0.21 9 1.12 #> log_lik[23] -0.07 0.03 0.11 -0.29 -0.12 -0.04 0.00 0.15 13 1.06 #> log_lik[24] -0.72 0.07 0.20 -1.16 -0.77 -0.68 -0.62 -0.46 8 1.15 #> log_lik[25] -0.80 0.08 0.22 -1.25 -0.85 -0.76 -0.69 -0.52 8 1.15 #> log_lik[26] -0.51 0.06 0.17 -0.88 -0.54 -0.48 -0.42 -0.27 9 1.13 #> log_lik[27] 0.50 0.01 0.07 0.41 0.46 0.48 0.54 0.65 24 0.98 #> log_lik[28] 0.20 0.02 0.09 0.07 0.15 0.20 0.26 0.39 21 0.98 #> log_lik[29] 0.40 0.02 0.07 0.24 0.37 0.39 0.43 0.52 18 0.99 #> log_lik[30] 0.66 0.03 0.07 0.57 0.61 0.65 0.69 0.80 8 1.14 #> log_lik[31] 0.72 0.03 0.08 0.62 0.67 0.71 0.76 0.86 7 1.15 #> log_lik[32] -0.24 0.03 0.15 -0.57 -0.30 -0.20 -0.14 -0.06 23 0.99 #> log_lik[33] 0.72 0.03 0.08 0.60 0.66 0.70 0.78 0.85 9 1.10 #> log_lik[34] 0.64 0.02 0.07 0.55 0.59 0.62 0.67 0.78 8 1.14 #> log_lik[35] -0.17 0.03 0.14 -0.48 -0.22 -0.12 -0.08 0.00 24 0.98 #> log_lik[36] 0.73 0.03 0.08 0.62 0.68 0.73 0.78 0.87 8 1.13 #> log_lik[37] -0.27 0.03 0.16 -0.60 -0.32 -0.22 -0.16 -0.08 23 0.99 #> log_lik[38] 0.54 0.02 0.07 0.44 0.49 0.51 0.57 0.68 22 0.99 #> log_lik[39] 0.43 0.01 0.07 0.33 0.38 0.40 0.47 0.58 26 0.96 #> log_lik[40] 0.67 0.02 0.08 0.56 0.61 0.65 0.73 0.80 12 1.06 #> log_lik[41] 0.64 0.02 0.07 0.55 0.59 0.62 0.67 0.78 8 1.13 #> log_lik[42] 0.06 0.02 0.11 -0.19 0.02 0.10 0.13 0.19 25 0.96 #> log_lik[43] -5.81 0.36 1.07 -7.69 -6.66 -5.45 -5.01 -4.42 9 1.11 #> log_lik[44] 0.61 0.02 0.07 0.52 0.58 0.60 0.65 0.75 8 1.12 #> log_lik[45] -0.16 0.03 0.14 -0.46 -0.21 -0.11 -0.07 0.01 24 0.98 #> log_lik[46] 0.10 0.02 0.10 -0.07 0.06 0.11 0.16 0.30 18 1.01 #> log_lik[47] 0.20 0.02 0.09 0.07 0.15 0.20 0.26 0.39 21 0.98 #> log_lik[48] 0.64 0.02 0.07 0.55 0.59 0.62 0.67 0.78 8 1.13 #> log_lik[49] 0.34 0.02 0.08 0.16 0.30 0.34 0.38 0.45 20 0.98 #> log_lik[50] 0.60 0.02 0.07 0.51 0.57 0.59 0.64 0.74 9 1.12 #> log_lik[51] 0.36 0.02 0.07 0.19 0.33 0.36 0.40 0.48 19 0.98 #> log_lik[52] 0.67 0.03 0.07 0.58 0.63 0.66 0.71 0.81 7 1.15 #> log_lik[53] 0.71 0.03 0.07 0.61 0.66 0.69 0.75 0.85 7 1.15 #> log_lik[54] 0.70 0.03 0.07 0.61 0.66 0.69 0.74 0.85 7 1.15 #> log_lik[55] -0.20 0.03 0.15 -0.51 -0.25 -0.15 -0.11 -0.02 23 0.99 #> log_lik[56] 0.67 0.03 0.07 0.58 0.63 0.66 0.71 0.81 7 1.15 #> log_lik[57] 0.20 0.02 0.09 -0.01 0.16 0.22 0.27 0.31 24 0.96 #> log_lik[58] 0.51 0.02 0.07 0.39 0.48 0.50 0.53 0.64 12 1.05 #> log_lik[59] 0.46 0.01 0.07 0.37 0.42 0.44 0.50 0.62 25 0.97 #> log_lik[60] 0.31 0.02 0.08 0.12 0.27 0.31 0.35 0.42 21 0.97 #> log_lik[61] 0.43 0.02 0.07 0.29 0.40 0.43 0.46 0.56 16 1.01 #> log_lik[62] 0.71 0.03 0.07 0.61 0.66 0.70 0.75 0.85 7 1.15 #> log_lik[63] 0.67 0.03 0.07 0.58 0.63 0.66 0.71 0.81 7 1.15 #> log_lik[64] 0.71 0.03 0.08 0.60 0.65 0.69 0.77 0.84 9 1.10 #> log_lik[65] 0.64 0.02 0.07 0.53 0.58 0.61 0.68 0.77 17 1.04 #> log_lik[66] 0.74 0.03 0.08 0.63 0.68 0.74 0.78 0.88 7 1.14 #> log_lik[67] 0.61 0.02 0.07 0.52 0.57 0.59 0.64 0.75 9 1.12 #> log_lik[68] 0.57 0.02 0.07 0.47 0.52 0.54 0.60 0.71 21 1.00 #> log_lik[69] 0.38 0.02 0.07 0.22 0.35 0.38 0.42 0.50 19 0.99 #> log_lik[70] -0.45 0.04 0.18 -0.84 -0.50 -0.40 -0.33 -0.23 21 1.01 #> log_lik[71] -0.80 0.05 0.24 -1.30 -0.88 -0.70 -0.64 -0.52 19 1.03 #> log_lik[72] 0.67 0.03 0.07 0.58 0.63 0.66 0.71 0.81 7 1.15 #> log_lik[73] 0.57 0.02 0.07 0.48 0.54 0.57 0.60 0.71 9 1.10 #> log_lik[74] 0.19 0.02 0.09 -0.03 0.15 0.21 0.26 0.30 24 0.96 #> log_lik[75] 0.53 0.02 0.07 0.42 0.50 0.52 0.55 0.66 11 1.07 #> log_lik[76] 0.46 0.01 0.07 0.37 0.42 0.44 0.50 0.62 25 0.97 #> log_lik[77] 0.70 0.03 0.07 0.60 0.65 0.68 0.74 0.84 7 1.15 #> log_lik[78] 0.72 0.03 0.08 0.62 0.67 0.71 0.76 0.86 7 1.15 #> log_lik[79] 0.68 0.03 0.07 0.58 0.63 0.67 0.71 0.82 7 1.15 #> log_lik[80] 0.73 0.03 0.08 0.63 0.68 0.73 0.78 0.88 7 1.15 #> log_lik[81] 0.20 0.02 0.09 -0.01 0.16 0.22 0.27 0.31 24 0.96 #> log_lik[82] 0.24 0.02 0.09 0.03 0.20 0.25 0.30 0.35 23 0.96 #> log_lik[83] 0.65 0.03 0.07 0.56 0.61 0.64 0.69 0.79 8 1.14 #> log_lik[84] 0.43 0.01 0.07 0.33 0.38 0.40 0.47 0.58 26 0.96 #> log_lik[85] 0.73 0.03 0.08 0.62 0.68 0.73 0.78 0.87 8 1.12 #> log_lik[86] 0.63 0.02 0.07 0.53 0.58 0.61 0.67 0.77 18 1.04 #> log_lik[87] 0.51 0.02 0.07 0.39 0.49 0.50 0.54 0.65 12 1.06 #> log_lik[88] 0.73 0.03 0.08 0.62 0.67 0.72 0.79 0.87 8 1.12 #> log_lik[89] 0.66 0.02 0.08 0.55 0.60 0.64 0.71 0.79 14 1.05 #> log_lik[90] 0.58 0.02 0.07 0.49 0.55 0.58 0.61 0.72 9 1.10 #> log_lik[91] 0.54 0.02 0.07 0.44 0.49 0.51 0.57 0.68 22 0.99 #> log_lik[92] 0.74 0.03 0.08 0.63 0.68 0.74 0.78 0.88 7 1.13 #> log_lik[93] 0.74 0.03 0.08 0.63 0.68 0.74 0.78 0.88 7 1.14 #> log_lik[94] -0.19 0.04 0.13 -0.45 -0.22 -0.15 -0.12 0.03 11 1.08 #> log_lik[95] 0.74 0.03 0.08 0.63 0.68 0.73 0.78 0.88 8 1.13 #> log_lik[96] 0.22 0.02 0.09 0.01 0.18 0.23 0.28 0.33 24 0.96 #> log_lik[97] 0.73 0.03 0.08 0.62 0.68 0.72 0.78 0.87 8 1.12 #> log_lik[98] -0.01 0.02 0.12 -0.28 -0.05 0.04 0.06 0.14 25 0.97 #> log_lik[99] -0.05 0.02 0.12 -0.32 -0.09 0.01 0.02 0.11 25 0.97 #> log_lik[100] 0.17 0.02 0.09 -0.05 0.13 0.19 0.24 0.28 25 0.96 #> lp__ 114.02 0.15 0.74 112.57 113.57 114.22 114.57 114.94 25 1.09 #> #> Samples were drawn using NUTS(diag_e) at Tue Sep 28 10:44:09 2021. #> For each parameter, n_eff is a crude measure of effective sample size, #> and Rhat is the potential scale reduction factor on split chains (at #> convergence, Rhat=1). #> #> $best_model$y #> Time Series: #> Start = 1871 #> End = 1970 #> Frequency = 1 #> [1] 7.021084 7.056175 6.870053 7.098376 7.056175 7.056175 6.700731 7.114769 #> [9] 7.222566 7.038784 6.902743 6.840547 7.012115 6.901737 6.927558 6.866933 #> [17] 7.073270 6.683361 6.864848 7.038784 7.003065 7.098376 7.047517 7.130899 #> [25] 7.138867 7.106606 6.937314 7.003065 6.651572 6.733402 6.773080 6.542472 #> [33] 6.845880 6.725034 6.552508 6.820016 6.539586 6.927558 6.956545 6.876265 #> [41] 6.722630 6.587550 6.122493 6.714171 6.553933 7.021084 7.003065 6.723832 #> [49] 6.638568 6.710523 6.643790 6.739337 6.761573 6.759255 6.548219 6.739337 #> [57] 6.612041 6.679599 6.946976 6.632002 6.660575 6.762730 6.739337 6.850126 #> [65] 6.891626 6.799056 6.711740 6.917706 6.647688 6.516193 6.475433 6.740519 #> [73] 6.699500 6.609349 6.685861 6.946976 6.756932 6.773080 6.742881 6.791221 #> [81] 6.612041 6.618739 6.731018 6.956545 6.822197 6.893656 6.680855 6.827629 #> [89] 6.882437 6.703188 6.927558 6.809039 6.803505 7.064759 6.815640 6.614726 #> [97] 6.823286 6.576470 6.570883 6.606650 #> #> $best_model$looic #> [1] -49.7178 #> #> #> $n_loo_bad #> [1] 4 #> #> $n_loo_very_bad #> [1] 3 #>