Sunday, October 19, 2014

Tuning Laplaces Demon II

I am continuing with my trying all algorithms of Laplaces Demon. It is actually quite a bit more work than I expected but I do find that some of the things get clearer. Now that I am close to the end of calculating this second batch I learned that there is loads of adaptive algorithms. The point of those adaptations is not so much getting the correct posterior distribution, but rather getting enough information so one can set up the other algorithms which can get the desired posterior. For example, in this post DRAM is the adaptive version of DRM which form such a pairing of algorithms.
Given all that it may be that I will redo this same exercise with a different estimation, but that is yet to be decided.

Adaptive-Mixture Metropolis

No specs




Call:
LaplacesDemon(Model = Model, Data = MyData, Initial.Values = Initial.Values,
    Algorithm = "AMM")

Acceptance Rate: 0.284
Algorithm: Adaptive-Mixture Metropolis
Covariance Matrix: (NOT SHOWN HERE; diagonal shown instead)
   beta[1]    beta[2]
2.73756468 0.00197592

Covariance (Diagonal) History: (NOT SHOWN HERE)
Deviance Information Criterion (DIC):
         All Stationary
Dbar  45.095     44.425
pD   234.487      2.231
DIC  279.582     46.656
Initial Values:
[1] -10   0

Iterations: 10000
Log(Marginal Likelihood): NA
Minutes of run-time: 0.05
Model: (NOT SHOWN HERE)
Monitor: (NOT SHOWN HERE)
Parameters (Number of): 2
Posterior1: (NOT SHOWN HERE)
Posterior2: (NOT SHOWN HERE)
Recommended Burn-In of Thinned Samples: 500
Recommended Burn-In of Un-thinned Samples: 5000
Recommended Thinning: 150
Specs: (NOT SHOWN HERE)
Status is displayed every 100 iterations
Summary1: (SHOWN BELOW)
Summary2: (SHOWN BELOW)
Thinned Samples: 1000
Thinning: 10


Summary of All Samples
                Mean          SD        MCSE       ESS          LB      Median
beta[1]  -10.8694827  1.66603784 0.154095363  57.79995 -15.5489329 -10.2511972
beta[2]    0.2682103  0.04423248 0.004039057  58.80563   0.2003859   0.2543845
Deviance  45.0951406 21.65580992 1.178504393 534.75650  42.5189305  43.4676853
LP       -31.3536988 10.82813264 0.589266937 534.75770 -33.8231778 -30.5333506
                  UB
beta[1]   -8.3168119
beta[2]    0.3923192
Deviance  50.0480146
LP       -30.0738495


Summary of Stationary Samples
               Mean        SD        MCSE      ESS          LB      Median
beta[1]  -11.574823 2.1300652 0.205608788 285.1749 -16.3533092 -11.3357688
beta[2]    0.286758 0.0553065 0.005306451 287.9282   0.1924482   0.2804307
Deviance  44.425140 2.1124891 0.191862970 191.2321  42.4735763  43.8636196
LP       -31.027498 1.0677294 0.589266937 190.2838 -34.0072489 -30.7386292
                  UB
beta[1]   -7.9334688
beta[2]    0.4128574
Deviance  50.4694327
LP       -30.0463562

Affine-Invariant Ensemble Sampler

It seems to go somewhere, then gets stuck without an exit.


Call:
LaplacesDemon(Model = Model, Data = MyData, Initial.Values = Initial.Values,
    Iterations = 20000, Status = 2000, Thinning = 35, Algorithm = "AIES",
    Specs = list(Nc = 16, Z = NULL, beta = 1.1, CPUs = 1, Packages = NULL,
        Dyn.libs = NULL))

Acceptance Rate: 0.9773
Algorithm: Affine-Invariant Ensemble Sampler
Covariance Matrix: (NOT SHOWN HERE; diagonal shown instead)
     beta[1]      beta[2]
0.5252284175 0.0004811633

Covariance (Diagonal) History: (NOT SHOWN HERE)
Deviance Information Criterion (DIC):
        All Stationary
Dbar 43.004     43.005
pD    0.053      0.000
DIC  43.057     43.005
Initial Values:
[1] -10   0

Iterations: 20000
Log(Marginal Likelihood): NA
Minutes of run-time: 0.8
Model: (NOT SHOWN HERE)
Monitor: (NOT SHOWN HERE)
Parameters (Number of): 2
Posterior1: (NOT SHOWN HERE)
Posterior2: (NOT SHOWN HERE)
Recommended Burn-In of Thinned Samples: 399
Recommended Burn-In of Un-thinned Samples: 13965
Recommended Thinning: 27
Specs: (NOT SHOWN HERE)
Status is displayed every 2000 iterations
Summary1: (SHOWN BELOW)
Summary2: (SHOWN BELOW)
Thinned Samples: 571
Thinning: 35


Summary of All Samples
                Mean         SD        MCSE       ESS          LB      Median
beta[1]  -10.2521485 0.72528513 0.153054828  9.623682 -12.7424753  -9.9662793
beta[2]    0.2513389 0.01927108 0.004080774 11.791793   0.2404582   0.2438793
Deviance  43.0041950 0.32410474 0.046924334 74.412690  42.5190044  43.0023753
LP       -30.3005775 0.16647750 0.024331033 74.198162 -30.8672783 -30.2965116
                  UB
beta[1]   -9.8180273
beta[2]    0.3153106
Deviance  44.0738314
LP       -30.0671736


Summary of Stationary Samples
                Mean           SD         MCSE      ESS          LB      Median
beta[1]   -9.9558233 0.0082421078 2.797169e-03 12.56157  -9.9743833  -9.9550952
beta[2]    0.2436365 0.0001725992 5.836733e-05 12.63574   0.2433173   0.2436223
Deviance  43.0047636 0.0021518709 7.092913e-04 12.84743  43.0002894  43.0048552
LP       -30.2976030 0.0009940593 2.433103e-02 12.86979 -30.2995034 -30.2976427
                  UB
beta[1]   -9.9405874
beta[2]    0.2440232
Deviance  43.0088727
LP       -30.2955405

Componentwise Hit-And-Run Metropolis

This never was able to get to the target.

Call:
LaplacesDemon(Model = Model, Data = MyData, Initial.Values = Initial.Values,
    Iterations = 40000, Status = 2000, Thinning = 30, Algorithm = "CHARM")

Acceptance Rate: 0.31229
Algorithm: Componentwise Hit-And-Run Metropolis
Covariance Matrix: (NOT SHOWN HERE; diagonal shown instead)
    beta[1]     beta[2]
3.580895236 0.002467357

Covariance (Diagonal) History: (NOT SHOWN HERE)
Deviance Information Criterion (DIC):
        All Stationary
Dbar 44.445     45.021
pD    2.023      2.256
DIC  46.468     47.278
Initial Values:
[1] -10   0

Iterations: 40000
Log(Marginal Likelihood): NA
Minutes of run-time: 0.18
Model: (NOT SHOWN HERE)
Monitor: (NOT SHOWN HERE)
Parameters (Number of): 2
Posterior1: (NOT SHOWN HERE)
Posterior2: (NOT SHOWN HERE)
Recommended Burn-In of Thinned Samples: 1064
Recommended Burn-In of Un-thinned Samples: 31920
Recommended Thinning: 31
Specs: (NOT SHOWN HERE)
Status is displayed every 2000 iterations
Summary1: (SHOWN BELOW)
Summary2: (SHOWN BELOW)
Thinned Samples: 1333
Thinning: 30


Summary of All Samples
                Mean         SD       MCSE      ESS          LB      Median
beta[1]  -10.9964257 1.89283881 0.49785194 13.06079 -14.8229746 -10.9766992
beta[2]    0.2717979 0.04913034 0.01300998 11.03343   0.1856506   0.2705021
Deviance  44.4449406 2.01148697 0.18601589 82.06199  42.4984709  43.8291949
LP       -31.0303916 1.00254773 0.09222924 83.71010 -33.6460481 -30.7196890
                  UB
beta[1]   -7.6255135
beta[2]    0.3698866
Deviance  49.6364683
LP       -30.0586484


Summary of Stationary Samples
                Mean         SD       MCSE       ESS        LB     Median
beta[1]   -9.5579858 1.34107513 0.62509204  4.739982 -12.03957  -9.313436
beta[2]    0.2340237 0.03463118 0.01639968  4.804878   0.18134   0.227444
Deviance  45.0214688 2.12434347 0.28430825 16.656844  42.51149  44.655282
LP       -31.3029682 1.05482519 0.09222924 17.255007 -33.92636 -31.139132
                 UB
beta[1]   -7.398433
beta[2]    0.297938
Deviance  50.284536
LP       -30.061842

Delayed Rejection Adaptive Metropolis

This is an interesting algorithm. One can see during sampling the algorithm shifts from a faster to a slower sampling approach. The same shift in gears is seen in the plot. Notice that it recommends thinning 90. In fact I had  it to the point of proposing a thinning of 1000. Since the manual also states on using DRAM as final algorithm: 'DRAM may be used if diminishing adaptation occurs and adaptation ceases effectively'. Given these texts and effects, I tried a different problem, starting with wrong initial values. Indeed, it was able to get close to the true values in all such runs.



Call:
LaplacesDemon(Model = Model, Data = MyData, Initial.Values = Initial.Values,
    Thinning = 30, Algorithm = "DRAM")

Acceptance Rate: 0.5221
Algorithm: Delayed Rejection Adaptive Metropolis
Covariance Matrix: (NOT SHOWN HERE; diagonal shown instead)
     beta[1]      beta[2]
11.556472479  0.007722216

Covariance (Diagonal) History: (NOT SHOWN HERE)
Deviance Information Criterion (DIC):
             All Stationary
Dbar     470.735     48.093
pD   1803475.978     35.962
DIC  1803946.712     84.055
Initial Values:
[1] -10   0

Iterations: 10000
Log(Marginal Likelihood): NA
Minutes of run-time: 0.2
Model: (NOT SHOWN HERE)
Monitor: (NOT SHOWN HERE)
Parameters (Number of): 2
Posterior1: (NOT SHOWN HERE)
Posterior2: (NOT SHOWN HERE)
Recommended Burn-In of Thinned Samples: 165
Recommended Burn-In of Un-thinned Samples: 4950
Recommended Thinning: 270
Specs: (NOT SHOWN HERE)
Status is displayed every 100 iterations
Summary1: (SHOWN BELOW)
Summary2: (SHOWN BELOW)
Thinned Samples: 333
Thinning: 30


Summary of All Samples
                 Mean           SD         MCSE       ESS           LB
beta[1]   -11.7526275    2.3119401   0.34580302  43.51566   -17.027894
beta[2]     0.2000943    0.4891327   0.02860336 130.61590    -1.487342
Deviance  470.7346820 1899.1977136 105.17220909 100.44119    42.511693
LP       -244.1848392  949.6008219  52.58640512 100.44148 -3481.777935
              Median           UB
beta[1]  -11.6929022   -7.8194808
beta[2]    0.2842366    0.4423707
Deviance  44.3755257 6945.9261923
LP       -31.0009124  -30.0634427


Summary of Stationary Samples
                Mean         SD       MCSE ESS         LB      Median
beta[1]  -11.7250338 2.48894626  0.2213773 168 -17.044268 -11.6851217
beta[2]    0.2921779 0.06547124  0.0053019 168   0.166793   0.2909701
Deviance  48.0932958 8.48081577  0.7474430 168  42.527075  45.2507580
LP       -32.8641423 4.24023747 52.5864051 168 -47.454476 -31.4673576
                  UB
beta[1]   -7.5769778
beta[2]    0.4250974
Deviance  77.3040995
LP       -30.0696373

Delayed Rejection Metropolis

This algorithm has the instruction to use the covariance matrix from for instance DRAM. So I pulled those and the summary of stationary samples as input.


Call:
LaplacesDemon(Model = Model, Data = MyData, Initial.Values = c(-11.72,
    0.29), Covar = covar, Algorithm = "DRM")

Acceptance Rate: 0.5659
Algorithm: Delayed Rejection Metropolis
Covariance Matrix: (NOT SHOWN HERE; diagonal shown instead)
     beta[1]      beta[2]
11.556472479  0.007722216

Covariance (Diagonal) History: (NOT SHOWN HERE)
Deviance Information Criterion (DIC):
         All Stationary
Dbar  48.417     48.417
pD    59.001     59.001
DIC  107.419    107.419
Initial Values:
[1] -11.72   0.29

Iterations: 10000
Log(Marginal Likelihood): -38.65114
Minutes of run-time: 0.09
Model: (NOT SHOWN HERE)
Monitor: (NOT SHOWN HERE)
Parameters (Number of): 2
Posterior1: (NOT SHOWN HERE)
Posterior2: (NOT SHOWN HERE)
Recommended Burn-In of Thinned Samples: 0
Recommended Burn-In of Un-thinned Samples: 0
Recommended Thinning: 10
Specs: (NOT SHOWN HERE)
Status is displayed every 100 iterations
Summary1: (SHOWN BELOW)
Summary2: (SHOWN BELOW)
Thinned Samples: 1000
Thinning: 10


Summary of All Samples
                Mean          SD        MCSE      ESS        LB      Median
beta[1]  -11.6326715  2.89304045 0.111808577 891.8417 -18.12638 -11.5067562
beta[2]    0.2883743  0.07495814 0.002893592 894.3397   0.13874   0.2834856
Deviance  48.4174842 10.86289496 0.377770754 897.9490  42.52784  44.7049759
LP       -33.0262590  5.43029899 0.188915825 897.4927 -47.25058 -31.1763877
                 UB
beta[1]   -6.014343
beta[2]    0.452027
Deviance  76.915884
LP       -30.075104


Summary of Stationary Samples
                Mean          SD        MCSE      ESS        LB      Median
beta[1]  -11.6326715  2.89304045 0.111808577 891.8417 -18.12638 -11.5067562
beta[2]    0.2883743  0.07495814 0.002893592 894.3397   0.13874   0.2834856
Deviance  48.4174842 10.86289496 0.377770754 897.9490  42.52784  44.7049759
LP       -33.0262590  5.43029899 0.188915825 897.4927 -47.25058 -31.1763877
                 UB
beta[1]   -6.014343
beta[2]    0.452027
Deviance  76.915884
LP       -30.075104

Differential Evolution Markov Chain

Following LP, one can see this algorithm shift its step to step towards the target distribution. The same is visible in the samples.


Call:
LaplacesDemon(Model = Model, Data = MyData, Initial.Values = Initial.Values,
    Iterations = 70000, Status = 2000, Thinning = 36, Algorithm = "DEMC",
    Specs = list(Nc = 3, Z = NULL, gamma = 0, w = 0.1))

Acceptance Rate: 0.94571
Algorithm: Differential Evolution Markov Chain
Covariance Matrix: (NOT SHOWN HERE; diagonal shown instead)
    beta[1]     beta[2]
90.26633832  0.04206898

Covariance (Diagonal) History: (NOT SHOWN HERE)
Deviance Information Criterion (DIC):
          All Stationary
Dbar   89.209     43.944
pD   5238.430      1.706
DIC  5327.639     45.650
Initial Values:
[1] -10   0

Iterations: 70000
Log(Marginal Likelihood): NA
Minutes of run-time: 0.73
Model: (NOT SHOWN HERE)
Monitor: (NOT SHOWN HERE)
Parameters (Number of): 2
Posterior1: (NOT SHOWN HERE)
Posterior2: (NOT SHOWN HERE)
Recommended Burn-In of Thinned Samples: 1164
Recommended Burn-In of Un-thinned Samples: 41904
Recommended Thinning: 32
Specs: (NOT SHOWN HERE)
Status is displayed every 2000 iterations
Summary1: (SHOWN BELOW)
Summary2: (SHOWN BELOW)
Thinned Samples: 1944
Thinning: 36


Summary of All Samples
               Mean          SD        MCSE      ESS           LB      Median
beta[1]  -17.482864   9.5017889  2.35451645 2.787539  -36.9570864 -13.2979369
beta[2]    0.410902   0.2049482  0.04994949 4.145223    0.1652045   0.3204944
Deviance  89.209031 102.3565307 23.04234800 7.261840   42.4986747  44.6939882
LP       -53.548197  51.3373427 11.56914508 7.198904 -167.9946428 -31.1456515
                  UB
beta[1]   -7.1204957
beta[2]    0.7804724
Deviance 317.1315784
LP       -30.0563707


Summary of Stationary Samples
                Mean        SD         MCSE       ESS          LB      Median
beta[1]  -11.9431454 1.7007792  0.215271093 125.13410 -15.7999987 -11.9807022
beta[2]    0.2969431 0.0441033  0.005730276 118.23767   0.2259635   0.2946702
Deviance  43.9443086 1.8471515  0.371329880  63.98536  42.4849394  43.3846382
LP       -30.7905955 0.9340381 11.569145079  63.51772 -33.2807813 -30.5163527
                  UB
beta[1]   -9.0646733
beta[2]    0.4059097
Deviance  48.8635918
LP       -30.0522792

Elliptical Slice Sampler

Manual states. 'This algorithm is applicable only to models in which the prior mean of all parameters is zero.' That is true for my prior, yet I am not impressed at all. Maybe I should be centering or such, but the current formulation was not a success
Call:
LaplacesDemon(Model = Model, Data = MyData, Initial.Values = Initial.Values,
    Iterations = 60000, Status = 2000, Thinning = 1000, Algorithm = "ESS")

Acceptance Rate: 1
Algorithm: Elliptical Slice Sampler
Covariance Matrix: (NOT SHOWN HERE; diagonal shown instead)
    beta[1]     beta[2]
1.514016386 0.001094917

Covariance (Diagonal) History: (NOT SHOWN HERE)
Deviance Information Criterion (DIC):
        All Stationary
Dbar 53.903     53.806
pD   11.574     13.346
DIC  65.477     67.152
Initial Values:
[1] -10   0

Iterations: 60000
Log(Marginal Likelihood): NA
Minutes of run-time: 0.77
Model: (NOT SHOWN HERE)
Monitor: (NOT SHOWN HERE)
Parameters (Number of): 2
Posterior1: (NOT SHOWN HERE)
Posterior2: (NOT SHOWN HERE)
Recommended Burn-In of Thinned Samples: 18
Recommended Burn-In of Un-thinned Samples: 18000
Recommended Thinning: 1000
Specs: (NOT SHOWN HERE)
Status is displayed every 2000 iterations
Summary1: (SHOWN BELOW)
Summary2: (SHOWN BELOW)
Thinned Samples: 60
Thinning: 1000


Summary of All Samples
                Mean         SD        MCSE      ESS           LB     Median
beta[1]   -5.9519978 1.12538724 0.199063822 34.62487  -8.11739884  -5.904983
beta[2]    0.1419102 0.02788798 0.004825592 38.79318   0.09329411   0.141184
Deviance  53.9025233 4.81129661 0.854043909 46.96804  46.49740770  53.653605
LP       -35.7152403 2.39947768 0.425934607 46.93833 -41.22733361 -35.594423
                  UB
beta[1]   -3.9669525
beta[2]    0.1932619
Deviance  64.9487765
LP       -32.0237607


Summary of Stationary Samples
                Mean         SD        MCSE      ESS           LB      Median
beta[1]   -5.9962946 1.24391453 0.253903583 22.52123  -8.27636391  -5.9679392
beta[2]    0.1430514 0.03108438 0.006168722 27.21658   0.09456836   0.1467105
Deviance  53.8060933 5.16634523 1.088725764 34.31438  46.18528394  53.6113477
LP       -35.6674227 2.57618453 0.425934607 34.28162 -40.73404524 -35.5728340
                  UB
beta[1]   -4.0287456
beta[2]    0.1938871
Deviance  63.9614942
LP       -31.8687620

Gibbs Sampler

This needs derivatives, hence skipped.

Griddy Gibbs

This takes a grid from which a density is estimated and on which sampling is based. It may be a bit difficult for this grid, since the two parameters have different scales and the same grid is used. With only two parameters it was possible to take a rather high value for the number of grid points. Even so, I am not so happy with the final outcome.
Call:
LaplacesDemon(Model = Model, Data = MyData, Initial.Values = Initial.Values,
    Iterations = 30000, Status = 2000, Thinning = 100, Algorithm = "GG",
    Specs = list(Grid = seq(from = -0.25, to = 0.25, len = 13),
        dparm = NULL, CPUs = 1, Packages = NULL, Dyn.libs = NULL))

Acceptance Rate: 1
Algorithm: Griddy-Gibbs
Covariance Matrix: (NOT SHOWN HERE; diagonal shown instead)
     beta[1]      beta[2]
11.378198005  0.008486228

Covariance (Diagonal) History: (NOT SHOWN HERE)
Deviance Information Criterion (DIC):
          All Stationary
Dbar   66.161     66.161
pD   1339.075   1339.075
DIC  1405.236   1405.236
Initial Values:
[1] -10   0

Iterations: 30000
Log(Marginal Likelihood): NA
Minutes of run-time: 2.09
Model: (NOT SHOWN HERE)
Monitor: (NOT SHOWN HERE)
Parameters (Number of): 2
Posterior1: (NOT SHOWN HERE)
Posterior2: (NOT SHOWN HERE)
Recommended Burn-In of Thinned Samples: 0
Recommended Burn-In of Un-thinned Samples: 0
Recommended Thinning: 900
Specs: (NOT SHOWN HERE)
Status is displayed every 2000 iterations
Summary1: (SHOWN BELOW)
Summary2: (SHOWN BELOW)
Thinned Samples: 300
Thinning: 100


Summary of All Samples
              Mean         SD       MCSE      ESS            LB      Median
beta[1]  -11.00845  3.3782928 0.77873105  23.0348  -18.26815566 -10.7755255
beta[2]    0.27170  0.0909315 0.01994284  30.0812    0.09175425   0.2612613
Deviance  66.16122 51.7508405 2.84979150 300.0000   42.82591844  50.8991200
LP       -41.89256 25.8754878 1.42498409 300.0000 -139.85980754 -34.2665415
                 UB
beta[1]   -4.870858
beta[2]    0.450951
Deviance 262.096671
LP       -30.229348


Summary of Stationary Samples
              Mean         SD       MCSE      ESS            LB      Median
beta[1]  -11.00845  3.3782928 0.77873105  23.0348  -18.26815566 -10.7755255
beta[2]    0.27170  0.0909315 0.01994284  30.0812    0.09175425   0.2612613
Deviance  66.16122 51.7508405 2.84979150 300.0000   42.82591844  50.8991200
LP       -41.89256 25.8754878 1.42498409 300.0000 -139.85980754 -34.2665415
                 UB
beta[1]   -4.870858
beta[2]    0.450951
Deviance 262.096671
LP       -30.229348


Hamiltonian Monte Carlo
A set was of specs was found. Acceptance rate is a bit high compared to the manual.

Call:
LaplacesDemon(Model = Model, Data = MyData, Initial.Values = Initial.Values,
    Thinning = 100, Algorithm = "HMC", Specs = list(epsilon = 0.9 *
        c(0.1, 0.01), L = 11))

Acceptance Rate: 0.8385
Algorithm: Hamiltonian Monte Carlo
Covariance Matrix: (NOT SHOWN HERE; diagonal shown instead)
    beta[1]     beta[2]
3.515108412 0.003083421

Covariance (Diagonal) History: (NOT SHOWN HERE)
Deviance Information Criterion (DIC):
        All Stationary
Dbar 44.429     44.562
pD    1.941      1.869
DIC  46.369     46.431
Initial Values:
[1] -10   0

Iterations: 10000
Log(Marginal Likelihood): NA
Minutes of run-time: 0.59
Model: (NOT SHOWN HERE)
Monitor: (NOT SHOWN HERE)
Parameters (Number of): 2
Posterior1: (NOT SHOWN HERE)
Posterior2: (NOT SHOWN HERE)
Recommended Burn-In of Thinned Samples: 80
Recommended Burn-In of Un-thinned Samples: 8000
Recommended Thinning: 100
Specs: (NOT SHOWN HERE)
Status is displayed every 100 iterations
Summary1: (SHOWN BELOW)
Summary2: (SHOWN BELOW)
Thinned Samples: 100
Thinning: 100


Summary of All Samples
               Mean         SD        MCSE ESS          LB      Median
beta[1]  -11.421073 1.87894067 0.242559120 100 -15.4741232 -11.3104311
beta[2]    0.283175 0.04808956 0.006413119 100   0.1975276   0.2818055
Deviance  44.428764 1.97004886 0.159856183 100  42.5400966  43.7163906
LP       -31.027024 0.98807551 0.080511201 100 -33.5265769 -30.6632451
                  UB
beta[1]   -7.9608191
beta[2]    0.3807289
Deviance  49.3945952
LP       -30.0829425


Summary of Stationary Samples
                Mean         SD        MCSE ESS          LB      Median
beta[1]  -11.0974590 1.92886822 0.226325971  20 -15.4741232 -10.9792610
beta[2]    0.2750898 0.04775153 0.005911688  20   0.2034193   0.2713854
Deviance  44.5623988 1.93322645 0.408444645  20  42.5740058  44.0794655
LP       -31.0902147 0.97037005 0.080511201  20 -33.0194587 -30.8456818
                  UB
beta[1]   -8.2355095
beta[2]    0.3807289
Deviance  48.3972203
LP       -30.0962034

Another set of specs


Call:
LaplacesDemon(Model = Model, Data = MyData, Initial.Values = Initial.Values,
    Thinning = 100, Algorithm = "HMC", Specs = list(epsilon = 3 *
        c(0.1, 0.001), L = 18))

Acceptance Rate: 0.8855
Algorithm: Hamiltonian Monte Carlo
Covariance Matrix: (NOT SHOWN HERE; diagonal shown instead)
    beta[1]     beta[2]
3.640714435 0.003207219

Covariance (Diagonal) History: (NOT SHOWN HERE)
Deviance Information Criterion (DIC):
        All Stationary
Dbar 44.404     44.404
pD    2.051      2.051
DIC  46.455     46.455
Initial Values:
[1] -10   0

Iterations: 10000
Log(Marginal Likelihood): NA
Minutes of run-time: 0.96
Model: (NOT SHOWN HERE)
Monitor: (NOT SHOWN HERE)
Parameters (Number of): 2
Posterior1: (NOT SHOWN HERE)
Posterior2: (NOT SHOWN HERE)
Recommended Burn-In of Thinned Samples: 0
Recommended Burn-In of Un-thinned Samples: 0
Recommended Thinning: 100
Specs: (NOT SHOWN HERE)
Status is displayed every 100 iterations
Summary1: (SHOWN BELOW)
Summary2: (SHOWN BELOW)
Thinned Samples: 100
Thinning: 100


Summary of All Samples
                Mean         SD        MCSE ESS          LB      Median
beta[1]  -11.5949171 1.91103354 0.200246790 100 -15.6570246 -11.5727273
beta[2]    0.2867121 0.04916803 0.005146306 100   0.2097083   0.2865395
Deviance  44.4043210 2.02528350 0.193624072 100  42.4813611  43.7159364
LP       -31.0168639 1.01912132 0.097084186 100 -33.7046786 -30.6665144
                  UB
beta[1]   -8.4758710
beta[2]    0.3936658
Deviance  49.8533556
LP       -30.0520014


Summary of Stationary Samples
                Mean         SD        MCSE ESS          LB      Median
beta[1]  -11.5949171 1.91103354 0.200246790 100 -15.6570246 -11.5727273
beta[2]    0.2867121 0.04916803 0.005146306 100   0.2097083   0.2865395
Deviance  44.4043210 2.02528350 0.193624072 100  42.4813611  43.7159364
LP       -31.0168639 1.01912132 0.097084186 100 -33.7046786 -30.6665144
                  UB
beta[1]   -8.4758710
beta[2]    0.3936658
Deviance  49.8533556
LP       -30.0520014

No comments:

Post a Comment