Skip to content

Commit

Permalink
ENH: Work-in-progress Stan model for #1
Browse files Browse the repository at this point in the history
  • Loading branch information
pan14001 committed May 10, 2019
1 parent 4447e68 commit 42003e3
Show file tree
Hide file tree
Showing 5 changed files with 120 additions and 0 deletions.
10 changes: 10 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -139,13 +139,23 @@ task workers complete records and become available.

```sh
# From the command-line
python2.7 -m pip install --user --upgrade pip
python2.7 -m pip install --user --upgrade pystan
module purge
module load gcc/4.9.3 # Needed for pystan 2.19.0
cd ~/parallel-slurm/examples
rm -rf joblog submit.out results/ results.csv
for i in {1..5}; do sbatch 03-submit-param-sweep.slurm; done
touch submit.out && tail -f submit.out
# Hit Ctrl+C to exit
```

Output:

```
```

## Next Steps

Hopefully these examples have inspired you to use GNU Parallel to
Expand Down
40 changes: 40 additions & 0 deletions examples/03-submit-param-sweep.slurm
Original file line number Diff line number Diff line change
@@ -0,0 +1,40 @@
#!/bin/bash
#SBATCH --ntasks 5
#SBATCH --output submit.out

#SBATCH --dependency singleton
#SBATCH --job-name example-03-parameter-sweep
# Kill job after 5 minutes seconds to show resuming feature.
#SBATCH --time 5:00

parallel_opts=$(~/parallel-slurm/parallel_opts.sh)
module load parallel

# Alias our model running program.
model="python model.py"

# Read the number of simulations to run.
n_sim=$( $model --sim )
n_remaining=$( $model --remaining )

# Check if all simulations are completed.
echo "Started SLURM job $SLURM_JOB_ID"
if [[ n_remaining -eq 0 ]]
then
echo "Nothing to run; all $n_sim simulations complete."
echo "Completed SLURM job $SLURM_JOB_ID in $(sacct -nXj $SLURM_JOB_ID -o elapsed)"
exit 0
fi

# Run an interruption prone program.
seq $n_sum > param_index
echo "Running $n_remaining of total $n_sim simulations."
# It's okay to use
parallel $parallel_opts \
--joblog joblog \
--resume \
--retries 3 \
--line-buffer \
$model \
::: param_index
echo "Completed SLURM job $SLURM_JOB_ID in $(sacct -nXj $SLURM_JOB_ID -o elapsed)"
27 changes: 27 additions & 0 deletions examples/model.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
# Example of Surgical institutional ranking from
# http://www.openbugs.net/Examples/Surgical.html

import os

import pystan

# This example considers mortality rates in 12 hospitals performing
# cardiac surgery in babies. The data are shown below.
#
DATA = {
# Number of hospitals.
'N': 12,
# Number of operations.
'n': [ 47, 148, 119, 810, 211, 196, 148, 215, 207, 97, 256, 360], # noqa: E201, E501
# Number of deaths.
'r': [ 0, 18, 8, 46, 8, 13, 9, 31, 14, 8, 29, 24] # noqa: E201, E501
}

if __name__ == "__main__":
# Disable threading.
os.environ['STAN_NUM_THREADS'] = "1"

fixed_effects = pystan.StanModel(file="model_indep.stan")
random_effects = pystan.StanModel(file="model_dep.stan")
burn_in = fixed_effects.sampling(data=DATA, warmup=1000, iter=0)
fit = random_effects.sampling(fit=burn_in, data=DATA, warmup=0, iter=10000)
27 changes: 27 additions & 0 deletions examples/model_dep.stan
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
data {
int N;
int<lower = 0> n[N];
int<lower = 0> r[N];
}

parameters {
real mu;
real tau;
real b[N];
real<lower = 0, upper = 1> p[N];
real pop_mean;
real sigma;
}

model {
for (i in 1:N) {
b[i] ~ normal_lpdf(mu, tau);
r[i] ~ binomial_lpmf(n[i], p[i]);
p[i] ~ logit(b[i]);
}

pop_mean = exp(mu) / (1 + exp(mu));
mu ~ normal(0.0, 1.0E-6);
sigma = 1 / sqrt(tau);
tau ~ gamma(0.001, 0.001);
}
16 changes: 16 additions & 0 deletions examples/model_indep.stan
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
data {
int N;
int<lower = 0> n[N];
int<lower = 0> r[N];
}

parameters {
real<lower = 0, upper = 1> p[N];
}

model {
for (i in 1:N) {
p[i] ~ beta_lpdf(1.0, 1.0);
r[i] ~ binomial_lpmf(n[i], p[i]);
}
}

0 comments on commit 42003e3

Please sign in to comment.