Models
For more on these models, check out the Conjugate Prior Wikipedia Table
Supported Likelihoods
Discrete
- Bernoulli / Binomial
- Negative Binomial
- Geometric
- Hypergeometric
- Poisson
- Categorical / Multinomial
Continuous
- Normal
- Multivariate Normal
- Linear Regression (Normal)
- Log Normal
- Uniform
- Exponential
- Pareto
- Gamma
- Beta
- Von Mises
Model Functions
Below are the supported models
bernoulli_beta(x, beta_prior)
Posterior distribution for a bernoulli likelihood with a beta prior.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
x |
NUMERIC
|
sucesses from a single trial |
required |
beta_prior |
Beta
|
Beta distribution prior |
required |
Returns:
Type | Description |
---|---|
Beta
|
Beta distribution posterior |
Examples:
Information gain from a single coin flip
from conjugate.distributions import Beta
from conjugate.models import bernoulli_beta
prior = Beta(1, 1)
# Positive outcome
x = 1
posterior = bernoulli_beta(
x=x,
beta_prior=prior
)
posterior.dist.ppf([0.025, 0.975])
# array([0.15811388, 0.98742088])
Source code in conjugate/models.py
bernoulli_beta_posterior_predictive(beta)
Posterior predictive distribution for a bernoulli likelihood with a beta prior.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
beta |
Beta
|
Beta distribution |
required |
Returns:
Type | Description |
---|---|
BetaBinomial
|
BetaBinomial posterior predictive distribution |
Source code in conjugate/models.py
beta(x_prod, one_minus_x_prod, n, proportional_prior)
Posterior distribution for a Beta likelihood.
Inference on alpha and beta
Parameters:
Name | Type | Description | Default |
---|---|---|---|
x_prod |
NUMERIC
|
product of all outcomes |
required |
one_minus_x_prod |
NUMERIC
|
product of all (1 - outcomes) |
required |
n |
NUMERIC
|
total number of samples in x_prod and one_minus_x_prod |
required |
proportional_prior |
BetaProportional
|
BetaProportional prior |
required |
Returns:
Type | Description |
---|---|
BetaProportional
|
BetaProportional posterior distribution |
Source code in conjugate/models.py
binomial_beta(n, x, beta_prior)
Posterior distribution for a binomial likelihood with a beta prior.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
n |
NUMERIC
|
total number of trials |
required |
x |
NUMERIC
|
sucesses from that trials |
required |
beta_prior |
Beta
|
Beta distribution prior |
required |
Returns:
Type | Description |
---|---|
Beta
|
Beta distribution posterior |
Examples:
A / B test example
import numpy as np
import matplotlib.pyplot as plt
from conjugate.distributions import Beta
from conjugate.models import binomial_beta
impressions = np.array([100, 250])
clicks = np.array([10, 35])
prior = Beta(1, 1)
posterior = binomial_beta(
n=impressions,
x=clicks,
beta_prior=prior
)
ax = plt.subplot(111)
posterior.set_bounds(0, 0.5).plot_pdf(ax=ax, label=["A", "B"])
prior.set_bounds(0, 0.5).plot_pdf(ax=ax, label="prior")
ax.legend()
plt.show()
Source code in conjugate/models.py
binomial_beta_posterior_predictive(n, beta)
Posterior predictive distribution for a binomial likelihood with a beta prior.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
n |
NUMERIC
|
number of trials |
required |
beta |
Beta
|
Beta distribution |
required |
Returns:
Type | Description |
---|---|
BetaBinomial
|
BetaBinomial posterior predictive distribution |
Examples:
A / B test example with 100 new impressions
import numpy as np
import matplotlib.pyplot as plt
from conjugate.distributions import Beta
from conjugate.models import binomial_beta, binomial_beta_posterior_predictive
impressions = np.array([100, 250])
clicks = np.array([10, 35])
prior = Beta(1, 1)
posterior = binomial_beta(
n=impressions,
x=clicks,
beta_prior=prior
)
posterior_predictive = binomial_beta_posterior_predictive(
n=100,
beta=posterior
)
ax = plt.subplot(111)
ax.set_title("Posterior Predictive Distribution with 100 new impressions")
posterior_predictive.set_bounds(0, 50).plot_pmf(
ax=ax,
label=["A", "B"],
)
plt.show()
Source code in conjugate/models.py
categorical_dirichlet(x, dirichlet_prior)
Posterior distribution of Categorical model with Dirichlet prior.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
x |
NUMERIC
|
counts |
required |
dirichlet_prior |
Dirichlet
|
Dirichlet prior on the counts |
required |
Returns:
Type | Description |
---|---|
Dirichlet
|
Dirichlet posterior distribution |
Source code in conjugate/models.py
categorical_dirichlet_posterior_predictive(dirichlet, n=1)
Posterior predictive distribution of Categorical model with Dirichlet prior.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
dirichlet |
Dirichlet
|
Dirichlet distribution |
required |
n |
NUMERIC
|
Number of trials for each sample, defaults to 1. |
1
|
Source code in conjugate/models.py
exponential_gamma(x_total, n, gamma_prior)
Posterior distribution for an exponential likelihood with a gamma prior.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
x_total |
NUMERIC
|
sum of all outcomes |
required |
n |
NUMERIC
|
total number of samples in x_total |
required |
gamma_prior |
Gamma
|
Gamma prior |
required |
Returns:
Type | Description |
---|---|
Gamma
|
Gamma posterior distribution |
Source code in conjugate/models.py
exponential_gamma_posterior_predictive(gamma)
Posterior predictive distribution for an exponential likelihood with a gamma prior
Parameters:
Name | Type | Description | Default |
---|---|---|---|
gamma |
Gamma
|
Gamma distribution |
required |
Returns:
Type | Description |
---|---|
Lomax
|
Lomax distribution related to posterior predictive |
Examples:
Constructed example
import numpy as np
import matplotlib.pyplot as plt
from conjugate.distributions import Exponential, Gamma
from conjugate.models import exponential_gamma, expotential_gamma_posterior_predictive
true = Exponential(1)
n_samples = 15
data = true.dist.rvs(size=n_samples, random_state=42)
prior = Gamma(1, 1)
posterior = exponential_gamma(
n=n_samples,
x_total=data.sum(),
gamma_prior=prior
)
prior_predictive = expotential_gamma_posterior_predictive(prior)
posterior_predictive = expotential_gamma_posterior_predictive(posterior)
ax = plt.subplot(111)
prior_predictive.set_bounds(0, 2.5).plot_pdf(ax=ax, label="prior predictive")
true.set_bounds(0, 2.5).plot_pdf(ax=ax, label="true distribution")
posterior_predictive.set_bounds(0, 2.5).plot_pdf(ax=ax, label="posterior predictive")
ax.legend()
plt.show()
Source code in conjugate/models.py
gamma(x_total, x_prod, n, proportional_prior)
Posterior distribution for a gamma likelihood.
Inference on alpha and beta
Parameters:
Name | Type | Description | Default |
---|---|---|---|
x_total |
NUMERIC
|
sum of all outcomes |
required |
x_prod |
NUMERIC
|
product of all outcomes |
required |
n |
NUMERIC
|
total number of samples in x_total and x_prod |
required |
proportional_prior |
GammaProportional
|
GammaProportional prior |
required |
Returns:
Type | Description |
---|---|
GammaProportional
|
GammaProportional posterior distribution |
Source code in conjugate/models.py
gamma_known_rate(x_prod, n, beta, proportional_prior)
Posterior distribution for a gamma likelihood.
The rate beta is assumed to be known.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
x_prod |
NUMERIC
|
product of all outcomes |
required |
n |
NUMERIC
|
total number of samples in x_prod |
required |
beta |
NUMERIC
|
known rate parameter |
required |
Returns:
Type | Description |
---|---|
GammaKnownRateProportional
|
GammaKnownRateProportional posterior distribution |
Source code in conjugate/models.py
gamma_known_shape(x_total, n, alpha, gamma_prior)
Gamma likelihood with a gamma prior.
The shape parameter of the likelihood is assumed to be known.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
x_total |
NUMERIC
|
sum of all outcomes |
required |
n |
NUMERIC
|
total number of samples in x_total |
required |
alpha |
NUMERIC
|
known shape parameter |
required |
gamma_prior |
Gamma
|
Gamma prior |
required |
Returns:
Type | Description |
---|---|
Gamma
|
Gamma posterior distribution |
Examples:
Constructed example
import numpy as np
import matplotlib.pyplot as plt
from conjugate.distributions import Gamma
from conjugate.models import gamma_known_shape
known_shape = 2
unknown_rate = 5
true = Gamma(known_shape, unknown_rate)
n_samples = 15
data = true.dist.rvs(size=n_samples, random_state=42)
prior = Gamma(1, 1)
posterior = gamma_known_shape(
n=n_samples,
x_total=data.sum(),
alpha=known_shape,
gamma_prior=prior
)
bound = 10
ax = plt.subplot(111)
posterior.set_bounds(0, bound).plot_pdf(ax=ax, label="posterior")
prior.set_bounds(0, bound).plot_pdf(ax=ax, label="prior")
ax.axvline(unknown_rate, color="black", linestyle="--", label="true rate")
ax.legend()
plt.show()
Source code in conjugate/models.py
gamma_known_shape_posterior_predictive(gamma, alpha)
Posterior predictive distribution for a gamma likelihood with a gamma prior
Parameters:
Name | Type | Description | Default |
---|---|---|---|
gamma |
Gamma
|
Gamma distribution |
required |
alpha |
NUMERIC
|
known shape parameter |
required |
Returns:
Type | Description |
---|---|
CompoundGamma
|
CompoundGamma distribution related to posterior predictive |
Source code in conjugate/models.py
geometric_beta(x_total, n, beta_prior, one_start=True)
Posterior distribution for a geometric likelihood with a beta prior.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
x_total |
sum of all trials outcomes |
required | |
n |
total number of trials |
required | |
beta_prior |
Beta
|
Beta distribution prior |
required |
one_start |
bool
|
whether to outcomes start at 1, defaults to True. False is 0 start. |
True
|
Returns:
Type | Description |
---|---|
Beta
|
Beta distribution posterior |
Examples:
Number of usages until user has good experience
import numpy as np
import matplotlib.pyplot as plt
from conjugate.distributions import Beta
from conjugate.models import geometric_beta
data = np.array([3, 1, 1, 3, 2, 1])
prior = Beta(1, 1)
posterior = geometric_beta(
x_total=data.sum(),
n=data.size,
beta_prior=prior
)
ax = plt.subplot(111)
posterior.set_bounds(0, 1).plot_pdf(ax=ax, label="posterior")
prior.set_bounds(0, 1).plot_pdf(ax=ax, label="prior")
ax.legend()
ax.set(xlabel="chance of good experience")
plt.show()
Source code in conjugate/models.py
hypergeometric_beta_binomial(x_total, n, beta_binomial_prior)
Hypergeometric likelihood with a BetaBinomial prior.
The total population size is N and is known. Encode it in the BetaBinomial prior as n=N
Parameters:
Name | Type | Description | Default |
---|---|---|---|
x_total |
NUMERIC
|
sum of all trials outcomes |
required |
n |
NUMERIC
|
total number of trials |
required |
beta_binomial_prior |
BetaBinomial
|
BetaBinomial prior n is the known N / total population size |
required |
Returns:
Type | Description |
---|---|
BetaBinomial
|
BetaBinomial posterior distribution |
Source code in conjugate/models.py
linear_regression(X, y, normal_inverse_gamma_prior, inv=np.linalg.inv)
Posterior distribution for a linear regression model with a normal inverse gamma prior.
Derivation taken from this blog here.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
X |
NUMERIC
|
design matrix |
required |
y |
NUMERIC
|
response vector |
required |
normal_inverse_gamma_prior |
NormalInverseGamma
|
NormalInverseGamma prior |
required |
inv |
function to invert matrix, defaults to np.linalg.inv |
inv
|
Returns:
Type | Description |
---|---|
NormalInverseGamma
|
NormalInverseGamma posterior distribution |
Source code in conjugate/models.py
linear_regression_posterior_predictive(normal_inverse_gamma, X, eye=np.eye)
Posterior predictive distribution for a linear regression model with a normal inverse gamma prior.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
normal_inverse_gamma |
NormalInverseGamma
|
NormalInverseGamma posterior |
required |
X |
NUMERIC
|
design matrix |
required |
eye |
function to get identity matrix, defaults to np.eye |
eye
|
Returns:
Type | Description |
---|---|
MultivariateStudentT
|
MultivariateStudentT posterior predictive distribution |
Source code in conjugate/models.py
log_normal_normal_inverse_gamma(ln_x_total, ln_x2_total, n, normal_inverse_gamma_prior)
Log normal likelihood with a normal inverse gamma prior.
By taking the log of the data, we can use the normal inverse gamma posterior.
Reference: Section 1.2.1
Parameters:
Name | Type | Description | Default |
---|---|---|---|
ln_x_total |
NUMERIC
|
sum of the log of all outcomes |
required |
ln_x2_total |
NUMERIC
|
sum of the log of all outcomes squared |
required |
n |
NUMERIC
|
total number of samples in ln_x_total and ln_x2_total |
required |
normal_inverse_gamma_prior |
NormalInverseGamma
|
NormalInverseGamma prior |
required |
Returns:
Type | Description |
---|---|
NormalInverseGamma
|
NormalInverseGamma posterior distribution |
Example
Constructed example
import numpy as np
import matplotlib.pyplot as plt
from conjugate.distributions import NormalInverseGamma, LogNormal
from conjugate.models import log_normal_normal_inverse_gamma
true_mu = 0
true_sigma = 2.5
true = LogNormal(true_mu, true_sigma)
n_samples = 15
data = true.dist.rvs(size=n_samples, random_state=42)
ln_data = np.log(data)
prior = NormalInverseGamma(1, 1, 1, nu=1)
posterior = log_normal_normal_inverse_gamma(
ln_x_total=ln_data.sum(),
ln_x2_total=(ln_data**2).sum(),
n=n_samples,
normal_inverse_gamma_prior=prior
)
fig, axes = plt.subplots(ncols=2)
mean, variance = posterior.sample_mean(4000, return_variance=True, random_state=42)
ax = axes[0]
ax.hist(mean, bins=20)
ax.axvline(true_mu, color="black", linestyle="--", label="true mu")
ax = axes[1]
ax.hist(variance, bins=20)
ax.axvline(true_sigma**2, color="black", linestyle="--", label="true sigma^2")
plt.show()
Source code in conjugate/models.py
1571 1572 1573 1574 1575 1576 1577 1578 1579 1580 1581 1582 1583 1584 1585 1586 1587 1588 1589 1590 1591 1592 1593 1594 1595 1596 1597 1598 1599 1600 1601 1602 1603 1604 1605 1606 1607 1608 1609 1610 1611 1612 1613 1614 1615 1616 1617 1618 1619 1620 1621 1622 1623 1624 1625 1626 1627 1628 1629 1630 1631 1632 1633 1634 1635 1636 1637 1638 1639 |
|
multinomial_dirichlet(x, dirichlet_prior)
Posterior distribution of Multinomial model with Dirichlet prior.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
x |
NUMERIC
|
counts |
required |
dirichlet_prior |
Dirichlet
|
Dirichlet prior on the counts |
required |
Returns:
Type | Description |
---|---|
Dirichlet
|
Dirichlet posterior distribution |
Examples:
Personal preference for ice cream flavors
import numpy as np
import matplotlib.pyplot as plt
from conjugate.distributions import Dirichlet
from conjugate.models import multinomial_dirichlet
kinds = ["chocolate", "vanilla", "strawberry"]
data = np.array([
[5, 2, 1],
[3, 1, 0],
[3, 2, 0],
])
prior = Dirichlet([1, 1, 1])
posterior = multinomial_dirichlet(
x=data.sum(axis=0),
dirichlet_prior=prior
)
ax = plt.subplot(111)
posterior.plot_pdf(ax=ax, label=kinds)
ax.legend()
ax.set(xlabel="Flavor Preference")
plt.show()
Source code in conjugate/models.py
multinomial_dirichlet_posterior_predictive(dirichlet, n=1)
Posterior predictive distribution of Multinomial model with Dirichlet prior.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
dirichlet |
Dirichlet
|
Dirichlet distribution |
required |
n |
NUMERIC
|
Number of trials for each sample, defaults to 1. |
1
|
Source code in conjugate/models.py
multivariate_normal(X, normal_inverse_wishart_prior, outer=np.outer)
Multivariate normal likelihood with normal inverse wishart prior.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
X |
NUMERIC
|
design matrix |
required |
mu |
known mean |
required | |
normal_inverse_wishart_prior |
NormalInverseWishart
|
NormalInverseWishart prior |
required |
outer |
function to take outer product, defaults to np.outer |
outer
|
Returns:
Type | Description |
---|---|
NormalInverseWishart
|
NormalInverseWishart posterior distribution |
Examples:
Constructed example
import numpy as np
import matplotlib.pyplot as plt
from conjugate.distributions import NormalInverseWishart
from conjugate.models import multivariate_normal
true_mean = np.array([1, 5])
true_cov = np.array([
[1, 0.5],
[0.5, 1],
])
n_samples = 100
rng = np.random.default_rng(42)
data = rng.multivariate_normal(
mean=true_mean,
cov=true_cov,
size=n_samples,
)
prior = NormalInverseWishart(
mu=np.array([0, 0]),
kappa=1,
nu=3,
psi=np.array([
[1, 0],
[0, 1],
]),
)
posterior = multivariate_normal(
X=data,
normal_inverse_wishart_prior=prior,
)
Source code in conjugate/models.py
1385 1386 1387 1388 1389 1390 1391 1392 1393 1394 1395 1396 1397 1398 1399 1400 1401 1402 1403 1404 1405 1406 1407 1408 1409 1410 1411 1412 1413 1414 1415 1416 1417 1418 1419 1420 1421 1422 1423 1424 1425 1426 1427 1428 1429 1430 1431 1432 1433 1434 1435 1436 1437 1438 1439 1440 1441 1442 1443 1444 1445 1446 1447 1448 1449 1450 1451 1452 1453 1454 1455 1456 1457 1458 1459 1460 1461 1462 1463 1464 1465 1466 1467 1468 1469 1470 |
|
multivariate_normal_known_mean(X, mu, inverse_wishart_prior)
Multivariate normal likelihood with known mean and inverse wishart prior.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
X |
NUMERIC
|
design matrix |
required |
mu |
NUMERIC
|
known mean |
required |
inverse_wishart_prior |
InverseWishart
|
InverseWishart prior |
required |
Returns:
Type | Description |
---|---|
InverseWishart
|
InverseWishart posterior distribution |
Source code in conjugate/models.py
multivariate_normal_posterior_predictive(normal_inverse_wishart)
Multivariate normal likelihood with normal inverse wishart prior.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
normal_inverse_wishart |
NormalInverseWishart
|
NormalInverseWishart posterior |
required |
Returns:
Type | Description |
---|---|
MultivariateStudentT
|
MultivariateStudentT posterior predictive distribution |
Examples:
Constructed example
import numpy as np
import matplotlib.pyplot as plt
from conjugate.distributions import NormalInverseWishart, MultivariateNormal
from conjugate.models import multivariate_normal, multivariate_normal_posterior_predictive
mu_1 = 10
mu_2 = 5
sigma_1 = 2.5
sigma_2 = 1.5
rho = -0.65
true_mean = np.array([mu_1, mu_2])
true_cov = np.array([
[sigma_1 ** 2, rho * sigma_1 * sigma_2],
[rho * sigma_1 * sigma_2, sigma_2 ** 2],
])
true = MultivariateNormal(true_mean, true_cov)
n_samples = 100
rng = np.random.default_rng(42)
data = true.dist.rvs(size=n_samples, random_state=rng)
prior = NormalInverseWishart(
mu=np.array([0, 0]),
kappa=1,
nu=2,
psi=np.array([
[5 ** 2, 0],
[0, 5 ** 2],
]),
)
posterior = multivariate_normal(
X=data,
normal_inverse_wishart_prior=prior,
)
prior_predictive = multivariate_normal_posterior_predictive(prior)
posterior_predictive = multivariate_normal_posterior_predictive(posterior)
ax = plt.subplot(111)
xmax = mu_1 + 3 * sigma_1
ymax = mu_2 + 3 * sigma_2
x, y = np.mgrid[-xmax:xmax:.1, -ymax:ymax:.1]
pos = np.dstack((x, y))
z = true.dist.pdf(pos)
# z = np.where(z < 0.005, np.nan, z)
contours = ax.contour(x, y, z, alpha=0.55, color="black")
for label, dist in zip(["prior", "posterior"], [prior_predictive, posterior_predictive]):
X = dist.dist.rvs(size=1000)
ax.scatter(X[:, 0], X[:, 1], alpha=0.15, label=f"{label} predictive")
ax.axvline(0, color="black", linestyle="--")
ax.axhline(0, color="black", linestyle="--")
ax.scatter(data[:, 0], data[:, 1], label="data", alpha=0.5)
ax.scatter(mu_1, mu_2, color="black", marker="x", label="true mean")
ax.set(
xlabel="x1",
ylabel="x2",
title=f"Posterior predictive after {n_samples} samples",
xlim=(-xmax, xmax),
ylim=(-ymax, ymax),
)
ax.legend()
plt.show()
Source code in conjugate/models.py
1473 1474 1475 1476 1477 1478 1479 1480 1481 1482 1483 1484 1485 1486 1487 1488 1489 1490 1491 1492 1493 1494 1495 1496 1497 1498 1499 1500 1501 1502 1503 1504 1505 1506 1507 1508 1509 1510 1511 1512 1513 1514 1515 1516 1517 1518 1519 1520 1521 1522 1523 1524 1525 1526 1527 1528 1529 1530 1531 1532 1533 1534 1535 1536 1537 1538 1539 1540 1541 1542 1543 1544 1545 1546 1547 1548 1549 1550 1551 1552 1553 1554 1555 1556 1557 1558 1559 1560 1561 1562 1563 1564 1565 1566 1567 1568 |
|
negative_binomial_beta(r, n, x, beta_prior)
Posterior distribution for a negative binomial likelihood with a beta prior.
Assumed known number of failures r
Parameters:
Name | Type | Description | Default |
---|---|---|---|
r |
NUMERIC
|
number of failures |
required |
n |
NUMERIC
|
number of trials |
required |
x |
NUMERIC
|
number of successes |
required |
beta_prior |
Beta
|
Beta distribution prior |
required |
Returns:
Type | Description |
---|---|
Beta
|
Beta distribution posterior |
Source code in conjugate/models.py
negative_binomial_beta_posterior_predictive(r, beta)
Posterior predictive distribution for a negative binomial likelihood with a beta prior
Assumed known number of failures r
Parameters:
Name | Type | Description | Default |
---|---|---|---|
r |
NUMERIC
|
number of failures |
required |
beta |
Beta
|
Beta distribution |
required |
Returns:
Type | Description |
---|---|
BetaNegativeBinomial
|
BetaNegativeBinomial posterior predictive distribution |
Source code in conjugate/models.py
normal_known_mean(x_total, x2_total, n, mu, inverse_gamma_prior)
Posterior distribution for a normal likelihood with a known mean and a variance prior.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
x_total |
NUMERIC
|
sum of all outcomes |
required |
x2_total |
NUMERIC
|
sum of all outcomes squared |
required |
n |
NUMERIC
|
total number of samples in x_total |
required |
mu |
NUMERIC
|
known mean |
required |
inverse_gamma_prior |
InverseGamma
|
InverseGamma prior for variance |
required |
Returns:
Type | Description |
---|---|
InverseGamma
|
InverseGamma posterior distribution for the variance |
Source code in conjugate/models.py
normal_known_mean_posterior_predictive(mu, inverse_gamma)
Posterior predictive distribution for a normal likelihood with a known mean and a variance prior.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
mu |
NUMERIC
|
known mean |
required |
inverse_gamma |
InverseGamma
|
InverseGamma prior |
required |
Returns:
Type | Description |
---|---|
StudentT
|
StudentT posterior predictive distribution |
Examples:
Constructed example
import numpy as np
import matplotlib.pyplot as plt
from conjugate.distributions import Normal, InverseGamma
from conjugate.models import normal_known_mean, normal_known_mean_posterior_predictive
unknown_var = 2.5
known_mu = 0
true = Normal(known_mu, unknown_var**0.5)
n_samples = 15
data = true.dist.rvs(size=n_samples, random_state=42)
prior = InverseGamma(1, 1)
posterior = normal_known_mean(
n=n_samples,
x_total=data.sum(),
x2_total=(data**2).sum(),
mu=known_mu,
inverse_gamma_prior=prior
)
bound = 5
ax = plt.subplot(111)
prior_predictive = normal_known_mean_posterior_predictive(
mu=known_mu,
inverse_gamma=prior
)
prior_predictive.set_bounds(-bound, bound).plot_pdf(ax=ax, label="prior predictive")
true.set_bounds(-bound, bound).plot_pdf(ax=ax, label="true distribution")
posterior_predictive = normal_known_mean_posterior_predictive(
mu=known_mu,
inverse_gamma=posterior
)
posterior_predictive.set_bounds(-bound, bound).plot_pdf(ax=ax, label="posterior predictive")
ax.legend()
plt.show()
Source code in conjugate/models.py
normal_known_precision(x_total, n, precision, normal_prior)
Posterior distribution for a normal likelihood with known precision and a normal prior on mean.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
x_total |
NUMERIC
|
sum of all outcomes |
required |
n |
NUMERIC
|
total number of samples in x_total |
required |
precision |
NUMERIC
|
known precision |
required |
normal_prior |
Normal
|
Normal prior for mean |
required |
Returns:
Type | Description |
---|---|
Normal
|
Normal posterior distribution for the mean |
Examples:
Constructed example
import numpy as np
import matplotlib.pyplot as plt
from conjugate.distributions import Normal
from conjugate.models import normal_known_precision
unknown_mu = 0
known_precision = 0.5
true = Normal.from_mean_and_precision(unknown_mu, known_precision**0.5)
n_samples = 15
data = true.dist.rvs(size=n_samples, random_state=42)
prior = Normal(0, 10)
posterior = normal_known_precision(
n=n_samples,
x_total=data.sum(),
precision=known_precision,
normal_prior=prior
)
bound = 5
ax = plt.subplot(111)
posterior.set_bounds(-bound, bound).plot_pdf(ax=ax, label="posterior")
prior.set_bounds(-bound, bound).plot_pdf(ax=ax, label="prior")
ax.axvline(unknown_mu, color="black", linestyle="--", label="true mu")
ax.legend()
plt.show()
Source code in conjugate/models.py
normal_known_precision_posterior_predictive(precision, normal)
Posterior predictive distribution for a normal likelihood with known precision and a normal prior on mean.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
precision |
NUMERIC
|
known precision |
required |
normal |
Normal
|
Normal posterior distribution for the mean |
required |
Returns:
Type | Description |
---|---|
Normal
|
Normal posterior predictive distribution |
Examples:
Constructed example
import numpy as np
import matplotlib.pyplot as plt
from conjugate.distributions import Normal
from conjugate.models import normal_known_precision, normal_known_precision_posterior_predictive
unknown_mu = 0
known_precision = 0.5
true = Normal.from_mean_and_precision(unknown_mu, known_precision)
n_samples = 15
data = true.dist.rvs(size=n_samples, random_state=42)
prior = Normal(0, 10)
posterior = normal_known_precision(
n=n_samples,
x_total=data.sum(),
precision=known_precision,
normal_prior=prior
)
prior_predictive = normal_known_precision_posterior_predictive(
precision=known_precision,
normal=prior
)
posterior_predictive = normal_known_precision_posterior_predictive(
precision=known_precision,
normal=posterior
)
bound = 5
ax = plt.subplot(111)
true.set_bounds(-bound, bound).plot_pdf(ax=ax, label="true distribution")
posterior_predictive.set_bounds(-bound, bound).plot_pdf(ax=ax, label="posterior predictive")
prior_predictive.set_bounds(-bound, bound).plot_pdf(ax=ax, label="prior predictive")
ax.legend()
plt.show()
Source code in conjugate/models.py
normal_known_variance(x_total, n, var, normal_prior)
Posterior distribution for a normal likelihood with known variance and a normal prior on mean.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
x_total |
NUMERIC
|
sum of all outcomes |
required |
n |
NUMERIC
|
total number of samples in x_total |
required |
var |
NUMERIC
|
known variance |
required |
normal_prior |
Normal
|
Normal prior for mean |
required |
Returns:
Type | Description |
---|---|
Normal
|
Normal posterior distribution for the mean |
Examples:
Constructed example
import numpy as np
import matplotlib.pyplot as plt
from conjugate.distributions import Normal
from conjugate.models import normal_known_variance
unknown_mu = 0
known_var = 2.5
true = Normal(unknown_mu, known_var**0.5)
n_samples = 15
data = true.dist.rvs(size=n_samples, random_state=42)
prior = Normal(0, 10)
posterior = normal_known_variance(
n=n_samples,
x_total=data.sum(),
var=known_var,
normal_prior=prior
)
bound = 5
ax = plt.subplot(111)
posterior.set_bounds(-bound, bound).plot_pdf(ax=ax, label="posterior")
prior.set_bounds(-bound, bound).plot_pdf(ax=ax, label="prior")
ax.axvline(unknown_mu, color="black", linestyle="--", label="true mu")
ax.legend()
plt.show()
Source code in conjugate/models.py
normal_known_variance_posterior_predictive(var, normal)
Posterior predictive distribution for a normal likelihood with known variance and a normal prior on mean.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
var |
NUMERIC
|
known variance |
required |
normal |
Normal
|
Normal posterior distribution for the mean |
required |
Returns:
Type | Description |
---|---|
Normal
|
Normal posterior predictive distribution |
Examples:
Constructed example
import numpy as np
import matplotlib.pyplot as plt
from conjugate.distributions import Normal
from conjugate.models import normal_known_variance, normal_known_variance_posterior_predictive
unknown_mu = 0
known_var = 2.5
true = Normal(unknown_mu, known_var**0.5)
n_samples = 15
data = true.dist.rvs(size=n_samples, random_state=42)
prior = Normal(0, 10)
posterior = normal_known_variance(
n=n_samples,
x_total=data.sum(),
var=known_var,
normal_prior=prior
)
prior_predictive = normal_known_variance_posterior_predictive(
var=known_var,
normal=prior
)
posterior_predictive = normal_known_variance_posterior_predictive(
var=known_var,
normal=posterior
)
bound = 5
ax = plt.subplot(111)
true.set_bounds(-bound, bound).plot_pdf(ax=ax, label="true distribution")
posterior_predictive.set_bounds(-bound, bound).plot_pdf(ax=ax, label="posterior predictive")
prior_predictive.set_bounds(-bound, bound).plot_pdf(ax=ax, label="prior predictive")
ax.legend()
plt.show()
Source code in conjugate/models.py
normal_normal_inverse_gamma(x_total, x2_total, n, normal_inverse_gamma_prior)
Posterior distribution for a normal likelihood with a normal inverse gamma prior.
Derivation from paper here.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
x_total |
NUMERIC
|
sum of all outcomes |
required |
x2_total |
NUMERIC
|
sum of all outcomes squared |
required |
n |
NUMERIC
|
total number of samples in x_total and x2_total |
required |
normal_inverse_gamma_prior |
NormalInverseGamma
|
NormalInverseGamma prior |
required |
Returns:
Type | Description |
---|---|
NormalInverseGamma
|
NormalInverseGamma posterior distribution |
Source code in conjugate/models.py
normal_normal_inverse_gamma_posterior_predictive(normal_inverse_gamma)
Posterior predictive distribution for a normal likelihood with a normal inverse gamma prior.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
normal_inverse_gamma |
NormalInverseGamma
|
NormalInverseGamma posterior |
required |
Returns:
Type | Description |
---|---|
StudentT
|
StudentT posterior predictive distribution |
Source code in conjugate/models.py
pareto_gamma(n, ln_x_total, x_m, gamma_prior, ln=np.log)
Posterior distribution for a pareto likelihood with a gamma prior.
The parameter x_m is assumed to be known.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
n |
NUMERIC
|
number of samples |
required |
ln_x_total |
NUMERIC
|
sum of the log of all outcomes |
required |
x_m |
NUMERIC
|
The known minimum value |
required |
gamma_prior |
Gamma
|
Gamma prior |
required |
ln |
function to take the natural log, defaults to np.log |
log
|
Returns:
Type | Description |
---|---|
Gamma
|
Gamma posterior distribution |
Examples:
Constructed example
import numpy as np
import matplotlib.pyplot as plt
from conjugate.distributions import Pareto, Gamma
from conjugate.models import pareto_gamma
x_m_known = 1
true = Pareto(x_m_known, 1)
n_samples = 15
data = true.dist.rvs(size=n_samples, random_state=42)
prior = Gamma(1, 1)
posterior = pareto_gamma(
n=n_samples,
ln_x_total=np.log(data).sum(),
x_m=x_m_known,
gamma_prior=prior
)
ax = plt.subplot(111)
posterior.set_bounds(0, 2.5).plot_pdf(ax=ax, label="posterior")
prior.set_bounds(0, 2.5).plot_pdf(ax=ax, label="prior")
ax.axvline(x_m_known, color="black", linestyle="--", label="true x_m")
ax.legend()
plt.show()
Source code in conjugate/models.py
poisson_gamma(x_total, n, gamma_prior)
Posterior distribution for a poisson likelihood with a gamma prior.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
x_total |
NUMERIC
|
sum of all outcomes |
required |
n |
NUMERIC
|
total number of samples in x_total |
required |
gamma_prior |
Gamma
|
Gamma prior |
required |
Returns:
Type | Description |
---|---|
Gamma
|
Gamma posterior distribution |
Source code in conjugate/models.py
poisson_gamma_posterior_predictive(gamma, n=1)
Posterior predictive distribution for a poisson likelihood with a gamma prior
Parameters:
Name | Type | Description | Default |
---|---|---|---|
gamma |
Gamma
|
Gamma distribution |
required |
n |
NUMERIC
|
Number of trials for each sample, defaults to 1. Can be used to scale the distributions to a different unit of time. |
1
|
Returns:
Type | Description |
---|---|
NegativeBinomial
|
NegativeBinomial distribution related to posterior predictive |
Source code in conjugate/models.py
uniform_pareto(x_max, n, pareto_prior, max_fn=np.maximum)
Posterior distribution for a uniform likelihood with a pareto prior.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
x_max |
NUMERIC
|
maximum value |
required |
n |
NUMERIC
|
number of samples |
required |
pareto_prior |
Pareto
|
Pareto prior |
required |
max_fn |
elementwise max function, defaults to np.maximum |
maximum
|
Returns:
Type | Description |
---|---|
Pareto
|
Pareto posterior distribution |
Examples:
Get the posterior for this model with simulated data:
from conjugate.distributions import Uniform, Pareto
from conjugate.models import uniform_pareto
true_max = 5
true = Uniform(0, true_max)
n_samples = 10
data = true.dist.rvs(size=n_samples)
prior = Pareto(1, 1)
posterior = uniform_pareto(
x_max=data.max(),
n=n_samples,
pareto_prior=prior
)
Source code in conjugate/models.py
von_mises_known_concentration(cos_total, sin_total, n, kappa, von_mises_prior, sin=np.sin, cos=np.cos, arctan2=np.arctan2)
VonMises likelihood with known concentration parameter.
Taken from Section 2.13.1.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
cos_total |
NUMERIC
|
sum of all cosines |
required |
sin_total |
NUMERIC
|
sum of all sines |
required |
n |
NUMERIC
|
total number of samples in cos_total and sin_total |
required |
kappa |
NUMERIC
|
known concentration parameter |
required |
von_mises_prior |
VonMisesKnownConcentration
|
VonMisesKnownConcentration prior |
required |
Returns:
Type | Description |
---|---|
VonMisesKnownConcentration
|
VonMisesKnownConcentration posterior distribution |
Source code in conjugate/models.py
von_mises_known_direction(centered_cos_total, n, proportional_prior)
VonMises likelihood with known direction parameter.
Taken from Section 2.13.2
Parameters:
Name | Type | Description | Default |
---|---|---|---|
centered_cos_total |
NUMERIC
|
sum of all centered cosines. sum cos(x - known direction)) |
required |
n |
NUMERIC
|
total number of samples in centered_cos_total |
required |
proportional_prior |
VonMisesKnownDirectionProportional
|
VonMisesKnownDirectionProportional prior |
required |