"- $f(\\cdot)$ if the (known!) PDF of the variable $X$\n",
"- $f(\\cdot)$ if the (known!) PDF of the variable $X$\n",
"- $G(\\cdot)$ is a function with nice properties.\n",
"- $G(\\cdot)$ is a function with nice properties.\n",
"\n",
"\n",
"The \"nice properties\" require (in the most general case) that $G(x)$ is a $C^1$ diffeomorphism, which means that it is 1) continuous and differentiable almost everywhere; 2) it is bijective, and 3) its derivaties are also bijective. \n",
"The \"nice properties\" require (in the most general case) that $G(x)$ is a $C^1$ diffeomorphism, which means that it is 1) continuous and differentiable almost everywhere; 2) it is bijective, and 3) its derivatives are also bijective. \n",
"\n",
"\n",
"A simpler requirement is that $G(x)$ is continuous, bijective, and monotonic. That will get us 99% of the way there. Hey, $\\exp$ is continuous, bijective, and monotonic -- what a coincidence!\n"
"A simpler requirement is that $G(x)$ is continuous, bijective, and monotonic. That will get us 99% of the way there. Hey, $\\exp$ is continuous, bijective, and monotonic -- what a coincidence!\n"
]
]
...
@@ -412,7 +412,7 @@
...
@@ -412,7 +412,7 @@
],
],
"source": [
"source": [
"z_values = pt.dvector(\"z_values\")\n",
"z_values = pt.dvector(\"z_values\")\n",
"# The funtion `pm.logp` does the magic!\n",
"# The function `pm.logp` does the magic!\n",
"z_logp = pm.logp(z, z_values, jacobian=True)\n",
"z_logp = pm.logp(z, z_values, jacobian=True)\n",
"# We do this rewrite to make the computation more stable.\n",
"# We do this rewrite to make the computation more stable.\n",
"rewrite_graph(z_logp).dprint()"
"rewrite_graph(z_logp).dprint()"
...
@@ -668,7 +668,7 @@
...
@@ -668,7 +668,7 @@
"id": "5f9a7a50",
"id": "5f9a7a50",
"metadata": {},
"metadata": {},
"source": [
"source": [
"Theese distribution are essentially the same."
"These distribution are essentially the same."
]
]
},
},
{
{
...
@@ -715,7 +715,7 @@
...
@@ -715,7 +715,7 @@
"\n",
"\n",
"So, the inverse of their composition is $G^{-1} \\equiv (J^{-1} \\circ H^{-1}) = J^{-1}(H^{-1}(x)) = J^{-1}(\\ln(x)) = \\frac{\\ln(x) - a}{b}$\n",
"So, the inverse of their composition is $G^{-1} \\equiv (J^{-1} \\circ H^{-1}) = J^{-1}(H^{-1}(x)) = J^{-1}(\\ln(x)) = \\frac{\\ln(x) - a}{b}$\n",
"\n",
"\n",
"For the correction term, we need the determinant of the jacobian. Since $G$ is a scalar function, this is just the absolutel value of the gradient:\n",
"For the correction term, we need the determinant of the jacobian. Since $G$ is a scalar function, this is just the absolute value of the gradient:\n",