R 打敗 Python?
November 14, 2010
賓拉登原來這般仙風道骨啊
November 9, 2010
iFrame test
November 9, 2010
Berliner Philharmoniker plays Mozart:
Big Buck Bunny:
Update:
Sintel, the last Blender’s project movie:
Sintel觀後感:
光技術是不夠的,灑狗血是一定要的啦
OpenBUGS: still has lot of space to improve
October 30, 2010
I tried the famous Gelman’s example with OpenBUGS
, JAGS
and famous (or notorious) WinBUGS
with R
to compare the RNG performance.
This is the testing code:
library(arm) library(R2jags) J <- 8 y <- c(28,8,-3,7,-1,1,18,12) sigma.y <- c(15,10,16,11,9,11,10,18) schools.data <- list ("J", "y", "sigma.y") schools.inits <- function() list (theta=rnorm(J,0,1), mu.theta=rnorm(1,0,100), sigma.theta=runif(1,0,100)) schools.parameters <- c("theta", "mu.theta", "sigma.theta") ## using WINE+WinBUGS print(system.time(schools.sim <- bugs (schools.data, schools.inits, schools.parameters, "schools.bug", n.chains=3, n.iter=10000, WINE='WINEDLLOVERRIDES="ole32,oleaut32,rpcrt=n" /usr/bin/wine'))) ## using OpenBUGS print(system.time(schools.sim <- bugs (schools.data, schools.inits, schools.parameters, "schools.bug", n.chains=3, n.iter=10000, program="OpenBUGS"))) ## using JAGS print(system.time(schools.sim <- jags(schools.data, schools.inits, schools.parameters, "schools.bug", n.chains=3, n.iter=10000)))
The results are quire interesting:
The first result is R+WINE+WinBUGS+R2WinBUGS
, all these software come with the last version:
> source('schools.R') fixme:keyboard:RegisterHotKey (0x10048,13,0x00000002,3): stub user system elapsed 1.292 0.384 3.663
The second result is R+OpenBUGS
, where OpenBUGS
is the version provided on OpenBUGS’s web site, a 32-bit Linux command line executable:
user system elapsed 3.500 1.048 5.162
And the last, fully native R+JAGS
:
Compiling model graph Resolving undeclared variables Allocating nodes Graph Size: 39 |++++++++++++++++++++++++++++++++++++++++++++++++++| 100% |**************************************************| 100% user system elapsed 1.376 0.008 1.384
You can see that, even OpenBUGS
does provide its 32-bit Linux version, it’s still defeated by WinBUGS+WINE
. In fact, in most cases, WINE
‘s performance penalty is significant, but, even though, WINE+WinBUGS
still outperforms OpenBUGS
.
Does it means OpenBUGS
need to do some more modification on it?
Let’s wait, participate, and see.
MathML stretch symbol Test
October 25, 2010
E
(
X
k
)
=
∫
R
1
Γ
ν
+
1
2
Γ
ν
+
1
2
π
ν
x
k
ν
ν
+
x
2
ν
+
1
2
d
x
Nothing selected
Conjugate note 2010 Oct 23 (1)
October 24, 2010
if $\boldsymbol x\in\{0\cup\mathbb N\}^m$ is a random vector follows a multinomial distribution with parameter vector $\boldsymbol\theta\in (0,1)^m$.
Assume that $\boldsymbol x$’s parameter $\boldsymbol\theta$ has the prior distribution as Dirichlet with parameter $\boldsymbol\alpha\in (2, \infty)^m$.
It is clear that
\[
\mathscr P(\left.\boldsymbol x, \boldsymbol\theta\right|\boldsymbol\alpha)
= \frac{\Gamma\left(\sum_{i=1}^mx_i\right)\Gamma\left(\sum_{i=1}^m\alpha_i\right)}{\prod_{i=1}^m\Gamma(x_i)\prod_{i=1}^m\Gamma(\alpha_i)}\prod_{i=1}^m\theta_i^{x_i+\alpha_i}
\]
And so
\[
\mathscr P(\left.\boldsymbol x\right|\boldsymbol\alpha) =
\frac{\Gamma\left(\sum_{i=1}^mx_i\right)\Gamma\left(\sum_{i=1}^m\alpha_i\right)\prod_{i=1}^n\Gamma(x_i+\alpha_i)}{\prod_{i=1}^m\Gamma(x_i)\Gamma(\alpha_i)\Gamma\left(\sum_{i=1}^nx_i+\alpha_i\right)}
\]
Thus
\[
\mathscr P(\left.\boldsymbol\theta\right|\boldsymbol x, \boldsymbol\alpha) =
\frac{\Gamma\left(\sum_{i=1}^nx_i+\alpha_i\right)}{\prod_{i=1}^n\Gamma(x_i+\alpha_i)}
\prod_{i=1}^m\theta_i^{x_i+\alpha_i}
\]
For R
, we have
> library(compositions) > rmultinom(1, prob=rDirichlet.acomp(1, alpha), size=10)
Probabilty note 2010 Oct 13 (1)
October 13, 2010
Consider a collection of random variables $\boldsymbol x = (x_1, x_2, \ldots, x_k)^T$, if $\boldsymbol x$ have a joint characteristic function
$\phi_{\boldsymbol x} (\boldsymbol t)$, where $\boldsymbol t = (t_1,t_2,\ldots, t_k)^T$ are all distinct, then we can conclude that $x_i, x_j$ are independent for all $i\neq j$ if and only if
\[
\phi_{\boldsymbol x} (\boldsymbol t) = \prod_{i=1}^k\phi_{X_i}(t_i)
\]
Note that the above statement does not hold if the distinction property among the $t_i$’s does not exist, that means, the condition that
\[
\phi_{\boldsymbol x} (t\boldsymbol 1) = \prod_{i=1}^k\phi_{X_i}(t)
\quad
\text{ does not imply that}\quad X_i’s\quad\text{ are all independent}
\]
The reason is apparent, think about the case $t=0$.
Probability Note 2010 Oct 10 (1)
October 10, 2010
For $X\sim t(\nu)$,
\[
E(X^k) = \int_{\mathbb R^1}\frac{\Gamma\left(\frac{\nu+1}{2}\right)}{\Gamma\left(\frac{\nu+1}{2}\right)\sqrt{\pi\nu}}x^k\left(\frac{\nu}{\nu+x^2}\right)^{\frac{\nu+1}{2}}
dx
\]
\[
= \int_{\mathbb R^1}\frac{\Gamma\left(\frac{\nu+1}{2}\right)}{\Gamma\left(\frac{\nu+1}{2}\right)\sqrt{\pi\nu}}\left(\frac{x^2}{\nu+x^2}\right)^{\frac{k}{2}} \left(\frac{\nu}{\nu+x^2}\right)^{\frac{\nu-k+1}{2}}
v^{\frac{k}{2}}dx
\]
\[
= \int_0^1\frac{\Gamma\left(\frac{\nu+1}{2}\right)\nu^{\frac{k}{2}}}{\Gamma\left(\frac{\nu+1}{2}\right)\sqrt{\pi\nu}}\left[\frac{1}{2}\nu^{\frac{1}{2}}y^{-\frac{1}{2}}(1-y)^{-\frac{1}{2}} + \frac{1}{2}\nu^{\frac{1}{2}}y^{\frac{1}{2}}(1-y)^{-\frac{3}{2}} \right]y^{\frac{k}{2}}(1-y)^{\frac{\nu+1-k}{2}}dy
\]
\[
= \frac{\Gamma\left(\frac{\nu+1}{2}\right)\nu^{\frac{k}{2}}}{\Gamma\left(\frac{\nu+1}{2}\right)2\sqrt{\pi}}\left[\int_0^1y^{\frac{k+1}{2}-1}(1-y)^{\frac{\nu-k+2}{2}-1} dy + \int_0^1 y^{\frac{k+3}{2}-1}(1-y)^{\frac{\nu-k}{2}-1}dy\right]
\]
\[
= \frac{\Gamma\left(\frac{\nu+1}{2}\right)\nu^{\frac{k}{2}}}{\Gamma\left(\frac{\nu+1}{2}\right)\Gamma\left(\frac{\nu+3}{2}\right)2\sqrt{\pi}}\left[\Gamma\left(\frac{k+1}{2}\right)\Gamma\left(\frac{\nu-k+2}{2}\right) + \Gamma\left(\frac{k+3}{2}\right)\Gamma\left(\frac{\nu-k}{2}\right)\right]
\]
Thus, for $k > \nu$, the moment does not exist.
Probability note 2010 Oct 09 -(4)
October 10, 2010
Let $\mathscr P(X_n=-1) = \frac{1}{2} – \frac{1}{n+1}$, $\mathscr P(X_n=1)=\frac{1}{2} + \frac{1}{n+1}$ and $\mathscr P(X = -1) = \mathscr P(X=1) = \frac{1}{2}$, it clearly that
\[
\lim_{n\rightarrow\infty}\mathscr P(X_n=-1) = \frac{1}{2} – \lim_{n\rightarrow\infty}\frac{1}{n+1} = \frac{1}{2}
,\]
and
\[
\lim_{n\rightarrow\infty}\mathscr P(X_n=1) = \frac{1}{2} + \lim_{n\rightarrow\infty}\frac{1}{n+1} = \frac{1}{2}
\]
Thus, $\mathscr P(X_n=-1)\rightarrow\mathscr P(X=-1)$ and $\mathscr P(X_n=1)\rightarrow\mathscr P(X=1)$, thus, $X_n\overset{a.s.}{\rightarrow}X$, thus,$X_n\overset{P}{\rightarrow}X$ and so $X_n\overset{\mathcal D}{\rightarrow}X$
Probability note 2010 Oct 09 -(3)
October 10, 2010
If $X_n\overset{\mathcal D}{\rightarrow}X$, and $Y_n\overset{\mathcal D}{\rightarrow}Y$, where $X_n\amalg X_n$ for all $n\in\mathbb N$, $X\amalg Y$, then
\[
\lim_{n\rightarrow\infty}\phi_{X_n+Y_n}(t) = \lim_{n\rightarrow\infty}\phi_{X_n}(t)\phi_{Y_n}(t) = \lim_{n\rightarrow\infty}\phi_{X_n}(t)\cdot\lim_{n\rightarrow\infty}\phi_{Y_n}(t) = \phi_{X}(t)\phi_{Y}(t) = \phi_{X+Y}(t)
,\]
thus, $X_n+Y_n\overset{\mathcal D}{\rightarrow}X+Y$.