Brownian Bridge

Brownian Bridge:

In the most common formulation, the Brownian bridge process is obtained by taking a standard Brownian motion process \(X\), restricted to the interval \([0,1]\), and conditioning on the event that \(X_1 = 0\). Since also \(X_0 =0\), the process is tied down at both ends, and so the process in between forms a bridge. The Brownian bridge turns out to be an interesting stochastic process with surprising applications, including a very important application to statistics. In terms of a definition, however, we will give a list of characterizing properties as we did for standard Brownian motion and for Brownian motion with drift and scaling.

The expected value of the bridge is zero, with variance \(\frac{t(T-t)}{T}\), implying that the most uncertainty is in the middle of the bridge, with zero uncertainty at the nodes. The increments in a Brownian bridge are not independent.

A Brownian bridge is the result of Donsker’s theorem (a functional extension of the CLT) in the area of empirical processes. It is also used in the Kolmogorov–Smirnov test in the area of statistical inference.

More …

Nyquist–Shannon sampling theorem

Nyquist–Shannon sampling theorem is a fundamental bridge between continuous-time signals (often called “analog signals”) and discrete-time signals (often called “digital signals”). It establishes a sufficient condition for a sample rate that permits a discrete sequence of samples to capture all the information from a continuous-time signal of finite bandwidth.

In order to reconstruct a signal, sample at a rate greater than twice it’s highest frequency component

More …

LOESS

Local regression or local polynomial regression, is a generalization of moving average and polynomial regression. Its most common methods, initially developed for scatterplot smoothing, are LOESS (locally estimated scatterplot smoothing) and LOWESS (locally weighted scatterplot smoothing). They are two strongly related non-parametric regression methods that combine multiple regression models in a k-nearest-neighbor-based meta-model.

A smooth curve through a set of data points obtained with this statistical technique is called a Loess Curve, particularly when each smoothed value is given by a weighted quadratic least squares regression over the span of values of the y-axis scattergram criterion variable.

More …

Markov Decision Processes

A Markov decision process (MDP) is a discrete time stochastic control process. It provides a mathematical framework for modeling decision making in situations where outcomes are partly random and partly under the control of a decision maker. MDPs are useful for studying optimization problems solved via dynamic programming and reinforcement learning.

More …

Beginning Bayesian

Beginning Bayes in R

Introduction to Bayesian thinking

library(TeachBayes)
areas <- c(2,1,2,1,2)
spinner_plot(areas)

spinner_probs(areas)
##   Region  Prob
## 1      1 0.250
## 2      2 0.125
## 3      3 0.250
## 4      4 0.125
## 5      5 0.250
df = data.frame(Region = 1:5, areas, Probability = areas/sum(areas))

df
##   Region areas Probability
## 1      1     2       0.250
## 2      2     1       0.125
## 3      3     2       0.250
## 4      4     1       0.125
## 5      5     2       0.250
spins = spinner_data(areas, 1000)




More …