allenfrostline

Notes on Stochastic Calculus


2019-02-21

This is a brief selection of my notes on the stochastic calculus course. Content may be updated at times. The general topics range from martingale, Brownian motion and its variants, option pricing, etc. $\newcommand{\E}{\text{E}}\newcommand{\P}{\text{P}}\newcommand{\Q}{\text{Q}}\newcommand{\F}{\mathcal{F}}\newcommand{\d}{\text{d}}\newcommand{\N}{\mathcal{N}}\newcommand{\sgn}{\text{sgn}}\newcommand{\tr}{\text{tr}}\newcommand{\bs}{\boldsymbol}\newcommand{\eeq}{\ \!=\mathrel{\mkern-3mu}=\ \!}\newcommand{\eeeq}{\ \!=\mathrel{\mkern-3mu}=\mathrel{\mkern-3mu}=\ \!}\newcommand{\R}{\mathbb{R}}\newcommand{\MGF}{\text{MGF}}$

MGF of Normal Distribution

For $X\sim\N(\mu,\sigma^2)$, we have $\MGF(\theta)=\exp(\theta\mu + \theta^2\sigma^2/2)$. We have $\E(X^k) = \MGF^{\ (k)}(0)$.

Truncated Normal Distribution

Consider a two-sided truncation $(a,b)$ on $\N(\mu,\sigma^2)$, then

$$ \E[X\mid a < X < b] = \mu - \sigma\frac{\phi(\alpha) - \phi(\beta)}{\Phi(\alpha) - \Phi(\beta)} $$

where $\alpha:=(a-\mu)/\sigma$ and $\beta:=(b-\mu)/\sigma$.

Doob’s Identity

Let $X$ be a MG and $T$ a stopping time, then $\E X_{T\wedge n} = \E X_0$ for any $n$.

Matingale Transform

Define $(Z\cdot X)_n:=\sum_{i=1}^n Z_i(X_i - X_{i-1})$ where $X$ is MG with $X_0=0$ and $Z_n$ is predictable and bounded, then $(Z\cdot X)$ is MG. If $X$ is sub-MG, then also is $(Z\cdot X)$. Furthermore, if $Z\in[0,1]$, then $\E(Z\cdot X)\le \E X$.

Common MGs

Convex Mapping

If $X$ is MG and $\phi(\cdot)$ is a convex function, then $\phi(X)$ is sub-MG.

$L^p$ and $L^p$ Boundedness

Doob’s Maximal Ineq.

MG Convergence Theorem

Change of Measure

Given $\P$-measure, we define the likelihood ratio $Z:=\d\Q / \d\P$ for another measure $\Q$. Then we have

Cameron-Martin

Strong Markov Property

If $B$ is a BM and $T=\tau(\cdot)$ is a stopping time, then $\{B_{t+T} - B_T\}_{t\ge T}$ is a BM indep. of $\{B_t\}_{t\le T}$.

Orthogonal Transform

If $B$ is a standard $k$-BM and $U\in\mathbb{R}^{k\times k}$ is orthogonal, then $UB$ is also a standard $k$-BM.

Doob’s Decomposition

For any sub-MG $X$, we have unique decomposition $X=M+A$ where $M_n:=X_0 + \sum_{i=1}^n [X_i - \E(X_i\mid \F_{i-1})]$ is a martingale and $A_n:=\sum_{i=1}^n[\E(X_i\mid \F_{i-1}) - X_{i-1}]$ is a non-decreasing predictable sequence.

Gambler’s Ruin

Reflection Principle

For BM $B$ and stopping time $T=\tau(a)$, define $B^*$ s.t. $B_t^*=B_t$ for all $t\le T$ and $B_t^* = 2a - B_t$ for all $t>T$, then $B^*$ is also a BM.

First Passage Time $T:=\tau(a)$

Joint Distribution of BM and its Maximum

$\P(\max_{s\le t}B_s > x\text{ and }B_t < y) = \Phi\!\left(\frac{y-2x}{\sqrt{t}}\right)$.

$2$-BM Stopped on 1 Boundary

Let $X$ and $Y$ be indep. BM. Note that for all $t\ge 0$, from exponential MG we know $\E[\exp(i\theta X_t)]=\exp(-\theta^2 t/2)$. Now define $T=\tau(a)$ for $Y$ and we have $\E[\exp(i\theta X_T)] = \E[\exp(-\theta^2 T /2)]=\exp(-|\theta| a)$, which is the Fourier transform of the Cauchy density $f_a(x)=\frac{1}{\pi}\frac{a}{a^2+x^2}$.

Itô Integral

We define Itô integral $I_t(X) := \int_0^t\! X_s\d W_s$ where $W_t$ is a standard Brownian process and $X_t$ is adapted.

Martingality of Itô Integral

Itô Isometry

This is the direct result from the second martingality property above. Let $X_t$ be nonrandom and continuously differentiable, then

$$ \E!\left[!\left(\int_0^t X_t\d W_t\right)^{!!2}\right] = \E!\left[\int_0^t X_t^2\d t\right]. $$

Itô Formula - $f(W_t)$

Let $W_t$ be a standard Brownian motion and let $f:\R\mapsto\R$ be a twice-continously differentiable function s.t. $f$, $f'$ and $f''$ are all bounded, then for all $t>0$ we have

$$ \d f(W_t) = f’(W_t)\d W_t + \frac{1}{2}f’’(W_t) \d t. $$

Itô Formula - $f(t,W_t)$

Let $W_t$ be a standard Brownian motion and let $f:[0,\infty)\times\R\mapsto\R$ be a twice-continously differentiable function s.t. its partial derivatives are all bounded, then for all $t>0$ we have

$$ \d f(t, W_t) = f_x\d W_t + \left(f_t + \frac{1}{2}f_{xx}\right) \d t. $$

Wiener Integral

The Wiener integral is a special case of Itô integral where $f(t)$ is here a nonrandom function of $t$. Variance of a Wiener integral can be derived using Itô isometry.

Itô Process

We say $X_t$ is an Itô process if it satisfies

$$ \d X_t = Y_t\d W_t + Z_t\d t $$

where $Y_t$ and $Z_t$ are adapted and $\forall t$

$$ \int_0^t! \E Y_s^2\d s < \infty\quad\text{and}\quad\int_0^t! \E|Z_s|\d s < \infty. $$

The quadratic variation of $X_t$ is

$$ [X,X]_t = \int_0^t! Y_s^2\d s. $$

Itô Product and Quotient

Assume $X_t$ and $Y_t$ are two Itô processes, then

$$ \frac{\d (XY)}{XY} = \frac{\d X}{X} + \frac{\d Y}{Y} + \frac{\d X\d Y}{XY} $$

and

$$ \frac{\d (X/Y)}{X/Y} = \frac{\d X}{X} - \frac{\d Y}{Y} + \left(\frac{\d Y}{Y}\right)^{!2} - \frac{\d X\d Y}{XY}. $$

Brownian Bridge

A Brownian bridge is a continuous-time stochastic process $X_t$ with both ends pinned: $X_0=X_T=0$. The SDE is

$$ \d X_t = -\frac{X_t}{1-t}\d t + \d W_t $$

which solves to

$$ X_t = W_t - \frac{t}{T}W_T. $$

Itô Formula - $u(t, X_t)$

Let $X_t$ be an Itô process. Let $u(t,x)$ be a twice-continuously differentiable function with $u$ and its partial derivatives bounded, then

$$ \d u(t, X_t) = \frac{\partial u}{\partial t}(t, X_t)\d t + \frac{\partial u}{\partial x}(t, X_t)\d X_t + \frac{1}{2}\frac{\partial^2 u}{\partial x^2}(t, X_t)\d [X,X]_t. $$

The Ornstein-Uhlenbeck Process

The OU process describes a stochastic process that has a tendency to return to an “equilibrium” position $0$, with returning velocity proportional to its distance from the origin. It’s given by SDE

$$ \d X_t = -\alpha X_t \d t + \d W_t \Rightarrow \d [\exp(\alpha t)X_t] = \exp(\alpha t)\d W_t $$

which solves to

$$ X_t = \exp(-\alpha t)\left[X_0 + \int_0^t! \exp(as)\d W_s\right]. $$

Remark: In finance, the OU process is often called the Vasicek model.

Diffusion Process

The SDE for general diffusion process is $\d X_t = \mu(X_t)\d t + \sigma(X_t)\d W_t$.

Hitting Probability for Diffusion Processes

In order to find $\P(X_T=B)$ where we define $T=\inf\{t\ge 0: X_t=A\text{ or }B\}$, we consider a harmonic function $f(x)$ s.t. $f(X_t)$ is a MG. This gives ODE

$$ f'(x)\mu(x) + f''(x)\sigma^2(x)/2 = 0\Rightarrow f(x) = \int_A^x C_1\exp\left\{-\!\int_A^z\frac{2\mu(y)}{\sigma^2(y)}\d y\right\}\d z + C_2 $$

where $C_{1,2}$ are constants. Then since $f(X_{T\wedge t})$ is a bounded MG, by Doob’s identity we have

$$ \P(X_T=B) = \frac{f(X_0) - f(A)}{f(B) - f(A)}. $$

Multivariable Itô Formula - $u(\mathbf{W}_t)$

Let $\bs{W_t}$ be a $K$-dimensional standard Brownian motion. Let $u:\R^K\mapsto \R$ be a $C^2$ function with bounded first and second partial derivatives. Then

$$ \d u(\mathbf{W}_t) = \nabla u(\mathbf{W}_t)\cdot \d \mathbf{W}_t + \frac{1}{2}\tr[\Delta u(\mathbf{W}_t)] \d t $$

where the gradient operator $\nabla$ gives the vector of all first order partial derivatives, and the Laplace operator (or Laplacian) $\Delta\equiv\nabla^2$ gives the vector of all second order partial derivatives.

Dynkin’s Formula

If $T$ is a stopping time for $\bs{W_t}$, then for any fixed $t$ we have

$$ \E[u(\mathbf{W}_{T\wedge t})] = u(\bs{0}) + \frac{1}{2}\E!\left[\int_0^{T\wedge t}!!\Delta u(\mathbf{W}_s)\d s\right]. $$

Harmonic Functions

A $C^2$ function $u:\R^k\mapsto\R$ is said to be harmonic in a region $\mathcal{U}$ if $\Delta u(x) = 0$ for all $x\in \mathcal{U}$. Examples are $u(x,y)=2\log(r)$ and $u(x,y,z)=1/r$ where $r$ is defined as the norm.

Remark: $f$ being a harmonic function is equivalent to $f(X_t)$ being a MG, i.e. $f'(x)\mu(x) + f''(x)\sigma^2(x)/2 = 0$ for a diffusion process $X_t$.

Harmonic Corollary of Dynkin

Let $u$ be harmonic in the an open region $\mathcal{U}$ with compact support, and assume that $u$ and its partials extend continuously to the boundary $\partial \mathcal{U}$. Define $T$ to be the first exit time of Brownian motion from $\mathcal{U}$. for any $\mathbf{x}\in\mathcal{U}$, let $\E^{\mathbf{x}}$ be the expectation under measure $\P^{\mathbf{x}}$ s.t. $\mathbf{W}_t - \mathbf{x}$ is a $K$-dimensional standard BM. Then

Multivariate Itô Process

A multivariate Itô process is a continuous-time stochastic process $X_t\in\R$ of the form

$$ X_t = X_0 + \int_0^t! M_s \d s + \int_0^t! \mathbf{N}_s\cdot \d \mathbf{W}_s $$

where $\mathbf{N}_t$ is an adapted $\R^K$−valued process and $\mathbf{W}_t$ is a $K$−dimensional standard BM.

General Multivariable Itô Formula - $ u(\mathbf{X}_t)$

Let $\mathbf{W}_t\in\R^K$ be a standard $K$−dimensional BM, and let $\mathbf{X}_t\in\R^m$ be a vector of $m$ multivariate Itô processes satisfying

$$ \d X_t^i = M_t^i\d t + \mathbf{N}_t^i\cdot \d \mathbf{W}_t. $$

Then for any $C^2$ function $u:\R^m\mapsto\R$ with bounded first and second partial derivatives

$$ \d u(\mathbf{X}_t) = \nabla u(\mathbf{X}_t)\cdot \d \mathbf{X}_t + \frac{1}{2}\tr[\Delta u(\mathbf{X}_t)\cdot \d [\mathbf{X},\mathbf{X}]_t]. $$

Knight’s Theorem

Let $\mathbf{W}_t$ be a standard $K$−dimensional BM, and let $\mathbf{U}_t$ be an adapted $K$−dimensional process satisfying

$$ |{\mathbf{U}_t}| = 1\quad\forall t\ge 0. $$

Then we know the following $1$-dimensional Itô process is a standard BM:

$$ X_t := \int_0^t!! \mathbf{U}_s\cdot \d W_s. $$

Radial Process

Let $\mathbf{W}_t$ be a standard $K$−dimensional BM, and let $R_t=|\mathbf{W}_t|$ be the corresponding radial process, then $R_t$ is a Bessel process with parameter $(K-1)$ given by

$$ \d R_t = \frac{K-1}{R_t}\d t + \d W_t^{\sgn} $$

where we define $\d W_t^{\sgn} := \sgn(\mathbf{W}_t)\cdot \d \mathbf{W}_t$.

Bessel Process

A Bessel process with parameter $a$ is a stochastic process $X_t$ given by

$$ \d X_t = \frac{a}{X_t}\d t+ \d W_t. $$

Since this is just a special case of diffusion processes, we know the corresponding harmonic function is $f(x)=C_1x^{-2a+1} + C_2$, and the hitting probability is

$$ \P(X_T=B) = \frac{f(X_0) - f(A)}{f(B) - f(A)} = \begin{cases} 1 & \text{if }a > 1/2,\ (x/B)^{1-2a} & \text{otherwise}. \end{cases} $$

Itô’s Representation Theorem

Let $W_t$ be a standard $1$-dimensional Brownian motion and let $\F_t$ be the $\sigma$−algebra of all events determined by the path $\{W_s\}_{s\le t}$. If $Y$ is any r.v. with mean $0$ and finite variance that is measurable with respect to $\F_t$, then for some $t > 0$

$$ Y = \int_0^t! A_s\d W_s $$

for some adapted process $A_t$ that satisfies

$$ \E(Y^2) = \int_0^t! \E(A_s^2)\d s. $$

This theorem is of importance in finance because it implies that in the Black-Sholes setting, every contingent CLAIM can be hedged.

Special case: let $Y_t=f(W_t)$ be any mean $0$ r.v. with $f\in C^2$. Let $u(s,x):=\E[f(W_t)\mid W_s = x]$, then

$$ Y_t = f(W_t) = \int_0^t! u_x(s,W_s)\d W_s. $$

Assumptions of the Black-Scholes Model

Black-Scholes Model

Under a risk-neutral measure $\P$, the discounted share price $S_t / M_t$ is a martingale and thus

$$ \frac{S_t}{M_t} = \frac{S_0}{M_0}\exp\left\{\sigma W_t - \frac{\sigma^2t}{2}\right\} $$

where we used the fact that $\mu_t = r_t$ by the Fundamental Theorem.

Contingent Claims

A European contingent CLAIM with expiration date $T > 0$ and payoff function $f:\R\mapsto\R$ is a tradeable asset that pays $f(S_T)$ at time $T$. By the Fundamental Theorem we know the discounted share price of this CLAIM at any $t\le T$ is $\E[f(S_T)/M_T\mid \F_t]$. In order to calculate this conditional expectation, let $g(W_t):= f(S_t)/M_t$, then by the Markov property of BM we know $\E[g(W_T)\mid \F_t] = \E[g(W_t + W_{T-t}^*)\mid \F_t]$ where $W_t$ is adapted in $\F_t$ and independent of $W_t^*$.

Black-Scholes Formula

The discounted time−$t$ price of a European contingent CLAIM with expiration date $T$ and payoff function $f$ is

$$ \E[f(S_T)/M_T\mid \F_t] = \frac{1}{M_T}\E\!\left[f\!\left(S_t\exp\!\left\{\sigma W_{T-t}^* - \frac{\sigma^2(T-t)}{2} + R_T - R_t\right\}\right)\middle|\F_t\right] $$

where $S_t$ is adapted in $\F_t$ and independent of $W_t^*$. The expectation is calculated using normal. Note here $R_t = \int_0^t r_s\d s$ is the log-compound interest rate.

Black-Scholes PDE

Under risk-neutral probability measure, the discounted share price of CLAIM is a martingale, i.e. it has no drift term. So we can differentiate $M_t^{-1}u(t,S_t)$ by Itô and derive the following PDE

$$ u_t(t,S_t) + r_t S_tu_x(t,S_t) + \frac{\sigma^2S_t^2}{2}u_{xx}(t,S_t) = r_t u(t,S_t) $$

with terminal condition $u(T,S_T)=f(S_T)$. Note here everything is under the BS model.

Hedging in Continuous Time

A replicating portfolio for a contingent CLAIM in STOCK and CASH is given by

$$ V_t = \alpha_t M_t + \beta_t S_t $$

where $\alpha_t = [u(t,S_t) - S_t u_x(t,S_t)]/M_t$ and $\beta_t = u_x(t,S_t)$.

Barrier Option

A barrier option pays 1USD at time $T$ if $\max_{t\le T} S_t \ge AS_0$ and 0USD otherwise. This is a simple example of a path-dependent option. Other commonly used examples are knock-ins, knock-outs, lookbacks and Asian options.

The time-$0$ price of such barrier options is calculated from

$$ \begin{align*} V_0 &= \exp(-rT)\P\!\left(\max_{t\le T} S_t \ge AS_0\right) = \exp(-rT)\P\!\left(\max_{t\le T} W_t + \mu t \ge a\right)\\ &= \exp(-rT)\P_{\mu}\!\left(\max_{t\le T} W_t \ge a\right) \end{align*} $$

where $\mu=r\sigma^{-1} - \sigma/2$ and $a = \sigma^{-1}\log A$. Now, by Cameron-Martin we know

$$ \begin{align*} \P_{\mu}\!\left(\max_{t\le T} W_t \ge a\right) &= \E_0[Z_T\cdot \mathbf{1}_{\{\max_{t\le T} W_t\ge a\}}] = \E_0[\exp(\mu W_T - \mu^2 T / 2)\cdot \mathbf{1}_{\{\max_{t\le T} W_t\ge a\}}] \\ &= \exp(- \mu^2 T / 2)\cdot \E_0[\exp(\mu W_T)\cdot \mathbf{1}_{\{\max_{t\le T} W_t\ge a\}}] \end{align*} $$

and by reflection principle we have

$$ \begin{align*} \E_0[\exp(\mu W_T)\cdot \mathbf{1}_{\{\max_{t\le T} W_t\ge a\}}] &= e^{\mu a}\int_0^{\infty} (e^{\mu y} + e^{-\mu y}) \P(W_T - a \in \d y) \\&= \Phi(\mu\sqrt{T} - a/\sqrt{T}) + e^{2\mu a}\Phi(-\mu\sqrt{T}-a/\sqrt{T}). \end{align*} $$

Exponential Process

The exponential process

$$ Z_t = \exp\!\left\{\int_0^t\! Y_s\d W_s - \frac{1}{2}\int_0^t\! Y_s^2\d s\right\} $$

is a positive MG given

$$ \E!\left[\int_0^t! Z_s^2Y_s^2\d s\right] < \infty. $$

Specifically, the exponential martingale is given by the SDE $\d X_t = \theta X_t \d W_t$.

Girsanov’s Theorem

Assume that under the probability measure $\P$ the exponential process $Z_t(Y)$ is a MG and $W_t$ is a standard BM. Define the absolutely continuous probability measure $Q$ on $\F_t$ with likelihood ratio $Z_t$, i.e. $(\d\Q/\d\P)_{\F_t} = Z_t$, then under $Q$ the process

$$ W_t^* := W_t - \int_0^t! Y_s\d s $$

is a standard BM. Girsanov’s Theorem shows that drift can be added or removed by change of measure.

Novikov’s Theorem

The exponential process

$$ Z_t = \exp\!\left\{\int_0^t\! Y_s \d W_s - \frac{1}{2}\!\int_0^t\! Y_s^2 \d s\right\} $$

is a MG given

$$ \E\left[\exp\!\left\{\frac{1}{2}\!\int_0^t\! Y_s^2\d s\right\}\right] < \infty. $$

This theorem gives another way to show whether an exponential process is a MG.

Standard BM to OU Process

Assume $W_t$ is a standard BM under $\P$, define likelihood ratio $Z_t = (\d\Q/\d\P)_{\F_t}$ as above where $Y_t = -\alpha W_t$, then by Girsanov $W_t$ under $\Q$ is an OU process.

Fundamental Principle of Statistical Mechanics

If a system can be in one of a collection of states $\{\omega_i\}_{i\in\mathcal{I}}$, the probability of finding it in a particular state $\omega_i$ is proportional to $\exp\{-H(\omega_i)/kT\}$ where $k$ is Boltzmann’s constant, $T$ is temperature and $H(\cdot)$ is energy.

Conditioned Brownian Motion

If $W_t$ is standard BM with $W_0 = x \in (0, A)$, how does $W_t$ behave conditional on the event that it hits $A$ before $0$? Define

Then the likelihood ratios are

$$ \left(\frac{\d\Q^x}{\d\P^x}\right)_{\!\F_T} \!= \frac{\mathbf{1}_{\{W_T=A\}}}{\P^x\{W_T=x\}} \Rightarrow \left(\frac{\d\Q^x}{\d\P^x}\right)_{\!\F_{T\wedge t}} \!= \E\!\left[\left(\frac{\d\Q^x}{\d\P^x}\right)_{\!\F_T}\middle|\F_{T\wedge t}\right] = \frac{W_{T\wedge t}}{x}. $$

Notice

$$ \begin{align*} \frac{W_{T\wedge t}}{x} &= \exp\left\{\log W_{T\wedge t}\right\} / x \overset{\text{Itô}}{\eeq} \exp\left\{\log W_0 + \int_0^{T\wedge t}W_s^{-1}\d W_s - \frac{1}{2}\int_0^{T\wedge t} W_s^{-2}\d s\right\} / x \\&= \exp\left\{\int_0^{T\wedge t}W_s^{-1}\d W_s - \frac{1}{2}\int_0^{T\wedge t} W_s^{-2}\d s\right\} \end{align*} $$

which is a Girsanov likelihood ratio, so we conclude $W_t$ is a BM under $\Q^x$ with drift $W_t^{-1}$, or equivalently

$$ W_t^* = W_t - \int_0^{T\wedge t}W_s^{-1}\d s $$

is a standard BM with initial point $W_0^* = x$.

Lévy Process

A one-dimensional Lévy process is a continuous-time random process $\{X_t\}_{t\ge 0}$ with $X_0=0$ and i.i.d. increments. Lévy processes are defined to be a.s. right continuous with left limits.

Remark: Brownian motion is the only Lévy process with continuous paths.

First-Passage-Time Process

Let $B_t$ be a standard BM. Define the FPT process as $\tau_x = \inf\{t\ge 0: B_t \ge x\}$. Then $\{\tau_{x}\}_{x\ge 0}$ is a Lévy process called the one-sided stable-$1/2$ process. Specifically, the sample paths $x\mapsto \tau_x$ is non-decreasing in $x$. Such Lévy processes with non-decreasing paths are called subordinators.

Poisson Process

A Poisson process with rate (or intensity) $\lambda > 0$ is a Lévy process $N_t$ such that for any $t\ge 0$ the distribution of the random variable $N_t$ is the Poisson distribution with mean $\lambda t$. Thus, for any $k=0,1,2,\cdots$ we have $\P(N_t=k) = (\lambda t)^k\exp(-\lambda t)\ /\ k!$ for all $t > 0$.

Remark 1: (Superposition Theorem) If $N_t$ and $M_t$ are independent Poisson processes of rates $\lambda$ and $\mu$ respectively, then the superposition $N_t + M_t$ is a Poisson process of rate $\lambda+\mu$.

Remark 2: (Exponential Interval) Successive intervals are i.i.d. exponential r.v.s. with common mean $1/\lambda$.

Remark 3: (Thinning Property) Bernoulli-$p$ r.v.s. by Poisson-$\lambda$ compounding is again Poisson with rate $\lambda p$.

Remark 4: (Compounding) Every compound Poisson process is a Lévy process. We call the $\lambda F$ the Lévy measure where $F$ is the compounding distribution.

MGF of Poisson

For $N\sim\text{Pois}(\lambda)$, we have $\MGF(\theta)=\exp[\lambda (e^{\theta}-1)]$.

For $X_t=\sum_{i=1}^{N_t}\!Y_i$ where $N_t\sim\text{Pois}(\lambda t)$ and $\MGF_Y(\theta) = \psi(\theta) < \infty$, then $\MGF_{X_t}(\theta)=\exp[\lambda t (\psi(\theta) - 1)]$.

Law of Small Numbers

Binomial-$(n,p_n)$ distribution, where $n\to\infty$ and $p_n\to 0$ s.t. $np_n\to\lambda > 0$, converges to Poisson-$\lambda$ distribution.

Poisson-Exponential Martingale

If $N_t$ is a Poisson process with rate $\lambda$, then $Z_t=\exp[\theta N_t - (e^{\theta} - 1) \lambda t]$ is a martingale for any $\theta\in\R$.

Remark: Similar to Cameron-Martin, let $N_t$ be a Poisson process with rate $\lambda$ under $\P$, let $\Q$ be the measure s.t. the likelihood ratio $(\d\Q/\d\P)_{\F_t}=Z_t$ is defined as above, then $N_t$ under $\Q$ is a Poisson process with rate $\lambda e^{\theta}$.

If $X_t$ is a compound Poisson process with Lévy measure $\lambda F$. Let the MGF of compounding distribution $F$ be $\psi(\theta)$, then $Z_t=\exp[\theta X_t - (\psi(\theta) - 1)\lambda t]$ is a martingale for any $\theta\in\R$.

Vector Lévy Process

A $K$-dimensional Lévy process is a continuous-time random process $\{\mathbf{X}_t\}_{t\ge 0}$ with $\mathbf{X}_0=\bs{0}$ and i.i.d. increments. Like the one-dimensional version, vector Lévy processes are defined to be a.s. right continuous with left limits.

Remark: Given non-random linear transform $F:\R^K\mapsto \R^M$ and a $K$-dimensional Lévy process $\{\mathbf{X}_t\}_{t\ge 0}$, then $\{F(\mathbf{X}_t)\}_{t\ge 0}$ is a Lévy process on $\R^M$.