
Central Limit Theorem and Levy Stable Processes in Econophysics
Explore the Central Limit Theorem (CLT), which states that the sum of independent random variables tends towards a normal distribution, and delve into Levy stable distributions as attracting fixed points for distributions. Learn how these concepts are applied in analyzing market returns and predicting future statistics in the field of Econophysics.
Download Presentation

Please find below an Image/Link to download the presentation.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.
You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.
E N D
Presentation Transcript
Lecture 11 CLT and Levy Stable Processes John Rundle Econophysics PHYS 255
Main Point Gaussian normal distribution is an attracting fixed point in the space of distributions for distributions with finite mean and variance More generally, Levy stable distributions that we discuss here are attracting fixed points under certain conditions These ideas are applied to the distributions of market returns to determine if the those returns for single returns or portfolios are characterized by stable distributions If the distributions are stable, we can postulate that the future statistics will be characterized by the same distribution
Central Limit Theorem https://en.wikipedia.org/wiki/Central_limit_theorem In probability theory, the Central Limit Theorem (CLT) establishes that, for the most commonly studied scenarios, when independent random variables are added, their sum tends toward a normal distribution (commonly known as a bell curve) even if the original variables themselves are not normally distributed. In more precise terms, given certain conditions, the arithmetic mean of a sufficiently large number of iterates of independent random variables, each with a well-defined (finite) expected value and finite variance, will be approximately normally distributed, regardless of the underlying distribution. The theorem is a key concept in probability theory because it implies that probabilistic and statistical methods that work for normal distributions can be applicable to many problems involving other types of distributions.
Central Limit Theorem https://en.wikipedia.org/wiki/Central_limit_theorem To illustrate the meaning of the theorem, suppose that a sample is obtained containing a large number of observations, each observation being randomly generated in a way that does not depend on the values of the other observations, and that the arithmetic average of the observed values is computed. If this procedure is performed many times, the central limit theorem says that the computed values of the average will be distributed according to the normal distribution (commonly known as a "bell curve"). A simple example of this is that if one flips a coin many times the probability of getting a given number of heads in a series of flips should follow a normal curve, with mean equal to half the total number of flips in each series (Bernoulli trials)
Central Limit Theorem http://www.slideshare.net/ShakeelNouman1/sampling-and-sampling-distributions
Central Limit Theorem https://en.wikipedia.org/wiki/Central_limit_theorem In more general usage, a central limit theorem is any of a set of weak- convergence theorems in probability theory. They all express the fact that a sum of many independent and identically distributed (i.i.d.) random variables, or alternatively, random variables with specific types of dependence, will tend to be distributed according to one of a small set of attractor distributions. When the variance of the i.i.d. variables is finite, the attractor distribution is the normal distribution. In contrast, the sum of a number of i.i.d. random variables with power law tail distributions decreasing as | x | ( + 1)where 0 < < 2 (and therefore having infinite variance) will tend to an alpha-stable distribution with stability parameter (or index of stability) of as the number of variables grows.
Levy Stable Processes https://en.wikipedia.org/wiki/Stable_distribution A non-degenerate distribution is a stable distribution if it satisfies the following property: Let X1and X2be independent copies of a random variable X. Then X is said to be stable if for any constants a > 0 and b > 0 the random variable aX1+ bX2has the same distribution as cX + d for some constants c > 0 and d. The distribution is said to be strictly stable if this holds with d = 0. Since the normal distribution, the Cauchy distribution, and the L vy distribution all have the above property, it follows that they are special cases of stable distributions. Such distributions form a four-parameter family of continuous probability distributions parametrized by location and scale parameters and c, respectively, and two shape parameters and , roughly corresponding to measures of asymmetry and concentration, respectively (see the figures on next slide).
Levy Stable Processes https://en.wikipedia.org/wiki/Stable_distribution Of the four parameters defining the family, most attention has been focused on the stability parameter, (see panel). Stable distributions have 0 < 2, with the upper bound corresponding to the normal distribution, and = 1 to the Cauchy distribution. The distributions have undefined variance for < 2, and undefined mean for 1. The importance of stable probability distributions is that they are "attractors" for properly normed sums of independent and identically-distributed (iid) random variables.
Levy Stable Processes https://en.wikipedia.org/wiki/Stable_distribution In probability theory, a distribution or a random variable is said to be stable if a linear combination of two independent copies of a random sample has the same distribution, up to location and scale parameters. The stable distribution family is also sometimes referred to as the L vy alpha-stable distribution, after Paul L vy, the first mathematician to have studied it. Of the four parameters defining the family, most attention has been focused on the stability parameter, (see following). Stable distributions have 0 < 2, with the upper bound corresponding to the normal distribution, and = 1 to the Lorentz (Cauchy) distribution. The distributions have undefined variance for < 2, and undefined mean for 1. The importance of stable probability distributions is that they are "attractors" for properly normed sums of independent and identically-distributed (iid) random variables.
Levy Stable Processes - Phase Diagram https://en.wikipedia.org/wiki/Stable_distribution From: Martin Sewall, Characterization of Financial Time Series, UCL Research Note RN/11/01 January 20, 2011
Levy Stable Processes https://en.wikipedia.org/wiki/Stable_distribution A generalized central limit theorem Another important property of stable distributions is the role that they play in a generalized central limit theorem. The central limit theorem states that the sum of a number of independent and identically distributed (i.i.d.) random variables with finite variances will tend to a normal distribution as the number of variables grows. A generalization due to Gnedenko and Kolmogorov states that the sum of a number of random variables with symmetric distributions having power-law tails (Pareto tails), decreasing as |x| 1where 0 < < 2 (and therefore having infinite variance), will tend to a stable distribution f ( x ; , 0 , c , 0 ) as the number of summands grows. If >2 then the sum converges to a stable distribution with stability parameter equal to 2, i.e. a Gaussian distribution.
PDF of returns for the Shanghai market data with t = 1 (daily returns) Comparison to Data http://finance.martinsewell.com/stylized- facts/distribution/ This plot is compared to a stable symmetric Levy distribution using the value = 1.44 determined from the slope [in a log-log plot of the central peak of the PDF as a function of the time increment]. Two attempts to fit a Gaussian are also shown. The wider Gaussian is chosen to have the same standard deviation as the empirical data. However, the peak in the data is much narrower and higher than this Gaussian, and the tails are fatter. The narrower Gaussian is chosen to fit the central portion, however the standard deviation is now too small. It can be seen that the data has tails which are much fatter and furthermore have a non- Gaussian functional dependence." Johnson, Jefferies and Hui (2003)
Fourier Transforms of Probability Distributions https://en.wikipedia.org/wiki/Fourier_transform Recall that the Fourier transform decomposes a function of time (a signal) into the frequencies that make it up, in a way similar to how a musical chord can be expressed as the frequencies (or pitches) of its constituent notes. The Fourier transform of a function of time itself is a complex-valued function of frequency, whose absolute value represents the amount of that frequency present in the original function, and whose complex argument is the phase offset of the basic sinusoid in that frequency. The Fourier transform is called the frequency domain representation of the original signal.
Fourier Transforms https://en.wikipedia.org/wiki/Fourier_transform The term Fourier transform refers to both the frequency domain representation and the mathematical operation that associates the frequency domain representation to a function of time. The Fourier transform is not limited to functions of time, but in order to have a unified language, the domain of the original function is commonly referred to as the time domain. For many functions of practical interest, one can define an operation that reverses this, the inverse Fourier transformation of a frequency domain representation combines the contributions of all the different frequencies to recover the original function of time.
Fourier Transforms https://en.wikipedia.org/wiki/Fourier_transform
Example: Normal Distribution https://en.wikipedia.org/wiki/Normal_distribution
Example: Lorentz Distribution Characteristic Function https://en.wikipedia.org/wiki/Cauchy_distribution Recall:
Fourier Transforms https://en.wikipedia.org/wiki/Fourier_transform
Levy Stable Processes https://en.wikipedia.org/wiki/Stable_distribution
Levy Stable Processes https://en.wikipedia.org/wiki/Stable_distribution
Levy Stable Processes: Fourier Transform https://en.wikipedia.org/wiki/Stable_distribution
Infinitely Divisible Processes https://en.wikipedia.org/wiki/Infinite_divisibility_(probability) A probability distribution is infinitely divisible if it can be expressed as the probability distribution of the sum of an arbitrary number of independent and identically distributed random variables. The characteristic function of any infinitely divisible distribution is then called an infinitely divisible characteristic function. More rigorously, the probability distribution F is infinitely divisible if, for every positive integer n, there exist n independent identically distributed random variables Xn1, ..., Xnn whose sum Sn = Xn1+ + Xnn has the distribution F.
Infinitely Divisible Processes https://en.wikipedia.org/wiki/Infinite_divisibility_(probability) Every infinitely divisible probability distribution corresponds in a natural way to a L vy process. A L vy process is a stochastic process { Lt: t 0 } with stationary independent increments Stationary means that for s < t, the probability distribution of Lt Ls depends only on t s and where independent increments means that that difference Lt Lsis independent of the corresponding difference on any interval non-overlapping with [s, t], and similarly for any finite number of mutually non-overlapping intervals.
Infinitely Divisible Processes https://en.wikipedia.org/wiki/Infinite_divisibility_(probability) The Poisson distribution, the negative binomial distribution, and the Gamma distribution are examples of infinitely divisible distributions, as are the normal distribution, Cauchy distribution and all other members of the stable distribution family. The uniform distribution and the binomial distribution are not infinitely divisible, as are all distribution with bounded (finite) support. The Student's t-distribution is infinitely divisible, while the distribution of the reciprocal of a random variable having a Student's t-distribution, is not.