# Random Variables

Random variables are variables that can equal sample points in a sample space.

## Contents |

## Definitions

**X** is the name of the random variable.

**P(X = x)** is the probability that the random variable is equal to a chosen sample.

These can be understood seen in practice, with an **example** of the sample space due to rolling 2 dice.

- The sample space would be:

- S = {(1,1),(1,2),(1,3),...,(6,4),(6,5),(6,6)}

- S = {(1,1),(1,2),(1,3),...,(6,4),(6,5),(6,6)}
- If the probability of the dice roll involves the addition of the values on each die, the sample space would be:

- S = {2,3,4,...,10,11,12}

- S = {2,3,4,...,10,11,12}
- The random variable of the sample space is X. When it equals a value, the corresponding sample events are equal to it.

- (X = 3) = {(1,2),(2,1)}

- (X = 3) = {(1,2),(2,1)}
- The probability of the random variable equalling 3 is 2 sample points in the sample space of 36.

- P(X = 3) = 2/36 = 1/18

- P(X = 3) = 2/36 = 1/18
- Greater than and less than can be used also. Here is another example of the probability of rolling greater than or equal to 4. Notice how the probability can be found by using the compliment, requiring to add less probabilities.

- P(X >= 4)
- = 1 - P(X < 4)
- = 1 - (P(X = 2) + P(X = 3))
- = 1 - (1/36 + 2/36)
- = 11/12

## Distributions

### Cumulative Distribution Function

The cumulative distribution function graphs the sum of the density of probability for the random variable X. The function of the graph is F(x) = P(X <= x), which shows that it graphs the probability of x or smaller occurring at each value of x. It should be known that the integral of the function is equal to 1. The shape of these functions are based on the mean and variance of the data. If the mean is zero, the graph is centred in the middle, and can be shifted left or right depending on the mean. The variance changes the shape by making the spread greater if the variance is larger.

### Probability Density Function

The probability density function shows the probability at each point rather than the probability of the point and everything lower than that point, defined as P(x) = P(X = x).

The term "expected value" is defined as the mean of a random variable X:

- E(X) = μ

It can be calculated by the following integral of the function of the graph, multiplied by x:

Variance, denoted as σ^{2}, is also denoted as Var(X). It can be calculated with the following formula:

- Var(X) = E(X
^{2}) - E(X)^{2}

### Transformations

It is often required to add, subtract and multiply distributions together. This gives a new distribution, with new expected values and variances. The following are the ways they can be transformed:

- E(aX + b) = aE(X) + b

- Var(aX + b) = a
^{2}Var(X)

Example:

- Where W, X and Y are independent, normally distributed random variables, find the mean and variance of W given X and Y.
- W = 3*X + Y

- μ
_{X}= 11, μ_{Y}= 9 - μ
_{W}= 3*μ_{X}+ μ_{Y}= 3*11 + 9 = 42

- σ
^{2}_{X}= 5, σ^{2}_{Y}= 7 - σ
^{2}_{W}= (2*X)^{2}+ (1*Y)^{2}= 2^{2}*5^{2}+ 6^{2}= 136

## Standardisation

Standardisation is used to transform distributions into a normal, bell shaped, distribution. This allows us to use a standard table for **normal distributions**. The following formula is used:

where Z is the standardised random variable.

## End

This is the end of this topic. Click here to go back to the main subject page for Numerical Methods & Statistics.