expectation

consider a discrete random variable that receives the values with the probabilities respectively, the expected value of , denoted by , is defined as such that the series is absolutely convergent a more general formula: if is a constant random variable, i.e. it receives only one value with a probability of 1 then . if is constant, then for each random variable whose expected value is finite .
let and be two random variables defined over the same probability space, then:
let , we look at as the number of successes from a sequence of independent trials
different definitions that mostly say the same thing:
if a variable quantity can take on the particular values in mutually exclusive and exhausive situations with the respective probabilities to them, the the quantity
is called the expectation of .
[cite:@jaynes_prob_2003]
let be a real-valued random variable, if , then is called integrable and we call
the expectation or mean of . if , then is called centered.
[cite:@klenke_prob_2020]
the expected value of a random variable , denoted by is
provided that the integral or sum exists. if , we say that does not exist.
[cite:@berger_inference_2002]
if are constants and is a random variable, , more generally: