Appendix:Glossary of probability and statistics

Hello, you have come here looking for the meaning of the word Appendix:Glossary of probability and statistics. In DICTIOUS you will not only get to know all the dictionary meanings for the word Appendix:Glossary of probability and statistics, but we will also tell you about its etymology, its characteristics and you will know how to say Appendix:Glossary of probability and statistics in singular and plural. Everything you need to know about the word Appendix:Glossary of probability and statistics you have here. The definition of the word Appendix:Glossary of probability and statistics will help you to be more precise and correct when speaking or writing your texts. Knowing the definition ofAppendix:Glossary of probability and statistics, as well as those of other words, enriches your vocabulary and provides you with more and better linguistic resources.

The following is a glossary of terms related to probability and statistics.

Contents: A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

A

atomic event
Another name for elementary event.
applied statistics
for software testing

B

bias
A sample not being representative of the population.
bias
The difference between the expected value of an estimator and the true value.
Bayesian probability
(or "personal probability") .. ?
Bayesian inference
?

C

categorical variable
A nominal variable.
conditional distribution
Given two jointly distributed random variables X and Y, the conditional probability distribution of Y given X (written "Y | X") is the probability distribution of Y when X is known to be a particular value.
conditional probability
the probability of some event A, assuming event B. Conditional probability is written P(A|B), and is read "the probability of A, given B".
completeness
?
correlation coefficient
correlation.
correlation
A numeric measure of the strength of linear relationship between two random variables (one can use it to quantify, for example, how shoe size and height are correlated in the population). An example is the Pearson product-moment correlation coefficient, which is found by dividing the covariance of the two variables by the product of their standard deviations. Independent variables have a correlation of 0. Synonym: correlation coefficient.
covariance
between two random variables X and Y, with expected values and is defined as the expected value of random variable , and is written . It is used for measuring correlation.
credence
A subjective estimate of probability.

D

data set
A sample and the associated data points.
data point
A typed measurement - it can be a Boolean value, a real number, a vector (in which case it's also called a data vector), etc.
distribution function
The function that gives the probability distribution of a random variable. It cannot be negative, and its integral on the probability space is equal to 1.

E

eclectic probability
?
efficiency
?
elementary event
(or atomic event) is an event with only one element. For example, when pulling a card out of a deck, "getting the jack of spades" is an elementary event, while "getting a king or an ace" is not.
estimator
A function of the known data that is used to estimate an unknown parameter; an estimate is the result from the actual application of the function to a particular set of data. The mean can be used as an estimator.
expected value
(or expectation) of a random variable is the sum of the probability of each possible outcome of the experiment multiplied by its payoff ("value"). Thus, it represents the average amount one "expects" to win per bet if bets with identical odds are repeated many times. For example, the expected value of a six-sided die roll is 3.5. The concept is similar to the mean. The expected value of random variable X is typically written E(X) or (mu).
experiment
?
event
A subset of the sample space, to which a probability can be assigned. For example, on rolling a die, "getting a five or a six" is an event (with a probability of one third if the die is fair).

F

frequency probability
?

G

generating function
?

I

independence or statistical independence
Two events are independent if the outcome of one does not affect that of the other (for example, getting a 1 on one die roll does not affect the probability of getting a 1 on a second roll). Similarly, when we assert that two random variables are independent, we intuitively mean that knowing something about the value of one of them does not yield any information about the value of the other.
interval variable
A variable with the features of ordinal variable, such that equal differences between its values represent equivalent intervals.

J

joint distribution
Given two random variables X and Y, the joint distribution of X and Y is the probability distribution of X and Y together.
joint probability
the probability of two events occurring together. The joint probability of A and B is written or

K

kurtosis
A measure of the "peakedness" of the probability distribution of a real-valued random variable. Higher kurtosis means more of the variance is due to infrequent extreme deviations, as opposed to frequent modestly-sized deviations.

L

likelihood function
(or just likelihood) is a conditional probability function considered a function of its second argument with its first argument held fixed. For example, imagine pulling a numbered ball with the number k from a bag of n balls, numbered 1 to n. Then you could describe a likelihood function for the random variable N as the probability of getting k given that there are n balls : the likelihood will be 1/n for n greater or equal to k, and 0 for n smaller than k. Unlike a probability distribution function, this likelihood function will not sum up to 1 on the sample space.

M

marginal distribution
given two jointly distributed random variables X and Y, the marginal distribution of X is simply the probability distribution of X ignoring information about Y.
marginal probability
The probability of an event, ignoring any information about other events. The marginal probability of A is written P(A). Contrast with conditional probability.
mean
Of a random variable is its expected value. The mean (or sample mean of a data set is just the average value.
moment about the mean
mutual independence
A collection of events is mutually independent if for any subset of the collection, the joint probability of all events occurring is equal to the product of the joint probabilities of the individual events. Think of the result of a series of coin-flips. This is a stronger condition than pairwise independence.
median
placing the numbers you are given in value order and find the middle number.

N

nominal variable
A variable with values whose order is insignificant.

O

ordinal variable
A variable with values whose order is significant.

P

pairwise independence
a pairwise independent collection of random variables is a set of random variables any two of which are independent.
parameter
Can be a population parameter, a distribution parameter, an unobserved parameter (all the same ?). Often written θ.
prior probability
?
population or statistical population
is a set of entities about which statistical inferences are to be drawn, often based on random sampling. One can also talk about a population of measurements or values.
population parameter
See statistical parameter.
posterior probability
?
probability density
Is used to describe probability in a continuous probability distribution. For example, you can't say that the probability of a man being six feet tall is 20%, but you can say he has 20% of chances of being between five and six feet tall. Probability density is given by a probability density function. Contrast with probability mass.
probability density function
gives the probability distribution for a continuous random variable.
probability distribution
A function that gives the probability of all elements in a given space. (-> see that page for a list of different distributions)
probability measure
gives the probability of events in a probability space.
probability space
A sample space over which a probability measure has been defined.
random function
?
probability theory
?

R

random variable
A quantity whose values are random and to which a probability distribution is assigned, such as the possible outcomes of a roll of a dice.
discrete random variable
?
continuous random variable
?
random vector (or multivariate random variable)
a vector whose components are random variables on the same probability space.
ratio variable
A variable with the features of interval variable, whose any two values have meaningful ratio, making the operations of multiplication and division meaningful.

S

sample or statistical sample
That part of a population which is actually observed.
sample space
The set of possible outcomes of an experiment. For example, the sample space for rolling a six-sided die will be {1, 2, 3, 4, 5, 6}.
sampling
A process of selecting observations to obtain knowledge about a population. There are many methods to choose on which sample to do the observations.
sampling distribution
The probability distribution, under repeated sampling of the population, of a given statistic.
skewness
A measure of the asymmetry of the probability distribution of a real-valued random variable. Roughly speaking, a distribution has positive skew (right-skewed) if the higher tail is longer and negative skew (left-skewed) if the lower tail is longer (confusing the two is a common error).
standard deviation
The most commonly used measure of statistical dispersion. It is the square root of the variance, and is generally written (sigma).
standardized moment
statistic
The result of applying a statistical algorithm to a data set. It can also be described as an observable random variable.
statistical inference
Inference about a population from a random sample drawn from it or, more generally, about a random process from its observed behavior during a finite period of time.
statistical dispersion
(also called statistical variability) is a measure of how diverse some data is. It can be expressed by the variance or the standard deviation.
statistical parameter
is a parameter that indexes a family of probability distributions.
sufficiency
?
variance
of a random variable is a measure of its statistical dispersion, indicating how far from the expected value its values typically are. The variance of random variable X is typically designated as , , or simply .

See also