- published: 26 Jan 2013
- views: 183741
In statistics, dependence refers to any statistical relationship between two random variables or two sets of data. Correlation refers to any of a broad class of statistical relationships involving dependence.
Familiar examples of dependent phenomena include the correlation between the physical statures of parents and their offspring, and the correlation between the demand for a product and its price. Correlations are useful because they can indicate a predictive relationship that can be exploited in practice. For example, an electrical utility may produce less power on a mild day based on the correlation between electricity demand and weather. In this example there is a causal relationship, because extreme weather causes people to use more electricity for heating or cooling; however, statistical dependence is not sufficient to demonstrate the presence of such a causal relationship.
Formally, dependence refers to any situation in which random variables do not satisfy a mathematical condition of probabilistic independence. In loose usage, correlation can refer to any departure of two or more random variables from independence, but technically it refers to any of several more specialized types of relationship between mean values. There are several correlation coefficients, often denoted ρ or r, measuring the degree of correlation. The most common of these is the Pearson correlation coefficient, which is sensitive only to a linear relationship between two variables (which may exist even if one is a nonlinear function of the other). Other correlation coefficients have been developed to be more robust than the Pearson correlation – that is, more sensitive to nonlinear relationships.
Karl Pearson FRS (27 March 1857 – 27 April 1936) was an influential English mathematician who has been credited for establishing the discipline of mathematical statistics.
In 1911 he founded the world's first university statistics department at University College London. He was a proponent of eugenics, and a protégé and biographer of Sir Francis Galton.
A sesquicentenary conference was held in London on 23 March 2007, to celebrate the 150th anniversary of his birth.
Carl Pearson, later known as Karl Pearson (1857–1936), was born to William Pearson and Fanny Smith, who had three children, Aurthur, Carl (Karl) and Amy. William Pearson also sired an illegitimate son, Frederick Mockett.
Pearson's mother, Fanny Pearson née Smith, came from a family of master mariners who sailed their own ships from Hull; his father read law at Edinburgh and was a successful barrister and Queen's Counsel (QC). William Pearson's father's family came from the North Riding of Yorkshire.
"Carl Pearson" inadvertently became "Karl Pearson" when he enrolled at the University of Heidelberg in 1879, which changed the spelling. He used both variants of his name until 1884 when he finally adopted Karl — supposedly also after Karl Marx[citation needed], though some argue otherwise. Eventually he became universally known as "KP".
Statistics 101: Understanding Correlation
What Is Correlation?
The Correlation Coefficient - Explained in Three Steps
Correlation & Regression Video 1
How Ice Cream Kills! Correlation vs. Causation
Correlation Coefficient
Correlation and causality | Statistical studies | Probability and Statistics | Khan Academy
Linear Regression and Correlation - Introduction
How to Calculate Pearson's Correlation Coefficient
The danger of mixing up causality and correlation: Ionica Smeets at TEDxDelft
Correlation Explanation with Demo
Correlation analysis using Excel
Phase Correlation Meter - Creating Tracks
Spearman's Rank Correlation Coefficient : ExamSolutions Maths Revision