অ-শূন্য পারস্পরিক সম্পর্ক কি নির্ভরতা বোঝায়?


17

আমরা জেনেছি যে শূন্য সহাবস্থান স্বাধীনতা বোঝায় না। আমি কোনও শূন্য-না সম্পর্কিত পারস্পরিক সম্পর্ক নির্ভর করে কিনা সে বিষয়ে আমি আগ্রহী - উদাহরণস্বরূপ যদি কিছু র্যান্ডম ভেরিয়েবল এক্স এবং ওয়াইয়ের জন্য করর ( এক্স , ওয়াই ) we 0 হয় তবে আমরা সাধারণভাবে বলতে পারি যে এফ এক্স , ওয়াই ( এক্স , ওয়াই ) এক্স ( x ) f Y ( y ) ?Corr(X,Y)0XYfX,Y(x,y)fX(x)fY(y)

উত্তর:


13

হ্যাঁ কারণ

করর ( এক্স , ওয়াই ) 0 কোভ ( এক্স , ওয়াই ) 0

Corr(X,Y)0Cov(X,Y)0

E ( X Y ) - E ( X ) E ( Y ) 0

E(XY)E(X)E(Y)0

x y f X , Y ( x , y ) d x d y - x f x ( x ) d x y f y ( y ) d y 0

xyfX,Y(x,y)dxdyxfX(x)dxyfY(y)dy0

xyfX,Y(x,y)dxdyxyfX(x)fY(y)dxdy0

xyfX,Y(x,y)dxdyxyfX(x)fY(y)dxdy0

xy[fX,Y(x,y)fX(x)fY(y)]dxdy0

xy[fX,Y(x,y)fX(x)fY(y)]dxdy0

which would be impossible if fX,Y(x,y)fX(x)fY(y)=0,{x,y}fX,Y(x,y)fX(x)fY(y)=0,{x,y}. So

Corr(X,Y)0{x,y}:fX,Y(x,y)fX(x)fY(y)

Corr(X,Y)0{x,y}:fX,Y(x,y)fX(x)fY(y)

Question: what happens with random variables that have no densities?


1
Alecos, I have a dumb question. What does the fancy arrow mean in, e.g., line 1? I imagine something like "imply," but I'm uncertain.
Sycorax says Reinstate Monica

2
@user777 You mean ? Indeed, it means "implies".
Alecos Papadopoulos

The reason to only use the implication arrow in informal argument: is the implication arrow left or right associative?
kasterma

\implies produces which looks better than \rightarow which produces .
Dilip Sarwate

14

Let XX and YY denote random variables such that E[X2]E[X2] and E[Y2]E[Y2] are finite. Then, E[XY]E[XY], E[X]E[X] and E[Y]E[Y] all are finite.

Restricting our attention to such random variables, let AA denote the statement that XX and YY are independent random variables and BB the statement that XX and YY are uncorrelated random variables, that is, E[XY]=E[X]E[Y]E[XY]=E[X]E[Y]. Then we know that AA implies BB, that is, independent random variables are uncorrelated random variables. Indeed, one definition of independent random variables is that E[g(X)h(Y)]E[g(X)h(Y)] equals E[g(X)]E[h(Y)]E[g(X)]E[h(Y)] for all measurable functions g()g() and h()h()). This is usually expressed as AB.

AB.
But ABAB is logically equivalent to ¬B¬A¬B¬A, that is,

correlated random variables are dependent random variables.

If E[XY]E[XY], E[X]E[X] or E[Y]E[Y] are not finite or do not exist, then it is not possible to say whether XX and YY are uncorrelated or not in the classical meaning of uncorrelated random variables being those for which E[XY]=E[X]E[Y]E[XY]=E[X]E[Y]. For example, XX and YY could be independent Cauchy random variables (for which the mean does not exist). Are they uncorrelated random variables in the classical sense?


3
The nice thing about this answer is that it applies whether or not the random variables in question admit a density function, as opposed to other answers on this thread. This is true due to the fact that expectations can be defined with Stieltjes integrals using the CDF, with no mention of the density.
ahfoss

1

Here a purely logical proof. If AB then necessarily ¬B¬A, as the two are equivalent. Thus if ¬B then ¬A. Now replace A with independence and B with correlation.

Think about a statement "if volcano erupts there are going to be damages". Now think about a case where there are no damages. Clearly a volcano didn't erupt or we would have a condtradicition.

Similarly, think about a case "If independent X,Y, then non-correlated X,Y". Now, consider the case where X,Y are correlated. Clearly they can't be independent, for if they were, they would also be correlated. Thus conclude dependence.


If you will read my answer carefully, you will see that I too used the argument that you have made in your answer, namely that AB is the same as B¬A.
Dilip Sarwate

@DilipSarwate Edited to reflect that.
Tony
আমাদের সাইট ব্যবহার করে, আপনি স্বীকার করেছেন যে আপনি আমাদের কুকি নীতি এবং গোপনীয়তা নীতিটি পড়েছেন এবং বুঝতে পেরেছেন ।
Licensed under cc by-sa 3.0 with attribution required.