How would a composite variable be strongly correlated with one variable but not the other?
I have two variables x1 and x2 which measure relatively similar things (r ~ 0.6), with x2 slightly larger than x1 on average. I then created a new variable x3 by subtracting the two: x3 = x1 - x2.
However, when I ran the Pearson correlations, x3 is strongly negatively correlated with x2 as expected (r ~ -0.6), but x3 is not very correlated with x1 (r ~ 0.1). How is this possible?
correlation
add a comment |
I have two variables x1 and x2 which measure relatively similar things (r ~ 0.6), with x2 slightly larger than x1 on average. I then created a new variable x3 by subtracting the two: x3 = x1 - x2.
However, when I ran the Pearson correlations, x3 is strongly negatively correlated with x2 as expected (r ~ -0.6), but x3 is not very correlated with x1 (r ~ 0.1). How is this possible?
correlation
2
A scatter plot matrix should help.
– Nick Cox
Dec 11 '18 at 20:37
2
Possible duplicate of When A and B are positively related variables, can they have opposite effect on their outcome variable C?
– sds
Dec 11 '18 at 21:07
I have a vague memory of an even closer duplicate but I can not find it.
– Martijn Weterings
Dec 13 '18 at 14:07
add a comment |
I have two variables x1 and x2 which measure relatively similar things (r ~ 0.6), with x2 slightly larger than x1 on average. I then created a new variable x3 by subtracting the two: x3 = x1 - x2.
However, when I ran the Pearson correlations, x3 is strongly negatively correlated with x2 as expected (r ~ -0.6), but x3 is not very correlated with x1 (r ~ 0.1). How is this possible?
correlation
I have two variables x1 and x2 which measure relatively similar things (r ~ 0.6), with x2 slightly larger than x1 on average. I then created a new variable x3 by subtracting the two: x3 = x1 - x2.
However, when I ran the Pearson correlations, x3 is strongly negatively correlated with x2 as expected (r ~ -0.6), but x3 is not very correlated with x1 (r ~ 0.1). How is this possible?
correlation
correlation
edited Dec 11 '18 at 20:37
Nick Cox
38.2k483128
38.2k483128
asked Dec 11 '18 at 15:30
hlineehlinee
1097
1097
2
A scatter plot matrix should help.
– Nick Cox
Dec 11 '18 at 20:37
2
Possible duplicate of When A and B are positively related variables, can they have opposite effect on their outcome variable C?
– sds
Dec 11 '18 at 21:07
I have a vague memory of an even closer duplicate but I can not find it.
– Martijn Weterings
Dec 13 '18 at 14:07
add a comment |
2
A scatter plot matrix should help.
– Nick Cox
Dec 11 '18 at 20:37
2
Possible duplicate of When A and B are positively related variables, can they have opposite effect on their outcome variable C?
– sds
Dec 11 '18 at 21:07
I have a vague memory of an even closer duplicate but I can not find it.
– Martijn Weterings
Dec 13 '18 at 14:07
2
2
A scatter plot matrix should help.
– Nick Cox
Dec 11 '18 at 20:37
A scatter plot matrix should help.
– Nick Cox
Dec 11 '18 at 20:37
2
2
Possible duplicate of When A and B are positively related variables, can they have opposite effect on their outcome variable C?
– sds
Dec 11 '18 at 21:07
Possible duplicate of When A and B are positively related variables, can they have opposite effect on their outcome variable C?
– sds
Dec 11 '18 at 21:07
I have a vague memory of an even closer duplicate but I can not find it.
– Martijn Weterings
Dec 13 '18 at 14:07
I have a vague memory of an even closer duplicate but I can not find it.
– Martijn Weterings
Dec 13 '18 at 14:07
add a comment |
4 Answers
4
active
oldest
votes
Here's a simple example. Suppose $ε_1$ and $ε_2$ are independent standard normal random variables. Define $X_1 = ε_1$, $X_2 = X_1 + ε_2$, and $X_3 = X_1 - X_2$. The correlation of $X_1$ with $X_2$ is then $tfrac{1}{sqrt{2}} approx .71$. Likewise, the correlation of $X_2$ with $X_3$ is $-tfrac{1}{sqrt{2}}$. But the correlation of $X_1$ with $X_3$ is the correlation of $ε_1$ with $ε_1 - (ε_1 + ε_2) = -ε_2$, which is 0 since the $ε_i$s are independent.
add a comment |
This is by construction of $x_3$. Given that $x_2$ and $x_1$ are closely related - in terms of their Pearson correlation if you subtract one from the other, you reduce correlation. The best way to see that is to consider the extreme scenario of complete correlation, i.e., $x_2=x_1$, in which case $x_3=x_1-x_2=0$, which is fully deterministic, i.e., $rapprox 0$.
You can do a more formal argument using the definition of the Pearson correlation by looking at the covariation between $x_3$ and $x_1$. You will see that the covariation will be reduced. By how much, depends on the correlation between $x_1$ and $x_2$, i.e., $r_{12}$ and their standard deviations. Everything being equal, the larger $r_{12}$, the smaller $r_{13}$.
1
By "covariation", do you mean "covariance"?
– Kodiologist
Dec 11 '18 at 16:53
@Kodiologist Are the two interchangeable? Or do they mean different things?
– Cowthulhu
Dec 12 '18 at 14:44
@Cowthulhu "Covariance" has a specific definition in statistics, but I'm not familiar with the word "covariation".
– Kodiologist
Dec 12 '18 at 14:50
@Kodiologist Gotcha, I had never heard "Covariance" referred to as "Covaration" either, so I was just wondering. Very new though, so don't take that as much of an indicator :). Thanks
– Cowthulhu
Dec 12 '18 at 14:59
add a comment |
You can rewrite your equation $x_3=x_2-x_1$ as $x_2=x_3-x_1$. Then regardless of what you pick as $x_1$ and $x_3$, you will have that $x_2$ is correlated to $x_1$ and $x_3$, but there is no reason to expect $x_1$ and $x_3$ to be correlated to each other. For instance, if $x_1$= number of letters in title of Best Picture Oscar winner, $x_3$= number of named hurricanes, $x_2$= number of named hurricanes - number of letters in title of Best Picture Oscar winner, then you will have that $x_3=x_2-x_1$, but that doesn't mean that $x_3$ will be correlated with $x_1$.
add a comment |
Let $Var(X_1) = sigma_1^2$, $Var(X_2) = sigma_2^2$, and $Cov(X_1,X_2)=sigma_{12} = rhosigma_1sigma_2$
Then $Var(X_3=X_1-X_2)=sigma_1^2+sigma_2^2 - 2sigma_{12}$
$Cov(X_1,X_3)=sigma_1^2-sigma_{12}$
$Cov(X_2,X_3) =sigma_{12}-sigma_2^2$
$Corr(X_1,X_3) =frac{sigma_1^2-sigma_{12}}{sqrt{sigma_1^2(sigma_1^2+sigma_2^2 - 2sigma_{12})}}$
$Corr(X_2,X_3) =frac{-sigma_2^2+sigma_{12}}{sqrt{sigma_2^2(sigma_1^2+sigma_2^2 - 2sigma_{12})}}$
So $|Corr(X_1,X_3)| lt text {or} = text {or} gt |Corr(X_2,X_3)|$ depends on $sigma_1^2$ and $sigma_2^2$
This relation cannot be determined by correlation coefficient.
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "65"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f381477%2fhow-would-a-composite-variable-be-strongly-correlated-with-one-variable-but-not%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
4 Answers
4
active
oldest
votes
4 Answers
4
active
oldest
votes
active
oldest
votes
active
oldest
votes
Here's a simple example. Suppose $ε_1$ and $ε_2$ are independent standard normal random variables. Define $X_1 = ε_1$, $X_2 = X_1 + ε_2$, and $X_3 = X_1 - X_2$. The correlation of $X_1$ with $X_2$ is then $tfrac{1}{sqrt{2}} approx .71$. Likewise, the correlation of $X_2$ with $X_3$ is $-tfrac{1}{sqrt{2}}$. But the correlation of $X_1$ with $X_3$ is the correlation of $ε_1$ with $ε_1 - (ε_1 + ε_2) = -ε_2$, which is 0 since the $ε_i$s are independent.
add a comment |
Here's a simple example. Suppose $ε_1$ and $ε_2$ are independent standard normal random variables. Define $X_1 = ε_1$, $X_2 = X_1 + ε_2$, and $X_3 = X_1 - X_2$. The correlation of $X_1$ with $X_2$ is then $tfrac{1}{sqrt{2}} approx .71$. Likewise, the correlation of $X_2$ with $X_3$ is $-tfrac{1}{sqrt{2}}$. But the correlation of $X_1$ with $X_3$ is the correlation of $ε_1$ with $ε_1 - (ε_1 + ε_2) = -ε_2$, which is 0 since the $ε_i$s are independent.
add a comment |
Here's a simple example. Suppose $ε_1$ and $ε_2$ are independent standard normal random variables. Define $X_1 = ε_1$, $X_2 = X_1 + ε_2$, and $X_3 = X_1 - X_2$. The correlation of $X_1$ with $X_2$ is then $tfrac{1}{sqrt{2}} approx .71$. Likewise, the correlation of $X_2$ with $X_3$ is $-tfrac{1}{sqrt{2}}$. But the correlation of $X_1$ with $X_3$ is the correlation of $ε_1$ with $ε_1 - (ε_1 + ε_2) = -ε_2$, which is 0 since the $ε_i$s are independent.
Here's a simple example. Suppose $ε_1$ and $ε_2$ are independent standard normal random variables. Define $X_1 = ε_1$, $X_2 = X_1 + ε_2$, and $X_3 = X_1 - X_2$. The correlation of $X_1$ with $X_2$ is then $tfrac{1}{sqrt{2}} approx .71$. Likewise, the correlation of $X_2$ with $X_3$ is $-tfrac{1}{sqrt{2}}$. But the correlation of $X_1$ with $X_3$ is the correlation of $ε_1$ with $ε_1 - (ε_1 + ε_2) = -ε_2$, which is 0 since the $ε_i$s are independent.
edited Dec 11 '18 at 16:54
answered Dec 11 '18 at 15:55
KodiologistKodiologist
16.7k22953
16.7k22953
add a comment |
add a comment |
This is by construction of $x_3$. Given that $x_2$ and $x_1$ are closely related - in terms of their Pearson correlation if you subtract one from the other, you reduce correlation. The best way to see that is to consider the extreme scenario of complete correlation, i.e., $x_2=x_1$, in which case $x_3=x_1-x_2=0$, which is fully deterministic, i.e., $rapprox 0$.
You can do a more formal argument using the definition of the Pearson correlation by looking at the covariation between $x_3$ and $x_1$. You will see that the covariation will be reduced. By how much, depends on the correlation between $x_1$ and $x_2$, i.e., $r_{12}$ and their standard deviations. Everything being equal, the larger $r_{12}$, the smaller $r_{13}$.
1
By "covariation", do you mean "covariance"?
– Kodiologist
Dec 11 '18 at 16:53
@Kodiologist Are the two interchangeable? Or do they mean different things?
– Cowthulhu
Dec 12 '18 at 14:44
@Cowthulhu "Covariance" has a specific definition in statistics, but I'm not familiar with the word "covariation".
– Kodiologist
Dec 12 '18 at 14:50
@Kodiologist Gotcha, I had never heard "Covariance" referred to as "Covaration" either, so I was just wondering. Very new though, so don't take that as much of an indicator :). Thanks
– Cowthulhu
Dec 12 '18 at 14:59
add a comment |
This is by construction of $x_3$. Given that $x_2$ and $x_1$ are closely related - in terms of their Pearson correlation if you subtract one from the other, you reduce correlation. The best way to see that is to consider the extreme scenario of complete correlation, i.e., $x_2=x_1$, in which case $x_3=x_1-x_2=0$, which is fully deterministic, i.e., $rapprox 0$.
You can do a more formal argument using the definition of the Pearson correlation by looking at the covariation between $x_3$ and $x_1$. You will see that the covariation will be reduced. By how much, depends on the correlation between $x_1$ and $x_2$, i.e., $r_{12}$ and their standard deviations. Everything being equal, the larger $r_{12}$, the smaller $r_{13}$.
1
By "covariation", do you mean "covariance"?
– Kodiologist
Dec 11 '18 at 16:53
@Kodiologist Are the two interchangeable? Or do they mean different things?
– Cowthulhu
Dec 12 '18 at 14:44
@Cowthulhu "Covariance" has a specific definition in statistics, but I'm not familiar with the word "covariation".
– Kodiologist
Dec 12 '18 at 14:50
@Kodiologist Gotcha, I had never heard "Covariance" referred to as "Covaration" either, so I was just wondering. Very new though, so don't take that as much of an indicator :). Thanks
– Cowthulhu
Dec 12 '18 at 14:59
add a comment |
This is by construction of $x_3$. Given that $x_2$ and $x_1$ are closely related - in terms of their Pearson correlation if you subtract one from the other, you reduce correlation. The best way to see that is to consider the extreme scenario of complete correlation, i.e., $x_2=x_1$, in which case $x_3=x_1-x_2=0$, which is fully deterministic, i.e., $rapprox 0$.
You can do a more formal argument using the definition of the Pearson correlation by looking at the covariation between $x_3$ and $x_1$. You will see that the covariation will be reduced. By how much, depends on the correlation between $x_1$ and $x_2$, i.e., $r_{12}$ and their standard deviations. Everything being equal, the larger $r_{12}$, the smaller $r_{13}$.
This is by construction of $x_3$. Given that $x_2$ and $x_1$ are closely related - in terms of their Pearson correlation if you subtract one from the other, you reduce correlation. The best way to see that is to consider the extreme scenario of complete correlation, i.e., $x_2=x_1$, in which case $x_3=x_1-x_2=0$, which is fully deterministic, i.e., $rapprox 0$.
You can do a more formal argument using the definition of the Pearson correlation by looking at the covariation between $x_3$ and $x_1$. You will see that the covariation will be reduced. By how much, depends on the correlation between $x_1$ and $x_2$, i.e., $r_{12}$ and their standard deviations. Everything being equal, the larger $r_{12}$, the smaller $r_{13}$.
answered Dec 11 '18 at 15:57
Gkhan CebsGkhan Cebs
311
311
1
By "covariation", do you mean "covariance"?
– Kodiologist
Dec 11 '18 at 16:53
@Kodiologist Are the two interchangeable? Or do they mean different things?
– Cowthulhu
Dec 12 '18 at 14:44
@Cowthulhu "Covariance" has a specific definition in statistics, but I'm not familiar with the word "covariation".
– Kodiologist
Dec 12 '18 at 14:50
@Kodiologist Gotcha, I had never heard "Covariance" referred to as "Covaration" either, so I was just wondering. Very new though, so don't take that as much of an indicator :). Thanks
– Cowthulhu
Dec 12 '18 at 14:59
add a comment |
1
By "covariation", do you mean "covariance"?
– Kodiologist
Dec 11 '18 at 16:53
@Kodiologist Are the two interchangeable? Or do they mean different things?
– Cowthulhu
Dec 12 '18 at 14:44
@Cowthulhu "Covariance" has a specific definition in statistics, but I'm not familiar with the word "covariation".
– Kodiologist
Dec 12 '18 at 14:50
@Kodiologist Gotcha, I had never heard "Covariance" referred to as "Covaration" either, so I was just wondering. Very new though, so don't take that as much of an indicator :). Thanks
– Cowthulhu
Dec 12 '18 at 14:59
1
1
By "covariation", do you mean "covariance"?
– Kodiologist
Dec 11 '18 at 16:53
By "covariation", do you mean "covariance"?
– Kodiologist
Dec 11 '18 at 16:53
@Kodiologist Are the two interchangeable? Or do they mean different things?
– Cowthulhu
Dec 12 '18 at 14:44
@Kodiologist Are the two interchangeable? Or do they mean different things?
– Cowthulhu
Dec 12 '18 at 14:44
@Cowthulhu "Covariance" has a specific definition in statistics, but I'm not familiar with the word "covariation".
– Kodiologist
Dec 12 '18 at 14:50
@Cowthulhu "Covariance" has a specific definition in statistics, but I'm not familiar with the word "covariation".
– Kodiologist
Dec 12 '18 at 14:50
@Kodiologist Gotcha, I had never heard "Covariance" referred to as "Covaration" either, so I was just wondering. Very new though, so don't take that as much of an indicator :). Thanks
– Cowthulhu
Dec 12 '18 at 14:59
@Kodiologist Gotcha, I had never heard "Covariance" referred to as "Covaration" either, so I was just wondering. Very new though, so don't take that as much of an indicator :). Thanks
– Cowthulhu
Dec 12 '18 at 14:59
add a comment |
You can rewrite your equation $x_3=x_2-x_1$ as $x_2=x_3-x_1$. Then regardless of what you pick as $x_1$ and $x_3$, you will have that $x_2$ is correlated to $x_1$ and $x_3$, but there is no reason to expect $x_1$ and $x_3$ to be correlated to each other. For instance, if $x_1$= number of letters in title of Best Picture Oscar winner, $x_3$= number of named hurricanes, $x_2$= number of named hurricanes - number of letters in title of Best Picture Oscar winner, then you will have that $x_3=x_2-x_1$, but that doesn't mean that $x_3$ will be correlated with $x_1$.
add a comment |
You can rewrite your equation $x_3=x_2-x_1$ as $x_2=x_3-x_1$. Then regardless of what you pick as $x_1$ and $x_3$, you will have that $x_2$ is correlated to $x_1$ and $x_3$, but there is no reason to expect $x_1$ and $x_3$ to be correlated to each other. For instance, if $x_1$= number of letters in title of Best Picture Oscar winner, $x_3$= number of named hurricanes, $x_2$= number of named hurricanes - number of letters in title of Best Picture Oscar winner, then you will have that $x_3=x_2-x_1$, but that doesn't mean that $x_3$ will be correlated with $x_1$.
add a comment |
You can rewrite your equation $x_3=x_2-x_1$ as $x_2=x_3-x_1$. Then regardless of what you pick as $x_1$ and $x_3$, you will have that $x_2$ is correlated to $x_1$ and $x_3$, but there is no reason to expect $x_1$ and $x_3$ to be correlated to each other. For instance, if $x_1$= number of letters in title of Best Picture Oscar winner, $x_3$= number of named hurricanes, $x_2$= number of named hurricanes - number of letters in title of Best Picture Oscar winner, then you will have that $x_3=x_2-x_1$, but that doesn't mean that $x_3$ will be correlated with $x_1$.
You can rewrite your equation $x_3=x_2-x_1$ as $x_2=x_3-x_1$. Then regardless of what you pick as $x_1$ and $x_3$, you will have that $x_2$ is correlated to $x_1$ and $x_3$, but there is no reason to expect $x_1$ and $x_3$ to be correlated to each other. For instance, if $x_1$= number of letters in title of Best Picture Oscar winner, $x_3$= number of named hurricanes, $x_2$= number of named hurricanes - number of letters in title of Best Picture Oscar winner, then you will have that $x_3=x_2-x_1$, but that doesn't mean that $x_3$ will be correlated with $x_1$.
answered Dec 11 '18 at 20:29
AcccumulationAcccumulation
1,54826
1,54826
add a comment |
add a comment |
Let $Var(X_1) = sigma_1^2$, $Var(X_2) = sigma_2^2$, and $Cov(X_1,X_2)=sigma_{12} = rhosigma_1sigma_2$
Then $Var(X_3=X_1-X_2)=sigma_1^2+sigma_2^2 - 2sigma_{12}$
$Cov(X_1,X_3)=sigma_1^2-sigma_{12}$
$Cov(X_2,X_3) =sigma_{12}-sigma_2^2$
$Corr(X_1,X_3) =frac{sigma_1^2-sigma_{12}}{sqrt{sigma_1^2(sigma_1^2+sigma_2^2 - 2sigma_{12})}}$
$Corr(X_2,X_3) =frac{-sigma_2^2+sigma_{12}}{sqrt{sigma_2^2(sigma_1^2+sigma_2^2 - 2sigma_{12})}}$
So $|Corr(X_1,X_3)| lt text {or} = text {or} gt |Corr(X_2,X_3)|$ depends on $sigma_1^2$ and $sigma_2^2$
This relation cannot be determined by correlation coefficient.
add a comment |
Let $Var(X_1) = sigma_1^2$, $Var(X_2) = sigma_2^2$, and $Cov(X_1,X_2)=sigma_{12} = rhosigma_1sigma_2$
Then $Var(X_3=X_1-X_2)=sigma_1^2+sigma_2^2 - 2sigma_{12}$
$Cov(X_1,X_3)=sigma_1^2-sigma_{12}$
$Cov(X_2,X_3) =sigma_{12}-sigma_2^2$
$Corr(X_1,X_3) =frac{sigma_1^2-sigma_{12}}{sqrt{sigma_1^2(sigma_1^2+sigma_2^2 - 2sigma_{12})}}$
$Corr(X_2,X_3) =frac{-sigma_2^2+sigma_{12}}{sqrt{sigma_2^2(sigma_1^2+sigma_2^2 - 2sigma_{12})}}$
So $|Corr(X_1,X_3)| lt text {or} = text {or} gt |Corr(X_2,X_3)|$ depends on $sigma_1^2$ and $sigma_2^2$
This relation cannot be determined by correlation coefficient.
add a comment |
Let $Var(X_1) = sigma_1^2$, $Var(X_2) = sigma_2^2$, and $Cov(X_1,X_2)=sigma_{12} = rhosigma_1sigma_2$
Then $Var(X_3=X_1-X_2)=sigma_1^2+sigma_2^2 - 2sigma_{12}$
$Cov(X_1,X_3)=sigma_1^2-sigma_{12}$
$Cov(X_2,X_3) =sigma_{12}-sigma_2^2$
$Corr(X_1,X_3) =frac{sigma_1^2-sigma_{12}}{sqrt{sigma_1^2(sigma_1^2+sigma_2^2 - 2sigma_{12})}}$
$Corr(X_2,X_3) =frac{-sigma_2^2+sigma_{12}}{sqrt{sigma_2^2(sigma_1^2+sigma_2^2 - 2sigma_{12})}}$
So $|Corr(X_1,X_3)| lt text {or} = text {or} gt |Corr(X_2,X_3)|$ depends on $sigma_1^2$ and $sigma_2^2$
This relation cannot be determined by correlation coefficient.
Let $Var(X_1) = sigma_1^2$, $Var(X_2) = sigma_2^2$, and $Cov(X_1,X_2)=sigma_{12} = rhosigma_1sigma_2$
Then $Var(X_3=X_1-X_2)=sigma_1^2+sigma_2^2 - 2sigma_{12}$
$Cov(X_1,X_3)=sigma_1^2-sigma_{12}$
$Cov(X_2,X_3) =sigma_{12}-sigma_2^2$
$Corr(X_1,X_3) =frac{sigma_1^2-sigma_{12}}{sqrt{sigma_1^2(sigma_1^2+sigma_2^2 - 2sigma_{12})}}$
$Corr(X_2,X_3) =frac{-sigma_2^2+sigma_{12}}{sqrt{sigma_2^2(sigma_1^2+sigma_2^2 - 2sigma_{12})}}$
So $|Corr(X_1,X_3)| lt text {or} = text {or} gt |Corr(X_2,X_3)|$ depends on $sigma_1^2$ and $sigma_2^2$
This relation cannot be determined by correlation coefficient.
edited Dec 12 '18 at 2:02
answered Dec 12 '18 at 0:03
user158565user158565
5,3091318
5,3091318
add a comment |
add a comment |
Thanks for contributing an answer to Cross Validated!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Some of your past answers have not been well-received, and you're in danger of being blocked from answering.
Please pay close attention to the following guidance:
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f381477%2fhow-would-a-composite-variable-be-strongly-correlated-with-one-variable-but-not%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
2
A scatter plot matrix should help.
– Nick Cox
Dec 11 '18 at 20:37
2
Possible duplicate of When A and B are positively related variables, can they have opposite effect on their outcome variable C?
– sds
Dec 11 '18 at 21:07
I have a vague memory of an even closer duplicate but I can not find it.
– Martijn Weterings
Dec 13 '18 at 14:07