Saya ditanya pertanyaan ini dalam sebuah wawancara.
Katakanlah kita memiliki matriks korelasi dalam bentuk
Saya diminta untuk menemukan nilai gamma, mengingat matriks korelasi ini.
Saya pikir saya bisa melakukan sesuatu dengan nilai eigen, karena semuanya harus lebih besar dari atau sama dengan 0. (Matriks semidefinite positif) - tetapi saya tidak berpikir pendekatan ini akan menghasilkan jawaban. Saya melewatkan tipuan.
Bisakah Anda memberikan petunjuk untuk menyelesaikannya?
pearson-r
correlation-matrix
pemula
sumber
sumber
Jawaban:
Kita sudah tahu dibatasi antara [ - 1 , 1 ] Matriks korelasi harus semidefinit positif dan karenanya anak di bawah umur utamanya harus nonnegatifγ [−1,1]
Jadi,
sumber
Inilah solusi yang lebih sederhana (dan mungkin lebih intuitif):
Pikirkan kovarians sebagai produk dalam daripada ruang vektor abstrak . Kemudian, entri dalam matriks korelasi yang untuk vektor v 1 , v 2 , v 3 , di mana braket sudut ⟨ v i , v j ⟩ menunjukkan sudut antara v i dan v j .cos⟨vi,vj⟩ v1 v2 v3 ⟨vi,vj⟩ vi vj
Hal ini tidak sulit untuk memvisualisasikan bahwa dibatasi oleh | ⟨ V 1 , v 2 ⟩ ± ⟨ v 1 , v 3 ⟩ | . Terikat pada kosinus nya ( γ ) demikian cos [ ⟨ v 1 , v 2 ⟩ ± ⟨ v 1 , v 3 ⟩ ] . Trigonometri dasar kemudian memberikan gamma ∈ [ 0,6 ×⟨v2,v3⟩ |⟨v1,v2⟩±⟨v1,v3⟩| γ cos[⟨v1,v2⟩±⟨v1,v3⟩] .γ∈[0.6×0.8−0.6×0.8,0.6×0.8+0.6×0.8]=[0,0.96]
Edit: Perhatikan bahwa di baris terakhir benar-benar cos ⟨ v 1 , v 2 ⟩ cos ⟨ v 1 , v 3 ⟩ ∓ dosa ⟨ v 1 , v 3 ⟩ dosa ⟨ v 1 , v 2 ⟩ - penampilan kedua 0,6 dan 0,8 terjadi secara kebetulan berkat 0,6 2 + 0,8 2 = 10.6×0.8∓0.6×0.8 cos⟨v1,v2⟩cos⟨v1,v3⟩∓sin⟨v1,v3⟩sin⟨v1,v2⟩ 0.62+0.82=1 .
sumber
Inilah yang saya maksudkan di komentar awal saya untuk jawaban dan apa yang saya rasakan @yangle mungkin bicarakan (walaupun saya tidak mengikuti / memeriksa perhitungan mereka).
As it spins, two positions are remarkable as ultimate wrt X, both are when Z falls into the plane XY. One is between X and Y, and the other is on the opposite side of Y. These are shown by blue and red vectors. At both these positions exactly the configuration XYZ (correlation matrix) is singular. And these are the minimal and maximal angle (hence correlation) Z can attain wrt X.
Picking the trigonometric formula to compute sum or difference of angles on a plane, we have:
This geometric view is just another (and a specific and simpler in 3D case) look on what @rightskewed expressed in algebraic terms (minors etc.).
sumber
Playing around with principal minors may be fine on 3 by 3 or maybe 4 by 4 problems, but runs out of gas and numerical stability in higher dimensions.
For a single "free" parameter problem such as this, it's easy to see that that the set of all values making the matrix psd will be a single interval. Therefore, it is sufficient to find the minimum and maximum such values. This can easily be accomplished by numerically solving a pair of linear SemiDefinite Programming (SDP) problems:
For example, these problems can be formulated and numerically solved using YALMIP under MATLAB.
Fast, easy, and reliable.
BTW, if the smarty pants interviewer asking the question doesn't know that SemiDefinite Programming, which is well-developed and has sophisticated and easy to use numerical optimizers for reliably solving practical problems, can be used to solve this problem, and many much more difficult variants, tell him/her that this is no longer 1870, and it's time to take advantage of modern computational developments.
sumber
Let us consider the following convex set
which is a spectrahedron named3 -dimensional elliptope. Here's a depiction of this elliptope
Intersecting this elliptope with the planes defined byx=0.6 and by y=0.8 , we obtain a line segment whose endpoints are colored in yellow
The boundary of the elliptope is a cubic surface defined by
Ifx=0.6 and y=0.8 , then the cubic equation above boils down to the quadratic equation
Thus, the intersection of the elliptope with the two planes is the line segment parametrized by
sumber
Every positive semi-definite matrix is a correlation/covariance matrix (and vice versa).
To see this, start with a positive semi-definite matrixA and take its eigen-decomposition (which exists by the spectral theorm, since A is symmetric) A=UDUT where U is a matrix of orthonormal eigenvectors and D is a diagonal matrix with eigen values on the diagonal. Then, let B=UD1/2UT where D1/2 is a diagonal matrix with the square root of eignevalues on the diagonal.
Then, take a vector with i.i.d. mean zero and variance 1 entries,x and note that Bx also has mean zero, and covariance (and correlation) matrix A .
Now, to see every correlation/covariance matrix is positive semi-definite is simple: LetR=E[xxT] be a correlation matrix. Then, R=RT is easy to see, and aTRa=E[(aTx)2]≥0 so the Rayleigh quotient is non-negative for any non-zero a so R is positive semi-definite.
Now, noting that a symmetric matrix is positive semi-definite if and only if its eigenvalues are non-negative, we see that your original approach would work: calculate the characteristic polynomial, look at its roots to see if they are non-negative. Note that testing for positive definiteness is easy with Sylvester's Criterion (as mentioned in another answer's comment; a matrix is positive definite if and only if the principal minors all have positive determinant); there are extensions for semidefinite (all minors have non-negative determinant), but you have to check2n minors in this case, versus just n for positive definite.
sumber