Articles of correlation

Finding the autocorrelation of a sine wave.

The autocorrelation of sin(t) is defined as $$\displaystyle \int_{-\infty}^{\infty} \sin(t+\tau)\sin(t)d\tau$$ I’ve tried using the Wiener-Khinchin theorem which says that $$Corr(g,g)\Longleftrightarrow|G(f)|^2$$ I’ve tried reverting the FT squared of the sine wave, and don’t see any solution. I can’t find a derivation online for this. So I came here wondering if anyone of you could offer a […]

How to increase the correlation?

I have three vectors of numbers with the same dimensionality, $A$,$B$ and $C$. What is the most suitable number $x$, which maximizes the correlation of $A$ and $B+xC$ . To what extend can I increase the correlation. Thanks

Wedge Product Formula For Sine. Analogous Formula Generalizing Cosine to Higher Dimensions?

So I was day dreaming about linear algebra today (in a class which had nothing to do with linear algebra), when I stumbled across an interesting relationship. I was thinking about how determinants are really the area spanned by column vectors, and I had the thought that one could measure linear independence (in $R^2$ in […]

Find ratio / division between two numbers

I am reverse engineering custom software for a stepper motor. The original software eases in and out of any motion, and the duration of the ramping up to speed is directly related to the speed that the motor is ramping up to. In other words, when the motor is ramping up to a high speed, […]

Bounds on off-diagonal entries of a correlation matrix

Assume that all the entries of an $n \times n$ correlation matrix which are not on the main diagonal are equal to $q$. Find upper and lower bounds on the possible values of $q$. I know that the matrix should be positive semidefinite but how to proceed to get the upper and lower bounds? Thanks!

Correlation between two linear sums of random variables

I understand how to create random variables with a prespecified correlational structure using a Cholsesky decomposition. But I would like to be able to solve the inverse problem: Given random variables $X_1, X_3, \dots X_n$ ,and two different linear sums of those variables $V_1=a_{11}X_1+a_{12}X_2+\dots a_{1n}X_n$, and $V_2=a_{21}X_1 + a_{22}X_2 +\dots+a_{2n}X_n$, I wish to calculate the […]

How to prove inverse direction for correlation coefficient?

To show: If |Cor(X,Y)| = 1, then there exists a, b ∈ R s.t Y = bX + a. Any ideas or hints to proceed? Basically, I’ve to prove that if the absolute value of correlation b/w two random variables is 1, then they should be linearly related. So far, $$ |cor(X, Y)| = 1 […]

Pearson correlation and metric properties

Assuming that the data set was $z$-standardized to zero mean and unit variance (also assuming that it does not contain constant vectors). Then Pearson’s r reduces to Covariance: $$\rho(X,Y) := \frac{Cov(X,Y)}{\sigma(X)\sigma(Y)} = Cov(X,Y)$$ Now I’m investigating the dissimilarity function $$d(X,Y):=\sqrt{1 – \rho(X,Y)}$$ which is the square root of a common transformation of $\rho$ for use […]

Is correlation (in some sense) transitive?

If we know that A has some correlation with B ($\rho_{AB}$), and that B has some with C ($\rho_{BC}$), is there something we know to say about the correlation between A and C ($\rho_{AC}$)? Thanks.

Determining variance from sum of two random correlated variables

I understand that the variance of the sum of two independent normally distributed random variables is the sum of the variances, but how does this change when the two random variables are correlated?