Vector visualization of multicollinearity

  • #1
Trollfaz
137
14
General linear model is
$$y=a_0+\sum_{i=1}^{i=k} a_i x_i$$
In regression analysis one always collects n observations of y at different inputs of ##x_i##s. n>>k or there will be many problems. For each regressor, and response y ,we tabulate all observations in a vector ##\textbf{x}_i## and ##\textbf{y}_i##, both is a vector of ##R^n##.So multicollinearity is the problem that there's significant correlation between the ##x_i##s. In practice some degree of multicollinearity exists. So perfectly no multicollinearity means all the ##\textbf{x}_i## are orthogonal to each other?ie.
$$\textbf{x}_i•\textbf{x}_j=0$$
For different i,j and strong multicollinearity means one of more of the vector makes a very small angle with the subspace form by the other vectors? As far as I know perfect multicollinearity means rank(X)<k. X is a n by k matrix with ith col as ##\textbf{x}_i##
 
Physics news on Phys.org
  • #2
Perfect multicollinarity means that at least 1 predictor variable (columns) is a perfect linear combination of one or more of the other variables. Typically the variables are the columns of the matrix and observations are rows. In this situation, the matrix will not be full rank.
 
  • Like
Likes FactChecker

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
8
Views
2K
Replies
3
Views
1K
  • Atomic and Condensed Matter
Replies
4
Views
2K
Replies
12
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
914
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
5
Views
962
  • Calculus and Beyond Homework Help
Replies
4
Views
3K
Replies
2
Views
352
  • Set Theory, Logic, Probability, Statistics
Replies
10
Views
2K
Back
Top