See all sketches...

Sketch 5

Cross product in R3 induces a linear functional


Huh...

the deeper ideas behind cross products, determinants, and those sorts of things has always eluded me. As I was proving Lemma 3.3 from Chapter 3 of Millman and Parker's book on differential geometry, I was trying to understand if the notation abuse behind the cross product had any motivation or provided insight, and also about coming up with a basis independent understanding of the cross product. It lead me to the ideas in this sketch.

Date Started: March 14, 2024
Date Finished: March 14, 2024

Determinants are an indicator of linear independence

In my mind, there are two halves to linear algebra. I don't have an a priori name for the first half in my mind, so I'll call it "tensor arithmetic". I call it this because it has to do with simply tabulating data and performing arithmetical operations almost as if we were working with sheets in Excel. For example, we can add rows or columns, we can perform dot products, etc.. The idea comes from the fact that the field structure on R induces a field structure on Rn, and in general on "tensors" or multidimensional arrays with elements in R. The other half is what I think of as "linear algebra" which I see as the more algebraic/geometric perspective: the study of linear maps and spaces, and their geometry. Maybe this distinction is really a figment of my imagination, since after all the thing which I am calling "tensor arithmetic" really does feel like what we generally call algebra, but I think the distinction comes from what we care about. In tensor arithmetic, I care about the data and manipulations themselves, irrespective of some algebraic or geometric interpretation. Because of this, there is a fixed "basis" which is just a list of indicator rows with one in different positions. In the linear algebra perspective, I prefer basis independent ideas. I would like to think of R3 as some abstract space or module which just has an origin 0 specified. There is no basis. This alludes to the geometric picture, where I have an idea of things like perpendicularity, parallelism, etc. which in classical geometry did not need us to choose some sort of basis.

Notice that determinants are an indicator of linear independence. I'll start with the tensor arithmetic perspective. If we have a list of vectors {v1,,vk}, then we can form a matrix M with this data by simply arranging the vectors in a tabular format. Then it turns out that detM is zero exactly when there is some linear combination of the vectors which is zero, where a linear combination is essentially some way to add the rows. This can be seen from looking recursively at the definition of the determinant, which we can understand through the Leibniz formula and Laplace expansion. From a more algebraic or geometric perspective, the determinant is the product of eigenvalues of a linear transformation. As such, it is zero exactly when zero is an eigenvalue (since R is an integral domain) which happens exactly when the null space has nonzero dimension, so again the row space doesn't have maximal dimension (the rows are not linearly independent). As such, from both perspectives, we can see that determinants are an indicator of linear independence.

To clarify, I still feel that determinants provide more information than just an indicator of linear independence. For exmample, they also provide information on orientation, and also the amount of "dilation" a linear operator does to vectors in a vector space. However, I am still understanding what determinants are, so maybe I'll come to understand them better in a future post.

Cross products motivate the construction of a linear functional

The cross product is generally defined as: (I'll call this the tensor arithmetic definition) u×v=(u2v3u3v2,u3v1u1v3,u1v2u2v1). But this definition is not immediately obvious and can be hard to remember, so we sometimes give the following "memory tool" which is also notational abuse: (I'll call this the linear algebraic definition and will make this rigorous below) u×v=det(e1e2e3u1u2u3v1v2v3) where we have performed notational abuse because e1, e2, and e3 are vectors and not scalars. In a more tensor arithmetic sense, maybe I could argue that we don't really care what they are, as long as they have the right arithmetical properties.

On the other hand, though, I think this "notational abuse" is actually more intuitive and deeper than the definition itself! Firstly, to avoid notational abuse, notice that we can introduce the following linear functional on R3: ×u,v(x)=det(x1x2x3u1u2u3v1v2v3)=det(xuv)=x(u×v) where x(u×v) is the inner product or dot product of x with u×v. From above, we know that ×u,v is an indicator which tells us when {x,u,v} is a linearly independent collection. In our case, ×u,v(x)0 if and only if {x,u,v} is a linearly independent set if and only if {x,u,v} is a basis for R3. With this symbol, we can see some facts almost immediately: u×v=(×u,v(e1),×u,v(e2),×u,v(e3)), and (u×v)j=×u,v(ej) uu,v=×u,v(u)=det(u1u2u3u1u2u3v1v2v3)=0 vu,v=×u,v(v)=det(v1v2v3u1u2u3v1v2v3)=0 ×u,v(u×v)=u×v,u×v=||u×v||0 and when u and v are independent, ×u,v(x)=x(u×v)=0 exactly when x lies in the plane of u and v (this is illuminating because it shows how we can motivate the tensor arithmetic definition rigorously from our geometric insights).

It is nice to see that we can actually recover the cross product from our linear functional! This is the first fact above. We also get another interesting idea: u×v is a solution to ×u,v(x)=||x||. I wonder if this idea is particularly illuminating. As of now, I don't see how it is. What prevents us from doing anything nice quickly is that ||x|| isn't linear. On the other hand, I think these ideas somewhat motivate higher dimensional generalizations of the cross product. I'd also be curious to explore connections and constructions with polynomials that we might motivate from these more geometric constructions.

As usual, contact me at aathreyakadambi at gmail dot com if you'd like to discuss any of these ideas!

References

  1. "Determinant." Wikipedia, Wikimedia Foundation, 28 Feb. 2024, en.wikipedia.org/wiki/Determinant.
  2. "Laplace Expansion." Wikipedia, Wikimedia Foundation, 12 Feb. 2024, en.wikipedia.org/wiki/Laplace_expansion.
  3. "Leibniz Formula for Determinants." Wikipedia, Wikimedia Foundation, 12 Nov. 2023, en.wikipedia.org/wiki/Leibniz_formula_for_determinants.
  4. Millman, Richard S., and George D. Parker. Elements of Differential Geometry. Tan Chiang, 1984.
  5. "Rule of Sarrus." Wikipedia, Wikimedia Foundation, 11 Feb. 2024, en.wikipedia.org/wiki/Rule_of_Sarrus.