In general, one can align a 3D vector $\vec A$ to another 3D vector $\vec B$ by rotating $\vec A$ around the axis $\ \vec A \times \vec B \$ by the angle $\arccos{(\ \vec A \ \cdot \ \vec B \)}$.
But what I want to find is the rotation axis and angle to align two (what I call) 3D oriented vectors.
Each oriented vector consists of a pair of regular normalized 3D vectors, describing the head direction and the face direction. Think about a person: the head direction would be the vector from ground towards head, and the face direction would be the vector from head towards the tip of the nose (the two vectors are always perpendicular). For example: (all vectors are direction vectors located around the origin; they’re displaced here for better visualization)
^ (A's head)

+> (A's face)



(B's head) <+o

V (B's face)
If I rotate $\vec A$ around the axis perpendicular to the screen (as the above formula would, for the head vectors), their faces would not end up looking at the same direction (they’d be looking at opposite directions). But there is an axis that will align both upon rotation. (For the above case, it should be the vector coplanar with the screen, at a 45degree angle between vectors $\vec A$ and $\vec B$.)
Now, I think there is always such an axis that will align any 3D oriented vector $\vec A$ with any 3D oriented vector $\vec B$ upon rotation. I want to know how to find such axis and the angle to rotate, as to build a rotation matrix or quaternion.
OK. Let’s say that $(v_1, v_2)$ is the first vectorpair (your “oriented vector”). I’m going to change that to $(v_1, v_2, v_1 \times v_2)$, which is a triple of unitlength, orthogonal unit vectors, and in fact, the basis $(v_1, v_2, v_1 \times v_2)$ is positively oriented (i.e., can be rotated to align with the standard unit vectors $e_1, e_2,$ and $e_3$, in the $x$, $y$, and $z$directions, respectively, in that order). Let $V$ be the matrix whose columns are $v_1$, $v_2$, and $v_1 \times v_2$. Then
$$
T_V : \mathbb R^3 \to \mathbb R^3 : v \mapsto Vv
$$
is a linear transformation taking $e_1$ to $v_1$, $e_2$ to $v_2$, and $e_3$ oto $v_1 \times v_2$. Clear so far?
Build a similar matrix $W$ for your other set of vectors (extneded by the crossproduct).
Now consider the transformation
$$
S : : \mathbb R^3 \to \mathbb R^3 :v \mapsto W V^t v=W V^{1} v .
$$
It takes the vector $v_1$ to $e_1$ (when you multiply by $V^{1}$) and then to $w_1$ (when you multiply by $W$); similarly, it takes $v_2$ to $w_2$. And it also happens to take $v_1 \times v_2$ to $w_1 \times w_2$.
And it’s easy to build, because rather than computing $V^{1}$, you can compute $V^t$, because the unitness and orthogonality of the $v$vectors means that $V^t = V^{1}$.