## Lecture 18 (24.11.2015)

Definition 4.2: Let $I$ be a finite index set, and $R,\overline{R}\in M_{I^2}$. The partition function $Z(D)$ of  an oriented diagram  $D$ is defined as the number represented by $D$ when $D$ is translated into graphical matrix notation, replacing positive crossings by matrix elements of $R$, and negative crossings by matrix elements of $\bar R$, as in

Remarks:

• The partition function can be viewed as a sum over colourings, but with a notion of colouring that is different to the one that we used in Chapter 2. Namely, we call a colouring of edges of a diagram $D$ an assignment of colours (or abstract colours, i.e. elements of our index set $I$) to the edges of the diagram. At each crossing, we have two strands (one over and one under), and three arcs (the overstrand is one arc, and the understrand is split into two arcs at the crossing), but four edges. These edges are defined by taking the two underarcs as edges, and splitting the overarc also into two pieces. Thus a colouring of edges looks like
at a crossing. With this understanding of colouring, we have the symbolic formula
$$Z(D)=\sum_{{\rm colourings}}\prod_{{\rm crossings}} \left( R \;{\rm or}\;\overline{R}\right).$$
• To get some training in computing partition fucnctions, see exercise 14.
• From the definition of $Z$, it is clear that it depends on the choice of index set $I$ as well as the choice of tensors $R,\bar R$. When I want to emphasize this dependence, I write something like $Z_{I,R,\bar R}$ instead of $Z$.

From the definition, it is not clear if $Z$ has interesting invariance properties – in fact, this will be the case only  if $I, R, \bar R$ are chosen appropriately. In the following, we want to find choices of $I, R, \bar R$ that turn $Z$ into an invariant. We therefore have to consider the behaviour of the partitition function under the Reidemeister moves, and begin with move II,

Since we are dealing with oriented diagrams here, we have to distinguish two versions of this move: One where both strands point in the same direction, and one where they point in opposite directions. In the first case, the left hand side of the type II move can be translated into graphical matrix notation as
and the right hand side as

If we demand invariance under this type II move, we must therefore have $(R\bar R)^{ab}_{cd}=\delta^a_c\delta^b_d$ for all $a,b,c,d\in I$, which is equivalent to $R{\bar R}=1$, with $1\in M_{I^2}$ the identity tensor. In other words, $\bar R$ must be the inverse of $R$.

We now consider the other oriented type II move, where the strands have opposite orientations. Here the left hand side translates to
The two inner lines represent sums over two indices, say $i$ and $j$, so that this partial diagram can be written as $$\sum_{i,j}{\bar R}^{bi}_{aj}{R}^{cj}_{di},$$ which does not simplify to a matrix product as in the first case.

The right hand side of the type II move is
which still translates to $\delta^a_c\delta^b_d$. Hence we get as a second condition $$\sum_{i,j}{\bar R}^{bi}_{aj}{R}^{cj}_{di}=\delta^a_c\delta^b_d,$$which must again hold for all $a,b,c,d\in I$.

We now turn to the type III move,
For our purposes it will be convenient to rotate this picture. We first look at the situation with all arrows pointing down:

As for the type II move, we could now go ahead and translate into graphical matrix notation to extract another condition on $R$. This condition will however look simpler when we first make some changes to the above two diagrams on the left and right hand side.

So let us add a type II move on the left two strands at the top of the left hand side diagram, and another type II move on the right two strands at the bottom of the right hand side diagram. Then the two diagrams above turn into

The two blue boxes indicate the areas in which the two type II moves were inserted.

If we demand our first two equations on $R$ and $\bar R$, implying invariance of the partition function under both oriented type II moves, then the two diagrams just drawn will give the same (local contribution to the) partition function if and only if the partition function does not change under the type III move initially considered.

We now focus attention on the middle part of these two diagrams, as emphasized here by the green box:

Note that the parts of the diagrams that are not covered by the green rectangle are identical on the left and right hand sides. Hence we will have invariance under the initially considered type III move if and only if the parts inside the green rectangle are identical. This inner part of the diagrams has a beautiful representation in terms of tensors, which we derive now — making use of our graphical notation for tensor products, matrix products, and identities.

For the left hand side, we get

and for the right hand side, we get

Since both sides must be the same for invariance under this type III move, we find the condition

$$(R\otimes 1)(1\otimes R)(R\otimes 1)=(1\otimes R)(R\otimes 1)(1\otimes R).$$

This famous equation is known as the Yang-Baxter equation in honour of (independent) discoverers Yang and Baxter, who found this equation in theoretical physics.

It has connections to many areas in mathematics in physics, such as statistical mechanics, the braid group, Hopf algebras/quantum groups, permutation symmetries, quantum field theory, and more.

Note that by taking the inverse of the left and right hand side, we see that with $R$, also ${\bar R}=R^{-1}$ satisfies the Yang-Baxter equation (YBE).

Our discussion of the type III move was so far restricted to one particular choice of orientation: “All arrows down”. But one can check that once invariance under both versions of the type II move holds, also the type III move with other choices of orientations will leave the partition function invariant. As an example, let us look at this sequence of diagrams:

Starting from the left, we have a type III move configuration, but now with two strands oriented upwards and one downwards. We next carry out a type II move as shown, and observe that in the second diagram, in the middle area all orientations point up. This is the situation we have considered before (“all up” is of course equivalent to “all down”, just by turning around the paper), so that we may slide the strand in the background past the crossing as shown. Then, in a third step, we can again carry out two more type II moves, and arrive at the diagram that differs from the first one by a type III move. Hence we have also invariance under the type III move with “middle line pointing down, the other two up”, and similarly, under all versions of oriented type III moves.

We have thus proven the following theorem.

Theorem 4.3: Let $I$ be a finite index set, and $R, {\bar R}\in M_{I^2}$ such that the following three conditions hold:

• $R{\bar R}=1$
• $\sum_{i,j}{\bar R}^{bi}_{aj}{R}^{cj}_{di}=\delta^a_c\delta^b_d,$ for all $a,b,c,d\in I$.
• The Yang-Baxter equation holds for $R$ (and thus also for ${\bar R}=R^{-1}$.

Then the corresponding partition function $Z_{I,R,\bar R}$ is an invariant of oriented diagrams under regular isotopy.

We have seen many invariants already, but this construction is somewhat different: First, it seems that we might get not just one new invariant, but many — any choice of $I,R,\bar R$ satisfying the three conditions above will yield one. On the other hand, it is not evident how to pick $I, R, \bar R$ such that these conditions are satisfied. In fact, it is not even clear if these conditions admit any solutions at all.

In this context, in particular the Yang-Baxter equation is challenging to solve. If $I$ has $|I|=n$ elements, then $R\in M_{I^2}$ has $n^4$ components. However, the YBE is an equation for tensors in $M_{I^3}$ (three upper and three lower indices). In other words, the YBE is a system of $n^6$ nonlinear coupled equations for $n^4$ unknowns. Although many solutions of the Yang-Baxter equation are known, its solution theory is far from complete. At this stage knot theory makes contact with other fields of mathematics that study algebraic structures related to the conditions of Thm. 4.3.

I want to give at least one example where everything works out easily. This example is inspired by the bracket polynomial / Kauffman bracket. Looking at the unknot, we saw in the last lecture that the partition function of the usual unknot diagram is $Z(U)=n$, with $n=|I|$. The Kauffman bracket, on the other hand, gives $\langle U \rangle =1$, as you (hopefully) remember.

Going one step further, we might look at the typical diagram $U_r$ of the $r$-unlink,
(r loops). This translates to

which we see to be $\sum_{i_1}1\cdot\sum_{i_2}1\cdots\sum_{i_r}1=n^r$. The Kauffman bracket, on the other hand, satisfies $\langle U_r\rangle = d^{r-1}=(-A^2-A^{-2})^{r-1}$.

If we  try to find $I, R, \bar R$ such that the partition function resembles the Kauffman bracket, we therefore first have to take out one factor of $d$, i.e. we aim for $Z(D)=d\cdot\langle D\rangle$ to match the powers. Looking at the unknot diagram, this immediately tells us that we need $Z(U)=n=d\langle D\rangle=d=-A^2-A^{-2}$, i.e. $$n=-A^2-A^{-2}.$$ When we fix $A$ in such a way that it satisfies this equation, then the partition function will coincide with the bracket polynomial times $(-A^2-A^{-2})$ on all the trivial diagrams considered above.

Can we also choose $R,\bar R$ in such a way that $Z(D)=d\cdot\langle D\rangle$ for all diagrams? To achieve that, we must consider the recursion relation of the Kauffman bracket, which we recall to be
Reading the lines with orientation from top to bottom for the partial diagram on the left hand side, we have a negative crossing, i.e. this equation should correspond to $\bar R$. For $R$, we consider the flipped crossing,

We now want to read this recursion relation in the graphical matrix notation language;

This expression means in formulas: $$R^{ab}_{cd}=A\cdot \delta^a_c\delta^b_d+A^{-1}\cdot \delta^{ab}\delta_{cd}.$$ The variable $A$ is, as always, just a number and should not be confused with a matrix/tensor. The first term we recognise as the matrix elements of $1\in M_{I^2}$, because $1^{ab}_{cd}=\delta^a_c\delta^b_d$. The second term is the tensor $U\in M_{I^2}$ which was introduced in Exercise 4.1, with $U^{ab}_{cd}=\delta^{ab}\delta_{cd}$. Thus $$R=A\cdot 1+A^{-1}\cdot U.$$ In complete analogy, we can read off $\bar R$, and find $${\bar R}=A^{-1}\cdot 1+A\cdot U.$$

Now $R, \bar R$ are defined, and we can check if the three conditions in Theorem 4.3 are satisfied. There are two ways of doing this: On the one hand, one can calculate algebraically, multiplying tensors etc. On the other hand, one can use the graphical calculus. We want to do both.

For the graphical calculus, it is however necessary to first check that our assignment of $R,\bar{R}$ to crossings does not depend on the angle in which we look at the diagram, i.e. it must be invariant under roations. (We have this little complication here because we do not have arrows on the partial diagrams in the recursion relation of the Kauffman bracket. This is so because the Kauffman bracket does not depend on orientation, and produces two different splits (from the A/B splits of the bracket polynomial). Working with oriented crossings, there is always only one split that fits the orientations, as we know from the skein relations of the Jones, HOMFLY, and Conway polynomial. So here we must check by another method that our graphical notation does not become ambiguous when rotating the diagrams.)

A rotation through 180 degrees gives

Viewing all strands as having arrows top to bottom, we recognise the left hand side to be $R^{ab}_{cd}$, whereas the right hand side is $R^{dc}_{ba}$. These two expressions should better be the same if our graphical notation should be consistent. So we check $$R^{dc}_{ba}=A\delta^d_b\delta^c_a+A^{-1}\delta^{dc}\delta_{ba}=A\delta^a_c\delta^b_d+A^{-1}\delta^{ab}\delta_{cd}=R^{ab}_{cd}.$$

Similarly, a rotation through 90 degrees gives

which gives us $R^{ab}_{cd}$ on the left and ${\bar R}^{ca}_{db}$ on the right hand side. We check $${\bar R}^{ca}_{db}=A\delta^{ca}\delta_{db}+A^{-1}\delta^c_d\delta^a_b=A\delta^a_c\delta^b_d+A^{-1}\delta^{ab}\delta_{cd}=R^{ab}_{cd}.$$

This shows that our notation is independent of rotations of our diagrams. We can now proceed to graphically verify the three conditions of Theorem 4.3, and begin with the first one, $\bar{R}R=1$. Graphically, this calculation reads

Here we have used the graphical notation for $R$ and $\bar R$. In the third step, we used that the closed loop is a trace over the identity matrix, which is ${\rm Tr}(1)=n=|I|$, and in the fourth step, we have used that $n=d=-A^2-A^{-2}$, which causes the term $A^2+A^{-2}+n$ to vanish.

If one prefers an algebraic calculation, one gets $$R\bar{R}=(A+A^{-1}U)(A^{-1}+AU)=1+A^{-2}U+U^2+A^2U.$$ Now one can check, similar to the solution of exercise 4.1 last time, that $U$ satisfies $U^2=nU$. Using again $n+A^2+A^{-2}=0$ then gives $$R{\bar R}=1+(A^{-2}+n+A^2)U=1,$$ as in the graphical calculation.