\documentclass[12pt]{article} \usepackage{amsmath, amsfonts, graphicx} \title{Combinatorial Interpretations for the Markov Numbers} \author{Andy Itsara, Gregg Musiker, James Propp, and Rui Viana} \newtheorem{theorem}{Theorem}[section] \begin{document} \maketitle \begin{abstract} In this paper we generalize the concept of Markov numbers with the construction of Markov polynomials. We then present a combinatorial model based on matchings of a family of graphs, called Snakes, that correctly interpret such Markov polynomials. From the four proofs found for such correlation, two are formally presented here, whereas the other two are only sketched. In the end we conjecture a generalization for this model. \end{abstract} \section{Markov Polynomials} \subsection{Definitions} Markov numbers are triples of number $(a,b,c)$ such that, $$ a^2+b^2+c^2=3abc$$ Such triples can be obtained from $(1,1,1)$ through a series of local moves of the type: $$a'=\frac{b^2+c^2}{a}$$ We define Markov polynomials to be triples of polynomials $(f(x,y,z), g(x,y,z), h(x,y,z))$ satisfying the relation $$f^2 + g^2 + h^2 = \frac{x^2+y^2+z^2}{xyz}fgh$$ Equivalently, if we let, $$K(a,b,c):=\frac{a^2+b^2+c^2}{abc}$$ we may write the above relation as $$K(f,g,h)=K(x,y,z)$$ Clearly the triple $(x,y,z)$ satisfies this relation. Furthermore, we can define three symmetric operators, $u,v$ and $w$, that act on triples of polynomials and allow us to construct Markov triples given some base case. Define: $$u(f,g,h)=(\frac{g^2+h^2}{f},g,h)=(f',g,h)$$ $$v(f,g,h)=(f,\frac{f^2+h^2}{g},h)=(f,g',h)$$ $$w(f,g,h)=(f,g,\frac{f^2+g^2}{h})=(f,g,h')$$ If $(f,g,h)$ is a Markov triple, then so are $u(f,g,h), v(f,g,h)$ and $w(f,g,h)$. Thus, taking the triple $(x,y,z)$ as our starting point, we encode a Markov triple as a string of letters $u$'s, $v$'s and $w$'s. Furthermore, since $o(o(f,g,h))=(f,h,g)$, where $o$ can be $u,v$ or $w$, we can also use reduced words that do not contain the same letter in consecutive positions. Finally, we derive the two equations that will play a central role in our model. Let $h'=w(f,g,h)$. Then, \begin{eqnarray} hh' &=& f^2 + g^2 \\ h + h' &=& K(x,y,z)fg \end{eqnarray} Note that similar equations could be derived for the operators $u$ and $v$, but the analysis will be completely symmetric. \subsection{Laurentness property} As proved in [2], Markov polynomials have the Laurentness property. Here, we present a simpler proof of this fact. From (2), $w(f,g,h)=K(x,y,z)fg - h$. If $f,g$ and $h$ are Laurent polynomials, then $w(f,g,h)$ is the sum and product of Laurent polynomials and therefore it is also a Laurent polynomial. Hence, since our starting point is $(x,y,z)$, by induction, any polynomial constructed using the operators $u,v$ and $w$ will have the Laurentness property. Our combinatorial model will also prove that these polynomials can only have positive coefficients. \section{The combinatorial model: Snake Graphs} \subsection{Definition} Snakes are undirected weighted graphs that will serve as our combinatorial model for the Markov polynomials. As with the Markov polynomials, Snakes come in triples and are constructed with the use of three symmetric operators that act on these triples. The starting point, however, will simply be a triple of empty graphs. We will call the graph operators $U,V$ and $W$, and as the names suggest they will have similar effects to those of the operators $u,v$ and $w$. A Snake will be represented by a string of the letters $U$, $V$ and $W$, but here we will not allow that a letter appears consecutively, in order to construct a bijection between words that represent Snakes and reduced words that represent Markov polynomials. The operator $W$ will substitute the third component of the triple by a copy of the other two components adjoined by an extra unit, which contains four vertices and four edges, two of weight $x$ and two of weight $y$. The extra unit will be connected to the end of the other two components by four edges of weight $z$, as in the following figure: \begin{figure}[h] \centering \includegraphics[height=1.2in]{picture1} \caption{(a) The adjoining unit. (b) The operator W in action.} \end{figure} Note that the extra unit could be connected to the Snakes in two different positions. As shown later, one of these possible positions will form a symmetric (with respect to a 180 degrees rotation) Snake and that will be the correct position of the connecting unit. The operators $U$ and $V$ are symmetric to $W$, except that $U$ acts on the first component of the triple and $V$ on the second one. A small drawback of our model is that this assignment of weights to the edges will only generate a direct correspondence with Markov polynomials if the operator $W$ is the first one applied. Otherwise, we would have to reassign the weights of the edges. Since such constructions would be symmetric to this case, we will only consider Snakes constructed through an application of the operator $W$, i.e, the first letter of the word must be a $W$. For the purpose of clarification, figure 2 shows an example of a possible sequence of application of the operators $U$,$V$ and $W$. \begin{figure}[h] \centering \includegraphics[height=1.0in]{picture2} \caption{An example of Snakes.} \end{figure} Note that there is a natural way of embedding Snakes in a square grid. For this reason, we will call the extra unit used to connect Snake a square unit. We can also say that X-edges are always vertical, Y-edges always horizontal and Z-edges can have both orientations. \subsection{Symmetry} To complete the definition of Snakes we must prove that one of the possible positions of the connecting unit form a symmetric Snake. \begin{figure}[h] \centering \includegraphics[height=1.0in]{picture3} \caption{The two positions for the square unit.} \end{figure} By induction, suppose we have a triple of symmetric Snakes $F$, $G$ and $FG$, where $FG$ was formed by an application of $W$. Then, if we apply $V$ to this triple the new Snake we construct can have either $F$ or $FG$ on top. For each of these cases there is a unique position of the extra unit that makes the resulting $FGF$ (or $FFG$) Snake symmetric. \begin{figure}[h] \centering \includegraphics[height=1.2in]{picture4} \caption{(a)The initial triple. (b) $F$ on top. (c) $FG$ on top.} \end{figure} Now, if $F$ and $G$ are symmetric then Snake (b) is clearly symmetric. Furthermore, Snakes (b) and (c) are the same Snake, as we can rotate the top $FG$ in Snake (c) to obtain Snake (b) and $FG$ is symmetric. Therefore, Snake (c) is also symmetric. Later on, we will make use of both representations here shown to derive important facts about Snakes. \section{Snake Markov Polynomials} \subsection{Definition} We will represent the weighted sum of all matchings of a Snake F by M(F), and we will call it the Snake Matching Polynomial. The Snake Markov Polynomial P(F) is defined to be M(F) divided by a monomial on $x$, $y$ and $z$. The exponents of this monomial will depend on the Snake and the first operator applied. If $W$ is the first operator applied to the base case, as we have been assuming so far, then the monomial is $x^{\frac{h-2}{2}}y^{\frac{w-2}{2}}z^{\frac{n}{4}}$, where $h$ is the height of the Snake, $w$ its width and $n$ the number of vertices. For clarification, the Snake represented in figure 5 has height 2, width 4 and 8 vertices. Its Snake Matching Polynomial is $M(F)=x^4+2x^2y^2+y^4+x^2z^2$ and, therefore, its Snake Markov Polynomial is $M(F)$ divided by $yz^2$. \begin{figure}[h] \centering \includegraphics[height=1.0in]{picture5} \caption{(a)The VW Snake.} \end{figure} \subsection{A bijection between polynomials} \begin{theorem} Let $x_{n} \cdots x_1$ be a reduced word representing a Markov Polynomial, and $X_{n} \cdots X_1$ be a word representing a Snake, where $X_{i}$ is the upper case version of $x_{i}$. Then, $x_{n} \cdots x_1$ = $P(X_{n} \cdots X_1)$. \end {theorem} In other words, the triple of polynomials represented by $x_{n} \cdots x_1$ is the same as the triple of Snake Markov Polynomials of $X_{n} \cdots X_{1}$ where $x_{i}$ and $X_{i}$ represent analogous operators ($u$ and $U$, $v$ and $V$, and $w$ and $W$) acting on triples of polynomials or triples of Snake graphs. \section{Proofs} There are many different ways of proving the above theorem. The most direct proofs are based on a technique called graphical condensation (as described in [3]). A formal proof that makes uses of this technique is presented here, as well as the sketch of a second proof. Furthermore, we present an important property of cuts on the Snake graphs that leads to two other proofs of the same theorem. One of these proofs is formally presented, whereas the second proof is only sketched. \subsection{Equations} All proofs here presented will be based on mathematical induction and on the fact that Theorem 3.1 is in fact true for many base cases, which we will not check here. We will be concentrate our efforts on proving that the operators $u,v$ and $w$ are equivalent to $U,V$ and $W$. If this is so, the structural induction step will be proved, and, therefore Theorem 3.1 will be true. But to prove that $u,v$ and $w$ are equivalent to $U,V$ and $W$ is to prove either (1) or (2) for the the Snake Markov Polynomials of triples of Snakes generated by $U,V$ and $W$. In other words, given a triple of graphs $(F,G,H)$ and $W(F,G,H) = (F,G,H')$ we have to prove that: \begin{eqnarray} P(H)P(H') &=& P(F)^2 + P(G)^2 \\ P(H) + P(H') &=& K(x,y,z)P(F)P(G) \end{eqnarray} Let $G=FH$, i.e. $(F,G,H)=V(F,G',H)$. We can ``simplify'' the above equations by canceling out some of the terms in the denominators to obtain: \begin{eqnarray} M(H)M(H') &=& z^{\frac{n_H}{2}+2}y^{w_H}x^{h_H}M(F)^2 + M(G)^2 \\ z^{\frac{n_F}{2}+2}y^{w_F}x^{h_F}M(H) + M(H') &=& (x^2+y^2+z^2)M(F)M(G) \end{eqnarray} , where $n_H$ is the number of vertices in H, $w_H$ is the width of H and $h_H$ is the height of H, and similar definitions apply to $n_F,w_F$ and $h_F$. \subsection{Graphical Condensation} Graphical condensation consists of merging two graphs into a double graph, and then splitting this double graph into two other graphs. The idea is finding a relationship between the matchings of the two initial graphs and matchings of the two resulting ones. In fact, Eric Kuo, in his article about graphical condensation proved the following theorems: \begin{theorem} Let $G=(V_1,V_2,E)$ be a planar bipartite graph in which $|V_1|=|V_2|$. Let vertices $a,b,c$ and $d$ appear on $G$ as in figure 6(a). If $a,c \in V_1$ and $b,d \in V_2$, then \begin{center} $W(G)W(G-\{a,b,c,d\}) = W(G-\{a,b\})W(G-\{c,d\}) + W(G-\{a,d\})W(G-\{b,c\})$ \end{center} \end{theorem} \begin{theorem} Let $G=(V_1,V_2,E)$ be a planar bipartite graph in which $|V_1|=|V_2|$. Let vertices $a,b,c$ and $d$ appear on $G$, as in figure 6(b). If $a,b \in V_1$ and $c,d \in V_2$, then \begin{center} $W(G-\{a,d\})W(G-\{b,c\}) = W(G)W(G-\{a,b,c,d\}) + W(G-\{a,c\})W(G-\{b,d\})$ \end{center} \end{theorem} Here $W(G)$ represents the weighted sum of the matchings of graph $G$, and $G-U$, where $U$ is a set of vertices, is the graph obtained by removing all the vertices in $U$ from $G$, and also all edges connected to those vertices. \begin{figure}[h] \centering \includegraphics[height=1.7in]{picture6} \caption{(a) Set up for Theorem 4.1. (b) Set up for Theorem 4.2.} \end{figure} \subsection{Using Graphical Condensation I} The first proof will show that equations (6) holds by an application of Theorem 4.2 to the graph $H'$ (figure 7). As defined before $(F,G,H') = W(V(F,G',H)$, thus we can represent $H'$ as a combination of $F$s, $H$s and square units as in figure 7. \begin{figure}[h] \centering \includegraphics[height=1.5in]{picture7} \caption{$H'=HFF$ and its labeled vertices for the first graphical condensation proof.} \end{figure} Note that if we label the vertices $a,b,c,d$ as in figure 7 and our snakes are thought of as bipartite graphs, then $a$ and $b$ belong to the same half of the graph, and $c$ and $d$ to a different half. Hence, we can apply theorem 4.2. Let us count the weighted matchings of each of the graphs necessary for the application of theorem 4.2: \begin{itemize} \item $H'-\{a,d\}$\newline We obtain three disconnected components: \begin{itemize} \item An X-edge; \item A copy of $H$; \item Two copies of $F$ connected by a unit square. The unit square has two matchings. One uses two X-edges, in which case we obtain $M(F)^2x^2$ matchings for $FF$, and the other one uses two Y-edges, in which case we obtain $M(F)^2y^2$ matchings for $FF$. There is also a third possibility in which we use the Z-edges connected to the unit square. By the symmetry of $F$, it is not hard to see that this case generates $M(F)^2z^2$ new matchings of $FF$. \end{itemize} Therefore, \begin{eqnarray} M(H'-\{a,d\}) &=& x(x^2+y^2+z^2)M(H)M(F)^2 \end{eqnarray} \item $H'-\{b,c\}$\newline Again, we obtain three disconnected components: \begin{itemize} \item A Y-edge; \item A copy of $F$; \item A copy of $HF=G$. \end{itemize} Therefore, \begin{eqnarray} M(H'-\{b,c\}) &=& yM(G)M(F) \end{eqnarray} \item $H'-\{a,c\}$ or $H'-\{b,d\}$ \newline The central $F$ part, as well as the two square units of $H'$ have forced matchings , and the other $F$ and $H$ parts of the graph are ``free''. We can simply count the forced matchings to obtain, \begin{eqnarray} M(H'-\{a,c\}) &=& z^{\frac{n_F}{4}+1}y^{\frac{w_F}{2}}x^{\frac{h_F}{2}+1}M(F)M(H) \end{eqnarray} \begin{eqnarray} M(H'-\{b,d\}) &=& z^{\frac{n_F}{4}+1}y^{\frac{w_F}{2}+1}x^{\frac{h_F}{2}}M(F)M(H) \end{eqnarray} \item $H'-\{a,b,c,d\}$ \newline \begin{eqnarray} M(H'-\{a,b,c,d\}) &=& xyM(F)^2M(H) \end{eqnarray} \end{itemize} Using Theorem 4.2 we can put equations (7)-(11) together to obtain: \begin{eqnarray} z^{\frac{n_F}{2}+2}y^{w_F}x^{h_F}M(H) + M(H') &=& (x^2+y^2+z^2)M(F)M(G) \end{eqnarray} , which is exactly the same as equation (6). \subsection{Using Graphical Condensation II} By a similar application of theorem 4.1 to $H'$ with a different assignment of the vertices $a,b,c,d$ (see figure 8) we can prove equation (5), but the new proof does not provide us with any further information, and it is not shown here. \begin{figure}[h] \centering \includegraphics[height=1.5in]{picture8} \caption{$H'=FHF$ and its labeled vertices for the second graphical condensation proof.} \end{figure} \subsection{Cuts of Snakes} Snake graphs have the property that their width along the ``spinal column'' of the Snake is always two. This fact can be used to prove a useful property relating cuts to matchings of Snakes. Consider two different matchings of the same snake. If we ``superimpose'' them into a single snake, in general, we can find a cut through the resulting ``double-snake'' that separates the corners of the snake. But there are particular matchings for which no cuts of the ``double-snake'' can be found. Those matchings must, together, use all the edges that constitute the border of the graph. \begin{figure}[h] \centering \includegraphics[height=0.8in]{picture9} \caption{(a) Matching 1. (b) Matching 2. (c) The ``double-snake'' formed from matchings 1 and 2.} \end{figure} As in figure (9) there are only two matchings of the same snake that do not share a common cut when superimposed on top of each other. One of those matchings has only X-edges and horizontal Z-edges (matching 1) and the other one has only Y-edges and vertical Z-edges (matching 2). \subsection{Using Cuts I} This special property of snakes leads to two other proofs of Theorem 3.1. First we proof equation (5), and then we sketch the proof of equation (6). Equation (5) represents a bijection between the matchings of the graphs $H \cup H'$ and $F \cup F$ plus $G \cup G$. First note that $G \cup G$ can be represented by the union of a copy of $G$ with a rotated copy of $G$, since Snakes are symmetric. \begin{figure}[h] \centering \includegraphics[height=1in]{picture10} \caption{Using Cuts I.} \end{figure} If one of the pairs of Z-edges connecting to the $H$ parts of $G \cup G$ is not used in a matching, then we can rearrange the parts of this graph to obtain a matching of $H \cup H'$ (figure 11 - (a) and (b)). If both pairs of Z-edges are being used, then the $H$ parts will have a common cut when superimposed (since they both have the ending vertices of an Y-edge being used by ``exterior'' graphs, which is equivalent to them both having a Y-edge), and we can rearrange the parts of the graph using a cut to get a matching of $H \cup H'$ (figure 11 - (c) and (d)). \begin{figure}[h] \centering \includegraphics[height=2.2in]{picture11} \caption{(a) We rearrange the parts of $G \cup G \cdots$ (b) to obtain $H \cup H'$. (c) We use a cut of $H$ to rearrange the parts of $G \cup G \cdots$ (d) to obtain $H \cup H'$.} \end{figure} Now, to complete the bijection, we look at the matchings of the $H$ part in $H'$ that do not share a common cut with the matchings of $H$ in the graph $H \cup H'$. These matchings are unique, as shown in the previous section , and to include them we only have to add $z^{\frac{n_H}{2}+2}y^{w_H}x^{h_H}M(F)^2$ matchings to the equation. Therefore, $$M(H)M(H') = z^{\frac{n_H}{2}+2}y^{w_H}x^{h_H}M(F)^2 + M(G)^2$$ , which is equation (5). \subsection{Using Cuts II} There is still a second proof based on cut arguments, but it will not be shown here, as it is a fairly complicated proof with many unimportant details. The new argument would prove equation (6) by analyzing the different possibilities for matchings of the bottom connecting square unit in figure 4(c). \section {A conjectured generalization} One direct generalization of the Markov numbers is if we consider n-tuples, instead of triples of numbers, and redefine all operators to act on n-tuples. \subsection{Definition} Let, $$K(X_1,X_2,\ldots,X_n) = \frac{\sum_1^n X_i^2}{\prod_1^n X_i}$$ If $(x_1,x_2,\ldots,x_n)$ is our starting n-tuple, then $(Y_1,Y_2,\ldots,Y_n)$ is a Markov n-tuple if, and only if, $$K(Y_1,Y_2,\ldots,Y_n) = K(x_1,x_2,\ldots,x_n)$$ Define the operators $O_i$, similar to $O_n$ as bellow: $$O_n(Y_1,Y_2,\ldots,Y_n) = (Y_1,Y_2,\ldots,\frac{\sum_{i=1}^{n-1} Y_i^2}{Y_n}) = (Y_1,Y_2,\ldots,Y_n')$$ It can be algebraically verified that these operators will take Markov n-tuples to Markov n-tuples. \subsection{Laurent polynomials} As before, there is a simple proof that all the polynomials generated by the operators $O_i$ will be Laurent polynomials. $$Y_n + Y_n' = Y_n + \frac{\sum_1^{n-1} Y_i^2}{Y_n} = K(Y_1,Y_2,\ldots,Y_n)\prod_1^{n-1} Y_i = K(x_1,x_2,\ldots,x_n)\prod_1^{n-1} Y_i$$ Therefore, if $Y_1,Y_2,\ldots,Y_n$ are Laurent polynomials then $Y_n'$ is the sum and product of Laurent polynomials and is then a Laurent polynomial itslef. By induction, all polynomials generated by the operators $O_i$ from $(x_1,x_2,\ldots,x_n)$ will have the Laurentness property. \subsection{A Guess} The Laurentness of n-tuples of polynomials strongly suggests that there exists a combinatorial interpretation for these polynomials. Furthermore, we should expect that Snakes are particular cases of a general interpretation. We have not yet been able to generate such model. Our best guess for a general interpretation so far is to use $(n-1)$-dimensional units instead of square units in the construction of our graphs. These new units will have $n-1$ matchings. The hard part problem is to find a way of making the ``inter-unit'' connections (the equivalent of Z-edges in the case $n=3$). \begin{thebibliography}{1} \bibitem {Ac:1} T. Ace. ``Markov Numbers'', web-page at http://www.qnet.com/\~{}crux /markov.html \bibitem {FZ:1} S. Fomin and A. Zelevinsky. ``The Laurent phenomenon'' \bibitem {Ku:1} E. Kuo. ``Applications of Graphical Condensation for Enumerating Matchings and Tilings'' \end{thebibliography} \end{document}