My primary research interest is in symmetric function theory and its rich interplay with algebraic combinatorics, representation theory, and algebraic geometry. I also enjoy probability, enumerative combinatorics, and combinatorial game theory.

Symmetric Functions

Symmetric functions are ubiquitous throughout mathematics, with Schur functions playing a central role in combinatorics, geometry and representation theory. Multiplicities of irreducible representations, dimensions of algebraic varieties, and various other algebraic constructions that require the computation of certain integers may be translated to the computation of the coefficients in the expansion of certain generalizations of the Schur basis.

A symmetric polynomial is a polynomial in variables \( x_1, x_2, \ldots, x_n \) that is invariant under any permutation of its variables. For example, \[ s_{(2,1)}(x_1,x_2,x_3) = x_2 x_3^2 + x_1 x_3^2 + x_2^2 x_3 + 2x_1 x_2 x_3 + x_1^2 x_3 + x_1 x_2^2 + x_1^2 x_2 \] is symmetric in its three variables \(x_1, x_2, x_3\) since any reordering of them again results in the same polynomial. This is the Schur polynomial \(s_{\lambda}(X)\) in three variables indexed by the partition \(\lambda = (2,1)\), this is the character \(\mathrm{ch}(V^{\lambda})\) of the irreducible Schur module \(V^{\lambda}\) for the general linear group \(\mathrm{GL}(3)\), and this is a polynomial representative of the Schubert class \( [X_{\lambda}] \) in the cohomology ring of the grassmannian \( \mathrm{Gr}(2,3) \).

A quintessential problem in the theory of symmetric functions is to prove that a given function is symmetric and Schur positive. Previously, these problems required a new set of tools for each new function. I pioneered and continue to develop a general framework for providing a combinatorial proof of Schur positivity by using a structure called a dual equivalence graph that groups combinatorial objects into equivalence classes that are manifestly Schur positive. I extended this machinery to type B to give a framework for providing a combinatorial proof of Schur P-positivity, and to the full polynomial ring to give a framework for providing a combinatorial proof of Demazure positivity. One component of my current research program is to continue to expand dual equivalence to other settings.

Macdonald polynomials

Macdonald introduced a basis of symmetric functions \(P_{\mu}(X;q,t)\) with two additional parameters, \(q\) and \(t\), that simultaneously generalizes the Hall–Littlewood and Jack symmetric functions, both of which generalize Schur functions. Garsia transformed Macdonald polynomials \(\widetilde{H}_{\mu}(X;q,t)\) so that the Kostka-Macdonald coefficients \(K_{\lambda,\mu}(q,t)\) give the change of basis from Macdonald polynomials to Schur functions, \[ \widetilde{H}_{\mu}(X;q,t) = \sum_{\lambda} K_{\lambda,\mu}(q,t) s_{\lambda}(X) . \] A priori, \(K_{\lambda,\mu}(q,t)\) is a rational function in \(q\) and \(t\) with rational coefficients.

The Macdonald Positivity Theorem, first conjectured by Macdonald based on hand calculations, states that the Kostka-Macdonald coefficients are polynomials in the parameters with nonnegative integer coefficients. For example, \[ \widetilde{H}_{(2,1)}(x_1,x_2,x_3;q,t) = s_{(3)}(x_1,x_2,x_3) + (q+t) s_{(2,1)}(x_1,x_2,x_3) + qt s_{(1,1,1)}(x_1,x_2,x_3) . \] Garsia and Haiman conjectured that the transformed Macdonald polynomials could be realized as the bi-graded characters of certain modules for the diagonal action of the symmetric group on two sets of variables. Once resolved, this conjecture gives a representation theoretic interpretation of Kostka-Macdonald coefficients as the graded multiplicity of an irreducible representation in the Garsia-Haiman module. Following an idea outlined by Procesi, Haiman proved this conjecture by analyzing the algebraic geometry of the isospectral Hilbert scheme of points in the plane, thereby establishing Macdonald Positivity. This proof, however, is purely geometric and does not offer a combinatorial interpretation for the coefficients.

In 2004, Haglund conjectured an elegant combinatorial formula for the monomial expansion of Macdonald polynomials that was subsequently proved by Haglund, Haiman and Loehr. The formula may be stated as \[ \widetilde{H}_{\mu}(X;q,t) = \sum_{T:\mu\rightarrow \mathbb{N}} q^{\mathrm{inv}(T)} t^{\mathrm{maj}(T)} X^{T}, \] where the sum is over certain fillings of the diagram of \(\mu\) and \(\mathrm{inv}\) and \(\mathrm{maj}\) are nonnegative integer statistics. For example, \[ \widetilde{H}_{(2,1)}(x_1,x_2,x_3;q,t) = x_1^3 + x_2^3 + x_3^3 + (1+q+t) (x_1^2 x_2 + x_1^2 x_3 + x_1 x_2^2 + x_1 x_3^3 + x_2^2 x_3 + x_2 x_3^2) + (1 + 2q + 2t + q t) x_1 x_2 x_3 . \] This formula establishes that Kostka-Macdonald coefficients are polynomials with integer coefficients but comes short of proving non-negativity. One objective of my current research program is to utilize dual equivalence to prove Macdonald positivity combinatorially and give an explicit combinatorial formula for Kostka-Macdonald coefficients.

Schubert Calculus

Schubert calculus began around 1879 with Herman Schubert asking, and in special cases answering, enumerative questions in geometry. For example, how many lines in space meet four given lines? To answer this, Schubert considered the case where the first line intersects the second and the third intersects the fourth, in which case the answer is 2 (the line connecting the two points of intersection and the line of intersection of the two planes spanned by the two pairs of intersecting lines). He then asserted, by his principle of conservation of number, that the general answer, if finite, must also be 2. Hilbert, in his 15th problem, set out the task of making rigorous Schubert’s principle of conservation of number. Cohomology solve this and lead us to modern Schubert calculus and intersection theory, which has ramifications in geometry, topology, combinatorics, and even plays a central role in string theory.

For the flag manifold parameterizing complete flags of subspaces in complex affine \(n\)-space, Borel showed that the cohomology ring is naturally identified with the quotient of the polynomial ring in \(n\) variables by the ideal generated by positive degree symmetric polynomials. The Schubert cell decomposition studied by Bernstein, Gelfand and Gelfand and Demazure gives a geometrically important basis of the cohomology ring that is naturally identified with a distinguished linear basis \([X_w]\) indexed by permutations. This allows one to compute intersection numbers by carrying out a product in the cohomology ring and considering the Schubert decomposition of the result. In particular, Schubert calculus aims to understand the Littlewood–Richardson coefficients \(c_{u,v}^w\) given by \[ [X_u \cap X_v] = [X_u] [X_v] = \sum_w c_{u,v}^w [X_w] . \] Schubert polynomials were introduced by Lascoux and Schutzenberger as polynomial representatives of Schubert classes for the cohomology of the flag manifold with nice algebraic and combinatorial properties. Schubert polynomials form an integral basis for the full polynomial ring, and their structure constants precisely give the Schubert cell decomposition for the corresponding product of Schubert classes, i.e. \[ \mathfrak{S}_u \mathfrak{S}_v = \sum_w c_{u,v}^w \mathfrak{S}_w . \] Therefore Schubert polynomials give a way to avoid working modulo the ideal of symmetric polynomials in order to compute intersection numbers. For example, the lines incident to a given line in space corresponds to the class \([X_{132}]\) and so, too, by the Schubert polynomial \(\mathfrak{S}_{132} = x_1 + x_2\). The intersection of four such sets gives the lines incident to four given lines in space corresponds to the product \([X_{132}]^4\), which we may compute in the polynomial ring by \[ \mathfrak{S}_{132}^4 = \left(x_1+x_2\right)^4 = x_1^4+4 x_1^3 x_2+6 x_1^2 x_2^2+4 x_1 x_2^3+x_2^4 . \] Lines in space corresponds to the class \([X_{3412}]\), and so the number of lines in space incident to four given lines is the coefficient of \([X_{3412}]\) in \([X_{132}]^4\), which coincides with the coefficient of \(\mathfrak{S}_{3412} = x_1^2 x_2^2\) in the expansion above. To compute that, we must fully express the right hand side as a sum of Schubert polynomials, which becomes \[ \mathfrak{S}_{132}^4 = \mathfrak{S}_{162345}+3\mathfrak{S}_{25134}+2\mathfrak{S}_{3412}, \] and so the number of lines in space incident to four given lines is computed to be two. In the case of the grassmannian subvariety of the complete flag variety, the Schubert polynomials are symmetric and coincide with Schur polynomials. Therefore the classical Littlewood–Richardson rule for Schur polynomials gives a formula for \(c_{u,v}^w\) when \(u\) and \(v\) (and so necessarily \(w\) as well) are grassmannian permutations. One objective of my current research program is to use tools and techniques from symmetric functions theory to solve the Schubert problem by giving an explicit, nonnegative combinatorial rule for multiplying Schubert polynomials.

Combinatorics Seminar

I organize the USC Combinatorics Seminar where researchers present current results in algebraic or enumerative combinatorics.