Murphy Operators and the Centre

advertisement
Journal of Algebraic Combinatorics 9 (1999), 295–313
c 1999 Kluwer Academic Publishers. Manufactured in The Netherlands.
°
Murphy Operators and the Centre
of the Iwahori-Hecke Algebras of Type A
ANDREW MATHAS
mathas@maths.usyd.edu.au
Department of Mathematics, Imperial College, 180 Queen’s Gate, London SW7 2BZ, UK
Received October 24, 1996
Abstract. In this paper we introduce a family of polynomials indexed by pairs of partitions and show that if
these polynomials are self-orthogonal then the centre of the Iwahori-Hecke algebra of the symmetric group is
precisely the set of symmetric polynomials in the Murphy operators.
Keywords: Hecke algebra, Murphy operator, symmetric group
1.
Introduction
In [4] Murphy showed that for any field F the centre of the group algebra of the symmetric
group Sn on n symbols is the set of symmetric polynomials in the Murphy operators. One
consequence of this result is a relatively easy proof of the Nakayama conjecture for FSn .
Given an invertible element q in a ring R let H =H R,q (Sn ) be the associated IwahoriHecke algebra. Then H contains elements which are q-analogues of the Murphy operators
of the symmetric group and once again the symmetric polynomials in these elements belong
to the centre of H . It is natural therefore to make the following conjecture.
Conjecture 1.1 [2] For any ring R and any invertible element q in R the centre of the
Iwahori-Hecke algebra H R,q (Sn ) is the set of symmetric polynomials in the Murphy
operators.
Dipper and James [2, Theorem 2.14] have proved this conjecture in the case where H
is semisimple; unfortunately, there is a gap in their proof for the non-semisimple case. As
for the symmetric group, one of the reasons why this conjecture is interesting is that as
a corollary one can prove the Nakayama conjecture for H (this was proved by Gordon
James and the author in [3], with one direction being done previously in [2]).
In this paper we reduce Conjecture 1.1 to a purely combinatorial problem of showing
that certain polynomials are orthogonal. To the best of our knowledge these polynomials
have not appeared in the elsewhere in the literature; it seems likely that they will be of
independent interest.
Current address: School of Mathematics and Statistics, University of Sydney, NSW 2006, Australia.
296
MATHAS
One of the reasons why the conjecture for H is more difficult to prove than in the
symmetric group case is that the multiplication in H is much more complicated. In the
first section of this paper we overcome this difficulty by proving a combinatorial result
which allows us to rewrite an arbitrary product of Murphy operators as a linear combination
of less complicated products (modulo “unimportant” terms). In the second section we apply
this result towards Conjecture 1.1.
2.
Rewriting rules for Murphy operators
Throughout this paper we fix a positive integer n and let Sn = S({1, 2, . . . , n}) be the
symmetric group on {1, 2, . . . , n}.
Let R be a commutative ring with 1 and q an invertible element of R. Then the IwahoriHecke algebra H =H R,q (Sn ) is the unital associative R-algebra with generators T1 ,
T2 , . . . , Tn−1 and relations
Ti2 = (q − 1)Ti + q,
T j T j+1 T j = T j+1 T j T j+1 ,
Ti T j = T j Ti
if |i − j| > 2,
for all 1 ≤ i, j < n.
Given an integer i, where 1 ≤ i < n, let si = (i, i + 1); then {s1 , s2 , . . . , sn−1 } is the
(standard) set of Coxeter generators for the symmetric group Sn . Suppose that w ∈ Sn and
write w = si1 si2 . . . sik ; this expression for w is reduced if k is minimal, in which case we say
that w has length `(w) = k. Given such a reduced expression for w let Tw = Ti1 Ti2 . . . Tik ;
the relations in H ensure that Tw is independent of the choice of reduced expression for w.
Moreover, {Tw | w ∈ Sn } is a basis for H .
Definition 2.1 [2]
The Murphy operators of H are the elements L 1 = 0 and
L i = q 1−i T(1,i) + q 2−i T(2,i) + · · · + q −1 T(i−1,i) ,
for i = 2, . . . , n.
Using the defining relations in H it is a straightforward exercise to show that the following holds.
Lemma 2.2 [2, Section 2] Suppose that 1 ≤ i < n and 1 ≤ j ≤ n. Then
(i) Ti and L j commute when i 6= j − 1, j.
(ii) Ti commutes with L i L i+1 and L i + L i+1 .
(iii) L i and L j commute.
By (i) and (ii) the symmetric polynomials in the Murphy operators belong to the centre
of H .
297
MURPHY OPERATORS AND THE CENTRE
Now that we have amassed sufficient notation to explain Conjecture 1.1 in the general
case we restrict our attention to the generic Iwahori-Hecke algebra (where we renormalise
everything).
1
1
1
Let q 2 be an indeterminate over Z and let A = Z[q 2 , q − 2 ] be the ring of Laurent polyno1
mials in q 2 with integer coefficients. Then H =H A,q (Sn ) is the generic Iwahori-Hecke
algebra of Sn .
1
1
1
For any w ∈ Sn we let T̃w = q − 2 `(w) Tw . Define α = q 2 − q − 2 ; then the multiplication
in H is completely determined by
(
T̃i T̃w =
T̃si w
T̃si w + α T̃w
if `(si w) > `(w),
if `(si w) < `(w),
for all i = 1, 2, . . . , n − 1 and w ∈ Sn .
1
We define L̃ i = T̃(1,i) + T̃(2,i) + · · · + T̃(i−1,i) for m = 2, . . . , n. Because L̃ i = q 2 L i we
also call L̃ i a Murphy operator; this should cause no confusion. In fact, the Murphy operators
are quite hard to work with directly; instead we work with the elements L 1 = 1 and
L i = T̃i−1 T̃i−2 . . . T̃1 T̃1 . . . T̃i−2 T̃i−1 ,
for i = 2, . . . , n,
which we call L -Murphy operators. This terminology is justified by the well-known
lemma.
Lemma 2.3 Suppose that 1 ≤ i ≤ n. Then L i = α L̃ i + 1.
Proof: When i = 1 there is nothing to prove. Therefore, by induction and using the fact
that T̃i T̃(i, j) T̃i = T̃(i+1, j) when j = 1, 2, . . . , i − 1, we find
L
i+1
= T̃i L i T̃i = T̃i (α L̃ i + 1)T̃i
¡
¢
= α T̃(i+1,i−1) + T̃(i+1,i−2) + · · · + T̃(i+1,1) + T̃i2
= α L̃ i+1 + 1.
2
We remark that the easiest way to verify Lemma 2.2 is to first prove the corresponding
result for the L -Murphy operators.
Define the inner product h, i : H ⊗H → A to be the A-linear extension of the map
(
hT̃x , T̃y i =
1,
if x = y
0,
otherwise,
where x, y ∈ Sn . An important property of the inner product, which we shall apply without
mention, is that
hh 1 h 2 , h 3 i = hh 2 , h ∗1 h 3 i,
for all h 1 , h 2 , h 3 ∈ H ,
298
MATHAS
where ∗ : H →H is the unique A-linear anti-isomorphism of H such that T̃w∗ = T̃w−1
for all w ∈ Sm . This is easy to check because hh 1 , h 3 i is the coefficient of 1 in h ∗1 h 3 .
Following DipperP
and James [2], if x ∈ Sn we say that appears in h ∈ H if hT̃x , hi 6= 0;
equivalently, if h = w∈Sn aw T̃w then ax 6= 0 (aw ∈ A).
A partition
P λ = (λ1 , λ2 , . . .) of n is a weakly decreasing sequence of non-negative integers
such that i λi = n. Write λ = (λ1 , . . . , λk ), if λi = 0 only if i > k, and let
Sλ = Sλ1 × Sλ2 × · · · × Sλk
be the corresponding Young or parabolic subgroup of Sn .
Definition 2.4 [2] Let λ be a partition of n and suppose that si1 , si2 , . . . , sik are the standard
Coxeter generators for Sλ where n > i 1 > i 2 > · · · > i k > 0. Then u λ ∈ Sλ is the
permutation si1 si2 . . . sik .
Example 2.5
Suppose that λ = (3, 3, 2, 1), a partition of 9. Then
Sλ = S3 × S3 × S2 × S1 = S({1, 2, 3}) × S({4, 5, 6}) × S({7, 8}) × S({9})
is generated by the set of simple transpositions {s1 , s2 , s4 , s5 , s7 }. Therefore u λ is the element
s7 s5 s4 s2 s1 = (1, 2, 3)(4, 5, 6)(7, 8).
Remark 2.6 The elements u λ are elements of minimal length in their conjugacy class (in
fact they are Coxeter elements in Sλ ). The reason why they are of interest to us is that
Dipper and James [2, Theorem 2.12] have shown that if c is in the centre of H then there
exists a partition λ such that u λ appears in c.
Our initial aim is to prove the following result.
Theorem 2.7 Suppose that λ is a partition of n and that 1 ≤ i 1 , i 2 , . . . , i k ≤ n. Then u λ
appears in L̃ i1 L̃ i2 . . . L̃ ik only if `(u λ ) ≤ k.
Because the Murphy operators commute we may assume that i 1 ≥ i 2 ≥ · · · ≥ i k . Furthermore, by Lemma 2.3, we may replace the Murphy operator L̃ i in the statement of
Theorem 2.7 with the L -Murphy operator L i . We shall need quite a few lemmas before
we can approach the summit.
Definition 2.8 An element w of Sn is decreasing if it has a reduced expression of the
form w = si1 si2 . . . sir for some n > i 1 > i 2 > · · · > ir > 0.
In particular, each u λ is decreasing. To prove Theorem 2.7 we actually give a recursive
formula for calculating the inner products hT̃w , L̃ i1 L̃ i2 . . . L̃ ik i whenever w is a decreasing
element of length k.
299
MURPHY OPERATORS AND THE CENTRE
dec
dec
We define an equivalence relation = on H whereby h 1 = h 2 if and only if hT̃w , h 1 i =
dec
hT̃w , h 2 i for all decreasing elements w ∈ Sn (h 1 , h 2 ∈ H ). A note of warning: = does
not respect the multiplication in H .
We need one more set of definitions before we can start to work. Henceforth, we fix
an integer i where 1 ≤ i < n and let Si = S({1, 2, . . . , i}) and H i = H A,q (Si ), which
P
we consider as a subalgebra of H in the natural way. We also let H i+ = w∈Si N[α]T̃w
+
and H + = H +
is closed under multiplication and that L i ∈ H i+ for
n . Note that H
i = 1, 2, . . . , n.
Lemma 2.9 Suppose that h ∈ H + and that w appears in T̃i h for some w ∈ Sn such
that `(si w) > `(w). Then si w appears in T̃i h.
Proof:
­
By assumption, T̃i T̃w = T̃si w so
®
­
®
T̃si w , T̃i h = hT̃i T̃w , T̃i hi = T̃w , T̃i2 h = hT̃w , hi + αhT̃w , T̃i hi.
Now h and T̃i h belong to H + , so hT̃w , hi and hT̃w , T̃i hi are both polynomials in α with
non-negative coefficients. Therefore, no cancellation can occur and hT̃si w , T̃i hi 6= 0 because
2
hT̃w , T̃i hi 6= 0.
The following easy but useful lemma is from [2].
Lemma 2.10 Let Y be a Young subgroup of Sn and suppose that w ∈ Sn appears in T̃x h
/ Y and some h ∈ H (Y ). Then w ∈
/ Y.
or h T̃x for some x ∈
Proof: It suffices to consider the case where h = T̃y for some y ∈ Y . The lemma now
follows easily by induction on `(y).
2
The next seemingly innocuous result will get us half way to Theorem 2.7.
Proposition 2.11 Suppose that w = si v for some v ∈ Si and 1 < i < n. Let x ∈ Si and
suppose that w appears in T̃i T̃x T̃i h for some h ∈ H i . Then x ∈ Si−1 .
Proof: Because w appears in T̃i T̃x T̃i h there exists some z ∈ Si such that w appears in
T̃i T̃x T̃i T̃z . By assumption, `(w) = `(v) + 1 so T̃i T̃w = T̃v + α T̃w ; therefore,
0 6= hT̃w , T̃i T̃x T̃i T̃z i = hT̃i T̃w , T̃x T̃i T̃z i = hT̃v , T̃x T̃i T̃z i + αhT̃w , T̃x T̃i T̃z i.
However, hT̃v , T̃x T̃i T̃z i = hT̃v T̃z −1 , T̃x T̃i i = 0 by Lemma 2.10; so
­
®
0 6= hT̃w , T̃i T̃x T̃i T̃z i = αhT̃w , T̃x T̃i T̃z i = αhT̃i T̃v T̃z −1 , T̃x T̃i i = α T̃i T̃v T̃z −1 , T̃xsi .
Therefore, xsi appears in T̃i T̃v T̃z −1 and so there exists some y ∈ Si such that si y = xsi .
2
Hence, x ∈ si Si si ∩ Si = Si−1 as required.
300
MATHAS
The relevance of the proposition to Theorem 2.7 is revealed by the following corollaries.
Corollary 2.12 Suppose that i ≥
Then
 ­
®

α T̃si w , h ,


hT̃w , L i+1 hi = hT̃w , hi,



0,
1 and let h ∈ H i and suppose that w is decreasing.
if si w ∈ Si ,
if w ∈ Si ,
otherwise.
In particular, w appears in L i+1 h if and only if either
(i) w ∈ Si and w appears in h, or
(ii) w = si v where v ∈ Si is decreasing and v appears in h.
Proof: If w ∈
/ Si+1 then w does not appear in L
that
i+1 h.
In general, by Lemma 2.3 we have
hT̃w , L i+1 hi
®
­
®
­
®
­
= α T̃w , T̃(i+1,i) h + α T̃w , T̃(i+1,i−1) h + · · · + α T̃w , T̃(i+1,1) h + hT̃w , hi.
If w ∈ Si then w does not appear in T̃(i+1, j) h for any 1 ≤ j ≤ i by Lemma 2.10, so
hT̃w , L i+1 hi = hT̃w , hi as claimed. On the other hand, if w = si v for some v ∈ Si then by
Proposition 2.11, w does not appear in T̃(i+1, j) h for 1 ≤ j < i and by Lemma 2.10 w
does not appear in h so hT̃w , L i+1 hi = αhT̃si v , T̃i hi = αhT̃v , hi + α 2 hT̃v , T̃i hi. However,
2
hT̃v , T̃i hi = 0 by Lemma 2.10 again, so the corollary follows.
This translates into a result of Bögeholz; we generalise both of these corollaries below.
Corollary 2.13 [1, Satz 5.1] Suppose that n > i 1 > i 2 > · · · > i k > 1 and that
w is decreasing. Then w appears in L̃ i1 L̃ i2 . . . L̃ ik if and only if w = si1 −1 si2 −1 . . . sik −1 .
Moreover, in this case, hT̃w , L̃ i1 L̃ i2 . . . L̃ ik i = 1.
Proof:
By Corollary 2.12, if 1 ≤ i < n and h ∈ H i then
(
hT̃w , L̃ i+1 hi =
hT̃v , hi,
0,
if si w = v ∈ Si ,
otherwise.
The result now follows easily by induction on k.
2
Remark 2.14 If w = si1 −1 si2 −1 . . . sik −1 , for distinct i 1 , i 2 , . . . , i k as above, we see that
hTw , L i1 L i2 . . . L ik i = q k . More generally, if w ∈ Sn is any element of length k and 2 ≤
i 1 , . . . , i k ≤ n then hTw , L i1 L i2 . . . L ik i = q k hT̃w , L̃ i1 L̃ i2 . . . L̃ ik i. We have renormalized
the T -basis and the Murphy operators in order to avoid having to keep track of these units.
301
MURPHY OPERATORS AND THE CENTRE
The corollary tells us which decreasing elements of Sn appear in a product of distinct Murphy operators. We now turn to the general case, which will follow by essentially
expanding part (ii) of the lemma below.
Lemma 2.15 Suppose i P
≥ 1 and that r ≥ 1. Then
−1
−s
(i) T̃i L ri+1 = L ri T̃i + α rs=0
L ri+1
L is .
P
P −1
r −1
r −s
r
r
(ii) L i+1 = T̃i L i T̃i + α s=1 L i T̃i L is + α 2 rs=1
sL
r −s
s
i+1 L i .
Proof: First consider (i). When r = 1 the formula reduces to the observation that T̃i L
L i T̃i + αL i+1 . By Lemma 2.2, L i and L i+1 commute so by induction,
¡
¢
+1
+1
T̃i L ri+1
= (L i T̃i + αL i+1 )L ri+1 = L i T̃i L ri+1 + αL ri+1
=L
r +1
T̃i
i
+α
r
X
L
i+1
=
r +1−s
s
i+1 L i ,
s=0
−1
= T̃i − α, so using (i) twice we find
!
r −1
X
r −s
r
s
= (T̃i − α) L i T̃i + α
L i+1 L i
proving (i). For (ii), note that T̃i
Ã
L
r
i+1
s=0
= T̃i L ri T̃i + α
r −1
X
s=0
= T̃i L
r
i T̃i
+α
r −1
X
¡
T̃i L
r −s
i+1
¢
L is − αL ri T̃i − α 2
L
r −s
s
i+1 L i
s=0
Ã
L
r −s
T̃i
i
+α
s=0
− αL ri T̃i − α 2
r −1
X
r −s−1
X
!
L
r −s−m
L im
i+1
L
s
i
m=0
r −1
X
L
r −s
s
i+1 L i
s=0
= T̃i L ri T̃i + α
r −1
X
L
s=1
as required.
r −s
T̃i L is
i
+ α2
r −1
X
sL
r −s
s
i+1 L i ,
s=1
2
The next few lemmas investigate which decreasing elements can appear in the products of
the form T̃i L ri T̃i ; these are by far the most difficult terms appearing in (ii) of the lemma. We
begin with a technical result which is another application of Proposition 2.11. At first sight
the h appearing in the lemma plays no role; however, it is crucial for our applications because
dec
the equivalence relation = does not respect multiplication, as noted in Definition 2.8.
For convenience, given i and j, where 1 ≤ i, j < n, we define


 T̃i T̃i+1 . . . T̃ j , if i < j,
T̃i·· j = T̃i ,
if i = j,


T̃i T̃i−1 . . . T̃ j , if j < i.
In particular, L i = T̃i−1··1 T̃1··i−1 .
302
MATHAS
Lemma 2.16 Let h ∈ H
T̃ j··i L
+
j
dec
s
i−1 T̃i−1 T̃i·· j h =
and suppose that n ≥ j ≥ i ≥ 2 and s ≥ 1. Then
0.
dec
s
T̃i−1 T̃i··k ; then we must show that A j h = 0 for all
Proof: For k ≥ i let Ak = T̃k··i L i−1
+
h ∈ H j . If this is not so then there must exist a decreasing element w ∈ S j+1 which appears
in A j h for some h ∈ H +j . By Lemma 2.9 we may assume that w = s j v j , where v j ∈ S j .
Since A j h = T j A j−1 T j h we can apply Proposition 2.11 to find an element v j−1 ∈ S j−1
which appears in A j−1 . By Lemma 2.9, s j−1 v j−1 appears in A j−1 , so we can apply
Proposition 2.11 once more to find an element v j−2 ∈ S j−2 which appears in A j−2 . Continus
T̃i−1 .
ing in this way we see that there exists an element vi−1 ∈ Si−1 which appears in L i−1
dec
2
However, this is impossible by Lemma 2.10, so A j h = 0 as claimed.
To proceed with the expansion of T̃i L ri T̃i we require two closely related families of
polynomials.
Definition 2.17
b(0) = 1,
a(s) =
Define a(s) and b(s) to be the polynomials in N[α 2 ] given by a(0) =
¶
s µ
X
s+m−1
m=1
2m − 1
α 2m ,
and
b(s) =
¶
s µ
X
s + m 2m
α ,
2m
m=0
for s ≥ 1.
We remark that even though it would have been more natural to define a(0) to be the
0-polynomial it is much more convenient to have it equal to 1. Also note that because
1
α 2 = q − 2 + q −1 both a(s) and b(s) are actually Laurent polynomials in q (rather than q 2 ).
The closed forms for a(s) and b(s) were noticed by John Graham; all that we really need
however is that they are the polynomials determined by the recurrence formulae below.
Lemma 2.18 Suppose that s ≥ 1. Then
(i) a(s + 1) = a(s) + α 2 b(s).
(ii) b(s) = a(s) + b(s − 1).
P
(iii) a(s) = α 2 sm=1 ma(s − m).
P
(iv) b(s) = 1 + α 2 sm=1 mb(s − m).
Proof:
¡s ¢ ¡s−1The
¢ first
¡s−1two
¢ statements follow from the definitions using the well-known identity
=
+
; the final two statements then follow from (i) and (ii) and induction.
t
t
t−1
2
We now return to the expansion of L
a-polynomials.
r
i+1 .
The next lemma reveals the origin of the
303
MURPHY OPERATORS AND THE CENTRE
Lemma 2.19 Let r, i and j be integers such that r ≥ 1 and n ≥ j ≥ i ≥ 2, and suppose
that h ∈ H +j . Then
dec
T̃ j··i L ri T̃i·· j h =
r −1
X
a(s)T̃ j··i−1 L
r −s
s
i−1 T̃i−1·· j L i−1 h.
s=0
Proof: We argue by induction on r . When r = 1 it is clear from the definitions that
T̃ j··i L i T̃i·· j = T̃ j··i−1 L i−1 T̃i−1·· j h so there is nothing to prove. Suppose then that r > 1.
By Lemma 2.2, L i−1 commutes with T̃ j··i and with T̃i·· j . Therefore, using Lemma 2.15(ii)
and Lemma 2.16,
dec
T̃ j··i L ri T̃i·· j h = T̃ j··i−1 L
r
i−1 T̃i−1·· j h
+ α2
r −1
X
m T̃ j··i L
r −m
m
T̃i·· j L i−1
h,
i
m=1
Because L
r −1
X
m
i−1 h
∈H
m T̃ j··i L
+
j
we may apply induction to expand the right-hand sum
dec
r −m
m
T̃i·· j L i−1
h =
i
m=1
r −1 r −m−1
X
X
m=1
=
ma(t)T̃ j··i−1 L
r −m−t
t+m
T̃i−1·· j L i−1
h
i−1
t=0
r −1 X
r −1
X
ma(s − m)T̃ j··i−1 L
r −s
s
i−1 T̃i−1·· j L i−1 h
m=1 s=m
=
r −1
X
T̃ j··i−1 L
r −s
s
i−1 T̃i−1·· j L i−1 h
s=1
·
s
X
ma(s − m).
m=1
P
Now a(s) = α 2 sm=1 ma(s − m) by Lemma 2.18(iii). Since a(0) = 1, combining the last
two equations yields the lemma.
2
A composition
P of r into l-parts is a sequence σ = (σ1 , σ2 , . . . , σl ) of non-negative integers such that li≥1 σi = r . A strict composition of r is a composition σ = (σ1 , σ2 , . . . , σl )
of r in which each part σi is strictly positive. Let C (r, l) be the set of all compositions of r into l parts and C (r ) be the set of all strict compositions of r . For example,
C (3, 2) = {(3, 0), (2, 1), (1, 2), (0, 3)} and C (3) = {(3), (2, 1), (1, 2), (1, 1, 1)}.
Lemma 2.20 Let i, j and r be integers such that r ≥ 1 and n ≥ j ≥ i ≥ 2 and suppose
that h ∈ H +j . Then
X
dec
σi−1
T̃ j··i L ri T̃i·· j h =
a(σi−1 ) . . . a(σ1 )L j+1 L i−1
. . . L σ22 h.
σ ∈C (r −1,i)
Proof: This time we argue by induction on i. If i = 2 then, using Lemma 2.19 and noting
that L 1 = 1, we have
T̃ j··i L ri T̃i·· j h
dec
=
r −1
X
s=0
a(s)T̃ j··1 T̃1·· j L s1 h =
r −1
X
a(s)L
s=0
j+1 h
=
X
σ ∈C (r −1,2)
a(σ1 )L
j+1
304
MATHAS
as required. Now suppose that i > 2. Then, by Lemma 2.19 and induction,
T̃ j··i L ri T̃i·· j h
r −1
X
dec
=
a(s)T̃ j··i−1 L
r −s
s
i−1 T̃i−1·· j L i−1 h
s=0
=
r −1
X
X
a(s)a(σi−2 ) . . . a(σ1 )L
σi−2
j+1 L i−2
...L
σ1
s
2 L i−1 h
s=0 σ ∈C (r −s−1,i−1)
X
=
a(σi−1 ) . . . a(σ1 )L
σi−1
j+1 L i−1
...L
σ2
2 h,
σ ∈C (r −1,i)
2
as required.
σ
i−1
. . . L σ11 .
For convenience, given a composition σ ∈ C (r − 1, i) we let L σ = L iσi L i−1
We are now ready to compute the needed inner of products of the form hT̃w , hi where h is
a product of Murphy operators.
Proposition 2.21 Let r, i and k be integers such that r ≥ 1, n > i ≥ 2 and k ≥ r + 1
and suppose that w = si v is a decreasing element of length at least k. Finally, let h =
L jr +1 L jr +2 . . . L jk where i ≥ jr +1 ≥ · · · ≥ jk ≥ 2. If `(w) = k then
­
T̃w , L
r
i+1 h
®
X
=
αb(σi )a(σi−1 ) . . . a(σ1 )hT̃v , L σ hi.
σ ∈C (r − 1, i)
σ1 = 0
If `(w) > k then hT̃w , L
r
i+1 hi
= 0.
Proof: We proceed by simultaneous induction on k and r . If r = 1 then hT̃w , L̃ i+1 hi =
αhT̃v , hi, by Corollary 2.12, and the theorem follows.
Suppose then that r > 1 and that the theorem holds for smaller k. Now by Lemma
2.15(ii),
L
r
i+1 h
= T̃i L ri T̃i h + α
r −1
X
s=1
L
r −s
T̃i L is h
i
+ α2
r −1
X
sL
r −s
s
i+1 L i h.
s=1
We deal with each of these three terms in turn.
Since h ∈ H i+ we may use Lemma 2.20, with j = i, and Corollary 2.12 to see that
X
­
®
­
®
σi−1
T̃w , T̃i L ri T̃i h =
a(σi−1 ) . . . a(σ1 ) T̃w , L i+1 L i−1
. . . L σ11 h
σ ∈C (r −1,i)
=
X
σ ∈C (r −1,i)
­
a(σi−1 ) . . . a(σ1 ) T̃v , L
σi−1
i−1
...L
σ1 ®
1 h
305
MURPHY OPERATORS AND THE CENTRE
σ
i−1
. . . L σ11 h
Now v is of length at least k − 1, so by induction on k it cannot appear in L i−1
unless this is a product of at least k − 1 non-trivial L -Murphy operators. Consequently, to
get a non-zero contribution to this sum we require that σ1 = σi = 0. Therefore, noting that
b(0) = 1,
X
b(σi )a(σi−1 ) . . . a(σ1 )hT̃v , L σ hi.
hT̃w , T̃i L ri T̃i hi =
σ ∈ C (r − 1, i)
σ1 = 0
σi = 0
By Lemma 2.20 and Corollary 2.12,
r −1
X
­
T̃w , L
r −s
T̃i L is h
i
®
s=1
=
r −1
X
­
T̃v , T̃i L
r −s
T̃i L is h
i
®
s=1
=
r −1
X
X
­
a(σi−1 ) . . . a(σ1 ) T̃v , L
σi−1
s
i+1 L i L i−1
...L
σ1 ®
1 h
s=1 σ ∈C (r −s−1,i)
=
r −1
X
X
­
a(σi−1 ) . . . a(σ1 ) T̃v , L
σi−1
s
i+1 L i L i−1
...L
σ1 ®
1 h
s=1 σ ∈C (r −s−1, i)
σ1 = σi = 0
X
=
a(σi−1 ) . . . a(σ1 )hT̃v , L σ hi,
σ ∈C (r − 1, i)
σ1 = 0
σi 6= 0
where the terms in the third line corresponding to σ1 6= 0 or σi 6= 0 disappear because, as
above, v cannot appear in a product of fewer than k − 1 non-trivial L -Murphy operators.
Finally, by induction on r ,
r −1
X
­
s T̃w , L
r −s
s
i+1 L i h
®
s=1
=
r −1
X
X
s=1
σ ∈C (r −s−1,i)
σ1 =0
=α
X
σ ∈C (r − 1, i)
σ1 = 0
σi 6= 0
­
®
sαb(σi )a(σi−1 ) . . . a(σ1 ) T̃v , L is L σ h
σi
­
® X
a(σi−1 ) . . . a(σ1 ) T̃v , L iσ h ·
mb(σi − m).
m=1
306
MATHAS
Pi
By Lemma 2.18(iv), b(σi ) = 1 + α 2 σm=1
mb(σi − m). Therefore, adding up the three
®
­
contributions to T̃w , L ri+1 h calculated above, we find
X
­
®
T̃w , L ri+1 h =
αb(σi )a(σi−1 ) . . . a(σ1 )hT̃v , L σ hi.
σ ∈C (r − 1, i)
σ1 =0
By induction the inner products on the right-hand side are zero if `(w) > k so the result
follows.
2
As a corollary we obtain Theorem 2.7.
Corollary 2.22 Suppose that w is decreasing and that n ≥ i 1 ≥ i 2 ≥ · · · ≥ i k > 1
is a sequence of positive integers. Then w appears in L i1 L i2 . . . L ik only if `(w) ≤ k.
Moreover, if `(w) = k then w appears in L i1 L i2 . . . L ik only if w = si1 +1 v for some
v ∈ Si1 −1 .
Proof: If k = 1 there is nothing to prove, so suppose k > 1. Then, if w appears in
L i1 L i2 . . . L ik , by Lemma 2.9 either w = si1 −1 v, for some v ∈ Si1 −1 , or `(si1 −1 w) >
2
`(w) and si1 −1 w appears in L i1 L i2 . . . L ik . Now apply Theorem 2.21.
Next we describe exactly which decreasing elements of maximal length appear in a
product of Murphy operators; for this we need some definitions.
A sequence of integers i = (i 1 , i 2 , . . . , i k ) is decreasing if n > i 1 > i 2 > · · · > i k > 0; i
is weakly decreasing if i 1 ≥ i 2 ≥ · · · ≥ i k . If i is decreasing then we write wi = si1 si2 . . . sik
(cf. Definition 2.4). Note that wi is a decreasing element of Si1 +1 of length k.
Definition 2.23 Let i = (i 1 , i 2 , . . . , i k ) and j = ( j1 , j2 , . . . , jk ) be two weakly decreasing
sequences of length k. Then i → j if
(i) i m ≥ jm for m = 1, 2, . . . , k, and
(ii) {i 1 , i 2 , . . . , i k } ⊆ { j1 , j2 , . . . , jk }.
In particular, note that → is transitive and that i → j only if i 1 = j1 .
Example 2.24 Starting from i = (4, 4, 4) the paths leading to decreasing 3-tuples (omitting (4, 4, 3) and (4, 4, 2) and the edges from (4, 4, 4) to the three decreasing sequences)
are
MURPHY OPERATORS AND THE CENTRE
307
The next result says, for example, that the only decreasing elements of length 3 which
appear in L 35 are s4 s3 s2 , s4 s3 s1 and s4 s2 s1 .
By Theorem 2.21 the labels on the graph give the inner products hT̃w , L 35 i when w is one
of these elements. For example, hT̃4 T̃3 T̃1 , L 35 i = α 2 b(2)a(1) + α 2 b(1)a(1). The second
b(1)a(1)
term comes from the omitted edge (4, 4, 4) −→ (4, 3, 1).
Theorem 2.25 Suppose that j = ( j1 , j2 , . . . , jk ) is a decreasing sequence and suppose
that i = (i 1 , i 2 , . . . , i k ) is weakly decreasing. Then w = wj appears in L i1 L i2 . . . L ik if
and only if (i − (1k )) → j where i − (1k ) = (i 1 − 1, i 2 − 1, . . . , i k − 1).
Proof: Let i = i 1 −1 and suppose that i 1 = · · · = ir > ir +1 (if necessary, let i k+1 = 0). We
argue by induction on k. When k = 1 there is nothing to prove so suppose that k > 1. Since
w is of length k, by Proposition 2.21(i) and Corollary 2.12, w appears in L i1 L i2 . . . L ik
only if w = si v for some v ∈ Si . As (i − (1k )) → j only if i 1 − 1 = j1 , we may assume
that w = si v for some v ∈ Si .
If r = 1 then hT̃w , L ri+1 L ir +1 . . . L ik i = αhT̃v , L ir +1 . . . L ik i, by Corollary 2.12, and
the result follows by induction on k. If r > 1 then, by Proposition 2.21,
®
­
(†)
T̃w , L ri+1 L ir +1 . . . L ik
X
­
®
a(σ1 ) . . . a(σi−1 )b(σi ) T̃v , L σ L ir +1 . . . L ik .
=α
σ ∈C (r − 1, i)
σ1 = 0
Now if (i − (1k )) → j then by induction there exists a composition σ such that v appears
in L σ L ir +1 . . . L ik , which is a summand on the right-hand side of (†). Now all of the
terms in this sum belong to H + , so no cancellation can occur. Therefore, w appears in
L ri+1 L ir +1 . . . L ik as required.
Conversely, if w appears in L i1 L i2 . . . L ik then v appears in L σ L ir +1 . . . L ik for some
σ ∈ C (r − 1, i) with σ1 = 0 by (†). Let L σ L ir +1 . . . L ik = L i20 . . . L ik0 for some weakly
decreasing sequence (i 20 , . . . , i k0 ). By induction (i 20 − 1, . . . , i k0 − 1) → ( j2 , . . . , jk ). On the
other hand, (i 1 −1, . . . , i k −1) → (i 1 −1, i 20 −1, . . . , i k0 −1). By transitivity, (i−(1k )) → j.
2
Finally, we deduce the corresponding results for products of the Murphy operators L̃ i .
Theorem 2.26 Suppose that i = (i 1 , i 2 , . . . , i k ) is weakly decreasing and j = ( j1 ,
j2 , . . . , jl ) is decreasing and let w = wj . Then
(i) wj appears in L̃ i1 L̃ i2 . . . L̃ ik only if `(wj ) ≤ k.
(ii) If l = `(wj ) = k then wj appears in L̃ i1 L̃ i2 . . . L̃ ik if and only if (i − (1k )) → j.
(iii) Suppose that `(w) = k, i 1 = j1 , and let i = i 1 − 1 and let r be the integer such that
i 1 = · · · = ir and ir > ir +1 . Then
X
­
®
®
­
b(σi )a(σi−1 ) . . . a(σ2 ) T̃v , L̃ σ L̃ ir +1 . . . L̃ ik ,
T̃wj , L̃ ri+1 L̃ ir +1 . . . L̃ ik =
σ ∈C (r − 1, i)
σ1 = 0
σ
i−1
where v = s j2 . . . s jk and L̃ σ = L̃ iσi L̃ i−1
. . . L̃ σ22 .
308
MATHAS
Proof: Since L i = α L̃ i + 1 by Lemma 2.3, (i) follows from Proposition 2.21. Consequently, when `(w) = k we may expand the L i j ’s using (i) to see that hT̃w , L i1 L i2 . . .
L ik i = α k hT̃w , L̃ i1 L̃ i2 . . . L̃ ik i; (ii) and (iii) follow from this observation combined with
Theorem 2.25 and Proposition 2.21 respectively.
2
Applying Theorem 2.26(iii) recursively gives a purely combinatorial algorithm for computing the inner products hT̃w , L̃ i1 . . . L̃ ik i for any decreasing element w of length k and any
weakly decreasing sequence (i 1 , . . . , i k ).
3.
Symmetric polynomials
P
In this section we return to Conjecture 1.1 and reduce it to the conjecture that ν 0λν 0νµ =
2
δλµ (Kronecker delta), for certain polynomials 0λν ∈ N[α ] (λ, µ, and ν partitions of some
integer k < n).
If σ = (σ1 , σ2 , . . .) is a composition, let σ̄ be the partition obtained from σ by reordering
its parts in weakly decreasing order. For example, if σ = (3, 0, 2, 3, 1) then σ̄ = (3, 3, 2, 1).
Definition 3.1 Let k and n be positive integers and let µ a partition of k. Then the µth
monomial symmetric function in L̃ 2 , . . . , L̃ n is
M̃µ (n) =
X
L̃ σ21 L̃ σ32 . . . L̃ σnn−1 .
σ ∈C (k, n − 1)
σ̄ = µ
Notice that M̃µ (n) belongs to the centre of H n by Lemma 2.2 since it is a symmetric
polynomial in the Murphy operators. Also, M̃µ (n) is an element of H +
n and M̃µ (n) =
M̃µ (m) if and only if n = m. For example,
M̃(2,1) (3) = L̃ 22 L̃ 3 + L̃ 2 L̃ 23 .
M̃(2,1) (4) = L̃ 22 L̃ 3 + L 2 L̃ 23 + L̃ 22 L̃ 4 + L̃ 23 L̃ 4 + L̃ 2 L̃ 24 + L̃ 3 L̃ 24 .
Even though M̃µ (n) 6= M̃µ (m) for n 6= m, these elements look the same as far as decreasing elements of length k are concerned. Recall the permutation u λ from Definition 2.4.
Theorem 3.2 Suppose that λ is a partition of n and that µ is a partition of k where
k < n. Then
(i) hT̃u λ , M̃µ (n)i 6= 0 if and only if `(u λ ) ≤ k.
(ii) If `(u λ ) ≤ k and m > n then hT̃u λ , M̃µ (m)i = hT̃u λ , M̃µ (n)i.
Proof: Because all of the summands of M̃µ (n) belong to H + no cancellation can occur in hT̃u λ , M̃µ (n)i and the first statement is immediate from Theorem 2.26(ii). For
the second statement, suppose that u λ = si v for some decreasing element v ∈ Si . Then
309
MURPHY OPERATORS AND THE CENTRE
u λ ∈ Si+1 so without loss of generality we may take n = i + 1 and m > n. Therefore, by
Theorem 2.26(ii),
X
®
­
®
­
T̃n−1 T̃v , L̃ σmm−1 . . . L̃ σ21
T̃u λ , M̃µ (m) =
σ ∈ C (k, m − 1)
σ̄ = µ
=
X
­
T̃n−1 T̃v , L̃ σnn−1 . . . L̃ σ21
®
σ ∈C (k, n − 1)
σ̄ = µ
­
®
= T̃u λ , M̃µ (n) ,
2
proving (ii).
Let λ be a partition of n. As illustrated in Example 2.5, if λ = (λ1 , . . . , λk ) then u λ is
the permutation
(1, . . . , λ1 )(λ1 + 1, . . . , λ1 + λ2 )(λ1 + · · · + λk−1 + 1, . . . , n)
and `(u λ ) = (λ1 − 1) + (λ2 − 1) + · · · + (λk − 1). Using this observation the next Lemma
follows easily.
Lemma 3.3 Suppose that n ≥ 2k. Then the map µ 7→ λ = µ + (1n−k ) gives a one-to-one
correspondence between the partitions µ of k and the partitions λ of n such that `(u λ ) = k
(where we write µ + (1n−k ) for the partition (µ1 + 1, µ2 + 1, . . . , µn−k + 1)).
Consequently, if µ is a partition of k of length l then
u µ+(1l ) = u µ+(1l+1 ) = u µ+(1l+2 ) = · · ·
where we identify these permutations under the natural embeddings S1 ,→ S2 ,→ · · ·.
Accordingly, we relabel the permutations u λ using partitions of k, where k is the length
of u λ .
Definition 3.4
Suppose that λ is a partition of k of length l. Let vλ = u λ+(1l ) .
Thus, vλ ∈ Sk+l and we can consider vλ as an element of Sn whenever n ≥ k + l. In
particular, vλ ∈ S2k for any partition λ of k.
Consequently, if λ and µ are partitions of k < n then by Theorem 3.2,
­
® ­
®
m λµ = T̃vλ , M̃µ (n) = T̃vλ , M̃µ (2k) 6= 0
depends only upon k and not upon n (and hT̃vλ , M̃µ (n)i = 0 when vλ ∈
/ Sn ). By definition,
m λµ ∈ N[α 2 ]. In principle, Theorem 2.26 can be used to calculate the polynomials m λµ ;
for example, m (2,1)(3) = a(2)b(1) + a(1)2 by Example 2.24.
310
MATHAS
We let M̃k = (m λµ ) be the matrix of these inner products as λ and µ run over the partitions
of k. Letting v = α 2 , and indexing the rows and columns lexicographically (beginning with
(13 )), the full matrix when k = 3 is

1
v 2 + 3v

M̃3 =  1 v 2 + 4v + 1
1 v 2 + 5v + 3

v 3 + 3v 2

v 3 + 4v 2 + 2v  .
v 3 + 5v 2 + 5v + 1
Notice that when we set v = 0, or equivalently q = 1, M̃3 is unitriangular. Using Theorem
2.26 one can show that m λλ (0) = 1 and that m λµ has non-zero constant term if and only
if λ D µ, where B is the usual dominance ordering on partitions. The point is that in the
expansion of Theorem 2.26(iii) only the b polynomials can occur if we are to get a non-zero
constant term in the coefficients. So in general the matrix M̃k is lower unitriangular when
v = 0.
More surprising is the following:
Conjecture 3.5 Suppose k ≥ 0 is any positive integer. Then det M̃k = 1. In particular,
1
1
M̃k is invertible over the Laurent polynomial ring A = Z[q 2 , q − 2 ].
This conjecture was made by Gordon James during discussions with Richard Dipper
and the author. James also conjectured that Theorems 2.7 and 3.2 were true. Using these
results we next show that Conjecture 3.5 implies Conjecture 1.1. Below we give an explicit
conjecture for the inverse of M̃k .
Let R be a commutative ring, q̄ an invertible element in R. As in Definition 3.1, given
a partition µ of k < n we define Mµ (n) ∈ H R,q̄ to be the monomial symmetric function
in the q-Murphy operators L 2 , . . . , L n of H R,q̄ . By Lemma 2.2, Mµ (n) belongs to the
centre of H R,q̄ .
Theorem 3.6 Let R be a commutative ring with 1, q̄ an invertible element in R, and
suppose that the matrices M̃k are invertible for all 0 ≤ k < n. Then
{Mµ (n) | µ a partition of 0 ≤ k < n such that vµ ∈ Sn }
is a basis for the centre of H R,q̄ . Consequently, the centre of H R,q̄ is precisely that set of
symmetric polynomials in the Murphy operators; thus, Conjecture 3.5 implies
Conjecture 1.1.
Proof: By Remark 2.14, for any partitions λ and µ of k, the inner product hTvλ , Mµ (n)i
in H R,q̄ is equal to the polynomial q `(vλ ) hT̃vλ , M̃µ (n)i evaluated at q = q̄ (note that m λµ
311
MURPHY OPERATORS AND THE CENTRE
is a polynomial in α 2 = q − 2 + q −1 ). Thus we may work in the generic Hecke algebra
and then specialize q to q̄.
First consider the case where R is the rational function field Q(q) and let M̃(n) be the
matrix of inner products hT̃vλ , M̃µ (n)i where λ and µ vary over all partitions of all positive
integers k, where 0 ≤ k < n. By Theorem 2.26(i), M̃(n) has the form



M̃(n) = 


M̃1 (n)
0
M̃2 (n)
M̃3 (n)
∗
..






.
where M̃k (n) = (hT̃vλ , M̃µ (n)i) is the submatrix of inner products in M̃(n) where λ and µ
run over all of the partitions of k. By Theorem 3.2, when vλ ∈ Sn the entries in the λth
row of M̃k (n) are equal to the corresponding entries of the matrix M̃k ; in particular they are
/ Sn , all of the entries in this row of M̃k (n) are zero.
independent of n. When vλ ∈
Now, by assumption the matrix M̃k has full rank. Consequently the rank of M̃k (n) is
greater than or equal to the number of partitions λ of k such that vλ ∈ Sn (0 ≤ k < n). Therefore, the H Q(q),q -submodule spanned by the monomial symmetric functions Mµ (q) has
dimension greater than or equal to the number of partitions of n. However, by
[2, Theorem 2.26] (see also Remark 2.6), the centre of H Q(q),q has dimension less than or
equal to the number of partitions of n. This completes the proof of the proposition when
R = Q(q). (In fact this argument holds whenever R is a field.)
1
1
Now because the matrices M̃k are invertible over A = Z[q 2 , q 2 ], the monomial symmetric functions {Mµ (n)} span an A-lattice M in the centre of H Q(q),q and M ⊗A R is the
centre of H R,q̄ (here we consider R as an A-module by specifying that q acts as q̄ on R).
Therefore, the {Mµ (n) = q̄ `(vµ ) M̃µ (n) ⊗ 1} are linearly independent and span the centre of
H R,q̄ .
2
Although we have not been able to prove Conjecture 3.5, we do have an explicit conjecture
for the inverse matrix of M̃k . Recall that C (k) is the set of strict compositions of k. There
is a well-known one-to-one correspondence between the compositions σ in C (k) and the
subsets of {1, 2, . . . , k − 1} where
σ = (σ1 , . . . , σl ) ↔ Jσ = {σ1 , σ1 + σ2 , . . . , σ1 + · · · + σl−1 }.
For example, J(1k ) = {1, 2, . . . , k − 1} and J(k) = Ø for all r . Given two partitions λ and
µ define
cλµ = (−1)`(µ) |{σ ∈ C (k) | Jλ ⊆ Jσ and σ̄ = µ}|,
312
MATHAS
and let Ck = (cλµ ). For example when k = 4, and
where the rows and columns of C4 are ordered lexicographically and all omitted entries are
zero.
So, |c(k)µ | is equal to the number of compositions σ in C (k) such that σ̄ = µ; more
generally, |cλµ | is the number of refinements of λ into compositions σ such that σ̄ = µ.
Conjecture 3.7 The inverse matrix of M̃k is Ck M̃k Ck .
In particular this implies Conjecture 3.5 and hence, by Theorem 3.6, Conjecture 1.1.
It is not hard to show that Ck2 = I for all k; in fact this reduces to the well-known identity
X
(−1)|J | =
J ⊆S
½
if S = ∅,
otherwise.
1,
0,
So, in fact Conjecture 3.7 claims that M̃k−1 = Ck M̃k Ck−1 . Further, if 0k = Ck M̃k then
Conjecture P
3.7 says that 0k2 = I . Equivalently, if 0k = (0λµ ) then Conjecture 3.7 holds if
and only if ν 0λν 0νµ = δλµ .
By the above remarks 0k is lower unitriangular when v = 0 and, assuming Conjecture 3.5, det 0k = det Ck = ±1. It would also appear to be true that (−1)`(λ) 0λµ is a polynomial in v with non-negative coefficients. The matrices 0k for k = 1, 2, 3, and 4 are as
follows.
µ
01 = (1),
02 =
1
0
¶
v
,
−1

−1

03 =  0
0
−v 2 − 3v
v+1
−1

−v 3 − 3v 2

v 2 + 2v ,
−v − 1
313
MURPHY OPERATORS AND THE CENTRE
and 04 is the matrix

1

0


0


0

v 3 + 4v 2 + 6v
v 4 + 4v 3 + 3v 2
v 5 + 7v 4 + 15v 3 + 12v 2
−v 2 − 2v − 1
−v 3 − 3v 2 − v
−v 4 − 6v 3 − 9v 2 − 4v
v
v 2 + 2v + 1
v 3 + 5v 2 + 4v
v+1
v 2 + 2v
v 3 + 5v 2 + 5v + 1
−1
−v − 1
−v 2 − 4v − 2
0
v 6 + 8v 5 + 20v 4 + 16v 3


−v 5 − 7v 4 − 14v 3 − 8v 2 


v 4 + 6v 3 + 9v 2 + 4v 
.

4
3
2
v + 6v + 9v + 3v 

−v 3 − 5v 2 − 5v − 1
The reader may check that det 0k = ±1 and that 0k2 = I as we have claimed. We have
checked Conjecture 3.7 for k ≤ 7.
Acknowledgments
I would like to thank Gordon James and Richard Dipper for many invaluable discussions on
this problem, without which this work could not have started. I also wish to thank James for
his comments on earlier versions of this paper, Meinolf Geck for writing his GAP programs
for Iwahori-Hecke algebras [5] which helped me guess the form of Theorem 2.21 and the
referee for several useful comments. Supported in part by SERC grant GR/J37690.
References
1. H. Bögeholz, “Die Darstellung des Zentrums der Hecke-Algebra vom Type An aus symmetrischen Polynomen
in Murphy-Operatoren,” Diplomarbeit, Univ. Stuttgart, 1994.
2. R. Dipper and G. James, “Blocks and idempotents of Hecke algebras of general linear groups,” Proc. London
Math. Soc. 54 (1987), 57–82.
3. G.D. James and A. Mathas, “A q-analogue of the Jantzen-Schaper theorem,” Proc. London Math. Soc. 74 (3)
(1997), 241–274.
4. G.E. Murphy, “The idempotents of the symmetric group and Nakayama’s conjecture,” J. Algebra 81 (1983),
258–265.
5. M. Schönert et al., “Gap: groups, algorithms, and programming,” Lehrstuhl D für Mathematik, RWTH Aachen,
3.4.4 edition, 1997.
Download