{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "### Overview\n", "\n", "This code produces examples of some of the objects in the paper \"Minimal special degenerations and duality\" by Daniel Juteau, Paul Levy, Eric Sommers (https://arxiv.org/abs/2310.00521).\n", "\n", "Author: Eric Sommers\n", "Date: November 2, 2023" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#### Generate all partitions of n, which correspond to nipotent orbits in $GL_n$. Methods for compacting printing." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "def partition(n, k):\n", " if n == 0:\n", " yield []\n", " else:\n", " for i in range(min(n,k), 0, -1):\n", " for part in partition(n-i, i):\n", " yield [i]+part\n", " \n", "def Partitions(n):\n", " '''Return a list of all integer partitions of n'''\n", " return [i for i in partition(n,n)]\n", "\n", "#print(Partitions(7))\n", "\n", "def short_hand(part):\n", " short_part = list(set([(x,part.count(x))for x in part]))\n", " short_part.sort(key=lambda x:x[0],reverse=True)\n", " return short_part\n", "\n", "def print_compact(part):\n", " stringy= f'['\n", " short = short_hand(part)\n", " for x in short:\n", " if x[1]>1:\n", " stringy+=f\"{x[0]}^{x[1]}, \"\n", " else:\n", " stringy+=f\"{x[0]}, \"\n", " stringy=stringy[:-2]+f']'\n", " return stringy\n", "\n", "for part in Partitions(5):\n", " print(print_compact(part))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Nilpotent orbits in other classical types\n", "\n", "These methods determine the nilpotent orbits in other classical types. They also include the information for the special nilpotent orbits." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "def data_for_type(typee,rank):\n", " '''epsilon is the first entry of the tuple and epsilon' is the second'''\n", " data = [('B',0,1,2*rank+1),('C',1,0,2*rank),('D',0,0,2*rank),('Ca',1,1,2*rank)]\n", " for x in data:\n", " if x[0]==typee:\n", " return x[1],x[2], x[3]\n", " \n", "def check_even_mult(part,d):\n", " '''\n", " parts of size congruent to d should have even multiplicity\n", " '''\n", " for v in part:\n", " if v%2==d and part.count(v)%2==1:\n", " return 0\n", " return 1\n", "\n", "def all_parts(typee,rank):\n", " '''Find all partitions of a given typee and rank'''\n", " epsilon, pos_parity, n = data_for_type(typee,rank)\n", " return [lam for lam in Partitions(n) if check_even_mult(lam,epsilon)]\n", "\n", "## Example \n", "for x in all_parts('B',4):\n", "#for x in all_parts('D',4):\n", " print(print_compact(x))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Special nilpotent orbits\n", "\n", "There are four families of special nilpotent orbits in the classical Lie algebras. The usual special orbits in types B,C,D and a new class that we highlight in our paper in type C. " ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "def check_special(part,d,pos_parity):\n", " '''\n", " parts of size congruent to d should be have position given by pos_parity, epsilon' in paper.\n", " '''\n", " for i,v in enumerate(part):\n", " if v%2==d and (i==0 or part[i-1]%2!=d) and i%2!=pos_parity:\n", " return 0\n", " return 1\n", " \n", "def special_parts(typee,rank):\n", " '''All special partitions of a given typee and rank'''\n", " parts = all_parts(typee,rank)\n", " epsilon, pos_parity, rank = data_for_type(typee,rank)\n", " return [lam for lam in parts if check_even_mult(lam,epsilon) if check_special(lam,epsilon,pos_parity)]\n", "\n", "def make_all_specials(rank):\n", " orbits= {}\n", " for typee in ['B','C','D','Ca']:\n", " orbits[typee] = special_parts(typee,rank)\n", " return orbits\n", "\n", "# pair = ('B','C')\n", "pair = ('D','Ca')\n", "for i in range(6,10):\n", " print(f\"Number of special orbits in rank {i} for {pair[0]} and {pair[1]} is {len(special_parts(pair[0],i))} and {len(special_parts(pair[1],i))}, respectively.\")\n", "\n", "# for part in special_parts('D',4):\n", "# print(print_compact(part))\n", "\n", "N=4\n", "all_orbits = make_all_specials(N)\n", "print(f\"\\nSpecial orbits for rank {N}:\")\n", "for typee, orbits in all_orbits.items():\n", " print(typee)\n", " for part in orbits:\n", " print(print_compact(part))\n", " print()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### The partial order on nilpotent orbits\n", "\n", "Methods using the dominance order and finding the edges the in Hasse diagram of the dominance order." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "\n", "def cumsum(lis,pad):\n", " summy = list(lis)\n", " for i in range(1,len(lis)):\n", " summy[i]+=summy[i-1]\n", " if len(summy)=l2[i] for i in range(len(l1))])\n", "\n", "def comparable(lam,mu):\n", " csl = cumsum(lam,sum(lam))\n", " csm = cumsum(mu,sum(mu))\n", " if vec_diff_positive(csl,csm):\n", " return 1\n", " elif vec_diff_positive(csm,csl):\n", " return 2\n", " return -1\n", "\n", "def hasse_edges(parts):\n", " '''all minimal (special) degenerations'''\n", " all_edges = {}\n", " for i,x in enumerate(parts):\n", " for j,y in enumerate(parts[i+1:],start=i+1):\n", " if comparable(x,y)==1:\n", " all_edges[(i,j)]=0\n", " for i,x in enumerate(parts):\n", " for j,y in enumerate(parts[i+1:],start=i+1):\n", " if (i,j) not in all_edges:\n", " continue\n", " for k,z in enumerate(parts[j+1:],start=j+1):\n", " if (j,k) not in all_edges:\n", " continue\n", " if (i,k) in all_edges:\n", " all_edges.pop((i,k))\n", " return all_edges\n", "\n", "All_specials_for_rank = make_all_specials(4) # make all 4 types of specials of rank 4\n", "hasse_edges(All_specials_for_rank['D']).keys()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### The type of a Slodowy slice between special nilpotent orbits\n", "\n", "These methods carry out the procedure in Tables 1 and 2 of the paper to find the type of the slice." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "def shrink_length(lam):\n", " return [x-1 for x in lam if x >=2]\n", "\n", "def trim_partitions_length(lam, mu):\n", " l,m = list(lam), list(mu)\n", " skim =0\n", " while len(l)==len(m):\n", " skim += 1\n", " l,m = shrink_length(l), shrink_length(m)\n", " return l,m, skim\n", "\n", "def trim_partitions_width(lam, mu):\n", " for i,x in enumerate(lam):\n", " if len(mu)<=i+1 or x != mu[i]:\n", " break\n", " return lam[i:], mu[i:],i\n", "\n", "def trim_both(lam,mu):\n", " l,m,skim = trim_partitions_length(lam,mu)\n", " l,m, height = trim_partitions_width(l,m)\n", " return l,m,skim,height\n", " \n", "# trim_partitions_length([4,4,2],[5,3,2])\n", "# trim_partitions_width([6,5,2],[6,3,1])\n", "# trim_both([6,5,2],[6,3,1])\n", "\n", "def classify_special_pairs2(lam,mu,typee):\n", " '''The procedure in Tables 1 and 2'''\n", " epsilon, pos_parity, rank = data_for_type(typee,sum(lam)//2)\n", " lam,mu,skim,height = trim_both(lam,mu)\n", " if len(lam)==1:\n", " if len(mu)==2:\n", " if height%2 != pos_parity:\n", " return f\"C_{lam[0]//2}^*\"\n", " else:\n", " return f\"C_{lam[0]//2}\"\n", " if len(mu)==3 and mu[1]==mu[2]==1:\n", " return f\"B_{lam[0]//2} (1)\"\n", " if sum(mu)==len(mu): # all 1's\n", " if lam[0]==3 and skim%2 == epsilon:\n", " return f\"b^sp_{len(mu)//2} (1)\"\n", " if lam[0]==2 and skim%2 == epsilon:\n", " #return f\"c^sp_{len(mu)//2}\"\n", " return f\"d^+_{len(mu)//2}\"\n", " if lam[0]==2 and skim%2 != epsilon:\n", " return f\"c^sp_{len(mu)//2}\"\n", " #return f\"d_{len(mu)//2}\"\n", " if sum(mu)==2*len(mu) and mu[0]==2: # all 2's\n", " if lam[0]==3 and skim%2 != epsilon:\n", " return f\"b^sp_{len(mu)//2} (2)\"\n", " if lam[0]==4 and skim%2 != epsilon:\n", " return f\"d_{len(mu)//2+1}/V_4\"\n", " if len(mu)==4 and mu[2]==mu[3]==1 and mu[0]==mu[1]:\n", " return f\"2B_{lam[0]//2}\"\n", " if len(mu)==3 and mu[2]==2 and mu[0]==mu[1]:\n", " return f\"B_{mu[0]//2} (2)\"" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Methods related to the duality of the special orbits " ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "## These methods deal with the type of the dual Lie algebra under the various maps\n", "\n", "def across_type(typee):\n", " initial=['B','C','D','Ca']\n", " across = ['C','B','Ca','D']\n", " return(across[initial.index(typee)])\n", "\n", "def LS_dual_type(typee):\n", " initial=['B','C','D','Ca']\n", " dual = ['C','B','D','Ca']\n", " return(dual[initial.index(typee)])\n", "\n", "def internal_dual_type(typee):\n", " initial=['B','C','D','Ca']\n", " dual = ['B','C','Ca','D']\n", " return(dual[initial.index(typee)])\n", "\n", "def dual_square(typee):\n", " return [typee,across_type(typee),internal_dual_type(typee),LS_dual_type(typee)]" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "## These methods carry out the duality among the actual partitions\n", "\n", "def transpose(lam):\n", " '''Usual partition transpose'''\n", " dual=[]\n", " l= list(lam)\n", " while(len(l)>0):\n", " dual.append(len(l))\n", " l=shrink_length(l)\n", " return dual\n", "\n", "def duality(lam): \n", " return transpose(lam)\n", " \n", "def sp_to_sp_from_lemma2_2(lam,typee,orbits): \n", " '''Map between special orbits of same codimension'''\n", " epsilon, pos_parity, rank = data_for_type(typee,sum(lam)//2)\n", " parti = []\n", " lam_values = sorted(list(set(lam)),reverse=1)+[0]\n", " #print(lam_values)\n", " for s in lam_values:\n", " mult = len([x for x in lam if x==s])\n", " if s%2==epsilon:\n", " parti+=[s]*mult\n", " continue\n", " height = len([x for x in lam if x>=s])\n", " if mult%2==1:\n", " if height%2 == pos_parity:\n", " parti+=[s]*(mult-1)\n", " if s>1:\n", " parti+=[s-1]\n", " else:\n", " parti+=([s+1]+[s]*(mult-1))\n", " if mult%2==0:\n", " if height%2 == pos_parity:\n", " parti+=([s+1]+[s]*(mult-2))\n", " if s>1:\n", " parti+=[s-1]\n", " else:\n", " parti+=[s]*mult\n", " return parti\n", " \n", "# brute force way\n", "def collapse(lam, orbits):\n", " '''The collapse map for lam relative to a set of valid partitions in orbits'''\n", " for mu in orbits:\n", " if comparable(lam,mu)==1:\n", " return mu\n", " \n", "def sp_to_sp_same_dim(lam,typee,orbits):\n", " '''Map between special orbits of same codimension, using collapse map instead'''\n", " llam = list(lam)\n", " if typee == 'C': # C to B\n", " llam[0]+=1\n", " elif typee == 'B':\n", " llam[-1]-=1\n", " if llam[-1]==0:\n", " llam.pop(-1)\n", " elif typee == 'Ca':\n", " pass\n", " elif typee == 'D':\n", " llam[0]+=1\n", " llam[-1]-=1\n", " return collapse(llam, orbits[across_type(typee)])\n", "\n", "def test_special_to_special_methods_coincide(rank1, how_many=2): ## compare the two methods from going between specials of same codim\n", " for rank in range(rank1,rank1+how_many):\n", " print(rank)\n", " orbits=make_all_specials(rank)\n", " for typee in orbits:\n", " #print(typee)\n", " for lam in orbits[typee]:\n", " j_image = sp_to_sp_same_dim(lam,typee,orbits)\n", " j_alt =sp_to_sp_from_lemma2_2(lam,typee,orbits)\n", " if(j_image != j_alt):\n", " print(\"boo\");\n", " return 0\n", " return 1\n", "\n", "## Check that the two different methods coincide \n", "#print(test_special_to_special_methods_coincide(8,4))\n", "\n", "def make_square_of_partitions(lam,typee,All_specials_for_rank):\n", " j_image = sp_to_sp_same_dim(lam,typee, All_specials_for_rank)\n", " #j_image =sp_to_sp_from_lemma2_2(lam,typee, All_specials_for_rank)\n", " duals = [duality(lam),duality(j_image)] # this is correct, but switches type with regular duality\n", " return [lam, j_image]+duals\n", "\n", "def make_all_squares_for_type(typee,All_specials_for_rank):\n", " square_dictionary = {}\n", " for lam in All_specials_for_rank[typee]:\n", " square_dictionary[tuple(lam)] = make_square_of_partitions(lam, typee,All_specials_for_rank) \n", " return square_dictionary\n", "\n", "## Show the squares of partitions for each partition.\n", "# All_specials_for_rank = make_all_specials(4)\n", "# for typee in All_specials_for_rank:\n", "# print(typee)\n", "# All_squares_for_type = make_all_squares_for_type(typee,All_specials_for_rank)\n", "# for lam in All_squares_for_type:\n", "# count=0\n", "# for mu in All_squares_for_type[lam]:\n", "# print(print_compact(mu),end=' ')\n", "# count+=1\n", "# if count%2==0:\n", "# print()\n", "# print()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Singularities of slices under dualtiy\n", "\n", "This is main part of the code to show the results in the tables in the paper and how they behave under the three maps." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "typee = 'C'\n", "rank = 5\n", "#rank = 15\n", "dual_types = dual_square(typee)\n", "\n", "#epsilon, pos_parity, rank = data_for_type(typee,rank)\n", "All_specials_for_rank = make_all_specials(rank)\n", "these_special_orbits = All_specials_for_rank[typee]\n", "\n", "hasse = hasse_edges(All_specials_for_rank[typee])\n", "edges = list(hasse)\n", "#print(edges)\n", "\n", "square_dictionary = make_all_squares_for_type(typee,All_specials_for_rank)\n", "epsilon, pos_parity, n = data_for_type(typee,rank)\n", "\n", "for x in edges: \n", " lam,mu = [these_special_orbits[y] for y in x]\n", " lam, j_lam, dlam, dj_lam = square_dictionary[tuple(lam)]\n", " mu, j_mu, dmu, dj_mu = square_dictionary[tuple(mu)]\n", " if typee in ['B' ,'C']:\n", " edge = (these_special_orbits.index(dmu),these_special_orbits.index(dlam))\n", " else:\n", " edge = (these_special_orbits.index(dj_mu),these_special_orbits.index(dj_lam))\n", " edges.remove(edge)\n", "\n", " print(classify_special_pairs2(lam,mu,dual_types[0]),' -> ',end='') #,lamB,muB,end=' ')\n", " print(classify_special_pairs2(j_lam,j_mu,dual_types[1]),end=' ')\n", " print(f\"{print_compact(lam)},{print_compact(mu)}->{print_compact(j_lam)},{print_compact(j_mu)}\") #, end=\" \")\n", " \n", " print(classify_special_pairs2(dmu,dlam,dual_types[2]),' -> ',end='')\n", " print(classify_special_pairs2(dj_mu,dj_lam,dual_types[3]), end=' ')\n", " print(f\"{print_compact(dmu)},{print_compact(dlam)}->{print_compact(dj_mu)},{print_compact(dj_lam)}\") #, end=\" \")\n", " print()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Lusztig's Canonical Quotient\n", "\n", "The following code shows the rank of the canonical quotient, which is a vector space over $\\mathbb F_2$. In types D, it is relative to $O(2n)$." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "def canonical_quotient_size(lam, typee, rank):\n", " epsilon, pos_parity, rank = data_for_type(typee,rank)\n", " corners = sum([1 for i,v in enumerate(lam) if \n", " (i==len(lam)-1 or (ilam[i+1])) \n", " and v%2!=epsilon and (i+1)%2==pos_parity])\n", " if typee=='B':\n", " return corners-1\n", " else:\n", " return corners\n", " \n", "rank = 5\n", "All_specials_for_rank = make_all_specials(rank)\n", "for typee in All_specials_for_rank:\n", " print(typee)\n", " dual_types = dual_square(typee)\n", " for lam in All_specials_for_rank[typee]:\n", " count=0\n", " for i,mu in enumerate(make_square_of_partitions(lam,typee,All_specials_for_rank)):\n", " print(print_compact(mu),canonical_quotient_size(mu,dual_types[i], rank),end=' ')\n", " count+=1\n", " if count%2==0:\n", " print()\n", " print()" ] } ], "metadata": { "kernelspec": { "display_name": "Python 3", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.9.6" } }, "nbformat": 4, "nbformat_minor": 4 }