Symmetry should be a familiar concept to all of us
We speak of shapes as being "symmetrical" or "unsymmetrical" or even "more symmetrical than other shapes"
Symmetry is fundamental in mathematics and science, so we must specify what we meant by symmetry in a rigorous way
Group Theory is the study of symmetry
The application of Group Theory extends far beyond geometric figures as it also help us study abstract mathematical concepts that exhibits "symmetric behavior"
A symmetry operation is defined as a transformation that causes every point of the body to be coincident with an equivalent point of the body in its original orientation
In other words, it is an action that moves the object into an equivalent configuration such that both the distance and shape is preserved and the transformed object is indistinguishable from its original position
It is the inherent symmetry of the object allows it to be moved and still leave looking the same
Although symmetry operations leave objects indistinguishable, this does not necessarily mean that the object has to be in a state that is identical to the original one
Being indistinguishable means that in the absence of labelling or other external indication, it is impossible to tell whether a symmetry operation is performed
This means that the effects of different symmetry operators are not necessarily the same
We can therefore distinguish one symmetry operation from another simply by comparing the configuration of the transformed object
Symmetry operations that lead to the same configuration are qualitatively the same
A symmetry operation is executed by a symmetry operator, Sym
This is the exact same principle of how linear transformations are achieved using linear operators
The symmetry operators are linear operators, so they have the same mathematical properties
Even if we do not perform symmetry operations, objects still possesses an abstract geometrical entity which we denote as the symmetry element
A symmetry element can be any geometric entity about which the corresponding symmetry operation can take place
Alternatively, we can define the symmetry element of a symmetry operation as a geometric object that is invariant to the corresponding transformation
These symmetry elements allow us to quantify how symmetric an object is
Symmetry operations and symmetry elements are closely interrelated
A symmetry operation can only be defined with respect to the corresponding symmetry element
The existence of a symmetry element can only be demonstrated by showing that the corresponding symmetry operation exists
Symmetry operations are nothing but linear transformations that results in a configuration that is indistinguishable from the original one
We can interpret this using vectors. Suppose we are applying a symmetry operation to all the vectors in a space, the vectors will collectively transform in a way that looks indistinguishable from the original vector space
Loosely speaking, a symmetry operation transforms the entire vector space as a whole such that the relative positions of vectors within said space remain unchanged
Due to this property, symmetry operations fall under a special category of linear transformation, which we refer to as unitary transformations
Unitary transformations are transformations that preserve lengths of vectors and the angle between any two vectors
Information about the alignments and lengths of vectors are all contained within an inner product, so we can say that a unitary transformation preserves the inner product
⟨U∣W⟩=⟨Utrans∣Wtrans⟩
The value of the inner product remains unchanged when both vectors are transformed by the same unitary operator
Sym∣U⟩=∣Utrans⟩Sym∣W⟩=∣Wtrans⟩
The product of a unitary operator with its hermitian transpose will give the identity operator
Since we can view inner product as a matrix-vector multiplication between a hermitian transposed ket with another ket, we can rewrite the previous relation as
We can then distribute the hermitian transpose and rewrite the equation in terms of bra where possible
⟨U∣W⟩=(⟨U∣SymH)(Sym∣W⟩)
We know that inserting the identity operator inside an inner product does not affect its value
⟨U∣I∣W⟩=⟨U∣SymHSym∣W⟩
Simply by comparing terms, we can conclude that
SymHSym=I
Alternatively, we can say that the inverse of a unitary operator is given by its hermitian transpose
SymH=Sym−1
The columns of a unitary matrix form a set of orthonormal vectors
Since symmetry operators are just linear operators and linear operators can be represented by matrices, it shouldn't be surprising that symmetry operators can also be represented by matrices
A group is a collection of all the symmetry operations that can be performed on an entity
There are four defining properties inherent to a complete collection of symmetry operations of any entity, and we can use these properties to define the four axioms of a group
All groups must obey the Closure Axiom
Since all symmetry operations of a group leave the corresponding entity indistinguishable, applying multiple symmetry operations consequetively will also have the net effect of leaving the object indistinguishable
Since applying two operations consequetively amounts to multiplying said operators in the corresponding order, the product of two symmetry operators of a group will yield another symmetry operator of the same group
♣∈Gand♠∈G⇓♣♠∈Gand♣♠∈G
In other words, a group exhibits closure with respect to multiplication, where every product of two operations in the group is equivalent to one of the operations of said group
It is therefore not possible to generate a new symmetry operation by combining those in the group
All groups must obey the Associative Axiom
Since symmetry operators are linear operators and the composition of linear operators is associativity, the composition of symmetry operators must also be associative
(♣♡)♠=♣(♡♠)
Note that, just like regular linear operators, the multiplication of symmetry operators generally do not commute
All groups must obey the Identity Axiom
Applying the identity operator amounts to applying a transformation that does not move anything around, which must therefore leave any entity in an identical configuration
The identity operator must therefore be a symmetry operation for any entity
In other words, all groups must contain one and only one identity operator
for all groupsI∈G
Moreover, the composition of any member of the group with the identity element in any order will always return the same member
All groups must obey the Inverse Axiom
Symmetry operators are linear operators that preserve the dimension of the input, so they can be represented by square matrices with non-zero determinant. In other words, all symmetry operators have an inverse
The inverse undoes a symmetry operation by transforming the entity from an indistinguishable configuration back to its original configuration
original ENTITYApplying Symmetry Operator♠indistinguishable ENTITYindistinguishable ENTITYApplying Inverse Symmetry Operator♠−1original ENTITY
Hence, the inverse must also be part of the group of the corresponding symmetry operator
The study of groups is not just about what a particular set of symmetry operations is
The essence of group theory lies in understanding how the symmetry operations in any given group play with each other
The Closure Axiom states that the product of symmetry operators within a group must yield another symmetry operator within the same group
We can therefore summarize all the possible combination of symmetry operations of a group using a group multiplication table
Each entry is obtained by applying an operation in the top row followed by an operation from the first column
Each row or column of any group multiplication table contains each symmetry operator in the said group once and once only
Suppose ♣ , ♠, ♢ and ♡ are in the same group and somehow we obtain these two relations
♣♡=♢♠♡=♢
We can then multiply both equations from the left by ♡−1 and see that dfferent operators cannot give same product with another operator in the same order
♣=♢♡−1♠=♢♡−1⎭⎪⎪⎬⎪⎪⎫♣=♠
This gaurentees that each symmetry operator in the a group will not appear more than once in each row or column
There are as many rows or elements as the order of the group,if each has to be different then the whole group has to be exhausted
In other words, each row or column of the multiplication table contains all the symmetry operators of the group, but the orders may be different
Each group has a unique multiplication table and we can extract useful information from the table
All possible symmetry operations that can be applied to an object is listed in any column or row of the table
The order of the group, h , is given by the number of columns or rows of the table
A group can contain many different symmetry operators, we can categorize these symmetry operators into different "classes"
Loosely speaking, symmetry operators within the same class have similar actions
Symmetry operators that give essentially the same operation physically but differ in labeling belong in the same class
Symmetry operations in the same class are operations that can be interchanged if the coordinate system is transformed into an indistinguishable configuration
♣Coordinate systemtransformed to equivalent configuration♠
In other words, these symmetry operations only differ in the basis chosen
We can therefore interpret these operators as the same operator expressed in different but equivalent basis
It is then quite obvious that we can relate the two operators using a similarity transformation
♣=X−1♠X
The change of basis matrix, X, must transform the coordinate system into an equivalent configuration, which means X can be any symmetry operator of the group
We can define symmetry operations as being in the same class mathematically
Two symmetry operators belong to the same class if
♣=X−1Inverse ofSymmetry operatorin the group♠ArbitrarySymmetry operatorin the groupX
Symmetry operators that fulfill this requirement are called conjugate operators
Symmetry operators of the same class have certain properties
If ♣ and ♠ are in the same class, there must exist two equalities: ♣=X−1♠X and ♠=X−1♣X
If ♡ and ♢ are both conjugate to ♣ , then ♡ and ♢ are conjugate to each other
All mutually conjugate elements form a class
Each element must belong to a class and one class only
While the multiplication table of a group contains a lot of useful information, it is often quite inconvenient to work with these abstract symmetry operations
When doing maths, we want to work with things that can be computed
We therefore wish to develop representations for these abstract groups so that we can perform computation more easily
Any set of elements which combine according to the multiplication table is said to form a representation of the group
The elements in the representation can combine themselves in a manner parallel to the way in which the symmetry operators in the group combine
We have established that symmetry operators are a special kind of unitary operator, which can be represented by unitary matrices
Symmetry operations results in an indistinguishable configuration, so the transformation must preserve the dimension
Hence, symmetry operators can be represented by square unitary matrices
A set of unitary matrices that multiply together in the same manner as a group multiplication table can be a representation of that group
Suppose ♣ , ♠ and ♡ are members of a group, then their corresponding matrix representations are denoted by Γ(♣) , Γ(♠) and Γ(♡) respectively
If the product of ♣ and ♠ gives ♡, then the matrix multiplication between Γ(♣) and Γ(♠) must also give Γ(♡)
♣♠=♡⇕Γ(♣)Γ(♠)=Γ(♡)
In general, we can always create a multiplication table using the matrix representations that are parallel to the multiplication table of the symmetry operators
Before we go any further, we shall remind ourselves what the block-diagonalized matrix is trying to say
Each column of the matrix indicates the transformed basis expressed in the original basis, while each row represents the component of the transformed basis in the original basis
We can see that the transformed basis only have non-zero components in basis within the same block
In practice, this means that basis that are in the same block couple to each other and transform together, while basis in different blocks transform independently
We can therefore break down the transformation of any arbitrary vector into a sum of transformation
We are effectively just breaking down the matrix and vectors into smaller matrices and vectors of lower dimensions
⊕ is an operation called the direct sum. The direct sum of two subspaces VA and VB of a vector space is another subspace whose elements can be written uniquely as sums of one vector of VA and one vector of VB
VA⊕VB=VC⇓∈VA∣a⟩⊕∈VB∣b⟩=∈VC∣c⟩
In other words, we can break down the vector space into different subspaces according to how the matrices are block diagonalized
Using the above property, we can reduce the redundant information carried by matrices in a representation
Let us consider a representation, Γ , where all of the matrices are in the same block-diagonal form
We know that multiplication between block-diagonal matrices of the same form can be achieved by multiplying each blocks separately
The above representation effectively decomposes into multiple separate representations of different dimensions
The significance of representation is that the set of elements in the representations conveys the relationship of the symmetry operators in a group
Although the block matrices do not correspond to linear transformations of the symmetry operations, the multiplication-relations between them are parallel to that of the symmetry operators in the group, and so they too can be valid representations of the group
There are two types of representations, reducible and irreducible
A representation is called reducible if it is possible to find a similarity transformation matrix which is able to convert all the matrices in the representation such that the resulting matrices can be blocked into smaller matrices
If it is not possible to find a similarity transformation which will reduce all of the matrices of a given representation in the above manner, then the representation is said to be irreducible
We can keep reducing the dimension of a representation until we reach the irreducible representations. This means that all reducible representations can be "constructed" from the irreducible representations
We can always find a basis such that the matrix corresponding to the reducible representation is casted in a block diagonal form, where the non-zero diagonalblocks being matrices of irreducible representations
Suppose we are able to extract all the irreducible representation of a group, what we can do is to list all of them out
Since the block matrices can have different dimensions, so too can the the irreducible representations
In every group, there must exist a one-dimensional trivial irreducible representation, Γ1irrep , where the effect of all symmetry operations on the corresponding one-dimensional subspace is to do nothing
Recall that each irreducible representation tells us how the vectors in the corresponding subspace transforms
We can then know what effect each symmetry operator has on any subspace and the vectors within it
We can therefore interpret each irreducible representation as the symmetry label of a particular subspace
The dimension of an irreducible representation tells us the dimension of the subspace it is acting on
Alternatively, we can say that the dimension of an irreducible representation is the maximum number of linearly independent vectors the corresponding subspace can contain
To see how much of the original basis is retained, we can break down the transformed basis vector into a vector sum of a scaled basis vector and a vector that is not of interest
Using the pre result, wevious know that the diagonal terms {a11,a22,a33,⋯,ann} is the "amount" of retained basis vector, while the off-diagonal entries give additional information on exactly how to build the new vector form the old
To see how many of the original basis vectors are retained after the transformation, we simply have to sum up the diagonal terms. We refer to the "amount" of preserved basis vector as the character of the transformation, χ
χ(A)=i∑naii
In other words, the character of a matrix is given by the sum of its diagonal value
The consequence of adopting such a definition as the character of a square matrix is that the trace of the product of two matrices is equal to the trace of the product in which order is swapped
This is quite obvious if we consider the rules of matrix-multiplication
χ(AB)=i∑n[j∑naijbji]
We can then swap the order of summation
χ(AB)=j∑n[i∑nbjiaij]
The result is nothing but the character of the product of the two matrices multiplied in the other direction
χ(AB)=χ(BA)
The character of a transformation is basis-independent
We can change the basis of a transformation using the similarity transformation
∵♣Similarity Transformation♠∴♣=X−1♠X
The character of X−1♠X is given by
χ(X−1♠X)
The commutativity of trace allows us to rearrange the order of multiplication
χ(X−1♠X)=χ(X−1X♠)
The composite of an operator and itself will just be the identity, we can therefore conclude
χ(X−1♠X)=χ(♠)
The implication of this is that symmetry operators of the same class must have the same character. However, two symmetry operators are not necessarily in the same class just because they share the same character
We are interested in knowing what the characters of the symmetry operators of a point group are
It does not really make sense to talk about the character of a symmetry operator
χ(Ω)
It makes more sense to talk about the character of the matrix representation of the symmetry operator
Character of Ω in the ith representationχ(Γi(Ω))
To save time writing, we usually just denote it as
We can set up a table where each row is a set of representation
It turns out for almost all the application of group theory, we do not need the matrices, only their characters
The dimension of such a vector is equal to the number of classes in the group, which means these vectors inhabit a vector space whose dimension equals the number of classes in the group
The number of linearly independent, let alone orthogonal, vectors in an n-dimension vector space is n
Since each set of irreducible representations represent a vector, there must be the same number of irreducible representations as there are classes
Number of Classes=Number of Irreducible Representations
The sum of the squares of the dimensions of the irreducible representations of a group is equal to the order of the group
di is the dimension of the matrix in the ith representation
i=1∑Ndi2=h
Since the character of the identity operator in an n-dimensional matrix must always be n. The dimension of a representation is adequetly given by the character of the identity operator in said representation
i=1∑N[χ(I)]2=h
We can determine the number of times each irreducible representation appears in a reducible representation
Recall that all reducible representations can always be written as a block-diagonalized matrix via a similarity transformation, and the character of a transformation is basis-independent
Since some irreducible representations can appear more than once, we shall rewrite it as such, where ni is the number of times the ith irreducible representation appears in the decomposition
χkreducible(operator)=i∑niχiirrep(operator)
In order to make use of the orthogonality of row vectors, we shall multiply both sides by χjirrep(operator) and sum both sides over operator
A simple algebaric rearrangement reveals that the number of times the jth irreducible representation appears in the kth reducible representation is given by the expression