# Characters

Characters were introduced by Frobenius and studied before their representation-theoretic origin was understood. For many purposes, we do not need general results about orthogonality of matrix coefficients that were proved in the last Section – characters are sufficient. They are easy to tabulate and compute with, and they still have beautiful orthogonality properties. Yet they still express deep properties of the irreducible representations.

As we move towards traditional character theory we want a more traditional way of thinking about the group algebra, so we will identify $g \in G$ with the function $\delta_g$ that has value $1$ at $g$ and $0$ elsewhere. We already mentioned this possibility in Section 2.3, but there we preferred to think of the group algebra as the convolution ring of functions on the group. Now we

If $M$ is an $d \times d$ matrix, the trace of $M$ is an additive analog of the determinant. It can be defined as the sum of the diagonal entries in $M$.

Lemma 2.4.1: (i) If $A$ and $B$ are $d \times d$ matrices then $\operatorname{tr} (A B) = \operatorname{tr} (B A)$.

(ii) If $M$ and $g$ are $d \times d$ matrices and $g$ is invertible, then $\operatorname{tr} (g M g^{- 1}) = \operatorname{tr} (M)$.

(iii) Let $V$ be a finite-dimensional vector space, and let $T : V \longrightarrow V$ be a linear transformation. Then the trace of a matrix representing $T$ with respect to some basis of $V$ is independent of the choice of basis.

For (i), it is actually only necessary that $A B$ and $B A$ be square. Indeed, $A$ could be $m \times n$ and $B$ could be $m \times n$.

Proof. (Click to Expand/Collapse)

The $i, k$-th entry of $A B$ is $\sum_j a_{i j} b_{j k}$, so the sum of the diagonal entries of $A B$ is $\sum a_{i j} b_{j i}$. This is clearly unchanged if we switch $A$ and $B$, proving (i). For (ii), we have $\operatorname{tr} (M) = \operatorname{tr} (g M^{- 1} g) = \operatorname{tr} (g M g^{- 1}),$ where we have used (i). For (iii), changing the basis of $V$ replaces $M$ by $g M g^{- 1}$ where $g$ is the change of basis matrix, proving (iii).

In view of Lemma 2.4.1 we may define the trace $\operatorname{tr} (T)$ of a linear transformation to be the trace of any matrix representing $T$, and this is well defined.

Lemma 2.4.2: The trace of $T$ is the sum of its eigenvalues.

This is similar to the fact that the determinant of $T$ is the product of its eigenvalues. In both statements, we count each eigenvalue with the multiplicity that it occurs as a root of the characteristic polynomial.

Proof. (Click to Expand/Collapse)

First assume that the ground field is algebraically closed. Using the Jordan canonical form, we may choose a basis of $V$ with respect to which the matrix $M$ representing $T$ is upper triangular. If $M = \left(\begin{array}{cccc} \lambda_1 & \ast & \cdots & \ast\\ 0 & \lambda_2 & & \ast\\ \vdots & & \ddots & \vdots\\ 0 & 0 & \cdots & \lambda_d \end{array}\right)$ then the eigenvalues of $M$ are $\lambda_1, \cdots, \lambda_d$, and the trace is $\lambda_1 + \ldots + \lambda_d$. This proves the statement if the ground field is algebraically closed.

In general, neither the characteristic polynomial nor the trace is changed if we extend the ground field, so the assumption of algebraic closure involves no loss of generality.

Proposition 2.4.1: If $(\pi, V)$ is a representation, then $\pi (g)$ is diagonalizable.

Proof. (Click to Expand/Collapse)

This follows from Proposition 2.3.1 since $\pi (g)$ has finite order.

Proposition 2.4.2: The character of the contragredient representation $\hat{\pi}$ is the complex conjugate of the character of $\pi$: $\chi_{\hat{\pi}} (g) = \overline{\chi_{\hat{\pi}} (g)}$.

We recall that the contragredient representation is defined by (2.3.1). The bar here denotes complex conjugation..

Proof. (Click to Expand/Collapse)

Since $\pi (g)$ has finite order, it is diagonalizable. With $g \in G$ fixed, we take as a basis of $V$ to be a set of eigenvectors $v_1, \cdots, v_d$ of $\pi (g)$, and we use the dual basis $v_1^{\ast}, \cdots, v_d^{\ast}$ of $V^{\ast}$. If the eigenvalues of $\pi (g)$ are $\lambda_1, \cdots, \lambda_d$, then we may write $\pi (g) v_i = \lambda_i v_i$. Now $\pi (g^{- 1}) v_i = \lambda_i^{- 1} v_i$. By Exercise 2.3.1, we also have $\hat{\pi} (g) v_i^{\ast} = \pi (g^{- 1})^{\ast} v_i^{\ast} = \lambda_i^{- 1} v_i$. Thus $\chi_{\hat{\pi}} (g) = \sum \lambda_i^{- 1} .$ But $\lambda_i$ is a root of unity, hence has complex absolute value $1$, so $\lambda_i^{- 1} = \overline{\lambda_i}$, and $\chi_{\hat{\pi}} (g) = \sum \overline{\lambda_i} = \overline{\chi_{\hat{\pi}} (g)} .$

If $V$ is a $d$-dimensional vector space then the space $\operatorname{End}_{\mathbb{C}} (V)$ is $d^2$-dimensional, being isomorphic to the ring of $d \times d$ matrices.

Lemma 2.4.3: Let $V$ be a $d$-dimensional vector space, and let $A, B \in \operatorname{End} (V)$. Let $T : \operatorname{End} (V) \longrightarrow \operatorname{End} (V)$ be the linear transformation that maps $X \in \operatorname{End} (V)$ to $A X B$. Then $\operatorname{tr} (T) = \operatorname{tr} (A) \operatorname{tr} (B)$.

Proof. (Click to Expand/Collapse)

After choosing a basis of $V$, linear transformations are represented by matrices. We may therefore replace $A, B$ and $X$ by matrices in order to compute the traces. Let $A = (a_{i j})$, $B = (b_{i j})$ and $X = (x_{i j})$. Now the $(i, l)$-th entry in $Y = A X B$ is $y_{i l} = \sum_{j, k} a_{i j} x_{j k} b_{k l} .$ In order to parse this let $\mu = (i, l)$ and $\nu = (j, k)$; we thus write $y_{\mu} = \sum_{\nu} c_{\mu \nu} x_{\nu}, \hspace{2em} c_{\mu \nu} = a_{i j} b_{k l} .$ Thus the trace is the sum of the diagonal terms, $c_{\mu \mu}$. If $\nu = \mu$ then $i = j$ and $l = k$, so $\operatorname{tr} (T) = \sum c_{\mu \mu} = \sum_{i, l} a_{i i} b_{l l} = \sum_i a_{i i} \sum_l b_{l l} = \operatorname{tr} (A) \operatorname{tr} (B) .$

Now if $(\pi, V)$ is a representation of $G$, let $\chi_{\pi} (g) = \operatorname{tr} \; \pi (g)$. This function will be called the character of the representation $\pi$. It is constant on conjugacy classes since if $x$ is conjugate to $y$ in $G$, we can write $x = g y g^{- 1}$, so $\pi (x) = \pi (g) \pi (y) \pi (g)^{- 1}$, and so $\pi (x)$ and $\pi (y)$ have the same trace. More generally, if $f$ is any function on $G$ that is constant on the conjugacy classes, we call $f$ a class function. Thus a character is a class function.

Lemma 2.4.4: If $(\pi, V)$ is irreducible, then $\chi_{\pi} \in \mathcal{R}_V$. In fact, give $V$ an invariant inner product, and let $v_1, \cdots, v_D$ be an orthonormal basis of $V$. Then
 $\chi (g) = \sum_i f_{v_i, v_i},$ (2.4.1)

where $f_{v_i, v_i}$ is the matrix coefficient (2.3.3).

Proof. (Click to Expand/Collapse)

If $\pi (g)$ is represented as a matrix, the $i, j$-th coefficient $\pi_{i j} (g)$ is given by (2.3.4) as $f_{v_j, v_i}$. Summing the diagonal entries gives this statement.

Lemma 2.4.5: Let $\chi$ and $\chi'$ be the characters of nonisomorphic irreducible representations. Then they are orthogonal.

Proof. (Click to Expand/Collapse)

By Lemma 2.4.4, $\chi$ and $\chi'$ lie in $\mathcal{R}_V$ and $\mathcal{R}_{V'}$ where $V$ and $V'$ are nonisomorphic irreducible modules. They are orthogonal by Proposition 2.3.6.

Lemma 2.4.6: If $f \in \mathbb{C}[G]$ then $(\delta_g \ast f) (x) = f (g^{- 1} x)$ and $(f \ast \delta_g) (x) = f (x g^{- 1})$.

Proof. (Click to Expand/Collapse)

We have $(\delta_g \ast f) (x) = \sum_y \delta_g (y) f (y^{- 1} x) = f (g^{- 1} y)$ since only the term $y = g$ contributes. Similarly $(f \ast \delta_g) (x) = \sum_y f (x y^{- 1}) \delta_g (y) = f (x g^{- 1}) .$

Proposition 2.4.3: The number of isomorphism classes of irreducible representations is equal to the number of conjugacy classes of $G$.

Proof. (Click to Expand/Collapse)

Both numbers will be shown to equal the dimension of the center $Z$ of $\mathbb{C}[G]$. We will exhibit two bases of $Z$. First, we note that in order to be in the center of $\mathbb{C}[G]$, a function $f : G \longrightarrow \mathbb{C}$ must be a class function, since $(\delta_g \ast f \ast \delta_g^{- 1}) (x) = f (g^{- 1} x g),$ and a necessary and sufficient condition that $\delta_g \ast f \ast \delta_g^{- 1} = f$ for all $g \in G$ is that $f$ is constant on conjugacy classes.

Thus, the characteristic functions of the conjugacy classes form a basis of $Z$ as a complex vector space, and $\dim (Z)$ is the number of conjugacy classes.

On the other hand, we saw in the previous section that $\mathbb{C}[G] \cong \prod_i \mathcal{R}_{V_i},$ where $V_i$ runs through the isomorphism classes of irreducible $G$-modules, and that $\mathcal{R}_{V_i}$ is isomorphic to $\operatorname{Mat}_{D_i} (\mathbb{C})$ where $D_i = \dim (V_i)$. It is easy to see that the center of $\operatorname{Mat}_{D_i} (\mathbb{C})$ is one-dimensional, spanned by the identity matrix. So the center of $\mathbb{C}[G]$ is the direct product of the one dimensional centers of these ideals. Its dimension is thus equal to the number of irreducible representations.

Let us fix representatives $(\pi_1, V_1), \cdots, (\pi_h, V_h)$ of the isomorphism classes of the irreducible $G$-modules. Let $\chi_1, \cdots, \chi_h$ be their characters; we call them the irreducible characters of $G$. We have proved that the $\chi_i$ are orthogonal class functions. Thus they are linearly independent. We've also proved that their number is equal to the number of conjugacy classes of $G$, so they span the space of class functions. We will show that $\left\langle \chi_i, \chi_i \right\rangle = 1$, so once we have established this, we will have an orthonormal basis of the space of class functions.

If $\chi : G \longrightarrow \mathbb{C}^{\times}$ is a linear character, that is, a homomorphism, then $\chi$ is among the $\chi_i$. Indeed, we may associate with $\chi$ a one-dimensional module $V_{\chi} =\mathbb{C}$ with the representation $\pi_{\chi} : G \longrightarrow \operatorname{End} (\mathbb{C}) \cong \mathbb{C}$, namely $\pi_{\chi} (g) x = \chi (g) x$ for $g \in G$. One particular linear character is the character $\mathbf{1}$ that takes value $1$ for all $g$. We will order the $\chi_i$ so that $\chi_1 =\mathbf{1}$. We call $\pi_1$ the trivial representation, its module the trivial $G$-module, and we call its character the trivial character. Thus $\chi_1 (g) = 1$ for all $g$. We say that a $G$-module is trivial if it isomorphic to the trivial $G$-module.

Although we are not ready to prove $\left\langle \chi_i, \chi_i \right\rangle = 1$ in general, at least $\left\langle \chi_1, \chi_1 \right\rangle = 1$ is clear since
 $\left\langle \chi_1, \chi_1 \right\rangle = \frac{1}{|G|} \sum_{g \in G} 1 \cdot 1 = 1$ (2.4.2)

We will reduce the general case to this special case.

Suppose that $(\sigma, W)$ is a representation of $G$. Let $W^G = \left\{ w \in W| \text{\sigma (g) w = w for all g \in G} \}. \right.$ This is the submodule of invariants.

Lemma 2.4.7: The invariants form a submodule of $W$, and if $U$ is any irreducible submodule of $W$, then $U$ is trivial if and only if $U \subset W^G$.

Proof. (Click to Expand/Collapse)

Both statements are obvious!

Lemma 2.4.8: Let $(\sigma, W)$ be a representation of $G$. Then $\dim (W^G) = \frac{1}{|G|} \sum_{g \in G} \chi_{\sigma} (g) .$

Proof. (Click to Expand/Collapse)

Decompose $W$ into a direct sum of irreducible modules: $W = W_1 \oplus W_2 \oplus \cdots$. Suppose that $m_i$ of the $W_j$ are isomorphic to $V_i$. In particular, those that are in $W^G$ are the ones that are isomorphic to the trivial module $V_1$, so $m_1 = \dim (W^G)$. We have $\chi_{\sigma} = \sum m_i \chi_i$, and so \begin{eqnarray*} & \frac{1}{|G|} \sum_{g \in G} \chi_{\sigma} (g) = \left\langle \chi_{\sigma}, \mathbf{1} \right\rangle = \sum m_i \left\langle \chi_i, \chi_1 \right\rangle . & \end{eqnarray*} Since $\mathbf{1}= \chi_1$, and since the other $\chi_i$ are orthogonal to it, this equals $m_1 \left\langle \chi_1, \chi_1 \right\rangle = m_1$ by (2.4.2).

Theorem 2.4.1: (Schur orthogonality for characters.) We have $\left\langle \chi_i, \chi_j \right\rangle = \left\{ \begin{array}{ll} 1 & \text{if i = j,}\\ 0 & \text{otherwise} . \end{array} \right.$ The irreducible characters are thus an orthonormal basis of the space of class functions.

We will sometimes refer to this as row orthogonality, since when we come to make tables of characters, there are orthogonality properties for both the rows and the columns of the character table.

Proof. (Click to Expand/Collapse)

The only thing that we haven't proved yet is that $\left\langle \chi_i, \chi_i \right\rangle = 1$, and we have proved this in the special case that $\chi_i = \chi_1$ is the trivial character. To reduce to the special case, we define a $G$-module structure on $W = \operatorname{End}_{\mathbb{C}} (V_i)$, namely if $T \in \operatorname{End}_{\mathbb{C}} (V_i)$ is any linear transformation, we define $\theta (g) T (v) = \pi (g) T \pi (g)^{- 1} (v)$ for $g \in G$, $v \in V_i$.

In this action, the $G$-invariants $W^G$ comprise all linear transformations $T$ such that $\pi (g) T \pi (g)^{- 1} = T$, which is to say $\pi (g) T = T \pi (g)$. In other words, $W^G = \operatorname{Hom}_G (V_i, V_i)$, which is one-dimensional by Schur's Lemma. Thus by Lemma 2.4.8, $\frac{1}{|G|} \sum_{g \in G} \chi_{\theta} (g) = 1.$ The Theorem will follow if we prove
 $\chi_{\theta} (g) = | \chi_i (g) |^2,$ (2.4.3)

since then we will get $\left\langle \chi_i, \chi_i \right\rangle = \frac{1}{|G|} \sum_{g \in G} | \chi_i (g) |^2 = 1.$

To compute $\chi_{\theta}$, we may choose a basis $v_1, \cdots, v_D$ of eigenvectors for $\pi_i (g)$, which is diagonalizable by Proposition 2.3.1. Now let us consider the effect of $\theta (g)$ on a linear transformation $T$, which we represent by a matrix $(t_{\alpha \beta})$ using this basis. (We are using $\alpha$ and $\beta$ for subscripts instead of the usual $i, j$.)

If $\lambda_1, \cdots, \lambda_D$ are the eigenvalues of $\pi_i (g)$, so that $\pi_i (g) v_{\alpha} = \lambda_{\alpha} v_{\alpha}$ we have $\pi_i (g)^{- 1} v_{\alpha} = \lambda_{\alpha}^{- 1} v_{\alpha} = \overline{\lambda_{\alpha}} v_{\alpha}$ since the $\lambda_i$ are roots of unity, hence have absolute value one, and their inverses are their complex conjugates. Then $\theta (g) T$ is represented by the matrix $\left(\begin{array}{ccc} \lambda_1 & & \\ & \ddots & \\ & & \lambda_D \end{array}\right) \left(\begin{array}{ccc} t_{11} & \cdots & t_{1 D}\\ \vdots & & \vdots\\ t_{D 1} & \cdots & t_{D D} \end{array}\right) \left(\begin{array}{ccc} \! \overline{\lambda_1} & & \\ & \ddots & \\ & & \! \overline{\lambda_D} \end{array}\right) = \left(\begin{array}{ccc} \lambda_{1 \!} \! \overline{\lambda_1} t_{11} & \cdots & \lambda_{1 \!} \! \overline{\lambda_D} t_{1 D}\\ \vdots & & \vdots\\ \lambda_D \! \overline{\lambda_1} t_{D 1} & \cdots & \lambda_{D \!} \! \overline{\lambda_D} t_{D D} \end{array}\right) .$ We see that the the $\alpha, \beta$ entry is multiplied by $\lambda_{\alpha} \overline{\lambda_{\beta}}$, so the trace of $\theta (g)$ on the $D^2$-dimensional vector space $W = \operatorname{End}_{\mathbb{C}} (V_i)$ is $\sum_{\alpha, \beta} \lambda_{\alpha} \overline{\lambda_{\beta}} = \left( \sum_{\alpha} \lambda_{\alpha} \right) \overline{\left( \sum_{\beta} \lambda_{\beta} \right)} = \chi_i (g) \overline{\chi_i (g)} = | \chi_i (g) |^2,$ and the Theorem is proved.

Proposition 2.4.4: If $\chi, \theta$ are the characters of representations $(\pi, V)$ and $(\sigma, U)$, respectively, then
 $\left\langle \chi, \theta \right\rangle = \dim \; \operatorname{Hom}_{\mathbb{C}[G]} (V, U) .$ (2.4.4)

Proof. (Click to Expand/Collapse)

Suppose $V = V' \oplus V''$, where $V'$ and $V''$ are $G$-submodules of $V$. Let $\chi', \chi''$ be the characters of the representations $\pi'$ and $\pi''$ of $G$ on $V'$ and $V''$. Then $\chi = \chi' + \chi''$, so $\left\langle \chi, \theta \right\rangle = \left\langle \chi', \theta \right\rangle + \left\langle \chi'', \theta \right\rangle .$ On the other hand $\operatorname{Hom}_{\mathbb{C}[G]} (V, U) \cong \operatorname{Hom}_{\mathbb{C}[G]} (V', U) \oplus \operatorname{Hom}_{\mathbb{C}[G]} (V'', U),$ so if $\left\langle \chi', \theta \right\rangle = \dim \; \operatorname{Hom}_{\mathbb{C}[G]} (V', U) \hspace{2em} \operatorname{and} \hspace{2em} \left\langle \chi'', \theta \right\rangle = \dim \; \operatorname{Hom}_{\mathbb{C}[G]} (V'', U)$ then $\left\langle \chi, \theta \right\rangle = \dim \; \operatorname{Hom}_{\mathbb{C}[G]} (V, U)$. Thus we may reduce to the case where $V$ is irreducible, and similarly, we may reduce to the case $U$ is irreducible.

Now assuming both $V$ and $U$ are irreducible, $\left\langle \chi, \theta \right\rangle = \left\{ \begin{array}{ll} 1 & \text{if V \cong U,}\\ 0 & \text{otherwise} \end{array} \right.$ by Schur orthogonality (Theorem 2.4.1), while $\dim \; \operatorname{Hom}_{\mathbb{C}[G]} (V, U) = \left\{ \begin{array}{ll} 1 & \text{if V \cong U,}\\ 0 & \text{otherwise} \end{array} \right.$ by Schur's Lemma (Proposition 2.3.4). The statement is now proved.

Proposition 2.4.5: The constant $d (V)$ in Proposition 2.3.7 equals $\dim (V)$.

Proof. (Click to Expand/Collapse)

We take the inner product of the irreducible character $\chi$ with itself using (2.4.1). We have $1 = \left\langle \chi, \chi \right\rangle = \left\langle \sum_i f_{v_i, v_i}, \sum_j f_{v_j, v_j} \right\rangle = \sum_{i, j} \left\langle f_{v_i, v_i}, f_{v_j, v_j} \right\rangle .$ By Proposition 2.3.7, we have $\left\langle f_{v_i, v_i}, f_{v_j, v_j} \right\rangle = \left\{ \begin{array}{ll} \frac{1}{d (V)} & \text{if i = j,}\\ 0 & \text{otherwise} . \end{array} \right.$ There are thus $\dim (V)$ nonzero contributions, so $d (V) = \dim (V)$.

Henceforth we will denote the dimension of the irreducible representation $\pi_i$ as $d_i$ – previously we denoted it $D_i$ to minimize the possibility of confusion since we had not yet proved Proposition 2.4.5. We note that $\chi_i (1) = d_i$, because $\chi_i (1) = \operatorname{tr} \; \pi_i (1)$ and of course $\pi_i (1)$ is represented by the identity matrix, so its trace is the dimension of the vector space.

We recall that if $g \in G$ then $C_G (g)$ is its centralizer, the group of all elements that commute with it.

Proposition 2.4.6: (Column orthogonality.) Let $g$ and h be elements of $G$. Then $\sum_{i = 1}^h \chi_i (g) \overline{\chi_i (h)} = \left\{ \begin{array}{ll} 0 & \text{if g and h are not conjugate},\\ |C_G (g) | & \text{if g and h are conjugate} . \end{array} \right.$

The property Theorem 2.4.1 is called row orthogonality. The terms "row'' and "column'' in this context refer to the rows and columns of character tables, which we will come to later.

Proof. (Click to Expand/Collapse)

We recall that an $n \times n$ matrix $M = (m_{i j})$ is called unitary if $M \bar{M}^t = I$. This means that $\sum m_{i j} \overline{m_{k j}} = \delta_{i k}$, where the "Kronecker delta'' $\delta_{i k}$ is $1$ if $i = k$ and $0$ if $i \neq k$. Now if $M$ is unitary, so is its transpose since $M$ and $\bar{M}^t$ are inverses, so $\bar{M}^t M = I$ and taking conjugates, $M^t \cdot \bar{M} = I$.

Let $g_1, \cdots, g_h$ be representatives of the conjugacy classes of $G$. Let $m_{i j} = \sqrt{\frac{1}{|C_G (g_i) |}} \chi_j (g_i) .$ So we write the orthogonality condition $\frac{1}{| G|} \sum_{g \in G} \chi_i (g) \overline{\chi_k (g)} = \delta_{i k}$ by grouping the elements of each conjugacy class together. Indeed, $\chi_i (g) \overline{\chi_j (g)}$ is constant on the conjugacy class of $g_i$, and that conjugacy class has $|G| / |C_G (g_i) |$ elements, by Proposition 1.5.2. Thus $\delta_{i k} = \frac{1}{| G|} \sum_{j = 1}^h \frac{|G|}{|C_G (g_j) |} \chi_i (g_j) \overline{\chi_k (g_j)} = \sum_{j = 1}^h m_{i j} \overline{m_{k j}}, \hspace{2em} m_{i j} = \frac{\chi_i (g_j)}{\sqrt{|C_G (g_j) |}} .$ Thus $M = (m_{i j})$ is unitary, and so is its transpose, which means that $\delta_{i k} = \sum_j m_{j i} \overline{m_{j k}} = \frac{1}{\sqrt{|C_G (g_i) | \cdot |C_G (g_k) |}} \sum_j \chi_j (g_i) \overline{\chi_j (g_k)} .$ Thus $\sum_j \chi_j (g_i) \overline{\chi_j (g_k)} = \left\{ \begin{array}{ll} 0 & \text{if i = k,}\\ |C_G (g_i) | & \text{if i \neq k,} \end{array} \right.$ which is equivalent to the statement.

## Summary: what we have proved

Since the proofs were long, here is a summary of Schur orthogonality. We started with matrix coefficients, but we'll start the summary with characters.

If $(\pi, V)$ is a representation, its character $\chi_{\pi}$ is the function $G \longrightarrow \mathbb{C}$ defined by $\chi_{\pi} (g) = \operatorname{tr} \; \pi (g)$. The characters are class functions, meaning that they are constant on conjugacy classes. If $h$ is the number of conjugacy classes of the group, the number of isomorphism classes of irreducible representations of $G$ is also $h$, and if representatives of these are $(\pi_i, V_i)$, let $\chi_i = \chi_{\pi_i}$. These are the irreducible characters of $G$. Among them are the linear characters, which are the homomorphisms $G \longrightarrow \mathbb{C}^{\times}$.

We proved that, with respect to the inner product
 $\left\langle f_1 f_2 \right\rangle_2 = \frac{1}{|G|} \sum_{g \in G} f_1 (g) \overline{f_2 (g)},$ (2.4.5)

the characters are an orthonormal basis for the space of class functions. This orthogonality relation is sometimes called row orthogonality, and there is also column orthogonality, which is the relation $\sum_{i = 1}^h \chi_i (g) \overline{\chi_i (h)} = \left\{ \begin{array}{ll} 0 & \text{if g and h are not conjugate},\\ |C_G (g) | & \text{if g and h are conjugate} . \end{array} \right.$ We proved that $\sum d_i^2 = |G|, \hspace{2em} d_i = \chi_i (1) = \dim (V_i) .$ We also gave the following concrete interpretation of the inner product: if $\chi, \theta$ are the characters of $G$-modules $V$ and $U$, then $\left\langle \chi, \theta \right\rangle = \dim \; \operatorname{Hom}_{\mathbb{C}[G]} (V, U) .$ The degree $\deg (\chi)$ of a character $\chi$ is the dimension of the corresponding $G$-module. Since the trace of the identity map on a vector space is equal to the dimension, $\deg (\chi) = \chi (1)$.

We also proved that we can obtain an orthonormal basis of the space $L^2 (G)$, which is a finite-dimensional Hilbert space with the inner product (2.4.5), as follows. For each $V_i$ pick a $G$-invariant inner product $\left\langle \;, \; \right\rangle$, and a $G$-invariant orthonormal basis $v_{i j}$ ($j = 1, \cdots, d_i$) with respect to this inner product. Then the matrix coefficients $\sqrt{d_i} f_{v_{i j}, v_{i k}} (g) = \sqrt{d_i} \left\langle \pi_i (g) v_{i j}, v_{i k} \right\rangle$ are an orthonormal basis of $L^2 (G)$.

We will give another short summary at the end of Section 2.6.