For the optimal segmentation of points into classes there are many
plausible criteria. Some of them, like "maximize the sum of
average similarities within the classes" resp.\ "minimize the sum
of average distances within the classes" resp.\ "minimize the sum
of average hitting times within the classes" are essentially of
the same type. The last criterion uses a matrix of similarities as
the basis of a random walk, whose properties correspond to the
properties of the underlying graph. Each of the
criteria focusses on a particular characteristic of the classes,
which is considered to be most important. Flexibility is obtained
by using different similarity measures (keyword feature selection)
or distances and of course different weights to calculate the
averages. It turns out that a criterion of this type is closely
related to the spectral properties of a suitable matrix. In \cite{ta3} we
give a very general presentation of this relationship. Since the
segmentation is indicated by piecewise (almost) constant
eigenvectors, the methods are called spectral clustering methods. Here we present
concrete examples of segmentation criteria and the corresponding
matrices, whose eigenvectors have to be considered. In any case
piecewise (almost) constant eigenvectors corresponding to the
largest resp.\ smallest eigenvalues indicate the optimal solution
to the criterion.