Let S denote the set of 2 by 2 matrices with integer entries and
determinant 1, and let T denote those matrices of S which are
congruent to the identity matrix I mod 3 (so
means that
a, b, c, d∈ℤ,
ad -bc = 1, and 3
divides b, c, a - 1, d - 1; ``
∈" means ``is in").
- (a)
- Let
f : T→ℝ (the real numbers) be a function
such that for every
X, Y∈T with
Y≠I, either
f (XY) > f (X) or
f (XY-1) > f (X) (or both).
Show that given two finite nonempty subsets A, B of T,
there are matrices
a∈A and
b∈B
such that if
a'∈A,
b'∈B and a'b' = ab, then a' = a
and b' = b.
- (b)
- Show that there is no
f : S→ℝ
such that for every
X, Y∈S with
Y≠±I,
either
f (XY) > f (X) or
f (XY-1) > f (X).