5 Upper bounds on
Theorem 5.1 (Ajtai-Komlós-Szemerédi, 1980s + Shearer 80s).
.
Theorem 5.2 (Shearer, 1980s).
Assuming that:
Then
where
as .
Remark.
Given
on
vertices with maximum degree ,
we can use greedy algorithm to show :
pick a vertex not adjacent to anything picked so far, and chuck away its neighbours. At each iteration,
we use up at most
vertices, hence we can run the algorithm for at least
steps.
Theorem 5.2 beats this simple argument by a factor of
, under the additional
assumption that
is triangle-free.
Remark.
If we take
and modify, for ,
one can show that there exist graphs with no triangles and
So Theorem 5.2 is in some sense sharp.
In fact, understanding what happens between these
and
cases is
a big open problem.
Sketch pad:
-
Idea I:
Greedily build an independent set .
-
Idea II:
Choose a set at random?? Choose each vertex with probability
. But note that if
, then for each vertex,
we are likely to pick
of its neighbours.
-
Idea III:
Use Local Lemma? Too much dependence.
-
Idea IV:
Combine Ideas I and II by using a random greedy process. Take a vertex, remove its neighbours.
Then using the triangle-freeness, we should get that density decreases.
Heuristic: Let’s say we build an independent set
using this process
with . Assumption:
“the set
looks like a random set of its density – apart from the fact that it’s independent”.
Let’s consider the largest
we can hope for. Say a vertex
is open if .
We might expect
Note that
then process is basically over (“amount of vertices left that we could potentially use is
than the number of vertices we have so far, so not really any point in trying to collect them”). So solve
. First estimate
that we need ,
and then using this we can solve to get
Fact (Jensen’s inequality). Let
be a convex function on .
Let be
a random variable. Then
Fact. Let be a graph
with average degree .
Then
|
|
Proof.
𝟙𝟙
The inequality follows by Jensen. □
Now we will prove Shearer, 1980s (in fact a slight strengthening). We discussed earlier that the proof will
involve repeatedly removing vertices and tracking a density increase. Shearer found a very elegant way of
tracking this, that makes the proof very clean and magical looking, but the key idea is really the sketch
argument given above.
Define
Note
for ,
and
. We
have
by remainder form of Taylor’s theorem.
Theorem 5.3 (Shearer on ).
Assuming that:
-
an
vertex graph
-
triangle-free
-
average degree
Proof.
Let .
Define .
We have
|
|
where
|
|
and
by triangle-free property.
Using the earlier working, we have
We want
|
|
We claim this holds on average. Average of left hand side is
|
|
Average of right hand side is (using ),
In fact,
satisfies the differential equation
So the inequality holds on average, and thus there exists a good choice of
for the
induction. □
In this proof, note that this function
is not essential to the proof and is just helpful to make the presentation as short as possible.
We now give another proof of Theorem 5.2, but with some constant
instead
of , and
this time we’ll assume the maximum degree is bounded rather than the average degree:
Second proof of Theorem 5.2.
Choose an independent set
uniformly at random among all independent sets. We’ll show
This is an example of what is known as the “hard-core model”. It is called “hard” because we have a
hard constraint on every edge (we require that each edge is not present).
Let , for
, be
defined by
Note, for fixed ,
|
𝟙 |
So
and so enough to show
for and
some .
Now fix
and define
|
|
We now bound
So now fix some .
We consider
where
and
means .
Let .
Now observe that is uniform
over all independent sets in .
So since
is either or
any subset of ,
we have
|
|
We are happy with this inequality, because if
is large, then the second term is large, and if
is small then the first term is large.
Continuing the above calculation, we have
|
|
Now solve
which is , so
. So
and thus as
desired. □
5.1 Recap of
bounds proved in this course
We have shown
|
|
Kim showed .
Triangle-free process: Define the random graph process
. Given
define
Now sample uniformly
at random, and define .
This method is not what Kim used, but it was used by others later to improve the lower bound.
Up until very recently, every proof of lower bound on
has
used a variant of the triangle-free process.
Remark.
There is some similarity between the triangle-free process and the first proof of Theorem 5.2 that
we saw: both are about a process where each step is uniformly random once we condition on a desired
property. So we might expect a related theorem to hold for the triangle-free process, and indeed this is the
case.
The Ramsey numbers
are still of interest, as there is now interest in determining the right constant.
We now finish this section by using Theorem 5.2 to deduce the upper bound on
:
Theorem 5.4 (Shearer upper bound on ).
.
Proof.
Let .
If there is a vertex of degree
in the blue graph then we have an independent set of size
because the blue graph must be triangle-free, otherwise done. So the max degree of blue graph is
.
So apply Shearer, 1980s to get (for large )
that
(where
is the blue graph). □