6 Analysis on the -biased
cube
Let . We define
a measure on
by picking each
independently
to be with
probability and
with probability
where
. This convention looks
counterintuitive at first (placing
for the probability of the smaller value), but recall that under the correspondence
, we
have
and . So
this convention means that we “take the identity most of the time, and take the non-identity with probability
”.
Consider the random variable
(where ).
Then
and
|
|
so .
Let . We
write
for , so
.
Let . The
will play the
role of the
in the unbiased case.
Fix . Then
given ,
we define
and
Lemma 6.1.
for every .
Proof.
|
|
Definition 6.2 (-biased Fourier coefficient).
Let .
The -biased
Fourier coefficient
is defined by .
Because the
form an orthonormal basis, it follows that
Definition 6.3.
Let .
Then define
by
|
|
Then define
to be .
Now,
|
|
If then
this is .
Otherwise, it is
|
|
It follows that
|
|
and therefore that
and therefore that
|
|
Noise and stability
Fix , so
that we don’t have to write subscripts everywhere.
Let and let
. We say
that if with
probability
and with
probability
is
-random,
and all s
independent.
If , then
So . Then
we say that
and are
-correlated,
and write .
Note: depends
on , but we
don’t write (i.e.
we omit the
for convenience).
Definition 6.4.
The -biased
noise operator
is given by the formula
Lemma 6.5.
For every ,
.
Reminders:
Proof.
Corollary 6.6.
.
Proof.
.
□
Definition 6.7.
Let ,
. Then
|
|
Remark.
By Corollary 6.6, ,
as in unbiased case.
The Margulis–Russo formula
We shall be considering more than one value of .
Convention: Let . If we write
, then all definitions should
be understood to be -biased.
Lemma 6.8.
Let be a
multilinear function, and write
also for its restriction to .
Then .
Note: whenever we write , it is
implicit that we are restricting to ,
since is
only defined there.
We will give 3 proofs!
Proof 1.
Write .
Then .
Then by linearity, we’re done. □
You might think the above proof is a bit odd, since it uses the
even though we’re
in the -biased
case. I would agree with you!
Proof 2.
Write .
Then
|
|
Proof 3.
Induction on .
|
|
Where the second equality used linearity in the last coordinate, and the last equality is by induction
hypothesis. □
Theorem 6.9 (The Margulis–Russo formula).
Let
be as
above. Then
|
|
Remark.
Later we will care instead about .
But for now it is easier to work with .
Proof.
By Lemma 6.8,
𝛍𝛍𝛍
But , so
. The
result follows. □
Corollary 6.10.
Let
be a monotone Boolean function. Then
|
|
Proof.
,
so
Therefore,
From the proof of Theorem 6.9, this is .
Since is
monotone, ,
so this equals
|
|
Remark.
Suppose that
are such that ,
. Then by MVT
there exists
such that
|
|
So by The Margulis–Russo formula, there exists
such that .
So if isn’t small,
then there exists
such that
and
isn’t too large.
Let . Then
for each ,
is
defined by
|
|
Lemma 6.11.
For every
and every ,
, and
and
are
orthogonal.
Proof.
Reminders:
So
|
|
Also,
Therefore,
|
|
Orthogonality is easy (
and don’t depend
on , and then use
the fact that
has average ).
□
Lemma 6.12 (-biased Bonami Lemma).
Let have
degree at most .
Then ,
where .
Proof.
Induction on .
Let ,
. By
orthogonality,
|
|
Note also that
has degree at most .
So
Using that ,
, we get
|
|
so .
Also,
|
|
So
|
|
Apply
with ,
. Then
|
|
Choose
such that
and .
will
do.
So .
□
Remark.
This proof does not recover the
case that we saw before (Bonami’s Lemma): in Lemma 4.1 we proved Lemma 6.12 for
but with
, whereas in the above
proof we only get
in the case .
Corollary 6.13.
Let .
Then for every
we have .
Proof.
By the tensor power trick, the result follows. □
Corollary 6.14.
For every
with , we
also have .
Proof.
Identical to uniform case, but with a different .
□
Remark.
As before, this gives us that ,
i.e. .
Theorem 6.15 (-biased Friedgut junta theorem).
Let be a Boolean
function and suppose that .
Then there exists an -junta
with
and
.
Proof.
Let
(to be chosen later) and let
Let .
Then
Let .
Then
|
|
By hypothesis, the second term is at most .
But
|
|
Therefore,
|
|
Set .
Then we get the bound. □
Remark.
As in unbiased case, we always have that
if
.