Resistive networks:
V = IR.
resistors in
series: R = R_1 + R_2.
resistors in
parallel: 1/R = 1/R_1 + 1/R_2.
THEOREM:
Take a graph G and think of each edge as a unit resistor.
Let R_uv =
effective resistance between u and v.
Then:
C_uv
= 2m R_uv.
(C_uv = commute time
u-->v-->u)
I.e., say we
set voltage at v to 0, voltage at u to V_u such that
current
flowing = 2m. Then, V_u = C_uv.
A little bit
more about V=IR: consider a"Y" network. If put in a
single
2-volt battery, will get 1 amp current.
If hook up 2 2-volt
batteries,
will get 2/3, 2/3, 4/3 amps current along the 3 edges. So,
can't
measure "effective resistance" based on current from a single
battery if
there are other batteries floating around.
Also, say
have two copies of graph G. In
one, have voltages
V1, ..., Vn,
and in other have voltages V0', ..., Vn'.
Say we add
two together
to get voltages V1+V1', ..., Vn+Vn'. Then currents add too.
Idea of
proof: will take our "2m current battery" and split into two
experiments
that are easier to analyze separately.
Experiment
#1: inject d(x) units of current into each
node x not
equal to v, and remove 2m - d(v) units from v. [Note: is
this
legal? Batteries set specific
voltage, not specific current.
But, can
imagine slowly increasing voltages at each x, until get
desired
current. OK since negative feedback: increase voltage at x,
only
decreases current from y]
Theorem 1:
H_xv = V_x-V_v. (voltage at x
equals hitting time from x to v).
Experiment
#2: inject 2m-d(u) units into u, and remove d(x) from each x.
Assuming
Theorem 1, then this is just negation, so H_xu = V_u - V_x.
Adding the
two, we get H_uv + H_vu = V_u - V_v in experimetn 0 where
we have a
single battery sending 2m amps current from u to v.
So, all we
need to do is prove Theorem 1.
Proof of Thm
1: Clearly true for x=v. Now lets
consider x != v.
Let's
define voltage at v to be 0.
Let's
look at neighbors of x. Current
out of x = d(x). Also,
for
each neighbor w, current on edge x->w equals (V_x - V_w)/1.
So: d(x) = SUM [V_x - V_w] (N(x) =
neighbors of x)
w in N(x)
We
can rewrite this as:
V_x
= 1 + AVG
V_w
w in N(x)
Now,
let's look at H_xv. We can write
H_xv in terms of what
happens
after one step, like this:
H_xv
= 1 + AVG H_wv
w in N(x)
So,
{H_xv} and {V_x} are solutions to the same set of equations. Now,
this
set of equations has a single solution once we fix V_v = 0 (or
H_vv
= 0). Can see this by noticing
that if you increase the value at
one
node, then either some neighbor must increase by even more, or
else
all neighbors must increase by exactly this amount.
==============================================================================
Markov Chains:
- defn.
P_ij = prob of going to j given that
you're in state i.
write a prob
distribution as a row vector q.
Then, one step of walk = qP
If
underlying graph (include only edges with non-zero prob) is
strongly
connected, then chain is "irreducible".
for finite
irreducible MC's, "aperiodic" -> for any starting
distribution, after finitely many steps, have non-zero prob
on every state.
stationary
distribution: eigenvector of eigenvalue 1.
Note: this
is largest eigenvalue. All other
eigenvectors have sum of
entries
equal to zero.
rapid mixing
-> want to approach stationary in only polylog(n) steps.
small second
eigenvalue gives rapid mixing -> can see intuitively why.
Expander graphs
---------------
Several roughly equivalent definitions (even in the book
they use several).
Basic idea: constant-degree graph, where all small sets
of nodes
expand, when you look at their neighborhoods.
Here's one (from chap 5)[OR-concentrator]
Defn: An (N, d, alpha, c) expander is a bipartite graph
on X union Y,
with |X| = |Y| = N, each node in X of degree d, such that
for any
subset S of X s.t., |S|< alpha*N, |N(S)| >= c|S|.
(interesting
for c>1, and d=constant)
(Nearly Equivalent: a directed graph on N nodes, with
out-degree d,
where for all S of size at most alpha*N, |N(S) - S| >=
c*|S|, for c > 0.)
Say put in edges at random (for each node in X, pick
random set of d nodes
in Y to be neighborhood). Claim: for d=12, w.h.p., every
set S of < n/4
vertices in X expands by at least a factor of 2.
Fix set S of size s. How many ways can we have
neighborhood of size <= r?
-> At
most (n choose r)*(r choose d)^s,
since first fix set of size r, and then choose neighborhood of
each node in S within that set.
So, prob S has neighborhood of size at most r is
<=
(n choose r)*(r choose d)^s / (n choose d)^s
So, prob[exists a set of s vertices with <= 2s
neighbors] is at most
(n
choose s)*(n choose 2s) * (2s choose d)^s / (n choose d)^s
<
(n choose 2s)^2 * (2s choose d)^s / (n choose d)^s
[since
s < n/3]
<
const*(n choose 2s)^2 * (2s/n)^{ds}
[
since d is constant]
<
(ne/(2s))^{4s} * (2s/n)^{ds}. Now,
plug in d=12, get:
=
(4es^2/n^2)^{4s}
-> if s is large, still at most n/4 so inside <=
(e/4) and (e/4)^s is tiny.
-> if s is small, then still at most O(1/n^8).
So, random graphs are expanders.
Note: determining if a graph is an expander, though, is
co-NP-hard.