Blow-up lemma

From HandWiki
Revision as of 15:19, 6 February 2024 by Steve2012 (talk | contribs) (linkage)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Short description: Important lemma in extremal graph theory

The blow-up lemma, proved by János Komlós, Gábor N. Sárközy, and Endre Szemerédi in 1997,[1][2] is an important result in extremal graph theory, particularly within the context of the regularity method. It states that the regular pairs in the statement of Szemerédi's regularity lemma behave like complete bipartite graphs in the context of embedding spanning graphs of bounded degree.

Definitions and Statement

To formally state the blow-up lemma, we first need to define the notion of a super-regular pair.

Super-regular pairs

A pair [math]\displaystyle{ (A,B) }[/math] of subsets of the vertex set is called [math]\displaystyle{ (\varepsilon, \delta) }[/math]-super-regular if for every [math]\displaystyle{ X \subset A }[/math] and [math]\displaystyle{ Y \subset B }[/math] satisfying

[math]\displaystyle{ |X| \gt \varepsilon |A| }[/math] and [math]\displaystyle{ |Y| \gt \varepsilon |B| }[/math]

we have

[math]\displaystyle{ e(X,Y) \gt \delta |X| |Y| }[/math]

and furthermore,

[math]\displaystyle{ \deg(a) \gt \delta |B| }[/math] for all [math]\displaystyle{ a \in A }[/math] and [math]\displaystyle{ \deg(b) \gt \delta |A| }[/math] for all [math]\displaystyle{ b \in B }[/math].[1]

Here [math]\displaystyle{ e(X, Y) }[/math] denotes the number of pairs [math]\displaystyle{ (x,y) }[/math] with [math]\displaystyle{ x \in X }[/math] and [math]\displaystyle{ y \in Y }[/math] such that [math]\displaystyle{ \{x,y\} }[/math] is an edge.

Statement of the Blow-up Lemma

Given a graph [math]\displaystyle{ R }[/math] of order [math]\displaystyle{ r }[/math] and positive parameters [math]\displaystyle{ \delta, \Delta }[/math], there exists a positive [math]\displaystyle{ \varepsilon = \varepsilon(\delta, \Delta, r) }[/math] such that the following holds. Let [math]\displaystyle{ n_1, n_2,\dots,n_r }[/math] be arbitrary positive integers and let us replace the vertices [math]\displaystyle{ v_1, v_2, \dots,v_r }[/math] of [math]\displaystyle{ R }[/math] with pairwise disjoint sets [math]\displaystyle{ V_1, V_2, \dots, V_r }[/math] of sizes [math]\displaystyle{ n_1, n_2, \dots, n_r }[/math] (blowing up). We construct two graphs on the same vertex set [math]\displaystyle{ V = \bigcup V_i }[/math]. The first graph [math]\displaystyle{ \mathbf R }[/math] is obtained by replacing each edge [math]\displaystyle{ \{v_i, v_j\} }[/math] of [math]\displaystyle{ R }[/math] with the complete bipartite graph between the corresponding vertex sets [math]\displaystyle{ V_i }[/math] and [math]\displaystyle{ V_j }[/math]. A sparser graph G is constructed by replacing each edge [math]\displaystyle{ \{v_i, v_j\} }[/math] with an [math]\displaystyle{ (\varepsilon, \delta) }[/math]-super-regular pair between [math]\displaystyle{ V_i }[/math] and [math]\displaystyle{ V_j }[/math]. If a graph [math]\displaystyle{ H }[/math] with [math]\displaystyle{ \Delta(H) \le \Delta }[/math] is embeddable into [math]\displaystyle{ \mathbf R }[/math] then it is already embeddable into G.[1]

Proof Sketch

The proof of the blow-up lemma is based on using a randomized greedy algorithm (RGA) to embed the vertices of [math]\displaystyle{ H }[/math] into [math]\displaystyle{ G }[/math] sequentially. The argument then proceeds by bounding the failure rate of the algorithm such that it is less than 1 (and in fact [math]\displaystyle{ o(1) }[/math]) for an appropriate choice of parameters. This means that there is a non-zero chance for the algorithm to succeed, so an embedding must exist.

Attempting to directly embed all the vertices of [math]\displaystyle{ H }[/math] in this manner does not work because the algorithm may get stuck when only a small number of vertices are left. Instead, we set aside a small fraction of the vertex set, called buffer vertices, and attempt to embed the rest of the vertices. The buffer vertices are subsequently embedded by using Hall's marriage theorem to find a perfect matching between the buffer vertices and the remaining vertices of [math]\displaystyle{ G }[/math].

Notation

We borrow all notation introduced in previous sections. Let [math]\displaystyle{ n = |V(G)| = \sum n_i }[/math]. Since [math]\displaystyle{ H }[/math] can be embedded into [math]\displaystyle{ \mathbf R }[/math], we can write [math]\displaystyle{ V(H) = X = \bigcup_{i \le r} X_i }[/math] with [math]\displaystyle{ |X_i| = |V_i| }[/math] for all [math]\displaystyle{ i }[/math]. For a vertex [math]\displaystyle{ x \in X_i }[/math], let [math]\displaystyle{ V_x }[/math] denote [math]\displaystyle{ V_i }[/math]. For [math]\displaystyle{ x \in X_i, y \in X_j }[/math],

[math]\displaystyle{ d_{xy} = \frac{e(V_i,V_j)}{|V_i| |V_j|} }[/math]

denotes the density of edges between the corresponding vertex sets of [math]\displaystyle{ G }[/math]. [math]\displaystyle{ \phi:V(G) \to V(H) }[/math] is the embedding that we wish to construct. [math]\displaystyle{ T }[/math] is the final time after which the algorithm concludes.

Outline of the algorithm

Phase 0: Initialization

  1. Greedily choose the set of buffer vertices [math]\displaystyle{ B }[/math] from the vertices of [math]\displaystyle{ H }[/math] as a maximal set of vertices distance at least [math]\displaystyle{ 4 }[/math] from each other
  2. Order the remaining vertices (those in [math]\displaystyle{ H \setminus B }[/math]) in a list [math]\displaystyle{ L }[/math], placing the neighbors of [math]\displaystyle{ B }[/math] first.
  3. Declare a queue [math]\displaystyle{ q }[/math] of presently prioritized vertices, which is initially empty.
  4. Declare an array of sets [math]\displaystyle{ F_z }[/math] indexed by the vertices of [math]\displaystyle{ H }[/math], representing the set of all "free spots" of [math]\displaystyle{ z }[/math], that is, the set of unoccupied vertices in [math]\displaystyle{ G }[/math] the vertex [math]\displaystyle{ z }[/math] could be mapped to without violating any of the adjacency conditions from the already-embedded neighbors of [math]\displaystyle{ z }[/math] in [math]\displaystyle{ H }[/math]. [math]\displaystyle{ F_z }[/math] is initialized to [math]\displaystyle{ V_z }[/math].

Phase 1: Randomized Greedy Embedding

  1. Choose a vertex [math]\displaystyle{ x }[/math] from the set of remaining vertices as follows:
    1. If the queue [math]\displaystyle{ q }[/math] of prioritized vertices is non-empty, then choose the vertex from [math]\displaystyle{ q }[/math]
    2. Otherwise, choose a vertex from the list [math]\displaystyle{ L }[/math] of remaining vertices
  2. Choose the image [math]\displaystyle{ \phi(x) }[/math] in [math]\displaystyle{ G }[/math] for the vertex [math]\displaystyle{ x }[/math] randomly from the set of "good" choices, where a choice is good iff none of the new free-sets [math]\displaystyle{ F_z(t) }[/math] differ too much in size from the expected value.
  3. Update the free sets [math]\displaystyle{ F_z(t) }[/math], and put vertices whose free sets have become too small with respect to their size in the last update in the set of prioritized vertices [math]\displaystyle{ q }[/math]
  4. Abort if the queue [math]\displaystyle{ q }[/math] contains a sufficiently large fraction of any of the sets [math]\displaystyle{ X_i }[/math]
  5. If there are non-buffer vertices left to be embedded in either [math]\displaystyle{ L }[/math] or [math]\displaystyle{ q }[/math], update time [math]\displaystyle{ t }[/math] and go back to step 1; otherwise move on to phase 2.

Phase 2: Kőnig-Hall matching for remaining vertices

Consider the set of vertices left to be embedded, which is precisely [math]\displaystyle{ B }[/math], and the set of free spots [math]\displaystyle{ \bigcup_{b \in B} F_b(T) }[/math]. Form a bipartite graph between these two sets, joining each [math]\displaystyle{ b \in B }[/math] to [math]\displaystyle{ F_b(T) }[/math], and find a perfect matching in this bipartite graph. Embed according to this matching.

Proof of correctness

The proof of correctness is technical and quite involved, so we omit the details. The core argument proceeds as follows:

Step 1: most vertices are good, and enough vertices are free

Prove simultaneously by induction on [math]\displaystyle{ t }[/math] that if [math]\displaystyle{ x }[/math] is the vertex embedded at time [math]\displaystyle{ t }[/math], then

  1. only a small fraction of the choices in [math]\displaystyle{ F_x(t) }[/math] are bad
  2. all of the free sets [math]\displaystyle{ F_z(t+1) }[/math] are fairly large for unembedded vertices [math]\displaystyle{ z }[/math]

Step 2: the "main lemma"

Consider [math]\displaystyle{ 1 \le i \le r, Y \subseteq X_i }[/math], and [math]\displaystyle{ A \subseteq V_i }[/math] such that [math]\displaystyle{ |A| }[/math] is not too small. Consider the event [math]\displaystyle{ E_{A,Y} }[/math] where

  1. no vertices are embedded in [math]\displaystyle{ A }[/math] during the first phase
  2. for every [math]\displaystyle{ y \in Y }[/math] there is a time [math]\displaystyle{ t_y }[/math] such that the fraction of free vertices of [math]\displaystyle{ y }[/math] in [math]\displaystyle{ A }[/math] at time [math]\displaystyle{ t }[/math] was small.

Then, we prove that the probability of [math]\displaystyle{ E_{A,Y} }[/math] happening is low.

Step 3: phase 1 succeeds with high probability

The only way that the first phase could fail is if it aborts, since by the first step we know that there is always a sufficient choice of good vertices. The program aborts only when the queue is too long. The argument then proceeds by union-bounding over all modes of failure, noting that for any particular choice of [math]\displaystyle{ 1 \le i \le r }[/math], [math]\displaystyle{ Y \subseteq X_i, |Y| \ge \delta_Q |X_i| }[/math] and [math]\displaystyle{ A = V_i }[/math] with [math]\displaystyle{ Y }[/math] representing a subset of the queue that failed, the triple [math]\displaystyle{ (i,Y,A) }[/math] satisfy the conditions of the "main lemma", and thus have a low probability of occurring.

Step 4: no queue in initial phase

Recall that the list was set up so that neighbors of vertices in the buffer get embedded first. The time until all of these vertices get embedded is called the initial phase. Prove by induction on [math]\displaystyle{ t }[/math] that no vertices get added to the queue during the initial phase. It follows that all of the neighbors of the buffer vertices get added before the rest of the vertices.

Step 5: buffer vertices have enough free spots

For any [math]\displaystyle{ x \in B }[/math] and [math]\displaystyle{ v \in V_x }[/math], we can find a sufficiently large lower bound on the probability that [math]\displaystyle{ \phi(N_H(x)) \subseteq N_G(v) }[/math], conditional on the assumption that [math]\displaystyle{ v }[/math] was free before any of the vertices in [math]\displaystyle{ N_H(x) }[/math] were embedded.

Step 6: phase 2 succeeds with high probability

By Hall's marriage theorem, phase 2 fails if and only if Hall's condition is violated. For this to happen, there must be some [math]\displaystyle{ 1 \le i \le r }[/math] and [math]\displaystyle{ S \subseteq X_i }[/math] such that [math]\displaystyle{ |\bigcup_{z \in S} F_z(T)| \lt |S| }[/math]. [math]\displaystyle{ |S| }[/math] cannot be too small by largeness of free sets (step 1). If [math]\displaystyle{ |S| }[/math] is too large, then with high probability [math]\displaystyle{ \bigcup_{z \in S} F_z(T) = V_i(T) }[/math], so the probability of failure in such a case would be low. If [math]\displaystyle{ |S| }[/math] is neither too small nor too large, then noting that [math]\displaystyle{ A := V_i(T) \setminus \bigcup_{z \in S} F_z(T) }[/math] is a large set of unused vertices, we can use the main lemma and union-bound the failure probability.[1][2][3]

Applications

The blow-up lemma has a number of applications in embedding dense graphs.

Pósa-Seymour Conjecture

In 1962, Lajos Pósa conjectured that every [math]\displaystyle{ n }[/math]-vertex graph with minimum degree at least [math]\displaystyle{ \frac{2n}3 }[/math] contains the square of a Hamiltonian cycle,[4] generalizing Dirac's theorem. The conjecture was further extended by Paul Seymour in 1974 to the following:

Every graph on [math]\displaystyle{ n }[/math] vertices with minimum degree at least [math]\displaystyle{ \frac{kn}{k+1} }[/math] contains the [math]\displaystyle{ k }[/math]-th power of a Hamiltonian cycle.

The blow-up lemma was used by Komlós, Sárközy, and Szemerédi to prove the conjecture for all sufficiently large values of [math]\displaystyle{ n }[/math] (for a fixed [math]\displaystyle{ k }[/math]) in 1998.[5]

Alon-Yuster Conjecture

In 1995, Noga Alon and Raphael Yuster considered the generalization of the well-known Hajnal–Szemerédi theorem to arbitrary [math]\displaystyle{ H }[/math]-factors (instead of just complete graphs), and proved the following statement:

For every fixed graph [math]\displaystyle{ H }[/math] with [math]\displaystyle{ h }[/math] vertices, any graph G with n vertices and with minimum degree [math]\displaystyle{ d \ge \frac{\chi(H)-1}{\chi(H)}n }[/math] contains [math]\displaystyle{ (1-o(1))n/h }[/math] vertex disjoint copies of H.

They also conjectured that the result holds with only a constant (instead of linear) error:

For every integer [math]\displaystyle{ h }[/math] there exists a constant [math]\displaystyle{ c(h) }[/math] such that for every graph [math]\displaystyle{ H }[/math] with [math]\displaystyle{ h }[/math] vertices, any graph [math]\displaystyle{ G }[/math] with [math]\displaystyle{ n }[/math] vertices and with minimum degree [math]\displaystyle{ d \ge \frac{\chi(H)-1}{\chi(H)}n }[/math] contains at least [math]\displaystyle{ n/h-c(h) }[/math] vertex disjoint copies of [math]\displaystyle{ H }[/math].[6]

This conjecture was proven by Komlós, Sárközy, and Szemerédi in 2001 using the blow-up lemma.[7]

History and Variants

The blow-up lemma, first published in 1997 by Komlós, Sárközy, and Szemerédi,[1] emerged as a refinement of existing proof techniques using the regularity method to embed spanning graphs, as in the proof of the Bollobás conjecture on spanning trees,[8] work on the Pósa-Seymour conjecture about the minimum degree necessary to contain the k-th graph power of a Hamiltonian cycle,[9][4] and the proof of the Alon-Yuster conjecture on the minimum degree needed for a graph to have a perfect H-factor.[7] The proofs of all of these theorems relied on using a randomized greedy algorithm to embed the majority of vertices, and then using a Kőnig-Hall like argument to find an embedding for the remaining vertices.[1] The first proof of the blow-up lemma also used a similar argument. Later in 1997, however, the same authors published another paper that found an improvement to the randomized algorithm to make it deterministic.[2]

Peter Keevash found a generalization of the blow-up lemma to hypergraphs in 2010.[3]

Stefan Glock and Felix Joos discovered a variant of the blow-up lemma for rainbow graphs in 2018.[10]

In 2019, Peter Allen, Julia Böttcher, Hiep Hàn, Yoshiharu Kohayakawa, and Yury Person, found sparse analogues of the blow-up lemma for embedding bounded degree graphs into random and pseudorandom graphs[11]

References

  1. 1.0 1.1 1.2 1.3 1.4 1.5 Komlós, János; Sárközy, Gábor N.; Szemerédi, Endre (1997), "Blow-up lemma", Combinatorica 17 (1): 109–123, doi:10.1007/BF01196135 
  2. 2.0 2.1 2.2 Komlós, János; Sárközy, Gábor N.; Szemerédi, Endre (1998), "An algorithmic version of the blow-up lemma", Random Structures & Algorithms 12 (3): 297–312, doi:10.1002/(SICI)1098-2418(199805)12:3<297::AID-RSA5>3.3.CO;2-W 
  3. 3.0 3.1 Keevash, Peter (2011-05-10). "A hypergraph blow-up lemma". Random Structures & Algorithms 39 (3): 275—367. doi:10.1002/rsa.20362. 
  4. 4.0 4.1 Komlós, János; Sárközy, Gábor N.; Szemerédi, Endre (1996). "On the square of a Hamiltonian cycle in dense graphs" (in en). Random Structures & Algorithms 9 (1–2): 193–211. doi:10.1002/(SICI)1098-2418(199608/09)9:1/2<193::AID-RSA12>3.0.CO;2-P. ISSN 1098-2418. https://onlinelibrary.wiley.com/doi/abs/10.1002/%28SICI%291098-2418%28199608/09%299%3A1/2%3C193%3A%3AAID-RSA12%3E3.0.CO%3B2-P. 
  5. Komlós, János; Sárközy, Gábor N.; Szemerédi, Endre (1998-03-01). "Proof of the Seymour conjecture for large graphs" (in en). Annals of Combinatorics 2 (1): 43–60. doi:10.1007/BF01626028. ISSN 0219-3094. https://doi.org/10.1007/BF01626028. 
  6. Alon, Noga; Yuster, Raphael (1996-03-01). "H-Factors in Dense Graphs" (in en). Journal of Combinatorial Theory, Series B 66 (2): 269–282. doi:10.1006/jctb.1996.0020. ISSN 0095-8956. 
  7. 7.0 7.1 Komlós, János; Sárközy, Gábor; Szemerédi, Endre (2001-05-28). "Proof of the Alon–Yuster conjecture" (in en). Discrete Mathematics. Chech and Slovak 3 235 (1): 255–269. doi:10.1016/S0012-365X(00)00279-X. ISSN 0012-365X. 
  8. Komlós, János; Sárközy, Gábor N.; Szemerédi, Endre (1995). "Proof of a Packing Conjecture of Bollobás" (in en). Combinatorics, Probability and Computing 4 (3): 241–255. doi:10.1017/S0963548300001620. ISSN 1469-2163. https://www.cambridge.org/core/journals/combinatorics-probability-and-computing/article/abs/proof-of-a-packing-conjecture-of-bollobas/9168DE00E6AAFD68459328BA6C23C137. 
  9. Komlós, János; Sárközy, Gábor N.; Szemerédi, Endre (1998). "On the Pósa-Seymour conjecture" (in en). Journal of Graph Theory 29 (3): 167–176. doi:10.1002/(SICI)1097-0118(199811)29:3<167::AID-JGT4>3.0.CO;2-O. ISSN 1097-0118. https://onlinelibrary.wiley.com/doi/abs/10.1002/%28SICI%291097-0118%28199811%2929%3A3%3C167%3A%3AAID-JGT4%3E3.0.CO%3B2-O. 
  10. Glock, Stefan; Joos, Felix (2020-02-20). "A rainbow blow-up lemma". Random Structures & Algorithms 56 (4): 1031–1069. doi:10.1002/rsa.20907. 
  11. Allen, Peter; Böttcher, Julia; Hàn, Hiep; Kohayakawa, Yoshiharu; Person, Yury (2019-03-19). "Blow-up lemmas for sparse graphs". arXiv:1612.00622 [math.CO].