Balls into bins via local search: cover time and maximum load

We study a natural process for allocating m balls into n bins that are organized as the vertices of an undirected graph G. Balls arrive one at a time. When a ball arrives, it first chooses a vertex u in G uniformly at random. Then the ball performs a local search in G starting from u until it reaches a vertex with local minimum load, where the ball is finally placed on. Then the next ball arrives and this procedure is repeated. For the case m = n, we give an upper bound for the maximum load on graphs with bounded degrees. We also propose the study of the cover time of this process, which is defined as the smallest m so that every bin has at least one ball allocated to it. We establish an upper bound for the cover time on graphs with bounded degrees. Our bounds for the maximum load and the cover time are tight when the graph is transitive or sufficiently homogeneous. We also give upper bounds for the maximum load when m>n.


INTRODUCTION
A very simple procedure for allocating m balls into n bins is to place each ball into a bin chosen independently and uniformly at random. We refer to this process as 1-choice process. It is well known that, when m = n, the maximum load for the 1-choice process (i.e., the maximum number of balls allocated to any single bin) is log n log log n [10]. Alternatively, in the d-choice process, balls arrive sequentially one after the other, and when a ball arrives, it chooses d bins independently and uniformly at random, and places itself in the bin that currently has the smallest load among the d bins (ties are broken uniformly at random). It was shown by Azar, Broder, Karlin, and Upfal [3] and Karp, Luby, and Meyer auf der Heide [7] that the maximum load for the d-choice process with m = n and d ≥ 2 is log log n log d . The constants omitted in the are known and, as shown by Vöcking [11], they can be reduced with a slight modification of the dchoice process. Berenbrink, Czumaj, Steger, and Vöcking [4] extended these results to the case m n. In some applications, it is important to allow each ball to choose bins in a correlated way. For example, such correlations occur naturally in distributed systems, where the bins represent processors that are interconnected as a graph and the balls represent tasks that need to be assigned to processors. From a practical point of view, letting each task choose d independent random bins may be undesirable, since the cost of accessing two bins which are far away in the graph may be higher than accessing two bins which are nearby. Furthermore, in some contexts, tasks are actually created by the processors, which are then able to forward tasks to other processors to achieve a more balanced load distribution. In such settings, allocating balls close to the processor that created them is certainly very desirable as it reduces the costs of probing the load of a processor and allocating the task.
With this motivation in mind, Bogdan, Sauerwald, Stauffer, and Sun [5] introduced a natural allocation process called local search allocation. Consider that the bins are organized as the vertices of a graph G = (V , E) with n = |V |. At each step a ball is "born" at a vertex chosen independently and uniformly at random from V , which we call the birthplace of the ball. Then, starting from its birthplace, the ball performs a local search in G, where in each step the ball moves to the adjacent vertex with the smallest load, provided that the load is strictly smaller than the load of the vertex the ball is currently in. We assume that ties are broken independently and uniformly at random. The local search ends when the ball visits the first vertex that is a local minimum, which is a vertex for which no neighbor has a smaller load. After that, the next ball is born and the procedure above is repeated. See Fig. 1 for an illustration.
The main result in [5] establishes that when G is an expander graph with bounded maximum degree, the maximum load after n balls have been allocated is (log log n). Hence, local search allocation on bounded-degree expanders achieves the same maximum load (up to constants) as in the d-choice process, but has the extra benefit of requiring only local information during the allocation. In [5], it was also established that the maximum load is log n log log n

Results
In this paper we derive upper and lower bounds for the maximum load that hold for all bounded-degree graphs. In addition, we propose the study of another natural quantity, which we refer to as the cover time. In order to state our results, we need to introduce the following two quantities that are related to the local neighborhood growth of G: and where B r u denotes the set of vertices within distance at most r from vertex u. Note that R 1 ≤ R 2 for all G. For bounded-degree expander graphs, we have |B r u | = e (r) which implies that R 1 and R 2 are of order log log n, whereas for a d-dimensional grids we have For the sake of clarity, we state our results here for vertex-transitive graphs only. In later sections we state our results in fullest generality, which will require a more refined definition of R 1 and R 2 . We also highlight that for all the results below (and throughout this paper) we assume that ties are broken independently and uniformly at random; the impact of tie-breaking procedures in local search allocation was investigated in [5,Theorem 1.5].
Maximum load. We derive an upper bound for the maximum load after n balls have been allocated. Our bound holds for all bounded-degree graphs, and is tight for vertex-transitive graphs (and, more generally, for graphs where the neighborhood growth is sufficiently homogeneous across different vertices). Theorem 1.1 (Maximum load when m = n). Let G be any vertex-transitive graph with bounded degrees. Then, with probability at least 1 − n −1 , the maximum load after n balls have been allocated is (R 1 ). Theorem 1.1 is a special case of Theorem 4.1, which gives a more precise version of the result above and generalizes it to non-transitive graphs; in particular, we obtain that for any graph with bounded degrees the maximum load is O(R 1 ) with high probability. We state and prove Theorem 4.1 in Section 4.
As mentioned above, for bounded-degree expanders we have R 1 = (log log n), and for d-dimensional grids we have R 1 = log n log log n 1 d+1 . Hence the results for boundeddegree graphs in [5] are special cases of Theorems 1.1 and 4.1. Furthermore, the proof of Theorems 1.1 and 4.1 uses novel techniques, and is substantially shorter than the proofs in [5]. In particular, in [5], different proofs are needed to handle the cases of expander graphs and grids, and the proof for expander graphs uses a rather involved recursive application of Azuma's inequality. Our proof here, on the other hand, is applicable to all bounded-degree graphs and is quite elegant. We construct a subtle coupling between local search allocation and the 1-choice process, which allows us to express the probability that a vertex has a large load in terms of the probability for the number of balls being born at a given vertex, which are much simpler to control (see Lemma 3.4

below).
Our second result establishes an upper bound for the maximum load when m ≥ n. We point out that all other results known so far were limited to the case m = n. We establish that, when m = (R 2 n), the maximum load is of order (m/n) (i.e., the same order as the average load). We note that the difference between the maximum load and the average load for the local search allocation is always bounded above by the diameter of the graph (see Lemma 2.2 below). This is in some sense similar to the d-choice process, where the difference between the maximum load and the average load does not depend on m [4]. Cover time. We propose to study the following natural quantity related to any process based on allocating balls into bins. Define the cover time as the first time at which all bins have at least one ball allocated to them. This is in analogy with cover time of random walks on graphs, which is the first time at which the random walk has visited all vertices of the graph. Note that for the 1-choice process, the cover time corresponds to the time of a coupon collector problem, which is known to be of order n log n with high probability [9,Theorem 5.13]. For the d-choice process with d = (1), we obtain that the cover time is also of order n log n. We show that for the local search allocation the cover time can be much smaller than n log n. Our next theorem establishes that the cover time for vertex-transitive bounded-degree graphs is (R 2 n) with high probability. Theorem 1.3 (Cover time for bounded-degree graphs). Let G be any vertex-transitive graph with bounded degrees. Then, with probability at least 1 − n −1 , the cover time of local search allocation on G is (R 2 n).
The theorem above is a special case of Theorem 5.2, which we state and prove in Section 5; in particular, we establish there that the upper bound of O(R 2 n) holds for all bounded-degree graphs. Since R 2 = O( log n) for all connected graphs, it follows that the cover time for any connected, bounded-degree graph is at most O(n log n), which is significantly smaller than the cover time of the d-choice process, which is (n log n) for any d = (1). In particular, we obtain R 2 = (log log n) for bounded-degree expanders, and R 2 = (log n) 1 d+1 for d-dimensional grids. Our final result provides a general upper bound on the cover time for dense graphs. Theorem 1.4 below is a special case of Theorem 5.3, which gives an upper bound on the cover time for all regular graphs. We state and prove Theorem 5.3 in Section 5. Theorem 1.4 (Cover time for dense graphs). Let G be any d-regular graph with d = (log n log log n). Then, with probability at least 1 − n −1 , the cover time is (n).

BACKGROUND AND NOTATION
In this section we recall some basic properties of the local search allocation that will be useful in our proofs. Let G = (V , E) be an undirected, not necessarily connected, graph with n vertices, and let be the maximum degree of G. We assume that, in the local search allocation, ties are broken independently and uniformly at random.
We denote by d G (u, v) the distance between u and v in G and define denote the load of v (i.e., the number of balls allocated to v) after m balls have been allocated. Initially we have max the maximum load after m balls have been allocated; i.e., Also, denote by T cov = T cov (G) the cover time of G, which we define as the first time at which all vertices have load at least 1. More formally, Let U i ∈ V denote the birthplace of ball i, and for each m ≥ 0 and v ∈ V , let X where 1 (·) denotes the indicator function. For vectors A = (a 1 , a 2 , . . . , a n ) and A = (a 1 , a 2 , . . . , a n ) such that n i=1 a i = n i=1 a i , we say that A majorizes A if, for each κ = 1, 2, . . . , n, the sum of the κ largest entries of A is at least the sum of the κ largest entries of A . More formally, if j 1 , j 2 , . . . , j n are distinct integers from {1, 2, . . . , n} such that a j 1 ≥ a j 2 ≥ · · · ≥ a jn and j 1 , For two random variables Y and Z on R, we say that Y stochastically dominates Z if we can couple the probability distributions of Y and Z so that Y ≥ Z with probability 1.
The lemma below establishes that the load vector obtained by the 1-choice process majorizes the load vector obtained by the local search allocation. This implies that the maximum load in the 1-choice process stochastically dominates the maximum load obtained by the local search allocation. As a consequence, we have that X (n) max = O log n log log n and T cov = O (n log n) for all G. Later, in Section 3, we state and prove Lemma 3.2, which is a generalization of Lemma 2.1.

Lemma 2.1 (Comparison with 1-choice).
For any fixed k ≥ 0, we can couple X (k) and X (k) so that, with probability 1, X (k) majorizes X (k) . Consequently, we have that, for all For any v ∈ V , let N v be the set of neighbors of v in G. The next lemma establishes that the local search allocation always maintains a smoothed load vector in the sense that the load of any two adjacent vertices differs by at most 1.

Lemma 2.2 (Smoothness).
For any k ≥ 0, any v ∈ V and any u ∈ N v , we have that Proof. In order to obtain a contradiction, suppose that X (k) v ≥ X (k) u + 2, and let j be the last ball allocated to v. Then, we have that Therefore, the moment the jth ball is born, vertex v has at least one neighbor with load strictly smaller than v. Therefore, ball j is not allocated to v, establishing a contradiction.
The next lemmas establish that the load vector X (n) satisfies a Lipschitz and monotonicity condition.

Lemma 2.3 (Lipschitz property).
Let k ≥ 1 be fixed and u 1 , u 2 , . . . , u k ∈ V be arbitrary. Let (X (k) v ) v∈V be the load of the vertices of G after the local search allocation places k balls with birthplaces u 1 , u 2 , . . . , u k . Let i ∈ {1, 2, . . . , k} be fixed, and let (Y (k) v ) v∈V be the load of the vertices of G after the local search allocation places k balls with birthplaces v ) v∈V by changing the birthplace of the ith ball from u i to u i . Then, there exists a coupling such that, with probability 1, Proof. We refer to the process defining the variables X (k) as the X process, and the process defining the variables Y (k) as the Y process. For each v ∈ V and i ≥ 1, we define ξ (i) v to be an independent and uniformly random permutation of the neighbors of v. We use this permutation for both the X and Y processes to break ties when ball i is at vertex v. Since the first i − 1 balls have the same birthplaces in both processes, we have that Now, when adding the ith ball, we let v i be the vertex to which this ball is allocated in the X process and v i be the vertex to which this ball is allocated in the Y process. (2.5) If i = k, then this implies (2.3) and the lemma holds. For the case i < k, we add ball i + 1 and are going to define v i+1 and v i+1 so that (2.5) holds with i replaced by i + 1. Then the proof of the lemma is completed by induction. We assume that v i = v i , otherwise (2.3) clearly holds. We note that v i+1 and v i+1 will not necessarily be the vertices to which ball i + 1 is allocated in the X and Y processes. The role of v i+1 and v i+1 is to be the only vertices whose loads in the X and Y processes are different. The definition of v i+1 and v i+1 will vary depending on the situation. For this, let ball i + 1 be born at u i+1 and define w to be the vertex on which ball i + 1 is allocated in the X process and w to be the vertex on which ball i + 1 is allocated in the Y process. We can assume that w = w , otherwise (2.5) holds with i replaced by i + 1 by setting Now we analyze ball i + 1. First, it is crucial to note that, during the local search of ball i + 1, if it does not enter v i in the Y process and does not enter v i in the X process, then ball i + 1 follows the same path in both processes. In order to see this, consider that during the local search of ball i + 1, the ball is on a vertex z ∈ V \ {v i , v i } and has so far performed the same steps in both the X and Y processes. It is enough to show that, in the next step of the local search, if ball i + 1 does not go to v i or v i , then it goes to the same vertex in both the X and Y processes. This establishes the above observation by induction. There are two cases, the first being if z is not a neighbor of v i or v i . Then the neighbors of z have the same load in processes X and Y , so ball i + 1 does the same step in each process. On the other hand, if z is a neighbor of v i (the same reasoning will apply with v i replaced by v i ), given that ball i + 1 never enters v i , z must have a neighbor z = v i so that z has the smallest load among the neighbors of z in both X and Y , and in case of ties, z is the neighbor of smallest load appearing first in ξ (i+1) z . Since the local searches of ball i + 1 in the X and Y processes break ties according to the same permutation ξ (i+1) z , we obtain that ball i + 1 goes to z in both X and Y .
As a consequence of the above reasoning, since we are in the case w = w , we can assume without loss of generality that ball i + 1 eventually visits v i in the Y process. In this case, since the local search performed by ball i in the X process stops at vertex v i , we have that v i is a local minimum for ball i + 1 in the Y process, which implies that w = v i . (The case when ball i + 1 visits v i in the X process follows by a symmetric argument.) So, since v by removing ball i. There exists a coupling such that, with probability 1, Proof. Let G be the graph obtained from G by adding an isolated node w; i.e., G has vertex set V ∪ {w} and the same edge set as G. Applying Lemma 2.3 to G with the same choice of u 1 , . . . , u k ∈ V and with u i = w gives v∈V ∪{w} In many of our proofs we analyze a continuous-time variant where the number of balls is not fixed, but is given by a Poisson random variable with mean m. Equivalently, in this variant balls are born at each vertex according to a Poisson process of rate 1/n. We refer to this as the Poissonized version. We will use the Poissonized versions of both the local search allocation and the 1-choice process in our proofs. Since the probability that a mean-m Poisson random variable takes the value m is of order (m −1/2 ) we obtain the following relation.

KEY TECHNICAL ARGUMENT
In this section we prove a key technical result (Lemma 3.2 below) that will play a central role in our proofs later. Let μ : V → Z be any integer function on the vertices of G that satisfies the following property: for any two neighbors We see μ as an initial attribution of weights to the vertices of G. Then, for any m ≥ 1, after m balls are allocated, we define the weight of vertex v by Note that for any m ≥ 1 and v ∈ V , we have that W v can increase by at most one after each step; i.e., The lemma below establishes that a ball cannot be allocated to a vertex with larger weight than the vertex where the ball is born.
Proceeding inductively for each step of the local search we obtain  For the proof of this lemma, we need the following result from [3].
. . , u n ) be two vectors such that v 1 ≥ v 2 ≥ · · · ≥ v n and u 1 ≥ u 2 ≥ · · · ≥ u n . If v majorizes u, then also v + e i majorizes u + e i , where e i is the ith unit vector.
Proof of Lemma 3.2. The proof is by induction on m. Clearly, for m = 0, we have Similarly, let j 1 , j 2 , . . . , j n be distinct elements of V so that Let be a uniformly random integer from 1 to n. Then, for the process (W (m) v ) v∈V , let the birthplace of ball m be vertex i and for the process (W (m) v ) v∈V , let the birthplace of ball m be j . For the process (W (m) v ) v∈V , ball m may not necessarily be allocated at vertex i , so let us define ι as the integer so that i ι is the vertex to which ball m is allocated.
In order to prove that W Now we illustrate the usefulness of the above result by relating the probability of a vertex to have a certain load with the probability that balls are born in a neighborhood around a vertex. Recall that the load vector is smooth (cf. Lemma 2.2), which means that if a vertex v has load , then a vertex at distance r from v has load at least − r and at most + r. Lemma 3.4. For any v ∈ V , and any , m ≥ 1, we have and    The above inequality implies that, for any given u ∈ V, From now on, we consider only the Poissonized version. Let K (2m) be the sum of the weights of the vertices with weight at least /16. More formally, Then, we have that, on the event K (2m) We can estimate the weight of vertices that reach weight /16 as follows. For each vertex, let balls arrive according to a rate-1 Poisson point process up to time 2m/n or until the vertex reaches weight /16, whatever happens first. Then, if the vertex reaches weight /16, continue adding balls for an additional time interval of length 2m/n. This construction stochastically dominates the weight of the vertices by the memoryless property of Poisson random variables. The probability that a vertex v with μ(v) = −k reaches weight /16 is , since 2me n( /16+k) ≤ 1 2 for all k ≥ 0 and x! ≥ (x/e) x for any integer x. Now any Bernoulli random variable with mean p ≤ 1/2 is stochastically dominated by a Poisson random variable with mean 2p, which follows from the fact that e −2p ≤ 1 − p for 0 ≤ p ≤ 1/2. Using this, and denoting by N k S the set of vertices at distance k from S, we have that the number of vertices reaching weight /16 is a Poisson random variable of mean , establish the first part of the lemma. The second part of Proposition 3.6 holds by setting S = B u . If u has load k > , then the total number of balls allocated to B u is at least Then setting k = 2 and applying the first part of the proposition yields the result.

MAXIMUM LOAD
We start stating a stronger version of Theorem 1.1 which also holds for non-transitive graphs. For γ ∈ (0, 1/2], let can be seen as the "lower bound counterpart" of R 1 which, as mentioned earlier, relates to the (worst-case) local neighborhood growth in G. The condition |S| ≥ n 1/2+γ ensures that there are sufficiently many vertices for which the neighborhood growth can be bounded uniformly. Lower bounding the size of S is necessary, since otherwise G could consist of a small part which has low local neighborhood growth, whereas the remaining part has high local neighborhood growth. The theorem below establishes that, for any bounded-degree graph, if there exists a γ ∈ (0, 1/2] for which R Proof. We start establishing a lower bound for X (n) max . We first consider the Poissonized versions of the local search allocation and the 1-choice process (recall the definition of these variants from the paragraph preceding Lemma 2.5). We abuse notation slightly, and let X (m) v and X For any v ∈ V and any > 0, Lemma 3.4 and the above inequality give that where N r v is the set of vertices at distance r from v, and the last inequality follows since the function z −z is decreasing for all z ≥ 1.
where the last step follows for all ≥ 2. Given γ > 0, set = γ R (γ ) 1 4 . Hence, since |B r v | log r is increasing with r, we have that there exists a set S with |S| = n 1 2 +γ such that Let Y = Y (γ ) be the random variable defined as the number of vertices v satisfying 4 . Let K be the total number of balls allocated in the Poissonized version of the local search allocation. Note that E [ K ] = n and by the last statement of Lemma A.4, Pr [ K > 2en ] ≤ 2 1−2ne . Regard Y as a function of the K independently chosen birthplaces U 1 , U 2 , . . . , U K . Then, for any given K, Y is 1-Lipschitz by Lemma 2.3 Applying (4.1) to the above inequality yields With this, we apply Lemma A.1 to obtain This result can then be translated to the non-Poissonized model via Lemma 2.5. Now we establish the upper bound, where we consider the non-Poissonized process. For any fixed u ∈ V , we have from the second part of Proposition 3.6 (with m = n) that Taking the union bound over u we obtain that Proof of Theorem 1.2. Applying Proposition 3.6 with = m n + R 2 c for any constant c ≥ 300 , we obtain By setting c ≥ 1 sufficiently large, the right-hand side above can be made smaller than n −2 .
If u has load k, then the number of balls allocated to vertices in B R 2 u is at least Therefore we obtain that, on the event Taking the union bound over all u completes the proof.

COVER TIME
Our next result will be a lower bound on the minimum load, which in turn gives an upper bound for the cover time. Given that for the cover time we only need to require all vertices to have a nonzero load, it may seem a bit crude that in the proposition below we require all vertices to have load at least m 224n log after m balls have been allocated. However, we believe that the cover time and the blanket time (which is the first time until all vertices have load of order m/n) are in fact of the same order for many natural graph classes including vertex-transitive graphs (cf. the discussion in Section 6).
Proof. Fix an arbitrary vertex u ∈ V . We will use the concept of weights defined in Section 3. Define be the minimum weight of all vertices in V in the 1-choice process. Let We have Using Lemma A.3, we obtain where the last inequality holds since m/n ≥ CR 2 = ω(1) for bounded degree graphs. Now define Z as the sum of the |B We now apply a variant of Azuma's inequality for martingales (Lemma A.2) in order to show that Z is likely to be at least |B R 2 u | 4 . Let A 0 = E Z and define A 1 , A 2 , . . . , A m to be the martingale adapted to the filtration F i generated by U 1 , U 2 , . . . , U i ; i.e., We first establish that the martingale A i satisfies the 1-Lipschitz condition. Changing the birthplace of ball i and keeping all other birthplaces the same can change Z by at most one, cf. Lemma 2.3. In symbols, this means In the following we will derive an upper bound on the second moment of A i − A i−1 exploiting the simple fact that the set of the |B where the expectation above is taken with respect to U i . Since |ζ u −ζ u | ≤ 1 for all u, u ∈ V , we can write Now consider a given realization of U 1 , U 2 , . . . , U i−1 , U i+1 , . . . , U m , and let ⊂ V be the set of |B R 2 u | vertices with smallest loads (note that we skip the ith ball in the definition of ). Then, by adding the ith ball, ζ u and ζ u only differ if at least one of u or u is in . Hence,

Now, Lemma A.2 gives
Clearly, E Z ≤ m|B R 2 u | n , which gives that Using the value of from (5.1) and using that m ≥ CR 2 n from the condition in the statement of the proposition, we have Due to our coupling which gives Z ≥ Z we conclude that with probability at least 1 − such that r|B r u | < log n for all u ∈ S .
Note that R (γ ) 2 is non-increasing with γ . Also, when G is vertex transitive, we have R 2 = R (γ ) 2 + 1 for all γ ∈ (0, 1/2], because in this case, for any given r, the size of B r u is the same for all u ∈ V . The theorem below establishes that, for any bounded-degree graph, if there exists a γ ∈ (0, 1/2] for which R (γ ) 2 = (R 2 ), then the cover time is (R 2 ).
Proof. The result is shown by a coupling with the following stochastic process, introduced in [1], which we call coupon collector process. Initially, every node of G is uncovered. Then in each round i, a node U i is chosen independently and uniformly at random. If node U i is uncovered, then it becomes covered. Otherwise, if U i has any uncovered neighbor, then a random node among this set becomes covered. For this process, let us denote by C (i) the set of covered nodes after round i. We shall prove that there is a coupling so that for every round i, ; in other words, every node which is covered by the coupon collector process in round i is also covered by the local search allocation after the allocation of ball i.
The coupling is shown by induction. Clearly, the claim holds for i = 1. Consider now the execution of any round i + 1, assuming that the induction hypothesis holds for round i. In our coupling, we choose the same node v for U i+1 and U i+1 .
In the first case, we assume that v is uncovered in the coupon collector process. Then the coupon collector process will cover node v in round i + 1. If v has not been covered by the local search allocation, then we have X (i) v = 0 and hence ball i + 1 will be allocated on node v in round i + 1. Otherwise, v has been covered previously. In either case, we conclude that node v is covered after round i + 1 in the local search allocation.
For the second case, suppose that v is covered in the coupon collector process. Then the coupon collector process will try to cover an uncovered neighbor of v if there exists one. This uncovered neighbor is chosen uniformly at random from all uncovered neighbors of v. This random experiment can be described by first choosing a random ranking of all deg(v) neighbors and then picking the uncovered neighbor with the highest rank, say node u. In our coupling, we assume that the local search allocation chooses the same ranking of all deg(v) neighbors. This, together, with the induction hypothesis, guarantees that if there is node u which becomes covered by the coupon collector process, then this node u also becomes covered by the local search allocation if it has not been covered in an earlier round.
Combining the two cases, we have shown that there is a coupling such that C (i) ⊆ {v ∈ V : X (i) v ≥ 1} for any integer i ≥ 1. Since it was shown for the coupon collector process in [1] that with probability 1 − n −c for some constant c > 0, O(n(1 + log n·log d d )) rounds suffice to cover all nodes, the theorem follows.

Blanket time
In analogy with the cover time for random walks, for each δ > 1, we can define the blanket time as the first time at which the load of each vertex is in the interval ( 1 δ · m n , δ · m n ). It ≥ min β ≥ 0 : β(2β + 1) log n (log log n) 3 ≥ ε log n log log log n = (log log n) 1.5 (log log log n) 0.5 , where the first inequality follows from (6.2).

Open questions
1. For any vertex-transitive graph (not necessarily of bounded degrees), does it hold that X (n) max = (R 1 ) and T cov = (R 2 n) with high probability? 2. For any vertex-transitive graph (not necessarily of bounded degrees) and any m = ω(nR 2 ), does it hold that X (m) max = m n + (R 2 ) with high probability? 3. For any vertex-transitive graph, is the blanket time of order nR 2 for all ε ∈ (0, 1)?
In particular, is the blanket time of the same order as the cover time for all vertextransitive graphs? 4. Let G = (V , E) and G = (V , E ) be two graphs such that E ⊂ E . For any m, does it hold that the maximum load on G is stochastically dominated by the maximum load on G?
Also, for any x ≥ 2eE [ X ], it follows by Stirling's approximation that