9512.net
甜梦文库
当前位置:首页 >> >>

A comparison inequality for sums of independent random variables


arXiv:math/9811124v1 [math.PR] 20 Nov 1998

A COMPARISON INEQUALITY FOR SUMS OF INDEPENDENT RANDOM VARIABLES
STEPHEN J. MONTGOMERY-SMITH AND ALEXANDER R. PRUSS Abstract. We give a comparison inequality that allows one to estimate the tail probabilities of sums of independent Banach space valued random variables in terms of those of independent identically distributed random variables. More precisely, let X1 , . . . , Xn be independent Banach-valued random variables. Let I be a random variable independent of X1 , . . . , Xn and uniformly distributed ? 1 = XI , and let X ?2, . . . , X ? n be independent over {1, . . . , n}. Put X ? identically distributed copies of X1 . Then, P ( X1 + · · · + Xn ≥ ?1 + · · · + X ? n ≥ λ/c) for all λ ≥ 0, where c is an λ) ≤ cP ( X absolute constant.

The independent Banach-valued random variables X1 , . . . , Xn are said to regularly cover (the distribution of) a random variable Y provided that n 1 E [g (Xk )], E [g (Y )] = n k=1 for all Borel functions g for which either side is de?ned [8]. An easy way of constructing Y , given the independent Banach-valued random variables X1 , . . . , Xn , is to let I be a random variable independent of X1 , . . . , Xn , with values in {1, 2, . . . , n} and with each value having equal probability 1/n, and then put Y = XI . It is easy to see that then X1 , . . . , Xn regularly cover Y . This construction will be useful for our proofs. If the variables are real valued, then the regular covering condition is easily seen to be equivalent to the condition that the distribution function F of Y is the arithmetic mean of the respective distribution functions F1 , . . . , Fn of X1 , . . . , Xn .
Date : November 20, 1998. 1991 Mathematics Subject Classi?cation. Primary 60G50, 60E15; Secondary 60F15. Key words and phrases. comparison inequalities, sums of independent random variables, sums of independent identically distributed random variables, rates of convergence in the law of large numbers.
1

2

STEPHEN J. MONTGOMERY-SMITH AND ALEXANDER R. PRUSS

A variable X ′ is said to be a copy of X if it has the same distribution as X . The main purpose of this paper is then to prove the following result. Theorem 1. There exists an absolute constant c ∈ (0, ∞) such that if X1 , . . . , Xn are independent Banach-valued random variables which ? 1 , then: regularly cover a random variable X (1) ?1 + · · · + X ? n ≥ λ/c), P ( X1 + · · · + Xn ≥ λ) ≤ cP ( X ?2, . . . , X ? n are independent copies of X ?1. for all λ ≥ 0, where X Remark 1. In the case where the random variables are symmetric, this was shown in [9] (strictly speaking, it was only shown in the real-valued case, but the proof also works for the Banach-valued case). Remark 2. The inequality converse to (1) is false, even in the special cases of symmetric real random variables. For, suppose that c is an absolute constant such that (2) ?1 + · · · + X ? n | ≥ λ) ≤ cP (|X1 + · · · + Xn | ≥ λ/c), P (| X

for all λ ≥ 0, whenever the conditions of Theorem 1 hold with symmetric variables. Fix any n > max(1, c). Put X2 ≡ · · · ≡ Xn ≡ 0. Let . Put λ = n. Then the X1 be such that P (X1 = 1) = P (X1 = ?1) = 1 2 right hand side of (2) is zero, since |X1 +· · ·+Xn | ≡ 1. But the left hand ? i = 1) = 2?n?1 side of (2) is non-zero, since it is easy to see that P (X ? i are identically distributed, and as X ? 1 can be taken for each i (as the X to be XI where I is independent of everything else and uniformly dis?1 + · · · + X ? n | ≥ n) ≥ (2?n?1)n > 0. tributed on {1, . . . , n}), so that P (|X Remark 3. The main consequence of Theorem 1 is that any upper bound on tail probabilities of sums of independent identically distributed random variables automatically gives a bound on tail probabilities of sums of non-identically distributed independent random variables. Remark 4. For a very simple application, we give another proof of one side of a result from [8] on randomly sampled Riemann sums. Let f ∈ L2 [0, 1]. For 1 ≤ k ≤ n, let xnk be uniformly distributed over [(k ?

A COMPARISON INEQUALITY

3

1)/n, k/n], and assume xn1 , . . . , xnn are independent for each ?xed n. De?ne the randomly sampled Riemann sum Rn f = n?1 n k =1 f (xnk ). Then the result says that Rn f converges almost surely to the Lebesgue 1 integral A = 0 f . (For a converse in the case where all the xnk are independent, not just for ?xed n, see [8].) For, by Borel-Cantelli it su?ces to show that


(3)
n=1

P (|Rn f ? A| ≥ ε) < ∞,

for all ε > 0. Let X1 , X2 , . . . be independent identically distributed random variables with the same distribution as f . Note that f (xn1 ), . . . , f (xnn ) regularly cover X1 , and f (xn1 ) ? A, . . . , f (xnn ) ? A regularly cover X1 ? A. Since f ∈ L2 , we have X1 having a ?nite second moment, and moreover E [X1 ] = A, so that by the Hsu-Robbins law of large numbers [6] (see also [3, 4]), we have


P (|(X1 ? A) + · · · + (Xn ? A)|/n ≥ ε) < ∞,
n=1

for all ε > 0. By Theorem 1 and the fact that f (xn1 ) ? A, . . . , f (xn1 ) ? A regularly cover X1 ? A, we obtain (3). To prove Theorem 1, we need some de?nitions and lemmata. If X is a random variable, then let X s = X ? X ′ be the symmetrization of X , where X ′ is an independent copy of X . We shall always choose s s symmetrizations so that we have (X1 + · · · + Xk )s = X1 + · · · + Xk whenever we need this identity. Write X p = (E [ X p ])1/p , where · is the norm on the Banach space in which our random variables take values. Lemma 1. Let X be a Banach-valued random variable with X ∞. Then, X 2 ≤ X s 2 + E [X ] ≤ 3 X 2
2

<

Proof. Let X ′ be an independent copy of X so that X s = X ?X ′ . Let A be the sigma-algebra generated by X . Then E [X s | A] = X ? E [X ′ ] = X ? E [X ], and so X
2

= E [X s + E [X ] | A]

2

≤ X s + E [X ]

2

≤ Xs

2

+ E [X ] ,

where the ?rst inequality used the fact that conditional expectation is a contraction on the Banach-valued Lp spaces, p ≥ 1 (see, e.g., [2,

4

STEPHEN J. MONTGOMERY-SMITH AND ALEXANDER R. PRUSS

Theorem V.1.4]). The rest of the Lemma follows from the triangle inequality. Lemma 2. Let X1 , . . . , Xn be independent random variables, and let ?1, . . . , X ? n be independent identically distributed random variables such X ? 1 . Put Sn = X1 + · · · + Xn and that X1 , . . . , Xn regularly cover X ?n = X ?1 + · · · + X ? n . Then: S Sn
2

?n 2 . ≤ 12 S

Proof. Let I1 , . . . , In be independent random variables uniformly dis′ tributed on the set {1, . . . , n}. Let {Xi,j }1≤i,j ≤n and {Xi,j }1≤i,j ≤n be independent arrays of independent random variables, with the arrays ′ independent of the Ii , and such that Xi,j and Xi,j both have the same distribution as Xj for all i and j . Without loss of generality we can put ′ ′ ′ ′ ′ ?n ?1 ?n ? i = Xi,I . Set X ? i′ = X ′ . Let S , . . . , Xn ) . Let (X1 =X +···+X X i,Ii i ′ ′ ′ be an independent copy of (X1 , . . . , Xn ), and put Sn = X1 + · · · + Xn . ′ ′ ?i ? X ? ′ for all i, and Observe that X1 ? X1 , . . . , Xn ? Xn regularly cover X i ′ that moreover the Xi ? Xi are symmetric. Thus, by [9, Proposition 1] (which though stated for real valued random variables, holds for the Banach-valued case as well, and with the same proof) we have: (4)
′ Sn ? Sn 2 ′ ?n ? S ?n ≤4 S 2.

?n ]. Combining this with Lemma 1, Also, it is clear that E [Sn ] = E [S we see that: Sn
2 ′ ≤ Sn ? Sn 2

?n ? S ?′ + E [Sn ] ≤ 4 S n

2

?n ] ≤ 12 S ?n 2 , + 4 E [S

as desired. The following Lemma is in e?ect a special case of a result of Hitczenko [5]. Lemma 3. Let X1 , . . . , Xn be independent identically distributed Banach-valued random variables with Xi < L almost surely for all i. Let Sk = X1 + · · · + Xk . Then: (E [ Sn ])2 ≥ c(E [ Sn 2 ] ? c?1 L2 ), where c ∈ (0, ∞) is an absolute constants.

A COMPARISON INEQUALITY

5

Proof of Lemma 3. By the work of Hitczenko [5], if S ? = maxk Sk and X ? = maxk Xk , then for q ≥ p: S?
q

q ≤ c0 ( S ? p

p

+ X ? q ),

for a ?nite absolute constant c0 . By [7, Corollary 4] we have S ? p ≤ c1 Sn p for an absolute constant c1 , as the Xi are identically distributed. The desired inequality easily follows from this with c = 8c2 0 if we let q = 2 and p = 1.
′ ′ Proof of Theorem 1. Let I1 , . . . , In , {Xi,j }1≤i,j ≤n , {Xi,j }1≤i,j ≤n , Sn and ′ ? Sn be as in the proof of Lemma 2. Applying [9, Proposition 1] (which works for Banach-valued variables as already stated), we see that ′ ?n ? S ?′ ≥ λ/2) ≤ 16P ( S ?n ≥ λ/4), (5) P ( Sn ? Sn ≥ λ) ≤ 8 P ( S n

for all λ, where the second inequality followed from the inequality that P ( X s ≥ t) ≤ P ( X ≥ t/2) + P ( X ′ ≥ t/2) = 2P ( X ≥ t/2), where X ′ is an independent copy of X such that X s = X ? X ′ . Note s ′ that Sn = Sn ? Sn . Let M be a median of Sn . It is easy to see that (6)
s P ( Sn ? M ≥ λ ) ≤ 2 P ( | Sn | ≥ λ ),

for all λ. (For, if Sn ? M ≥ λ, there is at least probability 1/2 that ′ ′ ′ Sn ≤ M in which case Sn ? Sn ≥ Sn ? Sn ≥ Sn ? M ≥ λ.) We now claim that in general in our present setting: (7) ?n ≥ εM ) ≥ δ, P( S

for absolute constants ε, δ ∈ (0, 1) to be determined later. (they will be determined in accordance with (12), (18), (20), (25) and (26), below). To prove (7), suppose that on the contrary we have: (8) ?n ≥ εM ) ≤ δ. P( S

? i are independent and identically distributed, by a maximal Since the X inequality for sums of independent and identically distributed random variables [7, Corollary 4] together with (8), we have: (9) ?k ≥ c1 εM ≤ c1 P ( S ?n ≥ εM ) ≤ c1 δ, P max S
1≤k ≤n

6

STEPHEN J. MONTGOMERY-SMITH AND ALEXANDER R. PRUSS

where c1 ∈ [1, ∞) is an absolute constant. By the elementary inequality
k

P ( max Uk ≥ 2t) ≤ P
1≤k ≤n

1≤k ≤n

max

Ui ≥ t ,
i=1

valid for all t if the Ui are independent (since if Uk ≥ 2t then k ?1 k i=1 Ui ≥ t), it follows from (9) that i=1 Ui ≥ t or (10) ? k ≥ 2c1 εM ≤ c1 δ. P max X
1≤k ≤n

?k = X ?k · 1 ? Let L = 2c1 εM . Set Yk = Xk · 1{ Xk <L} . Put Y { Xk <L} . Note ? that Y1 , . . . , Yn regularly cover Yk for each k . Let Tn = Y1 + · · · + Yn ?n = Y ?1 + · · · + Y ?n . By (10), we have: and put T
n

(11)

P

?k = Y ?k } {X
k =1

≤ c1 δ.

? k ≥ L). Note that this does not depend on k since the Let p = P ( X ? k are identically distributed. Note also that the left hand side of (11) X is equal to 1 ? (1 ? p)n . Henceforth we will assume that (12) δ < 1/(2c1). Now, if x ∈ [0, 1] is such that 1 ? (1 ? x)n ≤ 1/2, then nx ≤ 2(1 ? (1 ? x)n ). Then, using this observation, together with (11), (12) and ?1: the condition that X1 , . . . , Xn regularly cover X
n n

P

{ Xk = Y k }
k =1


k =1 n

P ( Xk = Y k ) P ( Xk ≥ L)
k =1

= (13)

? 1 ≥ L) = np = nP ( X ≤ 2(1 ? (1 ? p))n
n

= 2P

?k = Y ?k } {X
k =1

≤ 2c1 δ.

Now, by (5), (6) and (8), it follows that P ( Sn ? M ≥ 4εM ) ≤ 32δ. Using (13), it then follows that: (14) P ( Tn ? M ≥ 4εM ) ≤ (32 + 2c1 )δ.

A COMPARISON INEQUALITY

7

Moreover, by (8) and (11): (15) ?n ≥ εM ) ≤ (1 + c1 )δ. P( T ?n ])2 ≥ c2 (E [ T ?n 2 ] ? c?1 L2 ), (E [ T 2 ?i | < L almost surely. Lemma 3 then shows that: Observe that |Y (16) where c2 ∈ (0, ∞) is an absolute constant. Now, by (14) we have: (17) E [ Tn 2 ] ≥ [1 ? (32 + 2c1 )δ ]M 2 . 1 ? (32 + 2c1 )δ ≥ 1 . 2

Henceforth, we will assume that δ is su?ciently small that (18) ?n 2 ]. Combining this Using Lemma 2 we see that E [ Tn 2 ] ≤ 144E [ T with (17) and (18), we see that (19) ?n 2 ] ≥ M 2 /288. E[ T
1 2 2 Assume that ε > 0 is su?ciently small that c? 2 L ≤ M /(2 · 288). Since L = 2c1 εM , this assumption is equivalent to:

(20) Thus by (19): (21) Then, by (16), (22)

ε ≤ (48c1 )?1 c2 .
1 2 ? 2 c? 2 L ≤ E [ Tn ]/2.

1/2

?n ])2 ≥ c2 (E [ T ?n 2 ] ? c?1 L2 ) ≥ 1 c2 E [ T ?n 2 ]. (E [ T 2 2

The elementary inequality P (|Ξ| ≥ λE [|Ξ|]) ≥ (1 ? λ)2 (E [|Ξ|])2 /E [|Ξ|2] (see, e.g., [1, Exercise 3.3.11]) then implies that (23) ?n ≥ 1 E [ T ?n ]) ≥ (1 ? 1 )2 · 1 c2 . P( T 2 2 2 ?n ] ≥ ( 1 c2 /288)1/2 M = c1/2 M/24, Now, by (19) and (22) we have E [ T 2 2 so that (23) gives: (24) ?n ≥ 1 c1/2 M/24) ≥ c2 /8. P( T 2 2 0 < ε ≤ c2 /48
1/2

If we choose ε and δ such that (25) and (26) 0 < (1 + c1 )δ < c2 /8

8

STEPHEN J. MONTGOMERY-SMITH AND ALEXANDER R. PRUSS

and satisfying the other conditions required in the above argument (namely (12), (18) and (20)), we will obtain from (24) a contradiction to (15). Hence, if we take ε and δ to be absolute constants in (0, 1) satisfying these assumptions, we obtain (7). Now, combining (5) and (6), we see that: (27) ?n ≥ λ/4), P ( Sn ? M ≥ λ) ≤ 32P ( S

for all λ. There are now two cases to be considered. Suppose ?rst that λ ≤ 2M . Then using (7): ?n ≥ εM ) ≤ δ ?1 P ( S ?n ≥ ελ/2). (28) P ( Sn ≥ λ) ≤ 1 ≤ δ ?1 P ( S On the other hand, suppose that λ > 2M . In that case if Sn ≥ λ then Sn ? M > λ ? λ/2 = λ/2, so that (29) ?n ≥ λ/4), P ( Sn ≥ λ) ≤ P ( Sn ? M ≥ λ/2) ≤ 32P ( S

by (27). Inequality (1) follows from (28) for λ ≤ 2M and from (29) for λ > 2M , if we let c = max(32, 2/ε, δ ?1). References
1. Kai Lai Chung, A course in probability theory, 2nd ed., Academic Press, San Diego, 1974. 2. J. Diestel and J. J. Uhl, Jr., Vector measures, Math. Surveys, vol. 15, Amer. Math. Soc., Providence, RI, 1977. 3. Paul Erd? os, On a theorem of Hsu and Robbins, Ann. Math. Statist. 20 (1949), 286–291. , Remark on my paper “On a theorem of Hsu and Robbins”, Ann. Math. 4. Statist. 21 (1950), 138. 5. P. Hitczenko, On a domination of sums of random variables by sums of conditionally independent ones, Ann. Probab. 22 (1994), 453–468. 6. P. L. Hsu and H. Robbins, Complete convergence and the law of large numbers, Proc. Nat. Acad. Sci. U.S.A. 33 (1947), 25–31. 7. S. J. Montgomery-Smith, Comparison of sums of independent identically distributed random vectors, Probab. Math. Statist. 14 (1993), 281–285. 8. Alexander R. Pruss, Randomly sampled Riemann sums and complete convergence in the law of large numbers for a case without identical distribution, Proc. Amer. Math. Soc. 124 (1996), 919–929. 9. , Comparisons between tail probabilities of sums of independent symmetric random variables, Ann. Inst. Poincar? e Probab. Statist. 33 (1997), 651–671. Department of Mathematics, University of Missouri, Columbia, MO 65211, U.S.A.

A COMPARISON INEQUALITY

9

Department of Philosophy, University of Pittsburgh, Pittsburgh, PA 15260, U.S.A. E-mail address : stephen@math.missouri.edu E-mail address : pruss+@pitt.edu URL: http://www.missouri.edu/~stephen URL: http://www.pitt.edu/~pruss


赞助商链接

更多相关文章:
STATISTICAL GUIDE FOR AUTHORS
effects and sums of squares for B need to be...REGRESSION The dependent variable Y is random with...of independent variables to a more manageable ...
统计学重点整理-CH11
Independent variables are also referred to as ...ANOVA: Sums of Squares Definitions where : i ?...random samples from the population Variances of ...
更多相关标签:

All rights reserved Powered by 甜梦文库 9512.net

copyright ©right 2010-2021。
甜梦文库内容来自网络,如有侵犯请联系客服。zhit325@126.com|网站地图