9512.net

# Decoupling Inequalities for the Tail Probabilities of Multivariate U-statistics

Decoupling Inequalities for the Tail Probabilities of Multivariate U-statistics by 1 Victor H. de la Pe?a and S. J. Montgomery-Smith 2 n Columbia University and University of Missouri, Columbia Abstract

arXiv:math/9309211v2 [math.FA] 6 Dec 1999

In this paper we present a decoupling inequality that shows that multivariate Ustatistics can be studied as sums of (conditionally) independent random variables. This result has important implications in several areas of probability and statistics including the study random graphs and multiple stochastic integration. More precisely, we get the following result: Theorem 1. Let {Xj } be a sequence of independent random variables in a measurable (j) space (S, S), and let {Xi }, j = 1, ..., k be k independent copies of {Xi }. Let fi1 i2 ...ik be families of functions of k variables taking (S × ... × S) into a Banach space (B, || · ||). Then, for all n ≥ k ≥ 2, t > 0, there exist numerical constants Ck depending on k only so that, P (||
1≤i1 =i2 =...=ik ≤n

fi1 ...ik (Xi1 , Xi2 , ..., Xik )|| ≥ t) fi1 ...ik (Xi1 , Xi2 , ..., Xik )|| ≥ t).
1≤i1 =i2 =...=ik ≤n (1) (2) (k)

(1)

(1)

(1)

≤ Ck P (Ck ||

The reverse bound holds if in addition, the following symmetry condition holds almost surely fi1 i2 ...ik (Xi1 , Xi2 , ..., Xik ) = fiπ(1) iπ(2) ...iπ(k) (Xiπ(1) , Xiπ(2) , ..., Xiπ(k) ), for all permutations π of (1, ..., k). 1. Introduction In this paper we provide the multivariate extension of the tail probability decoupling inequality for generalized U-statistics of order two and quadratic forms presented in de la Pe?a and Montgomery-Smith (1993). This type of inequality permits the transfer of some n results for sums of independent random variables to the case of U-statistics. Our work builds mainly on recent work of Kwapien and Woyczynski (1992) as well as on results for U-statistics from Gin? and Zinn (1992) and papers dealing with inequalities for multilinear e forms of symmetric and hypercontractive random variables in de la Pe?a, Montgomeryn Smith and Szulga (1992), and de la Pe?a (1992). It is to be remarked that the decoupling n inequalities for multilinear forms introduced in McConnell and Taqqu (1986) provided us with our ?rst exposure to this decoupling problem. For a more expanded list of references on the subject see, for example, Kwapien and Woyczynski (1992).
1,2 2

Supported in part by NSF grants. Supported by the University of Missouri Research Board. AMS 1991 subject classi?cations: Primary 60E15. Secondary 60D05. Key words and phrases: U-statistics, Decoupling. 1

2. Main Result Theorem 1. Let {Xi } be a sequence of independent random variables in a measurable (j) space (S, S), and let {Xi }, j = 1, ..., k be k independent copies of {Xi }. Let fi1 i2 ...ik be families of functions of k variables taking (S × ... × S) into a Banach space (B, || · ||). Then, ? for all n ≥ k ≥ 2, t > 0, there exist numerical constants Ck , Ck depending on k only so that, (1) (1) (1) P (|| fi1 ...ik (Xi1 , Xi2 , ..., Xik )|| ≥ t)
1≤i1 =i2 =...=ik ≤n

≤ Ck P (Ck ||
1≤i1 =i2 =...=ik ≤n

fi1 ...ik (Xi1 , Xi2 , ..., Xik )|| ≥ t).

(1)

(2)

(k)

If in addition, the following symmetry condition holds almost surely fi1 i2 ...ik (Xi1 , Xi2 , ..., Xik ) = fiπ(1) iπ(2) ...iπ(k) (Xiπ(1) , Xiπ(2) , ..., Xiπ(k) ) for all permutations π of (1, ..., k), then P (||
1≤i1 =i2 =...=ik ≤n

fi1 ...ik (Xi1 , Xi2 , ..., Xik )|| ≥ t) fi1 ...ik (Xi1 , Xi2 , ..., Xik )|| ≥ t).
1≤i1 =i2 =...=ik ≤n (1) (1) (1)

(1)

(2)

(k)

? ? ≤ Ck P (Ck ||

Note: In this paper we use the notation {i1 = i2 = ... = ik } to denote that all of i1 , ..., ik are di?erent. 3. Preliminary Results Throughout this paper we will be using two results found in earlier work. The ?rst one comes from de la Pe?a and Montgomery-Smith (1993). For completeness we reproduce n the proof here. Lemma 1. Let X, Y be two i.i.d. random variables. Then (1) P (||X|| ≥ t) ≤ 3P (||X + Y || ≥ 2t ). 3

Proof: Let X, Y, Z be i.i.d. random variables. Then P (||X|| ≥ t) = P (||(X + Y ) + (X + Z) ? (Y + Z)|| ≥ 2t) ≤ P (||X + Y || ≥ 2t/3) + P (||X + Z|| ≥ 2t/3) + P (||Y + Z|| ≥ 2t/3) = 3P (||X + Y || ≥ 2t/3). The second result comes from Kwapien and Woyczynski (1992) and can also be found in de la Pe?a and Montgomery-Smith (1993). n 2

Proposition 1. Let Y be any mean zero random variable with values in a Banach space (B, || · ||). Then, for all a?B, (2)
′ 2

P (||a + Y || ≥ ||a||) ≥

κ , 4

(Y )|) where κ = inf x′ ?B′ (E|x′ (Y ))2 . (Here B ′ denotes the family of linear functionals on B.) E(x Proof: Note ?rst that if ξ is a random variable for which Eξ = 0, then P (ξ ≥ 0) ≥ 2 ′ 2 1 (E|X (Y )|) 1 (E|ξ|) ′ The result then follows, 2 ) . From this, we deduce that P (x (Y ) ≥ 0) ≥ 4 E(X ′ (Y ))2 4 E(ξ ′ ′ ′ ′ because if x ?B is such that ||x || = 1 and x (a) = ||a||, then {||a + Y || ≥ ||a||} contains {x′ (a + Y ) ≥ x′ (a)} = {x′ (Y ) ≥ 0}. Lemma 2. Let x, ai1 , ai1 i2 , ..., ai1 i2 ...ik belong to a Banach space (B, || · ||). Let {?i } be a sequence of symmetric Bernoulli random variables. Then, k

P (||x +
r=1 1≤ii =i2 =...=ir ≤n

ai1 ...ir ?i1 ...?ir || ≥ ||x||) ≥ c?1 , k

for a universal constant 1 < ck < ∞ depending on k only. Proof: Suppose that x, ai1 , ai1 i2 , ..., ai1 i2 ...ik are in R, then since the ?’s are hypercontractive, by equation (1.4) of Kwapien and Szulga (1991) and the easy argument of the proof of Lemma 3 in de la Pe?a and Montgomery-Smith (1993), for some σ > 0, we get n (E| = ≤ =
1 k 4 4 r=1 1≤i1 =...=ir ≤n ai1 ...ir ?i1 ...?ir | ) 1 k (E| r=1 1≤i1 <...<ir ≤n bi1 ...ir ?i1 ...?ir |4 ) 4 1 k σ ?k (E| r=1 1≤i1 <...<ik ≤n bi1 ...ir ?i1 ...?ir |2 ) 2 1 k σ ?k (E| r=1 1≤i1 =...=ik ≤n ai1 ...ir ?i1 ...?ir |2 ) 2 ,

where bi1 ...ir = π∈Sr aiπ(1) ...iπ(r) , and Sr denotes the set of all permutations of {1, ..., r}. Next, observe that ||ξ||4 ≤ σ ?2 ||ξ||2 implies that ||ξ||2 ≤ σ ?4 ||ξ||1 . Take x′ ?B ′ so that ′ ||x || = 1 and x′ (x) = ||x||, then P (||x +
′ k r=1 1≤ii =i2 =...=ir ≤n ai1 ...ir ?i1 ...?ir || ≥ ||x||) k ′ ′ r=1 1≤ii =i2 =...=ir ≤n x (ai1 ...ir )?i1 ...?ir ≥ x (x)) ?1 ′ 1≤ii =i2 =...=ir ≤n x (ai1 ...ir )?i1 ...?ir ≥ 0) ≥ ck

≥ P (x (x) + = P(
k r=1

Note: Throughout this paper we will use ck and Ck to denote numerical constants that depend on k only and may change from application to application. 4. Proof of the Upper Bound: Our proof of this result is obtained by applying the argument used in the proof of the upper bound in the bivariate case plus an inductive argument. Let {σi } be a sequence of 1 independent symmetric Bernoulli random variables, P (σi = 1) = 2 and P (σi = ?1) = 1 . 2 (1) (2) (1) (2) (1) (2) Consider random variables (Zi , Zi ) such that (Zi , Zi ) = (Xi , Xi ) if σi = 1 and (1) (2) (2) (1) (Zi , Zi ) = (Xi , Xi ) if σi = ?1. Then (1 + σi ) and (1 ? σi ) are either 0 or 2 and 3

these random variables can be used to transform the problem from one involving X’s to one involving Z’s. Let us ?rst illustrate the argument in the case that k = 3. (1) (1) (2) 23 fi1 i2 i3 (Zi1 , Zi2 , Zi3 ) = {(1 + σi1 )(1 + σi2 )(1 + σi3 )fi1 i2 i3 (Xi1 , Xi2 , Xi3 ) +(1 + σi1 )(1 + σi2 )(1 ? σi3 )fi1 i2 i3 (Xi1 , Xi2 , Xi3 ) +(1 + σi1 )(1 ? σi2 )(1 + σi3 )fi1 i2 i3 (Xi1 , Xi2 , Xi3 )
(1) (2) (2) (1) (1) (1) (1) (1) (2)

(3)

+(1 ? σi1 )(1 + σi2 )(1 + σi3 )fi1 i2 i3 (Xi1 , Xi2 , Xi3 ) +(1 + σi1 )(1 ? σi2 )(1 ? σi3 )fi1 i2 i3 (Xi1 , Xi2 , Xi3 ) +(1 ? σi1 )(1 + σi2 )(1 ? σi3 )fi1 i2 i3 (Xi1 , Xi2 , Xi3 ) +(1 ? σi1 )(1 ? σi2 )(1 + σi3 )fi1 i2 i3 (Xi1 , Xi2 , Xi3 ) +(1 ? σi1 )(1 ? σi2 )(1 ? σi3 )fi1 i2 i3 (Xi1 , Xi2 , Xi3 )},
(2) (2) (1) (2) (2) (2) (2) (1) (1) (1) (2) (1)

(2)

(1)

(2)

where the sign “+” is chosen if the superscript of Xi agrees with that of Zi , and “?” otherwise. Next, set Tn,3 =
1≤i1 =i2 =i3 ≤n

{fi1 i2 i3 (Xi1 , Xi2 , Xi3 ) + fi1 i2 i3 (Xi1 , Xi2 , Xi3 ) +fi1 i2 i3 (Xi1 , Xi2 , Xi3 ) + fi1 i2 i3 (Xi1 , Xi2 , Xi3 ) +fi1 i2 i3 (Xi1 , Xi2 , Xi3 ) + fi1 i2 i3 (Xi1 , Xi2 , Xi3 ) +fi1 i2 i3 (Xi1 , Xi2 , Xi3 ) + fi1 i2 i3 (Xi1 , Xi2 , Xi3 )}. Letting G2 = σ(Xi , Xi , i = 1, ..., n) we get Tn,3 = 23
1≤i1 =i2 =i3 ≤n (1) (2) (2) (2) (2) (2) (2) (1) (1) (2) (1) (2) (1) (1) (1) (2) (2) (2) (1) (2)

(1)

(1)

(2)

(1)

(1)

(1)

E(fi1 i2 i3 (Zi1 , Zi2 , Zi3 )|G2 ).

(1)

(1)

(2)

More generally, for any 1 ≤ l1 , ..., lk ≤ 2, one can obtain the expansion 4

2k fi1 ...ik (Zi11 , ..., Zikk ) (4) =
1≤j1 ,...,jk ≤2

(l )

(l )

(1±σi1 )...(1±σik )fi1 ...ik (Xi1 1 , ..., Xik k ).

(j )

(j )

The appropriate extension of Tn,3 is Tn,k =
1≤i1 =...=ik ≤n 1≤j1 ,...,jk ≤2

fi1 ...ik (Xi1 1 , ..., Xik k ).

(j )

(j )

Again, Tn,k = 2k
1≤i1 =...=ik ≤n

E(fi1 ...ik (Zi11 , ..., Zikk )|G2 ).

(l )

(l )

From Lemma 1 we get, P (||
1≤i1 =i2 =...=ik ≤n

fi1 ...ik (Xi1 , ..., Xik )|| ≥ t) ≤ {fi1 ...ik (Xi1 , ..., Xik ) + fi1 ...ik (Xi1 , ..., Xik )}|| ≥ 2t) =
(1) (1) (2) (2)

(1)

(1)

3P (3||
1≤i1 =...=ik ≤n

3P (3||Tn,k +
1≤i1 =...=ik ≤n

fi1 ...ik (Xi1 , ..., Xik ) + fi1 ...ik (Xi1 , ..., Xik ) ? Tn,k || ≥ 2t) ≤

(1)

(1)

(2)

(2)

{3P (3||Tn,k || ≥ t) +3P (3||
1≤i1 =...=ik ≤n 1≤j1 ,...jk ≤2,

fi1 ...ik (Xi1 1 , ..., Xik k )|| ≥ t)} not all j’s equal

(j )

(j )

≤ {3P (3||Tn,k || ≥ t) (5) +
1≤j1 ,...,jk ≤2,

Ck P (Ck || not all j’s equal
1≤i1 =...=ik ≤n

fi1 ...ik (Xi1 1 , ..., Xik k )|| ≥ t)}.

(j )

(j )

(Recall that Ck , ck are numerical constants that depend on k only and may change from application to application.) Observe also that using (4) and the fact that the σ’s are independent from the X’s, Lemma 2 with x = Tn,k gives for any ?xed 1 ≤ l1 , ..., lk ≤ 2, (6) P (2k ||
1≤i1 =...=ik ≤n

fi1 ...ik (Zi11 , ..., Zikk )|| ≥ ||Tn,k || |G2 ) ≥ c?1 , k
(1) (2)

(l )

(l )

Integrating over {||Tn,k || ≥ t} and using the fact that {(Xi , Xi ) : i = 1, ..., n} has (1) (2) the same joint distribution as {(Zi , Zi ) : i = 1, ..., n} we obtain that (7) P (2k ||
1≤i1 =...=ik ≤n

fi1 ...ik (Xi11 , ..., Xikk )|| ≥ t). 5

(l )

(l )

= P (2k ||
1≤i1 =...=ik ≤n

fi1 ...ik (Zi11 , ..., Zikk )|| ≥ t) ≥ c?1 P (||Tn,k || ≥ t) k

(l )

(l )

It is obvious that the upper bound decoupling inequality holds for the case of Ustatistics of order 1. Assume that it holds for U-statistics of orders 2, ..., k ? 1. Putting (5) and (7) together with 1 ≤ l1 , ..., lk ≤ 2, not all l’s equal we get, (1) (1) P (|| 1≤i1 =...=ik ≤n fi1 ...ik (Xi1 , ..., Xik )|| ≥ t) ≤ {3P (3||Tn,k || ≥ t) +
1≤j1 ,...,jk ≤2,

Ck P (Ck || not all j’s equal
1≤i1 =...=ik ≤n

fi1 ...ik (Xi1 1 , ..., Xik k )|| ≥ t)}

(j )

(j )

1≤j1 ,...,jk ≤2,

Ck P (Ck || not all j’s equal
(1) 1≤i1 =...=ik ≤n (k)

fi1 ...ik (Xi1 1 , ..., Xik k )|| ≥ t)

(j )

(j )

≤ Ck P (Ck || 1≤i1 =...=ik ≤n fi1 ...ik (Xi1 , ..., Xik )|| ≥ t), where again, the last line follows by the decoupling result for U-statistics of orders 2, ..., k?1 of the inductive hypothesis. Since the statement “not all j’s equal” means that there are less than k j’s equal, the variables whose j’s are equal can be decoupled using (conditionally on the other variables) the decoupling inequalities for U-statistics of order 2, ..., k ? 1. Next we give the proof of the lower bound. 5. Proof of the Lower Bound In order to show the lower bound we require the following result. Lemma 3. Let 1 ≤ l ≤ k. Then there is a constant Ck such that P (||
1≤i1 =i2 =...=ik ≤n ?1 ≥ Ck P (|| 1≤i1 =i2 =...=ik ≤n 1≤j1 ,...,jk ≤l

fi1 ...ik (Xi1 , Xi2 , ..., Xik )|| ≥ t) fi1 ...ik (Xi1 1 , Xi2 2 , ..., Xik k )|| ≥ Ck t).
(j ) (j ) (j )

(1)

(1)

(1)

Proof: Let {δr }, r = 1, ..., l, be a sequence of random variables for which P (δr = 1) = 1 l l and P (δr = 0) = 1 ? 1 , and r=1 δr = 1. Set ?r = δr ? 1 for r = 1, ..., l. Then, it is easy l l to see that there exists σl > 0 depending only upon l such that for any real number x0 and any sequence of real constants {ai }
l l

(8)

||x0 +
r=1

ar ?r ||4 ≤ ||x0 +

?1 σl r=1

ar ?r ||2 .

One can also use the results of Section 6.9 of Kwapien and Woyczynski (1992) (Pg. 180, 181) to assert this since the ?’s satisfy the conditions 1. through 3. stated there. 6

Let {(δi1 , ..., δil ), i = 1, ..., n} be n independent copies of (δ1 , ..., δl ). As before, we de?ne (9) 1 ?ij = δij ? . l

Since the vectors Ei = (?i1 , ..., ?il ) are independent, by an argument given in Kwapien and Szulga (1991), for i = 1, ..., n, for all constants x0 , aij in R,
n l n l n l

(10)

||x0 +
i=1 r=1

air ?ir ||4 ≤ ||x0 +

?1 σl i=1 r=1

air ?ir ||2 ≤

?1 σl ||x0

+
i=1 r=1

air ?ir ||2 ,

and recentering, we obtain
n l ?1 air δir ||4 ≤ σl ||x0 + i=1 r=1 i=1 r=1 n l

(11)

||x0 +

air δir ||2 .

Next we use the sequence Ei , i = 1, ..., n in de?ning the analogue of the Z’s used in our proof of the upper bound. (j) For each i, let Zi = Xi if δij = 1. Then, {Zi , i = 1, ..., n} has the same joint (1) distribution as {Xi , i = 1, ..., n} and fi1 ...ik (Zi1 , .., Zik ) =
1≤j1 ,j2 ,...,jk ≤l

δi1 j1 ...δik jk fi1 ...ik (Xi1 1 , ..., Xik k ).

(j )

(j )

The fact that Eδir jr =

1 l

for all ir , jr gives, fi1 ...ik (Xi1 1 , ..., Xik k ),
1≤j1 ,...,jk ≤l (j ) (j )

1 E(fi1 ...ik (Zi1 , ..., Zik )|Gl ) = ( )k l where Gl = σ((Xi , ..., Xi ), i = 1, ..., n). Let Un = 1≤i1 =i2 =...=ik ≤n fi1 i2 ...ik (Zi1 , ..., Zik ) =
1≤i1 =i2 =...=ik ≤n 1≤j1 ,...,jk ≤l (1) (l)

δi1 j1 ...δik jk fi1 ...ik (Xi1 1 , · · ·, Xik k ).

(j )

(j )

Let Di = (δi1 , ...., δil). Since the D’s are independent of the X’s, if we let gi1 ...ik (Di1 , ..., Dik ) =
1≤j1 ,...,jk ≤l

δi1 j1 ...δik jk fi1 ...ik (Xi1 1 , · · ·, Xik k ).

(j )

(j )

then, since fi1 ...ik (Xi1 , ..., Xik ) = fi(π(1)) ...i(π(k)) (Xiπ(1) , ..., Xiπ(k) ), 7

we have that, gi1 ...ik (Di1 , ..., Dik ) = gi(π(1)) ...i(π(k)) (Diπ(1) , ..., Diπ(k) ). Therefore, the two sided decoupling inequality in de la Pe?a (1992) can be applied and, for n every convex increasing function Φ, every Gl -measurable function T , and k independent (r) copies Di , r = 1, ..., k of Di there exists numerical constants Ak , Bk so that E(Φ(Ak ||T +
1≤i1 =i2 =...=ik ≤n

gi1 ...ik (Di1 , ..., Dik )||)|Gl )
(1) (k)

≤ E(Φ(||T +
1≤i1 =i2 =...=ik ≤n

gi1 ...ik (Di1 , ..., Dik )||)|Gl ) gi1 ...ik (Di1 , ..., Dik )||)|Gl ).
1≤i1 =i2 =...=ik ≤n

≤ E(Φ(Bk ||T +

This result with (11) shows that conditionally on Gl (12) where 1 k Tn = E(Un |Gl ) = ( ) l fi1 ...ik (Xi1 1 , Xi2 2 , ..., Xik k ).
1≤i1 =i2 =...=ik ≤n 1≤j1 ,...,jk ≤l (j ) (j ) (j ) ?k ||Un ? Tn ||4 ≤ σl

Bk ||Un ? Tn ||2 , Ak

(See also the proofs of Lemma 2 and Lemma 6.5.1 of Kwapien and Woyczynski (1992)). Thus we have that, (13) P (||Un || ≥ ||Tn |||Gl ) ≥ c?1 . k

This follows from the use of (12) and Proposition 1 with a = Tn and Y = Un ? Tn . We also use the fact that for any random variable ξ and positive constant c, ||ξ||4 ≤ c||ξ||2 implies that ||ξ||2 ≤ c2 ||ξ||1 (See also the proof of Lemma 2 for the approach to transfer the problem from one on Banach space valued random variables to one on real valued). Integrating (13) over the set {||Tn || ≥ t} we get (1) (1) (1) P (|| 1≤i1 =i2 =...=ik ≤n fi1 ...ik (Xi1 , Xi2 , ..., Xik )|| ≥ t) = P (||
1≤i1 =i2 =...=ik ≤n

fi1 ...ik (Zi1 , Zi2 , ..., Zik )|| ≥ t)
(j ) (j ) (j )

≥ c?1 P (Ck || k
1≤i1 =i2 =...=ik ≤n 1≤j1 ,...,jk ≤l

fi1 ...ik (Xi1 1 , Xi2 2 , ..., Xik k )|| ≥ t),

and Lemma 3 is proved. 8

The end of the proof of the lower bound follows by using induction and the iterative procedure introduced to obtain the proof of the lower bound multivariate decoupling inequality in de la Pe?a (1992). We give a di?erent expression of the same proof, motivated n by ideas from de la Pe?a, Montgomery-Smith and Szulga (1992). We will use Sk to denote n the set of permutations of {1, ..., k}. The Mazur-Orlicz formula tells us that for any 1 ≤ j1 , ..., jk ≤ k that (?1)k?δ1 ?...?δk δj1 . . . δjk
0≤δ1 ,...,δk ≤1

is 0 unless j1 , ..., jk is a permutation of 1, ..., k, in which case it is 1. Hence fi1 ...ik (Xi1
π∈Sk (π(1))

, . . . , Xik

(π(k))

) δj1 . . . δjk fi1 ...ik (Xi1 1 , . . . , Xik k ).
1≤j1 ,...,jk ≤k (j ) (j )

=
0≤δ1 ,...,δk ≤1

(?1)k?δ1 ?...?δk

By the symmetry properties on f , fi1 ...ik (Xi1 , . . . , Xik )
1≤i1 =...=ik ≤n (1) (k)

=

1 k!

(?1)k?δ1 ?...?δk
1≤i1 =...=ik ≤n 0≤δ1 ,...,δk ≤1 1≤j1 ,...,jk ≤k

δj1 . . . δjk fi1 ...ik (Xi1 1 , . . . , Xik k ).

(j )

(j )

Therefore, Pr(
1≤i1 =...=ik ≤n

fi1 ...ik (Xi1 , . . . , Xik ) ≥ t) ≤
0≤δ1 ,...,δk ≤1 k

(1)

(k)

Pr(
1≤i1 =...=ik ≤n 1≤j1 ,...,jk ≤k

δj1 . . . δjk fi1 ...ik (Xi1 1 , . . . , Xik k ) ≥ k!t/2k ) fi1 ...ik (Xi1 1 , . . . , Xik k ) ≥ k!t/2k ),
(j ) (j )

(j )

(j )

=
l=1

k l

Pr(
1≤i1 =...=ik ≤n 1≤j1 ,...,jk ≤l

and this combined with Lemma 3 is su?cient to show the result.

9

7. References 1. de la Pe?a, V. H. (1992). Decoupling and Khintchine’s inequalities for U-statistics. n Ann. Probab. 20 4, 1877-1892. 2. de la Pe?a, V. H., Montgomery-Smith and Szulga, J. (1992). Contraction and decoun pling inequalities for multilinear forms and U-statistics. Preprint. 3. de la Pe?a, V. H. and Montgomery-Smith (1993). Bounds on the tail probability of n U-statistics and quadratic forms. Preprint. 4. Gin?, E. and Zinn, J. (1992). A remark on convergence in distribution of U-statistics. e Preprint. 5. Kwapien, S. Szulga, J. (1991). Hypercontraction methods in moment inequalities for series of independent random variables in normed spaces. Ann. Probab. 19 (1), 369-379. 6. Kwapien, S. and Woyczynski, W. (1992). Random Series and Stochastic Integrals: Single and Multiple. Birkhauser. 7. McConnell, T. and Taqqu, M. (1986). Decoupling inequalities for multilinear forms in independent symmetric random variables. Ann. Probab. 14(3), 943-954.

10

Bounds on the tail probability of U-statistics and ....pdf
which allows one to compare the tail probabilities of the above quantities....nitive generalization of the decoupling inequalities for multilinear forms of ...
A note on sums of independent random variables.pdf
is to obtain bounds on the tail probabilities for sums of random variables...Contraction and decoupling inequalities n for multilinear forms and U-statistics...
Contemporary Mathematics Volume 00, XXXX A Note on Sums of ....pdf
is to obtain bounds on the tail probabilities for sums of random variables...Contraction and decoupling inequalities n for multilinear forms and U-statistics...
Bounds on the tail probability of U-statistics and ....pdf
which allows one to compare the tail probabilities of the above quantities....decoupling inequalities for expectations of convex functions of U-statistics ...
Adaptive Probabilities of Crossover and Mutation in....pdf
Adaptive Probabilities of Crossover and Mutation in...separately with the GA by decoupling the PCS and...OpCs is optimized for the steady-state operating ...
...Formula Using Quasi-Variational Inequalities.pdf
In this paper, we use quasi-variational inequalities to provide a rigorous proof of the familiar square root formula for the optimal economic order ...
Inequalities for a new data-based method for select....pdf
Inequalities for a new data-based method for ...Journal of Multivariate Analysis, vol. 12, pp. ...probabilities", Theory of Probability and its ...
Inequalities for the derivatives.pdf
In the course of these proofs we derive inequalities for the quantity γj := γj (δ) := γj (δ, mj ) := inf T f ∈K(δ,m ) j sup ...