On the application of strong approximation to weak convergence of products of sums for dependent random variables

We present the application of the strong approximation theorems to the study of weak convergence of products of sums of positive random variables. We focus our attention on sequences of dependent random variables such as associated and mixing sequences.


Introduction
The study of weak convergence of products of sums of random variables originated in the paper of Arnold and Villaseñor [1] who considered asymptotic properties of sums of records and proved that where S k is a partial sum of a sequence of i.i.d.random variables with the same exponential distribution with mean parameter equal to one.Here and in the sequel N is a standard normal random variable.This result may be equivalently written as The assumption on the particular distribution of the random variables was dropped by Rempa la and Weso lowski [7] who proved the following general result.
Theorem 1 Let (X n ) n∈N be a sequence of i.i.d.positive and square integrable random variables.Let us introduce the following notation: Remark 1 Equivalently, by taking logarithms, (1) may be written as follows The above result has attracted the attention of many researchers and there have appeared several papers concerning the related problems (cf.[2][3][4]9] and [8]).It is a natural question if the assumption of independence of random variables in Theorem 1 may be relaxed.As far as we know, the only result in this direction was obtained by Liu and Lin [3] who considered φ-mixing sequences.In this paper we present a general method of proving analogues of Theorem 1 for weakly dependent sequences.Our approach is based on the so-called strong approximation theorems (strong invariance principle, Hungarian construction -cf.[6]).Under some appropriate conditions imposed on the dependence structure of a weakly dependent sequence of random variables, the partial sums of the sequence may be approximated by the Brownian motion.The results similar to Theorem 1 may be obtained using this approximation.In order to present a uniform approach we shall assume that strong approximation is possible with a certain rate of accuracy instead of imposing specific dependence structures.The particular cases will be discussed in section 3 where we consider the mixing sequences and the associated sequences which have recently been playing a very important role in various areas of applied mathematics including mathematical physics.Associated random variables are sometimes called random variables satisfying FKG inequalities and appearing in Ising models of ferromagnets [5].

Main results
We begin with introducing the main condition, which will be used in the sequel.
Condition 1 Let (X n ) n∈N be a sequence of positive random variables (r.v.'s) with the same means EX k = µ such that these r.v.'s may be redefined on some possibly richer probability space without changing its distributions together with a Brownian motion B(t) in such a way that where ε n (ω) √ n → 0, almost surely as n → ∞.
For independent random variables σ 2 = V ar(X k ).In the case of dependent random variables usually σ 2 = lim n→∞ V ar(Sn) n and in the stationary case Remark 2 The condition (3) is satisfied if the r.v.'s are i.i.d. with E|X 1 | p < ∞ and p > 2, then we have even ε n (ω) n 1/p → 0, almost surely as n → ∞.
Unfortunately if we assume only the second moments to be finite, then we have which is not sufficient for further considerations.
We shall show that the r.v.'s log Sn n may be approximated by B(n) n .
by using the inequality (a + b) 2 2(a 2 + b 2 ), thus where we used (3) and the law of the iterated logarithm for Brownian motion.Now we see that by (3) Theorem 2 Under Condition 1 we have: Proof.
Our main result concerning convergence of products of sums will be formulated in the following theorem.
Theorem 3 Under Condition 1 we have: Proof.By Lemma 1 we have and the conclusion follows by the same argument as in the proof of Theorem 2.

Examples
Let us recall that random variables X 1 , . . ., X n are associated if, for any coordinatewise nondecreasing functions f, g : R n → R, Cov(f (X 1 , . . ., X n ), g(X 1 , . . ., X n )) 0, whenever this covariance exists.A sequence (X n ) n∈N is called associated if its every finite subcollection is associated.Let us note that independent random variables are associated, the increasing functions of associated variables are associated, the positively correlated gaussian random variables and random variables with M T P 2 densities are associated.Furthermore, associated and uncorrelated variables are independent.The following Cox-Grimmett coefficient is often used u(n) = sup k 1 j:|j−k| n Cov(X j , X k ).
The following strong invariance principle was proved by Yu [10].
Then, without changing its distribution we can redefine the sequence (X n ) n∈N on a richer probability space together with a standard Brownian motion process B(t), t 0 such that, for some ε > 0 S n − B(V ar(S n )) = O(n 1/2−ε ) almost surely.(7)