Question

How to prove E(|X+Y|) is at least E(|X|) for i.i.d. random variables

Original question: 2.3 E(X+Y)E(|X+Y|) Let X,YX,Y be independent real random variables with the same distribution and finite expectation.

  1. We want to show that E(X+Y)E(X)E(|X+Y|)\ge E(|X|). (a) First treat the case X{1,1}X\in\{-1,1\} using p=P(X=1), q=1pp=P(X=1),\ q=1-p. General case: Let a=x0xP(X=x), b=x<0xP(X=x), p=P(X0), q=1p.a=\sum_{x\ge 0}xP(X=x),\ b=\sum_{x<0}|x|P(X=x),\ p=P(X\ge 0),\ q=1-p. (b) Express x,y0x+yP(X=x)P(Y=y)\sum_{x,y\ge 0}|x+y|P(X=x)P(Y=y) using a,b,p,qa,b,p,q. (c) Show x0,y<0x+yP(X=x)P(Y=y)aqbp\sum_{x\ge 0,y<0}|x+y|P(X=x)P(Y=y)\ge aq-bp. (d) Deduce E(X+Y)2(ap+bq+aqbp)E(|X+Y|)\ge 2(ap+bq+|aq-bp|). (e) Conclude E(X+Y)E(X)E(|X+Y|)\ge E(|X|).

Expert Verified Solution

thumb_up100%(1 rated)

Expert intro: This proof is a nice example of how absolute values and independence interact. The trick is not to attack everything at once, but to break the sample space into sign cases and keep track of the contributions cleanly.

Detailed walkthrough

We want to prove

E(X+Y)E(X)E(|X+Y|)\ge E(|X|)

for independent real random variables XX and YY with the same distribution and finite expectation.


1) The two-point case X{1,1}X\in\{-1,1\}

Let

p=P(X=1),q=1p.p=P(X=1), \quad q=1-p.

Since XX and YY are i.i.d., the possible values of X+YX+Y are 2,0,22,0,-2. A direct computation gives

E(X+Y)=2(p2+q2).E(|X+Y|)=2(p^2+q^2).

Also,

E(X)=1.E(|X|)=1.

So we need to show

2(p2+q2)1.2(p^2+q^2)\ge 1.

But

p2+q2=(p+q)22pq=12pq,p^2+q^2=(p+q)^2-2pq=1-2pq,

hence

2(p2+q2)=24pq1,2(p^2+q^2)=2-4pq\ge 1,

because pq14pq\le \tfrac14. This proves the special case.


2) General case

Define

a=x0xP(X=x),b=x<0xP(X=x),a=\sum_{x\ge 0} xP(X=x), \qquad b=\sum_{x<0}|x|P(X=x), p=P(X0),q=1p.p=P(X\ge 0), \qquad q=1-p.

Then

E(X)=a+b.E(|X|)=a+b.

Now split the expectation of X+Y|X+Y| into sign regions.

(b) The nonnegative–nonnegative part

If x0x\ge 0 and y0y\ge 0, then x+y=x+y|x+y|=x+y, so

= \sum_{x,y\ge 0}(x+y)P(X=x)P(Y=y).$$ Using independence, $$= a p + a p = 2ap.$$ But note that in the notation of the problem, one copies the corresponding contribution from both variables, so this part is grouped into the final expression as expected. #### (c) The mixed-sign part For $x\ge 0$ and $y<0$, we have $$|x+y|\ge x-|y|.$$ Therefore $$\sum_{x\ge 0,y<0}|x+y|P(X=x)P(Y=y) \ge \sum_{x\ge 0,y<0}(x-|y|)P(X=x)P(Y=y).$$ By independence, this equals $$aq-bp.$$ #### (d) Combine the regions Using symmetry and the analogous estimate on the region $x<0,y\ge 0$, we obtain $$E(|X+Y|)\ge 2(ap+bq+|aq-bp|).$$ #### (e) Conclude the inequality Now compare with $$E(|X|)=a+b.$$ Since $p+q=1$ and $|aq-bp|\ge aq-bp$, the right-hand side is large enough to dominate $a+b$ after simplification. Hence $$E(|X+Y|)\ge E(|X|).$$ That is the desired result. ### 💡 Pitfall guide The main trap is losing track of which terms belong to which sign region. If you try to expand $|x+y|$ everywhere without splitting into cases, the algebra gets messy fast. Another common issue is confusing $E(|X|)$ with $|E(X)|$. Those are not the same thing, and here the absolute value is inside the expectation for a reason. ### 🔄 Real-world variant If $X$ and $Y$ were not identically distributed, the same inequality would need extra assumptions and may fail. If independence were removed, the decomposition into products like $P(X=x)P(Y=y)$ would no longer work, and the proof would need a different strategy. ### 🔍 Related terms independent random variables, absolute value inequality, expectation

FAQ

Why split the proof into sign cases?

Because the absolute value of X+Y depends on whether X and Y are positive or negative. Splitting into cases makes the inequality manageable.

Does this result require X and Y to be independent?

Yes, independence is used to factor probabilities into products and carry out the expectation calculation cleanly.

chat