Intereting Posts

How many triangles are formed by $n$ chords of a circle?
Causal Inverse Z-Transform of Fibonacci
How does intuition fail for higher dimensions?
continuity and existence of all directional derivatives implies differentiable?
When can a pair of groups be embedded in each other?
A Graph as a Union of K forests.
The square roots of different primes are linearly independent over the field of rationals
Find all the solutions of this equation:$x+3y=4y^3 \ , y+3z=4z^3\ , z+3x=4x^3 $ in reals.
Combinatorial proof for $\binom{n}{a}\binom{a}{k}\binom{n-a}{b-k} = \binom{n}{b}\binom{b}{k}\binom{n-b}{a-k}$?
Focal length of an ellipse and related results
Solve $x+3y=4y^3,y+3z=4z^3 ,z+3x=4x^3$ in reals
Inequality about disbtribution in Functional Analysis by Rudin
Evaluation of $\int_0^{\pi/3} \ln^2\left(\frac{\sin x }{\sin (x+\pi/3)}\right)\,\mathrm{d}x$
How to solve $ \int_0^{\pi} \sin{2x}\sin{x} dx $?
Mathematical Induction, Want to check I'm getting this right

Let $X_i$’s ($i=1,..,n$) be i.i.d. random variables with mean $\mu$ and variance $\sigma^2$.

Is there a method that can be used to compute $\mathbb{E}[1/(X_1+…+X_n)]$?

- How many bins do random numbers fill?
- Tail bound on the sum of independent (non-identical) geometric random variables
- pdf of a quotient of uniform random variables
- Is the Law of Large Numbers empirically proven?
- Probability that distance of two random points within a sphere is less than a constant
- Is this set compact?

- Calculating mu and sigma (μ and σ) of a normal random variable
- What is linearity of Expectations?
- Probability of rolling three dice without getting a 6
- Centre in N-sided polygon on circle
- Inequality with Expectations
- Average number of times it takes for something to happen given a chance
- Three points on a circle
- Probability that at least K cards will go into a bucket
- Expected value of $x^t\Sigma x$ for multivariate normal distribution $N(0,\Sigma)$
- Generate random numbers following the exponential distribution in a given interval $$

Assuming the expectation does exist, and further assuming $X$ to be positive random variables:

$$

\mathbb{E}\left(\frac{1}{X_1+\cdots+X_n}\right) = \mathbb{E}\left( \int_0^\infty \exp\left(-t (X_1+\cdots+X_n) \right) \mathrm{d}t \right)

$$

Interchanging the integral over $t$ with expectation:

$$

\mathbb{E}\left( \int_0^\infty \exp\left(-t (X_1+\cdots+X_n) \right) \mathrm{d}t \right) = \int_0^\infty \mathbb{E}\left(\exp\left(-t (X_1+\cdots+X_n) \right) \right) \mathrm{d}t

$$

Using iid property:

$$

\int_0^\infty \mathbb{E}\left(\exp\left(-t (X_1+\cdots+X_n) \right) \right) \mathrm{d}t = \int_0^\infty \mathbb{E}\left(\exp\left(-t X \right) \right)^n \mathrm{d}t

$$

So should you know the Laplace generating function $\mathcal{L}_X(t) = \mathbb{E}\left(\mathrm{e}^{-t X} \right)$ we have:

$$

\mathbb{E}\left(\frac{1}{X_1+\cdots+X_n}\right) = \int_0^\infty \mathcal{L}_X(t)^n \mathrm{d} t

$$

As others have commented, you need more information. However, you do have an estimate

(using convexity): assuming the random variables are positive,

$$E\left[ \frac{1}{X_1 + \ldots + X_n} \right] \ge \frac{1}{E[X_1 + \ldots + X_n]} = \frac{1}{n\mu}$$

- Test for the convergence of the sequence $S_n =\frac1n \left(1 + \frac{1}{2} + \frac{1}{3} + \cdots+ \frac{1}{n}\right)$
- The Leibniz rule for the curl of the product of a scalar field and a vector field
- $1+a$ and $1-a$ in a ring are invertible if $a$ is nilpotent
- (non)equivalence of definition of non-atomic measure for finitely additive measure
- Show that it is a homomorphism?
- Bijective local isometry to global isometry
- Do proper dense subgroups of the real numbers have uncountable index
- Newton optimization algorithm with non-positive definite Hessian
- What is the difference between matrix theory and linear algebra?
- Uniform distribution on the surface of unit sphere
- Prove uniform distribution
- Using integral definition to solve this integral
- Gaussian Elimination
- Is there a lower-bound version of the triangle inequality for more than two terms?
- How to prove Raabe's Formula