Expectation of inverse of sum of random variables

Let $X_i$’s ($i=1,..,n$) be i.i.d. random variables with mean $\mu$ and variance $\sigma^2$.

Is there a method that can be used to compute $\mathbb{E}[1/(X_1+…+X_n)]$?

Solutions Collecting From Web of "Expectation of inverse of sum of random variables"

Assuming the expectation does exist, and further assuming $X$ to be positive random variables:
$$
\mathbb{E}\left(\frac{1}{X_1+\cdots+X_n}\right) = \mathbb{E}\left( \int_0^\infty \exp\left(-t (X_1+\cdots+X_n) \right) \mathrm{d}t \right)
$$
Interchanging the integral over $t$ with expectation:
$$
\mathbb{E}\left( \int_0^\infty \exp\left(-t (X_1+\cdots+X_n) \right) \mathrm{d}t \right) = \int_0^\infty \mathbb{E}\left(\exp\left(-t (X_1+\cdots+X_n) \right) \right) \mathrm{d}t
$$
Using iid property:
$$
\int_0^\infty \mathbb{E}\left(\exp\left(-t (X_1+\cdots+X_n) \right) \right) \mathrm{d}t = \int_0^\infty \mathbb{E}\left(\exp\left(-t X \right) \right)^n \mathrm{d}t
$$
So should you know the Laplace generating function $\mathcal{L}_X(t) = \mathbb{E}\left(\mathrm{e}^{-t X} \right)$ we have:
$$
\mathbb{E}\left(\frac{1}{X_1+\cdots+X_n}\right) = \int_0^\infty \mathcal{L}_X(t)^n \mathrm{d} t
$$

As others have commented, you need more information. However, you do have an estimate
(using convexity): assuming the random variables are positive,
$$E\left[ \frac{1}{X_1 + \ldots + X_n} \right] \ge \frac{1}{E[X_1 + \ldots + X_n]} = \frac{1}{n\mu}$$