Intereting Posts

Multiplication by $0$ in a field
What's the intuition behind the identities $\cos(z)= \cosh(iz)$ and $\sin(z)=-i\sinh(iz)$?
Additive category that is not abelian
Domination of complex-value polynomial by highest power.
Smallest next real number after an integer
How to integrate greatest integer function $\int^{1.5}_0 \lfloor x^2 \rfloor \, dx$
$S := \{x \in \Bbb R^3: ||x||_2 = 1 \}$ and $T: S^2 \to \Bbb R$ is a continuous function. Is $T$ injective?
An algorithm for making conditionally convergent series take arbitrary values?
Inverse function theorem application
Proof of a limit for a recursively-defined sequence
Fourier transform of fourier transform?
function, vertical line and two values (notation for evaluation at 1 point or 2 endpoints)
This sequence $\lfloor \sqrt{2003}\cdot n\rfloor $ contains an infinite number of square numbers
Solve this Recursive Integral
Why isn't an infinite direct product of copies of $\Bbb Z$ a free module?

I am studying this sequence with some help from a small computer program which I wrote.

$(1 + \frac{k}{n})^{(n+1)}$

I am studying it for different integer values of k (positive and negative).

- Prove ${_2F_1}\left({{\tfrac16,\tfrac23}\atop{\tfrac56}}\middle|\,\frac{80}{81}\right)=\frac 35 \cdot 5^{1/6} \cdot 3^{2/3}$
- Using Squeeze Theorem to show the following limits
- Can someone show me a proof of the general solution for 2nd order homogenous linear differential equations?
- Find $\int_{ - \infty }^{ + \infty } {\frac{1} {1 + {x^4}}} \;{\mathrm{d}}x$
- Taylor expansion of $\frac{1}{1+x^{2}}$ at $0.$
- Is this a valid use of l'Hospital's Rule? Can it be used recursively?

I have so far built this theory:

1) For k < 0 the sequence is increasing after certain value of n

2) For k = 1,2 the sequence is decreasing after certain value of n

3) For k >= 3 the sequence is increasing after certain value of n

I know that its limit is $e^k$ but I want to know about its monotonicity.

Can anyone prove or reject this hypothesis?

I am looking for some simple solution based only on math induction and on basic facts about sequences. Assume that I know all basic theorems about sequences, monotonicity, convergence, etc. and that I have proved the following somewhat related statement.

**For all integers k the sequence $(1 + \frac{k}{n})^{n}$ is increasing
(at least for all sufficiently large n) and its limit is $e^k$**.

I also know Bernoulli’s inequality (which seems useful for such problems).

Earlier today I built another hypothesis which now seems wrong to me.

prove increasing/decreasing sequence

- Does $\sum_{j = 1}^{\infty} \sqrt{\frac{j!}{j^j}}$ converge?
- $f$ function with $\nabla f=\bar{0}$
- Problem with multivariable calculus: $\lim_{(x,y)\to (0,0)} \frac{x^3 + y^3}{x^2 + y}$
- R as a union of a zero measure set and a meager set
- Derivation of the Dirac-Delta function property: $\delta(bt)=\frac{\delta(t)}{\mid b \mid}$
- Set of zeroes of the derivative of a pathological function
- understanding $\frac{\partial x}{\partial y}\frac{\partial y}{\partial z}\frac{\partial z}{\partial x}=-1$
- Proving that an additive function $f$ is continuous if it is continuous at a single point
- What is the logic behind decomposing a derivative operator symbol. In the population growth equation?
- finding $\int {\tan^{3/2} 3x\sec 3x\,dx}$

We define $f(x)=(1+k/x)^{x+1},$ and want to show this is eventually decreasing if $k=1,2$ while it is eventually increasing if $k\ge 3.$ If by $\exp(a)$ we mean $e^a,$ then we have $f(x)=exp(g(x))$ where

$$g(x)=(x+1)\ln(1+k/x).$$

Note that since $\exp(u)$ is strictly increasing, we know that $f$ is increasing/decreasing iff $g$ is so.

The derivative of $g(x)$ is now

$$g'(x)=\ln(1+k/x)-\frac{k(x+1)}{x(x+k)},\tag{1}$$

after simplifying it.

Now we can apply the series for $\ln(1+t)=t-t^2/2+t^3/3 \cdots$ to this, and note since we’re only interested in eventually large $x$ that the value $k/x$ is eventually less than $1$ so that the log series will converge when $t=k/x.$ Furthermore note that then the log series is an alternating series whose terms have strictly monotone decreasing absolute values. This means that “tail ends” of the log series have definite sign, specifically a tail starting with a positive term has a positive sum, and a tail starting at a negagive term has a negative sum.

Define $h(k)$ as the term $k(x+1)/[x(x+k)]$ which is the fraction subtracted from the log term on the right side of $(1).$ We first assume that $k\ge 3$ and we take the first two terms of the log series, namely $k/x-(1/2)(k/x)^2,$ and subtract off $h(k)$ which then gives

$$\frac{k[(k-2)x-k^2]}{2x^2(x+k)}.$$

Since $k \ge 3$ this is eventually positive. Now putting the rest of the log series back we’re beginning at a positive term $+(1/3)(k/x)^3,$ so that the total sum remains positive ande we have shown eventually $g'(x)>0$ in case $k \ge 3.$

In the two cases for $k=1,2$ we need to use the first *three* terms of the log series, and then subtract $h(k)$ as above, this time wanting the result to be eventually negative, because now the first term in the tail is the negative value $-(1/4)(k/x)^4$ making the tail cause the total series to give a negative sum and so get $g'(x)<0$ for large $x.$

What we find is that for $k=1$ our first three terms with $h$ subtracted is $(2-3x)/6x^3,$ while in case $k=2$ this is $-4(x-4)/(3x^3(x+3)),$ in each case eventually negative as desired in order to show $g'(x)<0$ for large $x.$

- How do I solve $\displaystyle\int \frac{\mathrm{d}x}{e^x + 1} $?
- Groups with 20 Sylow subgroups
- Proof that there is no Banach-Tarski paradox in $\Bbb R^2$ using finitely additive invariant set functions?
- About a measurable function in $\mathbb{R}$
- Solve in $\mathbb Z^3$.
- Nth roots of square matrices
- Almost sure convergence
- What is a topology?
- Example of dynamical system where: $NW(f) \not\subset \overline{R(f)}$
- Proving a sequence is Cauchy given some qualities about the sequence
- gcd and order of elements of group
- Help with trigonometric equation
- How to prove $ A \cup \{a\} \approx B \cup \{ b \} \Rightarrow A \approx B $
- How to prove that $\lim\limits_{n\to\infty}\int\limits _{a}^{b}\sin\left(nt\right)f\left(t\right)dt=0\text { ? }$
- How does handle attachment work in Morse Theory