If $0<a<1, 0<b<1$, $a+b=1$, then prove that $a^{2b}+ b^{2a} \le 1$

I have been really struggling with this problem … please help!
Let a,b be real numbers. If $0<a<1, 0<b<1, a+b=1$, then prove that $a^{2b} + b^{2a} \le 1$

What I have thought so far:
without loss of generality we can assume that $a \le b$, since $a^{2b} + b^{2a}$ is symmetric in $a$ and $b$. This gives us $0<a \le 1/2, 1/2 \le b<1$. But then I am stuck.
I also thought of solving by Lagrange’s multiplier method, but it produces huge calculations.

Any help is welcome 🙂

Solutions Collecting From Web of "If $0<a<1, 0<b<1$, $a+b=1$, then prove that $a^{2b}+ b^{2a} \le 1$"

Not having a good day with websites. I have downloaded what seems to be the source of the question, a 2009 paper by Vasile Cirtoaje which is about 14 pages. Then a short answer, in a four page document by Yin Li, probably from the same time or not much later. The question was posted on MO by a selfish guy who knew the status of the problem but was hoping for a better answer, a complete answer was also given there in 2010 by fedja, https://mathoverflow.net/questions/17189/is-there-a-good-reason-why-a2b-b2a-1-when-ab1

I have both pdfs by Cirtoaje and Li, email me if you cannot find them yourself.

This is not a reasonable homework question, so I would like to know more of the story, what course for example.

========================

Yin Li of Binzhou University, Shandong, 2009 or 2010, excerpts I believe he just spelled Jensen’s incorrectly, see JENSEN

enter image description here

========================