A convex optimization problem over two vectors

I have a problem in vectors $\mathbf{x} = (x_1,\ldots,x_n)$ and $\mathbf{y}=(y_1,\ldots,y_n)$, where every $x_i,y_i\in\mathbb{R}_{\geq 0}$,

$$\begin{array}{ll} \text{maximize} & \displaystyle\sum_{i=1}^n \left(\frac{y_i}{x_i}\right)^ux_i\\ \text{subject to} & \displaystyle\sum_{i=1}^n x_i = 1\\ & \displaystyle\sum_{i=1}^n y_i = 1\\ & \mathbf{a} \leq \mathbf{x} \leq \mathbf{b}\\ & \mathbf{c} \leq \mathbf{y} \leq \mathbf{d}\end{array}$$

where $u \in (0,1)$ and nonnegative vectors $\mathbf{a}$, $\mathbf{b}$, $\mathbf{c}$, $\mathbf{d}$ are given. My questions are:

  • Is there an efficient algorithm to solve this?
  • Have you ever seen the same or a very similar optimization problem like the one above?

Addendum: The following conditions are also known:
$$
\sum_{i=1}^n a_i<1,\ \sum_{i=1}^n c_i<1,\quad \mbox{and}\quad \sum_{i=1}^n b_i>1,\ \sum_{i=1}^n d_i>1
$$

Solutions Collecting From Web of "A convex optimization problem over two vectors"

The minimum with strict inequalities for $a<x<b$, $c<y<d$ is most likely not to exist, as the optimal solution would most probably occur on the boundary and get equalities for some $x_i,y_i$. Is it critical?

Otherwise the objective function

$$\sum_{i=1}^n x_i^{1-u} y_i^u$$

is concave for $u\in(0,1)$ and positive $x,y$, the constraints are simple, the number of variables is not too many, it is a medium-scale problem, so I would expect most of algorithms perform quite well here, for example, the primal-dual interior point method or even Quasi-Newton/Newton method.

Another possibility is to do the dual decomposition. If we introduce the Lagrange multipliers for the equality constraints, the Lagrangian becomes
\begin{align}
L(x,y,\lambda,\mu)&=\sum_{i=1}^nx_i^{1-u}y_i^u+\lambda\left(\sum_{i=1}^nx_i-1\right)+\mu\left(\sum_{i=1}^ny_i-1\right)\\
&=\sum_{i=1}^n(x_i^{1-u}y_i^u+\lambda x_i+\mu y_i)-\lambda-\mu,
\end{align}
thus, to calculate the dual problem for given $\lambda,\mu$ one has to solve $n$ independent scalar problems
$$
g_i(\lambda,\mu)=\max\ x_i^{1-u}y_i^u+\lambda x_i+\mu y_i\quad\text{subject to } a_i\le x_i\le b_i,\ c_i\le y_i\le d_i
$$
and then get the solution from the unconstraint dual problem
$$
\min_{\lambda,\mu}\left(\sum_{i=1}^ng_i(\lambda,\mu)-\lambda-\mu\right).
$$
If the inequality constraints are compatible with the equalities, i.e.
$$
\sum a_i<1,\ \sum c_i<1,\ \sum b_i>1,\ \sum d_i>1
$$
then there is no duality gap, and the primal solution can be constructed from the dual solution.