Intereting Posts

Plotting exponential partial sums in the complex plane
Natural and coordinate free definition for the Riemannian volume form?
Looking for closed-forms of $\int_0^{\pi/4}\ln^2(\sin x)\,dx$ and $\int_0^{\pi/4}\ln^2(\cos x)\,dx$
What does it mean to say a boundary is $C^k$?
Is an ideal generated by multilinear polynomials of different degrees always radical?
Are Continuous Functions Always Differentiable?
Finitely Presented is Preserved by Extension
How to compute homography matrix H from corresponding points (2d-2d planar Homography)
How to prove $\int_1^\infty\frac{K(x)^2}x dx=\frac{i\,\pi^3}8$?
If $K^{\mathrm{Gal}}/F$ is obtained by adjoining $n$th roots, must $K/F$ be as well?
The expectation of absolute value of random variables
integration using residue
Finite Sum $\sum_{i=1}^n\frac i {2^i}$
Is $\sum\frac1{p^{1+ 1/p}}$ divergent?
Reverse Triangle Inequality Proof

What is the Orthogonal Projection onto the $ {L}_{1} $ Unit Ball?

Namely, given $ x \in {\mathbb{R}}^{n} $ what would be:

$$ {\mathcal{P}}_{ { \left\| \cdot \right\| }_{1} \leq 1 } \left( x \right) = \arg \min_{{ \left\| y \right\| }_{1} \leq 1} \left\{ {\left\| y – x \right\|}_{2}^{2} \right\} $$

- Existence and uniqueness of a function generalizing a finite sum of powers of logarithms
- Continuity of a convex function
- Hessian matrix for convexity of multidimensional function
- Proof of Clarkson's Inequality
- Are there necessary and sufficient conditions so that every element in a partially ordered set is either the least element or in the upset of an atom?
- Is every monotone map the gradient of a convex function?

Thank You.

- A problem about convex domains
- Please explain the intuition behind the dual problem in optimization.
- Boundedness condition of Minkowski's Theorem
- Epigraph of closed convex hull of a function
- How is the Lagrangian related to the perturbation function?
- If the Minkowski sum of two convex closed sets is a Euclidean ball, then can the two sets be anything other than Euclidean balls?
- Maximize $\langle \mathrm A , \mathrm X \rangle$ subject to $\| \mathrm X \|_2 \leq 1$
- Lower hemicontinuity of the intersection of lower hemicontinuous correspondences
- Intersection of a properly nested sequence of convex sets , if nonempty and bounded , can never be open?
- Is the convex hull of closed set in $R^{n}$ is closed?

Hint: Because of the symmetric essences of the problem you may assume $x$ lies in first quadrant i.e, $x \ge 0$ and assume $x$ is out side of $\ell_1 $- Unit ball (other wise the answer is trivially $y=x$ ),Therefor under these assumption for sure we have $ 0 \leq y^{*} \leq x$ where $y^{*} $ is the unique optimal solution. To find $y^{*}$ you need to solve following **Quadratic programming**

\begin{aligned}

& {\text{Min}}

& & \sum_{i=1}^{n} (x_i -y_i)^2 \\

& \text{subject to}

& & y \geq 0, \\

& & & \sum_{i=1}^{n} y_i =1 ,

\end{aligned}

Note that this is a smooth convex optimization problem with linear constraints, So it is easy to solve!

To find a closed form solution set up $KKT$ systems.

Note that once you get solution from problem above, you can characterize all solutions for all cases depending on positions of $x$ in space. For example let $x = (-1, 2,0,0,3)$, you know the solution for above problem where $\bar{x}=(1,2,0,0,3),$ call it $\bar{y} =(y_1,y_2,…, y_n)$ then solution corresponding to $x$ is $y=(-y_1,y_2,…,y_n)$.

$$ \DeclareMathOperator{\sign}{sign} $$

The Lagrangian of the problem can be written as:

$$ \begin{align}

L \left( x, \lambda \right) & = \frac{1}{2} {\left\| x – y \right\|}^{2} + \lambda \left( {\left\| x \right\|}_{1} – 1 \right) && \text{} \\

& = \sum_{i = 1}^{n} \left( \frac{1}{2} { \left( {x}_{i} – {y}_{i} \right) }^{2} + \lambda \left| {x}_{i} \right| \right) – \lambda && \text{Component wise form}

\end{align} $$

The Dual Function is given by:

$$ \begin{align}

g \left( \lambda \right) = \inf_{x} L \left( x, \lambda \right)

\end{align} $$

The above can be solved component wise for the term $ \left( \frac{1}{2} { \left( {x}_{i} – {y}_{i} \right) }^{2} + \lambda \left| {x}_{i} \right| \right) $ which is solved by the soft Thresholding Operator:

$$ \begin{align}

{x}_{i}^{\ast} = \sign \left( {y}_{i} \right) { \left( \left| {y}_{i} \right| – \lambda \right) }_{+}

\end{align} $$

Where $ {\left( t \right)}_{+} = \max \left( t, 0 \right) $.

Now, all needed is to find the optimal $ \lambda \geq 0 $ which is given by the root of the objective function (Which is the constrain of the KKT Sytsem):

$$ \begin{align}

h \left( \lambda \right) & = \sum_{i = 1}^{n} \left| {x}_{i}^{\ast} \left( \lambda \right) \right| – 1 \\

& = \sum_{i = 1}^{n} { \left( \left| {y}_{i} \right| – \lambda \right) }_{+} – 1

\end{align} $$

The above is a Piece Wise linear function of $ \lambda $ and its Derivative given by:

$$ \begin{align}

\frac{\mathrm{d} }{\mathrm{d} \lambda} h \left( \lambda \right) & = \frac{\mathrm{d} }{\mathrm{d} \lambda} \sum_{i = 1}^{n} { \left( \left| {y}_{i} \right| – \lambda \right) }_{+} \\

& = \sum_{i = 1}^{n} -{ \mathbf{1} }_{\left\{ \left| {y}_{i} \right| – \lambda > 0 \right\}}

\end{align} $$

Hence it can be solved using Newton Iteration.

In a similar manner the projection onto the Simplex (See @Ashkan answer) can be calculated.

The Lagrangian in that case is given by:

$$ \begin{align}

L \left( x, \mu \right) & = \frac{1}{2} {\left\| x – y \right\|}^{2} + \mu \left( \boldsymbol{1}^{T} x – 1 \right) && \text{} \\

\end{align} $$

The trick is to leave non negativity constrain implicit.

Hence the Dual Function is given by:

$$ \begin{align}

g \left( \mu \right) & = \inf_{x \succeq 0} L \left( x, \mu \right) && \text{} \\

& = \inf_{x \succeq 0} \sum_{i = 1}^{n} \left( \frac{1}{2} { \left( {x}_{i} – {y}_{i} \right) }^{2} + \mu {x}_{i} \right) – \mu && \text{Component wise form}

\end{align} $$

Again, taking advantage of the Component Wise form the solution is given:

$$ \begin{align}

{x}_{i}^{\ast} = { \left( {y}_{i} – \mu \right) }_{+}

\end{align} $$

Where the solution includes the non negativity constrain by Projecting onto $ {\mathbb{R}}_{+} $

Again, the solution is given by finding the $ \mu $ which holds the constrain (Pay attention, since the above was equality constrain, $ \mu $ can have any value and it is not limited to non negativity as $ \lambda $ above).

The objective function (From the KKT) is given by:

$$ \begin{align}

h \left( \mu \right) = \sum_{i = 1}^{n} {x}_{i}^{\ast} – 1 & = \sum_{i = 1}^{n} { \left( {y}_{i} – \mu \right) }_{+} – 1

\end{align} $$

The above is a Piece Wise linear function of $ \mu $ and its Derivative given by:

$$ \begin{align}

\frac{\mathrm{d} }{\mathrm{d} \mu} h \left( \mu \right) & = \frac{\mathrm{d} }{\mathrm{d} \mu} \sum_{i = 1}^{n} { \left( {y}_{i} – \mu \right) }_{+} \\

& = \sum_{i = 1}^{n} -{ \mathbf{1} }_{\left\{ {y}_{i} – \mu > 0 \right\}}

\end{align} $$

Hence it can be solved using Newton Iteration.

I wrote MATLAB code which implements them both at Mathematics StackExchange Question 2338491 – GitHub.

There is a test which compares the result to a reference calculated by CVX.

- Isomorphism of Elliptic Curves:
- Basis of a subset of finitely generated torsion free module
- Cube roots don't sum up to integer
- How does trigonometric substitution work?
- Is the quotient map a homotopy equivalence?
- Definition of the gradient for non-Cartesian coordinates
- Estimation of integral
- The automorphism group of the real line with standard topology
- Prove that $F(xy) =F(x) + F(y)$ when $F(x)$ is not $\ln(x)$
- Continuous increasing function with different Dini derivatives at 0
- Reciprocal-based field axioms
- Computing matrix-vector calculus derivatives
- Does $f(x) = f(2x)$ for all real $x$, imply that $f(x)$ is a constant function?
- Bound variance proxy of a subGaussian random variable by its variance
- Is $\delta : \mathcal{S}(\mathbf{R}) \to \mathbf{C}$ continuous with usual seminorm?