Intereting Posts

Relation betwen coefficients and roots of a polynomial
Expectation of maximum of arithmetic means of i.i.d. exponential random variables
Is isomorphism of two subgroups, one of them normal, enough to guarantee that the other is normal as well?
Mean of a Cauchy Distribution
Foundation for analysis without axiom of choice?
How to prove this $\pi$ formula?
Elementary question about Cayley Hamilton theorem and Zariski topology
The equivalence between Cauchy integral and Riemann integral for bounded functions
Classifying $\mathbb{Z}_{12} \times \mathbb{Z}_3 \times \mathbb{Z}_6/\langle(8,2,4)\rangle$
$\mathbb R^n$ has countable basis of open balls? (Yes)
Uniform convergence of functions, Spring 2002
Is outer measure a measure?
Expected number of rolls
About phi function
Prove that the real vector space consisting of all continuous, real-valued functions on the interval $$ is infinite-dimensional.

What is the Orthogonal Projection onto the $ {L}_{1} $ Unit Ball?

Namely, given $ x \in {\mathbb{R}}^{n} $ what would be:

$$ {\mathcal{P}}_{ { \left\| \cdot \right\| }_{1} \leq 1 } \left( x \right) = \arg \min_{{ \left\| y \right\| }_{1} \leq 1} \left\{ {\left\| y – x \right\|}_{2}^{2} \right\} $$

- Does local convexity imply global convexity?
- An explicit construction for an everywhere discontinuous real function with $F((a+b)/2)\leq(F(a)+F(b))/2$?
- Sufficient condition for a function to be affine
- About the hyperplane conjecture.
- Prove one set is a convex hull of another set
- Prove that convex function on $$ is absolutely continuous

Thank You.

- Composition of convex and power function
- Affine dimension of a simplex
- On Boyd et al.'s convergence analysis of ADMM: Why do we need the convexity assumption?
- How to prove that $e^x$ is convex?
- Understanding the subdifferential sum rule
- the composition of $g \circ f$ is convex on $\Omega$ from Shapley's definition of convexity
- Show that $f$ is convex if and only if $f\left( \sum_{i=1}^m\lambda_ix_i \right) \leq \sum_{i=1}^m\lambda_if(x_i)$
- Chebyshev sets in finite dimension are closed and convex
- Quasi-Concavity and Quasi-Convexity
- The measurability of convex sets

Hint: Because of the symmetric essences of the problem you may assume $x$ lies in first quadrant i.e, $x \ge 0$ and assume $x$ is out side of $\ell_1 $- Unit ball (other wise the answer is trivially $y=x$ ),Therefor under these assumption for sure we have $ 0 \leq y^{*} \leq x$ where $y^{*} $ is the unique optimal solution. To find $y^{*}$ you need to solve following **Quadratic programming**

\begin{aligned}

& {\text{Min}}

& & \sum_{i=1}^{n} (x_i -y_i)^2 \\

& \text{subject to}

& & y \geq 0, \\

& & & \sum_{i=1}^{n} y_i =1 ,

\end{aligned}

Note that this is a smooth convex optimization problem with linear constraints, So it is easy to solve!

To find a closed form solution set up $KKT$ systems.

Note that once you get solution from problem above, you can characterize all solutions for all cases depending on positions of $x$ in space. For example let $x = (-1, 2,0,0,3)$, you know the solution for above problem where $\bar{x}=(1,2,0,0,3),$ call it $\bar{y} =(y_1,y_2,…, y_n)$ then solution corresponding to $x$ is $y=(-y_1,y_2,…,y_n)$.

$$ \DeclareMathOperator{\sign}{sign} $$

The Lagrangian of the problem can be written as:

$$ \begin{align}

L \left( x, \lambda \right) & = \frac{1}{2} {\left\| x – y \right\|}^{2} + \lambda \left( {\left\| x \right\|}_{1} – 1 \right) && \text{} \\

& = \sum_{i = 1}^{n} \left( \frac{1}{2} { \left( {x}_{i} – {y}_{i} \right) }^{2} + \lambda \left| {x}_{i} \right| \right) – \lambda && \text{Component wise form}

\end{align} $$

The Dual Function is given by:

$$ \begin{align}

g \left( \lambda \right) = \inf_{x} L \left( x, \lambda \right)

\end{align} $$

The above can be solved component wise for the term $ \left( \frac{1}{2} { \left( {x}_{i} – {y}_{i} \right) }^{2} + \lambda \left| {x}_{i} \right| \right) $ which is solved by the soft Thresholding Operator:

$$ \begin{align}

{x}_{i}^{\ast} = \sign \left( {y}_{i} \right) { \left( \left| {y}_{i} \right| – \lambda \right) }_{+}

\end{align} $$

Where $ {\left( t \right)}_{+} = \max \left( t, 0 \right) $.

Now, all needed is to find the optimal $ \lambda \geq 0 $ which is given by the root of the objective function (Which is the constrain of the KKT Sytsem):

$$ \begin{align}

h \left( \lambda \right) & = \sum_{i = 1}^{n} \left| {x}_{i}^{\ast} \left( \lambda \right) \right| – 1 \\

& = \sum_{i = 1}^{n} { \left( \left| {y}_{i} \right| – \lambda \right) }_{+} – 1

\end{align} $$

The above is a Piece Wise linear function of $ \lambda $ and its Derivative given by:

$$ \begin{align}

\frac{\mathrm{d} }{\mathrm{d} \lambda} h \left( \lambda \right) & = \frac{\mathrm{d} }{\mathrm{d} \lambda} \sum_{i = 1}^{n} { \left( \left| {y}_{i} \right| – \lambda \right) }_{+} \\

& = \sum_{i = 1}^{n} -{ \mathbf{1} }_{\left\{ \left| {y}_{i} \right| – \lambda > 0 \right\}}

\end{align} $$

Hence it can be solved using Newton Iteration.

In a similar manner the projection onto the Simplex (See @Ashkan answer) can be calculated.

The Lagrangian in that case is given by:

$$ \begin{align}

L \left( x, \mu \right) & = \frac{1}{2} {\left\| x – y \right\|}^{2} + \mu \left( \boldsymbol{1}^{T} x – 1 \right) && \text{} \\

\end{align} $$

The trick is to leave non negativity constrain implicit.

Hence the Dual Function is given by:

$$ \begin{align}

g \left( \mu \right) & = \inf_{x \succeq 0} L \left( x, \mu \right) && \text{} \\

& = \inf_{x \succeq 0} \sum_{i = 1}^{n} \left( \frac{1}{2} { \left( {x}_{i} – {y}_{i} \right) }^{2} + \mu {x}_{i} \right) – \mu && \text{Component wise form}

\end{align} $$

Again, taking advantage of the Component Wise form the solution is given:

$$ \begin{align}

{x}_{i}^{\ast} = { \left( {y}_{i} – \mu \right) }_{+}

\end{align} $$

Where the solution includes the non negativity constrain by Projecting onto $ {\mathbb{R}}_{+} $

Again, the solution is given by finding the $ \mu $ which holds the constrain (Pay attention, since the above was equality constrain, $ \mu $ can have any value and it is not limited to non negativity as $ \lambda $ above).

The objective function (From the KKT) is given by:

$$ \begin{align}

h \left( \mu \right) = \sum_{i = 1}^{n} {x}_{i}^{\ast} – 1 & = \sum_{i = 1}^{n} { \left( {y}_{i} – \mu \right) }_{+} – 1

\end{align} $$

The above is a Piece Wise linear function of $ \mu $ and its Derivative given by:

$$ \begin{align}

\frac{\mathrm{d} }{\mathrm{d} \mu} h \left( \mu \right) & = \frac{\mathrm{d} }{\mathrm{d} \mu} \sum_{i = 1}^{n} { \left( {y}_{i} – \mu \right) }_{+} \\

& = \sum_{i = 1}^{n} -{ \mathbf{1} }_{\left\{ {y}_{i} – \mu > 0 \right\}}

\end{align} $$

Hence it can be solved using Newton Iteration.

I wrote MATLAB code which implements them both at Mathematics StackExchange Question 2338491 – GitHub.

There is a test which compares the result to a reference calculated by CVX.

- What is known about the sum $\sum\frac1{p^p}$ of reciprocals of primes raised to themselves?
- a square root of an irrational number
- What does “removing a point” have to do with homeomorphisms?
- Hartshorne proposition II(2.6)
- For any point outside of a circle, is there ever only one tangent to the circle that passes through the point?
- Possible determinant relation for PSD matrices.
- Quick question: Direct sum of zero-dimensional subschemes supported at the same point
- Prove by induction that $n^2<n!$
- How do I show that the derivative of the path $\det(I + tA)$ at $t = 0$ is the trace of $A$?
- Enlarging an ellipses along normal direction
- Verification of integral over $\exp(\cos x + \sin x)$
- Visualizing Frobenius Theorem
- How to find the P-Value for this test of hypothesis?
- Find the limit $ \lim_{n \to \infty}\left(\frac{a^{1/n}+b^{1/n}+c^{1/n}}{3}\right)^n$
- How influential was Lorenz' work?