# Generously Feasible?

In my machine learning class I have been provided a weight vector that has the property that it is generously feasible ?

Formally, what does generously feasible mean? I can’t seem to find a definition?

#### Solutions Collecting From Web of "Generously Feasible?"

If the weight vector in the current iteration is in the region between the hyperplane and the magnitude of input vector, i.e. $\vec{w^t_x} \: \epsilon \: [ \langle \vec{w_{x}},\vec{x} \rangle , |\vec{x}| ]$, where $\langle \vec{w_{x}},\vec{x} \rangle$ is the hyperplane, then, since the perceptron adds $\vec{x}$ or $-\vec{x}$ to the weights each iteration, the weight vector will oscillate around the hyperplane. Hence for the algorithm to terminate with a solution, it should be allowed to accept a solution in this feasible space, hence called the “generously feasible” space.

So consider “generously feasible” weight vectors that lie within the feasible region by a margin at least as great as the length of the input vector that defines each constraint plane.

Every time the perceptron makes a mistake, the squared distance to all of these generously feasible weight vectors is always decreased by at least the squared length of the update vector.