Articles of computational complexity

Factoring extremely large integers.

The question is about factoring extremely large integers but you can have a look at this question to see the context if it helps. Please note that I am not very familiar with mathematical notation so would appreciate a verbose description of equations. The Problem: The integer in question will ALWAYS be a power of […]

$O(n)$ algorithm to determine a number that appears more than $n/2$ times in an array of size $n$

I am trying to determine $O(n)$ algorithm to determine a number that appears more than $n/2$ times in an array of size $n$, assuming that such a number does exist in the array without resorting to an algorithm that computes the median. This problem came with a hint stating that Property 1: if an array […]

Confusion related to the definition of NP problems

I have this confusion related to the definition of NP problems. According to wikipedia Intuitively, NP is the set of all decision problems for which the instances where the answer is “yes” have efficiently verifiable proofs of the fact that the answer is indeed “yes”. More precisely, these proofs have to be verifiable in polynomial […]

Is there an efficient algorithm to compute a minimal polynomial for the root of a polynomial with algebraic coefficients?

An algebraic number is defined as a root of a polynomial with rational coefficients. It is known that every algebraic number $\alpha$ has a unique minimal polynomial, the monic polynomial with rational coefficients of smallest degree of which $\alpha$ is a root. It is also known that the algebraic numbers are algebraically closed, meaning that […]

Why are Hornsat, 3sat and 2sat not equivalent?

I have been reading a little bit about complexity theory recently, and I’m having a bit of a stumbling block. The horn satisfiability problem is solvable in linear time, but the boolean satisfiability problem in general is NP-Hard. So far so good, I understand that. Apparently Boolean satisfiability is reducible to satisfiability of a boolean […]

At what point does exponential growth dominate polynomial growth?

It’s well-known that exponential growth eventually overtakes polynomial growth (link, link). So for any non-negative integer $d$ and positive $\epsilon$, there exists $t^* \ge 0$ for which $$ 1 + t + \frac{t^2}{2!} + \ldots + \frac{t^d}{d!} \le e^{\epsilon t} $$ for all $t \ge t^*$. In other words, it takes $t^*$ seconds for the […]

Using $O(n)$ to determine limits of form $1^{\infty},\frac{0}{0},0\times\infty,{\infty}^0,0^0$?

Is it sufficient to use $O(n)$ repeatedly on $1^{\infty},\frac{0}{0},0\times\infty,{\infty}^0,0^0$ to get determinate forms? For example if we look at $\frac{0}{0}$ then $$\frac{O(f(n))}{O(g(n))}$$ should simplify the matters, and if needed repeat again to get $$\frac{O(O(f(n)))}{O(O(g(n)))}$$ My question is: at most would it be sufficient to repeatedly apply $O(n)$ to find the limit? Using this approach what […]

Complexity of counting the number of triangles of a graph

The trivial approach of counting the number of triangles in a simple graph $G$ of order $n$ is to check for every triple $(x,y,z) \in {V(G)\choose 3}$ if $x,y,z$ forms a triangle. This procedure gives us the algorithmic complexity of $O(n^3)$. It is well known that if $A$ is the adjacency matrix of $G$ then […]

Are derivatives actually bounded?

Suppose you a function $f$ which is differentiable, with the property that $$ f^{(n)} (0) = (n!)^2 $$ And in general $$ f^{(n)} (a) = O((n!)^2)$$ For any $a \in \mathbb{R}$. This function then is nowhere well defined by a taylor series. But I don’t immediately accept that this function “doesn’t exist”. What if we […]

Calculating run times of programs with asymptotic notation

When calculating the run time of programs using asymptotic notation, I know how to set up the sums for things like for loops, but I’m getting stuck on summing them up. Sorry if this is a dumb question but say we have something like for i=4 to n^2 for j = 5 to 3i log […]