Help understanding the complexity of my algorithm (summation)

As an exercise, I wrote an algorithm for summing the all elements in an array that are less than i. Given input array A, it produces output array B, whereby B[i] = sum of all elements in A that are <= i.
A’s largest element is k.

My algorithm is as follows:

function summation(A, k)
    Sorted = merge_sort(A)
    j = 1
    B[0] = 0
    for i = 1 to k:
        B[i] = B[i - 1]
        while Sorted[j] <= i and j <= Sorted.length:
            B[i] = B[i] + Sorted[j]

Merge sort sorts in O(n log n). Now , intuitively I would say that the worst-case running time of the algorithm is n log n + k + n (since the inner while loop can only execute n times, regardless of k).

However expressing this is a sum, I get n log n + k + n. i.e. expressing the for and while loop:

$\sum^k_{i=1}\sum^n_{j=1}1$ = $\sum^n_{i=1}n$ = $kn$

i.e. The algorithm’s runtime would be: n log n + kn
Any advice on where I am wrong?

Solutions Collecting From Web of "Help understanding the complexity of my algorithm (summation)"

Your sum expression is not wrong but it is a rather loose bound on the running time. In order to get the desired running time of $n \log n + k + n$ you could use the following sum
\sum_{i=1}^k \, (1 + \text{number of items of size $i$})
since the inner loop is only executed while there are items of size $i$. You need the additional 1 for the assignment $B[i] = B[i-1]$ in the outer loop. The sum can be simplified to obtain the suggested running time of $n + k$ for the loops.
\sum_{i=1}^k \, (1 + \text{number of items of size $i$})
= ~ & k + \sum_{i=1}^k \, (\text{number of items of size $i$}) \\
= ~ & k + n