Intereting Posts

Compactness in the weak* topology
Simplify this equation.
Proof that $\frac{2}{3} < \log(2) < \frac{7}{10}$
Mathematical structures
Dr Math and his family question. How to solve without trial and error?
Additive category that is not abelian
How to Compute Genus
If $G$ acts such that $\mbox{fix}(g) \in \{0,p\}$ for $g \ne 1$, $M$ maximal with $|G : M| = p$, then $|M / L| = p$ for semiregular $L \unlhd G$.
Generic topology on a vector space?
Open set as a countable union of open bounded intervals
Integrating Floor Functions
Small cycles in an undirected graph
Non-backtracking closed walks and the Ihara zeta function (Updated with partial attempt)
How this operation is called?
Finding points on the parabola at which normal line passes through it

Which is correct?

$$ \prod_a ab = \left[ \prod_a a\right]b $$

or

$$ \prod_a ab = \prod_a \left[ ab \right] $$

I’d say the latter, but with $\sum$ we have

$$\sum_a a + b = \left [ \sum_a a \right ] + b $$

Do they work differently, or did I guess wrong? Also, are these things defined somewhere (or at least documented)? Wikipedia and Mathworld let me down on this one.

- What does it mean when dx is put on the start in an integral?
- Weird large K symbol
- How to write two for loops in math notation?
- The notation for partial derivatives
- What are the rules for equals signs with big-O and little-o?
- meaning of powers on trig functions

- “Good” notation for $\hom$-functors
- Confusion about the usage of points vs. vectors
- Notation for function $ + \rightarrow \times $
- Meaning of $f:\to\mathbb{R}$
- what does ∇ (upside down triangle) symbol mean in this problem
- Notation for “the highest power of $p$ that divides $n$”
- Is there a symbol to mean 'this is undefined'?
- What does $\log^{2}{x}$ mean?
- Is there an accepted symbol for irrational numbers?
- Can someone show me why mathematicians use $d\mu$ instead of $dx$ for Lebesgue Integral over $u(x)$

If $b$ is does not depend on the index of summation, $$\sum_a(ab)=\left(\sum_aa\right)b=b\sum_aa\;,$$ so there’s no ambiguity. If $b$ does depend on the index of summation, either $\sum\limits_aab$ is to be understood as $\sum\limits_a(ab)$, or the writer made a bad mistake.

The expression $\sum\limits_aa+b$, however, is potentially ambiguous and should not be used; write $\sum\limits_a(a+b)$ if that’s the desired interpretation, and $\left(\sum\limits_aa\right)+b$ or, better, $b+\sum\limits_aa$ if **that’s** the desired interpretation. When reading, you have to use your head and pay attention to the context. E.g., something like $\sum\limits_{k=1}^na_k+b_k$ is surely intended to be read as $\sum\limits_{k=1}^n(a_k+b_k)$, though it’s a horribly sloppy way of writing it. $\sum\limits_{k=1}^na_k+b$, on the other hand, probably means $b+\sum\limits_{k=1}^na_k$, but if you see it being evaluated as $$\sum\limits_{k=1}^na_k+b=A(n)+nb\;,$$ it probably meant $\sum\limits_{k=1}^n(a_k+b)$, and the $nb$ is the result of summing those $n$ $b$ terms.

This is an important question, and it is a shame that there is no agreed-upon answer. One thing that is clear, is that a multiplication on its right binds more strongly than the $\sum$ operator, always. On the left there is no choice, and $\sum$ acts first in cases like $2\sum_ia_i$. No parentheses are needed either in cases like

$$

\prod_{i\in I}\sum_{j\in J}a_{i,j} = \sum_{f:I\to J}\prod_{i\in I}a_{i,f(i)},

$$

in whose left hand side the summation is applied before multiplication, not because of precedence rules but because the order of the operators used leaves no room for another interpretation.

Although some seem to disagree, it seems clear to me that between $\sum$ and “$+$” or binary “$-$” to its right, the summation binds stronger (and therefore ends at the mentioned operator). It is true that $\sum_ia_i+b$ might be written less confusingly as $b+\sum_ia_i$, but how about $\sum_ia_i+\sum_jb_j$, which certainly does not get less confusing by writing it as $\sum_jb_j+\sum_ia_i$? I think it is easy to find plenty of such examples in the literature, where two independent summations are added or subtracted without using parentheses; if the convention were either that such “$+$” or “$-$” binds stronger than summation, or that no precedence is defined at all, parentheses enclosing the first summation would be required (for the second there is no ambiguity). Note that the ambiguity at stake here is not just syntactic but semantic: including a second summation into the summand of the first one would add the terms of the second summation many more times than with separate summations, giving a different result. Based on the fact that people do add and subtract summations without parentheses, I think that there is agreement (even if unconscious to some) that an unparenthesised “$+$” or “$-$” does terminate a summation that is in progress.

For $\prod$ and multiplication the situation *ought to* be the same as for $\sum$ and addition, but I think there is less agreement about this. For one thing cases that reveal what people unconsciously think by what they do in practice are far more rare than for addition. Leafing through Concrete Mathematics which is replete with summations, I had a hard time finding any examples for products. However I found this one on page 490, in a list of “basic analogies”:

$$

\sum_{k\in K}\left(a_k+b_k\right) = \sum_{k\in K}a_k+\sum_{k\in K}b_k

\longleftrightarrow

\prod_{k\in K}a_kb_k = \left(\prod_{k\in K}a_k\right)\left(\prod_{k\in K}b_k\right).

$$

The second identity illustrates that these authors believe that the situation is contrary to the additive one: a multiplication does not terminate a product in progress (on the left), while multiplying two products requires additional parentheses (the second pair is just for aesthetic reasons). Actually the whole line (notably the placing of parentheses) shows beautifully how multiplicative notation is *not analogous* to additive notation.

Another example in the same book is $\sum_{k=1}^n(1+z/k)e^{-z/k}$ (page 535) without additional parentheses. Also the Jacobi triple product (actually a triply infinity product) gets written without additional parentheses: TAOCP Vol 4A page 396 has $\prod_{k=1}^\infty(1-u^kv^{k-1})(1-u^{k-1}v^k)(1-u^kv^k)$ rather than the more prudent $\prod_{k=1}^\infty\bigl((1-u^kv^{k-1})(1-u^{k-1}v^k)(1-u^kv^k)\bigr)$, or $\prod_{k=1}^\infty(1-u^kv^{k-1})\prod_{k=1}^\infty(1-u^{k-1}v^k)\prod_{k=1}^\infty(1-u^kv^k)$ which would use the opposite convention. And here is an interesting one where a first product encloses a second product, from [A. Borodin, Duke Math J. 140(3) (2007); proposition 5.1]; note how $n$ lives all the way to the end.

$$

\prod_{n\geq1}{1\over1-s^{nN}}

\prod_{p\in\overline{1,N}:A[p]=1\atop q\in\overline{1,N}:B[q]=1}

{1\over1-s^{(p-q)(N)+(n-1)N}}

$$

But I am sure I have also seen examples of that opposite convention: multiplication of independent products without additional parentheses. A quick search revealed quite a few instances in Stanley’s Enumerative Combinatorics$~2$, page 370, 458; note however that Stanley often, but not always, writes a dot for the multiplication, as in $\prod_x f(x)\cdot\prod_yg(y)$, which supposedly is there to suggest the proper parsing of the formula.

In conclusion, for products involving a multiplication differently people have different conventions, but surprisingly mathematicians seem to prefer to err on the side of writing too few parentheses rather than too many (computer programmers are quite the opposite).

**Added** Thinking of this a bit longer, I realise that there is a point of view that makes the above seem much more coherent:

The precedence of all large operators like $\sum$, $\prod$, $\int$ (and others like $\bigoplus$, $\coprod$ as well in cases where it would be relevant) with respect to binary operators to their right

are all equaland situated in between the precedence of ‘$+$’ and and $-$’ on one hand and the precedence of multiplication (notated using juxtaposition) and ‘$/$’ on the other hand.

The $\cdot$ operator used to separate products without the need of parentheses can be considered to be a syntactic variant of multiplication introduced with the explicit purpose of being of lower precedence than $\prod$ (but it would then also be lower than that of $\sum$, allowing to write $\sum_ia_i\cdot\sum_jb_j$ without parentheses, which is an interesting proposal).

Those expressions are inherently ambiguous In part this is a fault of the notation, since it lacks a delimiter that signifies the end of the sum. Contrast this to integrals, which have $\rm\ ‘dx\,’\:$ delimiting the end of the integrand

$$\rm \int (f(x) + g(x))\ dx\quad vs.\quad \sum_k\ f(k) + g(k) $$

This is one reason why some authors write indefinite sums in an analogous form

$$\rm \sum\ (f(k) + g(k))\ \Delta k$$

Generally it is best to avoid use of such ambiguous expressions by inserting parentheses as need be to make clear the intended parsing.

I think the common convention is

$$

\sum_a a+b = \left(\sum_a a\right) + b

$$

but

$$

\prod_a ab = \prod_a (ab).

$$

This is the first time that I realize that this is in some sense inconsistent.

- Primes of the form $x^2+ny^2$
- Noncyclic Abelian Group of order 51
- Improper integral $\int_0^\infty \frac{\sin(x)}{x}dx$ – Showing convergence.
- What makes a context free grammar ambiguous?
- Cardinality of the set of bijective functions on $\mathbb{N}$?
- proving $ \sqrt 2 + \sqrt 3 $ is irrational
- Is there a name for the “famous” inequality $1+x \leq e^x$?
- Math behind rotation in MS Paint
- $SL(3,\mathbb{C})$ acting on Complex Polynomials of $3$ variables of degree $2$
- It seems like a nim variant
- Unramified extension is normal if it has normal residue class extension
- Find and classify singular points of $\cot\left(\frac{1}{z}\right)$
- To determine whether the integral $\int_0^{\infty} \frac{\sin{(ax+b)}}{x^p} \,\mathrm dx$ converges for $p>0$
- Find $a$ such that $(x+a)(x+1991)+1=(x+b)(x+c)$ with $a,b,c\in\Bbb Z$
- What does this linear algebra notation mean?