Intuition behind the concept of indicator random variables.

I am studying Randomized Algorithms chapter in the book “Introduction to Algorithms” by Cormen et al.

In this chapter the book introduces the concept of an indicator random variable and state that the expected value of an indicator random variable as :

enter image description here

I am having difficulty understanding why this is called an indicator random variable, specifically why indicator and random and how this concept is useful in analyzing algorithm timings . It has been some time since I studied probability in school . However , I am aware of the concept behind probability. So you can base your answer on this premise.

As you can see from the diagram all it is saying is that the expected value of an indicator random variable of an event is equal to the probability of that event . We already have the concept of probability , why should we know about this new concept which happens to be the same value as the probability ?

Solutions Collecting From Web of "Intuition behind the concept of indicator random variables."

As the name implies, an indicator random variable indicates something: the value of $I_A$ is $1$ precisely when the event $A$ occurs, and is $0$ when $A$ does not occur (that is, $A^c$ occurs). Think of $I_A$ as a Boolean variable that indicates the occurrence of the event $A$. This Boolean variable has value $1$ with probability $P(A)$ and so its average value is $P(A)$. In terms of long-term frequencies, $I_A$ will have value $1$ on roughly $N\cdot P(A)$ of $N$ trials
of the experiment, and the long-term average value of $I_A$ on these
$N$ trials will be approximately

An indicator function is a function that returns the value of 1 when something is true: $$\mathbf{1}[A] = \left\{\begin{array}{cc} 1, & A\ \mathrm{ is\ true,} \\ 0, & A\ \mathrm{ is\ false.} \end{array}\right.$$

Therefore, the expectation is essentially the same thing as computing the expected value of a Bernoulli random variable: the value 1 times the probability that $A$ is true, plus the value 0 times the probability it is not.