WEEK2_ topic : " Why do we need Probabability Theory?"

In this video, it is explained that if we have a population of 10000 and from this population, we take a sample of 100, then the total number of ways of doing that is 6.5 * 10^241, which is quite a big number.

This must have been evaluated using combination as

(10000 * 9999 * 9998 …upto 9901)/(100!)

This is not a large population size but still my computer, and even on google search, could not evaluate this small value(when compared to millions of people in actual scenarios) of P(10000,100).

What does it mean? Does it require a more powerful computer like a supercomputer? Or is there something else that I need to be aware of?

Kindly provide your insights. Thank you.


IMG_20200214_003935|652x500

1 Like

One of the primary reasons is that general purpose computers are 64 or 32 bit architecture, which specifies a certain range of values a data type can hold. No data type is suitable enough to contain such large numbers and it is difficult for the machine to directly compute such values of the order of 10^241. Hope this helps!!

4 Likes

So does that mean, there exists some other techniques to determine the chance of observing a trend throughout the population of 10000 people, if this trend is observed in our collected sample of 100 people? Or else it actually requires heavy machines to compute, because in real world, these numbers tend to be even bigger, how do people do it then?

1 Like

Yes, more complex computations require the use of high computing devices such as GPUs and TPUs. Libraries such as numpy are used to help solve complex computations in an efficient manner. There are other libraries such as PyTorch and Tensorflow are there too. However, I am not aware of alternate techniques for this specific task of observing the trend in population. You may give a check with these libraries. Hope this helps!!

2 Likes

Hi,

Let me take on a chance to explain this.

Suppose, there is an upcoming election in a city of 10,000 people and we are interested in knowing the preference of its peoples between the left-wing and right-wing party.

So given the above objective, you got to reach to each individual of city before reaching a conclusion.

I believe 10,000 is a very big number as reaching 10,000 persons is time-consuming and cost-intensive process, or probably we can’t even ask each individual about there preference for various obvious reasons.

So, what would we do?

We would take a sample of 100 citizens such that it represents its population ( Knowledge of Sampling Technique is required ).

Please notice, we could select these 100 citizens in (10000 C 100) ways. Now ponder…

What if we would have chosen 100 citizens in the sample who all favour left-wing over right-wing. Such combination is possible in (10000 C 100) ways which may lead us to a wrong conclusion.

I would like to enumerate two problems that we are facing:

  1. We can’t work with the population because of cost, time and other constraints.
  2. We can’t rely on just one sample because that could be misleading.

Therefore we need some scientific technique to solve these two problems and come up with some conclusion, and I think here we would be needing probability theory and inferential statistics.

Thanks for the reading.

7 Likes

Nicely explained. Thank you. I have also felt that the course has just started and there is lot to learn and be familiar with. So patience is the key.

3 Likes

Hiee!!

Just to add my understanding of ‘‘why do we need probability theory’’ .
Whenever we take a sample out of a population(and that we do because we do not have enough resources like cost, time etc to work on entire population) we make inference about the population. But we can never be 100% sure as whatever inferences we are making out of that sample, will fully(100%) comply with the population. Therefore, after analysing sample we always talk about the ‘chance’ of any event related to that population.

This ‘chance’ term comes because there are certain factors which are out of our control, and these factors are called randomness in the population. To factor this randomness and define how much chance do we have that our inferences are correct, Probablity theory is required!!!

Hope that helps!!!

Regards

4 Likes

Hello Friend
I know we can’t find this calculation by calculator. But being a Java coder, we can code to calculate this nPr or nCr.

import java.math.BigInteger;
public class nCr {
    public static void main(String args[ ]){
        BigInteger N = new BigInteger("10000");
        BigInteger r = new BigInteger("100");
        BigInteger m = new BigInteger("1");
        BigInteger s = new BigInteger("1");
        for(int i= 0; i<100; i++){
            m = m.multiply(N);
            N = N.subtract(s);
        }
        for(int i = 1; i<= 100; i++){
            s = s.multiply(BigInteger.valueOf(i));
        }

        System.out.println(m);//This will give nPr.
        
        System.out.println(m.divide(s));//This will give nCr.
    }
}

resutlt of nCr = 65208469245472575695415972927215718683781335425416743372210247172869206520770178988927510291340552990847853030615947098118282371982392705479271195296127415562705948429404753632271959046657595132854990606768967505457396473467998111950929802400