Personal tools

Probability, Statistics and AI

Harvard (Charles River) IMG 7721
(Harvard University - Harvard Taiwan Student Association)


- Overview

Artificial intelligence (AI) and probability theory are closely related. Statistics and probability theory provide the mathematical foundations for many AI techniques. For example, Bayesian statistics is used to model uncertainty and make predictions in AI systems. 

Here are some ways probability and statistics are used in AI:

  • Probabilistic reasoning: A form of knowledge representation that uses the concept of probability to indicate the degree of uncertainty in knowledge.
  • Probabilistic models: Used to examine data using statistical codes. 
  • Machine learning: Heavily utilizes statistics, which is built upon probability theory.
  • Statistical techniques: Help identify the most informative features and discard redundant or irrelevant ones.


Probability enables us to reason about uncertainty, while statistics quantifies and explains it.

Statistics are important for understanding and improving AI systems. Statistical models enable AI algorithms to learn from data, adapt to new information, and make informed decisions. Statistical inference is also important for evaluating the performance and reliability of AI systems. 

Statistical techniques are essential for validating and refining ML models. For example, techniques like hypothesis testing, cross-validation, and bootstrapping help quantify the performance of models and avoid problems like over-fitting. 

Many performance metrics used in ML algorithms, such as accuracy, precision, recall, f-score, and root mean squared error, use statistics as the base.

A basic understanding of probability and statistical concepts and their application to solving real-world problems is necessary. This prerequisite provides a solid background in applications of probability and statistics that will serve as a foundation for AI and advanced techniques, including statistical concepts, probability theory, random and multivariate variables, data and sampling distributions, descriptive statistics, and hypothesis testing.

Please refer to the following for more information:

 

- Modern Statistics

Statistics is both a body of theory and methods of analysis. The subject matters of statistics covers a wide range - extending from the planning of experiments and other studies that generate data to the collection, analysis, presentation, and interpretation of the data. Numerical data constitute the raw material of the subject matter of statistics.

The essence of modern statistics, however, is the theory and the methodology of drawing inferences that extend beyond the particular set of data examined and of making decisions based on appropriate analysis of such inferential data.  

Please refer to the following for more information.

 

- Types of Statistics

The essence of modern statistics is the theory and the methodology of drawing inferences that extend beyond the particular set of data examined and of making decisions based on appropriate analyses of such inferential data. 

Statistics can be classified into two different categories. The two different types of Statistics are:

  • Descriptive Statistics
  • Inferential Statistics

 

In Statistics, descriptive statistics describe the data, whereas inferential statistics help you make predictions from the data. In inferential statistics, the data are taken from the sample and allows you to generalize the population. 

In general, inference means “guess”, which means making inference about something. So, statistical inference means, making inference about the population. To take a conclusion about the population, it uses various statistical analysis techniques. 

Please refer to the following for more information:

 

- AI and Probability

Probability is the likelihood of an event occurring. In artificial intelligence (AI), probability is used to model and reason about uncertain situations. For example, AI can calculate the probability that a person with a certain height and weight will be obese. 

Probabilistic reasoning is a form of knowledge representation that uses probability to indicate the degree of uncertainty in knowledge. In AI, probabilistic models are used to examine data using statistical codes. Probabilistic reasoning was one of the first machine learning methods. 

AI can be used to predict outcomes, scenarios, and actions based on simulations, models, and optimization. This can help test hypotheses, explore options, and make informed decisions. For example, in industrial settings, AI can facilitate predictive maintenance by monitoring machinery and equipment data.

Please refer to the following for more information:

 

- Probability Vs. Statistics

Probability and statistics are related areas of mathematics which concern themselves with analyzing the relative frequency of events. Still, there are fundamental differences in the way they see the world:  

  • Probability deals with predicting the likelihood of future events, while statistics involves the analysis of the frequency of past events.  
  • Probability is primarily a theoretical branch of mathematics, which studies the consequences of mathematical definitions. Statistics is primarily an applied branch of mathematics, which tries to make sense of observations in the real world.

 

Both subjects are important, relevant, and useful. But they are different, and understanding the distinction is crucial in properly interpreting the relevance of mathematical evidence. Many a gambler has gone to a cold and lonely grave for failing to make the proper distinction between probability and statistics.

This distinction will perhaps become clearer if we trace the thought process of a mathematician encountering her first craps game:

  • If this mathematician were a probabilist, she would see the dice and think "Six-sided dice?" Presumably each face of the dice is equally likely to land face up. Now assuming that each face comes up with probability 1/6, I can figure out what my chances of crapping out are.
  • If instead a statistician wandered by, she would see the dice and think "Those dice may look OK, but how do I know that they are not loaded?" I'll watch a while, and keep track of how often each number comes up. Then I can decide if my observations are consistent with the assumption of equal-probability faces. Once I'm confident enough that the dice are fair, I'll call a probabilist to tell me how to play.

 

In summary, probability theory enables us to find the consequences of a given ideal world, while statistical theory enables us to to measure the extent to which our world is ideal.

Please refer to the following for more information:

 

Amsterdam_Netherlands_060321A
[Amsterdam, Netherlands - Civil Engineering Discoveries]

- Machine Learning, Probability and Statistics

Machine learning (ML) is an interdisciplinary field that uses statistics, probability, algorithms to learn from data and provide insights that can be used to build intelligent applications. 

Statistics and machine learning are two very closely related fields. In fact, the line between the two can be very fuzzy at times. Nevertheless, there are methods that clearly belong to the field of statistics that are not only useful, but invaluable when working on a machine learning project. 

It would be fair to say that statistical methods are required to effectively work through a machine learning predictive modeling project.

  • Basic Statistics  —  Mean, Median, Mode, Variance, Covariance, etc.
  • Basic Rules of Probability  —  Events (correlated and independent), sample space, conditional probability.
  • Random variables  —  continuous and discrete, expectation, variance, distribution (joint and conditional).
  • Bayes' Theorem  —  computes the validity of beliefs. Bayesian software helps machines identify patterns and make decisions.
  • Maximum Likelihood Estimation (MLE)  —  Parameter estimation. Knowledge of basic probability concepts (joint probability and independence of events) is required.
  • Common distributions  —  Binomial, Poisson, Bernoulli, Gaussian, Exponential. 

 

- Statistical Machine Learning

Statistical ML is the application of statistical methods to make predictions about unseen data. It provides mathematical tools for analyzing the behavior and performance of machine learning algorithms. 

Statistical ML is based on statistical learning theory, which is a framework for ML that draws from statistics and functional analysis. Statistical learning theory deals with the statistical inference problem of finding a predictive function based on data. 

Statistical ML is broadly the same as ML, but the main distinction between them is in the culture. ML algorithms focus on building models that can make accurate predictions on new, unseen data. Statistical learning algorithms focus on building models to make predictions or decisions based on data.

Some of the more complex ML algorithms, such as Neural Networks, have statistical principles at their core. The optimization techniques, like gradient descent, used to train these models are based on statistical theory. 


 

[More to come ...]



Document Actions