Personal tools

Probability, Statistics and AI

The University of Chicago_052921B
[The University of Chicago]


- Overview

Statistics are important for understanding and improving artificial intelligence (AI) systems. Statistical models enable AI algorithms to learn from data, adapt to new information, and make informed decisions. Statistical inference is also important for evaluating the performance and reliability of AI systems. 

Statistical techniques are essential for validating and refining machine learning models. For example, techniques like hypothesis testing, cross-validation, and bootstrapping help quantify the performance of models and avoid problems like over-fitting. 

Many performance metrics used in machine learning algorithms, such as accuracy, precision, recall, f-score, and root mean squared error, use statistics as the base.
A statistics degree can be beneficial for students who want to work in AI research. Many students choose to major in both statistics and computer science.

A basic understanding of probability and statistical concepts and their application to solving real-world problems is necessary. This prerequisite provides a solid background in applications of probability and statistics that will serve as a foundation for artificial intelligence and advanced techniques, including statistical concepts, probability theory, random and multivariate variables, data and sampling distributions, descriptive statistics, and hypothesis testing .


- Modern Statistics

Statistics is both a body of theory and methods of analysis. The subject matters of statistics covers a wide range - extending from the planning of experiments and other studies that generate data to the collection, analysis, presentation, and interpretation of the data. Numerical data constitute the raw material of the subject matter of statistics.

The essence of modern statistics, however, is the theory and the methodology of drawing inferences that extend beyond the particular set of data examined and of making decisions based on appropriate analysis of such inferential data.  

Please refer to the following for more information.


- AI and Probability

Probability is the likelihood of an event occurring. In artificial intelligence (AI), probability is used to model and reason about uncertain situations. For example, AI can calculate the probability that a person with a certain height and weight will be obese. 

Probabilistic reasoning is a form of knowledge representation that uses probability to indicate the degree of uncertainty in knowledge. In AI, probabilistic models are used to examine data using statistical codes. Probabilistic reasoning was one of the first machine learning methods. 

AI can be used to predict outcomes, scenarios, and actions based on simulations, models, and optimization. This can help test hypotheses, explore options, and make informed decisions. For example, in industrial settings, AI can facilitate predictive maintenance by monitoring machinery and equipment data.

Please refer to the following for more information.

Harvard University: Probability Cheat Sheet

Carnegie Mellon University: Probability Cheat Sheet


- Probability versus Statistics

Probability and statistics are related areas of mathematics which concern themselves with analyzing the relative frequency of events. Still, there are fundamental differences in the way they see the world:  

  • Probability deals with predicting the likelihood of future events, while statistics involves the analysis of the frequency of past events.  
  • Probability is primarily a theoretical branch of mathematics, which studies the consequences of mathematical definitions. Statistics is primarily an applied branch of mathematics, which tries to make sense of observations in the real world.

Both subjects are important, relevant, and useful. But they are different, and understanding the distinction is crucial in properly interpreting the relevance of mathematical evidence. Many a gambler has gone to a cold and lonely grave for failing to make the proper distinction between probability and statistics.

This distinction will perhaps become clearer if we trace the thought process of a mathematician encountering her first craps game:

  • If this mathematician were a probabilist, she would see the dice and think ``Six-sided dice? Presumably each face of the dice is equally likely to land face up. Now assuming that each face comes up with probability 1/6, I can figure out what my chances of crapping out are.'' 
  • If instead a statistician wandered by, she would see the dice and think ``Those dice may look OK, but how do I know that they are not loaded? I'll watch a while, and keep track of how often each number comes up. Then I can decide if my observations are consistent with the assumption of equal-probability faces. Once I'm confident enough that the dice are fair, I'll call a probabilist to tell me how to play.''

In summary, probability theory enables us to find the consequences of a given ideal world, while statistical theory enables us to to measure the extent to which our world is ideal.


[Amsterdam, Netherlands - Civil Engineering Discoveries]

 - Machine Learning, Probability and Statistics

Machine learning is an interdisciplinary field that uses statistics, probability, algorithms to learn from data and provide insights that can be used to build intelligent applications. 

Statistics and machine learning are two very closely related fields. In fact, the line between the two can be very fuzzy at times. Nevertheless, there are methods that clearly belong to the field of statistics that are not only useful, but invaluable when working on a machine learning project. 

It would be fair to say that statistical methods are required to effectively work through a machine learning predictive modeling project.

  • Basic Statistics  —  Mean, Median, Mode, Variance, Covariance, etc.
  • Basic Rules of Probability  —  Events (correlated and independent), sample space, conditional probability.
  • Random variables  —  continuous and discrete, expectation, variance, distribution (joint and conditional).
  • Bayes' Theorem  —  computes the validity of beliefs. Bayesian software helps machines identify patterns and make decisions.
  • Maximum Likelihood Estimation (MLE)  —  Parameter estimation. Knowledge of basic probability concepts (joint probability and independence of events) is required.
  • Common distributions  —  Binomial, Poisson, Bernoulli, Gaussian, Exponential. 


- Statistical Machine Learning

Statistical machine learning is the application of statistical methods to make predictions about unseen data. It provides mathematical tools for analyzing the behavior and performance of machine learning algorithms. 

Statistical machine learning is based on statistical learning theory, which is a framework for machine learning that draws from statistics and functional analysis. Statistical learning theory deals with the statistical inference problem of finding a predictive function based on data. 

Statistical machine learning is broadly the same as machine learning, but the main distinction between them is in the culture. Machine learning algorithms focus on building models that can make accurate predictions on new, unseen data. Statistical learning algorithms focus on building models to make predictions or decisions based on data.

Some of the more complex machine learning algorithms, such as Neural Networks, have statistical principles at their core. The optimization techniques, like gradient descent, used to train these models are based on statistical theory. 


[More to come ...]

Document Actions