Personal tools

Scalars, Vectors, Matrices, and Tensors

UChicago_DSC_0304
(The University of Chicago - Alvin Wei-Cheng Wong)

- Overview

Scalars, vectors, matrices, and tensors are mathematical objects in linear algebra: 

  • Scalars: A single number, such as 7, -4.2, or π
  • Vectors: A list of numbers, which can be shown as a row or column
  • Matrices: A 2-dimensional array of numbers, usually m x n with m rows and n columns
  • Tensors: An n-dimensional array and generalized recursive representation of any of the above objects

 

Scalars are zero-dimensional, vectors are one-dimensional, matrices are two-dimensional, and tensors can have any number of dimensions. Tensors are often used to represent multi-dimensional data, such as color images, volumetric data, or time series data. 

In machine learning, a vector is an element of a vector space, which is a geometric object collection with an addition rule and a scalar multiplication rule. A vector looks like a directed line segment, though not all vectors are directed line segments. 

Scalars, vectors, and matrices are fundamental structures of linear algebra that are important for understanding deep learning. Tensors are a generalization of matrices and can have any number of dimensions. They are often used to represent multi-dimensional data, such as color images, volumetric data, or time series data. 

Scalars, vectors, and matrices are used to represent inputs like text and pictures, which allows machine learning or deep learning models to be trained and deployed. 

Broadly speaking, in linear algebra data is represented in the form of linear equations. These linear equations are in turn represented in the form of matrices and vectors.

Please refer to the following for more details:

 

- Linear Equations

In linear algebra, a linear equation is an algebraic equation where the highest power of the variable is always 1. It's also known as a one-degree equation. 

The standard form of a linear equation in one variable is Ax + B = 0, where x is a variable, A is a coefficient, and B is a constant. The standard form for linear equations in two variables is Ax+By=C. For example, 2x+3y=5 is a linear equation in standard form. 

When graphed, a linear equation in one or two variables always represents a straight line. For example, x + 2 y = 4 is a linear equation and the graph of this linear equation is a straight line. 

To solve a linear equation, you can perform a series of opposites. For example, if a number is added to the term containing x, you subtract that number from both sides of the equation.
 

Linear Algebra in TensorFlow (Scalars, Vectors and Matrices)

You can use scalars, vectors, and matrices in TensorFlow to build machine learning models. For example, you could use a vector to represent the features of an image, and you could use a matrix to represent the weights of a neural network.

In TensorFlow, a scalar is a 0-dimensional tensor, a vector is a 1-dimensional tensor, and a matrix is a 2-dimensional tensor. Tensors are the basic data structures in TensorFlow, and they are used to represent data in machine learning models. 

Here are some examples of how to create and use scalars, vectors, and matrices in TensorFlow:


import tensorflow as tf

# Create a scalar
scalar = tf.constant(1.0)

# Create a vector
vector = tf.constant([1.0, 2.0, 3.0])

# Create a matrix
matrix = tf.constant([[1.0, 2.0], [3.0, 4.0]])

# Add two scalars
sum = scalar + scalar

# Multiply a vector by a scalar
product = vector * scalar

# Multiply a matrix by a vector
matmul = tf.matmul(matrix, vector)

# Print the results
print(sum)
print(product)
print(matmul)



This code will print the following output:

2.0
[1.0, 2.0, 3.0]
[7.0, 10.0]
 
 

In TensorFlow, computation is described using data flow graphs. Each node of the graph represents an instance of a mathematical operation (like addition, division, or multiplication) and each edge is a multi-dimensional data set (tensor) on which the operations are performed.

 
 

[More to come ...]

 

Document Actions