Personal tools

Neural Networks

Artificial_Neural_Network_SpringerLink_103120A
[Artificial Neural Network - SpringerLink]

 

 

- Neral Networks Facts

What exactly is a neural network trying to do? Like any other model, it’s trying to make a good prediction. We have a set of inputs and a set of target values — and we are trying to get predictions that match those target values as closely as possible.

All recent advances in artificial intelligence in recent years are due to deep learning. Without deep learning, we would not have self-driving cars, chatbots or personal assistants like Alexa and Siri. The Google Translate app would continue to be as primitive as 10 years ago (before Google switched to neural networks for this App), and Netflix or Youtube would have no idea which movies or TV series we like or dislike. Behind all these technologies are neural networks.

Deep Learning is a subset of Machine Learning, which on the other hand is a subset of Artificial Intelligence. Artificial Intelligence is a general term that refers to techniques that enable computers to mimic human behavior. Machine Learning represents a set of algorithms trained on data that make all of this possible.  

Deep Learning, on the other hand, is just a type of Machine Learning, inspired by the structure of a human brain. Deep learning algorithms attempt to draw similar conclusions as humans would by continually analyzing data with a given logical structure. To achieve this, deep learning uses a multi-layered structure of algorithms called neural networks.

The design of the neural network is based on the structure of the human brain. Just as we use our brains to identify patterns and classify different types of information, neural networks can be taught to perform the same tasks on data.

The individual layers of neural networks can also be thought of as a sort of filter that works from gross to subtle, increasing the likelihood of detecting and outputting a correct result. The human brain works similarly. Whenever we receive new information, the brain tries to compare it with known objects. The same concept is also used by deep neural networks.

Neural networks enable us to perform many tasks, such as clustering, classification or regression. With neural networks, we can group or sort unlabeled data according to similarities among the samples in this data. Or in the case of classification, we can train the network on a labeled dataset in order to classify the samples in this dataset into different categories. Artificial neural networks have unique capabilities that enable deep learning models to solve tasks that machine learning models can never solve.

All recent advances in artificial intelligence in recent years are due to deep learning. Without deep learning, we would not have self-driving cars, chatbots or personal assistants like Alexa and Siri. The Google Translate app would continue to be as primitive as 10 years ago (before Google switched to neural networks for this App), and Netflix or Youtube would have no idea which movies or TV series we like or dislike. Behind all these technologies are neural networks.

 

- Artificial Neural Networks (ANNs)

Artificial neural networks (ANNs), usually simply called neural networks (NNs), are computing systems vaguely inspired by the biological neural networks that constitute animal brains. An ANN is based on a collection of connected units or nodes called artificial neurons (the blue nodes in the chart above), which loosely model the neurons in a biological brain. Each connection, like the synapses in a biological brain, can transmit a signal to other neurons. 

An artificial neuron (the blue node in the chart above) that receives a signal then processes it and can signal neurons connected to it. The "signal" at a connection is a real number, and the output of each neuron is computed by some non-linear function of the sum of its inputs. The connections are called edges. Neurons and edges typically have a weight that adjusts as learning proceeds. The weight increases or decreases the strength of the signal at a connection. Neurons may have a threshold such that a signal is sent only if the aggregate signal crosses that threshold. Typically, neurons are aggregated into layers. Different layers may perform different transformations on their inputs. Signals travel from the first layer (the input layer), to the last layer (the output layer), possibly after traversing the layers multiple times.

  

- Neural Networks and AI Research

Neural networks, a significant area of AI research, is currently proving to be valuable for more natural user interfaces through voice recognition and natural language processing, allowing humans to interact with machines similarly to how they interact with each other. By design, neural networks model the biological function of animal brains to interpret and react to specific inputs such as words and tone of voice. As the underlying technologies continue to develop, AI has the potential to enhance online learning, adaptive learning software, and simulations in ways that more intuitively respond to and engage with students. 

While neural networks (also called “perceptrons”) have been around since the 1940s, it is only in the last several decades where they have become a major part of artificial intelligence. This is due to the arrival of a technique called “backpropagation,” which allows networks to adjust their hidden layers of neurons in situations where the outcome doesn’t match what the creator is hoping for - like a network designed to recognize dogs, which misidentifies a cat, for example. 

Another important advance has been the arrival of deep learning neural networks, in which different layers of a multilayer network extract different features until it can recognize what it is looking for. The idea of deep learning as: using brain simulations, hope to: (a) make learning algorithms much better and easier to use. (b) make revolutionary advances in machine learning and AI. This is our best shot at progress towards real AI

 

Palace of Justice_Vienna_Austria_111220A
[Palace of Justice, Vienna, Austria - Civil Engineering Discoveries]

- Neural Networks and Deep Learning

Deep Learning is a subfield of machine learning concerned with algorithms inspired by the structure and function of the brain called artificial neural networks. Deep Learning is Large Neural Networks.

The core of deep learning according to Andrew is that we now have fast enough computers and enough data to actually train large neural networks. very large neural networks we can now have and huge amounts of data that we have access to, it is the time that deep learning is taking off. 

As we construct larger neural networks and train them with more and more data, their performance continues to increase. This is generally different to other machine learning techniques that reach a plateau in performance. For most flavors of the old generations of learning algorithms, performance will plateau. Deep learning is the first class of algorithms that is scalable. Performance just keeps getting better as you feed them more data.

 

- Deep Neural Networks and Living Brains

A new computational model predicts how information deep inside the brain could flow from one network to another, and how neural network clusters can self optimize over time. Researchers at the Cyber-Physical Systems Group at the USC Viterbi School of Engineering, in conjunction with the University of Illinois at Urbana-Champaign, have developed a new model of how information deep in the brain could flow from one network to another and how these neuronal network clusters self-optimize over time. 

Their work, chronicled in the paper “Network Science Characteristics of Brain-Derived Neuronal Cultures Deciphered From Quantitative Phase Imaging Data,” is believed to be the first study to observe this self-optimization phenomenon in in vitro neuronal networks, and counters existing models. 

Their findings can open new research directions for biologically inspired artificial intelligence, detection of brain cancer and diagnosis and may contribute to or inspire new Parkinson’s treatment strategies.

Similarly, researchers have demonstrated that the deep neural networks most proficient at classifying speech, music and simulated scents have architectures that seem to parallel the brain’s auditory and olfactory systems. Such parallels also show up in deep nets that can look at a 2D scene and infer the underlying properties of the 3D objects within it, which helps to explain how biological perception can be both fast and incredibly rich. All these results hint that the structures of living neural systems embody certain optimal solutions to the tasks they have taken on.
 
 

 
[More to come ...]
 
Document Actions