×
Time can be your best friend and your worst enemy depending on whether you use it or waste it.
--Your friends at LectureNotes
Close

Note for Artificial Neural Network - ANN by Aman Kumar

  • Artificial Neural Network - ANN
  • Note
  • Dr. A PJ Abdul Kalam Tech University Lucknow - AKTU
  • Computer Science Engineering
  • B.Tech
  • 1 Topics
  • 3634 Views
  • 18 Offline Downloads
  • Uploaded 4 months ago
Aman Kumar
Aman Kumar
0 User(s)
Download PDFOrder Printed Copy

Share it with your friends

Leave your Comments

Text from page-3

❑ANNs, like people, learn by example. An ANN is configured for a specific application, such as pattern recognition or data classification, through a learning process. ❑The amazing thing about a neural network is that you don't have to program it to learn explicitly: it learns all by itself, just like a brain! ❑An artificial neural network (ANN), usually called neural network (NN), is a mathematical model or computational model that is inspired by the structure and/or functional aspects of biological neural networks.

Text from page-4

WHY ANN? • Neural networks, with their remarkable ability to derive meaning from complicated or imprecise data, can be used to extract patterns and detect trends that are too complex to be noticed by either humans or other computer techniques. • A trained neural network can be thought of as an "expert" in the category of information it has been given to analyse. This expert can then be used to provide projections given new situations of interest and answer "what if" questions.

Text from page-5

Other advantages • Adaptive learning: An ability to learn how to do tasks based on the data given for training or initial experience. • Self-Organisation: An ANN can create its own organisation or representation of the information it receives during learning time. • Real Time Operation: ANN computations may be carried out in parallel, and special hardware devices are being designed and manufactured which take advantage of this capability. • Fault Tolerance via Redundant Information Coding: Partial destruction of a network leads to the corresponding degradation of performance. However, some network capabilities may be retained even with major network damage.

Text from page-6

Brief History of Neural Networks ❑ The earliest work in neural computing goes back to the 1940's when McCulloch and Pitts introduced the first neural network computing model. ❑In the 1950's, Rosenblatt's work resulted in a two-layer network, the perceptron, which was capable of learning certain classifications by adjusting connection weights. Although the perceptron was successful in classifying certain patterns, it had a number of limitations. ❑The perceptron was not able to solve the classic XOR (exclusive or) problem. Such limitations led to the decline of the field of neural networks. However, the perceptron had laid foundations for later work in neural computing. • In the early 1980's, researchers showed renewed interest in neural networks. Recent work includes Boltzmann machines, Hopfield nets, competitive learning models, multilayer networks, and adaptive resonance theory models.

Lecture Notes