WHAT IS DEEP LEARNING?

 Deep learning is not a new subject. Scientists want to make such kind of machines which can think for themselves and others. However, it had to pass through different steps to reach to this stge. Different scientists in the past had contributed well in this system. It can be said that some of the important algorithms in this subject had defeated human minds too. Some of the important terminologies in this regard lie in representation learning, cybernetics, and connectionism. 



According to Ian Goodfellow, Yoshu Bengio, and Aaron Courcille some important scientists in the past included Pygmalion, Daedalus, and Hephaestus had worked on this project. These scientists were present well before time. They wanted to make such kinds of machines which could learn and thing by themselves. So the point lies in what is deep learning? A simple definition of it lies that it allows computers to learn from experience and understand the world in terms of hierarchy of concepts. The hierarchy of concepts allows the computer to learn complicated concepts by building them out of simpler ones. Some of its applications can be used in Common Operating SystemsIt is very interesting to note that several deep learning algorithms are there which help to make very intelligent systems. IBM developed one of the such kind of system. Now chess is very important game. Intelligent people think that such kind of game can help a person to read the ability of other persons. Garry Kasparov was the greatest player of chess of his time. He had won  lot of matches in this regard. IBM developed a Deep Learning chess playing algorithm system which defeated this champion of chess. Now artificial intelligence has  deep connection with machine learning systems. artificial intelligence systems have the ability to acquire their own knowledge by extracting patterns from raw data. This capability is known as machine learning. Once a machine knows how to perform a specific task then that machine is trained to do a specific task. One can say that machine has successfully implemented the machine learning algorithm. An example of it lies in Naive Bayes algorithm which can separate legitimate email from spam email.  One can find out a useful feature for speaker identification from sound is an estimate of the size of speaker's vocal trust. It therefore gives a strong clue as to whether the speaker is a man, woman or child. So what is a feature in this whole scenario? It can be anything which is a distinguishing factor for an algorithm. These are the specific things upon which an algorithm learns something bout the specific picture or pattern. As an example one would like to write a program to detect cars in photographs. we know that cars have wheels so we might like to use presence of a wheel as a feature. a representation learning algorithm can discover a good set of features for a simple  task in minutes or a complex task in hours or months. Where as deep learning helps to build complex concepts from representation learning. It is very important to tell that deep learning has three main time periods. One lies in after 2006 in which complex algorithms are there in the market. They try to build such kinds of algorithms from simpler ones. Whereas the first one lies in between 1940 to 1960. This wave was called as cybernetics. In this wave they take n inputs an associate them with output y. They also calculate weights for them. It is very useful till today. In second wave connectionism or parallel distribution processing was there. In this scheme a large number of simple computational units can achieve intelligent behavior networked together. 

In a nutshell,  can be said that deep learning is the main unit of artificial intelligence. Devices learn things from experiences in this whole process. It had three main waves which included current time, cybernetics and parallel processing. Different complex algorithms are used to extract features from different objects.





Comments

Popular Posts