## 展开查看详情

1.• Deep Learning https://store.theartofservice.com/the-deep-learning-toolkit.html

2. Computer vision - Related fields 1 neural net and deep learning based image and feature analysis and classification) have their background in biology. https://store.theartofservice.com/the-deep-learning-toolkit.html

3. Machine learning - Representation learning 1 Deep learning algorithms discover multiple levels of representation, or a hierarchy of features, with higher-level, more abstract features defined in terms of (or generating) lower-level features https://store.theartofservice.com/the-deep-learning-toolkit.html

4. Artificial neural network - History 1 In the 1990s, neural networks were overtaken in popularity in machine learning by support vector machines and other, much simpler methods such as linear classifiers. Renewed interest in neural nets was sparked in the 2000s by the advent of deep learning. https://store.theartofservice.com/the-deep-learning-toolkit.html

5. Artificial neural network - Recent improvements 1 Between 2009 and 2012, the recurrent neural networks and deep feedforward neural networks developed in the research group of Jürgen Schmidhuber at the IDSIA|Swiss AI Lab IDSIA have won eight international competitions in pattern recognition and machine learning.http://www.kurzweilai.net/how-bio- inspired-deep-learning-keeps-winning-competitions 2012 Kurzweil AI Interview with Jürgen Schmidhuber on the eight competitions won by his Deep Learning team 2009–2012 For example, multi-dimensional long short term memory (LSTM)Graves, Alex; and Schmidhuber, Jürgen; Offline Handwriting Recognition with Multidimensional Recurrent Neural Networks, in Bengio, Yoshua; Schuurmans, Dale; Lafferty, John; Williams, Chris K https://store.theartofservice.com/the-deep-learning-toolkit.html

6. Artificial neural network - Recent improvements 1 Deep learning feedforward networks, such as convolutional neural networks, alternate convolutional layers and max-pooling layers, topped by several pure Statistical classification|classification layers https://store.theartofservice.com/the-deep-learning-toolkit.html

7. Andrew Ng 1 He researches primarily in Artificial Intelligence, machine learning, and deep learning. His early work includes the Stanford Autonomous Helicopter project, which developed one of the most capable autonomous helicopters in the world, and the STAIR (STanford Artificial Intelligence Robot) project, which resulted in ROS (Robot Operating System)|ROS, a widely used open source software|open-source robotics software platform. https://store.theartofservice.com/the-deep-learning-toolkit.html

8. Andrew Ng - Machine learning research 1 Among its notable results was a neural network trained using deep learning algorithms on 16,000 CPU cores, that learned to recognize higher-level concepts, such as cats, after watching only YouTube videos, and without ever having been told what a cat is. https://store.theartofservice.com/the-deep-learning-toolkit.html

9. Ben Goertzel - Papers 1 * Goertzel, Ben (2011). Integrating a Compositional Spatiotemporal Deep Learning Network with Symbolic Representation/Reasoning within an Integrative Cognitive Architecture via an Intermediary Semantic Network. Proceedings of AAAI Symposium on Cognitive Systems, Arlington VA https://store.theartofservice.com/the-deep-learning-toolkit.html

10. Ben Goertzel - Papers 1 * Goertzel, Ben (2011). Imprecise Probability as a Linking Mechanism Between Deep Learning, Symbolic Cognition and Local Feature Detection in Vision Processing. Proceedings of AGI-11, Lecture Notes in AI, Springer Verlag [ http://goertzel.org/VisualAttention_AGI_ 11.pdf] https://store.theartofservice.com/the-deep-learning-toolkit.html

11. Serbo-Croatian - Croatian linguists 1 : At the end of the 15th century [in Dubrovnik and Dalmatia], sermons and poems were exquisitely crafted in the Croatian language by those men whose names are widely renowned by deep learning and piety. https://store.theartofservice.com/the-deep-learning-toolkit.html

12.Pattern recognition - Regression analysis|Regression algorithms (predicting real number|real-valued labels) 1 *Neural networks and Deep learning|Deep learning methods https://store.theartofservice.com/the-deep-learning-toolkit.html

13. Deep learning 1 'Deep learning' is a set of algorithms in machine learning that attempt to learn in multiple levels of representation, corresponding to different levels of abstraction. It typically uses artificial neural networks. The levels in these learned statistical models correspond to distinct levels of concepts, where higher-level concepts are defined from lower-level ones, and the same lower-level concepts can help to define many higher-level concepts. https://store.theartofservice.com/the-deep-learning-toolkit.html

14. Deep learning 1 Deep learning is part of a broader family of machine learning methods based on learning representations. An observation (e.g., an image) can be represented in many ways (e.g., a vector of pixels), but some representations make it easier to learn tasks of interest (e.g., is this the image of a human face?) from examples, and research in this area attempts to define what makes better representations and how to learn them. https://store.theartofservice.com/the-deep-learning-toolkit.html

15. Deep learning 1 Ronan Collobert has said that deep learning is just a buzzword for neural nets https://store.theartofservice.com/the-deep-learning-toolkit.html

16. Deep learning - Introduction 1 The term deep learning gained traction in the mid-2000s after a publication by Geoffrey Hinton and Ruslan Salakhutdinov[http://www.cs.toronto.edu/ ~hinton/absps/tics.pdf Learning multiple layers of representation] https://store.theartofservice.com/the-deep-learning-toolkit.html

17. Deep learning - Introduction 1 In 1992, Jürgen Schmidhuber had already implemented a very similar idea for the more general case of unsupervised deep hierarchies of recurrent neural networks, and also experimentally shown its benefits for speeding up supervised learning.Jürgen Schmidhuber| Schmidhuber, Jürgen; Learning complex, extended sequences using the principle of history compression., Neural Computation, 4(2):234-242, 1992Jürgen Schmidhuber|Schmidhuber, Jürgen; My First Deep Learning System of 1991 + Deep Learning Timeline 1962-2013, http://www.idsia.ch/~juergen/firstdeeplearner.html https://store.theartofservice.com/the-deep-learning-toolkit.html

18. Deep learning - Introduction 1 Advances in hardware have been an important enabling factor for the resurgence of neural networks and the advent of deep learning, in particular the availability of powerful and inexpensive graphics processing units (GPUs) also suitable for general-purpose computing on graphics processing units|general-purpose computing https://store.theartofservice.com/the-deep-learning-toolkit.html

19. Deep learning - Introduction 1 and has attracted the attention of such thinkers as Ray Kurzweil, who was hired by Google to do deep learning research. https://store.theartofservice.com/the-deep-learning-toolkit.html

20. Deep learning - Introduction 1 Gary Marcus has expressed skepticism of deep learning's capabilities, noting that https://store.theartofservice.com/the-deep-learning-toolkit.html

21. Deep learning - Fundamental concepts 1 The appropriate number of levels and the structure that relates these factors is something that a deep learning algorithm is also expected to discover from examples. https://store.theartofservice.com/the-deep-learning-toolkit.html

22. Deep learning - Fundamental concepts 1 Deep learning algorithms often involve other important ideas that correspond to broad a priori beliefs about these unknown underlying factors https://store.theartofservice.com/the-deep-learning-toolkit.html

23. Deep learning - Fundamental concepts 1 Many deep learning algorithms are actually framed as unsupervised learning, e.g., using many examples of natural images to discover good representations of them. Because most of these learning algorithms can be applied to unlabeled data, they can leverage large amounts of unlabeled data, even when these examples are not necessarily labeled, and even when the data cannot be associated with labels of the immediate tasks of interest. https://store.theartofservice.com/the-deep-learning-toolkit.html

24. Deep learning - Deep learning in artificial neural networks 1 Deep Learning Neural Networks date back at least to the 1980 Neocognitron by Kunihiko Fukushima https://store.theartofservice.com/the-deep-learning-toolkit.html

25. Deep learning - Deep learning in artificial neural networks 1 Another method is the long short term memory (LSTM) network of 1997 by Sepp Hochreiter|Hochreiter Jürgen Schmidhuber| Schmidhuber.Sepp Hochreiter|Hochreiter, Sepp; and Jürgen Schmidhuber|Schmidhuber, Jürgen; Long Short-Term Memory, Neural Computation, 9(8):1735–1780, 1997 In 2009, deep multidimensional LSTM networks demonstrated the power of deep learning with many nonlinear layers, by winning three ICDAR 2009 competitions in connected handwriting recognition, without any prior knowledge about the three different languages to be learned.Graves, Alex; and Schmidhuber, Jürgen; Offline Handwriting Recognition with Multidimensional Recurrent Neural Networks, in Bengio, Yoshua; Schuurmans, Dale; Lafferty, John; Williams, Chris K https://store.theartofservice.com/the-deep-learning-toolkit.html

26. Deep learning - Deep learning in artificial neural networks 1 As of 2011, the state of the art in deep learning feedforward networks alternates convolutional layers and max-pooling layers,D https://store.theartofservice.com/the-deep-learning-toolkit.html

27. Deep learning - Deep learning in artificial neural networks 1 Such supervised deep learning methods also were the first artificial pattern recognizers to achieve human-competitive performance on certain tasks.D. C. Ciresan, U. Meier, J. Schmidhuber. Multi- column Deep Neural Networks for Image Classification. IEEE Conf. on Computer Vision and Pattern Recognition CVPR 2012. https://store.theartofservice.com/the-deep-learning-toolkit.html

28. Deep learning - Deep learning in the human brain 1 These models share the interesting property that various proposed learning dynamics in the brain (e.g., a wave of neurotrophic growth factor) conspire to support the self- organization of just the sort of inter-related neural networks utilized in the later, purely computational deep learning models, and which appear to be analogous to one way of understanding the neocortex of the brain as a hierarchy of filters where each layer captures some of the information in the operating environment, and then passes the remainder, as well as modified base signal, to other layers further up the hierarchy https://store.theartofservice.com/the-deep-learning-toolkit.html

29. Deep learning - Deep learning in the human brain 1 The theory of deep learning therefore sees the coevolution of culture and cognition as a fundamental condition of human evolution.Shrager, J., Johnson, M https://store.theartofservice.com/the-deep-learning-toolkit.html