Knowledge

Deep learning

Source 📝

1253:
using generative models of deep belief nets (DBN) would overcome the main difficulties of neural nets. However, it was discovered that replacing pre-training with large amounts of training data for straightforward backpropagation when using DNNs with large, context-dependent output layers produced error rates dramatically lower than then-state-of-the-art Gaussian mixture model (GMM)/Hidden Markov Model (HMM) and also than more-advanced generative model-based systems. The nature of the recognition errors produced by the two types of systems was characteristically different, offering technical insights into how to integrate deep learning into the existing highly efficient, run-time speech decoding system deployed by all major speech recognition systems. Analysis around 2009–2010, contrasting the GMM (and other generative speech models) vs. DNN models, stimulated early industrial investment in deep learning for speech recognition. That analysis was done with comparable performance (less than 1.5% in error rate) between discriminative DNNs and generative models. In 2010, researchers extended deep learning from
2188: 3174:, the need for training data does not stop once an ANN is trained. Rather, there is a continued demand for human-generated verification data to constantly calibrate and update the ANN. For this purpose, Facebook introduced the feature that once a user is automatically recognized in an image, they receive a notification. They can choose whether or not they like to be publicly labeled on the image, or tell Facebook that it is not them in the picture. This user interface is a mechanism to generate "a constant stream of verification data" to further train the network in real-time. As Mühlhoff argues, the involvement of human users to generate training and verification data is so typical for most commercial end-user applications of Deep Learning that such systems may be referred to as "human-aided artificial intelligence". 839:(1958) proposed the perceptron, an MLP with 3 layers: an input layer, a hidden layer with randomized weights that did not learn, and an output layer. He later published a 1962 book that also introduced variants and computer experiments, including a version with four-layer perceptrons "with adaptive preterminal networks" where the last two layers have learned weights (here he credits H. D. Block and B. W. Knight). The book cites an earlier network by R. D. Joseph (1960) "functionally equivalent to a variation of" this four-layer system (the book mentions Joseph over 30 times). Should Joseph therefore be considered the originator of proper adaptive 62: 632:, in which a signal may propagate through a layer more than once, the CAP depth is potentially unlimited. No universally agreed-upon threshold of depth divides shallow learning from deep learning, but most researchers agree that deep learning involves CAP depth higher than two. CAP of depth two has been shown to be a universal approximator in the sense that it can emulate any function. Beyond that, more layers do not add to the function approximator ability of the network. Deep models (CAP > two) are able to extract better features than shallow models and hence, extra layers help in learning the features effectively. 1508: 1532: 13334: 6126: 2650:, deep BSDE addresses the computational challenges faced by traditional numerical methods in high-dimensional settings. Specifically, traditional methods like finite difference methods or Monte Carlo simulations often struggle with the curse of dimensionality, where computational cost increases exponentially with the number of dimensions. Deep BSDE methods, however, employ deep neural networks to approximate solutions of high-dimensional partial differential equations (PDEs), effectively reducing the computational burden. 38: 2657:(PINNs) into the deep BSDE framework enhances its capability by embedding the underlying physical laws directly into the neural network architecture. This ensures that the solutions not only fit the data but also adhere to the governing stochastic differential equations. PINNs leverage the power of deep learning while respecting the constraints imposed by the physical models, resulting in more accurate and reliable solutions for financial mathematics problems. 15025: 15005: 3015: 2143: 2822:(ARL) and UT researchers. Deep TAMER used deep learning to provide a robot with the ability to learn new tasks through observation. Using Deep TAMER, a robot learned a task with a human trainer, watching video streams or observing a human perform a task in-person. The robot later practiced the task with the help of some coaching from the trainer, who provided feedback such as "good job" and "bad job". 1270: 2818:(UT) developed a machine learning framework called Training an Agent Manually via Evaluative Reinforcement, or TAMER, which proposed new methods for robots or computer programs to learn how to perform tasks by interacting with a human instructor. First developed as TAMER, a new algorithm called Deep TAMER was later introduced in 2018 during a collaboration between 1793:(computing the gradient on several training examples at once rather than individual examples) speed up computation. Large processing capabilities of many-core architectures (such as GPUs or the Intel Xeon Phi) have produced significant speedups in training, because of the suitability of such processing architectures for the matrix and vector computations. 1119:. These were designed for unsupervised learning of deep generative models. However, those were more computationally expensive compared to backpropagation. Boltzmann machine learning algorithm, published in 1985, was briefly popular before being eclipsed by the backpropagation algorithm in 1986. (p. 112 ). A 1988 network became state of the art in 2751:, well-tuned to their operating environment. A 1995 description stated, "...the infant's brain seems to organize itself under the influence of waves of so-called trophic-factors ... different regions of the brain become connected sequentially, with one layer of tissue maturing before another and so on until the whole brain is mature". 1061:(LSTM), published in 1995. LSTM can learn "very deep learning" tasks with long credit assignment paths that require memories of events that happened thousands of discrete time steps before. That LSTM was not yet the modern architecture, which required a "forget gate", introduced in 1999, which became the standard RNN architecture. 988:(RNN) were further developed in the 1980s. Recurrence is used for sequence processing, and when a recurrent network is unrolled, it mathematically resembles a deep feedforward layer. Consequently, they have similar properties and issues, and their developments had mutual influences. In RNN, two early influential works were the 1800:) is one such kind of neural network. It doesn't require learning rates or randomized initial weights. The training process can be guaranteed to converge in one step with a new batch of data, and the computational complexity of the training algorithm is linear with respect to the number of neurons involved. 1138:(GMM-HMM) technology based on generative models of speech trained discriminatively. Key difficulties have been analyzed, including gradient diminishing and weak temporal correlation structure in neural predictive models. Additional difficulties were the lack of training data and limited computing power. 2888:
In further reference to the idea that artistic sensitivity might be inherent in relatively low levels of the cognitive hierarchy, a published series of graphic representations of the internal states of deep (20-30 layers) neural networks attempting to discern within essentially random data the images
2515:
Deep learning has been shown to produce competitive results in medical application such as cancer cell classification, lesion detection, organ segmentation and image enhancement. Modern deep learning tools demonstrate the high accuracy of detecting various diseases and the helpfulness of their use by
2295:
In 2023 Murray et al. developed a deep learning architecture which was capable of determining whether a defendant should be tried as a child or adult. Their software was able to estimate subject age with significant accuracy. The same team has developed architectures capable of performing ante-mortem
1648:
A deep neural network (DNN) is an artificial neural network with multiple layers between the input and output layers. There are different types of neural networks but they always consist of the same components: neurons, synapses, weights, biases, and functions. These components as a whole function in
1573:
that constitute animal brains. Such systems learn (progressively improve their ability) to do tasks by considering examples, generally without task-specific programming. For example, in image recognition, they might learn to identify images that contain cats by analyzing example images that have been
2844:
A main criticism concerns the lack of theory surrounding some methods. Learning in the most common deep architectures is implemented using well-understood gradient descent. However, the theory surrounding other algorithms, such as contrastive divergence is less clear. (e.g., Does it converge? If so,
2286:
method in which the system "learns from millions of examples". It translates "whole sentences at a time, rather than pieces". Google Translate supports over one hundred languages. The network encodes the "semantics of the sentence rather than simply memorizing phrase-to-phrase translations". GT uses
2937:
As deep learning moves from the lab into the world, research and experience show that artificial neural networks are vulnerable to hacks and deception. By identifying patterns that these systems use to function, attackers can modify inputs to ANNs in such a way that the ANN finds a match that human
2769:
Although a systematic comparison between the human brain organization and the neuronal encoding in deep networks has not yet been established, several analogies have been reported. For example, the computations performed by deep learning units could be similar to those of actual neurons and neural
1812:
have led to more efficient methods for training deep neural networks that contain many layers of non-linear hidden units and a very large output layer. By 2019, graphics processing units (GPUs), often with AI-specific enhancements, had displaced CPUs as the dominant method for training large-scale
1652:
For example, a DNN that is trained to recognize dog breeds will go over the given image and calculate the probability that the dog in the image is a certain breed. The user can review the results and select which probabilities the network should display (above a certain threshold, etc.) and return
854:
and Lapa in 1965. They regarded it as a form of polynomial regression, or a generalization of Rosenblatt's perceptron. A 1971 paper described a deep network with eight layers trained by this method, which is based on layer by layer training through regression analysis. Superfluous hidden units are
2945:
In 2016 researchers used one ANN to doctor images in trial and error fashion, identify another's focal points, and thereby generate images that deceived it. The modified images looked no different to human eyes. Another group showed that printouts of doctored images then photographed successfully
1912:
Large-scale automatic speech recognition is the first and most convincing successful case of deep learning. LSTM RNNs can learn "Very Deep Learning" tasks that involve multi-second intervals containing speech events separated by thousands of discrete time steps, where one time step corresponds to
1856:
are considered promising for energy-efficient deep learning hardware where the same basic device structure is used for both logic operations and data storage. In 2020, Marega et al. published experiments with a large-area active channel material for developing logic-in-memory devices and circuits
1252:
The 2009 NIPS Workshop on Deep Learning for Speech Recognition was motivated by the limitations of deep generative models of speech, and the possibility that given more capable hardware and large-scale data sets that deep neural nets might become practical. It was believed that pre-training DNNs
10425:
Wu, Yonghui; Schuster, Mike; Chen, Zhifeng; Le, Quoc V; Norouzi, Mohammad; Macherey, Wolfgang; Krikun, Maxim; Cao, Yuan; Gao, Qin; Macherey, Klaus; Klingner, Jeff; Shah, Apurva; Johnson, Melvin; Liu, Xiaobing; Kaiser, Łukasz; Gouws, Stephan; Kato, Yoshikiyo; Kudo, Taku; Kazawa, Hideto; Stevens,
2674:
Traditional weather prediction systems solve a very complex system of partial differential equations. GraphCast is a deep learning based model, trained on a long history of weather data to predict how weather patterns change over time. It is able to predict weather conditions for up to 10 days
2665:
Image reconstruction is the reconstruction of the underlying images from the image-related measurements. Several works showed the better and superior performance of the deep learning methods compared to analytical methods for various applications, e.g., spectral imaging and ultrasound imaging.
1671:
DNNs are typically feedforward networks in which data flows from the input layer to the output layer without looping back. At first, the DNN creates a map of virtual neurons and assigns random numerical values, or "weights", to connections between them. The weights and inputs are multiplied and
2602:
database, offering researchers the opportunity to identify materials with desired properties for various applications. This development has implications for the future of scientific discovery and the integration of AI in material science research, potentially expediting material innovation and
2398:
Recommendation systems have used deep learning to extract meaningful features for a latent factor model for content-based music and journal recommendations. Multi-view deep learning has been applied for learning user preferences from multiple domains. The model uses a hybrid collaborative and
1672:
return an output between 0 and 1. If the network did not accurately recognize a particular pattern, an algorithm would adjust the weights. That way the algorithm can make certain parameters more influential, until it determines the correct mathematical manipulation to fully process the data.
2146: 1639:
As of 2017, neural networks typically have a few thousand to a few million units and millions of connections. Despite this number being several order of magnitude less than the number of neurons on a human brain, these networks can perform many tasks at a level beyond that of humans (e.g.,
7213: 1280:
Although CNNs trained by backpropagation had been around for decades and GPU implementations of NNs for years, including CNNs, faster implementations of CNNs on GPUs were needed to progress on computer vision. Later, as deep learning becomes widespread, specialized hardware and algorithm
2145: 2528:
is always challenging, since many data points must be considered and analyzed before a target segment can be created and used in ad serving by any ad server. Deep learning has been used to interpret large, many-dimensioned advertising datasets. Many data points are collected during the
2150: 2149: 2144: 1777:
regularization randomly omits units from the hidden layers during training. This helps to exclude rare dependencies. Finally, data can be augmented via methods such as cropping and rotating such that smaller training sets can be increased in size to reduce the chances of overfitting.
2151: 1248:
The impact of deep learning in industry began in the early 2000s, when CNNs already processed an estimated 10% to 20% of all the checks written in the US, according to Yann LeCun. Industrial applications of deep learning to large-scale speech recognition started around 2010.
2747:, neural networks employ a hierarchy of layered filters in which each layer considers information from a prior layer (or the operating environment), and then passes its output (and possibly the original input), to other layers. This process yields a self-organizing stack of 601:). The first representational layer may attempt to identify basic shapes such as lines and circles, the second layer may compose and encode arrangements of edges, the third layer may encode a nose and eyes, and the fourth layer may recognize that the image contains a face. 1537:
Subsequent run of the network on an input image (left): The network correctly detects the starfish. However, the weakly weighted association between ringed texture and sea urchin also confers a weak signal to the latter from one of two intermediate nodes. In addition, a
12297:
Lam, Remi; Sanchez-Gonzalez, Alvaro; Willson, Matthew; Wirnsberger, Peter; Fortunato, Meire; Alet, Ferran; Ravuri, Suman; Ewalds, Timo; Eaton-Rosen, Zach; Hu, Weihua; Merose, Alexander; Hoyer, Stephan; Holland, George; Vinyals, Oriol; Stott, Jacklynn (2023-12-22).
11353:
Litjens, Geert; Kooi, Thijs; Bejnordi, Babak Ehteshami; Setio, Arnaud Arindra Adiyoso; Ciompi, Francesco; Ghafoorian, Mohsen; van der Laak, Jeroen A.W.M.; van Ginneken, Bram; Sánchez, Clara I. (December 2017). "A survey on deep learning in medical image analysis".
2904:
Some deep learning architectures display problematic behaviors, such as confidently classifying unrecognizable images as belonging to a familiar category of ordinary images (2014) and misclassifying minuscule perturbations of correctly classified images (2013).
1932:
language models. This lets the strength of the acoustic modeling aspects of speech recognition be more easily analyzed. The error rates listed below, including these early results and measured as percent phone error rates (PER), have been summarized since 1991.
10426:
Keith; Kurian, George; Patil, Nishant; Wang, Wei; Young, Cliff; Smith, Jason; Riesa, Jason; Rudnick, Alex; Vinyals, Oriol; Corrado, Greg; et al. (2016). "Google's Neural Machine Translation System: Bridging the Gap between Human and Machine Translation".
1667:
Deep architectures include many variants of a few basic approaches. Each architecture has found success in specific domains. It is not always possible to compare the performance of multiple architectures, unless they have been evaluated on the same data sets.
2734:
in the early 1990s. These developmental theories were instantiated in computational models, making them predecessors of deep learning systems. These developmental models share the property that various proposed learning dynamics in the brain (e.g., a wave of
8312:
Silver, David; Huang, Aja; Maddison, Chris J.; Guez, Arthur; Sifre, Laurent; Driessche, George van den; Schrittwieser, Julian; Antonoglou, Ioannis; Panneershelvam, Veda (January 2016). "Mastering the game of Go with deep neural networks and tree search".
2148: 2593:
by discovering over 2 million new materials within a relatively short timeframe. GNoME employs deep learning techniques to efficiently explore potential material structures, achieving a significant increase in the identification of stable inorganic
7205: 1608:
Typically, neurons are organized in layers. Different layers may perform different kinds of transformations on their inputs. Signals travel from the first (input), to the last (output) layer, possibly after traversing the layers multiple times.
7746:
Fang, Hao; Gupta, Saurabh; Iandola, Forrest; Srivastava, Rupesh; Deng, Li; Dollár, Piotr; Gao, Jianfeng; He, Xiaodong; Mitchell, Margaret; Platt, John C; Lawrence Zitnick, C; Zweig, Geoffrey (2014). "From Captions to Visual Concepts and Back".
2170:
data set. MNIST is composed of handwritten digits and includes 60,000 training examples and 10,000 test examples. As with TIMIT, its small size lets users test multiple configurations. A comprehensive list of results on this set is available.
642:
Deep learning algorithms can be applied to unsupervised learning tasks. This is an important benefit because unlabeled data are more abundant than the labeled data. Examples of deep structures that can be trained in an unsupervised manner are
3526: 2241:, can be thought of as a representational layer in a deep learning architecture that transforms an atomic word into a positional representation of the word relative to other words in the dataset; the position is represented as a point in a 10778: 1817:
estimated the hardware computation used in the largest deep learning projects from AlexNet (2012) to AlphaZero (2017) and found a 300,000-fold increase in the amount of computation required, with a doubling-time trendline of 3.4 months.
1525:. The starfish match with a ringed texture and a star outline, whereas most sea urchins match with a striped texture and oval shape. However, the instance of a ring textured sea urchin creates a weakly weighted association between them. 2603:
reducing costs in product development. The use of AI and deep learning suggests the possibility of minimizing or eliminating manual lab experiments and allowing scientists to focus more on the design and analysis of unique compounds.
7243: 9765:
Hannun, Awni; Case, Carl; Casper, Jared; Catanzaro, Bryan; Diamos, Greg; Elsen, Erich; Prenger, Ryan; Satheesh, Sanjeev; Sengupta, Shubho; Coates, Adam; Ng, Andrew Y (2014). "Deep Speech: Scaling up end-to-end speech recognition".
2354:
were used for the first time to predict various properties of molecules in a large toxicology data set. In 2019, generative neural networks were used to produce molecules that were validated experimentally all the way into mice.
2938:
observers would not recognize. For example, an attacker can make subtle changes to an image such that the ANN finds a match even though the image looks to a human nothing like the search target. Such manipulation is termed an "
2174:
Deep learning-based image recognition has become "superhuman", producing more accurate results than human contestants. This first occurred in 2011 in recognition of traffic signs, and in 2014, with recognition of human faces.
13271:; Maddison, Chris J.; Guez, Arthur; Sifre, Laurent; Driessche, George van den; Schrittwieser, Julian; Antonoglou, Ioannis; Panneershelvam, Veda; Lanctot, Marc; Dieleman, Sander; Grewe, Dominik; Nham, John; Kalchbrenner, Nal; 1420:(2018) based on the Progressive GAN by Tero Karras et al. Here the GAN generator is grown from small to large scale in a pyramidal fashion. Image generation by GAN reached popular success, and provoked discussions concerning 11041: 1612:
The original goal of the neural network approach was to solve problems in the same way that a human brain would. Over time, attention focused on matching specific mental abilities, leading to deviations from biology such as
2178:
Deep learning-trained vehicles now interpret 360° camera views. Another example is Facial Dysmorphology Novel Analysis (FDNA) used to analyze cases of human malformation connected to a large database of genetic syndromes.
1601:) between neurons can transmit a signal to another neuron. The receiving (postsynaptic) neuron can process the signal(s) and then signal downstream neurons connected to it. Neurons may have state, generally represented by 10114: 4381: 697:
activation functions and was generalised to feed-forward multi-layer architectures in 1991 by Kurt Hornik. Recent work also showed that universal approximation also holds for non-bounded activation functions such as
2249:(PCFG) implemented by an RNN. Recursive auto-encoders built atop word embeddings can assess sentence similarity and detect paraphrasing. Deep neural architectures provide the best results for constituency parsing, 9218: 7289: 2199:
Closely related to the progress that has been made in image recognition is the increasing application of deep learning techniques to various visual art tasks. DNNs have proven themselves capable, for example, of
10334: 10145: 616:
useful feature representations from the data automatically. This does not eliminate the need for hand-tuning; for example, varying numbers of layers and layer sizes can provide different degrees of abstraction.
2872:, and they are also still a long way from integrating abstract knowledge, such as information about what objects are, what they are for, and how they are typically used. The most powerful A.I. systems, like 1660:. The extra layers enable composition of features from lower layers, potentially modeling complex data with fewer units than a similarly performing shallow network. For instance, it was proved that sparse 6601: 2315:
A large percentage of candidate drugs fail to win regulatory approval. These failures are caused by insufficient efficacy (on-target effect), undesired interactions (off-target effects), or unanticipated
5276:". David E. Rumelhart, James L. McClelland, and the PDP research group. (editors), Parallel distributed processing: Explorations in the microstructure of cognition, Volume 1: Foundation. MIT Press, 1986. 2889:
on which they were trained demonstrate a visual appeal: the original research notice received well over 1,000 comments, and was the subject of what was for a time the most frequently accessed article on
2598:. The system's predictions were validated through autonomous robotic experiments, demonstrating a noteworthy success rate of 71%. The data of newly discovered materials is publicly available through the 448:
into layers and "training" them to process data. The adjective "deep" refers to the use of multiple layers (ranging from three to several hundred or thousands) in the network. Methods used can be either
2693:
that can be used to measure age. Galkin et al. used deep neural networks to train an epigenetic aging clock of unprecedented accuracy using >6,000 blood samples. The clock uses information from 1000
5002:
The Early Mathematical Manuscripts of Leibniz: Translated from the Latin Texts Published by Carl Immanuel Gerhardt with Critical and Historical Notes (Leibniz published the chain rule in a 1676 memoir)
3567: 6479: 10218: 13607: 7319: 1237:
were developed for generative modeling. They are trained by training one restricted Boltzmann machine, then freezing it and training another one on top of the first one, and so on, then optionally
2245:. Using word embedding as an RNN input layer allows the network to parse sentences and phrases using an effective compositional vector grammar. A compositional vector grammar can be thought of as 855:
pruned using a separate validation set. Since the activation functions of the nodes are Kolmogorov-Gabor polynomials, these were also the first deep networks with multiplicative units or "gates."
10768: 2913:(AGI) architectures. These issues may possibly be addressed by deep learning architectures that internally form states homologous to image-grammar decompositions of observed entities and events. 2147: 1200:(SVMs) became the preferred choices in the 1990s and 2000s, because of artificial neural networks' computational cost and a lack of understanding of how the brain wires its biological networks. 7585: 1578:
as "cat" or "no cat" and using the analytic results to identify cats in other images. They have found most use in applications difficult to express with a traditional computer algorithm using
1307:
achieved for the first time superhuman performance in a visual pattern recognition contest, outperforming traditional methods by a factor of 3. It then won more contests. They also showed how
2770:
populations. Similarly, the representations developed by deep learning models are similar to those measured in the primate visual system both at the single-unit and at the population levels.
2766:, may be closer to biological reality. In this respect, generative neural network models have been related to neurobiological evidence about sampling-based processing in the cerebral cortex. 1605:, typically between 0 and 1. Neurons and synapses may also have a weight that varies as learning proceeds, which can increase or decrease the strength of the signal that it sends downstream. 7235: 7114: 3833: 958:
to apply CNN to phoneme recognition. It used convolutions, weight sharing, and backpropagation. In 1988, Wei Zhang applied a backpropagation-trained CNN to alphabet recognition. In 1989,
13519: 13347: 6423:
Baker, J.; Deng, Li; Glass, Jim; Khudanpur, S.; Lee, C.-H.; Morgan, N.; O'Shaughnessy, D. (2009). "Research Developments and Directions in Speech Recognition and Understanding, Part 1".
2994:
voice command system open a particular web address, and hypothesized that this could "serve as a stepping stone for further attacks (e.g., opening a web page hosting drive-by malware)".
922:
in 1673 to networks of differentiable nodes. The terminology "back-propagating errors" was actually introduced in 1962 by Rosenblatt, but he did not know how to implement this, although
2856:, not as an all-encompassing solution. Despite the power of deep learning methods, they still lack much of the functionality needed to realize this goal entirely. Research psychologist 9188: 6273:
Morgan, Nelson; Bourlard, Hervé; Renals, Steve; Cohen, Michael; Franco, Horacio (1 August 1993). "Hybrid neural network/hidden markov model systems for continuous speech recognition".
10365: 7179: 2758:
algorithm have been proposed in order to increase its processing realism. Other researchers have argued that unsupervised forms of deep learning, such as those based on hierarchical
1874:
for parallel convolutional processing. The authors identify two key advantages of integrated photonics over its electronic counterparts: (1) massively parallel data transfer through
11736: 11587:
De, Shaunak; Maity, Abhishek; Goel, Vritti; Shitole, Sanjay; Bhattacharya, Avik (2017). "Predicting the popularity of instagram posts for a lifestyle magazine using deep learning".
11033: 8771: 10515:
Murray, J., Heng, D., Lygate, A., et al. (2023). "Applying artificial intelligence to determination of legal age of majority from radiographic data". Morphologie. 108 (360): 100723
1849:
has also built a dedicated system to handle large deep learning models, the CS-2, based on the largest processor in the industry, the second-generation Wafer Scale Engine (WSE-2).
1172:
The principle of elevating "raw" features over hand-crafted optimization was first explored successfully in the architecture of deep autoencoder on the "raw" spectrogram or linear
2909:
hypothesized that these behaviors are due to limitations in their internal representations and that these limitations would inhibit integration into heterogeneous multi-component
612:
to transform the data into a more suitable representation for a classification algorithm to operate on. In the deep learning approach, features are not hand-crafted and the model
9016: 10744:
Wallach, Izhar; Dzamba, Michael; Heifets, Abraham (9 October 2015). "AtomNet: A Deep Convolutional Neural Network for Bioactivity Prediction in Structure-based Drug Discovery".
10106: 4362: 2646:(BSDE). This method is particularly useful for solving high-dimensional problems in financial mathematics. By leveraging the powerful function approximation capabilities of 2046:
The debut of DNNs for speaker recognition in the late 1990s and speech recognition around 2009-2011 and of LSTM around 2003–2007, accelerated progress in eight major areas:
4119: 2754:
A variety of approaches have been used to investigate the plausibility of deep learning models from a neurobiological perspective. On the one hand, several variants of the
1542:
that was not included in the training gives a weak signal for the oval shape, also resulting in a weak signal for the sea urchin output. These weak signals may result in a
8378: 9210: 8985: 14899: 12767:
Testolin, Alberto; Stoianov, Ivilin; Zorzi, Marco (September 2017). "Letter perception emerges from unsupervised deep learning and recycling of natural image features".
1379:
accuracy, known as the "degradation" problem. In 2015, two techniques were developed to train very deep networks: the Highway Network was published in May 2015, and the
10808: 10326: 10137: 7470:
Cireşan, Dan Claudiu; Meier, Ueli; Gambardella, Luca Maria; Schmidhuber, Jürgen (21 September 2010). "Deep, Big, Simple Neural Nets for Handwritten Digit Recognition".
7265: 1771: 1740: 8225: 8168: 13687:
Szegedy, Christian; Zaremba, Wojciech; Sutskever, Ilya; Bruna, Joan; Erhan, Dumitru; Goodfellow, Ian; Fergus, Rob (2013). "Intriguing properties of neural networks".
13488: 3471: 13755: 13576: 13788: 9096:
Viebke, André; Memeti, Suejb; Pllana, Sabri; Abraham, Ajith (2019). "CHAOS: a parallelization scheme for training convolutional neural networks on Intel Xeon Phi".
4228: 6527:
Doddington, G.; Przybocki, M.; Martin, A.; Reynolds, D. (2000). "The NIST speaker recognition evaluation ± Overview, methodology, systems, results, perspective".
620:
The word "deep" in "deep learning" refers to the number of layers through which the data is transformed. More precisely, deep learning systems have a substantial
13386: 10007: 9319:
P, JouppiNorman; YoungCliff; PatilNishant; PattersonDavid; AgrawalGaurav; BajwaRaminder; BatesSarah; BhatiaSuresh; BodenNan; BorchersAl; BoyleRick (2017-06-24).
3439: 14008: 13857: 12657:
O'Reilly, Randall C. (1 July 1996). "Biologically Plausible Error-Driven Learning Using Local Activation Differences: The Generalized Recirculation Algorithm".
10662: 9743: 8095: 11636: 8585: 2950:
that can then find other instances of it. A refinement is to search using only parts of the image, to identify images from which that piece may have been taken
10458: 9387: 3759: 1886:, and (2) extremely high data modulation speeds. Their system can execute trillions of multiply-accumulate operations per second, indicating the potential of 10403: 7815:
He, Kaiming; Zhang, Xiangyu; Ren, Shaoqing; Sun, Jian (2016). "Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification".
6557:
Heck, L.; Konig, Y.; Sonmez, M.; Weintraub, M. (2000). "Robustness to Telephone Handset Distortion in Speaker Recognition by Discriminative Feature Design".
2639: 321: 10718: 10210: 6399: 5741: 13599: 10903: 8685: 5920:(2020). "Generative Adversarial Networks are Special Cases of Artificial Curiosity (1990) and also Closely Related to Predictability Minimization (1991)". 1656:
DNNs can model complex non-linear relationships. DNN architectures generate compositional models where the object is expressed as a layered composition of
10955: 10924:
Tkachenko, Yegor (8 April 2015). "Autonomous CRM Control via CLV Approximation with Deep Reinforcement Learning in Discrete and Continuous Action Space".
7311: 6776: 3829:(1986). Learning while searching in constraint-satisfaction problems. University of California, Computer Science Department, Cognitive Systems Laboratory. 2187: 539:. However, current neural networks do not intend to model the brain function of organisms, and are generally seen as low-quality models for that purpose. 11929:"Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations" 11415:
Forslid, Gustav; Wieslander, Hakan; Bengtsson, Ewert; Wahlby, Carolina; Hirsch, Jan-Michael; Stark, Christina Runow; Sadanandan, Sajith Kecheril (2017).
7575:
Ciresan, Dan; Giusti, Alessandro; Gambardella, Luca M.; Schmidhuber, Jürgen (2012). Pereira, F.; Burges, C. J. C.; Bottou, L.; Weinberger, K. Q. (eds.).
2228:
Neural networks have been used for implementing language models since the early 2000s. LSTM helped to improve machine translation and language modeling.
1916:
The initial success in speech recognition was based on small-scale recognition tasks based on TIMIT. The data set contains 630 speakers from eight major
11242: 10945:
van den Oord, Aaron; Dieleman, Sander; Schrauwen, Benjamin (2013). Burges, C. J. C.; Bottou, L.; Welling, M.; Ghahramani, Z.; Weinberger, K. Q. (eds.).
10830:
Gilmer, Justin; Schoenholz, Samuel S.; Riley, Patrick F.; Vinyals, Oriol; Dahl, George E. (2017-06-12). "Neural Message Passing for Quantum Chemistry".
9249: 8407: 6500:
Deng, L.; Hassanein, K.; Elmasry, M. (1994). "Analysis of correlation structure for a neural predictive model with applications to speech recognition".
1284:
A key advance for the deep learning revolution was hardware advances, especially GPU. Some early work dated back to 2004. In 2009, Raina, Madhavan, and
833:
produced work on "Intelligent Machinery" that was not published in his lifetime, containing "ideas related to artificial evolution and learning RNNs."
713:
concerns the capacity of networks with bounded width but the depth is allowed to grow. Lu et al. proved that if the width of a deep neural network with
624:(CAP) depth. The CAP is the chain of transformations from input to output. CAPs describe potentially causal connections between input and output. For a 14147: 6012: 13457: 10275:
Kariampuzha, William; Alyea, Gioconda; Qu, Sue; Sanjak, Jaleal; Mathé, Ewy; Sid, Eric; Chatelaine, Haley; Yadaw, Arjun; Xu, Yanji; Zhu, Qian (2023).
8192:
Li, Xiangang; Wu, Xihong (2014). "Constructing Long Short-Term Memory based Deep Recurrent Neural Networks for Large Vocabulary Speech Recognition".
8116:
Singh, Premjeet; Saha, Goutam; Sahidullah, Md (2021). "Non-linear frequency warping using constant-Q transformation for speech emotion recognition".
7609:
Ciresan, D.; Giusti, A.; Gambardella, L.M.; Schmidhuber, J. (2013). "Mitosis Detection in Breast Cancer Histology Images with Deep Neural Networks".
7576: 4956:
Fukushima, K. (1980). "Neocognitron: A self-organizing neural network model for a mechanism of pattern recognition unaffected by shift in position".
1412:'s principle of artificial curiosity) became state of the art in generative modeling during 2014-2018 period. Excellent image quality is achieved by 412: 13896: 2876:(...) use techniques like deep learning as just one element in a very complicated ensemble of techniques, ranging from the statistical technique of 1653:
the proposed label. Each mathematical manipulation as such is considered a layer, and complex DNN have many layers, hence the name "deep" networks.
1466:), as well as a range of large-vocabulary speech recognition tasks have steadily improved. Convolutional neural networks were superseded for ASR by 14741: 13666:
Nguyen, Anh; Yosinski, Jason; Clune, Jeff (2014). "Deep Neural Networks are Easily Fooled: High Confidence Predictions for Unrecognizable Images".
9816:
Cireşan, Dan; Meier, Ueli; Masci, Jonathan; Schmidhuber, Jürgen (August 2012). "Multi-column deep neural network for traffic sign classification".
9162: 8298: 13185: 8953: 7107: 3830: 2428:
In medical informatics, deep learning was used to predict sleep quality based on data from wearables and predictions of health complications from
13511: 10240:
Brocardo, Marcelo Luiz; Traore, Issa; Woungang, Isaac; Obaidat, Mohammad S. (2017). "Authorship verification using deep belief network systems".
1375:
In 2014, the state of the art was training “very deep neural network” with 20 to 30 layers. Stacking too many layers led to a steep reduction in
938:
applied backpropagation to neural networks in 1982 (his 1974 PhD thesis, reprinted in a 1994 book, did not yet describe the algorithm). In 1986,
13355: 11905: 10574:
Verbist, B; Klambauer, G; Vervoort, L; Talloen, W; The Qstar, Consortium; Shkedy, Z; Thas, O; Bender, A; Göhlmann, H. W.; Hochreiter, S (2015).
7992:
Karras, T.; Aila, T.; Laine, S.; Lehtinen, J. (26 February 2018). "Progressive Growing of GANs for Improved Quality, Stability, and Variation".
5128: 3218: 2557:. These applications include learning methods such as "Shrinkage Fields for Effective Image Restoration" which trains on an image dataset, and 1166: 1011:
where each RNN tries to predict its own next input, which is the next unexpected input of the RNN below. This "neural history compressor" uses
11063:
Chicco, Davide; Sadowski, Peter; Baldi, Pierre (1 January 2014). "Deep autoencoder neural networks for gene ontology annotation predictions".
8017: 7941: 2964:
into thinking ordinary people were celebrities, potentially allowing one person to impersonate another. In 2017 researchers added stickers to
2864:
Realistically, deep learning is only part of the larger challenge of building intelligent machines. Such techniques lack ways of representing
585:
in which a hierarchy of layers is used to transform input data into a slightly more abstract and composite representation. For example, in an
13818: 12702:"Probabilistic Models and Generative Neural Networks: Towards an Unified Framework for Modeling Normal and Impaired Neurocognitive Functions" 10484: 9180: 1169:
Speaker Recognition benchmark. It was deployed in the Nuance Verifier, representing the first major industrial application of deep learning.
11758: 10357: 10171:; He, X.; Heck, L.; Tur, G.; Yu, D.; Zweig, G. (2015). "Using recurrent neural networks for slot filling in spoken language understanding". 7931:
Goodfellow, Ian; Pouget-Abadie, Jean; Mirza, Mehdi; Xu, Bing; Warde-Farley, David; Ozair, Sherjil; Courville, Aaron; Bengio, Yoshua (2014).
7171: 5111:
Ostrovski, G.M., Volin,Y.M., and Boris, W.W. (1971). On the computation of derivatives. Wiss. Z. Tech. Hochschule for Chemistry, 13:382–384.
1707:
DNNs are prone to overfitting because of the added layers of abstraction, which allow them to model rare dependencies in the training data.
628:, the depth of the CAPs is that of the network and is the number of hidden layers plus one (as the output layer is also parameterized). For 7769:
Kiros, Ryan; Salakhutdinov, Ruslan; Zemel, Richard S (2014). "Unifying Visual-Semantic Embeddings with Multimodal Neural Language Models".
7206:"Keynote talk: 'Achievements and Challenges of Deep Learning - From Speech Analysis and Recognition To Language and Multimodal Processing'" 3963: 2946:
tricked an image classification system. One defense is reverse image search, in which a possible fake image is submitted to a site such as
2210: – capturing the style of a given artwork and applying it in a visually pleasing manner to an arbitrary photograph or video 1084:
to predict the reactions of the environment to these patterns. This was called "artificial curiosity". In 2014, this principle was used in
982:
et al., that classifies digits, was applied by several banks to recognize hand-written numbers on checks digitized in 32x32 pixel images.
11728: 11530:"System for the Recognizing of Pigmented Skin Lesions with Fusion and Analysis of Heterogeneous Data Based on a Multimodal Neural Network" 3890:
Co-evolving recurrent neurons learn deep memory POMDPs. Proc. GECCO, Washington, D. C., pp. 1795–1802, ACM Press, New York, NY, USA, 2005.
11267: 4637:
Unpublished (later published in Ince DC, editor, Collected works of AM Turing—Mechanical Intelligence, Elsevier Science Publishers, 1992)
1796:
Alternatively, engineers may look for other types of neural networks with more straightforward and convergent training algorithms. CMAC (
874:
to classify non-linearily separable pattern classes. Subsequent developments in hardware and hyperparameter tunings have made end-to-end
208: 173: 1513:
Simplified example of training a neural network in object detection: The network is trained by multiple images that are known to depict
1257:
to large vocabulary speech recognition, by adopting large output layers of the DNN based on context-dependent HMM states constructed by
1003:
In the 1980s, backpropagation did not work well for deep learning with long credit assignment paths. To overcome this problem, in 1991,
9710: 9474:
Feldmann, J.; Youngblood, N.; Karpov, M.; et al. (2021). "Parallel convolutional processing using an integrated photonic tensor".
5169: 1548:
In reality, textures and outlines would not be represented by single nodes, but rather by associated weight patterns of multiple nodes.
13945:"Human-aided artificial intelligence: Or, how to run large computations in human brains? Toward a media sociology of machine learning" 10981:"The Deep Learning–Based Recommender System "Pubmender" for Choosing a Biomedical Publication Venue: Development and Validation Study" 9881: 8459: 7058:; Kingsbury, B. (2012). "Deep Neural Networks for Acoustic Modeling in Speech Recognition: The Shared Views of Four Research Groups". 3111:
Most Deep Learning systems rely on training and verification data that is generated and/or annotated by humans. It has been argued in
7310:
Deng, Li; Li, Jinyu; Huang, Jui-Ting; Yao, Kaisheng; Yu, Dong; Seide, Frank; Seltzer, Mike; Zweig, Geoff; He, Xiaodong (1 May 2013).
6871: 2643: 1924:, where each speaker reads 10 sentences. Its small size lets many configurations be tried. More importantly, the TIMIT task concerns 10697: 11692:
Kleanthous, Christos; Chatzis, Sotirios (2020). "Gated Mixture Variational Autoencoders for Value Added Tax audit case selection".
11668: 11450: 10080: 8612:
Jozefowicz, Rafal; Vinyals, Oriol; Schuster, Mike; Shazeer, Noam; Wu, Yonghui (2016). "Exploring the Limits of Language Modeling".
5269: 3208: 2253:, information retrieval, spoken language understanding, machine translation, contextual entity linking, writing style recognition, 1797: 1376: 272: 250: 4917:
Fukushima, K. (1979). "Neural network model for a mechanism of pattern recognition unaffected by shift in position—Neocognitron".
14257: 12366: 9412:
Marega, Guilherme Migliato; Zhao, Yanfei; Avsar, Ahmet; Wang, Zhenyu; Tripati, Mukesh; Radenovic, Aleksandra; Kis, Anras (2020).
9363: 5456:"Computerized detection of clustered microcalcifications in digital mammograms using a shift-invariant artificial neural network" 4814: 4116: 3183: 2529:
request/serve/click internet advertising cycle. This information can form the basis of machine learning to improve ad selection.
186: 11294: 10044: 8372: 6843: 14140: 8977: 6635:
1st Intl. Workshop on Biologically Inspired Approaches to Advanced Information Technology, Bio-ADIT 2004, Lausanne, Switzerland
5000: 110: 7551: 1019:
at multiple self-organizing time scales. This can substantially facilitate downstream deep learning. The RNN hierarchy can be
639:
layer-by-layer method. Deep learning helps to disentangle these abstractions and pick out which features improve performance.
14043: 13744:
Miller, G. A., and N. Chomsky. "Pattern conception". Paper for Conference on pattern detection, University of Michigan. 1957.
13425: 12473: 11604: 10800: 8917: 8856: 8215:"Unidirectional Long Short-Term Memory Recurrent Neural Network with Recurrent Output Layer for Low-Latency Speech Synthesis" 8143: 7894: 7626: 7454: 7155: 6697: 6210: 5775: 4744: 4596: 4375: 4270: 4097: 4067: 3955: 3868: 3733: 3400: 2586: 2487:' color values to probabilities over possible image classes. In practice, the probability distribution of Y is obtained by a 1322:
created an FNN that learned to recognize higher-level concepts, such as cats, only from watching unlabeled images taken from
1208: 405: 331: 285: 240: 235: 11104: 9975:
Goldberg, Yoav; Levy, Omar (2014). "word2vec Explained: Deriving Mikolov et al.'s Negative-Sampling Word-Embedding Method".
8214: 8175: 8075: 7693: 6638: 6330: 3079: 14930: 14090: 13480: 11782:
Merchant, Amil; Batzner, Simon; Schoenholz, Samuel S.; Aykol, Muratahan; Cheon, Gowoon; Cubuk, Ekin Dogus (December 2023).
10985: 8556: 7724:
Vinyals, Oriol; Toshev, Alexander; Bengio, Samy; Erhan, Dumitru (2014). "Show and Tell: A Neural Image Caption Generator".
3463: 3188: 2623:
in both forward and inverse problems in a data driven manner. One example is the reconstructing fluid flow governed by the
2443:, a deep-learning based system, achieved a level of accuracy significantly higher than all previous computational methods. 2156: 1241:
using supervised backpropagation. They could model high-dimensional probability distributions, such as the distribution of
485: 13759: 13572: 13001:
Yamins, Daniel L K; DiCarlo, James J (March 2016). "Using goal-driven deep learning models to understand sensory cortex".
7412:; Chen, Yu-Hsin; Yang, Tien-Ju; Emer, Joel (2017). "Efficient Processing of Deep Neural Networks: A Tutorial and Survey". 7389: 3725:
Human Behavior and Another Kind in Consciousness: Emerging Research and Opportunities: Emerging Research and Opportunities
3051: 3001:", false data is continually smuggled into a machine learning system's training set to prevent it from achieving mastery. 2998: 15031: 14582: 14319: 13780: 2839: 1035:
network. In 1993, a neural history compressor solved a "Very Deep Learning" task that required more than 1000 subsequent
790: 384: 356: 351: 245: 17: 9044:
Proceedings of the International Conference for High Performance Computing, Networking, Storage and Analysis on - SC '17
8529: 7940:. Proceedings of the International Conference on Neural Information Processing Systems (NIPS 2014). pp. 2672–2680. 4205: 1789:
for optimal parameters may not be feasible due to the cost in time and computational resources. Various tricks, such as
12489:
Shrager, J.; Johnson, MH (1996). "Dynamic plasticity influences the emergence of function in a simple cortical array".
9569: 9074: 8636:
Gillick, Dan; Brunk, Cliff; Vinyals, Oriol; Subramanya, Amarnag (2015). "Multilingual Language Processing from Bytes".
8282: 6583:
L.P Heck and R. Teunen. "Secure and Convenient Transactions with Nuance Verifier". Nuance Users Conference, April 1998.
6308: 3428: 2246: 1829:
were designed to speed up deep learning algorithms. Deep learning processors include neural processing units (NPUs) in
721:; if the width is smaller or equal to the input dimension, then a deep neural network is not a universal approximator. 344: 213: 203: 193: 13378: 10000: 5061:
The representation of the cumulative rounding error of an algorithm as a Taylor expansion of the local rounding errors
1488:
for "conceptual and engineering breakthroughs that have made deep neural networks a critical component of computing."
1273:
How deep learning is a subset of machine learning and how machine learning is a subset of artificial intelligence (AI)
14843: 14470: 14277: 14133: 14062: 13992: 13642: 11436: 11090: 10654: 10576:"Using transcriptomics to guide lead optimization in drug discovery projects: Lessons learned from the QSTAR project" 9735: 9546: 9279: 9060: 8880:
Bengio, Yoshua; Boulanger-Lewandowski, Nicolas; Pascanu, Razvan (2013). "Advances in optimizing recurrent networks".
8437: 8091: 7966: 6669:(2006). "Connectionist temporal classification: Labelling unsegmented sequence data with recurrent neural networks". 6324: 6048: 5839: 5325: 5203: 5010: 3098: 3058: 2283: 2279: 1786: 974:
hardware. In 1991, a CNN was applied to medical image object segmentation and breast cancer detection in mammograms.
548: 433: 316: 262: 228: 95: 13849: 11628: 10633: 8074:
Google Research Blog. The neural networks behind Google Voice transcription. August 11, 2015. By Françoise Beaufays
4316:
Sonoda, Sho; Murata, Noboru (2017). "Neural network with unbounded activation functions is universal approximator".
1781:
DNNs must consider many training parameters, such as the size (number of layers and number of units per layer), the
843:
with learning hidden units? Unfortunately, the learning algorithm was not a functional one, and fell into oblivion.
14798: 10448: 6662: 3750: 2654: 2468: 2364: 1204: 870:. In computer experiments conducted by Amari's student Saito, a five layer MLP with two modifiable layers learned 749: 398: 302: 148: 10395: 7033: 4464:
Amari, Shun-Ichi (1972). "Learning patterns and pattern sequences by self-organizing nets of threshold elements".
4408: 2830:
Deep learning has attracted both criticism and comment, in some cases from outside the field of computer science.
989: 10722: 7108:"New types of deep neural network learning for speech recognition and related applications: An overview (ICASSP)" 5737: 3263: 2910: 2500: 993: 671: 80: 10895: 8658: 7910:
Gatys, Leon A.; Ecker, Alexander S.; Bethge, Matthias (26 August 2015). "A Neural Algorithm of Artistic Style".
6352: 5455: 5408: 5362:"Parallel distributed processing model with local space-invariant interconnections and its optical architecture" 5361: 2627:. Using physics informed neural networks does not require the often expensive mesh generation that conventional 2611:
The United States Department of Defense applied deep learning to train robots in new tasks through observation.
528:
programs, where they have produced results comparable to and in some cases surpassing human expert performance.
14985: 14925: 14523: 14117: 10946: 9795: 9572:(30 September 1991). "Several Improvements to a Recurrent Error Propagation Network Phone Recognition System". 7671:
Simonyan, Karen; Andrew, Zisserman (2014). "Very Deep Convolution Networks for Large Scale Image Recognition".
6732: 3065: 3036: 3032: 2853: 2675:
globally, at a very detailed level, and in under a minute, with precision similar to state of the art systems.
2455:
and called Neural Joint Entropy Estimator (NJEE). Such an estimation provides insights on the effects of input
1439:
In 2015, Google's speech recognition improved by 49% by an LSTM-based model, which they made available through
1401: 1319: 1130:
have been explored for many years. These methods never outperformed non-uniform internal-handcrafting Gaussian
1085: 748:, respectively. More specifically, the probabilistic interpretation considers the activation nonlinearity as a 481: 14096: 13119:"Deep Neural Networks Reveal a Gradient in the Complexity of Neural Representations across the Ventral Stream" 9624: 8399: 5822:
Gers, Felix; Schmidhuber, Jürgen; Cummins, Fred (1999). "Learning to forget: Continual prediction with LSTM".
3373:
Ciresan, D.; Meier, U.; Schmidhuber, J. (2012). "Multi-column deep neural networks for image classification".
2088:
Other types of deep models including tensor-based models and integrated deep generative/discriminative models.
531:
Early forms of neural networks were inspired by information processing and distributed communication nodes in
14518: 14207: 11473:"Liver Cancer Detection Using Hybridized Fully Convolutional Neural Network Based on Deep Learning Framework" 10327:"Deep Learning for Natural Language Processing: Theory and Practice (CIKM2014 Tutorial) - Microsoft Research" 9241: 5671:"Learning complex, extended sequences using the principle of history compression (based on TR FKI-148, 1991)" 2975:, potentially leading attackers and defenders into an arms race similar to the kind that already defines the 2922: 2819: 2815: 2620: 2464: 1349:
by a significant margin over shallow machine learning methods. Further incremental improvements included the
847: 477: 3135:(the embedding of annotation or computation tasks in the flow of a game), (2) "trapping and tracking" (e.g. 817:
which is essentially a non-learning RNN architecture consisting of neuron-like threshold elements. In 1972,
14960: 14357: 14314: 14267: 14262: 13449: 12524:
Quartz, SR; Sejnowski, TJ (1997). "The neural basis of cognitive development: A constructivist manifesto".
9148:
Ting Qin, et al. "A learning algorithm of CMAC based on RLS". Neural Processing Letters 19.1 (2004): 49-61.
6810: 5314:
IEEE Transactions on Acoustics, Speech, and Signal Processing, Volume 37, No. 3, pp. 328. – 339 March 1989.
2939: 2628: 2436: 2072: 1708: 1685: 1451: 1450:
Deep learning is part of state-of-the-art systems in various disciplines, particularly computer vision and
1120: 1108: 897: 757: 556: 552: 13880: 13348:"A Google DeepMind Algorithm Uses Deep Learning and More to Master the Game of Go | MIT Technology Review" 12820:"Neural Dynamics as Sampling: A Model for Stochastic Computation in Recurrent Networks of Spiking Neurons" 5310: 4896:
Ramachandran, Prajit; Barret, Zoph; Quoc, V. Le (October 16, 2017). "Searching for Activation Functions".
4289:
Fukushima, K. (1969). "Visual feature extraction by a multilayered network of analog threshold elements".
3047: 15011: 14307: 14233: 13408:
Bradley Knox, W.; Stone, Peter (2008). "TAMER: Training an Agent Manually via Evaluative Reinforcement".
10851:
Zhavoronkov, Alex (2019). "Deep learning enables rapid identification of potent DDR1 kinase inhibitors".
5196:
The Roots of Backpropagation : From Ordered Derivatives to Neural Networks and Political Forecasting
3601:
Bengio, Y.; Courville, A.; Vincent, P. (2013). "Representation Learning: A Review and New Perspectives".
2624: 2369: 2223: 1913:
about 10 ms. LSTM with forget gates is competitive with traditional speech recognizers on certain tasks.
1887: 1617:, or passing information in the reverse direction and adjusting the network to reflect that information. 1238: 1091:
During 1985–1995, inspired by statistical mechanics, several architectures and methods were developed by
886: 875: 863: 682: 501: 267: 218: 115: 13177: 9527:
Garofolo, J.S.; Lamel, L.F.; Fisher, W.M.; Fiscus, J.G.; Pallett, D.S.; Dahlgren, N.L.; Zue, V. (1993).
9158: 7054:
Hinton, G.; Deng, L.; Yu, D.; Dahl, G.; Mohamed, A.; Jaitly, N.; Senior, A.; Vanhoucke, V.; Nguyen, P.;
6717:, in Bengio, Yoshua; Schuurmans, Dale; Lafferty, John; Williams, Chris K. I.; and Culotta, Aron (eds.), 5295:. Meeting of the Institute of Electrical, Information and Communication Engineers (IEICE). Tokyo, Japan. 14635: 14570: 14171: 13550: 8942: 7650:
Ng, Andrew; Dean, Jeff (2012). "Building High-level Features Using Large Scale Unsupervised Learning".
7236:"Roles of Pre-Training and Fine-Tuning in Context-Dependent DBN-HMMs for Real-World Speech Recognition" 6036:
Parallel Distributed Processing: Explorations in the Microstructure of Cognition, Volume 1: Foundations
5799: 3228: 3198: 3140: 2782: 2699: 1369: 1292:
GPUs, an early demonstration of GPU-based deep learning. They reported up to 70 times faster training.
1192:
Neural networks entered a null, and simpler models that use task-specific handcrafted features such as
1050: 794: 658:
in 1986, and to artificial neural networks by Igor Aizenberg and colleagues in 2000, in the context of
625: 90: 73: 31: 11066:
Proceedings of the 5th ACM Conference on Bioinformatics, Computational Biology, and Health Informatics
6719:
Advances in Neural Information Processing Systems 22 (NIPS'22), December 7th–10th, 2009, Vancouver, BC
6627: 6594:"Acoustic Modeling with Deep Neural Networks Using Raw Time Signal for LVCSR (PDF Download Available)" 5974: 4138:"Efficient probabilistic inference in generic neural networks trained with non-probabilistic feedback" 15036: 14894: 14533: 14364: 14187: 13264: 11897: 11528:
Lyakhov, Pavel Alekseevich; Lyakhova, Ulyana Alekseevna; Nagornov, Nikolay Nikolaevich (2022-04-03).
5861:(1991). "A possibility for implementing curiosity and boredom in model-building neural controllers". 5132: 4720:
Contributions to Perceptron Theory, Cornell Aeronautical Laboratory Report No. VG-11 96--G-7, Buffalo
3243: 2961: 1774: 1675: 1570: 1522: 1497: 951: 919: 802: 753: 594: 473: 168: 13722: 13333: 9830: 8900: 7932: 6125: 6027: 5722:
Page 150 ff demonstrates credit assignment across the equivalent of 1,200 layers in an unfolded RNN.
3546: 1180:
features that contain stages of fixed transformation from spectrograms. The raw features of speech,
1049:'s diploma thesis (1991) implemented the neural history compressor, and identified and analyzed the 61: 14935: 14192: 12538: 9302:"HUAWEI Reveals the Future of Mobile AI at IFA 2017 | HUAWEI Latest News | HUAWEI Global" 9301: 8090:
Sak, Haşim; Senior, Andrew; Rao, Kanishka; Beaufays, Françoise; Schalkwyk, Johan (September 2015).
8013: 6679: 5652: 5522: 2731: 2476: 2429: 2254: 2082: 1150: 1077: 1016: 1008: 985: 871: 826: 629: 454: 292: 13810: 11589:
2017 2nd International Conference on Communication Systems, Computing and IT Applications (CSCITA)
10491: 14980: 14965: 14618: 14613: 14513: 14381: 14162: 12234:"Training Variational Networks With Multidomain Simulations: Speed-of-Sound Image Reconstruction" 10136:
Huang, Po-Sen; He, Xiaodong; Gao, Jianfeng; Deng, Li; Acero, Alex; Heck, Larry (1 October 2013).
8169:"Long Short-Term Memory recurrent neural network architectures for large scale acoustic modeling" 6893: 5584: 3131:
distinguishes five types of "machinic capture" of human microwork to generate training data: (1)
3025: 2926: 2703: 2381: 1861: 1826: 1661: 1380: 1216: 1054: 967: 613: 465: 437: 53: 10979:
Feng, X.Y.; Zhang, H.; Ren, Y.J.; Shang, P.H.; Zhu, Y.; Liang, Y.C.; Guan, R.C.; Xu, D. (2019).
10068:
Socher, R.; Perelygin, A.; Wu, J.; Chuang, J.; Manning, C.D.; Ng, A.; Potts, C. (October 2013).
4761: 3493: 1649:
a way that mimics functions of the human brain, and can be trained like any other ML algorithm.
1386:
Around the same time, deep learning started impacting the field of art. Early examples included
1203:
In 2003, LSTM became competitive with traditional speech recognizers on certain tasks. In 2006,
604:
Importantly, a deep learning process can learn which features to optimally place at which level
15059: 14940: 14700: 14419: 14414: 13717: 12533: 9825: 8895: 6674: 5517: 3541: 3120: 2811:
In 2017, Covariant.ai was launched, which focuses on integrating deep learning into factories.
2275: 1838: 1749: 1718: 1579: 1300: 1197: 1177: 1058: 1024: 840: 517: 163: 11034:"A Multi-View Deep Learning Approach for Cross Domain User Modeling in Recommendation Systems" 6040: 2589:
announced that they had developed an AI system known as GNoME. This system has contributed to
14970: 14955: 14920: 14608: 14508: 14376: 9886: 6006: 5646: 4652:"The perceptron: A probabilistic model for information storage and organization in the brain" 4087: 3723: 2918: 2789: 2207: 2125: 1842: 1507: 1394:(2015), both of which were based on pretrained image classification neural networks, such as 1391: 1036: 859: 798: 760:
in neural networks. The probabilistic interpretation was introduced by researchers including
717:
activation is strictly larger than the input dimension, then the network can approximate any
703: 560: 489: 458: 14838: 9677: 7547: 6666: 5917: 5876: 5858: 5795: 5627: 5158: 4496: 1409: 1304: 1065: 1004: 14990: 14945: 14391: 14336: 14182: 14177: 13290: 13220: 12831: 12603: 12321: 12177:"High-Resolution Multi-Spectral Imaging With Diffractive Lenses and Learned Reconstruction" 12129: 12062: 12005: 11940: 11795: 11484: 11373: 9425: 9115: 8577: 8467: 8322: 7353: 7067: 6991: 6432: 6159: 6150:; Neal, Radford (1995-05-26). "The wake-sleep algorithm for unsupervised neural networks". 5467: 5420: 5373: 5232: 4533: 4435: 4149: 3972: 3850: 3799: 3328: 3213: 3072: 2917:(visual or linguistic) from training data would be equivalent to restricting the system to 2647: 2504: 2480: 2472: 2377: 2351: 2339:. AtomNet was used to predict novel candidate biomolecules for disease targets such as the 1871: 1633: 1463: 1346: 1145:
researchers moved away from neural nets to pursue generative modeling. An exception was at
1116: 997: 718: 686: 105: 10168: 6932: 6867: 2983:
software by repeatedly attacking a defense with malware that was continually altered by a
662:
threshold neurons. Although the history of its appearance is apparently more complicated.
8: 14565: 14543: 14292: 14287: 14245: 14197: 13279:(28 January 2016). "Mastering the game of Go with deep neural networks and tree search". 11416: 10693: 10105:
Shen, Yelong; He, Xiaodong; Gao, Jianfeng; Deng, Li; Mesnil, Gregoire (1 November 2014).
8836: 8076:
http://googleresearch.blogspot.co.at/2015/08/the-neural-networks-behind-google-voice.html
8039:
Sohl-Dickstein, Jascha; Weiss, Eric; Maheswaranathan, Niru; Ganguli, Surya (2015-06-01).
5124: 3223: 2881: 2763: 2736: 2690: 2496: 2321: 2062: 1657: 1625: 1440: 1234: 1212: 1158: 1135: 1068:
also published adversarial neural networks that contest with each other in the form of a
890: 769: 710: 644: 609: 568: 505: 469: 450: 257: 13636:"Are there Deep Reasons Underlying the Pathologies of Today's Deep Learning Algorithms?" 13294: 13224: 12835: 12607: 12325: 12133: 12066: 12051:"Hidden fluid mechanics: Learning velocity and pressure fields from flow visualizations" 12009: 11944: 11826: 11799: 11783: 11659: 11488: 11377: 10107:"A Latent Semantic Model with Convolutional-Pooling Structure for Information Retrieval" 9429: 9119: 8581: 8326: 8040: 7357: 7071: 6995: 6436: 6163: 5750:
Diploma thesis. Institut f. Informatik, Technische Univ. Munich. Advisor: J. Schmidhuber
5712: 5471: 5424: 5377: 5287: 5266: 5236: 4537: 4439: 4153: 3976: 3803: 3332: 1165:
reported significant success with deep neural networks in speech processing in the 1998
14950: 14528: 14111: 13974: 13688: 13667: 13431: 13322: 13246: 13153: 13130: 13118: 13099: 13086: 13053: 13034: 12983: 12932: 12875: 12862: 12819: 12800: 12749: 12736: 12701: 12682: 12559: 12444: 12311: 12279: 12245: 12214: 12188: 12152: 12119: 12107: 12083: 12050: 12031: 11974: 11874: 11709: 11610: 11564: 11529: 11510: 11471:
Dong, Xin; Zhou, Yizhao; Wang, Lantian; Peng, Jingfeng; Lou, Yanbo; Fan, Yiqun (2020).
11442: 11397: 11363: 11335: 11309: 11219: 11186: 11167: 11154: 11127: 11096: 11009: 10980: 10925: 10876: 10831: 10773: 10745: 10556: 10427: 10303: 10276: 10257: 10188: 10069: 9976: 9862: 9767: 9702: 9658: 9616: 9509: 9483: 9446: 9413: 9332: 9131: 9105: 9066: 9038: 8923: 8885: 8862: 8814: 8742: 8677: 8637: 8613: 8567: 8521: 8354: 8292: 8193: 8149: 8121: 8055: 7993: 7911: 7872: 7841: 7816: 7796: 7770: 7748: 7725: 7704: 7672: 7651: 7513: 7479: 7413: 7281: 7083: 6835: 6768: 6456: 6391: 6183: 6114: 6028:"Chapter 6: Information Processing in Dynamical Systems: Foundations of Harmony Theory" 5955: 5929: 5896: 5693: 5535: 5094: 4981: 4938: 4897: 4874: 4522:"Neural networks and physical systems with emergent collective computational abilities" 4500: 4343: 4325: 4172: 4137: 4035: 3988: 3931: 3704: 3678: 3636: 3610: 3559: 3406: 3378: 3352: 3291: 3203: 3193: 3170:
Mühlhoff argues that in most commercial end-user applications of Deep Learning such as
3152: 2877: 2525: 2452: 2393: 2344: 2265: 2250: 2093: 2056:
Feature processing by deep models with solid understanding of the underlying mechanisms
1907: 1822: 1621: 1227: 1142: 1127: 939: 905: 882: 867: 818: 773: 699: 675: 497: 307: 13944: 12911: 12894: 12232:
Bernhardt, Melanie; Vishnevskiy, Valery; Rau, Richard; Goksel, Orcun (December 2020).
10077:
Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing
7441:. ICML '09. New York, NY, USA: Association for Computing Machinery. pp. 873–880. 6570: 6540: 6227: 5990: 5502: 4556: 4521: 4018:
Hornik, Kurt (1991). "Approximation Capabilities of Multilayer Feedforward Networks".
3313: 3128: 2380:
variables. The estimated value function was shown to have a natural interpretation as
1368:
The success in image classification was then extended to the more challenging task of
15016: 15004: 14808: 14460: 14331: 14324: 14086: 14058: 14039: 14000: 13978: 13966: 13421: 13314: 13306: 13238: 13158: 13091: 13073: 13026: 13018: 12975: 12967: 12924: 12916: 12867: 12849: 12818:
Buesing, Lars; Bill, Johannes; Nessler, Bernhard; Maass, Wolfgang (3 November 2011).
12792: 12784: 12741: 12723: 12674: 12639: 12634: 12621: 12591: 12551: 12506: 12502: 12469: 12436: 12347: 12339: 12283: 12271: 12263: 12218: 12206: 12157: 12088: 12035: 12023: 11966: 11958: 11878: 11866: 11850: 11831: 11813: 11713: 11600: 11569: 11551: 11514: 11502: 11432: 11417:"Deep Convolutional Neural Networks for Detecting Cellular Changes Due to Malignancy" 11389: 11339: 11327: 11224: 11206: 11159: 11100: 11086: 11014: 10880: 10868: 10607: 10548: 10308: 9843: 9706: 9620: 9542: 9513: 9501: 9451: 9056: 9008: 8913: 8852: 8763: 8734: 8726: 8513: 8346: 8338: 8278: 8153: 8139: 7890: 7632: 7622: 7505: 7497: 7450: 7151: 7087: 7009: 6960: 6952: 6913: 6827: 6802: 6760: 6752: 6593: 6513: 6383: 6320: 6290: 6255: 6247: 6243: 6228:"Predicting the secondary structure of globular proteins using neural network models" 6206: 6175: 6106: 6101: 6044: 5994: 5959: 5947: 5835: 5771: 5631: 5604: 5483: 5436: 5389: 5248: 5199: 5098: 5006: 4985: 4973: 4942: 4930: 4777: 4740: 4679: 4671: 4592: 4561: 4371: 4266: 4177: 4093: 4063: 4031: 3935: 3923: 3864: 3729: 3696: 3628: 3563: 3396: 3344: 3295: 3283: 3238: 2984: 2914: 2873: 2743:
somewhat analogous to the neural networks utilized in deep learning models. Like the
2740: 2727: 2715: 2599: 2595: 2590: 2554: 2066: 1925: 1809: 1773:-regularization) can be applied during training to combat overfitting. Alternatively 1700:
As with ANNs, many issues can arise with naively trained DNNs. Two common issues are
1586: 1146: 1112: 1104: 1012: 971: 851: 741: 586: 572: 532: 445: 85: 13103: 13038: 12987: 12936: 12804: 11978: 11614: 10560: 10396:"Zero-Shot Translation with Google's Multilingual Neural Machine Translation System" 10261: 9662: 9135: 8927: 8866: 8818: 8681: 8525: 8135: 6772: 6475: 5764:"Gradient flow in recurrent nets: the difficulty of learning long-term dependencies" 5697: 5539: 4347: 3708: 2987:
until it tricked the anti-malware while retaining its ability to damage the target.
2852:
Others point out that deep learning should be looked at as a step towards realizing
2845:
how fast? What is it approximating?) Deep learning methods are often looked at as a
1531: 14761: 14751: 14558: 14352: 14302: 14297: 14240: 14228: 13956: 13888: 13727: 13435: 13413: 13298: 13281: 13250: 13228: 13148: 13144: 13140: 13081: 13065: 13010: 12959: 12906: 12879: 12857: 12839: 12776: 12753: 12731: 12713: 12686: 12666: 12629: 12611: 12563: 12543: 12498: 12448: 12428: 12399: 12329: 12255: 12198: 12147: 12137: 12078: 12070: 12013: 11948: 11858: 11821: 11803: 11701: 11592: 11559: 11541: 11492: 11446: 11424: 11401: 11381: 11319: 11243:"DeepMind's protein-folding AI has solved a 50-year-old grand challenge of biology" 11214: 11198: 11171: 11149: 11139: 11078: 11070: 11004: 10994: 10860: 10597: 10587: 10538: 10453: 10298: 10288: 10249: 10192: 10180: 9955: 9917: 9835: 9692: 9650: 9608: 9577: 9534: 9493: 9441: 9433: 9342: 9123: 9070: 9048: 8905: 8844: 8806: 8746: 8718: 8709:
Hochreiter, Sepp; Schmidhuber, Jürgen (1 November 1997). "Long Short-Term Memory".
8669: 8505: 8494:"LSTM Recurrent Networks Learn Simple Context Free and Context Sensitive Languages" 8330: 8131: 7882: 7614: 7543: 7533:"Flexible, High Performance Convolutional Neural Networks for Image Classification" 7532: 7517: 7489: 7442: 7361: 7273: 7075: 6999: 6944: 6905: 6839: 6819: 6744: 6566: 6536: 6509: 6448: 6440: 6395: 6375: 6367: 6282: 6239: 6167: 6118: 6096: 6088: 5986: 5939: 5888: 5827: 5805: 5685: 5670: 5596: 5527: 5475: 5428: 5381: 5240: 5086: 5074: 5056: 5038: 4965: 4922: 4854: 4806: 4773: 4700: 4663: 4617: 4584: 4551: 4541: 4443: 4335: 4298: 4220: 4167: 4157: 4039: 4027: 3992: 3980: 3913: 3856: 3807: 3688: 3620: 3551: 3410: 3388: 3356: 3336: 3275: 3160: 2805: 2759: 2684: 2558: 2546: 2538: 2456: 2373: 2271: 2101: 2076: 1921: 1433: 1358: 1081: 1073: 931: 893:. The rectifier has become the most popular activation function for deep learning. 836: 805:(RNN). RNNs have cycles in their connectivity structure, FNNs don't. In the 1920s, 729: 694: 636: 579: 564: 429: 223: 158: 143: 13326: 12175:
Oktem, Figen S.; Kar, Oğuzhan Fatih; Bezek, Can Deniz; Kamalabadi, Farzad (2021).
11187:"Using recurrent neural network models for early detection of heart failure onset" 9654: 8673: 8358: 7285: 7277: 7266:"Conversational speech transcription using context-dependent deep neural networks" 6694: 6460: 6187: 5900: 5879:(2010). "Formal Theory of Creativity, Fun, and Intrinsic Motivation (1990-2010)". 4795: 3996: 3918: 3901: 3640: 1928:-sequence recognition, which, unlike word-sequence recognition, allows weak phone 608:. Prior to deep learning, machine learning techniques often involved hand-crafted 14874: 14818: 14640: 14282: 14202: 12844: 12463: 11705: 10592: 10575: 10138:"Learning Deep Structured Semantic Models for Web Search using Clickthrough Data" 10033: 9839: 9526: 9388:"Cerebras launches new AI supercomputing processor with 2.6 trillion transistors" 9166: 7618: 7531:
Ciresan, D. C.; Meier, U.; Masci, J.; Gambardella, L.M.; Schmidhuber, J. (2011).
7365: 7172:"Deng receives prestigious IEEE Technical Achievement Award - Microsoft Research" 7145: 7037: 6875: 6701: 6476:"Artificial Neural Networks and their Application to Speech/Sequence Recognition" 6139: 6072: 6068: 5943: 5791: 5763: 5745: 5326:"Shift-invariant pattern recognition neural network and its optical architecture" 5273: 5026: 4838: 4734: 4588: 4260: 4123: 4057: 3837: 3692: 3156: 2957: 2755: 2582: 2570: 2460: 2137: 1834: 1620:
Neural networks have been used on a variety of tasks, including computer vision,
1614: 1477: 1425: 1342: 1334: 1289: 1223: 1100: 1092: 1046: 923: 911: 659: 493: 100: 13417: 12390:
Galkin, F.; Mamoshina, P.; Kochetov, K.; Sidorenko, D.; Zhavoronkov, A. (2020).
11759:"Google DeepMind's materials AI has already discovered 2.2 million new crystals" 11596: 11497: 11472: 11385: 9581: 8909: 8848: 6948: 6823: 3464:"Google's AlphaGo AI wins three-match series against the world's best Go player" 1277:
The deep learning revolution started around CNN- and GPU-based computer vision.
14848: 14813: 14803: 14628: 14386: 14212: 14072: 13276: 13272: 12963: 12259: 12233: 12176: 11862: 11808: 11323: 11185:
Choi, Edward; Schuetz, Andy; Stewart, Walter F.; Sun, Jimeng (13 August 2016).
10293: 10184: 10070:"Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank" 9788:"MNIST handwritten digit database, Yann LeCun, Corinna Cortes and Chris Burges" 9612: 9497: 9042: 8764:"Learning Precise Timing with LSTM Recurrent Networks (PDF Download Available)" 8722: 8118:
2021 International Conference on Computer Communication and Informatics (ICCCI)
7862: 7383: 6909: 6748: 6715:
Offline Handwriting Recognition with Multidimensional Recurrent Neural Networks
5600: 4339: 4162: 3144: 2707: 2479:
over the possible classes of random variable Y, given input X. For example, in
2418: 2408: 2305: 2261: 2232: 2167: 1883: 1853: 1790: 1689: 1679: 1629: 1543: 1459: 1405: 1354: 1338: 1242: 1072:, where one network's gain is the other network's loss. The first network is a 927: 765: 745: 690: 509: 13892: 13570: 12780: 12670: 12547: 12432: 12018: 11993: 11953: 11928: 10864: 10483:
Boitet, Christian; Blanchon, Hervé; Seligman, Mark; Bellynck, Valérie (2010).
9697: 9437: 9127: 7004: 6979: 6695:
An application of recurrent neural networks to discriminative keyword spotting
6379: 6286: 6092: 5892: 5689: 5219:
Rumelhart, David E.; Hinton, Geoffrey E.; Williams, Ronald J. (October 1986).
4859: 4842: 4810: 4621: 4447: 3860: 3812: 3787: 3392: 3279: 1664:
are exponentially easier to approximate with DNNs than with shallow networks.
15053: 14793: 14773: 14690: 14076: 14004: 13970: 13961: 13310: 13077: 13022: 12971: 12920: 12853: 12788: 12727: 12718: 12678: 12625: 12343: 12267: 12210: 12202: 12108:"Solving high-dimensional partial differential equations using deep learning" 12027: 11962: 11817: 11555: 11506: 11210: 9645:
Deng, L.; Platt, J. (2014). "Ensemble Deep Learning for Speech Recognition".
8882:
2013 IEEE International Conference on Acoustics, Speech and Signal Processing
8841:
2013 IEEE International Conference on Acoustics, Speech and Signal Processing
8794: 8730: 8493: 8342: 7501: 7079: 7013: 6956: 6756: 6387: 6351:
Waibel, A.; Hanazawa, T.; Hinton, G.; Shikano, K.; Lang, K. J. (March 1989).
6294: 6251: 6076: 6034: 5998: 5608: 5252: 4675: 4302: 3927: 3287: 3233: 3112: 2796:
video games using only pixels as data input. In 2015 they demonstrated their
2422: 1858: 1782: 1743: 1678:, in which data can flow in any direction, are used for applications such as 1564: 1473: 1428:(2015) eclipsed GANs in generative modeling since then, with systems such as 1258: 1131: 1069: 822: 761: 725: 13275:; Lillicrap, Timothy; Leach, Madeleine; Kavukcuoglu, Koray; Graepel, Thore; 12616: 12334: 12299: 12142: 12074: 11546: 11074: 11064: 10655:"Multi-task Neural Networks for QSAR Predictions | Data Science Association" 9861:
Chaochao Lu; Xiaoou Tang (2014). "Surpassing Human Level Face Recognition".
9347: 9320: 9052: 7446: 7147:
Automatic Speech Recognition: A Deep Learning Approach (Publisher: Springer)
6721:, Neural Information Processing Systems (NIPS) Foundation, 2009, pp. 545–552 6444: 6312: 6171: 5973:
Ackley, David H.; Hinton, Geoffrey E.; Sejnowski, Terrence J. (1985-01-01).
5561: 5500: 3848: 14879: 14710: 14125: 13318: 13242: 13162: 13095: 13069: 13030: 12979: 12950:
Olshausen, B; Field, D (1 August 2004). "Sparse coding of sensory inputs".
12928: 12871: 12796: 12745: 12555: 12510: 12440: 12351: 12275: 12161: 12092: 11870: 11835: 11667:. Computer Vision and Pattern Recognition (CVPR), 2014 IEEE Conference on. 11573: 11393: 11331: 11228: 11202: 11163: 11018: 10872: 10611: 10552: 10358:"Found in translation: More accurate, fluent sentences in Google Translate" 10312: 9847: 9505: 9455: 8832: 8517: 8350: 8249: 7636: 7509: 7439:
Proceedings of the 26th Annual International Conference on Machine Learning
7409: 7055: 6964: 6917: 6831: 6798: 6764: 6147: 5951: 5440: 5409:"Image processing of human corneal endothelium based on a learning network" 5393: 5220: 4683: 4651: 4579:
Nakano, Kaoru (1971). "Learning Process in a Model of Associative Memory".
4546: 4181: 3826: 3700: 3632: 3348: 3164: 3132: 3123:) is regularly deployed for this purpose, but also implicit forms of human 3116: 2906: 2891: 2242: 2191:
Visual art processing of Jimmy Wales in France, with the style of Munch's "
2160: 2105: 1879: 1712: 1602: 1575: 1485: 1231: 1193: 970:
on mail. Training required 3 days. In 1990, Wei Zhang implemented a CNN on
901: 806: 733: 655: 441: 297: 13635: 12643: 11428: 9271: 8738: 8429: 7434: 6671:
Proceedings of the International Conference on Machine Learning, ICML 2006
6626:
Graves, Alex; Eck, Douglas; Beringer, Nicole; Schmidhuber, Jürgen (2003).
6259: 6179: 6110: 5831: 5809: 5562:"Attractor dynamics and parallelism in a connectionist sequential machine" 5487: 4977: 4934: 4565: 3624: 2714:. The aging clock was planned to be released for public use in 2021 by an 14975: 14746: 14655: 14650: 14272: 14250: 11268:"DeepMind solves 50-year-old 'grand challenge' with protein folding A.I." 11144: 10277:"Precision information extraction for rare disease epidemiology at scale" 10209:
Gao, Jianfeng; He, Xiaodong; Yih, Scott Wen-tau; Deng, Li (1 June 2014).
8038: 7886: 7493: 6313:"A real-time recurrent error propagation network word recognition system" 6143: 6064: 5432: 5385: 5305: 5154: 3749:
Bengio, Yoshua; Lamblin, Pascal; Popovici, Dan; Larochelle, Hugo (2007).
3669:
Schmidhuber, J. (2015). "Deep Learning in Neural Networks: An Overview".
2857: 2698:
and predicts people with certain conditions older than healthy controls:
2439:, according to the sequence of the amino acids that make it up. In 2020, 2414: 2340: 2336: 1701: 1362: 1308: 1173: 1096: 955: 935: 830: 814: 810: 536: 521: 513: 326: 311: 13302: 11082: 9960: 9943: 9922: 9905: 9736:"How Skype Used AI to Build Its Amazing New Language Translator | WIRED" 9678:"Phone Recognition with Hierarchical Convolutional Deep Maxout Networks" 9528: 8334: 7611:
Medical Image Computing and Computer-Assisted Intervention – MICCAI 2013
6452: 6275:
International Journal of Pattern Recognition and Artificial Intelligence
5330:
Proceedings of Annual Conference of the Japan Society of Applied Physics
3340: 2634: 926:
had a continuous precursor of backpropagation in 1960 in the context of
900:(CNNs) with convolutional layers and downsampling layers began with the 14869: 14828: 14823: 14736: 14645: 14553: 14465: 14445: 13731: 12238:
IEEE Transactions on Ultrasonics, Ferroelectrics, and Frequency Control
12049:
Raissi, Maziar; Yazdani, Alireza; Karniadakis, George Em (2020-02-28).
11421:
2017 IEEE International Conference on Computer Vision Workshops (ICCVW)
10067: 10032:
Socher, Richard; Bauer, John; Manning, Christopher; Ng, Andrew (2013).
9538: 7613:. Lecture Notes in Computer Science. Vol. 7908. pp. 411–418. 7030: 7026: 5090: 4969: 4926: 4224: 3984: 3758:. Advances in neural information processing systems. pp. 153–160. 3555: 3039: in this section. Unsourced material may be challenged and removed. 2991: 2990:
In 2016, another group demonstrated that certain sounds could make the
2849:, with most confirmations done empirically, rather than theoretically. 2748: 2550: 2399:
content-based approach and enhances recommendations in multiple tasks.
2332:
of environmental chemicals in nutrients, household products and drugs.
2325: 2309: 2192: 2109: 1875: 1518: 1481: 1444: 1207:, Santiago Fernández, Faustino Gomez, and Schmidhuber combined it with 1162: 979: 959: 942:
et al. popularised backpropagation but did not cite the original work.
934:'s master thesis (1970). G.M. Ostrovski et al. republished it in 1971. 915: 525: 11970: 10602: 8509: 7869:
2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)
7385:
High performance convolutional neural networks for document processing
41:
Representing images on multiple layers of abstraction in deep learning
14864: 14833: 14731: 14575: 14538: 14475: 14429: 14424: 14409: 13512:"DARPA is funding projects that will try to open up AI's black boxes" 13268: 12296: 11992:
Mao, Zhiping; Jagtap, Ameya D.; Karniadakis, George Em (2020-03-01).
10625: 10253: 10211:"Learning Continuous Phrase Representations for Translation Modeling" 9787: 9318: 8810: 7836:
He, Kaiming; Zhang, Xiangyu; Ren, Shaoqing; Sun, Jian (10 Dec 2015).
7344:
Oh, K.-S.; Jung, K. (2004). "GPU implementation of neural networks".
6371: 5824:
9th International Conference on Artificial Neural Networks: ICANN '99
5531: 5479: 5244: 4667: 3124: 2972: 2965: 2869: 2865: 2846: 2801: 2744: 2542: 2440: 1890: 1387: 1383:(ResNet) in Dec 2015. ResNet behaves like an open-gated Highway Net. 1315: 1285: 1057:
connections to solve the vanishing gradient problem. This led to the
821:
made this architecture adaptive. His learning RNN was republished by
681:
The classic universal approximation theorem concerns the capacity of
582: 361: 125: 13233: 13208: 13014: 12576:
S. Blakeslee, "In brain's early growth, timetable may be critical",
10543: 10526: 10394:
Schuster, Mike; Johnson, Melvin; Thorat, Nikhil (22 November 2016).
8048:
Proceedings of the 32nd International Conference on Machine Learning
4762:"Heuristic self-organization in problems of engineering cybernetics" 3436:
NIPS 2012: Neural Information Processing Systems, Lake Tahoe, Nevada
3014: 2808:
uses a neural network to translate between more than 100 languages.
2721: 1211:(CTC) in stacks of LSTMs. In 2009, it became the first RNN to win a 14766: 14598: 13918: 13781:"Hackers Have Already Started to Weaponize Artificial Intelligence" 13571:
Alexander Mordvintsev; Christopher Olah; Mike Tyka (17 June 2015).
12316: 12250: 12193: 12124: 11368: 11314: 10930: 10836: 10750: 10432: 9596: 9488: 9337: 9110: 8943:"Improving DNNs for LVCSR using rectified linear units and dropout" 8642: 8618: 8126: 8060: 7998: 7916: 7877: 7846: 7821: 7608: 7574: 7418: 7312:"Recent Advances in Deep Learning for Speech Research at Microsoft" 5934: 5042: 4902: 4612:
Nakano, Kaoru (1972). "Associatron-A Model of Associative Memory".
4505: 4330: 3171: 3148: 2778: 2695: 2492: 2329: 2317: 2237: 1868: 1846: 1688:(CNNs) are used in computer vision. CNNs also have been applied to 1539: 1514: 1421: 1417: 1181: 850:, a method to train arbitrarily deep neural networks, published by 737: 198: 120: 14080: 13693: 13672: 13410:
2008 7th IEEE International Conference on Development and Learning
13135: 12392:"DeepMAge: A Methylation Aging Clock Developed with Deep Learning" 10999: 10769:"Toronto startup has a faster way to discover effective medicines" 10449:"An Infusion of AI Makes Google Translate More Powerful Than Ever" 9981: 9867: 9772: 9181:"Deep Neural Networks for Acoustic Modeling in Speech Recognition" 8890: 8879: 8831: 8572: 8198: 7801: 7793:
Very Deep Convolutional Networks for Large-Scale Image Recognition
7775: 7753: 7730: 7709: 7677: 7656: 7484: 7469: 7435:"Large-scale deep unsupervised learning using graphics processors" 5566:
Proceedings of the Annual Meeting of the Cognitive Science Society
3683: 3615: 3383: 2979:
defense industry. ANNs have been trained to defeat ANN-based anti-
1808:
Since the 2010s, advances in both machine learning algorithms and
1126:
Both shallow and deep learning (e.g., recurrent nets) of ANNs for
14889: 14726: 14680: 14603: 14503: 14498: 14450: 13708:
Zhu, S.C.; Mumford, D. (2006). "A stochastic grammar of images".
13379:"A.I. Researchers Leave Elon Musk Lab to Begin Robotics Start-Up" 12592:"A more biologically plausible learning rule for neural networks" 12465:
Rethinking Innateness: A Connectionist Perspective on Development
12419:
Utgoff, P. E.; Stracuzzi, D. J. (2002). "Many-layered learning".
11729:"Deep learning: the next frontier for money laundering detection" 11128:"Sleep Quality Prediction From Wearable Data Using Deep Learning" 6931:
Hinton, Geoffrey E.; Osindero, Simon; Teh, Yee-Whye (July 2006).
6628:"Biologically Plausible Speech Recognition with LSTM Neural Nets" 6526: 5790: 5501:
LeCun, Yann; Léon Bottou; Yoshua Bengio; Patrick Haffner (1998).
5346:, "Backpropagation Applied to Handwritten Zip Code Recognition", 3494:"Study urges caution when comparing neural networks to the brain" 3429:"ImageNet Classification with Deep Convolutional Neural Networks" 3136: 2980: 2976: 2797: 2711: 2488: 2121: 2092:
All major commercial speech recognition systems (e.g., Microsoft
1917: 1682:. Long short-term memory is particularly effective for this use. 1598: 1372:(captions) for images, often as a combination of CNNs and LSTMs. 1330: 1323: 1040: 366: 12895:"Linear summation of excitatory inputs by CA1 pyramidal neurons" 12404: 12391: 12389: 11414: 9903: 9321:"In-Datacenter Performance Analysis of a Tensor Processing Unit" 8611: 8041:"Deep Unsupervised Learning using Nonequilibrium Thermodynamics" 7240:
NIPS Workshop on Deep Learning and Unsupervised Feature Learning
6693:
Santiago Fernandez, Alex Graves, and Jürgen Schmidhuber (2007).
2213:
generating striking imagery based on random visual input fields.
1007:
proposed a hierarchy of RNNs pre-trained one level at a time by
37: 14904: 14884: 14756: 14548: 10573: 10527:"Trial watch: Phase II and phase III attrition rates 2011-2012" 9601:
IEEE/ACM Transactions on Audio, Speech, and Language Processing
9473: 8430:"The power of deeper networks for expressing natural functions" 6661: 3748: 3375:
2012 IEEE Conference on Computer Vision and Pattern Recognition
2947: 2507:
and outperforms other methods in case of large alphabet sizes.
2484: 1929: 1830: 1814: 1590: 1429: 1413: 1395: 1350: 670:
Deep neural networks are generally interpreted in terms of the
11781: 10482: 8398:
Szegedy, Christian; Toshev, Alexander; Erhan, Dumitru (2013).
7930: 7861:
He, Kaiming; Zhang, Xiangyu; Ren, Shaoqing; Sun, Jian (2016).
5265:
Rumelhart, David E., Geoffrey E. Hinton, and R. J. Williams. "
5077:(1976). "Taylor expansion of the accumulated rounding error". 4117:
The Expressive Power of Neural Networks: A View from the Width
3603:
IEEE Transactions on Pattern Analysis and Machine Intelligence
2451:
Deep neural networks can be used to estimate the entropy of a
752:. The probabilistic interpretation led to the introduction of 14705: 14685: 14675: 14670: 14665: 14660: 14623: 14455: 13543:"Is "Deep Learning" a Revolution in Artificial Intelligence?" 13542: 12231: 11927:
Raissi, M.; Perdikaris, P.; Karniadakis, G. E. (2019-02-01).
11032:
Elkahky, Ali Mamdouh; Song, Yang; He, Xiaodong (1 May 2015).
10944: 10239: 9211:"GPUs Continue to Dominate the AI Accelerator Market for Now" 9009:"A Practical Guide to Training Restricted Boltzmann Machines" 8635: 6360:
IEEE Transactions on Acoustics, Speech, and Signal Processing
4426:
Brush, Stephen G. (1967). "History of the Lenz-Ising Model".
2793: 2231:
Other key techniques in this field are negative sampling and
2117: 1594: 1455: 1281:
optimizations were developed specifically for deep learning.
1254: 1176:
features in the late 1990s, showing its superiority over the
1154: 975: 963: 598: 590: 13993:"Facebook Can Now Find Your Face, Even When It's Not Tagged" 13686: 13407: 10829: 9815: 9574:
Cambridge University Engineering Department Technical Report
8272: 7530: 6625: 5159:"Applications of advances in nonlinear sensitivity analysis" 5063:(Masters) (in Finnish). University of Helsinki. p. 6–7. 4499:(2022). "Annotated History of Modern AI and Deep Learning". 3427:
Krizhevsky, Alex; Sutskever, Ilya; Hinton, Geoffrey (2012).
2435:
Deep neural networks have shown unparalleled performance in
2320:. Research has explored use of deep learning to predict the 930:. The modern form of backpropagation was first published in 547:
Most modern deep learning models are based on multi-layered
492:. These architectures have been applied to fields including 14695: 12590:
Mazzoni, P.; Andersen, R. A.; Jordan, M. I. (15 May 1991).
12300:"Learning skillful medium-range global weather forecasting" 11926: 10173:
IEEE Transactions on Audio, Speech, and Language Processing
9414:"Logic-in-memory based on an atomically thin semiconductor" 9364:"Cerebras Hits the Accelerator for Deep Learning Workloads" 9039:"Scaling deep learning on GPU and knights landing clusters" 7433:
Raina, Rajat; Madhavan, Anand; Ng, Andrew Y. (2009-06-14).
6350: 2925:
and is a basic goal of both human language acquisition and
2726:
Deep learning is closely related to a class of theories of
2113: 2097: 1467: 1269: 1123:, an early application of deep learning to bioinformatics. 714: 13178:"Facebook's 'Deep Learning' Guru Reveals the Future of AI" 13054:"An emergentist perspective on the origin of number sense" 7745: 7199: 7197: 2971:
ANNs can however be further trained to detect attempts at
1585:
An ANN is based on a collection of connected units called
1521:, which are correlated with "nodes" that represent visual 14071: 13756:"Deep Learning of Recursive Structure: Grammar Induction" 11898:"Army researchers develop new algorithms to train robots" 11851:"Google AI and robots join forces to build new materials" 11302:
IEEE Transactions on Neural Networks and Learning Systems
10896:"A Molecule Designed By AI Exhibits 'Druglike' Qualities" 9764: 9095: 8795:"Gradient-based learning applied to document recognition" 8460:"Is Artificial Intelligence Finally Coming into Its Own?" 8068: 7540:
International Joint Conference on Artificial Intelligence
6272: 5503:"Gradient-based learning applied to document recognition" 4115:
Lu, Z., Pu, H., Wang, F., Hu, Z., & Wang, L. (2017).
4081: 4079: 3956:"Approximations by superpositions of sigmoidal functions" 3849:
Aizenberg, I.N.; Aizenberg, N.N.; Vandewalle, J. (2000).
3426: 2619:
Physics informed neural networks have been used to solve
2569:
Deep learning is being successfully applied to financial
2257:(token classification), text classification, and others. 1288:
reported a 100M deep belief network trained on 30 Nvidia
829:
were published by Kaoru Nakano in 1971. Already in 1948,
685:
with a single hidden layer of finite size to approximate
12048: 11629:"Colorizing and Restoring Old Images with Deep Learning" 11352: 10389: 10387: 10385: 10383: 10274: 10167:
Mesnil, G.; Dauphin, Y.; Yao, K.; Bengio, Y.; Deng, L.;
9882:
Nvidia Demos a Car Computer Trained with "Deep Learning"
9037:
You, Yang; Buluç, Aydın; Demmel, James (November 2017).
8839:(2013). "Deep convolutional neural networks for LVCSR". 8167:
Sak, Hasim; Senior, Andrew; Beaufays, Francoise (2014).
7723: 7382:
Chellapilla, Kumar; Puri, Sidd; Simard, Patrice (2006),
6556: 2503:
holds. It is shown that this method provides a strongly
2296:
post-mortem matching and determination of subject sex.
2287:
English as an intermediate between most language pairs.
2166:
A common evaluation set for image classification is the
2155:
Richard Green explains how deep learning is used with a
1454:(ASR). Results on commonly used evaluation sets such as 464:
Some common deep learning network architectures include
13263: 12817: 12174: 11994:"Physics-informed neural networks for high-speed flows" 11527: 11191:
Journal of the American Medical Informatics Association
8311: 7991: 7768: 7194: 7101: 7099: 7097: 6422: 5218: 3372: 3127:
that are often not recognized as such. The philosopher
2642:
is a numerical method that combines deep learning with
10166: 9860: 9685:
EURASIP Journal on Audio, Speech, and Music Processing
9597:"Convolutional Neural Networks for Speech Recognition" 6353:"Phoneme recognition using time-delay neural networks" 6033:. In Rumelhart, David E.; McLelland, James L. (eds.). 5821: 5267:
Learning Internal Representations by Error Propagation
4895: 4360: 4076: 3600: 3312:
LeCun, Yann; Bengio, Yoshua; Hinton, Geoffrey (2015).
2335:
AtomNet is a deep learning system for structure-based
2050:
Scale-up/out and accelerated DNN training and decoding
635:
Deep learning architectures can be constructed with a
13450:"Talk to the Algorithms: AI Becomes a Faster Learner" 13117:
Güçlü, Umut; van Gerven, Marcel A. J. (8 July 2015).
12766: 12589: 11998:
Computer Methods in Applied Mechanics and Engineering
11184: 10440: 10393: 10380: 9994: 9992: 9941: 8708: 8014:"Prepare, Don't Panic: Synthetic Media and Deepfakes" 7381: 5972: 5714:
Habilitation thesis: System modeling and optimization
5221:"Learning representations by back-propagating errors" 4051: 4049: 2640:
Deep backward stochastic differential equation method
2635:
Deep backward stochastic differential equation method
1752: 1721: 13052:
Zorzi, Marco; Testolin, Alberto (19 February 2018).
11991: 11292: 10948:
Advances in Neural Information Processing Systems 26
10743: 10031: 9904:
G. W. Smith; Frederic Fol Leymarie (10 April 2017).
8557:"Sequence to Sequence Learning with Neural Networks" 8434:
International Conference on Learning Representations
8089: 7578:
Advances in Neural Information Processing Systems 25
7094: 7049: 7047: 7045: 6687: 6499: 6063: 5311:
Phoneme Recognition Using Time-Delay Neural Networks
5289:
Phoneme Recognition Using Time-Delay Neural Networks
4291:
IEEE Transactions on Systems Science and Cybernetics
2730:(specifically, neocortical development) proposed by 2573:, tax evasion detection, and anti-money laundering. 2561:, which trains on the image that needs restoration. 654:
was introduced to the machine learning community by
13665: 11062: 9411: 8115: 7985: 7377: 7375: 7105: 6138: 5029:(1960). "Gradient theory of optimal flight paths". 4877:(1967). "A theory of adaptive pattern classifier". 4126:. Neural Information Processing Systems, 6231-6239. 2792:developed a system capable of learning how to play 2372:has been used to approximate the value of possible 2358: 2128:speech products, etc.) are based on deep learning. 1867:In 2021, J. Feldmann et al. proposed an integrated 1080:over output patterns. The second network learns by 12367:"GraphCast: A breakthrough in Weather Forecasting" 11586: 9989: 8554: 8397: 7602: 5881:IEEE Transactions on Autonomous Mental Development 5664: 5662: 4803:IEEE Transactions on Systems, Man, and Cybernetics 4614:IEEE Transactions on Systems, Man, and Cybernetics 4046: 3139:for image recognition or click-tracking on Google 2921:that operates on concepts in terms of grammatical 2446: 1765: 1734: 846:The first working deep learning algorithm was the 578:Fundamentally, deep learning refers to a class of 13573:"Inceptionism: Going Deeper into Neural Networks" 11691: 10978: 10349: 9968: 9242:"AI is changing the entire nature of computation" 8404:Advances in Neural Information Processing Systems 8166: 7967:"GAN 2.0: NVIDIA's Hyperrealistic Face Generator" 7959: 7791:Simonyan, Karen; Zisserman, Andrew (2015-04-10), 7790: 7042: 6930: 6797: 6226:Qian, Ning; Sejnowski, Terrence J. (1988-08-20). 4695: 4693: 3311: 2722:Relation to human cognitive and brain development 2614: 2516:specialists to improve the diagnosis efficiency. 2495:size of Y. NJEE uses continuously differentiable 1184:, later produced excellent larger-scale results. 1149:in the late 1990s. Funded by the US government's 908:in 1979, though not trained by backpropagation. 563:or latent variables organized layer-wise in deep 15051: 13919:"Whose intelligence is artificial intelligence?" 13811:"How hackers can force AI to make dumb mistakes" 13209:"Google AI algorithm masters ancient game of Go" 11661:Shrinkage Fields for Effective Image Restoration 11470: 10801:"Startup Harnesses Supercomputers to Seek Cures" 10524: 10424: 10418: 9530:TIMIT Acoustic-Phonetic Continuous Speech Corpus 8978:"Data Augmentation - deeplearning.ai | Coursera" 7909: 7372: 7106:Deng, L.; Hinton, G.; Kingsbury, B. (May 2013). 7053: 7020: 6933:"A Fast Learning Algorithm for Deep Belief Nets" 6803:"A Fast Learning Algorithm for Deep Belief Nets" 5853: 5851: 4999:Leibniz, Gottfried Wilhelm Freiherr von (1920). 4284: 4282: 3498:MIT News | Massachusetts Institute of Technology 3422: 3420: 3262:Schulz, Hannes; Behnke, Sven (1 November 2012). 3143:), (3) exploitation of social motivations (e.g. 2491:layer with number of nodes that is equal to the 2299: 2204:identifying the style period of a given painting 1311:CNNs on GPU improved performance significantly. 13116: 12596:Proceedings of the National Academy of Sciences 12523: 12418: 12112:Proceedings of the National Academy of Sciences 11784:"Scaling deep learning for materials discovery" 10683:"Toxicology in the 21st century Data Challenge" 10518: 8659:"Recurrent neural network based language model" 8631: 8629: 8607: 8605: 8491: 8092:"Google voice search: faster and more accurate" 7432: 7408: 5912: 5910: 5766:. In Kolen, John F.; Kremer, Stefan C. (eds.). 5762:Hochreiter, S.; et al. (15 January 2001). 5738:Untersuchungen zu dynamischen neuronalen Netzen 5704: 5659: 4526:Proceedings of the National Academy of Sciences 2537:Deep learning has been successfully applied to 1901: 996:(1990), which applied RNN to study problems in 13481:"In defense of skepticism about deep learning" 12949: 12488: 11125: 10954:. Curran Associates, Inc. pp. 2643–2651. 10719:"NCATS Announces Tox21 Data Challenge Winners" 10694:"NCATS Announces Tox21 Data Challenge Winners" 10242:International Journal of Communication Systems 10160: 10135: 9169:. Neural Processing Letters 22.1 (2005): 1-16. 9036: 8427: 7871:. Las Vegas, NV, USA: IEEE. pp. 770–778. 7670: 7584:. Curran Associates, Inc. pp. 2843–2851. 7309: 7139: 7137: 7135: 6520: 6011:: CS1 maint: DOI inactive as of August 2024 ( 5622: 5620: 5618: 4732: 4690: 4635:Turing, Alan (1948). "Intelligent Machinery". 3781: 3779: 3307: 3305: 3219:List of datasets for machine-learning research 2804:well enough to beat a professional Go player. 2217: 2037:Hierarchical Convolutional Deep Maxout Network 1470:. but are more successful in computer vision. 772:and popularized in surveys such as the one by 14141: 14034:Bishop, Christopher M.; Bishop, Hugh (2024). 13850:"AI Is Easy to Fool—Why That Needs to Change" 13051: 13000: 12699: 11293:Shalev, Y.; Painsky, A.; Ben-Gal, I. (2022). 11031: 10567: 10104: 10079:. Association for Computational Linguistics. 7860: 7835: 7814: 6621: 6619: 6225: 5975:"A learning algorithm for boltzmann machines" 5865:. MIT Press/Bradford Books. pp. 222–227. 5848: 5768:A Field Guide to Dynamical Recurrent Networks 5259: 5105: 4279: 4262:Machine Learning: A Probabilistic Perspective 3417: 2425:annotations and gene-function relationships. 1711:methods such as Ivakhnenko's unit pruning or 406: 14155: 14033: 13629: 13627: 13625: 13503: 12656: 10208: 10034:"Parsing With Compositional Vector Grammars" 9974: 9469: 9467: 9465: 8626: 8602: 8550: 8548: 8546: 8492:Gers, Felix A.; Schmidhuber, Jürgen (2001). 8487: 8485: 8374:A Guide to Deep Learning and Neural Networks 8297:: CS1 maint: multiple names: authors list ( 7864:Deep Residual Learning for Image Recognition 7854: 7838:Deep Residual Learning for Image Recognition 7339: 7337: 6894:"Learning multiple layers of representation" 6733:"Learning multiple layers of representation" 6301: 5907: 5299: 4837: 4513: 4491: 4489: 4487: 4485: 4483: 4481: 4479: 4459: 4457: 4403: 4401: 4315: 3964:Mathematics of Control, Signals, and Systems 3596: 3594: 3592: 3590: 3588: 3520: 3518: 3516: 3514: 3261: 2564: 2524:Finding the appropriate mobile audience for 1299:by Dan Ciresan, Ueli Meier, Jonathan Masci, 689:. In 1989, the first proof was published by 12105: 10850: 10763: 10761: 9942:Blaise Agüera y Arcas (29 September 2017). 9638: 9594: 9159:Continuous CMAC-QRLS and its systolic array 8835:; Mohamed, Abdel-Rahman; Kingsbury, Brian; 8555:Sutskever, L.; Vinyals, O.; Le, Q. (2014). 8400:"Deep neural networks for object detection" 8085: 8083: 7132: 6552: 6550: 6205:. Cambridge, Massachusetts: The MIT Press. 5916: 5875: 5857: 5732: 5730: 5728: 5710: 5668: 5626: 5615: 5123: 5119: 5117: 4992: 4831: 4789: 4787: 4495: 4318:Applied and Computational Harmonic Analysis 4213:Foundations and Trends in Signal Processing 4199: 4197: 4195: 4193: 4191: 4059:Neural Networks: A Comprehensive Foundation 3820: 3776: 3752:Greedy layer-wise training of deep networks 3668: 3302: 1264: 878:the currently dominant training technique. 14148: 14134: 13707: 13600:"Yes, androids do dream of electric sheep" 12570: 12181:IEEE Transactions on Computational Imaging 10686: 10618: 10478: 10476: 9780: 7762: 7739: 7717: 7227: 6868:Learning multiple layers of representation 6860: 6655: 6616: 6467: 5761: 5279: 5187: 5147: 5073: 5055: 5019: 4793: 4759: 4711: 4699: 4649: 4628: 4605: 4572: 4089:Fundamentals of Artificial Neural Networks 3742: 3664: 3662: 3660: 3658: 3656: 3654: 3652: 3650: 3534:Foundations and Trends in Machine Learning 2290: 2059:Adaptation of DNNs and related deep models 2013:Convolutional DNN w. Heterogeneous Pooling 413: 399: 13960: 13721: 13692: 13671: 13622: 13597: 13232: 13169: 13152: 13134: 13085: 12910: 12892: 12861: 12843: 12735: 12717: 12633: 12615: 12537: 12403: 12364: 12333: 12315: 12249: 12192: 12151: 12141: 12123: 12082: 12017: 11952: 11825: 11807: 11563: 11545: 11496: 11367: 11313: 11288: 11286: 11218: 11153: 11143: 11008: 10998: 10929: 10923: 10835: 10749: 10601: 10591: 10542: 10431: 10302: 10292: 10233: 9998: 9980: 9959: 9921: 9866: 9854: 9829: 9771: 9696: 9644: 9588: 9487: 9462: 9445: 9405: 9346: 9336: 9109: 9000: 8934: 8899: 8889: 8650: 8641: 8617: 8571: 8543: 8482: 8197: 8185: 8125: 8059: 7997: 7915: 7876: 7845: 7820: 7800: 7774: 7752: 7729: 7708: 7676: 7655: 7483: 7417: 7334: 7263: 7003: 6678: 6416: 6200: 6100: 6025: 5933: 5521: 5336: 4955: 4916: 4901: 4867: 4858: 4555: 4545: 4504: 4476: 4454: 4398: 4329: 4288: 4254: 4252: 4250: 4248: 4206:"Deep Learning: Methods and Applications" 4171: 4161: 4111: 4109: 3917: 3852:Multi-Valued and Universal Binary Neurons 3811: 3682: 3614: 3585: 3545: 3511: 3382: 3099:Learn how and when to remove this message 3004: 2868:(...) have no obvious ways of performing 2644:Backward stochastic differential equation 2510: 2387: 728:interpretation derives from the field of 13942: 13633: 12700:Testolin, Alberto; Zorzi, Marco (2016). 12482: 12099: 11126:Sathyanarayana, Aarti (1 January 2016). 10758: 10355: 10204: 10202: 9944:"Art in the Age of Machine Intelligence" 9937: 9935: 9933: 9906:"The Machine as Artist: An Introduction" 9899: 9897: 9895: 9733: 9568: 9178: 8786: 8365: 8206: 8160: 8080: 7924: 7685: 7402: 6791: 6704:. Proceedings of ICANN (2), pp. 220–229. 6665:; Fernández, Santiago; Gomez, Faustino; 6547: 6493: 6307: 5815: 5755: 5725: 5114: 5067: 5049: 4949: 4910: 4784: 4726: 4581:Pattern Recognition and Machine Learning 4519: 4364:Pattern Recognition and Machine Learning 4309: 4188: 3209:List of artificial intelligence projects 2825: 2280:Google Neural Machine Translation (GNMT) 2186: 2182: 2141: 1798:cerebellar model articulation controller 1692:for automatic speech recognition (ASR). 1268: 732:. It features inference, as well as the 709:The universal approximation theorem for 36: 14036:Deep learning: foundations and concepts 13938: 13936: 13934: 13932: 12706:Frontiers in Computational Neuroscience 12517: 12412: 11892: 11890: 11888: 11657: 10473: 9999:Socher, Richard; Manning, Christopher. 8758: 8756: 8656: 7829: 7691: 7643: 6713:Graves, Alex; and Schmidhuber, Jürgen; 5869: 5494: 4998: 4733:Ivakhnenko, A. G.; Lapa, V. G. (1967). 4419: 4135: 4085: 4013: 4011: 4009: 3953: 3899: 3721: 3647: 3184:Applications of artificial intelligence 2968:and caused an ANN to misclassify them. 2783:automatically tagging uploaded pictures 2660: 2463:. Practically, the DNN is trained as a 2075:and how to design them to best exploit 1643: 1039:in an RNN unfolded in time. The "P" in 14: 15052: 14052: 13916: 13878: 13844: 13842: 13840: 13838: 13836: 13540: 13509: 13478: 13206: 12455: 12365:Sivakumar, Ramakrishnan (2023-11-27). 11848: 11283: 10677: 10061: 10041:Proceedings of the ACL 2013 Conference 9758: 9669: 9520: 9361: 9325:ACM SIGARCH Computer Architecture News 9006: 8250:"2018 ACM A.M. Turing Award Laureates" 8212: 7808: 7649: 7343: 7233: 7143: 6977: 6891: 6731:Hinton, Geoffrey E. (1 October 2007). 6730: 6473: 5559: 5353: 5317: 5285: 5193: 5153: 5025: 4796:"Polynomial theory of complex systems" 4736:Cybernetics and Forecasting Techniques 4717: 4634: 4611: 4578: 4258: 4245: 4203: 4106: 4055: 4017: 3949: 3947: 3945: 3785: 3524: 3151:to obtain labeled facial images), (4) 2785:with the names of the people in them. 2773: 1569:are computing systems inspired by the 1161:. The speaker recognition team led by 14129: 13881:"The scientist who spots fake videos" 12893:Cash, S.; Yuste, R. (February 1999). 12461: 11756: 11726: 11265: 10811:from the original on 24 December 2015 10199: 10086:from the original on 28 December 2016 9930: 9892: 9734:McMillan, Robert (17 December 2014). 9595:Abdel-Hamid, O.; et al. (2014). 9151: 9142: 8792: 8428:Rolnick, David; Tegmark, Max (2018). 8273:Ferrie, C., & Kaiser, S. (2019). 7947:from the original on 22 November 2019 7548:10.5591/978-1-57735-516-8/ijcai11-210 5582: 5453: 5406: 5359: 5323: 4873: 4847:The Annals of Mathematical Statistics 4463: 4425: 2669: 2587:Lawrence Berkeley National Laboratory 2519: 2195:" applied using neural style transfer 1640:recognizing faces, or playing "Go"). 1209:connectionist temporal classification 14986:Generative adversarial network (GAN) 13943:Mühlhoff, Rainer (6 November 2019). 13929: 13860:from the original on 11 October 2017 13821:from the original on 11 October 2019 13791:from the original on 11 October 2019 13522:from the original on 4 November 2019 13491:from the original on 12 October 2018 13376: 13175: 12106:Han, J.; Jentzen, A.; E, W. (2018). 11885: 11639:from the original on 11 October 2019 11044:from the original on 25 January 2018 10986:Journal of Medical Internet Research 10781:from the original on 20 October 2015 10711: 10626:"Merck Molecular Activity Challenge" 10461:from the original on 8 November 2020 10446: 10356:Turovsky, Barak (15 November 2016). 10221:from the original on 27 October 2017 10148:from the original on 27 October 2017 10117:from the original on 27 October 2017 9675: 9191:from the original on 1 February 2016 8988:from the original on 1 December 2017 8940: 8753: 8498:IEEE Transactions on Neural Networks 8191: 8020:from the original on 2 December 2020 7664: 7322:from the original on 12 October 2017 6892:Hinton, Geoffrey E. (October 2007). 6779:from the original on 11 October 2013 6057: 5826:. Vol. 1999. pp. 850–855. 5131:. IDSIA, Switzerland. Archived from 4354: 4006: 3900:Fradkov, Alexander L. (2020-01-01). 3842: 3527:"Learning Deep Architectures for AI" 3368: 3366: 3189:Comparison of deep learning software 3037:adding citations to reliable sources 3008: 2576: 2532: 2131: 1965:Hidden Trajectory (Generative) Model 1787:Sweeping through the parameter space 793:of artificial neural network (ANN): 14011:from the original on 10 August 2019 13833: 13460:from the original on 28 August 2018 12578:The New York Times, Science Section 11908:from the original on 28 August 2018 10893: 10647: 9820:. Selected Papers from IJCNN 2011. 9809: 9239: 8825: 8702: 7568: 7524: 7463: 6801:; Osindero, S.; Teh, Y. W. (2006). 5198:. New York: John Wiley & Sons. 4843:"A Stochastic Approximation Method" 4739:. American Elsevier Publishing Co. 4259:Murphy, Kevin P. (24 August 2012). 3942: 3902:"Early History of Machine Learning" 3722:Shigeki, Sugiyama (12 April 2019). 2840:Explainable artificial intelligence 2678: 2499:, such that the conditions for the 1989:Triphone GMM-HMM with BMMI Training 914:is an efficient application of the 440:. The field takes inspiration from 24: 14026: 13753: 13541:Marcus, Gary (November 25, 2012). 13188:from the original on 28 March 2014 10665:from the original on 30 April 2017 10337:from the original on 13 March 2017 7264:Seide, F.; Li, G.; Yu, D. (2011). 7203: 7182:from the original on 16 March 2018 6978:Hinton, Geoffrey E. (2009-05-31). 6478:. McGill University Ph.D. thesis. 6132: 5175:from the original on 14 April 2016 3172:Facebook's face recognition system 2956:Another group showed that certain 2800:system, which learned the game of 2247:probabilistic context free grammar 1973:Monophone Randomly Initialized DNN 1491: 665: 60: 25: 15071: 13610:from the original on 19 June 2015 11733:Global Banking and Finance Review 11295:"Neural Joint Entropy Estimation" 10525:Arrowsmith, J; Miller, P (2013). 10406:from the original on 10 July 2017 10368:from the original on 7 April 2017 10281:Journal of Translational Medicine 9282:from the original on 17 June 2020 9221:from the original on 10 June 2020 9077:from the original on 29 July 2020 8657:Mikolov, T.; et al. (2010). 6079:(1995). "The Helmholtz machine". 5005:. Open court publishing Company. 3474:from the original on 17 June 2018 3363: 2816:The University of Texas at Austin 2781:'s AI lab performs tasks such as 2581:In November 2023, researchers at 2483:tasks, the NJEE maps a vector of 2402: 2284:example-based machine translation 1053:. Hochreiter proposed recurrent 968:recognizing handwritten ZIP codes 954:(TDNN) was introduced in 1987 by 559:, although they can also include 15024: 15023: 15003: 13985: 13910: 13872: 13803: 13773: 13747: 13738: 13710:Found. Trends Comput. Graph. Vis 13701: 13680: 13659: 13591: 13579:from the original on 3 July 2015 13564: 13534: 13479:Marcus, Gary (14 January 2018). 13472: 13442: 13401: 13389:from the original on 7 July 2019 13370: 13340: 13332: 13257: 13200: 13110: 13045: 12994: 12943: 12886: 12811: 12760: 12693: 12650: 12583: 12383: 12358: 12290: 12225: 12168: 12042: 11985: 11933:Journal of Computational Physics 11920: 11842: 11775: 11750: 11720: 11685: 11651: 11621: 11580: 11521: 11464: 11408: 11346: 11259: 11235: 11178: 11119: 11056: 11025: 10972: 10938: 10917: 10887: 10844: 10823: 10793: 10737: 10509: 10447:Metz, Cade (27 September 2016). 10319: 10268: 10129: 10098: 10025: 10013:from the original on 6 July 2014 9884:(6 January 2015), David Talbot, 9875: 9746:from the original on 8 June 2017 9727: 9562: 9380: 9355: 9312: 9294: 9264: 9252:from the original on 25 May 2020 9233: 9203: 9179:Research, AI (23 October 2015). 9172: 9089: 9030: 8970: 8873: 8451: 8421: 8391: 8305: 8266: 8242: 8109: 8032: 8006: 7903: 7784: 7694:"Going deeper with convolutions" 7426: 7303: 7257: 7164: 7029:(2016). Slides on Deep Learning 6971: 6924: 6885: 6124: 5583:Elman, Jeffrey L. (March 1990). 5166:System modeling and optimization 4136:Orhan, A. E.; Ma, W. J. (2017). 3855:. Science & Business Media. 3013: 2718:spinoff company Deep Longevity. 2655:Physics-informed neural networks 2653:In addition, the integration of 2365:Customer relationship management 2359:Customer relationship management 2053:Sequence discriminative training 1530: 1506: 896:Deep learning architectures for 750:cumulative distribution function 444:and is centered around stacking 14099:from the original on 2016-04-16 13899:from the original on 2017-10-10 13648:from the original on 2015-05-13 13553:from the original on 2009-11-27 12952:Current Opinion in Neurobiology 11739:from the original on 2018-11-16 11674:from the original on 2018-01-02 11453:from the original on 2021-05-09 11107:from the original on 9 May 2021 10961:from the original on 2017-05-16 10906:from the original on 2020-04-30 10700:from the original on 2015-09-08 10636:from the original on 2020-07-16 10050:from the original on 2014-11-27 9798:from the original on 2014-01-13 9716:from the original on 2020-09-24 9627:from the original on 2020-09-22 9019:from the original on 2021-05-09 8959:from the original on 2017-08-12 8793:LeCun, Y.; et al. (1998). 8774:from the original on 9 May 2021 8691:from the original on 2017-05-16 8591:from the original on 2021-05-09 8532:from the original on 2020-01-26 8457: 8440:from the original on 2021-01-07 8410:from the original on 2017-06-29 8381:from the original on 2020-11-02 8231:from the original on 2021-05-09 8213:Zen, Heiga; Sak, Hasim (2015). 8136:10.1109/ICCCI50826.2021.9402569 8098:from the original on 2016-03-09 7934:Generative Adversarial Networks 7591:from the original on 2017-08-09 7557:from the original on 2014-09-29 7392:from the original on 2020-05-18 7292:from the original on 2017-10-12 7246:from the original on 2017-10-12 7216:from the original on 2017-09-26 7120:from the original on 2017-09-26 7060:IEEE Signal Processing Magazine 6849:from the original on 2015-12-23 6724: 6707: 6644:from the original on 2021-05-09 6604:from the original on 9 May 2021 6586: 6577: 6482:from the original on 2021-05-09 6425:IEEE Signal Processing Magazine 6405:from the original on 2021-04-27 6344: 6333:from the original on 2021-05-09 6266: 6219: 6201:Sejnowski, Terrence J. (2018). 6194: 6019: 5966: 5784: 5576: 5553: 5447: 5400: 5212: 5129:"Who Invented Backpropagation?" 4889: 4820:from the original on 2017-08-29 4760:Ivakhnenko, A.G. (March 1970). 4753: 4643: 4387:from the original on 2017-01-11 4361:Bishop, Christopher M. (2006). 4234:from the original on 2016-03-14 4129: 3893: 3884: 3765:from the original on 2019-10-20 3715: 3445:from the original on 2017-01-10 3024:needs additional citations for 2932: 2911:artificial general intelligence 2501:universal approximation theorem 2447:Deep Neural Network Estimations 2260:Recent developments generalize 2069:by DNNs and related deep models 1896: 1893:in data-heavy AI applications. 1157:, SRI researched in speech and 1086:generative adversarial networks 672:universal approximation theorem 589:model, the raw input may be an 482:generative adversarial networks 81:Artificial general intelligence 30:For the TV series episode, see 14936:Recurrent neural network (RNN) 14926:Differentiable neural computer 13510:Knight, Will (14 March 2017). 13377:Metz, Cade (6 November 2017). 13145:10.1523/jneurosci.5023-14.2015 11727:Czech, Tomasz (28 June 2018). 9533:. Linguistic Data Consortium. 9047:. SC '17, ACM. pp. 1–12. 8941:Dahl, G.; et al. (2013). 8224:. ICASSP. pp. 4470–4474. 5286:Waibel, Alex (December 1987). 5168:. Springer. pp. 762–770. 3486: 3456: 3255: 2621:partial differential equations 2615:Partial differential equations 1402:Generative adversarial network 945: 784: 13: 1: 14981:Variational autoencoder (VAE) 14941:Long short-term memory (LSTM) 14208:Computational learning theory 13176:Metz, C. (12 December 2013). 12912:10.1016/s0896-6273(00)81098-3 12526:Behavioral and Brain Sciences 11757:Nuñez, Michael (2023-11-29). 10531:Nature Reviews Drug Discovery 9655:10.21437/Interspeech.2014-433 9098:The Journal of Supercomputing 8674:10.21437/Interspeech.2010-343 7278:10.21437/Interspeech.2011-169 6571:10.1016/s0167-6393(99)00077-1 6541:10.1016/S0167-6393(99)00080-1 5991:10.1016/S0364-0213(85)80012-4 3919:10.1016/j.ifacol.2020.12.1888 3249: 3229:Scale space and deep learning 2820:U.S. Army Research Laboratory 2376:actions, defined in terms of 2300:Drug discovery and toxicology 2274:(GT) uses a large end-to-end 2124:voice search, and a range of 1695: 1686:Convolutional neural networks 1634:playing board and video games 1043:refers to such pre-training. 898:convolutional neural networks 848:Group method of data handling 553:convolutional neural networks 478:convolutional neural networks 14961:Convolutional neural network 14053:Prince, Simon J. D. (2023). 12845:10.1371/journal.pcbi.1002211 12503:10.1016/0893-6080(96)00033-0 11706:10.1016/j.knosys.2019.105048 11658:Schmidt, Uwe; Roth, Stefan. 10593:10.1016/j.drudis.2014.12.014 9840:10.1016/j.neunet.2012.02.023 7619:10.1007/978-3-642-40763-5_51 7366:10.1016/j.patcog.2004.01.013 6898:Trends in Cognitive Sciences 6880:Trends in Cognitive Sciences 6737:Trends in Cognitive Sciences 6514:10.1016/0893-6080(94)90027-2 6244:10.1016/0022-2836(88)90564-5 6232:Journal of Molecular Biology 6203:The deep learning revolution 5944:10.1016/j.neunet.2020.04.008 5711:Schmidhuber, Jürgen (1993). 5669:Schmidhuber, Jürgen (1992). 4778:10.1016/0005-1098(70)90092-0 4589:10.1007/978-1-4615-7566-5_15 4086:Hassoun, Mohamad H. (1995). 4032:10.1016/0893-6080(91)90009-t 3908:. 21st IFAC World Congress. 3693:10.1016/j.neunet.2014.09.003 2437:predicting protein structure 1902:Automatic speech recognition 1452:automatic speech recognition 1245:, but convergence was slow. 1121:protein structure prediction 1109:restricted Boltzmann machine 978:-5 (1998), a 7-level CNN by 962:et al. created a CNN called 719:Lebesgue integrable function 7: 14956:Multilayer perceptron (MLP) 14079:; Courville, Aaron (2016). 14055:Understanding deep learning 13418:10.1109/devlrn.2008.4640845 11849:Peplow, Mark (2023-11-29). 11597:10.1109/CSCITA.2017.8066548 11498:10.1109/ACCESS.2020.3006362 11386:10.1016/j.media.2017.07.005 9582:10.13140/RG.2.2.15418.90567 9362:Woodie, Alex (2021-11-01). 9013:Tech. Rep. UTML TR 2010-003 8910:10.1109/icassp.2013.6639349 8849:10.1109/icassp.2013.6639347 7692:Szegedy, Christian (2015). 7204:Li, Deng (September 2014). 6949:10.1162/neco.2006.18.7.1527 6824:10.1162/neco.2006.18.7.1527 5585:"Finding Structure in Time" 5560:Jordan, Michael I. (1986). 4794:Ivakhnenko, Alexey (1971). 4705:Principles of Neurodynamics 3268:KI - Künstliche Intelligenz 3177: 2814:As of 2008, researchers at 2606: 2370:Deep reinforcement learning 2224:Natural language processing 2218:Natural language processing 1803: 1589:, (analogous to biological 1031:network into a lower level 876:stochastic gradient descent 864:stochastic gradient descent 683:feedforward neural networks 542: 502:natural language processing 116:Natural language processing 10: 15076: 15032:Artificial neural networks 14946:Gated recurrent unit (GRU) 14172:Differentiable programming 14116:: CS1 maint: postscript ( 13879:Gibney, Elizabeth (2017). 13598:Alex Hern (18 June 2015). 13207:Gibney, Elizabeth (2016). 12964:10.1016/j.conb.2004.07.007 12824:PLOS Computational Biology 12462:Elman, Jeffrey L. (1998). 12260:10.1109/TUFFC.2020.3010186 11863:10.1038/d41586-023-03745-5 11809:10.1038/s41586-023-06735-9 11324:10.1109/TNNLS.2022.3204919 10294:10.1186/s12967-023-04011-y 10185:10.1109/taslp.2014.2383614 9613:10.1109/taslp.2014.2339736 9498:10.1038/s41586-020-03070-1 8723:10.1162/neco.1997.9.8.1735 8275:Neural Networks for Babies 6910:10.1016/j.tics.2007.09.004 6749:10.1016/j.tics.2007.09.004 5632:"Neural Sequence Chunkers" 5601:10.1207/s15516709cog1402_1 4340:10.1016/j.acha.2015.12.005 4163:10.1038/s41467-017-00181-8 3199:Differentiable programming 2837: 2682: 2406: 2391: 2362: 2304:For more information, see 2303: 2235:. Word embedding, such as 2221: 2135: 2085:and its rich LSTM variants 1997:Monophone DBN-DNN on fbank 1905: 1571:biological neural networks 1556:Artificial neural networks 1495: 1051:vanishing gradient problem 795:feedforward neural network 779: 626:feedforward neural network 524:, material inspection and 169:Hybrid intelligent systems 91:Recursive self-improvement 32:Deep Learning (South Park) 29: 27:Branch of machine learning 14999: 14913: 14857: 14786: 14719: 14591: 14491: 14484: 14438: 14402: 14365:Artificial neural network 14345: 14221: 14188:Automatic differentiation 14161: 13893:10.1038/nature.2017.22784 12781:10.1038/s41562-017-0186-2 12671:10.1162/neco.1996.8.5.895 12548:10.1017/s0140525x97001581 12433:10.1162/08997660260293319 12019:10.1016/j.cma.2019.112789 11954:10.1016/j.jcp.2018.10.045 11266:Shead, Sam (2020-11-30). 11069:. ACM. pp. 533–540. 10865:10.1038/s41587-019-0224-x 9698:10.1186/s13636-015-0068-3 9438:10.1038/s41586-020-2861-0 9128:10.1007/s11227-017-1994-x 7234:Yu, D.; Deng, L. (2010). 7144:Yu, D.; Deng, L. (2014). 7005:10.4249/scholarpedia.5947 6287:10.1142/s0218001493000455 6102:21.11116/0000-0002-D6D3-E 6093:10.1162/neco.1995.7.5.889 5893:10.1109/TAMD.2010.2056368 5770:. John Wiley & Sons. 5690:10.1162/neco.1992.4.2.234 5079:BIT Numerical Mathematics 4919:Trans. IECE (in Japanese) 4811:10.1109/TSMC.1971.4308320 4622:10.1109/TSMC.1972.4309133 4448:10.1103/RevModPhys.39.883 4428:Reviews of Modern Physics 4204:Deng, L.; Yu, D. (2014). 4092:. MIT Press. p. 48. 4056:Haykin, Simon S. (1999). 3861:10.1007/978-1-4757-3115-6 3813:10.4249/scholarpedia.5947 3393:10.1109/cvpr.2012.6248110 3280:10.1007/s13218-012-0198-z 3244:Topological deep learning 2962:facial recognition system 2899: 2833: 2732:cognitive neuroscientists 2689:An epigenetic clock is a 2565:Financial fraud detection 2157:remotely operated vehicle 1957:Bayesian Triphone GMM-HMM 1766:{\displaystyle \ell _{1}} 1735:{\displaystyle \ell _{2}} 1676:Recurrent neural networks 1498:Artificial neural network 1408:et al., 2014) (based on 1222:In 2006, publications by 986:Recurrent neural networks 952:time delay neural network 920:Gottfried Wilhelm Leibniz 866:was published in 1967 by 827:recurrent neural networks 803:recurrent neural networks 744:, related to fitting and 630:recurrent neural networks 474:recurrent neural networks 14193:Neuromorphic engineering 14156:Differentiable computing 14106:, introductory textbook. 13962:10.1177/1461444819885334 13575:. Google Research Blog. 12719:10.3389/fncom.2016.00073 12203:10.1109/TCI.2021.3075349 11132:JMIR mHealth and uHealth 7080:10.1109/msp.2012.2205597 6882:, 11, pp. 428–434, 2007. 6026:Smolensky, Paul (1986). 5194:Werbos, Paul J. (1994). 4520:Hopfield, J. J. (1982). 4303:10.1109/TSSC.1969.300225 2960:spectacles could fool a 2477:probability distribution 2430:electronic health record 2255:named-entity recognition 1949:Randomly Initialized RNN 1862:field-effect transistors 1827:deep learning processors 1662:multivariate polynomials 1265:Deep learning revolution 1187: 1078:probability distribution 1017:internal representations 1009:self-supervised learning 889:(rectified linear unit) 872:internal representations 858:The first deep learning 466:fully connected networks 293:Artificial consciousness 14966:Residual neural network 14382:Artificial Intelligence 13949:New Media & Society 13123:Journal of Neuroscience 12617:10.1073/pnas.88.10.4433 12335:10.1126/science.adi2336 12143:10.1073/pnas.1718942115 12075:10.1126/science.aaw4741 11694:Knowledge-Based Systems 11547:10.3390/cancers14071819 11075:10.1145/2649387.2649442 10659:www.datascienceassn.org 10485:"MT on and for the Web" 10362:The Keyword Google Blog 10001:"Deep Learning for NLP" 9348:10.1145/3140659.3080246 9053:10.1145/3126908.3126912 8799:Proceedings of the IEEE 7447:10.1145/1553374.1553486 6445:10.1109/msp.2009.932166 6172:10.1126/science.7761831 5993:(inactive 2024-08-07). 5510:Proceedings of the IEEE 5350:, 1, pp. 541–551, 1989. 4921:. J62-A (10): 658–665. 4860:10.1214/aoms/1177729586 4650:Rosenblatt, F. (1958). 4409:"bibliotheca Augustana" 3525:Bengio, Yoshua (2009). 3115:that not only low-paid 2927:artificial intelligence 2704:frontotemporal dementia 2625:Navier-Stokes equations 2382:customer lifetime value 2291:Forensic Identification 1839:tensor processing units 1785:, and initial weights. 1636:and medical diagnosis. 1381:residual neural network 1370:generating descriptions 1217:handwriting recognition 1198:support vector machines 1023:into a single RNN, by 676:probabilistic inference 442:biological neuroscience 438:representation learning 164:Evolutionary algorithms 54:Artificial intelligence 13917:Tubaro, Paola (2020). 13634:Goertzel, Ben (2015). 13454:governmentciomedia.com 13070:10.1098/rstb.2017.0043 13058:Phil. Trans. R. Soc. B 12769:Nature Human Behaviour 11356:Medical Image Analysis 9576:. CUED/F-INFENG/TR82. 9007:Hinton, G. E. (2010). 8884:. pp. 8624–8628. 8843:. pp. 8614–8618. 6980:"Deep belief networks" 6319:. Icassp'92: 617–620. 6039:. MIT Press. pp.  5801:Long Short Term Memory 4805:. SMC-1 (4): 364–378. 4718:Joseph, R. D. (1960). 4616:. SMC-2 (3): 380–388. 4547:10.1073/pnas.79.8.2554 3788:"Deep belief networks" 3377:. pp. 3642–3649. 3121:Amazon Mechanical Turk 3005:Data collection ethics 2886: 2511:Medical image analysis 2388:Recommendation systems 2276:long short-term memory 2196: 2163: 1813:commercial cloud AI . 1767: 1736: 1704:and computation time. 1580:rule-based programming 1546:result for sea urchin. 1484:were awarded the 2018 1301:Luca Maria Gambardella 1274: 1215:contest, in connected 1103:, etc., including the 1059:long short-term memory 841:multilayer perceptrons 622:credit assignment path 561:propositional formulas 518:medical image analysis 490:neural radiance fields 65: 42: 14921:Neural Turing machine 14509:Human image synthesis 13787:. 11 September 2017. 13516:MIT Technology Review 13352:MIT Technology Review 11429:10.1109/ICCVW.2017.18 11247:MIT Technology Review 9887:MIT Technology Review 9676:Tóth, Laszló (2015). 9240:Ray, Tiernan (2019). 8464:MIT Technology Review 5639:TR FKI-148, TU Munich 4142:Nature Communications 3786:Hinton, G.E. (2009). 3625:10.1109/tpami.2013.50 2919:commonsense reasoning 2862: 2826:Criticism and comment 2790:DeepMind Technologies 2352:graph neural networks 2208:Neural Style Transfer 2190: 2183:Visual art processing 2154: 1944:error rate (PER) (%) 1843:Google Cloud Platform 1768: 1737: 1392:neural style transfer 1295:In 2011, a CNN named 1272: 860:multilayer perceptron 825:in 1982. Other early 799:multilayer perceptron 704:rectified linear unit 567:such as the nodes in 64: 40: 15012:Computer programming 14991:Graph neural network 14566:Text-to-video models 14544:Text-to-image models 14392:Large language model 14377:Scientific computing 14183:Statistical manifold 14178:Information geometry 13412:. pp. 292–297. 11635:. 13 November 2018. 11591:. pp. 174–177. 11203:10.1093/jamia/ocw112 11145:10.2196/mhealth.6562 10853:Nature Biotechnology 10580:Drug Discovery Today 10400:Google Research Blog 8837:Ramabhadran, Bhuvana 7887:10.1109/CVPR.2016.90 7494:10.1162/neco_a_00052 7272:. pp. 437–440. 6637:. pp. 175–184. 6559:Speech Communication 6529:Speech Communication 5651:: CS1 maint: year ( 5433:10.1364/AO.30.004211 5386:10.1364/AO.29.004790 5125:Schmidhuber, Juergen 4841:; Monro, S. (1951). 4707:. Spartan, New York. 4656:Psychological Review 4583:. pp. 172–186. 3214:Liquid state machine 3155:(e.g. by leveraging 3141:search results pages 3033:improve this article 2866:causal relationships 2764:deep belief networks 2661:Image reconstruction 2648:deep neural networks 2505:consistent estimator 2497:activation functions 2481:image classification 2337:rational drug design 2322:biomolecular targets 2021:Ensemble DNN/CNN/RNN 1882:in conjunction with 1872:hardware accelerator 1750: 1742:-regularization) or 1719: 1644:Deep neural networks 1597:). Each connection ( 1464:image classification 1347:ImageNet competition 1345:won the large-scale 1235:deep belief networks 1228:Ruslan Salakhutdinov 1117:wake-sleep algorithm 998:cognitive psychology 711:deep neural networks 687:continuous functions 645:deep belief networks 569:deep belief networks 470:deep belief networks 106:General game playing 14358:In-context learning 14198:Pattern recognition 13856:. 10 October 2017. 13303:10.1038/nature16961 13295:2016Natur.529..484S 13225:2016Natur.529..445G 13129:(27): 10005–10014. 13003:Nature Neuroscience 12836:2011PLSCB...7E2211B 12608:1991PNAS...88.4433M 12326:2023Sci...382.1416L 12310:(6677): 1416–1421. 12134:2018PNAS..115.8505H 12067:2020Sci...367.1026R 12061:(6481): 1026–1030. 12010:2020CMAME.360k2789M 11945:2019JCoPh.378..686R 11800:2023Natur.624...80M 11489:2020IEEEA...8l9889D 11378:2017arXiv170205747L 10725:on 28 February 2015 9961:10.3390/arts6040018 9923:10.3390/arts6020005 9430:2020Natur.587...72M 9306:consumer.huawei.com 9120:2017arXiv170207908V 8582:2014arXiv1409.3215S 8335:10.1038/nature16961 8327:2016Natur.529..484S 8054:. PMLR: 2256–2265. 7973:. December 14, 2018 7358:2004PatRe..37.1311O 7346:Pattern Recognition 7178:. 3 December 2015. 7072:2012ISPM...29...82H 6996:2009SchpJ...4.5947H 6667:Schmidhuber, Jürgen 6474:Bengio, Y. (1991). 6437:2009ISPM...26...75B 6164:1995Sci...268.1158H 6158:(5214): 1158–1161. 6140:Hinton, Geoffrey E. 6069:Hinton, Geoffrey E. 5918:Schmidhuber, Jürgen 5877:Schmidhuber, Jürgen 5859:Schmidhuber, Jürgen 5832:10.1049/cp:19991218 5628:Schmidhuber, Jürgen 5472:1994MedPh..21..517Z 5454:Zhang, Wei (1994). 5425:1991ApOpt..30.4211Z 5407:Zhang, Wei (1991). 5378:1990ApOpt..29.4790Z 5360:Zhang, Wei (1990). 5324:Zhang, Wei (1988). 5237:1986Natur.323..533R 4538:1982PNAS...79.2554H 4497:Schmidhuber, Jürgen 4440:1967RvMP...39..883B 4154:2017NatCo...8..138O 4002:on 10 October 2015. 3977:1989MCSS....2..303C 3804:2009SchpJ...4.5947H 3341:10.1038/nature14539 3333:2015Natur.521..436L 3224:Reservoir computing 2882:deductive reasoning 2774:Commercial activity 2737:nerve growth factor 2631:methods relies on. 2467:that maps an input 1823:electronic circuits 1626:machine translation 1441:Google Voice Search 1213:pattern recognition 1159:speaker recognition 1136:Hidden Markov model 891:activation function 610:feature engineering 535:, particularly the 506:machine translation 258:Machine translation 174:Systems integration 111:Knowledge reasoning 48:Part of a series on 18:Deep neural network 14951:Echo state network 14839:Jürgen Schmidhuber 14534:Facial recognition 14529:Speech recognition 14439:Software libraries 13732:10.1561/0600000018 13383:The New York Times 13358:on 1 February 2016 13064:(1740): 20170043. 12659:Neural Computation 12580:, pp. B5–B6, 1995. 12421:Neural Computation 11423:. pp. 82–89. 11038:Microsoft Research 10805:KQED Future of You 10774:The Globe and Mail 10331:Microsoft Research 10215:Microsoft Research 10142:Microsoft Research 10111:Microsoft Research 9539:10.35111/17gk-bn40 9165:2018-11-18 at the 9157:Ting Qin, et al. " 8711:Neural Computation 7472:Neural Computation 7316:Microsoft Research 7176:Microsoft Research 7036:2016-04-23 at the 6937:Neural Computation 6874:2018-05-22 at the 6811:Neural Computation 6700:2018-11-18 at the 6380:10338.dmlcz/135496 6081:Neural Computation 5798:(21 August 1995), 5796:Jürgen Schmidhuber 5744:2015-03-06 at the 5678:Neural Computation 5348:Neural Computation 5272:2022-10-13 at the 5091:10.1007/bf01931367 4970:10.1007/bf00344251 4927:10.1007/bf00344251 4413:www.hs-augsburg.de 4225:10.1561/2000000039 4122:2019-02-13 at the 3985:10.1007/bf02551274 3836:2016-04-19 at the 3556:10.1561/2200000006 3204:Echo state network 3194:Compressed sensing 3153:information mining 2940:adversarial attack 2915:Learning a grammar 2878:Bayesian inference 2870:logical inferences 2670:Weather prediction 2596:crystal structures 2526:mobile advertising 2520:Mobile advertising 2459:on an independent 2453:stochastic process 2394:Recommender system 2345:multiple sclerosis 2266:sentence embedding 2251:sentiment analysis 2197: 2164: 2161:mussel aquaculture 2029:Bidirectional LSTM 1908:Speech recognition 1763: 1732: 1622:speech recognition 1587:artificial neurons 1410:Jürgen Schmidhuber 1305:Jürgen Schmidhuber 1275: 1143:speech recognition 1128:speech recognition 1066:Jürgen Schmidhuber 1005:Jürgen Schmidhuber 940:David E. Rumelhart 906:Kunihiko Fukushima 883:Kunihiko Fukushima 700:Kunihiko Fukushima 593:(represented as a 573:Boltzmann machines 533:biological systems 498:speech recognition 446:artificial neurons 66: 43: 15047: 15046: 14809:Stephen Grossberg 14782: 14781: 14057:. The MIT Press. 14045:978-3-031-45467-7 13955:(10): 1868–1884. 13427:978-1-4244-2661-4 13289:(7587): 484–489. 13219:(7587): 445–446. 12602:(10): 4433–4437. 12475:978-0-262-55030-7 12427:(10): 2497–2529. 12396:Aging and Disease 12244:(12): 2584–2594. 12118:(34): 8505–8510. 11606:978-1-5090-4381-1 11483:: 129889–129898. 10894:Gregory, Barber. 9647:Proc. Interspeech 9607:(10): 1533–1545. 9217:. December 2019. 8919:978-1-4799-0356-6 8858:978-1-4799-0356-6 8805:(11): 2278–2324. 8510:10.1109/72.963769 8321:(7587): 484–489. 8181:on 24 April 2018. 8145:978-1-7281-5875-4 7896:978-1-4673-8851-1 7628:978-3-642-38708-1 7478:(12): 3207–3220. 7456:978-1-60558-516-1 7157:978-1-4471-5779-3 6212:978-0-262-03803-4 6077:Zemel, Richard S. 5979:Cognitive Science 5777:978-0-7803-5369-5 5736:S. Hochreiter., " 5589:Cognitive Science 5516:(11): 2278–2324. 5231:(6088): 533–536. 5075:Linnainmaa, Seppo 5057:Linnainmaa, Seppo 4879:IEEE Transactions 4746:978-0-444-00020-0 4701:Rosenblatt, Frank 4598:978-1-4615-7568-9 4466:IEEE Transactions 4377:978-0-387-31073-2 4272:978-0-262-01802-9 4099:978-0-262-08239-6 4069:978-0-13-273350-2 4062:. Prentice Hall. 3906:IFAC-PapersOnLine 3870:978-0-7923-7824-2 3735:978-1-5225-8218-2 3402:978-1-4673-1228-8 3327:(7553): 436–444. 3239:Stochastic parrot 3161:activity trackers 3109: 3108: 3101: 3083: 2985:genetic algorithm 2760:generative models 2741:self-organization 2728:brain development 2716:Insilico Medicine 2600:Materials Project 2591:materials science 2577:Materials science 2555:film colorization 2533:Image restoration 2152: 2132:Image recognition 2067:transfer learning 2044: 2043: 2005:Convolutional DNN 1981:Monophone DBN-DNN 1810:computer hardware 1690:acoustic modeling 1680:language modeling 1329:In October 2012, 1147:SRI International 1113:Helmholtz machine 1105:Boltzmann machine 1013:predictive coding 972:optical computing 852:Alexey Ivakhnenko 587:image recognition 565:generative models 432:methods based on 423: 422: 159:Bayesian networks 86:Intelligent agent 16:(Redirected from 15067: 15037:Machine learning 15027: 15026: 15007: 14762:Action selection 14752:Self-driving car 14559:Stable Diffusion 14524:Speech synthesis 14489: 14488: 14353:Machine learning 14229:Gradient descent 14150: 14143: 14136: 14127: 14126: 14121: 14115: 14107: 14105: 14104: 14092:978-0-26203561-3 14068: 14049: 14021: 14020: 14018: 14016: 13989: 13983: 13982: 13964: 13940: 13927: 13926: 13914: 13908: 13907: 13905: 13904: 13876: 13870: 13869: 13867: 13865: 13846: 13831: 13830: 13828: 13826: 13817:. 18 June 2018. 13807: 13801: 13800: 13798: 13796: 13777: 13771: 13770: 13768: 13767: 13758:. Archived from 13751: 13745: 13742: 13736: 13735: 13725: 13705: 13699: 13698: 13696: 13684: 13678: 13677: 13675: 13663: 13657: 13656: 13654: 13653: 13647: 13640: 13631: 13620: 13619: 13617: 13615: 13595: 13589: 13588: 13586: 13584: 13568: 13562: 13561: 13559: 13558: 13538: 13532: 13531: 13529: 13527: 13507: 13501: 13500: 13498: 13496: 13476: 13470: 13469: 13467: 13465: 13446: 13440: 13439: 13405: 13399: 13398: 13396: 13394: 13374: 13368: 13367: 13365: 13363: 13354:. Archived from 13344: 13338: 13337: 13336: 13330: 13261: 13255: 13254: 13236: 13204: 13198: 13197: 13195: 13193: 13173: 13167: 13166: 13156: 13138: 13114: 13108: 13107: 13089: 13049: 13043: 13042: 12998: 12992: 12991: 12947: 12941: 12940: 12914: 12890: 12884: 12883: 12865: 12847: 12830:(11): e1002211. 12815: 12809: 12808: 12764: 12758: 12757: 12739: 12721: 12697: 12691: 12690: 12654: 12648: 12647: 12637: 12619: 12587: 12581: 12574: 12568: 12567: 12541: 12521: 12515: 12514: 12497:(7): 1119–1129. 12486: 12480: 12479: 12459: 12453: 12452: 12416: 12410: 12409: 12407: 12387: 12381: 12380: 12378: 12377: 12362: 12356: 12355: 12337: 12319: 12294: 12288: 12287: 12253: 12229: 12223: 12222: 12196: 12172: 12166: 12165: 12155: 12145: 12127: 12103: 12097: 12096: 12086: 12046: 12040: 12039: 12021: 11989: 11983: 11982: 11956: 11924: 11918: 11917: 11915: 11913: 11894: 11883: 11882: 11846: 11840: 11839: 11829: 11811: 11779: 11773: 11772: 11770: 11769: 11754: 11748: 11747: 11745: 11744: 11724: 11718: 11717: 11689: 11683: 11682: 11680: 11679: 11673: 11666: 11655: 11649: 11648: 11646: 11644: 11625: 11619: 11618: 11584: 11578: 11577: 11567: 11549: 11525: 11519: 11518: 11500: 11468: 11462: 11461: 11459: 11458: 11412: 11406: 11405: 11371: 11350: 11344: 11343: 11317: 11308:(4): 5488–5500. 11299: 11290: 11281: 11280: 11278: 11277: 11263: 11257: 11256: 11254: 11253: 11239: 11233: 11232: 11222: 11182: 11176: 11175: 11157: 11147: 11123: 11117: 11116: 11114: 11112: 11060: 11054: 11053: 11051: 11049: 11029: 11023: 11022: 11012: 11002: 10976: 10970: 10969: 10967: 10966: 10960: 10953: 10942: 10936: 10935: 10933: 10921: 10915: 10914: 10912: 10911: 10891: 10885: 10884: 10859:(9): 1038–1040. 10848: 10842: 10841: 10839: 10827: 10821: 10820: 10818: 10816: 10797: 10791: 10790: 10788: 10786: 10765: 10756: 10755: 10753: 10741: 10735: 10734: 10732: 10730: 10721:. Archived from 10715: 10709: 10708: 10706: 10705: 10690: 10684: 10681: 10675: 10674: 10672: 10670: 10651: 10645: 10644: 10642: 10641: 10622: 10616: 10615: 10605: 10595: 10571: 10565: 10564: 10546: 10522: 10516: 10513: 10507: 10506: 10504: 10502: 10497:on 29 March 2017 10496: 10490:. Archived from 10489: 10480: 10471: 10470: 10468: 10466: 10444: 10438: 10437: 10435: 10422: 10416: 10415: 10413: 10411: 10391: 10378: 10377: 10375: 10373: 10353: 10347: 10346: 10344: 10342: 10323: 10317: 10316: 10306: 10296: 10272: 10266: 10265: 10254:10.1002/dac.3259 10237: 10231: 10230: 10228: 10226: 10206: 10197: 10196: 10164: 10158: 10157: 10155: 10153: 10133: 10127: 10126: 10124: 10122: 10102: 10096: 10095: 10093: 10091: 10085: 10074: 10065: 10059: 10058: 10056: 10055: 10049: 10038: 10029: 10023: 10022: 10020: 10018: 10012: 10005: 9996: 9987: 9986: 9984: 9972: 9966: 9965: 9963: 9939: 9928: 9927: 9925: 9901: 9890: 9879: 9873: 9872: 9870: 9858: 9852: 9851: 9833: 9813: 9807: 9806: 9804: 9803: 9784: 9778: 9777: 9775: 9762: 9756: 9755: 9753: 9751: 9731: 9725: 9724: 9722: 9721: 9715: 9700: 9682: 9673: 9667: 9666: 9642: 9636: 9635: 9633: 9632: 9592: 9586: 9585: 9566: 9560: 9559: 9557: 9555: 9524: 9518: 9517: 9491: 9471: 9460: 9459: 9449: 9409: 9403: 9402: 9400: 9399: 9384: 9378: 9377: 9375: 9374: 9359: 9353: 9352: 9350: 9340: 9316: 9310: 9309: 9298: 9292: 9291: 9289: 9287: 9272:"AI and Compute" 9268: 9262: 9261: 9259: 9257: 9237: 9231: 9230: 9228: 9226: 9207: 9201: 9200: 9198: 9196: 9176: 9170: 9155: 9149: 9146: 9140: 9139: 9113: 9093: 9087: 9086: 9084: 9082: 9034: 9028: 9027: 9025: 9024: 9004: 8998: 8997: 8995: 8993: 8974: 8968: 8967: 8965: 8964: 8958: 8947: 8938: 8932: 8931: 8903: 8893: 8877: 8871: 8870: 8833:Sainath, Tara N. 8829: 8823: 8822: 8811:10.1109/5.726791 8790: 8784: 8783: 8781: 8779: 8760: 8751: 8750: 8717:(8): 1735–1780. 8706: 8700: 8699: 8697: 8696: 8690: 8663: 8654: 8648: 8647: 8645: 8633: 8624: 8623: 8621: 8609: 8600: 8599: 8597: 8596: 8590: 8575: 8561: 8552: 8541: 8540: 8538: 8537: 8504:(6): 1333–1340. 8489: 8480: 8479: 8477: 8475: 8470:on 31 March 2019 8466:. Archived from 8455: 8449: 8448: 8446: 8445: 8425: 8419: 8418: 8416: 8415: 8395: 8389: 8388: 8387: 8386: 8369: 8363: 8362: 8309: 8303: 8302: 8296: 8288: 8270: 8264: 8263: 8261: 8260: 8246: 8240: 8239: 8237: 8236: 8230: 8219: 8210: 8204: 8203: 8201: 8189: 8183: 8182: 8180: 8174:. Archived from 8173: 8164: 8158: 8157: 8129: 8120:. pp. 1–4. 8113: 8107: 8106: 8104: 8103: 8087: 8078: 8072: 8066: 8065: 8063: 8045: 8036: 8030: 8029: 8027: 8025: 8010: 8004: 8003: 8001: 7989: 7983: 7982: 7980: 7978: 7971:SyncedReview.com 7963: 7957: 7956: 7954: 7952: 7946: 7939: 7928: 7922: 7921: 7919: 7907: 7901: 7900: 7880: 7858: 7852: 7851: 7849: 7833: 7827: 7826: 7824: 7812: 7806: 7805: 7804: 7788: 7782: 7780: 7778: 7766: 7760: 7758: 7756: 7743: 7737: 7735: 7733: 7721: 7715: 7714: 7712: 7698: 7689: 7683: 7682: 7680: 7668: 7662: 7661: 7659: 7647: 7641: 7640: 7606: 7600: 7599: 7597: 7596: 7590: 7583: 7572: 7566: 7565: 7563: 7562: 7556: 7537: 7528: 7522: 7521: 7487: 7467: 7461: 7460: 7430: 7424: 7423: 7421: 7406: 7400: 7399: 7398: 7397: 7379: 7370: 7369: 7352:(6): 1311–1314. 7341: 7332: 7331: 7329: 7327: 7307: 7301: 7300: 7298: 7297: 7270:Interspeech 2011 7261: 7255: 7254: 7252: 7251: 7231: 7225: 7224: 7222: 7221: 7201: 7192: 7191: 7189: 7187: 7168: 7162: 7161: 7141: 7130: 7129: 7127: 7125: 7119: 7112: 7103: 7092: 7091: 7051: 7040: 7024: 7018: 7017: 7007: 6975: 6969: 6968: 6943:(7): 1527–1554. 6928: 6922: 6921: 6889: 6883: 6866:G. E. Hinton., " 6864: 6858: 6857: 6855: 6854: 6848: 6818:(7): 1527–1554. 6807: 6795: 6789: 6788: 6786: 6784: 6728: 6722: 6711: 6705: 6691: 6685: 6684: 6682: 6659: 6653: 6652: 6650: 6649: 6643: 6632: 6623: 6614: 6613: 6611: 6609: 6590: 6584: 6581: 6575: 6574: 6554: 6545: 6544: 6524: 6518: 6517: 6497: 6491: 6490: 6488: 6487: 6471: 6465: 6464: 6420: 6414: 6413: 6411: 6410: 6404: 6372:10.1109/29.21701 6357: 6348: 6342: 6341: 6339: 6338: 6305: 6299: 6298: 6270: 6264: 6263: 6223: 6217: 6216: 6198: 6192: 6191: 6148:Frey, Brendan J. 6136: 6130: 6129: 6128: 6122: 6104: 6073:Neal, Radford M. 6061: 6055: 6054: 6032: 6023: 6017: 6016: 6010: 6002: 5970: 5964: 5963: 5937: 5914: 5905: 5904: 5873: 5867: 5866: 5855: 5846: 5845: 5819: 5813: 5812: 5788: 5782: 5781: 5759: 5753: 5734: 5723: 5721: 5719: 5708: 5702: 5701: 5675: 5666: 5657: 5656: 5650: 5642: 5636: 5624: 5613: 5612: 5580: 5574: 5573: 5557: 5551: 5550: 5548: 5546: 5532:10.1109/5.726791 5525: 5507: 5498: 5492: 5491: 5480:10.1118/1.597177 5451: 5445: 5444: 5404: 5398: 5397: 5357: 5351: 5340: 5334: 5333: 5321: 5315: 5306:Alexander Waibel 5303: 5297: 5296: 5294: 5283: 5277: 5263: 5257: 5256: 5245:10.1038/323533a0 5216: 5210: 5209: 5191: 5185: 5184: 5182: 5180: 5174: 5163: 5151: 5145: 5144: 5142: 5140: 5121: 5112: 5109: 5103: 5102: 5071: 5065: 5064: 5053: 5047: 5046: 5027:Kelley, Henry J. 5023: 5017: 5016: 4996: 4990: 4989: 4953: 4947: 4946: 4914: 4908: 4907: 4905: 4893: 4887: 4886: 4875:Amari, Shun'ichi 4871: 4865: 4864: 4862: 4835: 4829: 4828: 4826: 4825: 4819: 4800: 4791: 4782: 4781: 4757: 4751: 4750: 4730: 4724: 4723: 4715: 4709: 4708: 4697: 4688: 4687: 4668:10.1037/h0042519 4647: 4641: 4640: 4632: 4626: 4625: 4609: 4603: 4602: 4576: 4570: 4569: 4559: 4549: 4532:(8): 2554–2558. 4517: 4511: 4510: 4508: 4493: 4474: 4473: 4472:(21): 1197–1206. 4461: 4452: 4451: 4423: 4417: 4416: 4405: 4396: 4395: 4393: 4392: 4386: 4369: 4358: 4352: 4351: 4333: 4313: 4307: 4306: 4286: 4277: 4276: 4256: 4243: 4242: 4240: 4239: 4233: 4210: 4201: 4186: 4185: 4175: 4165: 4133: 4127: 4113: 4104: 4103: 4083: 4074: 4073: 4053: 4044: 4043: 4015: 4004: 4003: 4001: 3995:. Archived from 3960: 3954:Cybenko (1989). 3951: 3940: 3939: 3921: 3912:(2): 1385–1390. 3897: 3891: 3888: 3882: 3881: 3879: 3877: 3846: 3840: 3824: 3818: 3817: 3815: 3783: 3774: 3773: 3771: 3770: 3764: 3757: 3746: 3740: 3739: 3719: 3713: 3712: 3686: 3666: 3645: 3644: 3618: 3609:(8): 1798–1828. 3598: 3583: 3582: 3580: 3578: 3572: 3566:. Archived from 3549: 3531: 3522: 3509: 3508: 3506: 3505: 3490: 3484: 3483: 3481: 3479: 3460: 3454: 3453: 3451: 3450: 3444: 3433: 3424: 3415: 3414: 3386: 3370: 3361: 3360: 3318: 3309: 3300: 3299: 3259: 3159:devices such as 3113:media philosophy 3104: 3097: 3093: 3090: 3084: 3082: 3041: 3017: 3009: 2923:production rules 2806:Google Translate 2691:biochemical test 2685:Epigenetic clock 2679:Epigenetic clock 2559:Deep Image Prior 2547:super-resolution 2539:inverse problems 2457:random variables 2417:ANN was used in 2374:direct marketing 2278:(LSTM) network. 2272:Google Translate 2153: 2102:Skype Translator 2077:domain knowledge 1936: 1935: 1922:American English 1852:Atomically thin 1847:Cerebras Systems 1837:servers such as 1772: 1770: 1769: 1764: 1762: 1761: 1741: 1739: 1738: 1733: 1731: 1730: 1595:biological brain 1534: 1510: 1434:Stable Diffusion 1426:Diffusion models 1388:Google DeepDream 1359:Andrew Zisserman 1082:gradient descent 1074:generative model 932:Seppo Linnainmaa 837:Frank Rosenblatt 730:machine learning 580:machine learning 430:machine learning 415: 408: 401: 322:Existential risk 144:Machine learning 45: 44: 21: 15075: 15074: 15070: 15069: 15068: 15066: 15065: 15064: 15050: 15049: 15048: 15043: 14995: 14909: 14875:Google DeepMind 14853: 14819:Geoffrey Hinton 14778: 14715: 14641:Project Debater 14587: 14485:Implementations 14480: 14434: 14398: 14341: 14283:Backpropagation 14217: 14203:Tensor calculus 14157: 14154: 14124: 14109: 14108: 14102: 14100: 14093: 14073:Goodfellow, Ian 14065: 14046: 14029: 14027:Further reading 14024: 14014: 14012: 13991: 13990: 13986: 13941: 13930: 13923:Global Dialogue 13915: 13911: 13902: 13900: 13877: 13873: 13863: 13861: 13854:Singularity Hub 13848: 13847: 13834: 13824: 13822: 13809: 13808: 13804: 13794: 13792: 13779: 13778: 13774: 13765: 13763: 13754:Eisner, Jason. 13752: 13748: 13743: 13739: 13723:10.1.1.681.2190 13706: 13702: 13685: 13681: 13664: 13660: 13651: 13649: 13645: 13638: 13632: 13623: 13613: 13611: 13596: 13592: 13582: 13580: 13569: 13565: 13556: 13554: 13539: 13535: 13525: 13523: 13508: 13504: 13494: 13492: 13477: 13473: 13463: 13461: 13456:. 16 May 2018. 13448: 13447: 13443: 13428: 13406: 13402: 13392: 13390: 13375: 13371: 13361: 13359: 13346: 13345: 13341: 13331: 13277:Hassabis, Demis 13273:Sutskever, Ilya 13262: 13258: 13234:10.1038/529445a 13205: 13201: 13191: 13189: 13174: 13170: 13115: 13111: 13050: 13046: 13015:10.1038/nn.4244 12999: 12995: 12948: 12944: 12891: 12887: 12816: 12812: 12765: 12761: 12698: 12694: 12655: 12651: 12588: 12584: 12575: 12571: 12522: 12518: 12491:Neural Networks 12487: 12483: 12476: 12460: 12456: 12417: 12413: 12388: 12384: 12375: 12373: 12363: 12359: 12295: 12291: 12230: 12226: 12173: 12169: 12104: 12100: 12047: 12043: 11990: 11986: 11925: 11921: 11911: 11909: 11896: 11895: 11886: 11847: 11843: 11794:(7990): 80–85. 11780: 11776: 11767: 11765: 11755: 11751: 11742: 11740: 11725: 11721: 11690: 11686: 11677: 11675: 11671: 11664: 11656: 11652: 11642: 11640: 11627: 11626: 11622: 11607: 11585: 11581: 11526: 11522: 11469: 11465: 11456: 11454: 11439: 11413: 11409: 11351: 11347: 11297: 11291: 11284: 11275: 11273: 11264: 11260: 11251: 11249: 11241: 11240: 11236: 11183: 11179: 11124: 11120: 11110: 11108: 11093: 11061: 11057: 11047: 11045: 11030: 11026: 10977: 10973: 10964: 10962: 10958: 10951: 10943: 10939: 10922: 10918: 10909: 10907: 10892: 10888: 10849: 10845: 10828: 10824: 10814: 10812: 10807:. 27 May 2015. 10799: 10798: 10794: 10784: 10782: 10767: 10766: 10759: 10742: 10738: 10728: 10726: 10717: 10716: 10712: 10703: 10701: 10692: 10691: 10687: 10682: 10678: 10668: 10666: 10653: 10652: 10648: 10639: 10637: 10624: 10623: 10619: 10572: 10568: 10544:10.1038/nrd4090 10523: 10519: 10514: 10510: 10500: 10498: 10494: 10487: 10481: 10474: 10464: 10462: 10445: 10441: 10423: 10419: 10409: 10407: 10392: 10381: 10371: 10369: 10354: 10350: 10340: 10338: 10325: 10324: 10320: 10273: 10269: 10238: 10234: 10224: 10222: 10207: 10200: 10169:Hakkani-Tur, D. 10165: 10161: 10151: 10149: 10134: 10130: 10120: 10118: 10103: 10099: 10089: 10087: 10083: 10072: 10066: 10062: 10053: 10051: 10047: 10036: 10030: 10026: 10016: 10014: 10010: 10003: 9997: 9990: 9973: 9969: 9940: 9931: 9902: 9893: 9880: 9876: 9859: 9855: 9831:10.1.1.226.8219 9818:Neural Networks 9814: 9810: 9801: 9799: 9786: 9785: 9781: 9763: 9759: 9749: 9747: 9732: 9728: 9719: 9717: 9713: 9680: 9674: 9670: 9643: 9639: 9630: 9628: 9593: 9589: 9567: 9563: 9553: 9551: 9549: 9525: 9521: 9472: 9463: 9410: 9406: 9397: 9395: 9386: 9385: 9381: 9372: 9370: 9360: 9356: 9317: 9313: 9300: 9299: 9295: 9285: 9283: 9278:. 16 May 2018. 9270: 9269: 9265: 9255: 9253: 9238: 9234: 9224: 9222: 9215:InformationWeek 9209: 9208: 9204: 9194: 9192: 9177: 9173: 9167:Wayback Machine 9156: 9152: 9147: 9143: 9094: 9090: 9080: 9078: 9063: 9035: 9031: 9022: 9020: 9005: 9001: 8991: 8989: 8976: 8975: 8971: 8962: 8960: 8956: 8945: 8939: 8935: 8920: 8901:10.1.1.752.9151 8878: 8874: 8859: 8830: 8826: 8791: 8787: 8777: 8775: 8762: 8761: 8754: 8707: 8703: 8694: 8692: 8688: 8661: 8655: 8651: 8634: 8627: 8610: 8603: 8594: 8592: 8588: 8559: 8553: 8544: 8535: 8533: 8490: 8483: 8473: 8471: 8458:Hof, Robert D. 8456: 8452: 8443: 8441: 8426: 8422: 8413: 8411: 8396: 8392: 8384: 8382: 8371: 8370: 8366: 8310: 8306: 8290: 8289: 8285: 8277:. Sourcebooks. 8271: 8267: 8258: 8256: 8248: 8247: 8243: 8234: 8232: 8228: 8217: 8211: 8207: 8190: 8186: 8178: 8171: 8165: 8161: 8146: 8114: 8110: 8101: 8099: 8088: 8081: 8073: 8069: 8043: 8037: 8033: 8023: 8021: 8016:. witness.org. 8012: 8011: 8007: 7990: 7986: 7976: 7974: 7965: 7964: 7960: 7950: 7948: 7944: 7937: 7929: 7925: 7908: 7904: 7897: 7859: 7855: 7834: 7830: 7813: 7809: 7789: 7785: 7767: 7763: 7744: 7740: 7722: 7718: 7696: 7690: 7686: 7669: 7665: 7648: 7644: 7629: 7607: 7603: 7594: 7592: 7588: 7581: 7573: 7569: 7560: 7558: 7554: 7535: 7529: 7525: 7468: 7464: 7457: 7431: 7427: 7407: 7403: 7395: 7393: 7380: 7373: 7342: 7335: 7325: 7323: 7308: 7304: 7295: 7293: 7262: 7258: 7249: 7247: 7232: 7228: 7219: 7217: 7202: 7195: 7185: 7183: 7170: 7169: 7165: 7158: 7142: 7133: 7123: 7121: 7117: 7110: 7104: 7095: 7052: 7043: 7038:Wayback Machine 7025: 7021: 6976: 6972: 6929: 6925: 6904:(10): 428–434. 6890: 6886: 6876:Wayback Machine 6865: 6861: 6852: 6850: 6846: 6805: 6796: 6792: 6782: 6780: 6743:(10): 428–434. 6729: 6725: 6712: 6708: 6702:Wayback Machine 6692: 6688: 6660: 6656: 6647: 6645: 6641: 6630: 6624: 6617: 6607: 6605: 6592: 6591: 6587: 6582: 6578: 6555: 6548: 6525: 6521: 6502:Neural Networks 6498: 6494: 6485: 6483: 6472: 6468: 6421: 6417: 6408: 6406: 6402: 6355: 6349: 6345: 6336: 6334: 6327: 6306: 6302: 6271: 6267: 6224: 6220: 6213: 6199: 6195: 6137: 6133: 6123: 6062: 6058: 6051: 6030: 6024: 6020: 6004: 6003: 5971: 5967: 5922:Neural Networks 5915: 5908: 5874: 5870: 5856: 5849: 5842: 5820: 5816: 5792:Sepp Hochreiter 5789: 5785: 5778: 5760: 5756: 5746:Wayback Machine 5735: 5726: 5717: 5709: 5705: 5673: 5667: 5660: 5644: 5643: 5634: 5625: 5616: 5581: 5577: 5558: 5554: 5544: 5542: 5505: 5499: 5495: 5460:Medical Physics 5452: 5448: 5405: 5401: 5358: 5354: 5341: 5337: 5322: 5318: 5304: 5300: 5292: 5284: 5280: 5274:Wayback Machine 5264: 5260: 5217: 5213: 5206: 5192: 5188: 5178: 5176: 5172: 5161: 5152: 5148: 5138: 5136: 5135:on 30 July 2024 5127:(25 Oct 2014). 5122: 5115: 5110: 5106: 5072: 5068: 5054: 5050: 5037:(10): 947–954. 5024: 5020: 5013: 4997: 4993: 4954: 4950: 4915: 4911: 4894: 4890: 4872: 4868: 4836: 4832: 4823: 4821: 4817: 4798: 4792: 4785: 4758: 4754: 4747: 4731: 4727: 4716: 4712: 4698: 4691: 4648: 4644: 4633: 4629: 4610: 4606: 4599: 4577: 4573: 4518: 4514: 4494: 4477: 4462: 4455: 4424: 4420: 4407: 4406: 4399: 4390: 4388: 4384: 4378: 4367: 4359: 4355: 4314: 4310: 4287: 4280: 4273: 4257: 4246: 4237: 4235: 4231: 4208: 4202: 4189: 4134: 4130: 4124:Wayback Machine 4114: 4107: 4100: 4084: 4077: 4070: 4054: 4047: 4020:Neural Networks 4016: 4007: 3999: 3958: 3952: 3943: 3898: 3894: 3889: 3885: 3875: 3873: 3871: 3847: 3843: 3838:Wayback Machine 3825: 3821: 3784: 3777: 3768: 3766: 3762: 3755: 3747: 3743: 3736: 3720: 3716: 3671:Neural Networks 3667: 3648: 3599: 3586: 3576: 3574: 3573:on 4 March 2016 3570: 3547:10.1.1.701.9550 3529: 3523: 3512: 3503: 3501: 3492: 3491: 3487: 3477: 3475: 3470:. 25 May 2017. 3462: 3461: 3457: 3448: 3446: 3442: 3431: 3425: 3418: 3403: 3371: 3364: 3316: 3314:"Deep Learning" 3310: 3303: 3264:"Deep Learning" 3260: 3256: 3252: 3180: 3157:quantified-self 3129:Rainer Mühlhoff 3105: 3094: 3088: 3085: 3048:"Deep learning" 3042: 3040: 3030: 3018: 3007: 2935: 2902: 2842: 2836: 2828: 2776: 2756:backpropagation 2724: 2687: 2681: 2672: 2663: 2637: 2617: 2609: 2583:Google DeepMind 2579: 2571:fraud detection 2567: 2535: 2522: 2513: 2475:X to an output 2461:random variable 2449: 2411: 2405: 2396: 2390: 2367: 2361: 2313: 2302: 2293: 2226: 2220: 2185: 2142: 2140: 2138:Computer vision 2134: 1943: 1910: 1904: 1899: 1884:frequency combs 1835:cloud computing 1833:cellphones and 1806: 1757: 1753: 1751: 1748: 1747: 1726: 1722: 1720: 1717: 1716: 1698: 1646: 1615:backpropagation 1553: 1552: 1551: 1550: 1549: 1547: 1535: 1527: 1526: 1511: 1500: 1494: 1492:Neural networks 1478:Geoffrey Hinton 1343:Geoffrey Hinton 1335:Alex Krizhevsky 1290:GeForce GTX 280 1267: 1230:, Osindero and 1190: 1101:Geoffrey Hinton 1093:Terry Sejnowski 1047:Sepp Hochreiter 1027:a higher level 992:(1986) and the 948: 924:Henry J. Kelley 912:Backpropagation 885:introduced the 868:Shun'ichi Amari 819:Shun'ichi Amari 787: 782: 668: 666:Interpretations 549:neural networks 545: 522:climate science 494:computer vision 455:semi-supervised 434:neural networks 428:is a subset of 419: 390: 389: 380: 372: 371: 347: 337: 336: 308:Control problem 288: 278: 277: 189: 179: 178: 139: 131: 130: 101:Computer vision 76: 35: 28: 23: 22: 15: 12: 11: 5: 15073: 15063: 15062: 15045: 15044: 15042: 15041: 15040: 15039: 15034: 15021: 15020: 15019: 15014: 15000: 14997: 14996: 14994: 14993: 14988: 14983: 14978: 14973: 14968: 14963: 14958: 14953: 14948: 14943: 14938: 14933: 14928: 14923: 14917: 14915: 14911: 14910: 14908: 14907: 14902: 14897: 14892: 14887: 14882: 14877: 14872: 14867: 14861: 14859: 14855: 14854: 14852: 14851: 14849:Ilya Sutskever 14846: 14841: 14836: 14831: 14826: 14821: 14816: 14814:Demis Hassabis 14811: 14806: 14804:Ian Goodfellow 14801: 14796: 14790: 14788: 14784: 14783: 14780: 14779: 14777: 14776: 14771: 14770: 14769: 14759: 14754: 14749: 14744: 14739: 14734: 14729: 14723: 14721: 14717: 14716: 14714: 14713: 14708: 14703: 14698: 14693: 14688: 14683: 14678: 14673: 14668: 14663: 14658: 14653: 14648: 14643: 14638: 14633: 14632: 14631: 14621: 14616: 14611: 14606: 14601: 14595: 14593: 14589: 14588: 14586: 14585: 14580: 14579: 14578: 14573: 14563: 14562: 14561: 14556: 14551: 14541: 14536: 14531: 14526: 14521: 14516: 14511: 14506: 14501: 14495: 14493: 14486: 14482: 14481: 14479: 14478: 14473: 14468: 14463: 14458: 14453: 14448: 14442: 14440: 14436: 14435: 14433: 14432: 14427: 14422: 14417: 14412: 14406: 14404: 14400: 14399: 14397: 14396: 14395: 14394: 14387:Language model 14384: 14379: 14374: 14373: 14372: 14362: 14361: 14360: 14349: 14347: 14343: 14342: 14340: 14339: 14337:Autoregression 14334: 14329: 14328: 14327: 14317: 14315:Regularization 14312: 14311: 14310: 14305: 14300: 14290: 14285: 14280: 14278:Loss functions 14275: 14270: 14265: 14260: 14255: 14254: 14253: 14243: 14238: 14237: 14236: 14225: 14223: 14219: 14218: 14216: 14215: 14213:Inductive bias 14210: 14205: 14200: 14195: 14190: 14185: 14180: 14175: 14167: 14165: 14159: 14158: 14153: 14152: 14145: 14138: 14130: 14123: 14122: 14091: 14077:Bengio, Yoshua 14069: 14063: 14050: 14044: 14030: 14028: 14025: 14023: 14022: 13984: 13928: 13909: 13871: 13832: 13802: 13772: 13746: 13737: 13716:(4): 259–362. 13700: 13679: 13658: 13621: 13590: 13563: 13547:The New Yorker 13533: 13502: 13471: 13441: 13426: 13400: 13369: 13339: 13256: 13199: 13168: 13109: 13044: 13009:(3): 356–365. 12993: 12958:(4): 481–487. 12942: 12905:(2): 383–394. 12885: 12810: 12775:(9): 657–664. 12759: 12692: 12665:(5): 895–938. 12649: 12582: 12569: 12539:10.1.1.41.7854 12532:(4): 537–556. 12516: 12481: 12474: 12454: 12411: 12382: 12357: 12289: 12224: 12167: 12098: 12041: 11984: 11919: 11884: 11841: 11774: 11749: 11719: 11684: 11650: 11620: 11605: 11579: 11520: 11463: 11437: 11407: 11345: 11282: 11258: 11234: 11197:(2): 361–370. 11177: 11118: 11091: 11055: 11024: 10971: 10937: 10916: 10886: 10843: 10822: 10792: 10757: 10736: 10710: 10685: 10676: 10646: 10617: 10586:(5): 505–513. 10566: 10517: 10508: 10472: 10439: 10417: 10379: 10348: 10318: 10267: 10232: 10198: 10179:(3): 530–539. 10159: 10128: 10097: 10060: 10024: 9988: 9967: 9929: 9891: 9874: 9853: 9808: 9792:yann.lecun.com 9779: 9757: 9726: 9668: 9637: 9587: 9570:Robinson, Tony 9561: 9547: 9519: 9461: 9404: 9379: 9354: 9311: 9293: 9263: 9232: 9202: 9185:airesearch.com 9171: 9150: 9141: 9088: 9061: 9029: 8999: 8969: 8933: 8918: 8872: 8857: 8824: 8785: 8752: 8701: 8649: 8625: 8601: 8542: 8481: 8450: 8420: 8390: 8364: 8304: 8284:978-1492671206 8283: 8265: 8254:awards.acm.org 8241: 8205: 8184: 8159: 8144: 8108: 8079: 8067: 8031: 8005: 7984: 7958: 7923: 7902: 7895: 7853: 7828: 7807: 7783: 7761: 7738: 7716: 7684: 7663: 7642: 7627: 7601: 7567: 7523: 7462: 7455: 7425: 7401: 7371: 7333: 7302: 7256: 7226: 7193: 7163: 7156: 7131: 7093: 7041: 7019: 6970: 6923: 6884: 6859: 6790: 6723: 6706: 6686: 6680:10.1.1.75.6306 6654: 6615: 6585: 6576: 6565:(2): 181–192. 6546: 6535:(2): 225–254. 6519: 6508:(2): 331–339. 6492: 6466: 6415: 6366:(3): 328–339. 6343: 6325: 6300: 6281:(4): 899–916. 6265: 6238:(4): 865–884. 6218: 6211: 6193: 6131: 6087:(5): 889–904. 6056: 6049: 6018: 5985:(1): 147–169. 5965: 5906: 5887:(3): 230–247. 5868: 5863:Proc. SAB'1991 5847: 5840: 5814: 5783: 5776: 5754: 5724: 5703: 5684:(2): 234–242. 5658: 5630:(April 1991). 5614: 5595:(2): 179–211. 5575: 5552: 5523:10.1.1.32.9552 5493: 5446: 5419:(29): 4211–7. 5413:Applied Optics 5399: 5372:(32): 4790–7. 5366:Applied Optics 5352: 5335: 5316: 5298: 5278: 5258: 5211: 5204: 5186: 5146: 5113: 5104: 5085:(2): 146–160. 5066: 5048: 5043:10.2514/8.5282 5018: 5011: 4991: 4964:(4): 193–202. 4948: 4909: 4888: 4885:(16): 279–307. 4866: 4830: 4783: 4772:(2): 207–219. 4752: 4745: 4725: 4710: 4689: 4662:(6): 386–408. 4642: 4627: 4604: 4597: 4571: 4512: 4475: 4453: 4434:(4): 883–893. 4418: 4397: 4376: 4353: 4324:(2): 233–268. 4308: 4297:(4): 322–333. 4278: 4271: 4244: 4219:(3–4): 1–199. 4187: 4128: 4105: 4098: 4075: 4068: 4045: 4026:(2): 251–257. 4005: 3971:(4): 303–314. 3941: 3892: 3883: 3869: 3841: 3819: 3775: 3741: 3734: 3728:. IGI Global. 3714: 3646: 3584: 3510: 3485: 3455: 3416: 3401: 3362: 3301: 3274:(4): 357–363. 3253: 3251: 3248: 3247: 3246: 3241: 3236: 3231: 3226: 3221: 3216: 3211: 3206: 3201: 3196: 3191: 3186: 3179: 3176: 3107: 3106: 3021: 3019: 3012: 3006: 3003: 2999:data poisoning 2934: 2931: 2901: 2898: 2835: 2832: 2827: 2824: 2775: 2772: 2739:) support the 2723: 2720: 2708:ovarian cancer 2683:Main article: 2680: 2677: 2671: 2668: 2662: 2659: 2636: 2633: 2616: 2613: 2608: 2605: 2578: 2575: 2566: 2563: 2534: 2531: 2521: 2518: 2512: 2509: 2448: 2445: 2419:bioinformatics 2409:Bioinformatics 2407:Main article: 2404: 2403:Bioinformatics 2401: 2392:Main article: 2389: 2386: 2363:Main article: 2360: 2357: 2306:Drug discovery 2301: 2298: 2292: 2289: 2262:word embedding 2233:word embedding 2222:Main article: 2219: 2216: 2215: 2214: 2211: 2205: 2184: 2181: 2168:MNIST database 2136:Main article: 2133: 2130: 2090: 2089: 2086: 2080: 2070: 2060: 2057: 2054: 2051: 2042: 2041: 2038: 2034: 2033: 2030: 2026: 2025: 2022: 2018: 2017: 2014: 2010: 2009: 2006: 2002: 2001: 1998: 1994: 1993: 1990: 1986: 1985: 1982: 1978: 1977: 1974: 1970: 1969: 1966: 1962: 1961: 1958: 1954: 1953: 1950: 1946: 1945: 1940: 1906:Main article: 1903: 1900: 1898: 1895: 1854:semiconductors 1805: 1802: 1760: 1756: 1729: 1725: 1709:Regularization 1697: 1694: 1645: 1642: 1630:social network 1544:false positive 1536: 1529: 1528: 1512: 1505: 1504: 1503: 1502: 1501: 1496:Main article: 1493: 1490: 1406:Ian Goodfellow 1355:Karen Simonyan 1339:Ilya Sutskever 1266: 1263: 1259:decision trees 1189: 1186: 1076:that models a 990:Jordan network 947: 944: 928:control theory 904:introduced by 789:There are two 786: 783: 781: 778: 746:generalization 691:George Cybenko 667: 664: 544: 541: 510:bioinformatics 421: 420: 418: 417: 410: 403: 395: 392: 391: 388: 387: 381: 378: 377: 374: 373: 370: 369: 364: 359: 354: 348: 343: 342: 339: 338: 335: 334: 329: 324: 319: 314: 305: 300: 295: 289: 284: 283: 280: 279: 276: 275: 270: 265: 260: 255: 254: 253: 243: 238: 233: 232: 231: 226: 221: 211: 206: 204:Earth sciences 201: 196: 194:Bioinformatics 190: 185: 184: 181: 180: 177: 176: 171: 166: 161: 156: 151: 146: 140: 137: 136: 133: 132: 129: 128: 123: 118: 113: 108: 103: 98: 93: 88: 83: 77: 72: 71: 68: 67: 57: 56: 50: 49: 26: 9: 6: 4: 3: 2: 15072: 15061: 15060:Deep learning 15058: 15057: 15055: 15038: 15035: 15033: 15030: 15029: 15022: 15018: 15015: 15013: 15010: 15009: 15006: 15002: 15001: 14998: 14992: 14989: 14987: 14984: 14982: 14979: 14977: 14974: 14972: 14969: 14967: 14964: 14962: 14959: 14957: 14954: 14952: 14949: 14947: 14944: 14942: 14939: 14937: 14934: 14932: 14929: 14927: 14924: 14922: 14919: 14918: 14916: 14914:Architectures 14912: 14906: 14903: 14901: 14898: 14896: 14893: 14891: 14888: 14886: 14883: 14881: 14878: 14876: 14873: 14871: 14868: 14866: 14863: 14862: 14860: 14858:Organizations 14856: 14850: 14847: 14845: 14842: 14840: 14837: 14835: 14832: 14830: 14827: 14825: 14822: 14820: 14817: 14815: 14812: 14810: 14807: 14805: 14802: 14800: 14797: 14795: 14794:Yoshua Bengio 14792: 14791: 14789: 14785: 14775: 14774:Robot control 14772: 14768: 14765: 14764: 14763: 14760: 14758: 14755: 14753: 14750: 14748: 14745: 14743: 14740: 14738: 14735: 14733: 14730: 14728: 14725: 14724: 14722: 14718: 14712: 14709: 14707: 14704: 14702: 14699: 14697: 14694: 14692: 14691:Chinchilla AI 14689: 14687: 14684: 14682: 14679: 14677: 14674: 14672: 14669: 14667: 14664: 14662: 14659: 14657: 14654: 14652: 14649: 14647: 14644: 14642: 14639: 14637: 14634: 14630: 14627: 14626: 14625: 14622: 14620: 14617: 14615: 14612: 14610: 14607: 14605: 14602: 14600: 14597: 14596: 14594: 14590: 14584: 14581: 14577: 14574: 14572: 14569: 14568: 14567: 14564: 14560: 14557: 14555: 14552: 14550: 14547: 14546: 14545: 14542: 14540: 14537: 14535: 14532: 14530: 14527: 14525: 14522: 14520: 14517: 14515: 14512: 14510: 14507: 14505: 14502: 14500: 14497: 14496: 14494: 14490: 14487: 14483: 14477: 14474: 14472: 14469: 14467: 14464: 14462: 14459: 14457: 14454: 14452: 14449: 14447: 14444: 14443: 14441: 14437: 14431: 14428: 14426: 14423: 14421: 14418: 14416: 14413: 14411: 14408: 14407: 14405: 14401: 14393: 14390: 14389: 14388: 14385: 14383: 14380: 14378: 14375: 14371: 14370:Deep learning 14368: 14367: 14366: 14363: 14359: 14356: 14355: 14354: 14351: 14350: 14348: 14344: 14338: 14335: 14333: 14330: 14326: 14323: 14322: 14321: 14318: 14316: 14313: 14309: 14306: 14304: 14301: 14299: 14296: 14295: 14294: 14291: 14289: 14286: 14284: 14281: 14279: 14276: 14274: 14271: 14269: 14266: 14264: 14261: 14259: 14258:Hallucination 14256: 14252: 14249: 14248: 14247: 14244: 14242: 14239: 14235: 14232: 14231: 14230: 14227: 14226: 14224: 14220: 14214: 14211: 14209: 14206: 14204: 14201: 14199: 14196: 14194: 14191: 14189: 14186: 14184: 14181: 14179: 14176: 14174: 14173: 14169: 14168: 14166: 14164: 14160: 14151: 14146: 14144: 14139: 14137: 14132: 14131: 14128: 14119: 14113: 14098: 14094: 14088: 14085:. MIT Press. 14084: 14083: 14082:Deep Learning 14078: 14074: 14070: 14066: 14064:9780262048644 14060: 14056: 14051: 14047: 14041: 14037: 14032: 14031: 14010: 14006: 14002: 13998: 13994: 13988: 13980: 13976: 13972: 13968: 13963: 13958: 13954: 13950: 13946: 13939: 13937: 13935: 13933: 13924: 13920: 13913: 13898: 13894: 13890: 13886: 13882: 13875: 13859: 13855: 13851: 13845: 13843: 13841: 13839: 13837: 13820: 13816: 13815:The Daily Dot 13812: 13806: 13790: 13786: 13782: 13776: 13762:on 2017-12-30 13761: 13757: 13750: 13741: 13733: 13729: 13724: 13719: 13715: 13711: 13704: 13695: 13690: 13683: 13674: 13669: 13662: 13644: 13637: 13630: 13628: 13626: 13609: 13605: 13601: 13594: 13578: 13574: 13567: 13552: 13548: 13544: 13537: 13521: 13517: 13513: 13506: 13490: 13486: 13482: 13475: 13459: 13455: 13451: 13445: 13437: 13433: 13429: 13423: 13419: 13415: 13411: 13404: 13388: 13384: 13380: 13373: 13357: 13353: 13349: 13343: 13335: 13328: 13324: 13320: 13316: 13312: 13308: 13304: 13300: 13296: 13292: 13288: 13284: 13283: 13278: 13274: 13270: 13266: 13265:Silver, David 13260: 13252: 13248: 13244: 13240: 13235: 13230: 13226: 13222: 13218: 13214: 13210: 13203: 13187: 13183: 13179: 13172: 13164: 13160: 13155: 13150: 13146: 13142: 13137: 13132: 13128: 13124: 13120: 13113: 13105: 13101: 13097: 13093: 13088: 13083: 13079: 13075: 13071: 13067: 13063: 13059: 13055: 13048: 13040: 13036: 13032: 13028: 13024: 13020: 13016: 13012: 13008: 13004: 12997: 12989: 12985: 12981: 12977: 12973: 12969: 12965: 12961: 12957: 12953: 12946: 12938: 12934: 12930: 12926: 12922: 12918: 12913: 12908: 12904: 12900: 12896: 12889: 12881: 12877: 12873: 12869: 12864: 12859: 12855: 12851: 12846: 12841: 12837: 12833: 12829: 12825: 12821: 12814: 12806: 12802: 12798: 12794: 12790: 12786: 12782: 12778: 12774: 12770: 12763: 12755: 12751: 12747: 12743: 12738: 12733: 12729: 12725: 12720: 12715: 12711: 12707: 12703: 12696: 12688: 12684: 12680: 12676: 12672: 12668: 12664: 12660: 12653: 12645: 12641: 12636: 12631: 12627: 12623: 12618: 12613: 12609: 12605: 12601: 12597: 12593: 12586: 12579: 12573: 12565: 12561: 12557: 12553: 12549: 12545: 12540: 12535: 12531: 12527: 12520: 12512: 12508: 12504: 12500: 12496: 12492: 12485: 12477: 12471: 12468:. MIT Press. 12467: 12466: 12458: 12450: 12446: 12442: 12438: 12434: 12430: 12426: 12422: 12415: 12406: 12401: 12397: 12393: 12386: 12372: 12368: 12361: 12353: 12349: 12345: 12341: 12336: 12331: 12327: 12323: 12318: 12313: 12309: 12305: 12301: 12293: 12285: 12281: 12277: 12273: 12269: 12265: 12261: 12257: 12252: 12247: 12243: 12239: 12235: 12228: 12220: 12216: 12212: 12208: 12204: 12200: 12195: 12190: 12186: 12182: 12178: 12171: 12163: 12159: 12154: 12149: 12144: 12139: 12135: 12131: 12126: 12121: 12117: 12113: 12109: 12102: 12094: 12090: 12085: 12080: 12076: 12072: 12068: 12064: 12060: 12056: 12052: 12045: 12037: 12033: 12029: 12025: 12020: 12015: 12011: 12007: 12003: 11999: 11995: 11988: 11980: 11976: 11972: 11968: 11964: 11960: 11955: 11950: 11946: 11942: 11938: 11934: 11930: 11923: 11907: 11903: 11899: 11893: 11891: 11889: 11880: 11876: 11872: 11868: 11864: 11860: 11856: 11852: 11845: 11837: 11833: 11828: 11823: 11819: 11815: 11810: 11805: 11801: 11797: 11793: 11789: 11785: 11778: 11764: 11760: 11753: 11738: 11734: 11730: 11723: 11715: 11711: 11707: 11703: 11699: 11695: 11688: 11670: 11663: 11662: 11654: 11638: 11634: 11633:FloydHub Blog 11630: 11624: 11616: 11612: 11608: 11602: 11598: 11594: 11590: 11583: 11575: 11571: 11566: 11561: 11557: 11553: 11548: 11543: 11539: 11535: 11531: 11524: 11516: 11512: 11508: 11504: 11499: 11494: 11490: 11486: 11482: 11478: 11474: 11467: 11452: 11448: 11444: 11440: 11438:9781538610343 11434: 11430: 11426: 11422: 11418: 11411: 11403: 11399: 11395: 11391: 11387: 11383: 11379: 11375: 11370: 11365: 11361: 11357: 11349: 11341: 11337: 11333: 11329: 11325: 11321: 11316: 11311: 11307: 11303: 11296: 11289: 11287: 11272: 11269: 11262: 11248: 11244: 11238: 11230: 11226: 11221: 11216: 11212: 11208: 11204: 11200: 11196: 11192: 11188: 11181: 11173: 11169: 11165: 11161: 11156: 11151: 11146: 11141: 11137: 11133: 11129: 11122: 11106: 11102: 11098: 11094: 11092:9781450328944 11088: 11084: 11080: 11076: 11072: 11068: 11067: 11059: 11043: 11039: 11035: 11028: 11020: 11016: 11011: 11006: 11001: 11000:10.2196/12957 10996: 10993:(5): e12957. 10992: 10988: 10987: 10982: 10975: 10957: 10950: 10949: 10941: 10932: 10927: 10920: 10905: 10901: 10897: 10890: 10882: 10878: 10874: 10870: 10866: 10862: 10858: 10854: 10847: 10838: 10833: 10826: 10810: 10806: 10802: 10796: 10780: 10776: 10775: 10770: 10764: 10762: 10752: 10747: 10740: 10724: 10720: 10714: 10699: 10695: 10689: 10680: 10664: 10660: 10656: 10650: 10635: 10631: 10627: 10621: 10613: 10609: 10604: 10599: 10594: 10589: 10585: 10581: 10577: 10570: 10562: 10558: 10554: 10550: 10545: 10540: 10536: 10532: 10528: 10521: 10512: 10493: 10486: 10479: 10477: 10460: 10456: 10455: 10450: 10443: 10434: 10429: 10421: 10405: 10401: 10397: 10390: 10388: 10386: 10384: 10367: 10363: 10359: 10352: 10336: 10332: 10328: 10322: 10314: 10310: 10305: 10300: 10295: 10290: 10286: 10282: 10278: 10271: 10263: 10259: 10255: 10251: 10248:(12): e3259. 10247: 10243: 10236: 10220: 10216: 10212: 10205: 10203: 10194: 10190: 10186: 10182: 10178: 10174: 10170: 10163: 10147: 10143: 10139: 10132: 10116: 10112: 10108: 10101: 10082: 10078: 10071: 10064: 10046: 10042: 10035: 10028: 10009: 10002: 9995: 9993: 9983: 9978: 9971: 9962: 9957: 9953: 9949: 9945: 9938: 9936: 9934: 9924: 9919: 9915: 9911: 9907: 9900: 9898: 9896: 9889: 9888: 9883: 9878: 9869: 9864: 9857: 9849: 9845: 9841: 9837: 9832: 9827: 9823: 9819: 9812: 9797: 9793: 9789: 9783: 9774: 9769: 9761: 9745: 9741: 9737: 9730: 9712: 9708: 9704: 9699: 9694: 9690: 9686: 9679: 9672: 9664: 9660: 9656: 9652: 9649:: 1915–1919. 9648: 9641: 9626: 9622: 9618: 9614: 9610: 9606: 9602: 9598: 9591: 9583: 9579: 9575: 9571: 9565: 9550: 9548:1-58563-019-5 9544: 9540: 9536: 9532: 9531: 9523: 9515: 9511: 9507: 9503: 9499: 9495: 9490: 9485: 9481: 9477: 9470: 9468: 9466: 9457: 9453: 9448: 9443: 9439: 9435: 9431: 9427: 9423: 9419: 9415: 9408: 9393: 9389: 9383: 9369: 9365: 9358: 9349: 9344: 9339: 9334: 9330: 9326: 9322: 9315: 9307: 9303: 9297: 9281: 9277: 9273: 9267: 9251: 9247: 9243: 9236: 9220: 9216: 9212: 9206: 9190: 9186: 9182: 9175: 9168: 9164: 9160: 9154: 9145: 9137: 9133: 9129: 9125: 9121: 9117: 9112: 9107: 9103: 9099: 9092: 9076: 9072: 9068: 9064: 9062:9781450351140 9058: 9054: 9050: 9046: 9045: 9040: 9033: 9018: 9014: 9010: 9003: 8987: 8983: 8979: 8973: 8955: 8951: 8944: 8937: 8929: 8925: 8921: 8915: 8911: 8907: 8902: 8897: 8892: 8887: 8883: 8876: 8868: 8864: 8860: 8854: 8850: 8846: 8842: 8838: 8834: 8828: 8820: 8816: 8812: 8808: 8804: 8800: 8796: 8789: 8773: 8769: 8765: 8759: 8757: 8748: 8744: 8740: 8736: 8732: 8728: 8724: 8720: 8716: 8712: 8705: 8687: 8683: 8679: 8675: 8671: 8668:: 1045–1048. 8667: 8660: 8653: 8644: 8639: 8632: 8630: 8620: 8615: 8608: 8606: 8587: 8583: 8579: 8574: 8569: 8565: 8558: 8551: 8549: 8547: 8531: 8527: 8523: 8519: 8515: 8511: 8507: 8503: 8499: 8495: 8488: 8486: 8469: 8465: 8461: 8454: 8439: 8436:. ICLR 2018. 8435: 8431: 8424: 8409: 8406:: 2553–2561. 8405: 8401: 8394: 8380: 8376: 8375: 8368: 8360: 8356: 8352: 8348: 8344: 8340: 8336: 8332: 8328: 8324: 8320: 8316: 8308: 8300: 8294: 8286: 8280: 8276: 8269: 8255: 8251: 8245: 8227: 8223: 8216: 8209: 8200: 8195: 8188: 8177: 8170: 8163: 8155: 8151: 8147: 8141: 8137: 8133: 8128: 8123: 8119: 8112: 8097: 8093: 8086: 8084: 8077: 8071: 8062: 8057: 8053: 8049: 8042: 8035: 8019: 8015: 8009: 8000: 7995: 7988: 7972: 7968: 7962: 7943: 7936: 7935: 7927: 7918: 7913: 7906: 7898: 7892: 7888: 7884: 7879: 7874: 7870: 7866: 7865: 7857: 7848: 7843: 7839: 7832: 7823: 7818: 7811: 7803: 7798: 7794: 7787: 7777: 7772: 7765: 7755: 7750: 7742: 7732: 7727: 7720: 7711: 7706: 7702: 7695: 7688: 7679: 7674: 7667: 7658: 7653: 7646: 7638: 7634: 7630: 7624: 7620: 7616: 7612: 7605: 7587: 7580: 7579: 7571: 7553: 7549: 7545: 7541: 7534: 7527: 7519: 7515: 7511: 7507: 7503: 7499: 7495: 7491: 7486: 7481: 7477: 7473: 7466: 7458: 7452: 7448: 7444: 7440: 7436: 7429: 7420: 7415: 7411: 7410:Sze, Vivienne 7405: 7391: 7387: 7386: 7378: 7376: 7367: 7363: 7359: 7355: 7351: 7347: 7340: 7338: 7321: 7317: 7313: 7306: 7291: 7287: 7283: 7279: 7275: 7271: 7267: 7260: 7245: 7241: 7237: 7230: 7215: 7211: 7207: 7200: 7198: 7181: 7177: 7173: 7167: 7159: 7153: 7149: 7148: 7140: 7138: 7136: 7116: 7113:. Microsoft. 7109: 7102: 7100: 7098: 7089: 7085: 7081: 7077: 7073: 7069: 7065: 7061: 7057: 7050: 7048: 7046: 7039: 7035: 7032: 7028: 7023: 7015: 7011: 7006: 7001: 6997: 6993: 6989: 6985: 6981: 6974: 6966: 6962: 6958: 6954: 6950: 6946: 6942: 6938: 6934: 6927: 6919: 6915: 6911: 6907: 6903: 6899: 6895: 6888: 6881: 6877: 6873: 6869: 6863: 6845: 6841: 6837: 6833: 6829: 6825: 6821: 6817: 6813: 6812: 6804: 6800: 6799:Hinton, G. E. 6794: 6778: 6774: 6770: 6766: 6762: 6758: 6754: 6750: 6746: 6742: 6738: 6734: 6727: 6720: 6716: 6710: 6703: 6699: 6696: 6690: 6681: 6676: 6672: 6668: 6664: 6658: 6640: 6636: 6629: 6622: 6620: 6603: 6599: 6595: 6589: 6580: 6572: 6568: 6564: 6560: 6553: 6551: 6542: 6538: 6534: 6530: 6523: 6515: 6511: 6507: 6503: 6496: 6481: 6477: 6470: 6462: 6458: 6454: 6450: 6446: 6442: 6438: 6434: 6430: 6426: 6419: 6401: 6397: 6393: 6389: 6385: 6381: 6377: 6373: 6369: 6365: 6361: 6354: 6347: 6332: 6328: 6326:9780780305328 6322: 6318: 6314: 6310: 6304: 6296: 6292: 6288: 6284: 6280: 6276: 6269: 6261: 6257: 6253: 6249: 6245: 6241: 6237: 6233: 6229: 6222: 6214: 6208: 6204: 6197: 6189: 6185: 6181: 6177: 6173: 6169: 6165: 6161: 6157: 6153: 6149: 6145: 6141: 6135: 6127: 6120: 6116: 6112: 6108: 6103: 6098: 6094: 6090: 6086: 6082: 6078: 6074: 6070: 6066: 6060: 6052: 6050:0-262-68053-X 6046: 6042: 6038: 6037: 6029: 6022: 6014: 6008: 6000: 5996: 5992: 5988: 5984: 5980: 5976: 5969: 5961: 5957: 5953: 5949: 5945: 5941: 5936: 5931: 5927: 5923: 5919: 5913: 5911: 5902: 5898: 5894: 5890: 5886: 5882: 5878: 5872: 5864: 5860: 5854: 5852: 5843: 5841:0-85296-721-7 5837: 5833: 5829: 5825: 5818: 5811: 5807: 5803: 5802: 5797: 5793: 5787: 5779: 5773: 5769: 5765: 5758: 5751: 5747: 5743: 5739: 5733: 5731: 5729: 5716: 5715: 5707: 5699: 5695: 5691: 5687: 5683: 5679: 5672: 5665: 5663: 5654: 5648: 5640: 5633: 5629: 5623: 5621: 5619: 5610: 5606: 5602: 5598: 5594: 5590: 5586: 5579: 5571: 5567: 5563: 5556: 5541: 5537: 5533: 5529: 5524: 5519: 5515: 5511: 5504: 5497: 5489: 5485: 5481: 5477: 5473: 5469: 5466:(4): 517–24. 5465: 5461: 5457: 5450: 5442: 5438: 5434: 5430: 5426: 5422: 5418: 5414: 5410: 5403: 5395: 5391: 5387: 5383: 5379: 5375: 5371: 5367: 5363: 5356: 5349: 5345: 5339: 5331: 5327: 5320: 5313: 5312: 5307: 5302: 5291: 5290: 5282: 5275: 5271: 5268: 5262: 5254: 5250: 5246: 5242: 5238: 5234: 5230: 5226: 5222: 5215: 5207: 5205:0-471-59897-6 5201: 5197: 5190: 5171: 5167: 5160: 5156: 5150: 5134: 5130: 5126: 5120: 5118: 5108: 5100: 5096: 5092: 5088: 5084: 5080: 5076: 5070: 5062: 5058: 5052: 5044: 5040: 5036: 5032: 5028: 5022: 5014: 5012:9780598818461 5008: 5004: 5003: 4995: 4987: 4983: 4979: 4975: 4971: 4967: 4963: 4959: 4952: 4944: 4940: 4936: 4932: 4928: 4924: 4920: 4913: 4904: 4899: 4892: 4884: 4880: 4876: 4870: 4861: 4856: 4852: 4848: 4844: 4840: 4834: 4816: 4812: 4808: 4804: 4797: 4790: 4788: 4779: 4775: 4771: 4767: 4763: 4756: 4748: 4742: 4738: 4737: 4729: 4721: 4714: 4706: 4702: 4696: 4694: 4685: 4681: 4677: 4673: 4669: 4665: 4661: 4657: 4653: 4646: 4638: 4631: 4623: 4619: 4615: 4608: 4600: 4594: 4590: 4586: 4582: 4575: 4567: 4563: 4558: 4553: 4548: 4543: 4539: 4535: 4531: 4527: 4523: 4516: 4507: 4502: 4498: 4492: 4490: 4488: 4486: 4484: 4482: 4480: 4471: 4467: 4460: 4458: 4449: 4445: 4441: 4437: 4433: 4429: 4422: 4414: 4410: 4404: 4402: 4383: 4379: 4373: 4366: 4365: 4357: 4349: 4345: 4341: 4337: 4332: 4327: 4323: 4319: 4312: 4304: 4300: 4296: 4292: 4285: 4283: 4274: 4268: 4265:. MIT Press. 4264: 4263: 4255: 4253: 4251: 4249: 4230: 4226: 4222: 4218: 4214: 4207: 4200: 4198: 4196: 4194: 4192: 4183: 4179: 4174: 4169: 4164: 4159: 4155: 4151: 4147: 4143: 4139: 4132: 4125: 4121: 4118: 4112: 4110: 4101: 4095: 4091: 4090: 4082: 4080: 4071: 4065: 4061: 4060: 4052: 4050: 4041: 4037: 4033: 4029: 4025: 4021: 4014: 4012: 4010: 3998: 3994: 3990: 3986: 3982: 3978: 3974: 3970: 3966: 3965: 3957: 3950: 3948: 3946: 3937: 3933: 3929: 3925: 3920: 3915: 3911: 3907: 3903: 3896: 3887: 3872: 3866: 3862: 3858: 3854: 3853: 3845: 3839: 3835: 3832: 3828: 3823: 3814: 3809: 3805: 3801: 3797: 3793: 3789: 3782: 3780: 3761: 3754: 3753: 3745: 3737: 3731: 3727: 3726: 3718: 3710: 3706: 3702: 3698: 3694: 3690: 3685: 3680: 3676: 3672: 3665: 3663: 3661: 3659: 3657: 3655: 3653: 3651: 3642: 3638: 3634: 3630: 3626: 3622: 3617: 3612: 3608: 3604: 3597: 3595: 3593: 3591: 3589: 3569: 3565: 3561: 3557: 3553: 3548: 3543: 3539: 3535: 3528: 3521: 3519: 3517: 3515: 3499: 3495: 3489: 3473: 3469: 3465: 3459: 3441: 3437: 3430: 3423: 3421: 3412: 3408: 3404: 3398: 3394: 3390: 3385: 3380: 3376: 3369: 3367: 3358: 3354: 3350: 3346: 3342: 3338: 3334: 3330: 3326: 3322: 3315: 3308: 3306: 3297: 3293: 3289: 3285: 3281: 3277: 3273: 3269: 3265: 3258: 3254: 3245: 3242: 3240: 3237: 3235: 3234:Sparse coding 3232: 3230: 3227: 3225: 3222: 3220: 3217: 3215: 3212: 3210: 3207: 3205: 3202: 3200: 3197: 3195: 3192: 3190: 3187: 3185: 3182: 3181: 3175: 3173: 3168: 3166: 3162: 3158: 3154: 3150: 3146: 3145:tagging faces 3142: 3138: 3134: 3130: 3126: 3122: 3118: 3114: 3103: 3100: 3092: 3081: 3078: 3074: 3071: 3067: 3064: 3060: 3057: 3053: 3050: –  3049: 3045: 3044:Find sources: 3038: 3034: 3028: 3027: 3022:This section 3020: 3016: 3011: 3010: 3002: 3000: 2995: 2993: 2988: 2986: 2982: 2978: 2974: 2969: 2967: 2963: 2959: 2954: 2953: 2949: 2943: 2941: 2930: 2928: 2924: 2920: 2916: 2912: 2908: 2897: 2895: 2893: 2885: 2883: 2879: 2875: 2871: 2867: 2861: 2859: 2855: 2850: 2848: 2841: 2831: 2823: 2821: 2817: 2812: 2809: 2807: 2803: 2799: 2795: 2791: 2786: 2784: 2780: 2771: 2767: 2765: 2761: 2757: 2752: 2750: 2746: 2742: 2738: 2733: 2729: 2719: 2717: 2713: 2709: 2705: 2701: 2697: 2692: 2686: 2676: 2667: 2658: 2656: 2651: 2649: 2645: 2641: 2632: 2630: 2626: 2622: 2612: 2604: 2601: 2597: 2592: 2588: 2584: 2574: 2572: 2562: 2560: 2556: 2552: 2548: 2544: 2540: 2530: 2527: 2517: 2508: 2506: 2502: 2498: 2494: 2490: 2486: 2482: 2478: 2474: 2470: 2466: 2462: 2458: 2454: 2444: 2442: 2438: 2433: 2431: 2426: 2424: 2423:gene ontology 2421:, to predict 2420: 2416: 2410: 2400: 2395: 2385: 2383: 2379: 2375: 2371: 2366: 2356: 2353: 2348: 2346: 2342: 2338: 2333: 2331: 2330:toxic effects 2327: 2323: 2319: 2318:toxic effects 2311: 2307: 2297: 2288: 2285: 2281: 2277: 2273: 2269: 2267: 2263: 2258: 2256: 2252: 2248: 2244: 2240: 2239: 2234: 2229: 2225: 2212: 2209: 2206: 2203: 2202: 2201: 2194: 2189: 2180: 2176: 2172: 2169: 2162: 2158: 2139: 2129: 2127: 2123: 2119: 2115: 2111: 2107: 2103: 2099: 2095: 2087: 2084: 2081: 2078: 2074: 2071: 2068: 2064: 2061: 2058: 2055: 2052: 2049: 2048: 2047: 2039: 2036: 2035: 2031: 2028: 2027: 2023: 2020: 2019: 2015: 2012: 2011: 2007: 2004: 2003: 1999: 1996: 1995: 1991: 1988: 1987: 1983: 1980: 1979: 1975: 1972: 1971: 1967: 1964: 1963: 1959: 1956: 1955: 1951: 1948: 1947: 1942:Percent phone 1941: 1938: 1937: 1934: 1931: 1927: 1923: 1919: 1914: 1909: 1894: 1892: 1889: 1885: 1881: 1877: 1873: 1870: 1865: 1863: 1860: 1859:floating-gate 1855: 1850: 1848: 1844: 1841:(TPU) in the 1840: 1836: 1832: 1828: 1824: 1819: 1816: 1811: 1801: 1799: 1794: 1792: 1788: 1784: 1783:learning rate 1779: 1776: 1758: 1754: 1745: 1727: 1723: 1714: 1710: 1705: 1703: 1693: 1691: 1687: 1683: 1681: 1677: 1673: 1669: 1665: 1663: 1659: 1654: 1650: 1641: 1637: 1635: 1631: 1627: 1623: 1618: 1616: 1610: 1606: 1604: 1600: 1596: 1592: 1588: 1583: 1581: 1577: 1572: 1568: 1566: 1565:connectionist 1561: 1557: 1545: 1541: 1533: 1524: 1520: 1516: 1509: 1499: 1489: 1487: 1483: 1479: 1475: 1474:Yoshua Bengio 1471: 1469: 1465: 1461: 1457: 1453: 1448: 1446: 1442: 1437: 1435: 1431: 1427: 1423: 1419: 1415: 1411: 1407: 1403: 1399: 1397: 1393: 1389: 1384: 1382: 1378: 1373: 1371: 1366: 1364: 1361:and Google's 1360: 1356: 1352: 1348: 1344: 1340: 1336: 1332: 1327: 1325: 1321: 1317: 1312: 1310: 1306: 1302: 1298: 1293: 1291: 1287: 1282: 1278: 1271: 1262: 1260: 1256: 1250: 1246: 1244: 1240: 1236: 1233: 1229: 1225: 1220: 1218: 1214: 1210: 1206: 1201: 1199: 1195: 1194:Gabor filters 1185: 1183: 1179: 1175: 1170: 1168: 1164: 1160: 1156: 1152: 1148: 1144: 1139: 1137: 1133: 1132:mixture model 1129: 1124: 1122: 1118: 1114: 1110: 1106: 1102: 1098: 1094: 1089: 1087: 1083: 1079: 1075: 1071: 1070:zero-sum game 1067: 1062: 1060: 1056: 1052: 1048: 1044: 1042: 1038: 1034: 1030: 1026: 1022: 1018: 1014: 1010: 1006: 1001: 999: 995: 994:Elman network 991: 987: 983: 981: 977: 973: 969: 965: 961: 957: 953: 943: 941: 937: 933: 929: 925: 921: 917: 913: 909: 907: 903: 899: 894: 892: 888: 884: 879: 877: 873: 869: 865: 861: 856: 853: 849: 844: 842: 838: 834: 832: 828: 824: 823:John Hopfield 820: 816: 812: 808: 804: 800: 796: 792: 777: 775: 771: 767: 763: 759: 755: 751: 747: 743: 739: 735: 731: 727: 726:probabilistic 722: 720: 716: 712: 707: 705: 701: 696: 692: 688: 684: 679: 677: 673: 663: 661: 657: 653: 652:Deep Learning 648: 646: 640: 638: 633: 631: 627: 623: 618: 615: 611: 607: 602: 600: 596: 592: 588: 584: 581: 576: 574: 570: 566: 562: 558: 554: 550: 540: 538: 534: 529: 527: 523: 519: 515: 511: 507: 503: 499: 495: 491: 487: 483: 479: 475: 471: 467: 462: 460: 456: 452: 447: 443: 439: 435: 431: 427: 426:Deep learning 416: 411: 409: 404: 402: 397: 396: 394: 393: 386: 383: 382: 376: 375: 368: 365: 363: 360: 358: 355: 353: 350: 349: 346: 341: 340: 333: 330: 328: 325: 323: 320: 318: 315: 313: 309: 306: 304: 301: 299: 296: 294: 291: 290: 287: 282: 281: 274: 271: 269: 266: 264: 261: 259: 256: 252: 251:Mental health 249: 248: 247: 244: 242: 239: 237: 234: 230: 227: 225: 222: 220: 217: 216: 215: 214:Generative AI 212: 210: 207: 205: 202: 200: 197: 195: 192: 191: 188: 183: 182: 175: 172: 170: 167: 165: 162: 160: 157: 155: 154:Deep learning 152: 150: 147: 145: 142: 141: 135: 134: 127: 124: 122: 119: 117: 114: 112: 109: 107: 104: 102: 99: 97: 94: 92: 89: 87: 84: 82: 79: 78: 75: 70: 69: 63: 59: 58: 55: 52: 51: 47: 46: 39: 33: 19: 14880:Hugging Face 14844:David Silver 14492:Audio–visual 14369: 14346:Applications 14325:Augmentation 14170: 14101:. Retrieved 14081: 14054: 14038:. Springer. 14035: 14013:. Retrieved 13996: 13987: 13952: 13948: 13922: 13912: 13901:. Retrieved 13884: 13874: 13862:. Retrieved 13853: 13823:. Retrieved 13814: 13805: 13793:. Retrieved 13784: 13775: 13764:. Retrieved 13760:the original 13749: 13740: 13713: 13709: 13703: 13682: 13661: 13650:. Retrieved 13612:. Retrieved 13604:The Guardian 13603: 13593: 13581:. Retrieved 13566: 13555:. Retrieved 13546: 13536: 13524:. Retrieved 13515: 13505: 13493:. Retrieved 13484: 13474: 13462:. Retrieved 13453: 13444: 13409: 13403: 13391:. Retrieved 13382: 13372: 13360:. Retrieved 13356:the original 13351: 13342: 13286: 13280: 13259: 13216: 13212: 13202: 13190:. Retrieved 13181: 13171: 13126: 13122: 13112: 13061: 13057: 13047: 13006: 13002: 12996: 12955: 12951: 12945: 12902: 12898: 12888: 12827: 12823: 12813: 12772: 12768: 12762: 12709: 12705: 12695: 12662: 12658: 12652: 12599: 12595: 12585: 12577: 12572: 12529: 12525: 12519: 12494: 12490: 12484: 12464: 12457: 12424: 12420: 12414: 12395: 12385: 12374:. Retrieved 12370: 12360: 12307: 12303: 12292: 12241: 12237: 12227: 12184: 12180: 12170: 12115: 12111: 12101: 12058: 12054: 12044: 12001: 11997: 11987: 11936: 11932: 11922: 11910:. Retrieved 11901: 11854: 11844: 11791: 11787: 11777: 11766:. Retrieved 11762: 11752: 11741:. Retrieved 11732: 11722: 11697: 11693: 11687: 11676:. Retrieved 11660: 11653: 11641:. Retrieved 11632: 11623: 11588: 11582: 11537: 11533: 11523: 11480: 11476: 11466: 11455:. Retrieved 11420: 11410: 11359: 11355: 11348: 11305: 11301: 11274:. Retrieved 11270: 11261: 11250:. Retrieved 11246: 11237: 11194: 11190: 11180: 11135: 11131: 11121: 11109:. Retrieved 11083:11311/964622 11065: 11058: 11046:. Retrieved 11037: 11027: 10990: 10984: 10974: 10963:. Retrieved 10947: 10940: 10919: 10908:. Retrieved 10899: 10889: 10856: 10852: 10846: 10825: 10813:. Retrieved 10804: 10795: 10783:. Retrieved 10772: 10739: 10727:. Retrieved 10723:the original 10713: 10702:. Retrieved 10688: 10679: 10667:. Retrieved 10658: 10649: 10638:. Retrieved 10629: 10620: 10583: 10579: 10569: 10534: 10530: 10520: 10511: 10499:. Retrieved 10492:the original 10463:. Retrieved 10452: 10442: 10420: 10408:. Retrieved 10399: 10370:. Retrieved 10361: 10351: 10339:. Retrieved 10330: 10321: 10284: 10280: 10270: 10245: 10241: 10235: 10223:. Retrieved 10214: 10176: 10172: 10162: 10150:. Retrieved 10141: 10131: 10119:. Retrieved 10110: 10100: 10088:. Retrieved 10076: 10063: 10052:. Retrieved 10040: 10027: 10015:. Retrieved 9970: 9951: 9947: 9913: 9909: 9885: 9877: 9856: 9821: 9817: 9811: 9800:. Retrieved 9791: 9782: 9760: 9748:. Retrieved 9739: 9729: 9718:. Retrieved 9688: 9684: 9671: 9646: 9640: 9629:. Retrieved 9604: 9600: 9590: 9573: 9564: 9552:. Retrieved 9529: 9522: 9482:(2): 52–58. 9479: 9475: 9424:(2): 72–77. 9421: 9417: 9407: 9396:. Retrieved 9394:. 2021-04-20 9391: 9382: 9371:. Retrieved 9367: 9357: 9328: 9324: 9314: 9305: 9296: 9284:. Retrieved 9275: 9266: 9254:. Retrieved 9245: 9235: 9223:. Retrieved 9214: 9205: 9193:. Retrieved 9184: 9174: 9153: 9144: 9101: 9097: 9091: 9079:. Retrieved 9043: 9032: 9021:. Retrieved 9012: 9002: 8990:. Retrieved 8981: 8972: 8961:. Retrieved 8949: 8936: 8881: 8875: 8840: 8827: 8802: 8798: 8788: 8776:. Retrieved 8768:ResearchGate 8767: 8714: 8710: 8704: 8693:. Retrieved 8665: 8652: 8593:. Retrieved 8563: 8534:. Retrieved 8501: 8497: 8472:. Retrieved 8468:the original 8463: 8453: 8442:. Retrieved 8433: 8423: 8412:. Retrieved 8403: 8393: 8383:, retrieved 8373: 8367: 8318: 8314: 8307: 8274: 8268: 8257:. Retrieved 8253: 8244: 8233:. Retrieved 8221: 8208: 8187: 8176:the original 8162: 8117: 8111: 8100:. Retrieved 8070: 8051: 8047: 8034: 8022:. Retrieved 8008: 7987: 7975:. Retrieved 7970: 7961: 7949:. Retrieved 7933: 7926: 7905: 7868: 7863: 7856: 7837: 7831: 7810: 7792: 7786: 7764: 7741: 7719: 7700: 7687: 7666: 7645: 7610: 7604: 7593:. Retrieved 7577: 7570: 7559:. Retrieved 7539: 7526: 7475: 7471: 7465: 7438: 7428: 7404: 7394:, retrieved 7384: 7349: 7345: 7324:. Retrieved 7315: 7305: 7294:. Retrieved 7269: 7259: 7248:. Retrieved 7239: 7229: 7218:. Retrieved 7209: 7184:. Retrieved 7175: 7166: 7150:. Springer. 7146: 7122:. Retrieved 7066:(6): 82–97. 7063: 7059: 7022: 6987: 6984:Scholarpedia 6983: 6973: 6940: 6936: 6926: 6901: 6897: 6887: 6879: 6862: 6851:. Retrieved 6815: 6809: 6793: 6781:. Retrieved 6740: 6736: 6726: 6718: 6714: 6709: 6689: 6670: 6663:Graves, Alex 6657: 6646:. Retrieved 6634: 6606:. Retrieved 6598:ResearchGate 6597: 6588: 6579: 6562: 6558: 6532: 6528: 6522: 6505: 6501: 6495: 6484:. Retrieved 6469: 6453:1721.1/51891 6431:(3): 75–80. 6428: 6424: 6418: 6407:. Retrieved 6363: 6359: 6346: 6335:. Retrieved 6316: 6309:Robinson, T. 6303: 6278: 6274: 6268: 6235: 6231: 6221: 6202: 6196: 6155: 6151: 6144:Dayan, Peter 6134: 6084: 6080: 6065:Peter, Dayan 6059: 6035: 6021: 6007:cite journal 5982: 5978: 5968: 5925: 5921: 5884: 5880: 5871: 5862: 5823: 5817: 5800: 5786: 5767: 5757: 5749: 5713: 5706: 5681: 5677: 5647:cite journal 5638: 5592: 5588: 5578: 5569: 5565: 5555: 5543:. Retrieved 5513: 5509: 5496: 5463: 5459: 5449: 5416: 5412: 5402: 5369: 5365: 5355: 5347: 5343: 5338: 5329: 5319: 5309: 5301: 5288: 5281: 5261: 5228: 5224: 5214: 5195: 5189: 5177:. Retrieved 5165: 5155:Werbos, Paul 5149: 5137:. Retrieved 5133:the original 5107: 5082: 5078: 5069: 5060: 5051: 5034: 5030: 5021: 5001: 4994: 4961: 4958:Biol. Cybern 4957: 4951: 4918: 4912: 4891: 4882: 4878: 4869: 4850: 4846: 4833: 4822:. Retrieved 4802: 4769: 4765: 4755: 4735: 4728: 4719: 4713: 4704: 4659: 4655: 4645: 4636: 4630: 4613: 4607: 4580: 4574: 4529: 4525: 4515: 4469: 4465: 4431: 4427: 4421: 4412: 4389:. Retrieved 4370:. Springer. 4363: 4356: 4321: 4317: 4311: 4294: 4290: 4261: 4236:. Retrieved 4216: 4212: 4145: 4141: 4131: 4088: 4058: 4023: 4019: 3997:the original 3968: 3962: 3909: 3905: 3895: 3886: 3874:. Retrieved 3851: 3844: 3827:Rina Dechter 3822: 3795: 3792:Scholarpedia 3791: 3767:. Retrieved 3751: 3744: 3724: 3717: 3674: 3670: 3606: 3602: 3575:. Retrieved 3568:the original 3540:(1): 1–127. 3537: 3533: 3502:. Retrieved 3500:. 2022-11-02 3497: 3488: 3476:. Retrieved 3467: 3458: 3447:. Retrieved 3435: 3374: 3324: 3320: 3271: 3267: 3257: 3169: 3133:gamification 3110: 3095: 3086: 3076: 3069: 3062: 3055: 3043: 3031:Please help 3026:verification 3023: 2996: 2989: 2970: 2955: 2951: 2944: 2936: 2933:Cyber threat 2903: 2892:The Guardian 2890: 2887: 2863: 2851: 2843: 2829: 2813: 2810: 2787: 2777: 2768: 2753: 2725: 2688: 2673: 2664: 2652: 2638: 2618: 2610: 2580: 2568: 2536: 2523: 2514: 2450: 2434: 2427: 2412: 2397: 2368: 2349: 2334: 2314: 2294: 2270: 2259: 2243:vector space 2236: 2230: 2227: 2198: 2177: 2173: 2165: 2106:Amazon Alexa 2091: 2045: 1915: 1911: 1897:Applications 1880:multiplexing 1866: 1851: 1820: 1807: 1795: 1780: 1713:weight decay 1706: 1699: 1684: 1674: 1670: 1666: 1655: 1651: 1647: 1638: 1619: 1611: 1607: 1603:real numbers 1584: 1563: 1559: 1555: 1554: 1486:Turing Award 1472: 1449: 1438: 1400: 1390:(2015), and 1385: 1374: 1367: 1328: 1313: 1296: 1294: 1283: 1279: 1276: 1251: 1247: 1243:MNIST images 1224:Geoff Hinton 1221: 1202: 1191: 1178:Mel-Cepstral 1171: 1140: 1125: 1090: 1063: 1045: 1032: 1028: 1020: 1002: 984: 949: 910: 902:Neocognitron 895: 880: 857: 845: 835: 813:created the 807:Wilhelm Lenz 788: 736:concepts of 734:optimization 723: 708: 680: 669: 656:Rina Dechter 651: 649: 641: 634: 621: 619: 605: 603: 577: 557:transformers 546: 530: 486:transformers 463: 459:unsupervised 425: 424: 298:Chinese room 187:Applications 153: 15028:Categories 14976:Autoencoder 14931:Transformer 14799:Alex Graves 14747:OpenAI Five 14651:IBM Watsonx 14273:Convolution 14251:Overfitting 14015:22 November 13485:Gary Marcus 12405:10.14336/AD 12187:: 489–504. 11939:: 686–707. 11902:EurekAlert! 11763:VentureBeat 11540:(7): 1819. 11477:IEEE Access 11138:(4): e125. 11111:23 November 10090:21 December 9824:: 333–338. 9554:27 December 9392:VentureBeat 9331:(2): 1–12. 9104:: 197–227. 8992:30 November 8666:Interspeech 8024:25 November 7210:Interspeech 7124:27 December 7056:Sainath, T. 6990:(5): 5947. 6673:: 369–376. 5031:ARS Journal 4839:Robbins, H. 3876:27 December 3798:(5): 5947. 3577:3 September 2958:psychedelic 2858:Gary Marcus 2749:transducers 2415:autoencoder 2341:Ebola virus 2326:off-targets 1702:overfitting 1632:filtering, 1519:sea urchins 1432:(2022) and 1363:Inceptionv3 1353:network by 1309:max-pooling 1205:Alex Graves 1174:filter-bank 1097:Peter Dayan 1033:automatizer 956:Alex Waibel 946:1980s-2000s 936:Paul Werbos 918:derived by 862:trained by 831:Alan Turing 815:Ising model 811:Ernst Ising 785:Before 1980 758:regularizer 537:human brain 514:drug design 327:Turing test 303:Friendly AI 74:Major goals 15017:Technology 14870:EleutherAI 14829:Fei-Fei Li 14824:Yann LeCun 14737:Q-learning 14720:Decisional 14646:IBM Watson 14554:Midjourney 14446:TensorFlow 14293:Activation 14246:Regression 14241:Clustering 14103:2021-05-09 13903:2017-10-11 13864:11 October 13825:11 October 13795:11 October 13766:2015-05-10 13652:2015-05-10 13557:2017-06-14 13526:2 November 13495:11 October 13362:30 January 13269:Huang, Aja 12376:2024-05-19 12317:2212.12794 12251:2006.14395 12194:2008.11625 12125:1707.02568 12004:: 112789. 11768:2023-12-19 11743:2018-07-15 11700:: 105048. 11678:2018-01-01 11643:11 October 11457:2019-11-12 11369:1702.05747 11315:2012.11197 11276:2024-05-10 11252:2024-05-10 10965:2017-06-14 10931:1504.01840 10910:2019-09-05 10837:1704.01212 10815:9 November 10785:9 November 10751:1510.02855 10704:2015-03-05 10640:2020-07-16 10630:kaggle.com 10603:1942/18723 10537:(8): 569. 10501:1 December 10465:12 October 10433:1609.08144 10287:(1): 157. 10054:2014-09-03 10017:26 October 9802:2014-01-28 9720:2019-04-01 9631:2018-04-20 9489:2002.00281 9398:2022-08-03 9373:2022-08-03 9338:1704.04760 9195:23 October 9111:1702.07908 9023:2017-06-13 8963:2017-06-13 8695:2017-06-13 8643:1512.00103 8619:1602.02410 8595:2017-06-13 8564:Proc. NIPS 8536:2020-02-25 8444:2021-01-05 8414:2017-06-13 8385:2020-11-16 8259:2024-08-07 8235:2017-06-13 8222:Google.com 8127:2102.04029 8102:2016-04-09 8061:1503.03585 7999:1710.10196 7977:October 3, 7917:1508.06576 7878:1512.03385 7847:1512.03385 7822:1502.01852 7595:2017-06-13 7561:2017-06-13 7419:1703.09039 7396:2021-02-14 7296:2017-06-14 7250:2017-06-14 7220:2017-06-12 7027:Yann LeCun 6853:2011-07-20 6648:2016-04-09 6486:2017-06-12 6409:2019-09-24 6337:2017-06-12 5935:1906.04493 5545:October 7, 4903:1710.05941 4853:(3): 400. 4824:2019-11-05 4766:Automatica 4506:2212.11279 4391:2017-08-06 4331:1505.03654 4238:2014-10-18 4148:(1): 138. 3769:2019-10-06 3677:: 85–117. 3504:2023-12-06 3468:TechCrunch 3449:2017-05-24 3250:References 3163:) and (5) 3089:April 2021 3059:newspapers 2992:Google Now 2966:stop signs 2838:See also: 2551:inpainting 2465:classifier 2310:Toxicology 2193:The Scream 2114:Apple Siri 2110:Google Now 2063:Multi-task 1888:integrated 1876:wavelength 1864:(FGFETs). 1696:Challenges 1658:primitives 1482:Yann LeCun 1458:(ASR) and 1445:smartphone 1404:(GAN) by ( 1239:fine-tuned 1163:Larry Heck 1115:, and the 1025:distilling 980:Yann LeCun 960:Yann LeCun 916:chain rule 801:(MLP) and 606:on its own 583:algorithms 526:board game 451:supervised 332:Regulation 286:Philosophy 241:Healthcare 236:Government 138:Approaches 14900:MIT CSAIL 14865:Anthropic 14834:Andrew Ng 14732:AlphaZero 14576:VideoPoet 14539:AlphaFold 14476:MindSpore 14430:SpiNNaker 14425:Memristor 14332:Diffusion 14308:Rectifier 14288:Batchnorm 14268:Attention 14263:Adversary 14112:cite book 14005:1059-1028 13979:209363848 13971:1461-4448 13718:CiteSeerX 13694:1312.6199 13673:1412.1897 13464:29 August 13311:0028-0836 13192:26 August 13136:1411.6422 13078:0962-8436 13023:1546-1726 12972:0959-4388 12921:0896-6273 12854:1553-7358 12789:2397-3374 12728:1662-5188 12679:0899-7667 12626:0027-8424 12534:CiteSeerX 12344:0036-8075 12284:220055785 12268:1525-8955 12219:235340737 12211:2333-9403 12036:212755458 12028:0045-7825 11963:0021-9991 11912:29 August 11879:265503872 11818:1476-4687 11714:204092079 11556:2072-6694 11515:220733699 11507:2169-3536 11362:: 60–88. 11340:229339809 11211:1067-5027 11101:207217210 10881:201716327 9982:1402.3722 9954:(4): 18. 9868:1404.3840 9826:CiteSeerX 9773:1412.5567 9707:217950236 9621:206602362 9514:211010976 8896:CiteSeerX 8891:1212.0901 8731:0899-7667 8573:1409.3215 8343:1476-4687 8293:cite book 8199:1410.4281 8154:231846518 7951:20 August 7802:1409.1556 7776:1411.2539 7754:1411.4952 7731:1411.4555 7710:1409.4842 7678:1409.1556 7657:1112.6209 7502:0899-7667 7485:1003.0358 7088:206485943 7014:1941-6016 6957:0899-7667 6757:1364-6613 6675:CiteSeerX 6388:0096-3518 6295:0218-0014 6252:0022-2836 5999:0364-0213 5960:216056336 5928:: 58–66. 5810:Q98967430 5609:0364-0213 5518:CiteSeerX 5253:1476-4687 5099:122357351 4986:206775608 4943:206775608 4676:1939-1471 3936:235081987 3928:2405-8963 3684:1404.7828 3616:1206.5538 3564:207178999 3542:CiteSeerX 3384:1202.2745 3296:220523562 3288:1610-1987 3165:clickwork 3125:microwork 3119:(e.g. on 3117:clickwork 2973:deception 2896:website. 2854:strong AI 2847:black box 2788:Google's 2745:neocortex 2696:CpG sites 2543:denoising 2441:AlphaFold 2079:of speech 1891:photonics 1878:division 1857:based on 1755:ℓ 1724:ℓ 1574:manually 1422:deepfakes 1320:Jeff Dean 1316:Andrew Ng 1314:In 2012, 1286:Andrew Ng 1182:waveforms 1064:In 1991, 1021:collapsed 1015:to learn 881:In 1969, 797:(FNN) or 650:The term 614:discovers 571:and deep 362:AI winter 263:Military 126:AI safety 15054:Category 15008:Portals 14767:Auto-GPT 14599:Word2vec 14403:Hardware 14320:Datasets 14222:Concepts 14097:Archived 14009:Archived 13925:: 38–39. 13897:Archived 13858:Archived 13819:Archived 13789:Archived 13643:Archived 13608:Archived 13577:Archived 13551:Archived 13520:Archived 13489:Archived 13458:Archived 13387:Archived 13319:26819042 13243:26819021 13186:Archived 13163:26157000 13104:39281431 13096:29292348 13039:16970545 13031:26906502 12988:16560320 12980:15321069 12937:14663106 12929:10069343 12872:22096452 12805:24504018 12797:31024135 12746:27468262 12556:10097006 12511:12662587 12441:12396572 12352:37962497 12276:32746211 12162:30082389 12093:32001523 11979:57379996 11906:Archived 11871:38030771 11836:38030720 11827:10700131 11737:Archived 11669:Archived 11637:Archived 11615:35350962 11574:35406591 11451:Archived 11394:28778026 11332:36155469 11229:27521897 11164:27815231 11105:Archived 11042:Archived 11019:31127715 10956:Archived 10904:Archived 10873:31477924 10809:Archived 10779:Archived 10698:Archived 10663:Archived 10634:Archived 10612:25582842 10561:20246434 10553:23903212 10459:Archived 10410:23 March 10404:Archived 10372:23 March 10366:Archived 10335:Archived 10313:36855134 10262:40745740 10219:Archived 10146:Archived 10115:Archived 10081:Archived 10045:Archived 10008:Archived 9916:(4): 5. 9848:22386783 9796:Archived 9744:Archived 9711:Archived 9663:15641618 9625:Archived 9506:33408373 9456:33149289 9368:Datanami 9280:Archived 9250:Archived 9219:Archived 9189:Archived 9163:Archived 9136:14135321 9075:Archived 9017:Archived 8986:Archived 8982:Coursera 8954:Archived 8928:12485056 8867:13816461 8819:14542261 8772:Archived 8686:Archived 8682:17048224 8586:Archived 8530:Archived 8526:10192330 8518:18249962 8438:Archived 8408:Archived 8379:archived 8351:26819042 8226:Archived 8096:Archived 8018:Archived 7942:Archived 7701:Cvpr2015 7637:24579167 7586:Archived 7552:Archived 7510:20858131 7390:archived 7320:Archived 7290:Archived 7244:Archived 7214:Archived 7186:16 March 7180:Archived 7115:Archived 7034:Archived 6965:16764513 6918:17921042 6872:Archived 6844:Archived 6832:16764513 6777:Archived 6773:15066318 6765:17921042 6698:Archived 6639:Archived 6602:Archived 6480:Archived 6400:Archived 6331:Archived 6311:(1992). 5952:32334341 5806:Wikidata 5742:Archived 5698:18271205 5540:14542261 5441:20706526 5394:20577468 5308:et al., 5270:Archived 5170:Archived 5157:(1982). 5059:(1970). 4815:Archived 4703:(1962). 4684:13602029 4382:Archived 4348:12149203 4229:Archived 4182:28743932 4120:Archived 3834:Archived 3760:Archived 3709:11715509 3701:25462637 3633:23787338 3472:Archived 3440:Archived 3349:26017442 3178:See also 3149:Facebook 3137:CAPTCHAs 2907:Goertzel 2779:Facebook 2607:Military 2541:such as 2493:alphabet 2350:In 2017 2282:uses an 2238:word2vec 1918:dialects 1869:photonic 1821:Special 1804:Hardware 1791:batching 1744:sparsity 1523:features 1515:starfish 1436:(2022). 1430:DALL·E 2 1418:StyleGAN 1377:training 1326:videos. 1088:(GANs). 1055:residual 770:Narendra 762:Hopfield 738:training 551:such as 543:Overview 385:Glossary 379:Glossary 357:Progress 352:Timeline 312:Takeover 273:Projects 246:Industry 209:Finance 199:Deepfake 149:Symbolic 121:Robotics 96:Planning 14890:Meta AI 14727:AlphaGo 14711:PanGu-Σ 14681:ChatGPT 14656:Granite 14604:Seq2seq 14583:Whisper 14504:WaveNet 14499:AlexNet 14471:Flux.jl 14451:PyTorch 14303:Sigmoid 14298:Softmax 14163:General 13785:Gizmodo 13614:20 June 13583:20 June 13436:5613334 13291:Bibcode 13251:4460235 13221:Bibcode 13154:6605414 13087:5784047 12880:7504633 12863:3207943 12832:Bibcode 12754:9868901 12737:4943066 12687:2376781 12644:1903542 12604:Bibcode 12564:5818342 12449:1119517 12322:Bibcode 12304:Science 12153:6112690 12130:Bibcode 12084:7219083 12063:Bibcode 12055:Science 12006:Bibcode 11971:1595805 11941:Bibcode 11796:Bibcode 11565:8997449 11534:Cancers 11485:Bibcode 11447:4728736 11402:2088679 11374:Bibcode 11220:5391725 11172:3821594 11155:5116102 11048:14 June 11010:6555124 10729:5 March 10669:14 June 10341:14 June 10304:9972634 10225:14 June 10193:1317136 10152:14 June 10121:14 June 9750:14 June 9447:7116757 9426:Bibcode 9286:11 June 9256:11 June 9225:11 June 9116:Bibcode 9081:5 March 9071:8869270 8778:13 June 8747:1915014 8739:9377276 8578:Bibcode 8474:10 July 8323:Bibcode 7518:1918673 7354:Bibcode 7326:14 June 7068:Bibcode 6992:Bibcode 6840:2309950 6783:12 June 6608:14 June 6433:Bibcode 6396:9563026 6260:3172241 6180:7761831 6160:Bibcode 6152:Science 6119:1890561 6111:7584891 6041:194–281 5752:, 1991. 5488:8058017 5468:Bibcode 5421:Bibcode 5374:Bibcode 5233:Bibcode 4978:7370364 4935:7370364 4566:6953413 4534:Bibcode 4436:Bibcode 4173:5527101 4150:Bibcode 4040:7343126 3993:3958369 3973:Bibcode 3800:Bibcode 3478:17 June 3411:2161592 3357:3074096 3329:Bibcode 3073:scholar 2981:malware 2977:malware 2860:noted: 2798:AlphaGo 2712:obesity 2489:Softmax 2122:iFlyTek 2094:Cortana 1825:called 1775:dropout 1599:synapse 1591:neurons 1576:labeled 1567:systems 1331:AlexNet 1324:YouTube 1041:ChatGPT 1029:chunker 780:History 754:dropout 742:testing 695:sigmoid 660:Boolean 367:AI boom 345:History 268:Physics 14905:Huawei 14885:OpenAI 14787:People 14757:MuZero 14619:Gemini 14614:Claude 14549:DALL-E 14461:Theano 14089:  14061:  14042:  14003:  13977:  13969:  13885:Nature 13720:  13434:  13424:  13393:5 July 13327:515925 13325:  13317:  13309:  13282:Nature 13249:  13241:  13213:Nature 13161:  13151:  13102:  13094:  13084:  13076:  13037:  13029:  13021:  12986:  12978:  12970:  12935:  12927:  12919:  12899:Neuron 12878:  12870:  12860:  12852:  12803:  12795:  12787:  12752:  12744:  12734:  12726:  12712:: 73. 12685:  12677:  12642:  12632:  12624:  12562:  12554:  12536:  12509:  12472:  12447:  12439:  12371:Medium 12350:  12342:  12282:  12274:  12266:  12217:  12209:  12160:  12150:  12091:  12081:  12034:  12026:  11977:  11969:  11961:  11877:  11869:  11855:Nature 11834:  11824:  11816:  11788:Nature 11712:  11613:  11603:  11572:  11562:  11554:  11513:  11505:  11445:  11435:  11400:  11392:  11338:  11330:  11227:  11217:  11209:  11170:  11162:  11152:  11099:  11089:  11017:  11007:  10879:  10871:  10610:  10559:  10551:  10311:  10301:  10260:  10191:  9846:  9828:  9705:  9661:  9619:  9545:  9512:  9504:  9476:Nature 9454:  9444:  9418:Nature 9276:OpenAI 9134:  9069:  9059:  8950:ICASSP 8926:  8916:  8898:  8865:  8855:  8817:  8745:  8737:  8729:  8680:  8524:  8516:  8359:515925 8357:  8349:  8341:  8315:Nature 8281:  8152:  8142:  7893:  7635:  7625:  7516:  7508:  7500:  7453:  7286:398770 7284:  7154:  7086:  7031:Online 7012:  6963:  6955:  6916:  6838:  6830:  6771:  6763:  6755:  6677:  6461:357467 6459:  6394:  6386:  6323:  6317:ICASSP 6293:  6258:  6250:  6209:  6188:871473 6186:  6178:  6117:  6109:  6047:  5997:  5958:  5950:  5901:234198 5899:  5838:  5808:  5774:  5696:  5607:  5538:  5520:  5486:  5439:  5392:  5344:et al. 5342:LeCun 5251:  5225:Nature 5202:  5179:2 July 5139:14 Sep 5097:  5009:  4984:  4976:  4941:  4933:  4743:  4682:  4674:  4595:  4564:  4557:346238 4554:  4374:  4346:  4269:  4180:  4170:  4096:  4066:  4038:  3991:  3934:  3926:  3867:  3831:Online 3732:  3707:  3699:  3641:393948 3639:  3631:  3562:  3544:  3409:  3399:  3355:  3347:  3321:Nature 3294:  3286:  3075:  3068:  3061:  3054:  3046:  2948:TinEye 2929:(AI). 2900:Errors 2874:Watson 2834:Theory 2553:, and 2485:pixels 2473:matrix 2469:vector 2432:data. 2328:, and 2126:Nuance 1939:Method 1930:bigram 1831:Huawei 1815:OpenAI 1414:Nvidia 1396:VGG-19 1351:VGG-16 1341:, and 1303:, and 1297:DanNet 1037:layers 774:Bishop 766:Widrow 637:greedy 599:pixels 595:tensor 488:, and 317:Ethics 14971:Mamba 14742:SARSA 14706:LLaMA 14701:BLOOM 14686:GPT-J 14676:GPT-4 14671:GPT-3 14666:GPT-2 14661:GPT-1 14624:LaMDA 14456:Keras 13997:Wired 13975:S2CID 13689:arXiv 13668:arXiv 13646:(PDF) 13639:(PDF) 13432:S2CID 13323:S2CID 13247:S2CID 13182:Wired 13131:arXiv 13100:S2CID 13035:S2CID 12984:S2CID 12933:S2CID 12876:S2CID 12801:S2CID 12750:S2CID 12683:S2CID 12635:51674 12560:S2CID 12445:S2CID 12312:arXiv 12280:S2CID 12246:arXiv 12215:S2CID 12189:arXiv 12120:arXiv 12032:S2CID 11975:S2CID 11875:S2CID 11710:S2CID 11672:(PDF) 11665:(PDF) 11611:S2CID 11511:S2CID 11443:S2CID 11398:S2CID 11364:arXiv 11336:S2CID 11310:arXiv 11298:(PDF) 11168:S2CID 11097:S2CID 10959:(PDF) 10952:(PDF) 10926:arXiv 10900:Wired 10877:S2CID 10832:arXiv 10746:arXiv 10557:S2CID 10495:(PDF) 10488:(PDF) 10454:Wired 10428:arXiv 10258:S2CID 10189:S2CID 10084:(PDF) 10073:(PDF) 10048:(PDF) 10037:(PDF) 10011:(PDF) 10004:(PDF) 9977:arXiv 9863:arXiv 9768:arXiv 9740:Wired 9714:(PDF) 9703:S2CID 9681:(PDF) 9659:S2CID 9617:S2CID 9510:S2CID 9484:arXiv 9333:arXiv 9246:ZDNet 9132:S2CID 9106:arXiv 9067:S2CID 8957:(PDF) 8946:(PDF) 8924:S2CID 8886:arXiv 8863:S2CID 8815:S2CID 8743:S2CID 8689:(PDF) 8678:S2CID 8662:(PDF) 8638:arXiv 8614:arXiv 8589:(PDF) 8568:arXiv 8560:(PDF) 8522:S2CID 8355:S2CID 8229:(PDF) 8218:(PDF) 8194:arXiv 8179:(PDF) 8172:(PDF) 8150:S2CID 8122:arXiv 8056:arXiv 8044:(PDF) 7994:arXiv 7945:(PDF) 7938:(PDF) 7912:arXiv 7873:arXiv 7842:arXiv 7817:arXiv 7797:arXiv 7771:arXiv 7749:arXiv 7726:arXiv 7705:arXiv 7697:(PDF) 7673:arXiv 7652:arXiv 7589:(PDF) 7582:(PDF) 7555:(PDF) 7536:(PDF) 7514:S2CID 7480:arXiv 7414:arXiv 7282:S2CID 7118:(PDF) 7111:(PDF) 7084:S2CID 6847:(PDF) 6836:S2CID 6806:(PDF) 6769:S2CID 6642:(PDF) 6631:(PDF) 6457:S2CID 6403:(PDF) 6392:S2CID 6356:(PDF) 6184:S2CID 6115:S2CID 6031:(PDF) 5956:S2CID 5930:arXiv 5897:S2CID 5718:(PDF) 5694:S2CID 5674:(PDF) 5635:(PDF) 5536:S2CID 5506:(PDF) 5293:(PDF) 5173:(PDF) 5162:(PDF) 5095:S2CID 4982:S2CID 4939:S2CID 4898:arXiv 4818:(PDF) 4799:(PDF) 4501:arXiv 4385:(PDF) 4368:(PDF) 4344:S2CID 4326:arXiv 4232:(PDF) 4209:(PDF) 4036:S2CID 4000:(PDF) 3989:S2CID 3959:(PDF) 3932:S2CID 3763:(PDF) 3756:(PDF) 3705:S2CID 3679:arXiv 3637:S2CID 3611:arXiv 3571:(PDF) 3560:S2CID 3530:(PDF) 3443:(PDF) 3432:(PDF) 3407:S2CID 3379:arXiv 3353:S2CID 3317:(PDF) 3292:S2CID 3080:JSTOR 3066:books 2794:Atari 2118:Baidu 2040:16.5 2032:17.8 2024:18.3 2016:18.7 2008:20.0 2000:20.7 1992:21.7 1984:22.4 1976:23.4 1968:24.8 1960:25.6 1952:26.1 1926:phone 1593:in a 1562:) or 1540:shell 1460:MNIST 1456:TIMIT 1255:TIMIT 1188:2000s 1155:DARPA 1141:Most 976:LeNet 964:LeNet 791:types 591:image 436:with 229:Music 224:Audio 14895:Mila 14696:PaLM 14629:Bard 14609:BERT 14592:Text 14571:Sora 14118:link 14087:ISBN 14059:ISBN 14040:ISBN 14017:2019 14001:ISSN 13967:ISSN 13866:2017 13827:2019 13797:2019 13616:2015 13585:2015 13528:2017 13497:2018 13466:2018 13422:ISBN 13395:2019 13364:2016 13315:PMID 13307:ISSN 13239:PMID 13194:2017 13159:PMID 13092:PMID 13074:ISSN 13027:PMID 13019:ISSN 12976:PMID 12968:ISSN 12925:PMID 12917:ISSN 12868:PMID 12850:ISSN 12793:PMID 12785:ISSN 12742:PMID 12724:ISSN 12675:ISSN 12640:PMID 12622:ISSN 12552:PMID 12507:PMID 12470:ISBN 12437:PMID 12348:PMID 12340:ISSN 12272:PMID 12264:ISSN 12207:ISSN 12158:PMID 12089:PMID 12024:ISSN 11967:OSTI 11959:ISSN 11914:2018 11867:PMID 11832:PMID 11814:ISSN 11645:2019 11601:ISBN 11570:PMID 11552:ISSN 11503:ISSN 11433:ISBN 11390:PMID 11328:PMID 11271:CNBC 11225:PMID 11207:ISSN 11160:PMID 11113:2015 11087:ISBN 11050:2017 11015:PMID 10869:PMID 10817:2015 10787:2015 10731:2015 10671:2017 10608:PMID 10549:PMID 10503:2016 10467:2017 10412:2017 10374:2017 10343:2017 10309:PMID 10227:2017 10154:2017 10123:2017 10092:2023 10019:2014 9948:Arts 9910:Arts 9844:PMID 9752:2017 9689:2015 9556:2023 9543:ISBN 9502:PMID 9452:PMID 9288:2020 9258:2020 9227:2020 9197:2015 9083:2018 9057:ISBN 8994:2017 8914:ISBN 8853:ISBN 8780:2017 8735:PMID 8727:ISSN 8514:PMID 8476:2018 8347:PMID 8339:ISSN 8299:link 8279:ISBN 8140:ISBN 8026:2020 7979:2019 7953:2019 7891:ISBN 7633:PMID 7623:ISBN 7506:PMID 7498:ISSN 7451:ISBN 7328:2017 7188:2018 7152:ISBN 7126:2023 7010:ISSN 6961:PMID 6953:ISSN 6914:PMID 6828:PMID 6785:2017 6761:PMID 6753:ISSN 6610:2017 6384:ISSN 6321:ISBN 6291:ISSN 6256:PMID 6248:ISSN 6207:ISBN 6176:PMID 6107:PMID 6045:ISBN 6013:link 5995:ISSN 5948:PMID 5836:ISBN 5772:ISBN 5653:link 5605:ISSN 5547:2016 5484:PMID 5437:PMID 5390:PMID 5249:ISSN 5200:ISBN 5181:2017 5141:2024 5007:ISBN 4974:PMID 4931:PMID 4741:ISBN 4680:PMID 4672:ISSN 4593:ISBN 4562:PMID 4372:ISBN 4267:ISBN 4178:PMID 4094:ISBN 4064:ISBN 3924:ISSN 3878:2023 3865:ISBN 3730:ISBN 3697:PMID 3629:PMID 3579:2015 3480:2018 3397:ISBN 3345:PMID 3284:ISSN 3052:news 2997:In " 2762:and 2585:and 2343:and 2308:and 2120:and 2098:Xbox 2073:CNNs 2065:and 1560:ANNs 1517:and 1480:and 1468:LSTM 1357:and 1318:and 1196:and 1167:NIST 1153:and 966:for 950:The 887:ReLU 809:and 768:and 740:and 724:The 715:ReLU 693:for 555:and 14636:NMT 14519:OCR 14514:HWR 14466:JAX 14420:VPU 14415:TPU 14410:IPU 14234:SGD 13957:doi 13889:doi 13728:doi 13414:doi 13299:doi 13287:529 13229:doi 13217:529 13149:PMC 13141:doi 13082:PMC 13066:doi 13062:373 13011:doi 12960:doi 12907:doi 12858:PMC 12840:doi 12777:doi 12732:PMC 12714:doi 12667:doi 12630:PMC 12612:doi 12544:doi 12499:doi 12429:doi 12400:doi 12330:doi 12308:382 12256:doi 12199:doi 12148:PMC 12138:doi 12116:115 12079:PMC 12071:doi 12059:367 12014:doi 12002:360 11949:doi 11937:378 11859:doi 11822:PMC 11804:doi 11792:624 11702:doi 11698:188 11593:doi 11560:PMC 11542:doi 11493:doi 11425:doi 11382:doi 11320:doi 11215:PMC 11199:doi 11150:PMC 11140:doi 11079:hdl 11071:doi 11005:PMC 10995:doi 10861:doi 10598:hdl 10588:doi 10539:doi 10299:PMC 10289:doi 10250:doi 10181:doi 9956:doi 9918:doi 9836:doi 9693:doi 9651:doi 9609:doi 9578:doi 9535:doi 9494:doi 9480:589 9442:PMC 9434:doi 9422:587 9343:doi 9161:". 9124:doi 9049:doi 8906:doi 8845:doi 8807:doi 8719:doi 8670:doi 8506:doi 8331:doi 8319:529 8132:doi 7883:doi 7615:doi 7544:doi 7490:doi 7443:doi 7362:doi 7274:doi 7076:doi 7000:doi 6945:doi 6906:doi 6870:". 6820:doi 6745:doi 6567:doi 6537:doi 6510:doi 6449:hdl 6441:doi 6376:hdl 6368:doi 6283:doi 6240:doi 6236:202 6168:doi 6156:268 6097:hdl 6089:doi 5987:doi 5940:doi 5926:127 5889:doi 5828:doi 5740:". 5686:doi 5597:doi 5528:doi 5476:doi 5429:doi 5382:doi 5241:doi 5229:323 5087:doi 5039:doi 4966:doi 4923:doi 4855:doi 4807:doi 4774:doi 4664:doi 4618:doi 4585:doi 4552:PMC 4542:doi 4444:doi 4336:doi 4299:doi 4221:doi 4168:PMC 4158:doi 4028:doi 3981:doi 3914:doi 3857:doi 3808:doi 3689:doi 3621:doi 3552:doi 3389:doi 3337:doi 3325:521 3276:doi 3147:on 3035:by 2942:". 2880:to 2700:IBD 2629:CFD 2471:or 2413:An 2378:RFM 2264:to 2159:in 2083:RNN 1920:of 1443:on 1424:. 1416:'s 1333:by 1232:Teh 1151:NSA 756:as 702:'s 674:or 597:of 457:or 219:Art 15056:: 14114:}} 14110:{{ 14095:. 14075:; 14007:. 13999:. 13995:. 13973:. 13965:. 13953:22 13951:. 13947:. 13931:^ 13921:. 13895:. 13887:. 13883:. 13852:. 13835:^ 13813:. 13783:. 13726:. 13712:. 13641:. 13624:^ 13606:. 13602:. 13549:. 13545:. 13518:. 13514:. 13487:. 13483:. 13452:. 13430:. 13420:. 13385:. 13381:. 13350:. 13321:. 13313:. 13305:. 13297:. 13285:. 13267:; 13245:. 13237:. 13227:. 13215:. 13211:. 13184:. 13180:. 13157:. 13147:. 13139:. 13127:35 13125:. 13121:. 13098:. 13090:. 13080:. 13072:. 13060:. 13056:. 13033:. 13025:. 13017:. 13007:19 13005:. 12982:. 12974:. 12966:. 12956:14 12954:. 12931:. 12923:. 12915:. 12903:22 12901:. 12897:. 12874:. 12866:. 12856:. 12848:. 12838:. 12826:. 12822:. 12799:. 12791:. 12783:. 12771:. 12748:. 12740:. 12730:. 12722:. 12710:10 12708:. 12704:. 12681:. 12673:. 12661:. 12638:. 12628:. 12620:. 12610:. 12600:88 12598:. 12594:. 12558:. 12550:. 12542:. 12530:20 12528:. 12505:. 12493:. 12443:. 12435:. 12425:14 12423:. 12398:. 12394:. 12369:. 12346:. 12338:. 12328:. 12320:. 12306:. 12302:. 12278:. 12270:. 12262:. 12254:. 12242:67 12240:. 12236:. 12213:. 12205:. 12197:. 12183:. 12179:. 12156:. 12146:. 12136:. 12128:. 12114:. 12110:. 12087:. 12077:. 12069:. 12057:. 12053:. 12030:. 12022:. 12012:. 12000:. 11996:. 11973:. 11965:. 11957:. 11947:. 11935:. 11931:. 11904:. 11900:. 11887:^ 11873:. 11865:. 11857:. 11853:. 11830:. 11820:. 11812:. 11802:. 11790:. 11786:. 11761:. 11735:. 11731:. 11708:. 11696:. 11631:. 11609:. 11599:. 11568:. 11558:. 11550:. 11538:14 11536:. 11532:. 11509:. 11501:. 11491:. 11479:. 11475:. 11449:. 11441:. 11431:. 11419:. 11396:. 11388:. 11380:. 11372:. 11360:42 11358:. 11334:. 11326:. 11318:. 11306:PP 11304:. 11300:. 11285:^ 11245:. 11223:. 11213:. 11205:. 11195:24 11193:. 11189:. 11166:. 11158:. 11148:. 11134:. 11130:. 11103:. 11095:. 11085:. 11077:. 11040:. 11036:. 11013:. 11003:. 10991:21 10989:. 10983:. 10902:. 10898:. 10875:. 10867:. 10857:37 10855:. 10803:. 10777:. 10771:. 10760:^ 10696:. 10661:. 10657:. 10632:. 10628:. 10606:. 10596:. 10584:20 10582:. 10578:. 10555:. 10547:. 10535:12 10533:. 10529:. 10475:^ 10457:. 10451:. 10402:. 10398:. 10382:^ 10364:. 10360:. 10333:. 10329:. 10307:. 10297:. 10285:21 10283:. 10279:. 10256:. 10246:30 10244:. 10217:. 10213:. 10201:^ 10187:. 10177:23 10175:. 10144:. 10140:. 10113:. 10109:. 10075:. 10043:. 10039:. 10006:. 9991:^ 9950:. 9946:. 9932:^ 9912:. 9908:. 9894:^ 9842:. 9834:. 9822:32 9794:. 9790:. 9742:. 9738:. 9709:. 9701:. 9691:. 9687:. 9683:. 9657:. 9623:. 9615:. 9605:22 9603:. 9599:. 9541:. 9508:. 9500:. 9492:. 9478:. 9464:^ 9450:. 9440:. 9432:. 9420:. 9416:. 9390:. 9366:. 9341:. 9329:45 9327:. 9323:. 9304:. 9274:. 9248:. 9244:. 9213:. 9187:. 9183:. 9130:. 9122:. 9114:. 9102:75 9100:. 9073:. 9065:. 9055:. 9041:. 9015:. 9011:. 8984:. 8980:. 8952:. 8948:. 8922:. 8912:. 8904:. 8894:. 8861:. 8851:. 8813:. 8803:86 8801:. 8797:. 8770:. 8766:. 8755:^ 8741:. 8733:. 8725:. 8713:. 8684:. 8676:. 8664:. 8628:^ 8604:^ 8584:. 8576:. 8566:. 8562:. 8545:^ 8528:. 8520:. 8512:. 8502:12 8500:. 8496:. 8484:^ 8462:. 8432:. 8402:. 8377:, 8353:. 8345:. 8337:. 8329:. 8317:. 8295:}} 8291:{{ 8252:. 8220:. 8148:. 8138:. 8130:. 8094:. 8082:^ 8052:37 8050:. 8046:. 7969:. 7889:. 7881:. 7867:. 7840:. 7795:, 7703:. 7699:. 7631:. 7621:. 7550:. 7542:. 7538:. 7512:. 7504:. 7496:. 7488:. 7476:22 7474:. 7449:. 7437:. 7388:, 7374:^ 7360:. 7350:37 7348:. 7336:^ 7318:. 7314:. 7288:. 7280:. 7268:. 7242:. 7238:. 7212:. 7208:. 7196:^ 7174:. 7134:^ 7096:^ 7082:. 7074:. 7064:29 7062:. 7044:^ 7008:. 6998:. 6986:. 6982:. 6959:. 6951:. 6941:18 6939:. 6935:. 6912:. 6902:11 6900:. 6896:. 6878:. 6842:. 6834:. 6826:. 6816:18 6814:. 6808:. 6775:. 6767:. 6759:. 6751:. 6741:11 6739:. 6735:. 6633:. 6618:^ 6600:. 6596:. 6563:31 6561:. 6549:^ 6533:31 6531:. 6504:. 6455:. 6447:. 6439:. 6429:26 6427:. 6398:. 6390:. 6382:. 6374:. 6364:37 6362:. 6358:. 6329:. 6315:. 6289:. 6279:07 6277:. 6254:. 6246:. 6234:. 6230:. 6182:. 6174:. 6166:. 6154:. 6146:; 6142:; 6113:. 6105:. 6095:. 6083:. 6075:; 6071:; 6067:; 6043:. 6009:}} 6005:{{ 5981:. 5977:. 5954:. 5946:. 5938:. 5924:. 5909:^ 5895:. 5883:. 5850:^ 5834:. 5804:, 5794:; 5748:. 5727:^ 5692:. 5680:. 5676:. 5661:^ 5649:}} 5645:{{ 5637:. 5617:^ 5603:. 5593:14 5591:. 5587:. 5568:. 5564:. 5534:. 5526:. 5514:86 5512:. 5508:. 5482:. 5474:. 5464:21 5462:. 5458:. 5435:. 5427:. 5417:30 5415:. 5411:. 5388:. 5380:. 5370:29 5368:. 5364:. 5328:. 5247:. 5239:. 5227:. 5223:. 5164:. 5116:^ 5093:. 5083:16 5081:. 5035:30 5033:. 4980:. 4972:. 4962:36 4960:. 4937:. 4929:. 4883:EC 4881:. 4851:22 4849:. 4845:. 4813:. 4801:. 4786:^ 4768:. 4764:. 4692:^ 4678:. 4670:. 4660:65 4658:. 4654:. 4591:. 4560:. 4550:. 4540:. 4530:79 4528:. 4524:. 4478:^ 4468:. 4456:^ 4442:. 4432:39 4430:. 4411:. 4400:^ 4380:. 4342:. 4334:. 4322:43 4320:. 4293:. 4281:^ 4247:^ 4227:. 4215:. 4211:. 4190:^ 4176:. 4166:. 4156:. 4144:. 4140:. 4108:^ 4078:^ 4048:^ 4034:. 4022:. 4008:^ 3987:. 3979:. 3967:. 3961:. 3944:^ 3930:. 3922:. 3910:53 3904:. 3863:. 3806:. 3794:. 3790:. 3778:^ 3703:. 3695:. 3687:. 3675:61 3673:. 3649:^ 3635:. 3627:. 3619:. 3607:35 3605:. 3587:^ 3558:. 3550:. 3536:. 3532:. 3513:^ 3496:. 3466:. 3438:. 3434:. 3419:^ 3405:. 3395:. 3387:. 3365:^ 3351:. 3343:. 3335:. 3323:. 3319:. 3304:^ 3290:. 3282:. 3272:26 3270:. 3266:. 3167:. 2894:'s 2884:. 2802:Go 2710:, 2706:, 2702:, 2549:, 2545:, 2384:. 2347:. 2324:, 2268:. 2116:, 2112:, 2108:, 2104:, 2100:, 2096:, 1845:. 1628:, 1624:, 1582:. 1476:, 1447:. 1398:. 1365:. 1337:, 1261:. 1226:, 1219:. 1111:, 1107:, 1099:, 1095:, 1000:. 776:. 764:, 706:. 678:. 647:. 575:. 520:, 516:, 512:, 508:, 504:, 500:, 496:, 484:, 480:, 476:, 472:, 468:, 461:. 453:, 14149:e 14142:t 14135:v 14120:) 14067:. 14048:. 14019:. 13981:. 13959:: 13906:. 13891:: 13868:. 13829:. 13799:. 13769:. 13734:. 13730:: 13714:2 13697:. 13691:: 13676:. 13670:: 13655:. 13618:. 13587:. 13560:. 13530:. 13499:. 13468:. 13438:. 13416:: 13397:. 13366:. 13329:. 13301:: 13293:: 13253:. 13231:: 13223:: 13196:. 13165:. 13143:: 13133:: 13106:. 13068:: 13041:. 13013:: 12990:. 12962:: 12939:. 12909:: 12882:. 12842:: 12834:: 12828:7 12807:. 12779:: 12773:1 12756:. 12716:: 12689:. 12669:: 12663:8 12646:. 12614:: 12606:: 12566:. 12546:: 12513:. 12501:: 12495:9 12478:. 12451:. 12431:: 12408:. 12402:: 12379:. 12354:. 12332:: 12324:: 12314:: 12286:. 12258:: 12248:: 12221:. 12201:: 12191:: 12185:7 12164:. 12140:: 12132:: 12122:: 12095:. 12073:: 12065:: 12038:. 12016:: 12008:: 11981:. 11951:: 11943:: 11916:. 11881:. 11861:: 11838:. 11806:: 11798:: 11771:. 11746:. 11716:. 11704:: 11681:. 11647:. 11617:. 11595:: 11576:. 11544:: 11517:. 11495:: 11487:: 11481:8 11460:. 11427:: 11404:. 11384:: 11376:: 11366:: 11342:. 11322:: 11312:: 11279:. 11255:. 11231:. 11201:: 11174:. 11142:: 11136:4 11115:. 11081:: 11073:: 11052:. 11021:. 10997:: 10968:. 10934:. 10928:: 10913:. 10883:. 10863:: 10840:. 10834:: 10819:. 10789:. 10754:. 10748:: 10733:. 10707:. 10673:. 10643:. 10614:. 10600:: 10590:: 10563:. 10541:: 10505:. 10469:. 10436:. 10430:: 10414:. 10376:. 10345:. 10315:. 10291:: 10264:. 10252:: 10229:. 10195:. 10183:: 10156:. 10125:. 10094:. 10057:. 10021:. 9985:. 9979:: 9964:. 9958:: 9952:6 9926:. 9920:: 9914:6 9871:. 9865:: 9850:. 9838:: 9805:. 9776:. 9770:: 9754:. 9723:. 9695:: 9665:. 9653:: 9634:. 9611:: 9584:. 9580:: 9558:. 9537:: 9516:. 9496:: 9486:: 9458:. 9436:: 9428:: 9401:. 9376:. 9351:. 9345:: 9335:: 9308:. 9290:. 9260:. 9229:. 9199:. 9138:. 9126:: 9118:: 9108:: 9085:. 9051:: 9026:. 8996:. 8966:. 8930:. 8908:: 8888:: 8869:. 8847:: 8821:. 8809:: 8782:. 8749:. 8721:: 8715:9 8698:. 8672:: 8646:. 8640:: 8622:. 8616:: 8598:. 8580:: 8570:: 8539:. 8508:: 8478:. 8447:. 8417:. 8361:. 8333:: 8325:: 8301:) 8287:. 8262:. 8238:. 8202:. 8196:: 8156:. 8134:: 8124:: 8105:. 8064:. 8058:: 8028:. 8002:. 7996:: 7981:. 7955:. 7920:. 7914:: 7899:. 7885:: 7875:: 7850:. 7844:: 7825:. 7819:: 7799:: 7781:. 7779:. 7773:: 7759:. 7757:. 7751:: 7736:. 7734:. 7728:: 7713:. 7707:: 7681:. 7675:: 7660:. 7654:: 7639:. 7617:: 7598:. 7564:. 7546:: 7520:. 7492:: 7482:: 7459:. 7445:: 7422:. 7416:: 7368:. 7364:: 7356:: 7330:. 7299:. 7276:: 7253:. 7223:. 7190:. 7160:. 7128:. 7090:. 7078:: 7070:: 7016:. 7002:: 6994:: 6988:4 6967:. 6947:: 6920:. 6908:: 6856:. 6822:: 6787:. 6747:: 6683:. 6651:. 6612:. 6573:. 6569:: 6543:. 6539:: 6516:. 6512:: 6506:7 6489:. 6463:. 6451:: 6443:: 6435:: 6412:. 6378:: 6370:: 6340:. 6297:. 6285:: 6262:. 6242:: 6215:. 6190:. 6170:: 6162:: 6121:. 6099:: 6091:: 6085:7 6053:. 6015:) 6001:. 5989:: 5983:9 5962:. 5942:: 5932:: 5903:. 5891:: 5885:2 5844:. 5830:: 5780:. 5720:. 5700:. 5688:: 5682:4 5655:) 5641:. 5611:. 5599:: 5572:. 5570:8 5549:. 5530:: 5490:. 5478:: 5470:: 5443:. 5431:: 5423:: 5396:. 5384:: 5376:: 5332:. 5255:. 5243:: 5235:: 5208:. 5183:. 5143:. 5101:. 5089:: 5045:. 5041:: 5015:. 4988:. 4968:: 4945:. 4925:: 4906:. 4900:: 4863:. 4857:: 4827:. 4809:: 4780:. 4776:: 4770:6 4749:. 4722:. 4686:. 4666:: 4639:. 4624:. 4620:: 4601:. 4587:: 4568:. 4544:: 4536:: 4509:. 4503:: 4470:C 4450:. 4446:: 4438:: 4415:. 4394:. 4350:. 4338:: 4328:: 4305:. 4301:: 4295:5 4275:. 4241:. 4223:: 4217:7 4184:. 4160:: 4152:: 4146:8 4102:. 4072:. 4042:. 4030:: 4024:4 3983:: 3975:: 3969:2 3938:. 3916:: 3880:. 3859:: 3816:. 3810:: 3802:: 3796:4 3772:. 3738:. 3711:. 3691:: 3681:: 3643:. 3623:: 3613:: 3581:. 3554:: 3538:2 3507:. 3482:. 3452:. 3413:. 3391:: 3381:: 3359:. 3339:: 3331:: 3298:. 3278:: 3102:) 3096:( 3091:) 3087:( 3077:· 3070:· 3063:· 3056:· 3029:. 2952:. 2312:. 1759:1 1746:( 1728:2 1715:( 1558:( 1462:( 1134:/ 414:e 407:t 400:v 310:/ 34:. 20:)

Index

Deep neural network
Deep Learning (South Park)
Representing images on multiple layers of abstraction in deep learning
Artificial intelligence

Major goals
Artificial general intelligence
Intelligent agent
Recursive self-improvement
Planning
Computer vision
General game playing
Knowledge reasoning
Natural language processing
Robotics
AI safety
Machine learning
Symbolic
Deep learning
Bayesian networks
Evolutionary algorithms
Hybrid intelligent systems
Systems integration
Applications
Bioinformatics
Deepfake
Earth sciences
Finance
Generative AI
Art

Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.