Deep Learning Machine Learning Artificial Neural Networks Modeled after biological neuronal networks Stochastic Gradient Descent Used to calculate error and tune weights in a network Supervised Learning Providing labeled data at the input layer as training Unsupervised Learning Allowing the machine to find patterns in unlabeled data Deep Learning Using non-linear transforms to correlate data through complex layers and abstractions Just as general computing changed the way society operates, so too will general learning methodologies revolutionize the way software is written and the way information is processed. The rapid evolution of machine learning techniques is heralding in an age of digital thinking such that the world has never seen before. With the ability to abstract information at multiple layers and draw correlations autonomously, self-learning software algorithms are making pattern recognition, image recognition, translation, and many other high-order functions a task no longer limited to the domain of humans. Deep learning is a powerful example of one such series of techniques. By consuming an input of vectors and processing them across astronomical numbers of interconnected units and layers, a deep learning algorithm can teach itself how to detect edges, motifs, shapes, and entire objects inside of an image. Utilizing widely available, inexpensive, powerful, parallelized computer chipsets called Graphical Processing Units (GPUs), the performance of processing these once unconscionably massive sets of calculations has exploded. Though creating a deep learning machine is no secret, the cutting edge of these algorithms often require access to extremely large archives of labeled images, sounds, and text so that they are able to be trained or train themselves when making the connections necessary to form the desired output. After realizing the potential of modern self-learning machines, one can surmise that the future of artificial intelligence is right around the corner and that the creative, problem-solving faculties of the human mind are one step closer to being replicated algorithmically.