Artificial Intelligence (AI) AI is like this vague label for any machine that can think like a human. In a generalized idea, it is something that can answer any question we might have and do anything a human can do. That is a rigid way to think about AI and it is not super realistic. A machine is said to have an Artificial Intelligence if it can interpret data, potentially learn from the data, and use that knowledge to adapt and achieve specific goals. It is standard for AI these days to not be able to do most things that humans can do, but even with its limitations, AI still plays a huge role in our everyday lives. Uses of AI – some of these are obvious but there are a ton of less obvious examples (e.g. online shopping, ad picking, online bank transactions, applications) The way AI and automation is changing everything from commerce to jobs is kind of analogous to the Industrial Revolution in the 18th century. It is a global change. John McCarthy is a computer scientist who coined the term AI in 1956. He used it to name the Dartmouth Summer Research Project on Artificial Intelligence or Dartmouth Conference. Marvin Minsky is a cognitive scientist (founder of AI Lab at MIT) who was part of the Dartmouth Conference. He had some wrong predictions about AI. He claimed in 1970 that by three to eight years, we will have a machine with the general intelligence of an average human being. Scientists at Dartmouth underestimated how much data and computing power an AI would need to solve complex, real world problems. Most kinds of AI don’t have senses, body, or a brain that can automatically judge a lot of different things like a human does. Modern AI systems are just programs in machines. We need to give an AI a lot of data. We also need to label those data with the information it is trying to learn. Then it needs a powerful computer to make sense of all the data. AI Winter – in 2010 or so, the field was frozen. Still there were a lot of changes in the last half a century that led us to the AI revolution. Two Big Developments in Computing: 1. First development: huge increase in computing power and how fast computers could process data. IBM 7090 was the most advanced computer in 1956. This filled a room, stored data on giant cassette tapes, and took instructions using punch cards. It can do about 200,00 operations every second. Speed of computer is linked to the number of transistors it has to do the operations. Moore’s Law – every two years, number of transistors that fit in the same amount of space is doubled. 2005 – Computers started to have enough computing power to mimic certain brain functions within AI; AI winter started showing signs of thawing 2007 – first iPhone was released (400 million operations per second) 2017 – iPhone X (600 billion operations per second) Modern supercomputer – doing computational functions (30 quadrillion operations per second) 2. Second development: Internet and Social Media. In past 20 years, the world has become much more interconnected. Basically, anything that generates data is participating in the modern world. Machine Learning Machine learning brings the promise of deriving meaning from all of that data. Machine learning are tools and technology that we can use to utilize to answer questions with our data. There is a lot of data in the world today generated by people, computers, phones and other devices. This will only continue to grow in the coming years. Traditionally, humans have analyzed data and adapted systems to the changes in data patterns. However, as the volume of data surpasses the ability for humans to make sense of it and manually write those rules, we will turn increasingly to automated systems that can learn from the data and importantly, the changes in data to adapt to a shifting landscape. Immediate Applications of Machine Learning: Image recognition Fraud detection Recommendation Systems Text and Speech systems; etc. The day will soon come when it will be expected that our technology will be personalized, insightful and self-correcting. Basically, Machine Learning is defined as using data (Training – using data to inform the creation and fine tuning of a predictive model [serve up predictions on previously unseen data]) to answer questions (Prediction – inferences). Everything hinges on DATA. It is the key to unlocking machine learning just as much as machine learning is the key to unlocking that hidden insight in data.