Deep learning, big data and artificial general intelligence: 2011–present

In the first decades of the 21st century, access to large amounts of data (known as "big data"), cheaper and faster computers and advanced machine learning techniques were successfully applied to many problems throughout the economy. In fact, McKinsey Global Institute estimated in their famous paper "Big data: The next frontier for innovation, competition, and productivity" that "by 2009, nearly all sectors in the US economy had at least an average of 200 terabytes of stored data".


By 2016, the market for AI-related products, hardware, and software reached more than 8 billion dollars, and the New York Times reported that interest in AI had reached a "frenzy".


The applications of big data began to reach into other fields as well, such as training models in ecology and for various applications in economics.


Advances in deep learning (particularly deep convolutional neural networks and recurrent neural networks) drove progress and research in image and video processing, text analysis, and even speech recognition.


Deep learning
Main article: Deep learning

Deep learning is a branch of machine learning that models high level abstractions in data by using a deep graph with many processing layers.


According to the Universal approximation theorem, deep-ness isn't necessary for a neural network to be able to approximate arbitrary continuous functions. Even so, there are many problems that are common to shallow networks (such as overfitting) that deep networks help avoid.


As such, deep neural networks are able to realistically generate much more complex models as compared to their shallow counterparts.


However, deep learning has problems of its own. A common problem for recurrent neural networks is the vanishing gradient problem, which is where gradients passed between layers gradually shrink and literally disappear as they are rounded off to zero. There have been many methods developed to approach this problem, such as Long short-term memory units.


State-of-the-art deep neural network architectures can sometimes even rival human accuracy in fields like computer vision, specifically on things like the MNIST database, and traffic sign recognition.


Language processing engines powered by smart search engines can easily beat humans at answering general trivia questions (such as IBM Watson), and recent developments in deep learning have produced astounding results in competing with humans, in things like Go and Doom (which, being a First-Person Shooter game, has sparked some controversy). 


Big Data
Main article: Big Data

Big data refers to a collection of data that cannot be captured, managed, and processed by conventional software tools within a certain time frame. It is a massive amount of decision-making, insight, and process optimization capabilities that require new processing models. In the Big Data Era written by Victor Meyer Schonberg and Kenneth Cooke, big data means that instead of random analysis (sample survey), all data is used for analysis. The 5V characteristics of big data (proposed by IBM): Volume, Velocity, , Value, Veracity.


The strategic significance of big data technology is not to master huge data information, but to specialize in these meaningful data. In other words, if big data is likened to an industry, the key to realizing profitability in this industry is to increase the "Process capability" of the data and realize the "Value added" of the data through "Processing" or as the ability of a machine to perform "general intelligent action". Academic sources reserve "strong AI" to refer to machines capable of experiencing consciousness.




Comments

Popular posts from this blog

Multiple myeloma- Cancer of Bone

GENE THERAPY

AI : Artificial Intelligence