Position: Home page » Computing » Does artificial intelligence need power

Does artificial intelligence need power

Publish: 2021-04-21 20:25:34
1.

According to reports, the 2017 Intel China instry summit was held in Suzhou yesterday. At the meeting, Dr. Jerry Kaplan, an internationally renowned AI expert and technological innovation entrepreneur, delivered a speech, explaining the development of artificial intelligence and how it can lead instrial change

The rise of

machine learning is inseparable from the rapid growth of computer computing power. In the past 30 years, the speed of computer has increased by 1 million times. If we compare the speed of computers 30 years ago to that of snails, now it's like the speed of rockets

when the speed of the computer is faster and faster and the amount of data is large, machine learning becomes a better match, especially when we are about to enter the 5g era, which further promotes the interaction between dection and reasoning, perception and the real world. In the future, we can build a new flexible robot with strong perception ability

I hope artificial intelligence technology can achieve greater development

2.

The types of chips that provide computing power for AI include GPU, FPGA and ASIC

GPU is a kind of microprocessor specialized in image operation on personal computers, workstations, game machines and some mobile devices (such as tablet computers, smart phones, etc.). It is similar to Cu, except that GPU is designed to perform complex mathematical and geometric calculations, which are necessary for graphics rendering

FPGA can complete any digital device function chip, even high-performance CPU can be implemented with FPGA. In 2015, Intel acquired the FPGA long alter head with us $16.1 billion. One of its purposes is to focus on the development of FPGA's special computing power in the field of artificial intelligence in the future

ASIC refers to the integrated circuits designed and manufactured according to the requirements of specific users or the needs of specific electronic systems. Strictly speaking, ASIC is a special chip, which is different from the traditional general chip. It's a chip specially designed for a specific need. The TPU that Google recently exposed for AI deep learning computing is also an ASIC

extended data:

chips are also called integrated circuits. According to different functions, they can be divided into many kinds, including those responsible for power supply voltage output control, audio and video processing, and complex operation processing. The algorithm can only run with the help of chips, and because each chip has different computing power in different scenarios, the processing speed and energy consumption of the algorithm are also different. Today, with the rapid development of the artificial intelligence market, people are looking for chips that can make the deep learning algorithm perform faster and with lower energy consumption

3. The view of tenth power computing power leasing platform: as for whether computing power is used to rent, it depends on the conditions of the enterprise. Large enterprises with relatively strong strength can usually purchase a large amount of hardware and software to establish their own computing power centers< However, at present, many small and medium-sized enterprises are still facing the situation of "insufficient computing power, high cost and difficult to obtain". In addition, the demand of some enterprises for computing power is often flexible. If they create their own computing power center, they need to spend a lot of money, and they are also facing problems such as insufficient scalability and low efficiency. Therefore, in view of this situation, many enterprises will give priority to "renting computing power".
4. AI has a lot to do with computing power. The driving force to promote the development of artificial intelligence is algorithm, data and computing power. These three elements are indispensable, which are the necessary conditions for the achievement of artificial intelligence
in terms of computing power, we know that after we have data, we need to train and train constantly. Because it's not good to train the training set from the beginning to the end. It's just like saying a truth to a child. I'm sure I won't learn it again, except for the prodigy who never forgets. In addition to training, AI actually needs to run on hardware and reasoning, all of which need computing support
so artificial intelligence must have computing power, and with the development of more and more intelligence, more and stronger computing power is needed.
5.

The principle of artificial intelligence can be simply described as:

Artificial Intelligence = mathematical calculation

the intelligence of the machine depends on the "algorithm". At first, it was found that 1 and 0 could be represented by the on and off of the circuit. If many circuits are organized together and arranged differently, they can express many things, such as color, shape and letters. In addition, the logic element (triode) forms the "input (press the switch button) - Calculation (current through the line) - output (light on)"

but in go, there is no way to exhaust it like this. No matter how powerful, there are limits. The possibility of go is far beyond the sum of all atoms in the universe (known). Even if we use the most powerful supercomputing at present, it will take tens of thousands of years. Until quantum computers were mature, electronic computers were almost impossible< Therefore, the programmer adds an extra layer of algorithm to the alpha dog:

A. calculate first: where you need to calculate, where you need to ignore

B. then, the calculation is targeted

-- in essence, it's computation. There is no "perception"

in step a, how can it judge "where to calculate"

this is the core problem of "artificial intelligence": the process of "learning"

think about it carefully. How do humans learn

all human cognition comes from summing up the observed phenomena and predicting the future according to the law of summing up

when you see a four legged, short haired, medium-sized, long mouthed, barking animal named dog, you will classify all similar objects you will see in the future as dogs< However, the way of machine learning is qualitatively different from that of human beings:

by observing a few features, human beings can infer most of the unknowns. Take one corner and turn three

the machine must observe many dogs to know whether the running dog is a dog

can such a stupid machine be expected to rule mankind

it's just relying on computing power! Strength is the key to success

specifically, its "learning" algorithm is called "neural network" (more bluffing)< It needs two preconditions:

1. Eat a lot of data to try and error, and graally adjust its accuracy; 2

2. The more layers of neural network, the more accurate the calculation (with limit), and the greater the calculation force

therefore, neural network has been used for many years (it was also called "perceptron" at that time). However, limited by the amount of data and computing power, it has not developed

it sounds like neural networks don't know where the high end is! This once again tells us how important it is for us to have a nice name

now, these two conditions have been met big data and cloud computing. Who has data, who can do AI< At present, the common application fields of AI are as follows:

Image Recognition (security recognition, fingerprint, beauty, image search, medical image diagnosis) uses "convolutional neural network (CNN)", which mainly extracts features of spatial dimension to identify images

natural language processing (human-computer conversation, translation) uses "recurrent neural network (RNN)", which mainly extracts the features of time dimension. Because there is a sequence of words, the time when words appear determines the meaning

the design level of neural network algorithm determines its ability to depict reality. Wu Enda, the top Daniel, once designed convolution layers up to 100 layers (too many layers are prone to over fitting problems)

when we deeply understand the meaning of calculation: there are clear mathematical laws. Then,

the world has quantum (random) characteristics, which determines the theoretical limitations of computer—— In fact, computers can't even generate real random numbers

-- machines are still clumsy

for more in-depth AI knowledge, you can ask in private letters

6. Artificial intelligence (a branch of Computer Science)

artificial intelligence, abbreviated as AI. It is a new technical science to research and develop the theory, method, technology and application system for simulating, extending and expanding human intelligence. Artificial intelligence is a branch of computer science. It attempts to understand the essence of intelligence and proce a new intelligent machine that can respond in a way similar to human intelligence. The research in this field includes robot, language recognition, image recognition, natural language processing and expert system. Artificial intelligence is a new technology science which researches and develops the theory, method, technology and application system for simulating, extending and expanding human intelligence. Since the birth of artificial intelligence, the theory and technology have become more and more mature, and the application field has been expanding, but there is no unified definition
artificial intelligence is the simulation of the information process of human consciousness and thinking. Artificial intelligence is not human intelligence, but it can think like human and may surpass human intelligence. But this kind of advanced AI that can think for itself still needs a breakthrough in scientific theory and engineering
artificial intelligence is a very challenging science. People engaged in this work must understand computer knowledge, psychology and philosophy. Artificial intelligence is a very wide range of science, it is composed of different fields, such as machine learning, computer vision and so on. Generally speaking, one of the main goals of artificial intelligence research is to enable machines to be competent for some complex tasks that usually need human intelligence to complete. But different times and different people have different understanding of this kind of "complex work"
the definition of artificial intelligence can be divided into two parts, namely "artificial" and "intelligence"“ "Artificial" is easy to understand and not controversial. Sometimes we have to consider what human beings can proce, or whether the level of human intelligence is high enough to create artificial intelligence, and so on. But generally speaking, "artificial system" is the artificial system in the general sense
there are many questions about "intelligence". This involves others such as consciousness, self and mind (including unconsciousness)_ Mind) and so on. It is generally accepted that the only intelligence people understand is their own intelligence. However, our understanding of our own intelligence is very limited, and our understanding of the necessary elements of human intelligence is also limited, so it is difficult to define what "artificial" manufacturing "intelligence" is. Therefore, the research of artificial intelligence often involves the research of human intelligence itself. Other intelligence about animals or other artificial systems is also generally considered as a research topic related to artificial intelligence
artificial intelligence has been paid more and more attention in the field of computer. It has been applied in robot, economic and political decision-making, control system and simulation system.
7. Although both inside and outside the instry can feel the hot wave of artificial intelligence in recent years, in fact, artificial intelligence technology does not appear in recent years. Since the 1950s and 1960s, artificial intelligence algorithms and technologies have been hot for a time. With the development of time, they continue to evolve and evolve, and have experienced a process from heat to decay
in recent years, artificial intelligence has made everyone feel its very hot and sustainable development. Therefore, we believe that this round of rapid development of artificial intelligence benefits from the rapid development of IT technology over the years, which brings computing power and computing distance to artificial intelligence, so as to provide support for artificial intelligence algorithms
in recent years, the research and development of artificial intelligence technology and various artificial intelligence applications by enterprises have been continuously implemented, which directly promotes the rapid development of the overall artificial intelligence instry. The overall scale of the core instry of artificial intelligence is close to 100 billion yuan, which can be said to be one of the instries with huge scale. Moreover, from the perspective of future development trend, it is estimated that this year, the overall market scale will reach 160 billion yuan, so the growth rate is still very fast
What are the advantages of deep learning
in order to recognize a certain pattern, the usual way is to extract the features of the pattern in a certain way. This feature extraction method is sometimes designed or specified manually, and sometimes summed up by the computer itself under the premise of relatively more data. Deep learning puts forward a method to let the computer automatically learn the pattern features, and integrates the feature learning into the process of modeling, so as to rece the incompleteness caused by artificial design features. At present, some machine learning applications with deep learning as the core have achieved the recognition or classification performance beyond the existing algorithms in the application scenarios that meet specific conditions
if you are interested in artificial intelligence and deep learning, you can take a look at the AI deep learning courses jointly organized by China public ecation and Chinese Academy of Sciences, both of which are taught by experts of Chinese Academy of Sciences in person
8.

There is not much difference between artificial intelligence and traditional programming. The only difference is that it needs a lot of data and computing power to fit the model

AI = big data (data) + algorithm (deep learning, rule-based, knowledge-based, statistics based, etc., mostly recursive loop structure) + computing power (very high computing power, Intelligent algorithm can work better)

traditional software programming = data structure (a small amount of data relative to AI) + algorithm (the algorithm is not too complex relative to the machine, recursive operation is less) + computing power (not too much computing power)

3D simulation software = data structure (medium data relative to common application software) + algorithm (similar to AI algorithm, but different from it, Relatively speaking, most of the differences are not recursive or matrix operation) + medium computing power (the computing power required by 3D simulation software is not low, but it is lower than AI algorithm, but it is higher than ordinary application software. Of course, some special application software may also be higher than 3D software, However, in most cases, the requirements of 3D software are relatively high)

here, I believe we all understand that the artificial intelligence program is not much different from ordinary software! The difference lies in the understanding of the algorithm! Traditional programming is more based on logic operation! But the algorithm of artificial intelligence includes logic operation, and more complex modeling and fitting algorithm! Just understand linear algebra thoroughly! AI algorithm is not out of reach

9. With the development and integration of artificial intelligence, big data and computing power, the three have been organically integrated into an intelligent whole. Its connotation and denotation tend to be diversified, and the applications in various subdivision fields are also rich and superimposed. You have me and I have you. The difference and boundary between artificial intelligence, big data and computing power are more and more blurred
at this stage, the application of artificial intelligence and big data has penetrated into various fields such as instry, agriculture, medicine, national defense, economy, ecation and so on, and the commercial and social value generated is almost unlimited. With the development of artificial intelligence and Internet of things, cloud computing is no longer limited to storage and computing, and has become an important driving force for the development and transformation of various instries. You can learn more about AI, big data and computing power on the 10th power computing power platform.
10. It should be that abd
C is not the same as other issues. It belongs to the instry, not the technology
Hot content
Inn digger Publish: 2021-05-29 20:04:36 Views: 341
Purchase of virtual currency in trust contract dispute Publish: 2021-05-29 20:04:33 Views: 942
Blockchain trust machine Publish: 2021-05-29 20:04:26 Views: 720
Brief introduction of ant mine Publish: 2021-05-29 20:04:25 Views: 848
Will digital currency open in November Publish: 2021-05-29 19:56:16 Views: 861
Global digital currency asset exchange Publish: 2021-05-29 19:54:29 Views: 603
Mining chip machine S11 Publish: 2021-05-29 19:54:26 Views: 945
Ethereum algorithm Sha3 Publish: 2021-05-29 19:52:40 Views: 643
Talking about blockchain is not reliable Publish: 2021-05-29 19:52:26 Views: 754
Mining machine node query Publish: 2021-05-29 19:36:37 Views: 750