The difference between AI computing power and arm computing powe

CristianoAmon, President of Qualcomm Corp, at the press conference of SnapdragonRide, shows a combination of SoC, accelerator and automatic driving software stack, showing SnapdragonRide (Graphic source CNET/James Martin)
SnapdragonRide, which provides an extensible solution for automobile manufacturers, which can provide support for self driving vehicles in three sub fields, namely
1, L1 / L2 active safety ADAS -- for vehicles with automatic emergency braking, traffic sign recognition and lane keeping assistance
2, L2 + ADAS -- for vehicles that can drive automatically on expressways, support self-service parking, and drive in urban traffic with frequent parking
3, L4 / L5 fully automatic driving -- for automatic driving, unmanned taxi and robot logistics in urban traffic environment
snapdragonride platform is based on a series of different snapdragon SOC and accelerators. It adopts scalable and molar high-performance heterogeneous multi-core CPU, energy-efficient AI and computer vision engine, and GPU
among them, adassoc series and accelerator series adopt heterogeneous computing. At the same time, using the new generation of artificial intelligence engine of Qualcomm, ADAS and SOC can efficiently manage a large amount of data of on-board system
thanks to the combination of these different SOC and accelerators, snapdragonride platform can be equipped according to the needs of different segments of autonomous driving market, and provide good heat dissipation efficiency, from 30tops level equipment for L1 / L2 level applications to 130W power consumption equipment for L4 / L5 level driving with more than 700tops
in addition, the new snapdragonride autopilot software stack is a molar and scalable solution integrated into the snapdragonride platform
according to the introction, the software framework of snapdragonride platform can host customer specific software stack components and snapdragonride autopilot software stack components at the same time
snapdragonride platform also supports passive or air-cooled cooling design, so it can further optimize vehicle design and improve reliability while recing cost
now, arm, blackberry QNX, Infineon, Xinsi technology, elektrobit, and Anson semiconctor have joined the auto driving circle of friends of Qualcomm and become the software / hardware suppliers of snapdragonride auto driving platform
the functional security solutions of arm, the automotive designware interface IP, arc processor IP and starmemory SystemTM of Xinsi technology, the OS Security version and hypervisor Security version of BlackBerry QNX, the aurixtm microcontroller of Infineon, and the ADAS series sensors of Anson semiconctor will be integrated into the automatic driving platform of Qualcomm
elektrobit also plans to cooperate with Qualcomm to jointly develop a new generation AUTOSAR architecture that can be proced on a large scale. Ebcorbos software and snapdragonride autopilot platform will be integrated on this architecture
it is understood that snapdragonride will be delivered to automobile manufacturers and first tier suppliers for preliminary development in the first half of 2020, and according to the estimation of qcomm technologies, the vehicles equipped with snapdragonride will be put into proction in 2023
Second, we have been deeply engaged in the automotive business for many years. Before the release of snapdragonride, we have been deeply engaged in the field of intelligent vehicles for many yearsfor more than a decade, Qualcomm technologies, a subsidiary of Qualcomm, has been providing advanced wireless communication solutions for GM's connected vehicle applications, including the security applications supported by the Android devices on GM
in the fields of on-board information processing, information audio-visual and in car Internet, the total value of orders of qcomm technologies has exceeded 7 billion US dollars (about 48.7 billion yuan)
according to the information released by Qualcomm at the ces2020 press conference, so far, more than one million vehicles have used the automotive solutions provided by Qualcomm
it is obvious that today, the layout of Qualcomm in the automotive field has taken another step forward
ring ces2020, in addition to snapdragonride, Qualcomm also launched a new car to cloud service, which is expected to be available in the second half of 2020
according to the introction, the car to cloud service built by Qualcomm technologies supports the soft upgrade capability of softsku chip specifications, which can not only help automobile customers meet the changing needs of consumers, but also enable chipsets to be upgraded in the field to support new functions according to new performance requirements or new features
at the same time, softsku also supports customers to develop common hardware, so as to save their special investment for different development projects. By using the cloud softsku of Qualcomm, automobile manufacturers can not only provide a variety of customized services for consumers, but also create rich and immersive in car experience through personalized features
in addition, the car to cloud service of Qualcomm also supports the global cellular connection function, which can be used to guide the initialization service and provide wireless communication connection throughout the life cycle of the car
According to nakul ggal, senior vice president of proct management of Qualcomm technologies, combined with snapdragon's 4G and 5g platforms and snapdragon's digital cockpit platform, Qualcomm's car to cloud service can help automobile manufacturers and first tier suppliers meet the new expectations of contemporary car owners, including flexible and continuous technology upgrading and continuous exploration of new functions throughout the whole automobile life cyclein addition, Qualcomm technologies also announced on ces2020 that it will continue to deepen its cooperation with general motors. As a long-term partner, GM will support digital cockpit, on-board information processing and ADAS (advanced driving assistance system) through continuous cooperation with Qualcomm technologies P>
conclusion: Giants raging like a storm in the field of automatic driving, HUAWEI has said before it wants to build intelligent core sensors for automobile, such as lidar and millimeter wave radar. After that, Arm led the establishment of the automatic driving Computing Alliance. Now the mobile chip giant Qualcomm has released a new automatic driving platform, and has made further progress in the field of automobile and automatic driving. <
P>
game player is concive to the faster and better landing of autopilot cars, but on the other hand, as more hardcore players expand their business boundaries, competition in the market will become more intense. p>
this article comes from the author of car home, which does not represent the standpoint of car home
The development of single CPU can not meet the needs of practical applications, and the AI era must rely on parallel computing. At present, the mainstream architecture of parallel computing is heterogeneous parallel computing platform. If you need the service of computing power, you can go to the tenth power

At present, blockchain training courses in the market span a lot, and the course content and teaching form are also varied

bitcoin
bitcoin principle, bitcoin system architecture, cryptographic algorithm (go language implementation), consensus algorithm (go language implementation), bitcoin transaction principle and transaction script, bitcoin RPC programming (node. JS Implementation), and Bitcoin source code analysis
4, blockchain 2.0 Ethereum
Ethereum working principle and infrastructure, Ethereum basic concepts (account, transaction, gas), Ethereum wallet mist and metamask, Ethereum transaction, erc20 standard token development and deployment, Ethereum development ide Remix IDE, smart contract and solidness, solidness deployment, backup and call Framework technology: truffle and Web3, DAPP development practice, geth
5, blockchain 3.0 - Super ledger fabric
Super ledger project introction, fabric deployment and use, fabric configuration management, fabric architecture design, fabric CA application and configuration, application development practice
the Xueshuo innovation blockchain Technology Workstation of Lianqiao ecation online is the only approved "blockchain Technology Specialty" pilot workstation of "smart learning workshop 2020 Xueshuo innovation workstation" launched by the school planning, construction and development center of the Ministry of ecation of China. Based on providing diversified growth paths for students, the professional station promotes the reform of the training mode of the combination of professional degree research, proction, learning and research, and constructs the applied and compound talent training system
in terms of the hardware architecture of the server, AI server is a heterogeneous server. In terms of heterogeneous mode, different combinations can be adopted according to the application scope, such as CPU + GPU, CPU + TPU, CPU + other accelerators, etc. Compared with ordinary servers, there is no difference in memory, storage and network, mainly in big data, cloud computing, artificial intelligence and other aspects, which need more internal and external memory to meet the needs of various data collection and collation
as we all know, the common server is the provider of computing power based on CPU, which adopts the serial architecture and is good at logical computing and floating-point computing. Because a lot of branch jump processing is needed in logic judgment, the structure of CPU is complex, and the improvement of computing power mainly depends on stacking more cores
however, with the application of network technologies such as big data, cloud computing, artificial intelligence and the Internet of things, the amount of data in the Internet is growing exponentially, which poses a serious challenge to the traditional services with CPU as the main source of computing power. At present, the processing technology of CPU and the number of cores of a single CPU are close to the limit, but the increase of data continues, Therefore, the data processing ability of the server must be improved. Therefore, in this environment, AI server came into being.
There is not much difference between artificial intelligence and traditional programming. The only difference is that it needs a lot of data and computing power to fit the model
AI = big data (data) + algorithm (deep learning, rule-based, knowledge-based, statistics based, etc., mostly recursive loop structure) + computing power (very high computing power, Intelligent algorithm can work better)
traditional software programming = data structure (a small amount of data relative to AI) + algorithm (the algorithm is not too complex relative to the machine, recursive operation is less) + computing power (not too much computing power)
3D simulation software = data structure (medium data relative to common application software) + algorithm (similar to AI algorithm, but different from it, Relatively speaking, most of the differences are not recursive or matrix operation) + medium computing power (the computing power required by 3D simulation software is not low, but it is lower than AI algorithm, but it is higher than ordinary application software. Of course, some special application software may also be higher than 3D software, However, in most cases, the requirements of 3D software are relatively high)
here, I believe we all understand that the artificial intelligence program is not much different from ordinary software! The difference lies in the understanding of the algorithm! Traditional programming is more based on logic operation! But the algorithm of artificial intelligence includes logic operation, and more complex modeling and fitting algorithm! Just understand linear algebra thoroughly! AI algorithm is not out of reach
as we all know, the common server is the provider of computing power based on CPU, which adopts the serial architecture and is good at logical computing and floating-point computing. Because a lot of branch jump processing is needed in logic judgment, the structure of CPU is complex, and the improvement of computing power mainly depends on stacking more cores
however, with the application of network technologies such as big data, cloud computing, artificial intelligence and the Internet of things, the data in the Internet is growing exponentially, which poses a serious challenge to the traditional services with CPU as the main source of computing power. At present, the processing technology of CPU and the number of cores of a single CPU are close to the limit, but the increase of data continues, Therefore, the data processing ability of the server must be improved. Therefore, in this environment, AI server came into being<
nowadays, AI servers in the market generally adopt the form of CPU + GPU, because GPU is different from CPU, which adopts the mode of parallel computing and is good at sorting out intensive data operations, such as graphics rendering, machine learning, etc. On GPU, NVIDIA has obvious advantages. The number of single card cores of GPU can reach nearly 1000. For example, the number of cores with 16 NVIDIA Tesla V100 tensor core 32GB GPUs can exceed 10240, and the computing performance can reach 20 billion times per second. And after years of market development, it has been confirmed that CPU + GPU heterogeneous server in the current environment can really have a lot of development space
but it is undeniable that every instry needs to go through a lot of wind and rain from the beginning to maturity, and in this process of development, competition always exists, and can promote the sustainable development of the instry. AI server can be said to be a trend or a rising force, but there is still a long way to go for AI server. The above is the answer to the tenth power of Inspur server distribution platform.
