Data center decentralization
if the number of cases is really low, it can be made up.
if the workers refuse to make up, they can complain or report to the local labor supervision brigade or apply to the labor dispute Arbitration Committee for labor arbitration< br />
The purpose of data centralization and standardization is to eliminate the differences between features, which can make different features have the same scale and make the influence of different features on parameters consistent. In short, when the scale (unit) of the features on different dimensions of the original data is inconsistent, the data needs to be preprocessed by centralization and standardization steps
extended data:
because the original data often have different units of independent variables, it will bring some difficulties to the analysis, and because of the large amount of data, the calculation result may not be ideal e to rounding error. Data centralization and standardization are helpful to eliminate the influence caused by different dimensions and orders of magnitude, and avoid unnecessary errors
in regression analysis, it is usually necessary to centralize and standardize the original data. Through centralization and standardization, the data with mean value of 0 and standard deviation of 1 are obtained
Because it was just a new thing, there was only a white paper with two options at that time, so when preachers promoted it, they often started with IPFs and protocol lab. Sequoia Capital and other investment machines constituted the best endorsement, and it became an ideal benchmarking. Is it true that you need to have the awareness of keeping pace with the times, rather than being ecated by the market, without any resistance? If you don't even know IPFs, you'd better forget it and go home to farm
I will summarize its characteristics in three aspects: first, decentralization
it may be said that the distributed storage of cloud service providers is also decentralized, and the data is stored in different data centers. But have you ever thought that data actually exists in a cloud service provider's data center, which is essentially centralized
the emphasis of decentralization is that data can be stored in the storage space of any network in the world, just like BT and electric donkey. Each node is a data source, and anyone can rent it< 2. Cost performance
it is an indisputable fact that the big data explosion. Mining value from data will definitely account for the cost, which is lower than the storage cost of traditional distributed storage< 3. Safety
e to the small number of nodes in the data center, it may lead to high delay, network instability and other problems, which is why the major cloud service providers have to invest a lot of capital to build data centers and optimize the network. Only the person who has the private key can reorganize all parts to view the complete data. There are many nodes, and the recovery is fast even if there is a problem
here are some suggestions for you: everyone's success depends on the "current situation". Just as the so-called "time makes heroes", there is a saying on the Internet that is very popular: "as long as it's on the tuyere, pigs can fly!"
precautions: in addition, you should make more use of the Internet, pay attention to the new facts in this aspect, and increase your cognition! Web search results - 5g distributed
1. Tencent
Tencent earlier opened a cloud data center in Silicon Valley to keep up with the growth of market share of cloud services in the United States. Half of the 770 million daily users of wechat use the service for more than 90 minutes a day
2. Alibaba
although Alibaba is the largest e-commerce company in the world, the fastest growing one is cloud services. It has cloud data centers around the world and has begun to sell big data services to small and medium-sized enterprises. Alibaba cloud entered Gartner's IAAs Magic Quadrant in June
3. Network
network focuses on artificial intelligence, among which more than 1300 people are devoted to its development. The network also announced a partnership with NVIDIA to enhance its AI capabilities in cloud and autopilot through the Volta GPU
4, IBM
IBM accounts for 40% of the revenue of blockchain, cloud computing and artificial intelligence. By 2017, it also provided major deals for BMW and Bombardier in cloud services
5. Alpha (Google parent)
artificial intelligence is still one of the key fields of alpha and its subsidiaries, and Google cloud platform is not explicitly mentioned in the company summary. MIT indicates that the deep mind machine learning algorithm used by alphabet can save 40% of the cooling energy consumption of the data center
data standardization refers to subtracting the mean value from the value and then dividing it by the standard deviation; Centralization refers to subtracting the mean value of a variable.
extended data:
because the original data often have different units of independent variables, it will bring some difficulties to the analysis, and because of the large amount of data, the calculation result may not be ideal e to rounding error. Data centralization and standardization are helpful to eliminate the influence caused by different dimensions and orders of magnitude, and avoid unnecessary errors
in regression analysis, it is usually necessary to centralize and standardize the original data. Through centralization and standardization, the data with mean value of 0 and standard deviation of 1 are obtained
reference materials:
network data standardization
