As a professional translator, I have analyzed the current status of several popular projects at the conference. It shows a situation where the supply of computing power worldwide exceeds the demand from AI developers for training or inference tasks. This does not mean that the demand does not exist. Sam Altman, the founder of OpenAI, proposed raising $70 trillion to build an advanced chip factory that is 10 times larger than TSMC and is used for chip production and model training. Research from Stanford University also indicates that regardless of the language model, when the training parameter scale exceeds a certain critical value, its performance (such as accuracy) dramatically improves. This is contrary to the “work miracles through great efforts” rule and implies that there are still many challenges to solve in the decentralization of computing power in reality.