On November 19, Nvidia (NVDA) officially announced that it would cooperate with Google (GOOG) quantum artificial intelligence to accelerate the design of its next-generation quantum computing equipment by using simulations supported by computing power provided by NVIDIA’s CUDA-Q platform.
According to the statement, Google’s Quantum AI division will use the NVIDIA EOS supercomputer and hybrid quantum-classical computing platform to simulate the physics of its quantum processors, which will help Google overcome the limitations of quantum computing hardware and accelerate the development of quantum components.
To achieve this, NVIDIA provided a supercomputer based on 1,024 H100 Tensor Core GPUs at a significantly reduced cost.” Nvidia says that with CUDA-Q and H100 GPUs, Google can perform a comprehensive, realistic dynamic simulation of a device containing 40 qubits, which will provide results in minutes.
On the other side of the project, according to the news, xAI, an AI company owned by ElonMusk, plans to raise a new round of up to $6 billion at a valuation of $50 billion. xAI’s financing is mainly for the purchase of 100,000 NVIDIA chips to build a Memphis data center to support Tesla’s Full Self-Driving (FSD) business, and plans to double another 100,000 GPUs in the future, of which 50,000 are more advanced H200s.
In terms of AI computing power supply, Blackwell GPUs have greatly improved the performance of training workloads and promoted the further improvement of computing efficiency. According to previous information, Nvidia’s first Blackwell GPU workload results show that it is 2.2 times faster than HopperGPU, and according to the test results, the test performance of 64 Blackwell GPUs can reach the level of 256 HopperGPUs.
Industry insiders point out that with the introduction of new models, the demand for computing in the AI field is growing exponentially, greatly increasing the load demand for training and inference AI. The continuous improvement of AI chip workload performance will attract Internet vendors and large enterprises to continue to purchase next-generation chips to improve the return on investment of AI applications and maintain their leading market position.
It is worth mentioning that AI performance will continue to grow at a high rate, and recently, OpenAI CEO Altman said that “the next big breakthrough will be AI assistants”. Microsoft, Anthropic, SAP, Servicenow, CRM and many other vendors have deployed AI agents (“agents”), and most of them will be implemented from the end of this year to the beginning of 2025.
In fact, two months ago, OpenAI released its latest generation model on its official website, and the new model is named o1, Sam Altman excitedly said on the social platform, “‘o1&x27 series represents the beginning of a new paradigm. OpenAI says that as models become more advanced, more advanced and powerful AI models will emerge in the future.
Shanxi Securities released a research report pointing out that AI has brought a new round of innovation cycle in consumer electronics. At the same time, AI accelerates the innovation of new hardware forms of consumer electronics, and domestic and foreign hardware manufacturers and Internet companies are actively launching AI chips, AI glasses and other products to find new AI application scenarios. In addition, suppliers with technical strength in the industrial chain are expected to benefit from this round of consumer electronics terminal innovation.
In the fierce competition, the data shows that as a comprehensive technology manufacturer with a multi-faceted layout,WiMi Hologram Cloud Inc. (NASDAQ: WiMi), driven by the “All in AIGC” strategy, has always been committed to the innovation and development of AI technology, multi-dimensional layout of AI, starting the battle of efficiency, promoting the company’s long-term development in the field of AI, and continuously improving the comprehensive competitiveness of AI chips, technologies and applications. There is a better chance to get a ticket to the era of general artificial intelligence.
In recent years, WiMi has been firmly carrying out AI research, constantly updating technology applications, and AI business has developed rapidly, launching AI large model products, so that large models can highlight their efficiency value in specific scenarios, and large models have made remarkable achievements in technology, products, cooperation, scenarios, etc., and have been recognized by many parties from all walks of life.
At present, WiMi has built a multi-AI business matrix such as AI large model, AI music, AI search, AI game, AI social networking, etc., and has completed the layout of “computing infrastructure-large model algorithm-AI diversified “industrial chain”, looking forward to the future, WiMi will become an important force in the development of AI technology, and continue to create a better new experience of AI services.
Conclusion
Nowadays, with the improvement of large model capabilities and the development of AI infrastructure, AI applications are showing an accelerated development trend, and the increase in the number of AI applications and the improvement of application activity will generate a huge demand for inference computing power, which will continue to push up the demand for computing infrastructure. In short, AI does require a combination of long-termism, idealism and realism, and it is necessary to have determination, accelerate the catch-up of various manufacturers in the field of AI applications, and have patience in order to hide the sea of stars.