The AI chip was completely detonated by Sora! Masayoshi Son raised 100 billion US dollars, pointing

Mondo Technology Updated on 2024-02-21

Before you know it, it has been 5 months since ARM landed on NASDAQ.

As the largest IPO in the history of the semiconductor industry, although the stock price of ARM fluctuated slightly after its listing, the overall trend was still satisfactory. At press time, ARM shares were hovering around $121, with a total market capitalization of $125.2 billion. SoftBank, which gambled to win the tough battle of ARM, made a lot of money, which largely made up for the deficit in WeWork and Chinese concept stocks.

Now that SoftBank has been resurrected, it plans to fight semiconductors again and aim its guns at the hottest AI chip industry at the moment.

According to Bloomberg, Son is planning to raise $100 billion to invest in AI chip companies. It is reported that the name of this new project is "Izanagi", inspired by the god of creation and life in traditional Japanese culture "Izanagi", and its English translation contains the keyword "agi", which is self-evident.

Looking back at the investment history of Masayoshi Son and SoftBank, chasing the wind has always been one of the important strategies. With ARM in front, SoftBank has more confidence in the semiconductor industry.

However, the outlet of AI chips is too hot. Before SoftBank entered the game, traditional semiconductor giants, Internet giants, and start-ups had already been killed. Son's ambitious quest is to challenge Nvidia for the throne, but the journey is bound to be full of ups and downs.

*From Unsplash).

The "izanagi" project has just begun, and SoftBank has not made an official response, and the outside world has a very limited understanding of the project. Judging from the only news, the scale of investment, investment methods and focus on the field of AI chips can be determined.

It is reported that not all of the $100 billion reported on the Internet was paid by Son out of his own pocket, and SoftBank directly contributed about $30 billion, and the rest will be borne by other investment institutions. It is unclear which capital predators SoftBank will co-opt to participate in the project, but there is news that capital from the Middle East is the most likely to end up.

This news is not groundless, because SoftBank and Middle East Capital have always cooperated closely, and there have been many successful investment cases in the past. For example, the SoftBank vision**, which everyone is very familiar with, has the strong support of the two sovereign wealthies** of Saudi Arabia and Abu Dhabi.

Of course, there are also painful lessons in the process of cooperation between the two sides. In fiscal year 2022, Vision lost about $32 billion, and its main funder, Saudi Public Investment**, also suffered a combined loss of $15.6 billion, forcing Son to apologize personally. The good news is that Middle Eastern capital is pushing for a global investment strategy and is ambitious in the technology sector, even if there have been many failures in the past.

In addition to the support of the Middle East veterans, SoftBank's own performance has also strengthened Son's determination to fight the semiconductor industry again.

According to the latest financial report, SoftBank achieved a net profit of $6.4 billion in the third quarter of fiscal year 2023 (fourth quarter of natural year 2023), close to twice market expectations and the first turnaround since the third quarter of 2022. Among them, the first phase of Vision ** (SVF1) received a single-quarter investment income of US$5.3 billion, and the transfer of part of ARM's equity alone received US$5.5 billion.

*From SoftBank's earnings report).

As early as the second quarter of last year, SVF1 was the first to achieve profitability and is the most reliable pillar of SoftBank. The follow-up trend of the ** is widely optimistic, which is directly related to its position portfolio. According to the financial report, the fair value of SVF1 holdings in the fourth quarter increased by 4 quarter-on-quarter2%。In its portfolio, the performance of Internet companies such as ByteDance and Didi is certainly gratifying, but the main contributor to the valuation is none other than ARM.

Arm's previously released financial report showed that revenue and adjusted operating profit in the fourth quarter of last year recorded 82.4 billion and 3$3.8 billion, both well ahead of market expectations (expected at 7.8 billion600 million and 2$700 million), driving its stock price to soar nearly 90% in three trading days. Meanwhile, SoftBank's share price also hit its highest level since May 2021 on Feb. 13.

There is no doubt about the importance that Son attaches to ARM, after all, this is SoftBank's masterpiece in the semiconductor industry. The success of the big bet on ARM has brought enough confidence to Son, and he will definitely go all out to challenge the AI chip this time.

According to past practice, SoftBank is likely to look for potential chip companies around the world and carefully allocate the $100 billion bet. However, it is not yet clear how the funds will be distributed, and no potential prey has surfaced, and there are still many details to be clarified about this $100 billion investment plan.

But what is certain is that SoftBank came prepared this time and had plenty of ammunition. The already crowded AI chip industry will usher in a more fierce scuffle.

It's hard to say when SoftBank will officially make a move, and there are three main factions in the AI chip track: 1. Traditional semiconductor giants; 2. Internet technology manufacturers; 3. Start-ups from all walks of life are ready to move with the support of capital and giants.

In a broad sense, AI chips can be divided into three types: GPU, FPGA, and ASIC, among which GPU has the widest range of applications, the strongest performance, and the highest market demand. IDC's statistics show that Nvidia's dominance in the AI chip industry is very stable, and it has almost shared all the shares of the global independent GPU market with AMD, with a share of more than 80%.

Semiconductor giants led by AMD and Intel have been working hard to improve computing power in an attempt to shorten the gap with NVIDIA. Among them, AMD is the most promising to break through.

AMD's latest masterpiece is the Instinct MI300X and MI300A chips, which are data center GPUs built for generative AI at the end of December last year, benchmarking against the NVIDIA H100, and the latter are upgraded APUs designed for supercomputing scenarios. It is reported that the number of transistors in the MI300X has reached 153 billion, which has a significant advantage over the 80 billion of Nvidia H100, and the memory capacity also exceeds the latter.

However, Nvidia's advantages are still all-encompassing. Not to mention that the H100's FLOPS (floating point operations per second) is not inferior to any similar competitors, and NVIDIA's supporting facilities alone are enough to kill a lot of peers. Moreover, the Mi300X does not have a built-in Transformer acceleration engine, and the training time of large models is expected to be longer than that of the H100, which is not in line with economic benefits.

For Internet technology companies, the situation is more complicated: on the one hand, they are inseparable from NVIDIA's **, and on the other hand, they are trying to break free from the shackles and make efforts to develop themselves.

As the "hoarding king" of Silicon Valley, Meta purchased a large number of H100 in advance and started its own research early.

In early February, Meta's official spokesperson revealed that its second-generation self-developed AI chip Artemis will be officially put into production within this year, and it is expected to be used in data centers first. However, the production capacity and computing power of Meta's self-developed chips cannot keep up with the demand in a short period of time, and they will be used with NVIDIA's GPUs in the future. Zuckerberg revealed that 350,000 H100 GPUs will be deployed by the end of this year, and the total number of application GPUs will exceed 600,000 by then.

Microsoft and Google, the old enemies, are also chasing after me.

Google also released its self-developed chip TPU V5P in December last year, which improves FLOPS and high-bandwidth memory by 2 times and 3 times respectively compared with the previous generation, and the speed of training large models is increased by 28 times. Focusing on the TPU has been Google's long-term strategy for nearly a decade, and now it has seen the dawn of harvest. In addition, Google also has extensive involvement in the fields of CPU and VCU, the latter of which is very helpful for processing, and has higher strategic significance at the moment when SORA is popular.

Microsoft is more confident, and in November last year, in front of Huang, it released Azure Maia 100, an AI-specific chip for cloud training and inference, which will be prioritized for use in the field of Azure cloud services. Microsoft CEO Nadella revealed that some products equipped with the chip will be launched this year, and claimed that this is "the fastest CPU among all cloud computing companies".

SoftBank came a little later, but fortunately there was plenty of ammunition. If he can really raise $100 billion as scheduled, Son still has a chance to become a Tier1 player when he joins the AI chip chaos.

If you want to catch up with NVIDIA, followers of AI chips such as SoftBank, Google, Microsoft, OpenAI, Meta, Intel, and AMD will have to pay more effort.

At the same time that Son was raising ammunition, Nvidia Huang was also preparing a counterattack.

On the one hand, it is an effort to increase production capacity to meet customer needs.

According to UBS analysts, Nvidia has recently significantly shortened the AI GPU lead time from the previous 8-11 months to 3-4 months. According to agency analysis, this move is either an increase in Nvidia's production capacity to process squeezed orders in a shorter time, or it is planning a new capacity plan to cope with higher supply demand in the future.

But for whatever reason, Nvidia's capacity potential exceeds market expectations. There is no doubt about the demand for H100, and when the production capacity comes up, Nvidia will be able to eat more market share. UBS also raised Nvidia's price target to $850 in its research report, fully demonstrating the capital market's confidence in its prospects.

On the other hand, it is to expand its sphere of influence and extend to the upstream and downstream of the AI chip industry chain.

At the beginning of February, it was reported that Nvidia would enter the data center custom chip track and set up a special team internally. In addition, Nvidia has also had a lot of important operations in the past period: launching the AI chatbot "Chat with RTX", launching three new chips at CES to develop the AI PC business, and successively investing in AI companies such as Runway and Cohere, involving subdivisions including AI ** generation, cloud computing, basic large models, etc.

After expanding its territory, NVIDIA can give full play to the advantages of upstream and downstream synergies, so that customers can rely more on its own products and supporting facilities. Limited by technical barriers, it is difficult for Internet manufacturers to imitate this approach, and there is a high probability that it will be accurately handled by NVIDIA.

Challenging NVIDIA is a common choice in the industry. It's hard to win, and it's a consensus. However, in 2024, AGI will detonate the demand for AI computing power, and the cake of AI chips is expanding in huge quantities.

After entering 2024, the popularity of AI large models shows no signs of declining, but is becoming more and more popular. OpenAI released the Wensheng ** tool SORA, although this is not the first in the industry, but as the flagship product of AGI's "founder of the school", SORA still shook the technology industry. In addition to detonating heated discussions in the science and technology circle, ** entertainment and other related industries are highly concerned about SORA, and there is even news that "on the first day of construction, all VCs are meeting to discuss SORA".

After the advent of SORA, Musk was as anxious as an ant on a hot pot, just because SORA is the product closest to the concept of AGI at the moment, and AGI is the blue ocean market that Musk has in mind.

AGI refers to an intelligent computer system that can completely imitate human emotions and behaviors to achieve self-learning, self-improvement and self-correction. Although Sora has only shown its ability to create content or content, it has shown strong strength in its understanding of the real world.

Why did 360 Zhou Hongyi say that SORA shortened the time for the arrival of the AGI era from ten years to one year? It boils down to its strong understanding of the real world. The biggest feature of AGI is the feedback on the rules of the real world, especially the physical state, natural laws, chemical changes, and other factors. Although SORA only helps with content creation, who can be sure that with enough training, OpenAI will not launch a real AGI product based on it?

However, the operation of the AGI "World Simulator" represented by SORA is inseparable from the computing infrastructure. As Li Tao, chairman and CEO of APUS, a global artificial intelligence company, said, every deep learning AI model is inseparable from the support of powerful computing power behind it, and SORA is no different. OpenAI CEO Altman has publicly stated that its work and products still need more computing power to help, and the company is far from using it at present. As early as October last year, Reuters reported that OpenAI has participated in investing in at least three semiconductor design companies, of which Cerebras is a start-up company.

With the explosion of AGI, the world's demand for AI computing power will further blow out. On the one hand, the computing power requirements of Wensheng ** applications will be hundreds or thousands of times that of ChatGPT Wensheng Graph. On the other hand, AGI technology is entering more scenarios, and more AI software and hardware products and consumer industry applications will appear, which will form a greater demand for computing power.

At the beginning of the new year, a large number of companies shouted the slogan of transforming to AI and even "all in AI", and their investment has increased unabated compared with last year. Lei Technology's previous report "OPPO Meizu Stud! 2024 will become the "first year of AI hardware"? According to statistics, up to now, leading companies in various industries such as OPPO, Meizu, Lenovo, Xiaopeng, and Geely have voiced their efforts to increase AI investment. Among them, Lenovo warmed up for the upcoming AI PC Lenovo Xiaoxin through social networking, and announced internally that it would comprehensively accelerate AI transformation, focusing on AI embedded intelligent terminals, AI-oriented infrastructure and AI native solution services. Meizu has set a "three-year contract", announcing that it will stop traditional smartphone projects and fully invest in the new track of AI hardware. Chen Mingyong, CEO of OPPO, said that its strategy is to be the leader and popularizer of AI mobile phones, and has also set up a special AI center.

According to data cited in a research report at the end of last year, the global AI chip market size will be about $45 billion in 2023 and is expected to reach $400 billion by 2027, with an average annual compound growth rate of more than 70% in the next five years.

In the view of intelligent pro, the emergence of SORA, which shocked mankind, shows that Wensheng** technology is progressing faster than people expect, and the advent of the AGI era has been compressed from 10 years to 1 year. Therefore,The research report data on AI computing power demand in 2023 is completely out of date, and the real computing power demand is hundreds or thousands of times higher than people's previous needs.

It is precisely aiming at this point that SoftBank will spend hundreds of billions of dollars on AI chips, because the market opportunities are still abundant, and in this way, for all AI chip players, whether it can surpass Nvidia is not an important proposition.

Related Pages