China s AI New Prologue From Hundred Models to Application Wave

Mondo Technology Updated on 2024-01-31

Author丨Ni Yuqing, Tao Li.

Editor丨Lin Hong.

Source丨Visual China.

The era of AI in the technology industry is coming.

Since the launch of ChatGPT in October 2022, the large model has been running wildly. If 2023 is taken as the dividing year in the history of human civilization, it will be preceded by the chaotic period of AI, and after that will be the era of generative AI.

In 2023, the domestic technology industry will form a trend of "100 model wars", from the basic model to the industry model, each company is moving forward rapidly;In the second half of the year, more and more applications emerged and were commercialized in vertical industries. Looking back on the past year, the industry has risen together, but it is also accompanied by the controversy of duplicate construction and facing the problem of industrialization.

When the race heats up, can the industry find a suitable profit point?How to better land in different scenarios?And how to solve the problem of computing power constraints?

A number of industry experts in an interview with the 21st Century Business Herald reporter believedAlthough there are currently more than 200 large models in China, they will face fierce competition and elimination in 2024. On the other hand, the application of generative AI will continue to innovate, among which the B-end industry application has great potential, showing new business models and opportunities, and the C-end track is mainly focused on productivity tools and entertainment attributes.

Overall, after the boom in the first half of 2023, the industry will be more rational about the industrialization of generative AI and large models in the second half of the year, showing a trend of innovation and prudence.

Cai Yuezhou, director of the Digital Economy Research Office of the Institute of Quantitative and Technical Economics of the Chinese Academy of Social Sciences, believes that the 100-model war reflects the penetration of general-purpose technology into all aspects, and how to meet the challenges behind and seize the opportunities in the face of all-round disruptive innovation is an important topic. For example, there is a problem of computing power in China, but there is a huge demand in China, including the demand for large models. "In the future, the application scenario will still be our advantage, driving the improvement of domestic supply capacity, the optimization and iteration of technical capabilities, and breeding opportunities. He said.

2023: The AB side of the "100 models" war

According to data from Pitchbook, a venture capital data analysis company, as of October 15, 2023, the total financing in the global AIGC field reached 23.2 billion US dollars (about 165.6 billion yuan), an increase of 250 over the whole year of 20222%, of which 50% was raised in the AI core (AL core) field$8.2 billion. At present, the total number of AIGC emerging companies in the world has exceeded 1,500.

The fiery momentum can be seen. Taking Viva Capital as an example, more than 1,000 domestic and foreign AI application layer companies have been collected on the institutional platform, and more than 60 companies have been deeply served from the financing service level, with a financing conversion rate of more than 10%.

Although there are many AI entrepreneurs, AI commercialization has never been easy, and institutions are cautious about investment.

Even if the entire AI track is in full swing, compared with previous years, the support given by capital is still very small. Hu Xiaojing, a partner of Extraordinary Capital, told the 21st Century Business Herald reporter that in the AI application layer, this year's favored by institutions are enterprises that can quickly find application scenarios, and there is a very sharp incision of rigid needs, so as to open a new market and quickly see growth at the income level.

These popular applications can be divided into "AI+" and "+AI", "AI+" refers to AI-native, innovative applications based entirely on generative AI technology, and "+AI" refers to applications that can greatly improve efficiency or generate additional purchases by adding AI tools to existing businesses.

At the same time, the hot trend of the AI application layer is constantly evolving, and the two types of enterprises have different development timelines.

Hu Xiaojing further pointed out that the first half of this year is very good for "+AI" companies, because on the basis of customer needs to expand application scenarios, we can see better results. In the second half of the year, many AI-native applications also launched products and began to serve customers.

From a global perspective, a "1+N" pattern has been formed, OpenAI's ChatGPT is a "1" superstar product, and other "N" applications are making breakthroughs in various fields, of course, the pattern is also changing, and domestic companies are accelerating to catch up.

Competition is also fierce at the large model level. fasion.AI founder and CEO Cheng Bin told reporters: "In February, we were still discussing which two or three domestic companies have the strength to train real large models, and by the end of the month, more than 100 large model companies have emerged, each of which feels the need to have its own wheel, which in a sense also causes a certain waste of resources." ”

In this regard, 360 intelligent brain expert Ge Canhui also feels the same way, he predicts that there will not be so many of the more than 200 models in China by this time next year. "There are five to ten companies left in the public domain, and the private domain is also a very cruel knockout. He said.

Talking about the development trend in 2024, Ge Canhui said frankly that at the application level, China especially likes to make super apps, but in the field of AI, many API capabilities must be opened up to maximize their value, especially in the agent era, the effect of cooperation must be greater than to do everything by yourself, and it will be very convenient to do interoperability between apps, which is the trend. At present, the 360 intelligent brain model is 100% compatible with OpenAI's API, and the third-party open source ecosystem is also more convenient to use.

We hope to be able to easily access open source ecological toolchains such as Langchain LlamaIndex. The most important hard power catch-up is still in the underlying hardware and underlying model, many domestic startups are promoting the catch-up of private models, and the gap between the private model on consumer-grade hardware and foreign countries is not large. Ge Canhui said with confidence.

2024: A wave of adoption

With the continuous development of AI technology, large models have shown rich innovation in C-end and B-end application fields, however, there are also many hidden challenges.

Hu Xiaojing pointed out that this year's domestic investment in the field of C-end large models is relatively small, mainly due to the uncertainty of the basic layer of domestic large models. On the one hand, at the beginning, everyone did not know the progress speed of the large model, and they were worried that the C-end product was easy to be subverted by the iterative subversion of the basic modelOn the other hand, they are also worried that start-ups cannot compete with large domestic companies, and this cautious attitude leads institutions to wait and see the opportunities of the C-end.

However, there are still many power points on the C-side. Hu Xiaojing introduced that the innovation points of the C-end are mainly concentrated in two areas:"s**e time" and "kill time". "S**e time" focuses on saving the user's time, such as productivity tools and productivity tools. While improving individual productivity, these tools have formed an ecosystem of individual payment, and can even be driven by individuals to pay for enterprises. On the other hand, "kill time" includes emotional companionship and games, etc., to meet the needs of users in terms of entertainment, and these two categories cover the mainstream needs of the entire C-side for the new paradigm of generative AI.

In terms of B-end applications, the application of large models shows two main trends, one is to use AI tools to enhance the original solution on the basis of the original B-end software, so as to continuously enhance product barriers and customer value. Another trend is the rise of AI agents, which are based on different task solutions spliced together to form a complete workflow. Hu Xiaojing believes that this is an important opportunity for the next generation of software, and although AI Agent is still in its early stages, its potential should not be underestimated.

At the same time, she also specifically mentioned that for companies engaged in to B software services, the business model has changed dramatically, and after the trend of SaaS charging, there is an emergence of traffic-based charging, that is, billing according to tokens. This emerging approach to charging brings new revenue opportunities for businesses, but it also faces the challenge of competing at low prices and maintaining customer usage.

Luan Jian, head of the AI laboratory large model team of the Technical Committee of Xiaomi Group, told reporters that the application process of AI has not been smooth sailing, but has experienced ups and downs. In the process of development, there have been several peaks, and there have been several harsh winters. The current large model is in a relatively good stage of development for the To B application, because the real effect has been seen in terms of improving efficiency. In contrast, C-end users are not very sticky to large models, and they are only more receptive to entertainment. He said.

At present, the application direction of many large model companies is no longer to learn a real industry model, but to use the knowledge graph to sort out the internal knowledge base of the enterprise, and then use the large model to trace the final knowledge base, summarize and summarize the large model, and finally launch it to users. For "safe time" type enterprises, the real instrumental path needs to be further explored.

Cheng Bin believes: "In a sense, the large model brings technological equality. When the technical threshold is stretched infinitely, reasonable and effective application will become the ultimate killer feature. At present, the real beneficiaries of the application of large models are companies that have formed their own business closed loops in the past, can quickly use large models to reduce costs and increase efficiency, and even further extend their business boundaries. Next, from the second half of this year to the first half of next year, there should be a concentrated outbreak of a number of companies with very creative AI-driven products. ”

Commercial use is hard to evolve

As the industrial application of generative AI rolls forward, the commercialization process also faces a series of challenges.

Li Jingmei, partner and chief product officer of Lanzhou Technology, said that when Lanzhou Technology was established in 2021, it chose the pre-trained model because the pre-trained model has higher efficiency and lower delivery cost in the landing application of TO B. However, the development of large models is also accompanied by a significant change, namely an increase in the number of parameters. In the past, the pre-trained model was small, but in order to give the large model a wider range of general capabilities, the number of model parameters has reached the level of tens of billions and tens of billions.

She recalled that in the first half of the year, everyone was "hot" and very excited, and the expectations were very high, and in the second half of the year, rationalization began, and the challenges in the commercialization stage gradually became prominent. In order to solve the problem of commercialization, finding benchmark customers has become a key part. By working with benchmark customers, we can not only get real user feedback, but also find problems in the application process and improve and iterate in time.

Regarding the difficulties of commercial use, Lu Wenchao, deputy general manager of Daguan Data, said that at present, the value that can be generated by a single scenario is limited, and the customer's willingness to pay is not proportional to the actual input cost. To solve this problem, he proposed several possible options. The first is to combine and package, integrating multiple product lines and multiple directions to form a paid overall solution. The second option is to sell manpower, i.e., to provide an experienced engineering team to meet the customer's consulting and development needs. The third option is to upgrade and expand the original product, by improving the existing product, increasing the value and expanding the application field.

Regarding the challenge of TO C commercialization, some AI industry technology leaders told the 21st Century Business Herald reporter that although it is relatively difficult to commercialize TO C, once a suitable scenario is found, success is still possible. "In the field of To B, many leading companies do have a strong demand for AIGC, and large enterprises will be more willing to pay, the key is whether our functions can really solve the real pain points of enterprises. He said.

At the same time, Li Jingmei also talked about the problem of "hallucination", which is also a pain point that has not been completely solved in the TOB scene. Luan Jian said that although hallucinations are not reliable in some occasions, in the entertainment field, the tolerance for making mistakes is relatively high, and there is still room for improvement in the creativity of large models. He said.

In addition, Xu Xiaotian, chief architect of data and artificial intelligence at IBM Labs, also mentioned the issue of security and trustworthiness, he said: "The credibility of the model is a big problem, especially in the face of To B customers, if the model is not made credible, it is very difficult to go online." ”

The above views and practical experience jointly depict the complex pattern faced by the commercial use of large models. The exploration of the commercialization path requires rational thinking and close cooperation with users, while addressing the "illusion" problem in model generation and ensuring credibility has become a key consideration in the commercialization process.

Break the computing power dilemma

With the evolution of large models, the demand for computing power has become more and more huge. In this context, industry experts talk about the bottleneck and breakthrough of large model computing power, some believe that the domestic computing power situation is optimistic, and some point out that the global shortage will exist for a long time.

Ge Canhui said bluntly: "We are now preparing a lot of computing power in all aspects, but it will never be enough, this is a status quo." "This also reflects the rapid growth of computing power required for large model applications, and although many enterprises have made adequate preparations, the increasing demand is still insufficient.

Lu Wenchao, who also feels the shortage of Daguan data, "We generally use the inference stage when we go to the customer site, and the inference stage actually has good requirements for computing power, not very high, but there is also a shortage problem," he said, "For example, A100 and A800 belong to the graphics cards (GPUs) that are in short supply before, and they are not easy to buy, and the latest news is that ordinary graphics cards like A10 and A16 are not easy to buy, and the order cycle is very long." ”

In the domestic market, the cost of obtaining computing power has increased significantly, and the demand for domestic innovation has become more urgent.

At the national level, the layout has been stepped up. Recently, the National Development and Reform Commission, the National Data Administration, the Cyberspace Administration of China, the Ministry of Industry and Information Technology, and the National Energy Administration jointly issued the "Implementation Opinions on Accelerating the Construction of a National Integrated Computing Network for the In-depth Implementation of the "Eastern Data and Western Computing" Project, proposing that by the end of 2025, the comprehensive computing infrastructure system will take shape. The new computing power in the national hub node area accounts for more than 60% of the new computing power in the country, and the utilization rate of computing power resources in the national hub node significantly exceeds the national average. The 1ms delay urban computing power network, the 5ms delay regional computing power network, and the 20ms delay cross-national hub node computing power network will be initially realized in the demonstration area.

At the enterprise level, it is imperative to optimize the model and strengthen the reserves. Lu Wenchao said: "We especially hope to make the model smaller, and China's local GPU manufacturers, including the ecosystem, will greatly reduce the cost, and we now have a large part of the cost related to GPU." ”

As you can see, in the current context,Domestic chip manufacturers are facing development opportunities, but compared with NVIDIA, the ecology is still a problem. Cheng Bin believes that at present, from the perspective of pure computing power, there are already some domestic products that can be benchmarked with foreign products, but the lack of ecological support leads to the inability to roll out on a large scale. The localization substitution of the reasoning side is currently going faster. "When it comes to GPT-4 and even further training, the bottleneck of computing power still exists. He said.

Despite all the challenges, ChatGPT still exhilarates, delights and fills the entire tech community with excitement, joy and fantasy. In the imagination of entrepreneurs, it may have many appearances, and an AI assistant with strong generalization, versatility and practicality will presumably subvert the traditional production and work model. However, it must work from chips, algorithms, data, software and other aspects, just like a person must not only have bones and flesh, but also a brain and soul.

According to the data of CCID Research Institute of the Ministry of Industry and Information Technology, there are currently more than 19 language model R&D manufacturers in China, of which 15 manufacturers' model products have passed the recordIt is estimated that the market size of China's Chinese large model will reach 132 in 2023300 million yuan, with a growth rate of 110%.

It is foreseeable that in the coming 2024, after the model capability further completes the leap from quantitative to qualitative change, it will show an even more amazing explosive growth.

Cai Yuezhou believes that the self-sufficiency rate of domestic overall integrated circuit chips will be about 9 in 20219%, 10 in 20221%, the overall supply and demand is insufficient, and at the same time, there are obstacles in the adaptation of the conversion of domestic Xinchuang's self-developed chips. "The key to turning a crisis into an opportunity is that everyone has demand, and for the entire industrial chain, it actually brings opportunities to domestic production, and there is such a big demand in China, and the application scenario is also an advantage. ”

Overall, the localization of computing power chips faces many challenges and huge opportunities. By optimizing the model, enhancing the adaptability of domestic chips, solving the problem of information innovation, and expanding the share of domestic chips, it is expected to usher in new development opportunities in the field of computing power.

For more exciting content, please click here

sfc

Editor: Zhong Hailing, intern: Tan Yahan.

21 Jun recommended reading

Interview with Liu Hanyuan: Identifying the opportunities and risks of the photovoltaic industry in the cycle.

In 2023, the cultural tourism industry will be reborn, and in 2024, inbound and outbound tourism will welcome the "outlet".

Which one is stronger in China's urban consumption power?29 cities compete in the charm of consumption.

Related Pages