Where will the AI frenzy go?

Mondo Technology Updated on 2024-02-24

Author: Mark Collier 丨Chief Operating Officer of OpenInfra**.

OpenInfra-hosted projects include OpenStack for cloud computing, StarlingX for edge computing, Zuul for continuous integration and continuous delivery of CI CD, and Kata Containers for secure containers, which is currently in high demand.

With the support of more than 100,000 members in 187 countries, OpenInfra is focused on open source infrastructure. Looking ahead to 2024, if you want to make a decision about the future direction of technology, you can't avoid the rapidly evolving field of artificial intelligence, and in this article, Mark will propose five hot topics in artificial intelligence**.

1.AI models will become smaller and more efficient

The new large language model (LLM) will improve memory efficiency through compression techniques, allowing complex models to run complex models on performance-constrained hardware with minimal attrition. We've seen technologies like the mixtral moe sparse model and Microsoft's PHI-2 make this happen. Especially when it comes to inference, these new ways of building LLMs can run on increasingly small hardware, such as laptops, iPhones, or Android hardware devices. This capability will expand the use of AI and make it easier for businesses of all sizes to adopt AI strategies.

2.Global collaboration helps open-source AI catch up with GPT4

Open source AI will catch up with and surpass GPT4 and GPT4V, which is the most advanced technology of the moment but is not open. Investments in open-source AI models are increasing, with new versions being released every day. Researchers from all over the world are testing these models and learning from each other. France's Mistral (which has raised $400 million) and China's Zero One Everything (01.).AI) companies are two of the many companies that are taking advantage of open access. The bottom line is that these businesses are willing to listen to feedback from the community and remove restrictions to make these technologies more and more open.

To that end, I've been involved in an important effort led by the Open Source Initiative (OSI) to draft a definition of "open source AI". AI involves more than just sources**, which means it doesn't fit our definition of open source in the traditional sense of the word or open source supported by OSI. This will become very important as the open source model continues to improve.

3.Custodians who provide users with access to open-source AI will be more competitive

There are many avenues for hosting open-source AI models, as evidenced by the rapid emergence of various startups, not only reducing costs, but also finding increasingly efficient ways to host models. With enough GPUs, users can run AI models locally on their laptops or desktops, but there will be more competition on the server side, reducing the cost of processing these models.

4.Software advancements will intensify the competition for AI hardware

Nvidia, with its CUDA architecture and software stack, dominates the key components of AI training and running models, however, in 2024, AMD and other companies are expected to launch even better hardware products, along with improved supporting software, such as AMD's investment in PyTorch support and CUDA alternatives, the competition for AI hardware will intensify, and Nvidia's industry leadership will also be challenged.

5.Artificial intelligence will usher in a breakthrough in the field of entertainment

ChatGPT is the fastest consumer online product to exceed 100 million users, and its easy-to-use chat interface brings natural language processing (NLP) and LLM technology to the global perspective. Personally, I think that the same impactful things will also happen in the field of entertainment. It is expected that in 2024 we will see a short film or series entirely generated by AI released in theaters or released on mainstream** platforms.

Needless to say, as systems become smarter, the pace of change is accelerating, and this momentum is set to continue in 2024. Individuals, organizations, and organizations will all be affected by the acceleration of innovation, and in this era of rapid change, the resonant question is: How can we collectively keep up with this unprecedented wave of AI development?

About the author

Mark Collie has been involved in a number of disruptive technological changes, from Dell's first GPU, to MusicMatch's digital***** transformation, and Yahoo's first streaming*** on-demand service. In 2010, while working at Rackspace, an early pioneer in cloud computing, Mark co-founded OpenStack, an open-source cloud computing project with NASA and 25 other organizations around the world. In 2012, Mark co-founded OpenStack and has served as COO for over a decade, managing one of the fastest-growing open source projects of all time. In 2020, OpenStack will officially evolve into OpenInfra. Now, Mark is helping to expand the Society's mission to bring the unique open collaboration approach that the OpenStack community has accumulated to more open source projects in the cloud infrastructure market, including Kata Containers, Zuul CI CD Platform, and StarlingX Distributed Computing Platform.

About Open Source Infrastructure: OpenInfra is committed to building a diverse open source community and promoting the use of open source infrastructure software in production. With the support of more than 110,000 community members in 187 countries and regions around the world, OpenInfra** hosts open source projects and conducts community practices in areas such as artificial intelligence, container cloud native applications, edge computing, and data center cloud. Welcome to OpenInfra!

Related Pages