Connect OpenAI s broken thigh .

Mondo Health Updated on 2024-01-28

Author丨Li Shuangshuang.

Editor丨Sea waist.

Source丨rainAI official website.

On November 30, after a review by the CFIUS (Committee on Foreign Investment in the United States), the United States forced Saudi Arabia to be in the rain on the grounds of "**".AI.

The Saudi Aramco venture capital firm, Prosperity7 Ventures, previously led the investment in RainAI's largest Series A funding round of $25 million. According to Pitchbook data, Prosperity7 gave its stake** to Silicon Valley venture capital Grep VC. Both agencies declined to comment on the ** report.

OpenAI co-founder Sam Altman and venture capital are also investors in the company.

According to the analysis of Wisebeast, rainAI was founded in 2017 and is an AI chip company. The latest funding round took place in August 2023 with a $2.1 million investment from Toy Ventures. Currently, rainAI raised a total of $32.1 million at a valuation of about $90 million.

Investors include OpenAI's hardware expert Scott Gray, Sam Altman, who has just starred in the palace fight, Daniel Gross, who is keen to invest in AI startups, Justin Mateen, founder of Tinder, George Godula, founder of Web2Asia, Y Combinator, Soma Capital, Buckley Ventures, Individuals and institutions such as FundersClub, Liquid 2 Ventures, Pegasus Tech Ventures, Loup Ventures, etc.

rain.The core AI product is a "brain-like" AI chip based on neuromorphic. The company says the NPU will provide AI developers with potentially 100x computing power and 10,000x more energy efficiency for training, and its AI accelerator strikes a record balance between speed, power, area, accuracy and cost. Currently, rainThe AI team consists of about 40 people, including experts in chip architecture and design, deep learning Xi algorithms, semiconductors, compilers, etc.

Founder Gordon Wils LinkedIn on introduces RainAI is to build artificial brains and software and hardware integration platforms to significantly reduce the cost of artificial intelligence, and they are conducting research on new hardware substrates and AI training algorithms to narrow the cost gap between artificial intelligence and biological intelligence. "One of the reasons we've built hardware is great is that we've significantly reduced power consumption compared to comparable AI hardware. ”

10,000 times more energy efficient than GPUs

In the 50s of the 20th century, American engineers have made the first ANN (artificial neural network) for solving practical problems. They study computational models that mimic the mechanisms and functions of biological neural networks for machine Xi and artificial intelligence tasks, but it is difficult to solve difficult computational problems with large data teams.

There have also been major manufacturers such as HP and Intel who have tried neural network processors. In 2016, Intel acquired Nervana Systems to develop the Nervana NP-T training chip, but it was abandoned in less than 3 years due to the long process and high cost. (For details, see "Leaving Intel for a Second Venture, Two Years Later the Company Sold $1.3 Billion").

nervana nnp-t

The research of deep learning Xi has been precipitated for many years, until the rise of large model architecture and large language models in recent years, and researchers are once again in a dilemma - for the current computer digital system, running and training large models such as ChatGPT is time-consuming and power-consuming.

One group of people believes that while some existing commercial chips use analog memory processor technology, they consume a lot of power. From an engineering point of view, the backpropagation algorithms widely used for AI training are not compatible with the parallelism of large models, which potentially limits the hardware design, suggesting that we need a very different approach to deep network learning Xi.

So in this case, it's rainWhere AI comes in. Its goal is to build a complete virtual chip that solves these problems with new hardware and new algorithms. Its slogan is: "Redefining the limits of AI computing." ”

rain .The core product of AI is a brain-like chip based on neuromorphic technology. According to data, neuromimicry is a branch of brain-like computing. rain.Neuromorphic processors (NPUs) introduced by AI are computer chips designed to mimic the structure and function of the human brain. It is claimed to be the world's first end-to-end analog, trainable AI circuit. The NPU will be used in combination with new algorithms that simulate AI training and inference (balancing algorithms) and new analog chip architectures. The implementation of simulation algorithms makes NPU energy efficient by 10,000 times higher than GPU.

In principle, unlike a digital system that runs with "0" and "1", rainThe AI chip acts as an "artificial synapse" that mimics the brain's neural connections through a random Memristor (designed by HP Labs about 10 years ago) 3D array. rain.AI co-founder Gordon Wilson explains that this allows the processing layer and memory layer to be in the same place, allowing AI algorithms to run faster and more energy-efficiently.

We make principled design decisions across the full stack to reduce the abstraction-class interface gap between neural networks and their practices in the chip. ”rain.AI provides transformative co-design solutions in the official website approach.

Pioneering the digital in-memory computing (D-IMC) paradigm to improve AI processing, data storage, and data mobility. Its core is scalable to high-volume production and supports training and inference. When with rainWhen combined with AI's proprietary quantization algorithms, the accelerator will maintain the accuracy of FP32 (single-precision floating-point number, suitable for most scientific computing and general-purpose computing tasks). Also, rainAI has developed a proprietary interconnection of D-IMC and RISC-V that can be reprogrammed efficiently and freely by any operator.

According to the data, the initial chip architecture is based on the traditional RISC-V open-source architecture supported by Google, Qualcomm and other technology companies, and is used for edge devices such as mobile phones, drones, cars and robots far from the data center.

It's not just the chips, they want to control the ecosystem, they want to have access to semiconductors, their own algorithms, and their own training inference and fine-tuning processes," one programmer noted. rain.The AI team is providing IP licensing opportunities for digital memory compute blocks and software stacks, tailored for AI workloads on the ultra-low latency and high-performance device side, including headsets, smartwatches, smart cars, smart controllers, and more, with IP for custom SoCs available now.

Not only that, rainAI combined with hardware design to fine-tune algorithms (e.g., LoRa) to facilitate efficient real-time training can improve AI accuracy by at least 10% in real-world deployment environments.

At the end of the day, rainAI tries to build AI acceleration chips with graphics memory, but the architecture is completely different, and it also has its own "CUDA" to run seamlessly in hardware and not consume as much power as existing systems. According to the official website, this type of chip will be launched soon.

Shipped after 2 years

rain.AI wrote in the job posting: We were finally able to put ChatGPT-sized models into thumbnail-sized chips.

rain.AI's goals are in line with former CEO and founder Gordon Wilson's urgency, "Now, RainAI is probably the most ambitious AI chip company in the world. ”

rain.Gordon Wilson, former CEO and founder of AI, Source: RainAI official website.

Gordon was born into a family of entrepreneurs who love science fiction, and there is no shortage of imagination in his daily life, and he turned his childhood fascination with science fiction into a passion for entrepreneurship. Gordon, 25, attends the University of Florida, where he meets fellow senior Jack Kendall and college professor Juan Claudio Nino, both of whom have no experience in running companies but share a keen interest in artificial intelligence.

In college, Gordon chaired the university's DSI (Multidisciplinary Community), held seminars on topics such as Python, machine learning, Xi, and mentored a lot of projects, and at that point he realized he had the ability to make money working digitally. During this time, he also saw that neural networks were undergoing rapid development since 2012.

How do brains optimize and what algorithms do they use?It's a question that even Yoshua Bengio (a leading expert in the field of AI and winner of the Turing Award) is interested in," says Jack, who saw the physical limitations of modern neural networks after being inspired by Alex Net, and instead focused on the hardware part.

Over the years, Jack's research group at the University of Florida at Nino invented a method for connecting artificial chip neurons using coaxial nanowires, which is also known as RainAI lays the groundwork. In this group, Gordon and Jack got to know each other, and through Jack's interdisciplinary thinking, they discovered that the potential of AI could be explored from multiple perspectives such as neurology, physics, and chemistry at the same time.

In June 2017, the trio co-founded Rain Neuromorphics, with Gordon as the CEO, Jack as the CTO to build the framework, and Juan as the scientific advisor. In 2018, they participated in Y Combinator S18 to secure working capital, and in August, they successfully raised $5 million in seed funding including YC, Liquid 2, and Soma Capital, and it is worth noting that Scott Gray and Sam Altman placed bets in this round. At the time, Altman was also the CEO of YC.

OpenAI and RainThe fate of AI does not stop there, according to Gordon's interview, RainAI has been working with the OpenAI team on algorithms since May 2020. OpenAI's technician Scott is one of the advisors on its official website, in addition to Jackson Hu, a well-known Chinese expert in the semiconductor field, and Arijit Raychowdhury, an expert in the field of memory computing, are also among them. Gordon mentioned in InsightFinder's podcast that they developed and researched neuromorphic techniques with Yoshua Bengio, the father of deep Xi.

One of the key constraints that prevented us from deploying neural networks on a large scale was the cost of its hardware", Gordon saw a lot of energy and time involved in using NVIDIA GPUs to run neural networks in the early years of the company's operation, "so we tried to bring our hardware closer to the human brain, doing a lot of crazy things, such as using nanowire research developed at Stanford University".

It wasn't all smooth sailing, either, as they hit a wall in flash memory and changed the direction of their hardware to ensure the durability of the base layer, the speed at which they could switch devices, and the power consumption they consumed. "Simulate the neural circuits of a real brain and use the natural matrix inversion function to get faster second-order calculations," Jack said. "July 2020, rainA study conducted by the AI team in collaboration with Canadian research institute MILA proves that it is feasible to train neural networks entirely with simulated hardware.

Followed by rainIn October 2021, AI launched a demonstration chip that can operate neural network training and inference with very low power consumption. In November 2022, Jack and Texas A&M University professor Su-In Yi published a paper in Nature ElectronicsThe Memristor array used by AI can improve energy efficiency in real terms.

rain.The progress of AI can be described as "one major update a year". According to investor filings, rainAI will be tested as soon as this month (December), with the first batch of products available in October 2024 and commercial shipments in 2025, but it may be due to rainThe AI investment turmoil has increased the challenge of chip listing and delayed the fulfillment of the promise of prepaid orders.

rain.AI has told investors that it has been in talks with Google, Oracle, Meta and Amazon. After multiple rounds of fundraising, RainThe idea of AI is gradually moving from a laboratory project to a promising commercial application on a large scale.

It may be able to reduce costs and increase efficiency for OpenAI

Recently, Gordon resigned from RainCEO of AI, retired from the background as an executive advisor. The current CEO of the company is former Chief Operating Officer William Passo, a Ph.D. from Harvard Law School, an alumnus of Duke Gordon University, and a former attorney at Freshfields Buckhaus Deringer.

William Passo Image source: rainAI official website.

William Pasto said in a blog post that Nvidia's GPUs are a key "bottleneck" for AI innovation, and its monopoly has created an insurmountable barrier to entry for start-ups.

According to the news related to the closed-door meeting in May, even OpenAI is severely limited by GPUs and has delayed many of its short-term plans, such as fine-tuning the API is currently limited by GPU availability, and running and managing fine-tuning is very computationally intensive. ChatGPT users can also clearly feel that GPT4 is compared to GPT35 is more "lazy", which may be related to the saving of computing power.

In recent months, Altman has been in discussions with Middle Eastern investors to raise funds to start new chip companies to help OpenAI and other companies diversify, according to people familiar with the matter.

After all, Sam Altman himself believes that "the speed of AI progress may depend on new chip designs and ** chains." "Having said that, OpenAI is also asking for RainAI placed a $51 million order, highlighting its willingness to spend money to secure chips for groundbreaking AI projects**. If rainThe successful application of AI chips can provide "low-cost, high-performance" hardware for AI companies such as OpenAI and Anthropic, and connect the "thigh" that OpenAI has broken due to high costs.

Jack believes that now that everyone is concerned about the limitations of large models, it is a good time to get closer to AGI. "The improvement in chip performance has led to the study of more efficient architectures and algorithms. This leads to a research peak, which may be five years, maybe ten years. New hardware will also appear, and that's when we'll see AI on par with humans. ”

Related Pages