After delay, Elon Musk prepares for fire test of his generative AI

by Andrea
0 comments

When Elon Musk announced the launch of his artificial intelligence, Grok, in October 2023, competitors such as Alphabet (Google’s parent company) and OpenAI were already at the forefront of the market. A few billion dollars, the world’s largest AI supercomputer and delays later, the owner of Tesla () is preparing for one of his main fire tests in this market.

“Grok 3 is coming soon”, wrote the billionaire in a post on his own social network, X (formerly Twitter), last Friday (3). The new version is the billionaire’s biggest bet so far, the result of computational power 10 times greater than Grok 2.

The launch has already exceeded Musk’s previous predictions and there have been no major updates on the progress. In an interview with conservative psychologist Jordan B. Peterson in July 2024, he said he expects “it should be the most powerful AI in the world at that moment.”

Continues after advertising

But Alphabet (), OpenAI (and its investor Microsoft), Anthropic (and its investor Amazon) and Meta have also poured truckloads of money into their AI initiatives to fill that spot.

Largest AI supercomputer

Since Musk created xAI, his artificial intelligence company, two rounds of investment have been announced. In the last one, on the eve of Christmas, managers such as BlackRock and Sequoia Capital contributed to a contribution of US$6 billion that could value the company up to US$40 billion, according to the New York Times.

Logo of xAI, Elon Musk’s company (Forum: Dado Ruvic/Illustration/Reuters)

“It takes a lot of computing power,” Musk said in an X post.

Continues after advertising

Grok’s vaunted “rebellious” temperament, in the style of the controversial billionaire who conceived it, is being upgraded on a 100,000 GPU Hopper Tensor Core supercomputer, the largest cluster dedicated to AI training in the world.

All big tech is behind generative AI and cloud processing, but xAI appears to have gained some leverage with its cluster.

In a supercomputer, all cards must operate in imperceptible simultaneity. Through fiber communication, this is only possible when they are a certain distance away, which would normally physically limit supercomputers to 50,000 GPUs. xAI is already in the process of doubling its capacity to 200,000.

Continues after advertising

Limit of ‘brute force’

“A question being asked in the world of artificial intelligence is how long AIs will grow and improve on the brute force of computing before hitting a ceiling. Companies are finding it difficult. Grok’s own release has been delayed, a new version Claude [modelo de IA da Anthropic] was also postponed”, says Pedro Calais, professor and undergraduate coordinator at XP Educação.

It is still not clear to experts how much the models respond to an increase in computational power beyond a given limit, an improvement ceiling called the scaling limit. For now, the strategy has worked, but that limit may be approaching with a runaway expansion in computing power.

Although investors in the largest publicly traded technology companies with the infrastructure to develop generative AI, the companies remain convinced of the strategy.

Continues after advertising

In a publication made last Friday, the vice-chairman and president of Microsoft () said that the company expects to invest US$80 billion in building data centers for AI training. The company is the provider of cloud services for OpenAI processing.

In key benchmarks (comparing different models in specific capabilities, such as math tests), Grok 2 was competitive against competitors, although some of the exams applied are treated with reticence by researchers because the models can be refined for that specific use.

Source link

You may also like

Our Company

News USA and Northern BC: current events, analysis, and key topics of the day. Stay informed about the most important news and events in the region

Latest News

@2024 – All Right Reserved LNG in Northern BC