Former Intel CEO Pet Galesinger is using a dyspacc instead of OpenAI on its startup, glue

Dipcic Open Source AI logic model, R1Spark a NVIDIDA And because of its customer app To the top of the app stores..
Deepske last month said he trained a model Using some 2,000 data centers of Nvidia’s H800 GPU At a cost of about $ 5.5 million in about two months. Last week, he published a paper showing that his latest Model Dell matches the world’s most advanced logic models. These models are being trained in data centers that are spending billions of NVIDIA’s fast, very valuable AI chips.
The high performance of the dippick, the low cost of the model’s tech industry has been wild. For example, Pet Galesinger, Gly, took on x with posting“Thank you Deepsic Team.”
Galesinger is, of course, recently the former CEO of Intel, a hardware engineer and the current chair of his own IPO-bound startup, glue, messaging and engagement platform for churches. He left Intel in December four years later and attempted to chase Nvidia with Intel’s optional AI GPU, Gaudi 3I.
Galesinger writes that Deepcc should remind the tech industry’s three most important lessons: low cost means adopting widespread-failure; Certainly blooms under obstruction; And “Open wins. The Deepsk Foundational AI will help reset the growing world of the model’s work, “he wrote. OpenAI and Anthropic are both closed sources.
Galsar told Techcrunch that R1 is very impressive, Glue has already decided to adopt and not pay for OpenAI. Glue is creating an AI service called Kalam, which will provide chatbot and other services.
“My glue engineers are running R1 today,” he said. “They could have run the O1 – well, they can only cause access to O1 by API.”
Instead, in two weeks, Glue is expected to re -make Kalm from the beginning, “with our own basic model, which is all open sources,” he said. “It’s exciting.”
He said that he thinks that Deepasic would make AI so affordable, AI would not just come everywhere. Good AI will be everywhere. “I want a better AI in my ORA ring. I want a better AI in my hearing assistance. I want more AI in my phone. I want a better AI in my embedded devices like the sound identity in my EV, “he says.
Galesinger’s happy reaction was probably contradictory to others who were less thrilled that the basic models of logic are now a high -performance and more affordable challenge. AI is growing more expensive, no less.
Others should have reacted by notifying Dippicic and somehow its number and Training should be more expensive. Some thought he could not use high-final chips due to AI chip export restrictions on China. Others were placing holes in its influence, finding places where other models did better. Others believe that the next model of the OpenAI, O3, when R1 will outpay when it is released that the position will be repaired unchanged.
Galesinger pulls it all. He said, “Given that most of the work was done in China, you will never have complete transparency.” “But still, all the evidence is that it is 10-50x cheap in their training.”
Deepsik proves that AI can be moved forward, “by engineering creativity, do not throw more hardware power and count resources on the problem. So it’s exciting,” he said.
Galesinger shakes heads in a form of chinese developer, like concerns about privacy and censorship, all of which suggests.
“Chinese reminds us of the power of open ecosystems, it is a shame for our community,” he said.
There is an AI-centered newsletter in Techcranch! Sign up here To get it in your inbox every Wednesday.
https://techcrunch.com/wp-content/uploads/2021/01/GettyImages-871704844.jpg?w=1024