In a bid to control escalating costs stemming from its partnership with OpenAI, Microsoft has initiated efforts to develop its own in-house conversational artificial intelligence (AI). The move comes as the technology giant seeks alternatives to the hefty financial burden it has incurred in funding OpenAI since 2019. While the partnership has been fruitful, with breakthroughs like ChatGPT, the growing expenses have prompted Microsoft to explore more cost-effective solutions.
According to reports from The Information, a Microsoft executive overseeing a team of 1,500 researchers has instructed some of them to work on an internal conversational AI project. The primary goal is to create large language models that are not only more affordable but also smaller in scale compared to those developed by OpenAI. The strategy is to produce AI models that, even if they don’t match OpenAI’s performance, will still serve the purpose of reducing operational costs, as revealed by current employees and former staff familiar with the matter.
Also Read: Top Tech Leaders and Senators Discuss AI Safety, and Development Pace
Microsoft has already begun integrating segments of this in-house AI software into its existing products. Notably, it has been implemented in a chatbot featured within the Bing search engine, closely resembling the capabilities of ChatGPT. The primary motivation behind these efforts, as emphasized by The Information, is the pressing need to address the ever-increasing expenses associated with running current AI models.
Earlier this year, Microsoft relaunched Bing, incorporating OpenAI’s conversational AI technology. This transformation, which introduced ChatGPT’s conversational abilities, played a pivotal role in propelling Bing to achieve over 100 million daily active users for the first time in its history in March.
Microsoft’s concerns over costs have been a recurring theme in its partnership with OpenAI. In 2019, Microsoft invested approximately $1 billion in OpenAI, a crucial step in the development of the technology underpinning ChatGPT, which went on to revolutionize the market three years later. The success of ChatGPT made it the fastest application to surpass 100 million users, second only to the rapid launch of Threads.
At the beginning of this year, Microsoft further solidified its alliance with OpenAI, becoming the exclusive provider of OpenAI’s cloud services that enable the execution of its products and services. It is estimated that this new agreement required an additional $10 billion investment from Microsoft.
Despite the undeniable success of ChatGPT, the mounting costs have remained a significant concern. The development of ChatGPT incurred such substantial expenses that OpenAI reported losses of approximately $540 million in 2022, with the prospect of daily operational costs reaching at least $700,000. A significant portion of these expenditures is attributed to server maintenance, as estimated by SemiAnalysis in April.
Also Read: Microsoft’s Copyright Commitment: Protecting Users of AI Tools
In a multi-pronged approach to mitigate these costs, Microsoft has embarked on various initiatives. This includes the development of its own custom chips in collaboration with AMD and even exploring the possibility of powering its data centers, which house Microsoft Cloud and AI, with nuclear energy.
As Microsoft takes steps to ensure the sustainability of its AI initiatives, the tech world watches closely to see how these developments will shape the future of conversational AI and the broader landscape of artificial intelligence.
Leave a Reply