February 29, 2024

The tech industry loves garage startup stories. From Hewlett-Packard to Google, the stories of companies that started from nothing and became giants have inspired generations of entrepreneurs.

But startups trying to succeed with today’s hottest technology — the artificial intelligence used in chatbots like ChatGPT and Google Bard — require a lot of money and computing power, and may make those inspiring stories a thing of the past.

In 2019, Aidan Gomez and Nick Frosst left Google to create an artificial intelligence startup called Cohere in Toronto to compete with their former employers. A few months later, they returned to Google and asked if it would sell them the vast computing power needed to build their own artificial intelligence technology. After Google CEO Sundar Pichai personally approved the arrangement, the tech giant gave them what they wanted.

“It’s Game of Thrones.” That’s it. ’” said David Katz, a partner at Radical Ventures, Cohere’s first investor. Big players like Google, Microsoft, and Amazon are controlling chips, he added. “They’re controlling computing power,” he said. it. “

Building groundbreaking AI startups is difficult without the backing of “hyperscalers,” who control the vast data centers capable of running AI systems. That has allowed the industry’s giants to once again dominate in what many expect to be the most significant shift in the tech industry in decades.

OpenAI, the startup behind ChatGPT, recently raised $10 billion in funding from Microsoft. It will funnel much of that money back into Microsoft, which pays for the time on massive computer server farms run by big companies. These machines, which span thousands of specialized computer chips, are critical to improving and expanding the skills of ChatGPT and similar technologies.

Competitors can’t keep up with OpenAI unless they have similar computing power. Cohere recently raised $270 million, bringing its total funding to more than $440 million. It will use most of the money to buy computing power from companies like Google.

Other startups have made similar arrangements, most notably a Silicon Valley firm called Anthropic, founded in 2021 by a group of former OpenAI researchers; Character.AI, founded by two of Google’s top researchers; and Inflection AI, founded by a former Google executive. Inflection raised $1.3 billion in funding last week, bringing its total funding to $1.5 billion.

At Google, Mr. Gomez was part of a small research team that designed Transformersthe fundamental technology used to create chatbots like ChatGPT and Google Bard.

The Transformer is a powerful example of what scientists call a neural network, a mathematical system that can learn skills by analyzing data. Neural networks have been around for years, helping to power everything from voice-enabled digital assistants like Siri to instant translation services like Google Translate.

Transformers took this idea into new territory. It runs on hundreds or even thousands of computer chips, analyzing more data faster.

Using this technique, companies like Google and OpenAI have begun building systems that can learn from large amounts of digital text, including Wikipedia articles, digital books and chat logs. As these systems analyzed more and more data, they learned to generate text on their own, including term papers, blog posts, poetry and computer code.

These systems, known as large language models, now underpin chatbots such as Google Bard and ChatGPT.

Long before ChatGPT arrived, Mr. Gomez left Google to start his own company with Mr. Frosst and another Toronto entrepreneur, Ivan Zhang. The goal is to build a large language model comparable to Google’s.

At Google, he and his fellow researchers have access to an almost unlimited amount of computing power. After leaving the company, he needed something similar. So he and his co-founders bought it from Google, which sells access to the same chips through its cloud computing service.

Over the next three years, Cohere built a large language model Comparable to almost any other competitor. Now, it’s selling the technology to other businesses. The idea is to provide any company with the technology it needs to build and run its own AI applications, from chatbots to search engines to personal tutors.

“Our strategy is to build a platform that others can build on and experiment with,” Mr Gomez said.

OpenAI offers a similar service called GPT-4, which many businesses are already using to build chatbots and other applications. This new technology can analyze, generate and edit text. But it’s quick to handle picture and sound. OpenAI is preparing a version of GPT-4 that can examine a photo, describe it instantly, and even answer questions about it.

Microsoft Chief Executive Satya Nadella said the company’s arrangement with OpenAI is part of a mutually beneficial relationship it has long cultivated with smaller rivals. “I grew up in a company that was always doing these kinds of deals with other companies,” he told The New York Times earlier this year.

As the industry races to catch up to GPT-4, entrepreneurs, investors and experts are debating who will emerge as the ultimate winner. Most agree that OpenAI is leading the field. But Cohere and a handful of other companies are developing similar technology.

The tech giants are in a good position because they have the massive resources needed to push these systems further than anyone else.Google also Has a transformer patentthe underlying technology behind the artificial intelligence systems Cohere and many other companies are building.

But there’s a wildcard: open source software.

Meta, another giant with the computing power needed to build the next wave of artificial intelligence, recently open-sourced its latest large-scale language model, meaning anyone can reuse it and build on top of it. Many in the field believe that this free software will allow anyone to compete.

“Having the collective intelligence of every researcher on the planet will beat any company,” says Amr Awadallah, CEO of AI startup Vectara and a former Google executive. But they still have to pay to do it. Access to a larger competitor’s data center.

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *