The Walmarts are besieged by fruit stands.
While OpenAI is synonymous with machine learning, it may soon face a new threat: rapidly multiplying open source projects that push the state of the art and leave the deep-pocketed but unwieldy corporations in their dust. This threat will keep the dominant players on their toes. It is expected to see this kind of disruption on a weekly basis, but the situation was put in perspective by a widely shared document.
According to the memo, we don't have a moat and OpenAI doesn't. While GPT-4 and other proprietary models have obtained the lion's share of attention and income, the head start they've gained with funding and infrastructure is more important. The pace of OpenAI's releases may seem fast, but if you compare them to other major software releases, you'll see that GPT-4 and GPT-3 were very hot on each other's heels. On the scale of months and years, they are still occurring.
Advertisement
In March, a leaked foundation language model from Meta was leaked in rough form, according to the memo. Core features like instruction tuning, multiple modalities, and reinforcement learning from human feedback were added to laptops and penny-a-minute server within weeks. They probably poked around the code, too, but they didn't replicate the level of collaboration and experimentation occurring in subreddits. The titanic computation problem that seemed to pose an insurmountable obstacle to challengers is already a relic of a different era of artificial intelligence development.
Advertisement
Few would have guessed that smaller was actually better. The business paradigm being pursued by OpenAI and others is a descendant of the SaaS model. You offer carefully gated access to a high value software or service through an application programming interface. It is a straightforward and proven approach that makes perfect sense when you have invested hundreds of millions into a single product.
Advertisement
If GPT-4 generalizes well to answering questions about precedents in contract law, great, never mind that a huge number of its "intellect" is dedicated to being able to parrot the style of every author who ever published a work in the English language. GPT-4 is similar to a store. The company makes sure that no one wants to go there. Customers are starting to wonder why I am walking through 50 aisles of junk to buy a few apples.
It beggars the premise of their business that these systems are so hard to build and run that they have to do it for you. It starts to look like these companies picked and engineered a version of AI that fit their existing business model, not the other way around. We have been able to fit the whole application on a personal computer since then. Our devices have increased their capacity for computation many times.
Everyone knows that it is just a matter of time before something is done on a supercomputer. The time came a lot quicker than expected. They may never be at this rate because they weren't the ones to do theOptimizing. That doesn't mean they are out of luck.
Advertisement
Being the best for a long time wasn't enough to get where it is. Being a Walmart has benefits. If they can get a decent price from their existing vendor and not rock the boat, companies don't want to have to find a solution that performs the task they want 30% faster. People are iterating on LLaMA so fast that they are running out of camelids to name them.
Advertisement
The developers gave me an excuse to scroll through hundreds of pictures of cute, tawny vicuas instead of working. Few enterprise IT departments are going to cobble together an implementation of Stability's open source derivative-in-progress of a quasi-legal leaked Meta model. They have a business to run, but at the same time, I stopped using the open source options like Gimp and Paint for image editing and creation years ago. Net has gotten so good.
The argument goes in a different direction at this point. The distance from the first situation to the second is going to be much shorter than anyone thought, and there doesn't appear to be a damn. The memo says to embrace it. It is possible to open up, publish, collaborate, share and compromise.
They conclude that Google should establish itself as a leader in the open source community by cooperating with, rather than ignoring, the broader conversation. It's likely that this means taking some uncomfortable steps, like publishing the model weights. This means giving up some control over our models. The compromise is inevitable.
We can't hope to both drive innovation and control it. Cerebral Valley has too much opportunity. .