If you look past the financial headlines, what are today’s AI startups building?
News coverage of the AI boom has been hectic and mainly covered a few categories: Financial, Big Tech, concern and hype, and startup activity. The financial side is simple: Investors are working to put capital into companies that are either building new AI-powered products or embedding it into existing products.
The Exchange explores startups, markets and money.
Read it every morning on TechCrunch+ or get The Exchange newsletter every Saturday.
The Big Tech collection is also easy to understand: Google and Microsoft are racing to own the cloud layer underneath major AI technology and building generative AI services into their existing productivity and search products. Meta, Amazon and Baidu are also busy. The list goes on.
Hype is not hard to find nor is the doomer perspective. Reality will likely be somewhere in between. I suspect that we’ll grow accustomed to having AI-powered tools and services around us at all times and some of the use cases will be positive while others will prove negative.
But these conversations often don’t actually discuss what is being built. So, this morning, I’ll go back through our recent generative AI coverage to provide a few notes on what folks are working to create. I am approaching the topic as a generalist who has a pro-tech, pro-progress and pro-capitalism perspective tempered by a dash of anxiety. Call me an optimist with an asterisk.
Fair enough? Let’s get to work.
Looking past the money
We’re going to look at Together, Contextual AI, Instabase, Adept and Cohere.
Together
Together last month raised $20 million and caught my eye since it is clearly on the open source side of the current AI divide.
There has long been a debate on the merits of closed source software versus open source — ask any of your friends that use Linux on their personal machines if you want some context. TechCrunch+ has a little history for you here.
The company is building “a cloud platform for running, training and fine-tuning open source models that the co-founders claim will offer scalable compute at ‘dramatically lower’ prices than the dominant vendors (e.g., Google Cloud, AWS, Azure).” That indicates a future where open source AI tech will be more generally accessible.
One of Together’s first projects, RedPajama, aims to foster a set of open source generative models, including “chat” models along the lines of OpenAI’s ChatGPT. The startup also intends to help other companies tune and employ open source generative AI models.
If the company’s projects work out, we could see more and better open source generative AI models being brought to market in a relatively simple manner. Sure, Together is not going to make headlines like OpenAI, but its work could help foster a strong open source generative AI ecosystem, providing a real counterweight to the AI majors and closed source generative AI models. Very cool.
Contextual AI
How large companies will use generative AI tools is a question worth pondering. Use cases are not too hard to come up with, but companies are still working out how to get solid results.
There are a few issues, though. Companies want to keep certain data from AI models, and other data segregated from people both inside and outside their walls. And there is the risk that generative AI can get a bit too creative at times. It’s also often too generalist to be useful for more niche businesses right now.
Contextual AI wants to solve all those issues at once. The company claims to be building “the next generation of foundation models that provide fully customizable, trustworthy, privacy-aware AI.” That’s a big claim, but if the company succeeds, it would go a long way toward creating generative AI tools that corporations operating under complex and evolving global privacy and data-residency regimes can use without worrying.
TechCrunch has been working to understand the intersection of modern AI tools and the enterprise. Here’s our AI reporter Kyle Wiggers’ take when he covered Contextual AI:
Our colleague Ron Miller has mused about how generative AI’s future in the enterprise could be smaller, more focused language models. I don’t dispute that. But perhaps instead of exclusively fine-tuned, enterprise-focused LLMs, it’ll be a combination of “smaller” models and existing LLMs augmented with troves of company-specific documents.
There will be a few winners in this area. The enterprise is a huge space, but if the future that optimists expect does arrive, the corporate use case will require a lot more than just single-sign-on support. We should consider how quickly tools to make generative AI work in the enterprise will mature and reach the market, and if they can do so before interest wanes in the new tech.
Instabase
What happens when you are a company that uses AI to help customers summarize unstructured data and generative AI comes around? I suspect that there’s some neat stuff at the intersection of comprehension and generation. We’re again in the enterprise space and I am starting to notice a trend forming, at least at the generative AI startups that TechCrunch covers.
Instabase is not a new company. It’s been in the market since 2015 and has a pretty interesting market positioning. It already has enterprise customers and has worked to handle the data and privacy issues those companies bring. That gives it a pretty solid base to build on.
Today, that work comes in the form of Instabase’s AI Hub, which offers a host of methods to extract data from documents and a way to talk with datasets.
Adept
Of all the companies that we’re discussing this morning, I am most excited about Adept. Here’s how Kyle explained the company:
Adept’s vision, at a high level, is to create what it refers to as an “AI teammate” trained to use a wide variety of different software tools and APIs. Instead of investigating ways to generate text or images, like startups OpenAI and Stability AI, Adept’s studying how people use computers — specifically how they browse the web and navigate software — to train an AI model that can turn text instructions into sets of digital actions.
When generative AI took off, I hoped it would eventually be daisy-chained with other tools to create something truly impressive. Here’s how I sketched the future when I wrote about Adept back in March (emphasis added):
- LLMs like GPT-4 are tools that anyone can use and get value from; evidence for this comes from their massive and seemingly sticky popularity.
- Companies, including startups, are working to bake today’s AI tools into products and services, some of which appear to be more operating system-level than stuck in the application layer.
- The same tech appears to be increasingly competent at writing code that it could, in theory, execute if given the proper tooling.
The rendezvous of our three bullet points is a computing system that can do more than answer queries — it can help build things. If simple development tasks can be handled by AI, then we can effectively treat software development in many cases as disposable instead of catastrophically expensive. (I presume, in this view, that LLM output can be translated into action, which I think is a small leap at the moment.)
Your computer is going to get a lot smarter and more useful in the coming years. At this point, the operating system of the future is probably the one that manages to get as much AI power into it as quickly and safely as possible so that everyday folks are supercharged in their abilities. It could be as big a step-change in our output and capabilities as the internet. And I say that as someone who is terminally online.
So, yeah, what Adept is building has me pretty stoked.
The next step in this journey is probably alternative input mechanisms: Instead of Siri buckling under the weight of you asking it to send a text message, imagine if you could speak directly to your computer and have it do stuff quickly for you. For me, that would look like: Open a new Chrome tab and take me to the TechCrunch+ home page; mute Slack for the next 30 minutes so that I can focus; write a program that brings all TechCrunch+ subscriber growth data from our analysis tool and chart it in a spreadsheet using TechCrunch colors. That sort of thing.
I am elated with this particular use case. Generative AI could become the digital assistant we’ve long hungered for.
It appears that the “build enterprise-ready AI models” effort is a real locus of startup activity and venture investment. That’s what Cohere is working on, too. What sticks out to me is the company’s approach to cloud agnosticism and the fact that it “takes a hands-on approach, working with customers to create custom LLMs based on their proprietary data.”
I’m curious how much hand-holding enterprises will need to get up to speed with generative AI, and whether that will require an in-house expert or if it will be a service offered by AI software providers. If it’s the latter, it would be similar to SaaS in a sense. That’s generally offered at gross-margin neutral pricing but can provide a good lever for companies to grow their high-value ARR. Perhaps Cohere will blaze the trail here for us to watch and learn.
That list was far more enterprise-focused than I initially hoped for, but we shouldn’t be too surprised. Companies have been collecting libraries of data for years now, and firms like Databricks and Snowflake have helped store and chew that data for a long time. If generative AI is only as good as the data it learns from (and trained on), then the people with the most data will probably be able to do the most.
Here’s hoping that we get some more startups like Adept, though. We need companies that take more speculative — and therefore more fantastic — bets on what today’s LLMs can really do.
Comment