Sacra Logo

What impact did advancing the foundation model level have on Pinecone and customers, and how does it relate to creating an ML 'stack'?

Edo Liberty

Founder & CEO at Pinecone

A stack—that's exactly how people use it. 

People use OpenAI—and other great machine learning and language models like Hugging Face, like Cohere—but people really see OpenAI and Pinecone as peanut butter and jelly. They're like “Oh, this is the natural combo.” 

They’ll go, they will take language, text or documents or paragraphs or search queries and so on, pass them through OpenAI's models, get embeddings, store those embeddings in Pinecone, query those by similarity, by relevance and so on.

People build what's called retrieval augmented generation (RAG). So instead of using a Q&A and searching for an answer, they would use Pinecone to retrieve the top most relevant semantic answers and then use generative AI models to synthesize the results into one answer. Suddenly, people can build a ChatGPT kind of thing on their own data—and it's liberating. 

When I was in grade school, you had to summarize some topic, and back then, Google wasn't a thing, so you had to go and pick up five books and then you'd summarize them. People are now doing the same thing with AI. They search and then they find 10 relevant answers and they synthesize them into actual text.

So this ability to freely move between text and images and the numeric representations, store them in Pinecone, search them, retrieve, annotate, have the model, have access to hundreds of millions or billions of actual records or actual embeddings of these documents, and building applications with those two components ends up being incredibly powerful.

And again, people do all sorts of crazy stuff with it, really building these amazing, amazing applications. Just the creativity that comes out of it, it's unreal. It's kind of crazy, because we have, well, I don't want to share exact numbers, but we've basically tripled the number of new customers per day that onboard Pinecone. And I don't think I've met any two customers that do exactly the same thing. It's kind of nutty, a lot of semantic search, a lot of question answering and so on, but it's always with a twist. They always have their own flavor, they have their own data, they have their own model, they have some business logic on top of it.

Find this answer in Edo Liberty, founder and CEO of Pinecone, on the companies indexed on OpenAI
lightningbolt_icon Unlocked Report