Contextual AI, an innovative startup, emerged from stealth mode today with $20 million in seed funding. The firm aims to develop the "next generation" of large language models (LLMs) to cater to enterprise needs. Backed by prominent investors such as Bain Capital Ventures, Lightspeed, Greycroft, and SV Angel, the company aims to tackle several roadblocks in making generative AI more appealing to businesses.
Co-founded by Douwe Kiela and Amanpreet Singh, Contextual AI plans to harness a technique known as retrieval augmented generation (RAG), which Kiela researched extensively while at Meta. The technique enhances LLMs' performance by incorporating external sources like files and webpages.
By doing so, the LLM can generate "context-aware" responses by looking for relevant data within these external sources and packaging the results with the original prompt. This strategy addresses LLMs' typical problems with attribution and customization, thus facilitating the integration of data sources without necessarily requiring model retraining or fine-tuning.
Kiela asserts that their solution would be instrumental in addressing existing issues and unlocking the true potential of language models for enterprise use cases. He adds that enterprises need assurance about the accuracy, reliability, and traceability of generative AI's answers.
The startup is reportedly in discussions with Fortune 500 companies to trial its technology. The newly secured seed funding will mainly be directed towards product development, including investment in a compute cluster to train LLMs.