Egnyte, a renowned storage platform, is introducing new generative AI tools and revealing future plans for AI incorporation. It's important to note that Egnyte has been integrating AI into its platform for over a decade and is not just capitalizing on the current AI trend.
The company is now introducing a ChatGPT-like interface that allows customers to interact with the content stored in Egnyte. Users can utilize this to perform tasks like creating summaries of documents or reviewing transcripts of audio and video to generate key points or summaries.
While AI has always been a part of Egnyte's workings, it was previously engaged in background tasks like privacy, security, and infrastructure management. This latest announcement shifts AI to the center stage, making it accessible to any business user wanting to interact with content stored in Egnyte.
David Spitz, Chief Strategy Officer at Egnyte, explains that they're enabling every user on the platform to use the intelligence engine. Through a chat-based interface, users will be able to ask and answer questions related to the stored content.
Future plans include the introduction of the ability to query across multiple documents and folders to find content related to a particular subject. This feature, which Spitz likens to a "natural language-based search", is expected to introduce powerful search capabilities to the content repository.
Egnyte's co-founder and CEO, Vineet Jain, shared that the solutions are being built on foundational models, such as GPT-3.5 and GPT-4, using Microsoft Azure. However, these models will need fine-tuning to cater to specific customer needs, especially in fields like life sciences and financial services where specific language is common.
Despite following the trend of many other enterprise companies, Egnyte is adding its unique spin to stay competitive. Like others, Egnyte is offering the new tooling to a limited number of customers while fine-tuning it in production environments. It remains vague on the general availability date, although it's likely to be later this year.