Snowflake and NVIDIA to help businesses harness their data for Generative AI
Snowflake and NVIDIA have announced at Snowflake Summit 2023 that they are partnering to provide businesses of all sizes with an accelerated path to create customized generative AI applications using their own proprietary data, all securely within the Snowflake Data Cloud.
With the NVIDIA NeMo platform for developing large language models (LLMs) and NVIDIA GPU-accelerated computing, Snowflake will enable enterprises to use data in their Snowflake accounts to make custom LLMs for advanced generative AI services, including chatbots, search and summarization. The ability to customize LLMs without moving data enables proprietary information to remain fully secured and governed within the Snowflake platform.
“Snowflake’s partnership with NVIDIA will bring high-performance machine learning and artificial intelligence to our vast volumes of proprietary and structured enterprise data, a new frontier to bringing unprecedented insights, predictions and prescriptions to the global world of business,” said Frank Slootman, chairman and CEO of Snowflake.
“Data is essential to creating generative AI applications that understand the complex operations and unique voice of every company,” said Jensen Huang, founder and CEO of NVIDIA. “Together, NVIDIA and Snowflake will create an AI factory that helps enterprises turn their own valuable data into custom generative AI models to power groundbreaking new applications — right from the cloud platform that they use to run their businesses.”
NVIDIA and Snowflake’s collaboration represents a new opportunity for enterprises. It will enable them to use their proprietary data — which can range from hundreds of terabytes to petabytes of raw and curated business information — to create and fine-tune custom LLMs that power business-specific applications and services.
By integrating AI technology from Snowflake and NVIDIA, customers can quickly and easily build, deploy and manage customized applications that bring the power of generative AI to all parts of their business across a variety of use cases. In addition, expanding AI capabilities in the Data Cloud enables these customers to create generative AI applications where their governed data already resides, a benefit that significantly reduces cost and latency while maintaining the security of their data.
“More enterprises than we expected are training or at least fine-tuning their own AI models, as they increasingly appreciate the value of their own data assets,” said Alexander Harrowell, principal analyst for advanced computing for AI at technology research group Omdia. “Similarly, enterprises are beginning to operate more diverse fleets of AI models for business-specific applications. Supporting them in this trend is one of the biggest open opportunities in the sector.”
See What’s Next in Tech With the Fast Forward Newsletter
Tweets From @varindiamag
Nothing to see here - yet
When they Tweet, their Tweets will show up here.