STOCK TITAN

Snowflake Partners with Mistral AI to Bring Industry-Leading Language Models to Enterprises Through Snowflake Cortex

Rhea-AI Impact
(Neutral)
Rhea-AI Sentiment
(Very Positive)
Tags
partnership AI
Rhea-AI Summary
Snowflake (SNOW) partners with Mistral AI to bring powerful language models to enterprises through Snowflake Cortex, offering advanced AI capabilities with security and governance over data.
Positive
  • None.
Negative
  • None.

Insights

The strategic partnership between Snowflake and Mistral AI represents a significant development in the data analytics and artificial intelligence sectors. By integrating Mistral AI's Large Language Models (LLMs) into Snowflake's Data Cloud, the collaboration is poised to enhance the capabilities of enterprise customers in harnessing AI for data analysis. This move could potentially lead to increased demand for Snowflake's services, thereby impacting its revenue and market share. Investors may view this partnership as a positive signal for Snowflake's commitment to innovation and its ability to maintain a competitive edge in the rapidly evolving AI landscape.

Furthermore, the investment by Snowflake Ventures into Mistral AI's Series A round indicates confidence in Mistral AI's technology and its market potential. This financial backing could accelerate Mistral AI's growth and product development, which in turn may create synergies for Snowflake's offerings. The market will likely monitor the adoption rates of Mistral AI's LLMs within Snowflake's customer base, as this could serve as an indicator of the partnership's success and its impact on Snowflake's performance.

The announcement highlights the technical advancements of Mistral AI's LLMs, including the flagship Mistral Large model, which boasts top performance benchmarks, unique reasoning capacities and multilingual fluency. These features are critical in differentiating Snowflake's offerings in a market where AI and machine learning are becoming increasingly commoditized. By providing access to state-of-the-art AI models, Snowflake is likely to attract a diverse clientele looking to implement advanced AI-powered applications.

Additionally, the serverless experience offered by Snowflake Cortex, which leverages NVIDIA's computing platform and tools like the NVIDIA Triton Inference Server, simplifies the AI deployment process. This ease of use could lead to broader adoption among organizations that may not have the in-house expertise to manage complex AI infrastructure. This democratization of AI could result in a wider user base for Snowflake, potentially leading to increased data storage and processing needs, which are key revenue drivers for the company.

The emphasis on security, privacy and governance in the partnership between Snowflake and Mistral AI is particularly relevant given the increasing concerns around data protection in AI applications. Snowflake's commitment to maintaining the security and privacy boundaries of the Data Cloud while enabling access to powerful LLMs is a crucial selling point for enterprises that are wary of potential data breaches or misuse.

The partnership's focus on keeping enterprise data within Snowflake's security perimeter while enabling AI functionalities addresses a significant market need for secure AI solutions. This approach could help Snowflake to gain trust and expand its customer base, especially among industries that handle sensitive information. The long-term implications of this focus on security could shape industry standards for AI implementations, potentially leading to more stringent regulations and expectations for data protection within AI services.

  • Mistral AI’s newest and most powerful model, Mistral Large, is available in the Snowflake Data Cloud for customers to securely harness generative AI with their enterprise data
  • Snowflake Ventures partners with Mistral AI to expand generative AI capabilities and empower more developers to seamlessly tap into the power of leading large language models
  • Snowflake Cortex LLM Functions, now in public preview, enables users to quickly, easily, and securely build generative AI apps

No-Headquarters/BOZEMAN, Mont. & PARIS--(BUSINESS WIRE)-- Snowflake (NYSE: SNOW), the Data Cloud company, and Mistral AI, one of Europe’s leading providers of AI solutions, today announced a global partnership to bring Mistral AI’s most powerful language models directly to Snowflake customers in the Data Cloud. Through this multi-year partnership, which includes a parallel investment in Mistral’s Series A from Snowflake Ventures, Mistral AI and Snowflake will deliver the capabilities enterprises need to seamlessly tap into the power of large language models (LLMs), while maintaining security, privacy, and governance over their most valuable asset — their data.

Snowflake Partners with Mistral AI to Bring Industry-Leading Language Models to Enterprises Through Snowflake Cortex (Graphic: Business Wire)

Snowflake Partners with Mistral AI to Bring Industry-Leading Language Models to Enterprises Through Snowflake Cortex (Graphic: Business Wire)

With the new Mistral AI partnership, Snowflake customers gain access to Mistral AI’s newest and most powerful LLM, Mistral Large, with benchmarks making it one of the world’s top-performing models. Beyond the benchmarks, Mistral AI’s new flagship model has unique reasoning capacities, is proficient in code and mathematics, and is fluent in five languages — French, English, German, Spanish and Italian — in line with Mistral AI’s commitment to promoting cultural and linguistic specificities of generative AI technology. It can also process hundreds of pages of documents in a single call. In addition, Snowflake customers gain access to Mixtral 8x7B, Mistral AI’s open source model that exceeds OpenAI’s GPT3.5 in speed and quality on most benchmarks, alongside Mistral 7B, Mistral AI’s first foundation model optimized for low latency with a low memory requirement and high throughput for its size. Mistral AI’s models are now available to customers in public preview as a part of Snowflake Cortex, Snowflake’s fully managed LLM and vector search service that enables organizations to accelerate analytics and quickly build AI apps securely with their enterprise data.

“By partnering with Mistral AI, Snowflake is putting one of the most powerful LLMs on the market directly in the hands of our customers, empowering every user to build cutting-edge, AI-powered apps with simplicity and scale,” said Sridhar Ramaswamy, CEO of Snowflake. “With Snowflake as the trusted data foundation, we’re transforming how enterprises harness the power of LLMs through Snowflake Cortex so they can cost-effectively address new AI use cases within the security and privacy boundaries of the Data Cloud.”

“Snowflake’s commitments to security, privacy, and governance align with Mistral AI’s ambition to put frontier AI in everyone’s hand and to be accessible everywhere. Mistral AI shares Snowflake’s values for developing efficient, helpful, and trustworthy AI models that advance how organizations around the world tap into generative AI,” said Arthur Mensch, CEO and co-founder of Mistral AI. “With our models available in the Snowflake Data Cloud, we are able to further democratize AI so users can create more sophisticated AI apps that drive value at a global scale.”

Snowflake Cortex first announced support for industry-leading LLMs for specialized tasks such as sentiment analysis, translation, and summarization, alongside foundation LLMs — starting with Meta AI’s Llama 2 model — for use cases including retrieval-augmented generation (RAG) at Snowday 2023. Snowflake is continuing to invest in its generative AI efforts by partnering with Mistral AI and advancing the suite of foundation LLMs in Snowflake Cortex, providing organizations with an easy path to bring state-of-the-art generative AI to every part of their business. To deliver a serverless experience that makes AI accessible to a broad set of users, Snowflake Cortex eliminates the long-cycled procurement and complex management of GPU infrastructure by partnering with NVIDIA to deliver a full stack accelerated computing platform that leverages NVIDIA Triton Inference Server among other tools.

With Snowflake Cortex LLM Functions now in public preview, Snowflake users can leverage AI with their enterprise data to support a wide range of use cases. Using specialized functions, any user with SQL skills can leverage smaller LLMs to cost-effectively address specific tasks such as sentiment analysis, translation, and summarization in seconds. For more complex use cases, Python developers can go from concept to full-stack AI apps such as chatbots in minutes, combining the power of foundation LLMs — including Mistral AI’s LLMs in Snowflake Cortex — with chat elements, in public preview soon, within Streamlit in Snowflake. This streamlined experience also holds true for RAG with Snowflake’s integrated vector functions and vector data types, both in public preview soon, while ensuring the data never leaves Snowflake’s security and governance perimeter.

Snowflake is committed to furthering AI innovation not just for its customers and the Data Cloud ecosystem, but the wider technology community. As a result, Snowflake recently joined the AI Alliance, an international community of developers, researchers, and organizations dedicated to promoting open, safe, and responsible AI. Through the AI Alliance, Snowflake will continue to comprehensively and openly address both the challenges and opportunities of generative AI in order to further democratize its benefits.

Learn More:

  • Register for Snowflake Data Cloud Summit 2024, June 3-6, 2024 in San Francisco, to get the latest on Snowflake’s AI innovations and upcoming announcements, here.
  • Learn how organizations are bringing generative AI and LLMs to their enterprise data in this video.
  • Dig into how all users can harness the power of generative AI and LLMs within seconds through Snowflake Cortex.
  • Stay on top of the latest news and announcements from Snowflake on LinkedIn and Twitter.

Forward Looking Statements

This press release contains express and implied forward-looking statements, including statements regarding (i) Snowflake’s business strategy, (ii) Snowflake’s products, services, and technology offerings, including those that are under development or not generally available, (iii) market growth, trends, and competitive considerations, and (iv) the integration, interoperability, and availability of Snowflake’s products with and on third-party platforms. These forward-looking statements are subject to a number of risks, uncertainties and assumptions, including those described under the heading “Risk Factors” and elsewhere in the Quarterly Reports on Form 10-Q and the Annual Reports on Form 10-K that Snowflake files with the Securities and Exchange Commission. In light of these risks, uncertainties, and assumptions, actual results could differ materially and adversely from those anticipated or implied in the forward-looking statements. These statements speak only as of the date the statements are first made and are based on information available to us at the time those statements are made and/or management's good faith belief as of that time. Except as required by law, Snowflake undertakes no obligation, and does not intend, to update the statements in this press release. As a result, you should not rely on any forward-looking statements as predictions of future events.

Any future product information in this press release is intended to outline general product direction. This information is not a commitment, promise, or legal obligation for us to deliver any future products, features, or functionality; and is not intended to be, and shall not be deemed to be, incorporated into any contract. The actual timing of any product, feature, or functionality that is ultimately made available may be different from what is presented in this press release.

© 2024 Snowflake Inc. All rights reserved. Snowflake, the Snowflake logo, and all other Snowflake product, feature and service names mentioned herein are registered trademarks or trademarks of Snowflake Inc. in the United States and other countries. All other brand names or logos mentioned or used herein are for identification purposes only and may be the trademarks of their respective holder(s). Snowflake may not be associated with, or be sponsored or endorsed by, any such holder(s).

About Snowflake

Snowflake enables every organization to mobilize their data with Snowflake’s Data Cloud. Customers use the Data Cloud to unite siloed data, discover and securely share data, power data applications, and execute diverse AI/ML and analytic workloads. Wherever data or users live, Snowflake delivers a single data experience that spans multiple clouds and geographies. Thousands of customers across many industries, including 691 of the 2023 Forbes Global 2000 (G2K) as of January 31, 2024, use Snowflake Data Cloud to power their businesses. Learn more at snowflake.com.

For Snowflake

Kaitlyn Hopkins

Senior Product PR Lead, Snowflake

press@snowflake.com

For Mistral AI

Alexandra van Weddingen

Alva Conseil

avanweddingen@alvaconseil.com

Source: Snowflake Inc.

FAQ

What is the partnership between Snowflake and Mistral AI about?

The partnership aims to bring Mistral AI's powerful language models directly to Snowflake customers in the Data Cloud, empowering enterprises to tap into large language models while ensuring security, privacy, and governance over their data.

What are the benefits of Mistral AI's newest LLM, Mistral Large, for Snowflake customers?

Snowflake customers gain access to Mistral Large, one of the world's top-performing models with unique reasoning capacities, proficiency in code and mathematics, fluency in five languages, and the ability to process hundreds of pages of documents in a single call.

What is Snowflake Cortex and how does it enable organizations to accelerate analytics and build AI apps securely?

Snowflake Cortex is Snowflake's fully managed LLM and vector search service that allows organizations to quickly build AI apps securely with their enterprise data, leveraging industry-leading LLMs and foundation models for various use cases.

How does Snowflake Cortex LLM Functions in public preview benefit Snowflake users?

Snowflake users can leverage AI with their enterprise data using specialized functions to address tasks like sentiment analysis, translation, and summarization in seconds, while Python developers can create full-stack AI apps like chatbots in minutes.

What steps has Snowflake taken to make AI more accessible to a broad set of users?

Snowflake Cortex eliminates the long-cycled procurement and complex management of GPU infrastructure by partnering with NVIDIA, delivering a serverless experience that leverages NVIDIA Triton Inference Server and other tools to make AI more accessible.

Snowflake Inc.

NYSE:SNOW

SNOW Rankings

SNOW Latest News

SNOW Stock Data

56.80B
319.83M
4.58%
61.52%
3.51%
Software - Application
Services-prepackaged Software
Link
United States of America
BOZEMAN