Snowflake announces advancements to Snowflake Cortex AI and Snowflake ML

Snowflake announces advancements to Snowflake Cortex AI and Snowflake ML

Snowflake Cortex AI pitches ‘easy, efficient and trusted’ enterprise AI to thousands of organizations.

Snowflake announced at its annual user conference Snowflake Summit 2024, new innovations and enhancements to Snowflake Cortex AI.

These innovations included chat experiences empowering organizations to develop chatbots within minutes.

Snowflake also said it is further democratizing how any user can customize AI for specific industry use cases through a new no-code interactive interface, access to industry-leading LLMs and serverless fine-tunings.

The path for operationalizing models is being accelerated with an integrated experience for ML through Snowflake ML – enabling developers to build, discover, and govern models and features across the ML lifecycle.

Snowflake’s unified platform for genAI and ML is pitched as allowing every part of the business to extract more value from their data – while enabling full security, governance and control to deliver ‘responsible, trusted’ AI at scale.

Baris Gultekin, Head of AI, Snowflake, said: “Our latest advancements to Snowflake Cortex AI remove the barriers to entry so all organizations can harness AI to build powerful AI applications at scale and unlock unique differentiation with their enterprise data in the AI Data Cloud.”

Snowflake is unveiling two new chat capabilities, Snowflake Cortex Analyst and Snowflake Cortex Search – both up for public preview soon – allowing users to develop these chatbots in a matter of minutes against their structured and unstructured data, without operational complexity.

Cortex Analyst, built with Meta’s Llama 3 and Mistral Large models, allows businesses to securely build applications on top of their analytical data in Snowflake.

In addition, Cortex Search harnesses state-of-the-art retrieval and ranking technology from Neeva (acquired by Snowflake in May 2023) alongside Snowflake Arctic embed, so users can build applications against documents and other text-based datasets through enterprise-grade hybrid search – a combination of both vector and text — as a service. Snowflake also Snowflake Cortex Guard as a ‘coming soon’ leveraging Meta’s Llama Guard.

With Cortex Guard, Snowflake says it’s further unlocking trusted AI for enterprises, helping customers ensure that available models are safe and usable.

In addition to enabling the easy development of custom chat experiences, Snowflake is providing customers with pre-built AI-powered experiences, which are fueled by Snowflake’s world-class models. With Document AI (also available soon), users can easily extract content like invoice amounts or contract terms from documents using Snowflake’s industry-leading multimodal LLM, Snowflake Arctic-TILT, which outperforms GPT-4 and secured a top score in the DocVQA benchmark test – the standard for visual document question answering.

Organizations including Northern Trust harness Document AI to intelligently process documents at scale to lower operational overhead with higher efficiency. Snowflake is also advancing its breakthrough text-to-SQL assistant, Snowflake Copilot, which combines the strengths of Mistral Large with Snowflake’s proprietary SQL generation model to accelerate productivity for every SQL user.

Other initiatives include:

Snowflake AI & ML Studio

Snowflake Cortex AI provides customers with a robust set of state-of-the-art models from leading providers including Google, Meta, Mistral AI and Reka, in addition to Snowflake’s top-tier open source LLM Snowflake Arctic, to accelerate AI development.

Snowflake is further democratizing how any user can bring these powerful models to their enterprise data with the new Snowflake AI & ML Studio – a no-code interactive interface for teams to get started with AI development and productize their AI applications faster.

Cortex Fine-Tuning

To help organizations further enhance LLM performance and deliver more personalized experiences, Snowflake is introducing Cortex Fine-Tuning , accessible through AI & ML Studio or a simple SQL function. This serverless customization is available for a subset of Meta and Mistral AI models. These fine-tuned models can then be easily used through a Cortex AI function, with access managed using Snowflake role-based access controls.

Streamline Model and Feature Management with Unified, Governed MLOps Through Snowflake ML

Once ML models and LLMs are developed, most organizations struggle with continuously operating them in production on evolving data sets. Snowflake ML brings MLOps capabilities to the AI Data Cloud, so teams can seamlessly discover, manage, and govern their features, models, and metadata across the entire ML lifecycle – from data pre-processing to model management. These centralized MLOps capabilities also integrate with the rest of Snowflake’s platform, including Snowflake Notebooks and Snowpark ML for a simple end-to-end experience.

Snowflake Model Registry and Feature Store

Snowflake’s suite of MLOps capabilities include the Snowflake Model Registry), which allows users to govern the access and use of all types of AI models so they can deliver more personalized experiences and cost-saving automations with trust and efficiency. In addition, Snowflake is announcing the Snowflake Feature Store, an integrated solution for data scientists and ML engineers to create, store, manage and serve consistent ML features for model training and inference, and ML Lineage, so teams can trace the usage of features, datasets and models across the end-to-end ML lifecycle.

Also announced at Snowflake Summit 2024 were new innovations to its single, unified platform that provide thousands of organizations with increased flexibility and interoperability across their data; new tools that accelerate how developers build in the AI Data Cloud; a new collaboration with NVIDIA that customers and partners can harness to build customized AI data applications in Snowflake; the Polaris Catalog, a vendor-neutral, fully open catalog implementation for Apache Iceberg.

Click below to share this article

Browse our latest issue

Intelligent CIO APAC

View Magazine Archive