Generative AI in Azure Machine Learning: Operationalizing App Development for AI Transformation Generative AI in Azure Machine Learning: Operationalizing App Development for AI Transformation
In the age of generative AI, leaders find themselves at an intersection of innovation and purpose. The question that reverberates though... Generative AI in Azure Machine Learning: Operationalizing App Development for AI Transformation

In the age of generative AI, leaders find themselves at an intersection of innovation and purpose. The question that reverberates though boardrooms and data science conferences is this: How can this new constellation of cutting-edge technologies be harnessed to advance organizational objectives with creativity and business agility, while balancing social responsibility? Generative AI is not just a tool; its promise, combined with unique business data, fuels competitive advantage.

The Azure Machine Learning team has been at the forefront of innovation, infusing Generative AI capabilities into our platform over the last year including the addition of many OSS models into our model catalog. At Microsoft Inspire, we announced the addition of Meta’s Llama 2 models, and Falcon models by the Technology Innovation Institute.

The field of MLOps has evolved to incorporate LLMOps, the act of refining and understanding how to select, fine-tune and manage large transformer models to meet a variety of needs within an organization, and in particular how to monitor these solutions at scale. This represents a new operational skill set for many companies. We’ve invested in this specialized area by adding several capabilities to our Azure Machine Learning platform.

This month, we are excited to announce several new additions to our Generative AI portfolio in Azure Machine Learning.

Discover, customize, and deploy vision and multi-modality models in the Azure Machine Learning model catalog

We’re constantly looking for new effective ways for machine learning professionals and developers to easily discover, prompt engineer, fine-tune, and deploy pre-trained large AI model solutions. At Build, we announced the public preview of foundation models in the Azure Machine Learning model catalog. The model catalog serves as a central hub to explore collections of various foundation models from Hugging Face, Meta, and Azure OpenAI Service. Today marks another milestone: the preview of a diverse suite of new open-source vision models, spanning image classification, object detection, and image segmentation into our model catalog.

With these new capabilities, developers will be able to easily integrate powerful vision models into their applications and drive AI innovation across industries such as predictive maintenance, smart retail solutions, and autonomous vehicles.

thumbnail image 1 of blog post titled Generative AI in Azure Machine Learning: Operationalizing app development for AI transformation

Figure1. Discover vision models in Azure Machine Learning model catalog.

In addition, we are also thrilled to announce significant updates to AutoML for Images and NLP (natural language processing), integral components of the Azure Machine Learning suite. The updated architecture transitions from a monolithic design to a modular, component-based training pipeline, offering enhanced scalability, flexibility, debuggability, and reliability. Now you can easily incorporate the latest foundation models in model catalog into your projects, tailor pipelines to specific tasks such as Object Detection or Text Classification and save on computational costs through efficient component reuse. Whether you are fine-tuning existing models or exploring new architectures, these updates make it easier to execute, monitor, and scale your machine learning projects—all while leveraging the latest innovations in the AI landscape.

– Read the announcement blog for more information about new vision models.

– Read the AutoML & NLP Blog to dive deeper into AutoML enhancements.

Introducing a code-first experience in prompt flow for streamlined development

Large Language Models (LLMs) have enabled many intelligent tasks that were previously infeasible. As a result, there is a strong need for building AI applications that leverage LLMs. As LLMs rapidly evolve, prompt engineering and LLMOps play a crucial role in harnessing the full potential of LLMs with tailored AI-powered solutions that meet specific business needs.

To streamline the iterative processes of tuning quality through prompt engineering, we introduced Azure Machine Learning prompt flow at Build 2023, an interactive studio experience to design, experiment, evaluate, and deploy LLM workflows.

Prompt flow offers a range of benefits that help users transition from ideation to experimentation and ultimately to production-ready LLM-infused applications. When talking with customers, the three most common questions we hear are: how do I manage prompt versions, how do I integrate with CI/CD processes, and how do I export and deploy my prompt flows?

To address these questions and extend the capabilities towards more robust LLMOps, we’re introducing a code-first experience in prompt flow through our SDK, CLI, and the extension for VS Code, now available in preview. Now, developers can easily export a flow folder from the prompt flow UI and integrate with their preferred code repository, ensuring that their workflows and prompts are version-controlled and tracked efficiently. The prompt flow SDK not only allows developers to test flows locally and obtain single run outputs, but users can also submit flow batch runs to a cloud workspace and rigorously evaluate the run results, providing developers with the capability to handle extensive testing scenarios. To facilitate a smooth CI/CD pipeline, prompt flow CLI and SDK offers seamless integration with Azure DevOps and GitHub Actions. The prompt flow extension for VS Code enhances this development experience by allowing rapid testing, refinement, and debugging of flows, all in an interface that mirrors the UI. Users can also simply import their local flow directly into the Azure Machine Learning UI or export a flow folder with CLI to local to realize a smooth transition between local and cloud, making sure that local development is always in sync with the cloud and to harness the full power of Azure Machine Learning.

– Check out this demo video to learn how code-first experiences in prompt flow work in practice.

– Visit our documentation to learn more about prompt flow.

Monitor your generative AI applications in production with Azure Machine Learning

Monitoring models in production is an essential part of the AI lifecycle. Changes in data and consumer behavior can influence your application over time, resulting in outdated AI systems, which can produce undesired results that can negatively impact business outcomes and expose organizations to compliance and reputational risks. Unfortunately, the process for monitoring generative AI applications for safety, quality, and performance is arduous without pre-built tooling. Starting today, in preview, Azure Machine Learning enables organizations to monitor their generative AI applications in production.

Users can now collect production data using Model Data Collector, analyze key safety and quality evaluation metrics on a recurring basis, receive timely alerts about critical issues, and visualize the results over time in a rich dashboard within the Azure Machine Learning studio.

This capability integrates with AzureML’s pre-built evaluation, annotation, and measurement pipelines to evaluate for generation safety and quality. You can now monitor your application for key metrics such as coherence, fluency, groundedness, relevance, and similarity, while configuring your own custom thresholds.

For performance, using prompt flow’s system metrics, you can also view and track token consumptions for your applications, such as total prompt and completion token counts for your application usage.

Together, these capabilities can help you better identify and diagnose issues, understand usage patterns, and inform how you optimize your application with prompt engineering. Ultimately, model monitoring for generative AI enables more accurate, responsible, and compliant applications in production.

Visit our documentation to learn more about this feature.

thumbnail image 2 of blog post titled Generative AI in Azure Machine Learning: Operationalizing app development for AI transformation

Figure 2. Within the monitoring overview page, users can configure monitoring for their application, view overall performance, and review notifications.

thumbnail image 3 of blog post titled Generative AI in Azure Machine Learning: Operationalizing app development for AI transformation

Figure 3. Within the monitoring details page, users can view time-series metrics, histograms, detailed performance, and resolve notifications.

Next Steps

Our announcements this month showcase how Azure Machine Learning is on a continuous cycle of improvement as we listen to our customers and advance our platform. In just a few months at Microsoft Ignite, we are set to unveil several new capabilities that will help data scientists and developers unlock the power of Generative AI within their organizations. We hope you will join us to learn more. In the meantime, try Azure Machine Learning for free, and we encourage you to join our Insiders Program. If you’re a current Azure Machine Learning customer, please visit our documentation website for additional information.

Learn more here!

Article originally posted here by Richard Tso. Reposted with permission.

ODSC Community

The Open Data Science community is passionate and diverse, and we always welcome contributions from data science professionals! All of the articles under this profile are from our community, with individual authors mentioned in the text itself.