We couldn’t be more excited to introduce you to the first group of experts, practitioners, core contributors, and boundary breakers who will be joining us at ODSC East to share their expertise. Learn more about our first-announced sessions coming to the event this April 23rd-25th below.
Causal AI: from Data to Action
Dr. Andre Franca | CTO | connectedFlow
Explore the world of Causal AI for data science practitioners, with a focus on understanding cause-and-effect relationships within data to drive optimal decisions. Takeaways include:
- The dangers of using post-hoc explainability methods as tools for decision-making, and where traditional ML falls short.
- How do we figure out what is causal and what isn’t, with a brief introduction to methods of structure learning and causal discovery?
- Accurately estimating the impact we can make on our system and using this knowledge to derive the best possible actions to make.
Get practical experience and return to work ready to start implementing these concepts in your daily work.
The AI Paradigm Shift: Under the Hood of a Large Language Models
Valentina Alto | Azure Specialist – Data and Artificial Intelligence | Microsoft
Develop an understanding of Generative AI and Large Language Models, including the architecture behind them, their functioning, and how to leverage their unique conversational capabilities. You will also become familiar with the concept of LLM as a reasoning engine that can power your applications, paving the way to a new landscape of software development in the era of Generative AI. In this session, you’ll have the opportunity to explore examples of LLM-powered applications in Python using popular AI orchestrators, such as LangChain.
Enabling Complex Reasoning and Action with ReAct, LLMs, and LangChain
Giuseppe Zappia | Principal Machine Learning Specialist Solutions Architect | AWS
Shelbee Eigenbrode | Principal Solutions Architect | AWS
Learn how to employ the ReAct technique, which uses human reasoning tracts to create action plans, to allow an LLM to determine where to find information to service different types of user queries, using LangChain to orchestrate the process. You’ll see how it uses Retrieval Augmented Generation (RAG) to answer questions based on external data, as well as other tools for performing more specialized tasks to enrich the output of your LLM.
Everything About Large Language Models: Pre-training, Fine-tuning, RLHF & State of the Art
Chandra Khatri | Chief Scientist and Head of AI | Got It AI
Generative Large Language Models like GPT4 have revolutionized the entire tech ecosystem. In this talk, You’ll explore how these foundation models are trained. What are the steps and core components behind these LLMs? I will also cover how smaller, domain-specific models can outperform general-purpose foundation models like ChatGPT on target use cases.
Topological Deep Learning: Going Beyond Graph Data
Dr. Mustafa Hajij | Assistant Professor | University of San Francisco
This talk introduces the foundation of topological deep learning, a rapidly growing field that is concerned with the development of deep learning models for data supported on topological domains such as simplicial complexes, cell complexes, and hypergraphs, which generalize many domains encountered in scientific computations including images and sequence data. It introduces the main notions while maintaining intuitive conceptualization, implementation, and relevance to a wide range of practical applications. It also demonstrates the practical relevance of this framework with practical applications ranging from drug discovery to mesh and image segmentation.
Feature Stores in Practice: Build and Deploy a Model with Featureform, Redis, Databricks, and Sagemaker
Simba Khadder | Founder & CEO | Featureform
This session provides a practical blueprint to efficiently harness feature stores within ML workflows, effectively bridging the chasm between theoretical understanding and actionable implementation. Participants will walk away with a solid grasp of feature stores, equipped with the knowledge to drive meaningful insights and enhancements in their real-world ML platforms and projects.
Deploying Trustworthy Generative AI
Krishnaram Kenthapadi | Chief AI Officer & Chief Scientist | Fiddler AI
Generative AI models have engendered several ethical and social considerations that need to be addressed. These concerns include lack of interpretability, bias, and discrimination, privacy, lack of model robustness, fake and misleading content, copyright implications, plagiarism, and environmental impact associated with training and inference of generative AI models. In this talk, you’ll explore the need for adopting responsible AI principles when developing and deploying large language models (LLMs) and other generative AI models, and provide a roadmap for thinking about responsible AI for generative AI in practice through real-world LLM use cases.
Using Graphs for Large Feature Engineering Pipelines
Wes Madrigal | ML Engineer | Mad Consulting
This talk will outline the complexity of feature engineering from raw entity-level data, the reduction in complexity that comes with composable compute graphs, and an example of the working solution. You will explore one possible application through a case study of the impact on a logistics & supply chain machine learning problem.
Stable Diffusion: A New Frontier for Text-to-Image Paradigm
Sandeep Singh | Head of Applied AI/Computer Vision | Beans.ai
Discover Stable Diffusion, a new text-to-image generation model that is more stable and efficient than previous models and able to generate high-quality images from text descriptions.
By the end of this session, attendees will be able to:
– Understand the basics of Stable Diffusion and how it works.
– Know the landscape of tools and libraries for the Stable Diffusion domain.
– Generate images from text descriptions using Stable Diffusion.
– Apply Stable Diffusion to their own projects and workflows.
– Understand the process of fine-tuning open-source models to achieve the tasks at hand.
Better Features for Real-time Decisions Using Feature Engines
Mike Del Balso | Co-founder and CEO | Tecton
Nick Acosta | Developer Advocate | Tecton
Learn how leading ML teams use feature platforms to develop, operate, and manage features for production ML through a sample use case that demonstrates how feature engines can make it easy to build & productionize powerful feature pipelines.
Explore Tecton through a hands-on workshop that will take you through the concepts and code that will help you build a modern technical architecture that simplifies the process of managing real-time ML models and features.
Building Using Llama 2
Amit Sangani | Director of Partner Engineering | Meta
In this session, you’ll explore hands-on, engaging content that gives developers a basic understanding of Llama 2 models, with a focus on how to access and use them, as well as how to build core components of the AI chatbot using LangChain and Tools.You will also discuss core concepts of Prompt Engineering and Fine-Tuning and programmatically implement them using Responsible AI principles.
Aligning Open-source LLMs Using Reinforcement Learning from Feedback
Sinan Ozdemir | AI & LLM Expert, Author, Founder + CTO | LoopGenius
This session will focus on the core concepts of LLM fine-tuning, with a particular emphasis on reinforcement learning mechanisms. Engaging in hands-on exercises, you will gain practical experience in data preprocessing, quality assessment, and implementing reinforcement learning techniques for manual alignment.
No-Code and Low-Code AI: A Practical Project-Driven Approach to ML
Gwendolyn D. Stripling, PhD | Lead AI & ML Content Developer | Google Cloud
No-code machine learning (ML) is a way to build and deploy ML models without having to write any code. Low-code ML is a way to build and deploy ML models with minimal coding. Both methods can be valuable for businesses and individuals who do not have the skills or resources to develop ML models themselves.
Over the course of this session, you will develop an understanding of no-code and low-code frameworks, how they are used in the ML workflow, how they can be used for data ingestion and analysis, and for building, training, and deploying ML models.
Sign me up!
Don’t miss this chance to learn from some of the data practitioners defining the future of the industry. Register for ODSC East today to save 60% on any pass.