Arize's Amber Roberts and Xander Song join Jon Krohn this week, sharing invaluable insights into ML Observability, drift detection, retraining strategies, and the crucial task of ensuring fairness and ethical considerations in AI development.This episode is brought to you by Posit, the open-source data science company, by AWS Inferentia, and by Anaconda, the world's most popular Python distribution. Interested in sponsoring a SuperDataScience Podcast episode? Visit JonKrohn.com/podcast for sponsorship information.In this episode you will learn:\u2022 What is ML Observability [05:07]\u2022 What is Drift [08:18]\u2022 The different kinds of model drift [15:31]\u2022 How frequently production models should be retrained? [25:15]\u2022 Arize's open-source product, Phoenix [30:49]\u2022 How ML Observability relates to discovering model biases [50:30]\u2022 Arize case studies [57:13]\u2022 What is a developer advocate [1:04:51]Additional materials: www.superdatascience.com/689