Merged LLMs are the future, and we\u2019re exploring how with Mark McQuade and Charles Goddard from Arcee AI on this episode with Jon Krohn. Learn how to combine multiple LLMs without adding bulk, train more efficiently, and dive into different expert approaches. Discover how smaller models can outperform larger ones and leverage open-source projects for big enterprise wins. This episode is packed with must-know insights for data scientists and ML engineers. Don\u2019t miss out!\n\nInterested in sponsoring a SuperDataScience Podcast episode? Email\xa0natalie@superdatascience.com\xa0for sponsorship information.\n\nIn this episode you will learn:\n\u2022 Explanation of Charles' job title: Chief of Frontier Research [03:31]\n\u2022 Model Merging Technology combining multiple LLMs without increasing size [04:43]\n\u2022 Using MergeKit for model merging [14:49]\n\u2022 Evolutionary Model Merging using evolutionary algorithms [22:55]\n\u2022 Commercial applications and success stories [28:10]\n\u2022 Comparison of Mixture of Experts (MoE) vs. Mixture of Agents [37:57]\n\u2022 Spectrum Project for efficient training by targeting specific modules [54:28]\n\u2022 Future of Small Language Models (SLMs) and their advantages [01:01:22]\n\nAdditional materials:\xa0www.superdatascience.com/801