Mistral + Haystack Collection: build RAG pipelines that rock 🤘
Collection of notebooks and resources to build Retrieval Augmented Generation pipelines using:
💻 For other great Haystack Notebooks, check out the 👩🏻🍳 Haystack Cookbook
-
Great and deep blog post by Hugging Face on the MoE architecture, which is the basis of Mistral 8x7B.
-
Zephyr: Direct Distillation of LM Alignment
Technical report by the Hugging Face H4 team. They explain how they trained Zephyr, a strong 7B model fine-tuned from Mistral.
The main topic is: ⚗️ how to effectively distill the capabilities of GPT-4 into smaller models? The report is insightful and well worth reading. I have summarized it here.


