Discussion
Mistral Forge
mark_l_watson: I am rooting for Mistral with their different approach: not really competing on the largest and advanced models, instead doing custom engineering for customers and generally serving the needs of EU customers.
dmix: This is definitely the smart path for making $$ in AI. I noticed MongoDB is also going into this market with https://www.voyageai.com/ targeting business RAG applications and offering consulting for company-specific models.
rorylawless: [delayed]
bsjshshsb: Id training or FT > context? Anyone have experience.Is it possible to retrain daily or hourly as info changes?
andai: They mention pretraining too, which surprises me. I thought that was prohibitively expensive?It's feasible for small models but, I thought small models were not reliable for factual information?
ryeguy_24: How many proprietary use cases truly need pre-training or even fine-tuning as opposed to RAG approach? And at what point does it make sense to pre-train/fine tune? Curious.
baby: RAG is dead
loeg: Is it??
bigyabai: In what, X's hype circles? Embeddings are used in production constantly.
CharlesW: And yet your blog says you think NFTs are alive. Curious.But seriously, RAG/retrieval is thriving. It'll be part of the mix alongside long context, reranking, and tool-based context assembly for the forseeable future.
charcircuit: Using tools and skills to retrieve data or files is anything but dead.