User Guide

Learn how to get the most out of the RAG, Memory, MCP Servers, Artifacts, and API features in Chat LLM.

Getting Started

Chat LLM offers advanced AI features to enhance your experience. Here's what you need to know:

What are RAG, Memory, and MCP Servers?

  • RAG (Retrieval Augmented Generation): Enhances AI responses by incorporating knowledge from your documents
  • Memory: Enables the AI to remember important information about you across multiple conversations
  • MCP Servers: Specialized tools that extend your AI with external knowledge sources and capabilities (like Context7 for technical documentation)

When to Use Each Feature

  • Use RAG when you need the AI to reference specific documents or knowledge bases
  • Use Memory when you want the AI to remember personal preferences or details across conversations
  • Use MCP Servers when you need specialized tools or knowledge sources (like Context7 for development questions with best practices)

Combining All Features

For the best experience, you can use all features together:

Example: Complete Technical Support Experience

  • Memory knows you prefer detailed technical explanations
  • RAG finds relevant technical documentation about your specific issue
  • MCP Servers (like Context7) add current best practices and production-ready patterns
  • Result: You receive a personalized, comprehensive solution with best practices, tailored to your specific problem

Click on each guide above to learn more about how to set up and use these powerful features.