toomuch.sh

LangChain

Framework for building LLM-powered applications and agents

/ Agents & Automation | oss
#framework#agents#rag#llm-apps#developer-tools

Getting Started

  1. Install LangChain via pip: pip install langchain langchain-openai (or your preferred provider package).
  2. Set up your LLM provider API key and create a simple chain to test the connection.
  3. Build a RAG (Retrieval-Augmented Generation) pipeline by adding a vector store and document loaders.
  4. Use LangGraph for complex agentic workflows with cycles, branching, and persistent state.

Key Features

  • Modular architecture with composable components for chains, prompts, memory, retrievers, and output parsers.
  • LangGraph provides a graph-based framework for building stateful, multi-step agent workflows with cycles.
  • LangSmith observability offers tracing, evaluation, and monitoring for debugging and optimizing LLM applications.
  • Extensive integrations connect to 100+ LLM providers, vector stores, document loaders, and embedding models.
  • RAG support includes built-in tools for document ingestion, chunking, embedding, and retrieval-augmented generation.
  • Python and JavaScript available in both Python and TypeScript/JavaScript with feature parity across both libraries.

// related tools