0% found this document useful (0 votes)
478 views

LangChain Cheat Sheet KDnuggets

LangChain facilitates building applications with language models through reusable components and pre-built chains. It provides interfaces for popular language models like GPT-3.5 and BERT, and tools for tasks like prompt engineering, document retrieval, and state management to enable more dynamic conversations. The modular architecture allows for rapid development and customization of language model applications.

Uploaded by

kristofleroux
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
478 views

LangChain Cheat Sheet KDnuggets

LangChain facilitates building applications with language models through reusable components and pre-built chains. It provides interfaces for popular language models like GPT-3.5 and BERT, and tools for tasks like prompt engineering, document retrieval, and state management to enable more dynamic conversations. The modular architecture allows for rapid development and customization of language model applications.

Uploaded by

kristofleroux
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 1

Visit KDnuggets.

com for more


cheatsheets and additional LangChain facilitates prompt management and optimization agent.run("Can you tell me the distance between Earth and Vector Stores
learning resources. through the use of prompt templates. the moon? And could you please convert it into miles?
Thank you.")
from langchain import PromptTemplate
>>> Action: Wikipedia One common method for storing and searching
template = """Question: {question}

LangChain
Action Input: Earth-moon distance unstructured data is to embed it as vectors, then embed
Make the answer more engaging by incorporating puns.
Action: Calculator queries and retrieve the most similar vectors.
Answer: """
Action Input: 385400/1.609
from langchain.embeddings.openai import
Cheat Sheet prompt = PromptTemplate.from_template(template) Final Answer: The distance between Earth and the Moon is
approximately 239,527.66 miles. OpenAIEmbeddings
from langchain.text_splitter import CharacterTextSplitter
llm(prompt.format(question="Could you provide some
from langchain.vectorstores import FAISS
information on the impact of global warming?"))

>>> Global warming is no laughing matter, but that doesn.... Memory # Text Splitter
text_splitter = CharacterTextSplitter(chunk_size=1000,
chunk_overlap=0)
LangChain simplifies building applications with language documents = text_splitter.split_documents(raw_document)
models through reusable components and pre-built LangChain simplifies persistent state management in chain
chains. It makes models data-aware and agentic for more Chains or agent calls with a standard interface # Vector Store
dynamic interactions. The modular architecture supports db = FAISS.from_documents(documents,
rapid development and customization. from langchain.chains import ConversationChain OpenAIEmbeddings())
from langchain.memory import
ConversationBufferMemory # Similarity Search
Combining LLMs and prompt template can enhance multi- query = "When was Gregory born?"
step workflows. docs = db.similarity_search(query)
LLMs conversation = ConversationChain(
llm=llm, verbose=True, print(docs[0].page_content)
from langchain import LLMChain memory=ConversationBufferMemory()
llm_chain = LLMChain(prompt=prompt, llm=llm) ) >>> Gregory I. Piatetsky-Shapiro (born 7 April 1958) is a data
An interface for OpenAI GPT-3.5-turbo LLM question = "Could you provide some information on the scientist and the co-founder of the KDD conferences.....
impact of global warming?" conversation.predict(input="How can one overcome
from langchain.llms import OpenAI anxiety?")
llm = OpenAI(temperature=0.9) llm_chain.run(question)
A retriever is an interface that returns documents based on
text = "What do you know about KDnuggets?" >>> To overcome anxiety, it may be helpful to focus on the.... an unstructured query. When combined with LLM, it
llm(text) >>> Global warming is no laughing matter—but it sure is.....
generates a natural response instead of simply displaying
conversation.predict(input="Tell me more..") the text from the document.
>>> KDnuggets is one of the most popular data science
websites which focusses.... >>> To be mindful of the present, it can be helpful to pra..... from langchain.chains import RetrievalQA
from langchain.chat_models import ChatOpenAI
Agents and Tools
llm = ChatOpenAI(model_name="gpt-3.5-turbo",
An interface for HugginFace LLM
Document Loaders temperature=0)
qa_chain =
from langchain import HuggingFaceHub
RetrievalQA.from_chain_type(llm,retriever=db.as_retriever(
llm = HuggingFaceHub(repo_id="togethercomputer/LLaMA- Tool refers to a function that performs a specific task, such ))
2-7B-32K", model_kwargs={"temperature":0, as Google Search, database lookup, or Python REPL. Agents qa_chain({"query": "When was Gregory born?"})
"max_length":64}) use LLMs to choose a sequence of actions to execute. By combining language models with your own text data, you
can answer personalized queries. You can load CSV, >>> {'query': 'When was Gregory born?',
llm("How old is KDnuggets?") from langchain.agents import load_tools Markdown, PDF, and more. 'result': 'Gregory Piatetsky-Shapiro was born on April 7,
from langchain.agents import initialize_agent 1958.'}
>>> KDnuggets was founded in 1997, making it 23 years old. from langchain.document_loaders import TextLoader
tools = load_tools(["wikipedia", "llm-math"], llm=llm)
agent = initialize_agent(tools, llm, agent="zero-shot-react- raw_document =

Prompt Templates
description", verbose=True) TextLoader("/work/data/Gregory.txt").load() Subscribe to KDnuggets News

Abid Ali Awan | 2023

You might also like