How to invoke runnables in parallel
This guide assumes familiarity with the following concepts:
The RunnableParallel
(also known as a RunnableMap
) primitive is an object whose values are runnables (or things that can be coerced to runnables, like functions).
It runs all of its values in parallel, and each value is called with the initial input to the RunnableParallel
. The final return value is an object with the results of each value under its appropriate key.
Formatting with RunnableParallels
RunnableParallels
are useful for parallelizing operations, but can also be useful for manipulating the output of one Runnable to match the input format of the next Runnable in a sequence. You can use them to split or fork the chain so that multiple components can process the input in parallel. Later, other components can join or merge the results to synthesize a final response. This type of chain creates a computation graph that looks like the following:
Input
/ \
/ \
Branch1 Branch2
\ /
\ /
Combine
Below, the input to each chain in the RunnableParallel
is expected to be an object with a key for "topic"
.
We can satisfy that requirement by invoking our chain with an object matching that structure.
- npm
- Yarn
- pnpm
npm install @langchain/anthropic @langchain/cohere
yarn add @langchain/anthropic @langchain/cohere
pnpm add @langchain/anthropic @langchain/cohere
import { PromptTemplate } from "@langchain/core/prompts";
import { RunnableMap } from "@langchain/core/runnables";
import { ChatAnthropic } from "@langchain/anthropic";
const model = new ChatAnthropic({});
const jokeChain = PromptTemplate.fromTemplate(
"Tell me a joke about {topic}"
).pipe(model);
const poemChain = PromptTemplate.fromTemplate(
"write a 2-line poem about {topic}"
).pipe(model);
const mapChain = RunnableMap.from({
joke: jokeChain,
poem: poemChain,
});
const result = await mapChain.invoke({ topic: "bear" });
console.log(result);
/*
{
joke: AIMessage {
content: " Here's a silly joke about a bear:\n" +
'\n' +
'What do you call a bear with no teeth?\n' +
'A gummy bear!',
additional_kwargs: {}
},
poem: AIMessage {
content: ' Here is a 2-line poem about a bear:\n' +
'\n' +
'Furry and wild, the bear roams free \n' +
'Foraging the forest, strong as can be',
additional_kwargs: {}
}
}
*/
API Reference:
- PromptTemplate from
@langchain/core/prompts
- RunnableMap from
@langchain/core/runnables
- ChatAnthropic from
@langchain/anthropic
Manipulating outputs/inputs
Maps can be useful for manipulating the output of one Runnable to match the input format of the next Runnable in a sequence.
Note below that the object within the RunnableSequence.from()
call is automatically coerced into a runnable map. All keys of the object must
have values that are runnables or can be themselves coerced to runnables (functions to RunnableLambda
s or objects to RunnableMap
s).
This coercion will also occur when composing chains via the .pipe()
method.
import { CohereEmbeddings } from "@langchain/cohere";
import { PromptTemplate } from "@langchain/core/prompts";
import { StringOutputParser } from "@langchain/core/output_parsers";
import {
RunnablePassthrough,
RunnableSequence,
} from "@langchain/core/runnables";
import { Document } from "@langchain/core/documents";
import { ChatAnthropic } from "@langchain/anthropic";
import { MemoryVectorStore } from "langchain/vectorstores/memory";
const model = new ChatAnthropic();
const vectorstore = await MemoryVectorStore.fromDocuments(
[{ pageContent: "mitochondria is the powerhouse of the cell", metadata: {} }],
new CohereEmbeddings()
);
const retriever = vectorstore.asRetriever();
const template = `Answer the question based only on the following context:
{context}
Question: {question}`;
const prompt = PromptTemplate.fromTemplate(template);
const formatDocs = (docs: Document[]) => docs.map((doc) => doc.pageContent);
const retrievalChain = RunnableSequence.from([
{ context: retriever.pipe(formatDocs), question: new RunnablePassthrough() },
prompt,
model,
new StringOutputParser(),
]);
const result = await retrievalChain.invoke(
"what is the powerhouse of the cell?"
);
console.log(result);
/*
Based on the given context, the powerhouse of the cell is mitochondria.
*/
API Reference:
- CohereEmbeddings from
@langchain/cohere
- PromptTemplate from
@langchain/core/prompts
- StringOutputParser from
@langchain/core/output_parsers
- RunnablePassthrough from
@langchain/core/runnables
- RunnableSequence from
@langchain/core/runnables
- Document from
@langchain/core/documents
- ChatAnthropic from
@langchain/anthropic
- MemoryVectorStore from
langchain/vectorstores/memory
Here the input to prompt is expected to be a map with keys "context" and "question". The user input is just the question. So we need to get the context using our retriever and passthrough the user input under the "question" key.
Next steps
You now know some ways to format and parallelize chain steps with RunnableParallel
.
Next, you might be interested in using custom logic in your chains.