(Unnecessary if you're just using local langmodelpro)
First, make a json that's valid according to these standards:
LangChain InterpreterNext, upload it to this website at the following link, and give it a name.
LangChain TemplatesSend an http post request to
http://langmodel.pro/lc_templates/api
Use the following headers:
openai-api-key: (your openai api key) Content-Type: "application/json"
The json body should contain the input json data. The following is the format:
{ "langchain_inputs": (langchain inputs here), "history": (the context for the chain, such as chat history), "schema": (json template as a string) }
The response is the output of the langchain pipeline.
Send an http post request to
http://langmodel.pro/lc_templates/YOUR_TEMPLATE/api
Use the following headers:
userid: (found on the front page while logged in) openai-api-key: (your openai api key) Content-Type: "application/json"
The json body should contain the input json data. The following is the format:
{ "langchain_inputs": (langchain inputs here), "history": (the context for the chain, such as chat history) }
The response is the output of the langchain pipeline.
Use the basic in-website template consumer:
Go to the link that says, "Test this langchain template" on the template page.
Enter your openai key and the input json data, then click submit.