-
Notifications
You must be signed in to change notification settings - Fork 19.2k
Closed
Description
Issue you'd like to raise.
I'm trying to chat with a PDF using ConversationalRetrievalChain.
embeddings = OpenAIEmbeddings()
vectordb = Chroma(embedding_function=embeddings, persist_directory=directory)
qa_chain = ConversationalRetrievalChain.from_llm(ChatOpenAI(model_name="gpt-3.5-turbo", temperature=0.3), vectordb.as_retriever(), memory=memory)
answer = (qa_chain({"question": query}))
It works perfectly as it gives the answer from the documents. But it can't alter the tone or even convert the answer into another language when prompted.
How can we change the tone like we do in openai ChatCompletion:
response = openai.ChatCompletion.create(
model="gpt-3.5-turbo",
messages=[
{"role": "system", "content": "You are a helpful and friendly chatbot who converts text to a very friendly tone."},
{"role": "user", "content": f"{final_answer}"}
]
)
such that it answers from the doc but also converts it according to some given prompt. Right now I have to pass the received output from ConversationalRetrievalChain in above code in order to modify the tone. Is this kind of functionality missing in ConversationalRetrievalChain?
Suggestion:
No response
dosubot
Metadata
Metadata
Assignees
Labels
No labels