Skip to content

Issue: How can we combine ConversationalRetrievalChain with openai ChatCompletion #8864

@apurv101

Description

@apurv101

Issue you'd like to raise.

I'm trying to chat with a PDF using ConversationalRetrievalChain.

embeddings = OpenAIEmbeddings()
vectordb = Chroma(embedding_function=embeddings, persist_directory=directory)
qa_chain = ConversationalRetrievalChain.from_llm(ChatOpenAI(model_name="gpt-3.5-turbo", temperature=0.3), vectordb.as_retriever(), memory=memory)
answer = (qa_chain({"question": query}))

It works perfectly as it gives the answer from the documents. But it can't alter the tone or even convert the answer into another language when prompted.
How can we change the tone like we do in openai ChatCompletion:

    response = openai.ChatCompletion.create(
    model="gpt-3.5-turbo",
    messages=[
            {"role": "system", "content": "You are a helpful and friendly chatbot who converts text to a very friendly tone."},
            {"role": "user", "content": f"{final_answer}"}
        ]
    )

such that it answers from the doc but also converts it according to some given prompt. Right now I have to pass the received output from ConversationalRetrievalChain in above code in order to modify the tone. Is this kind of functionality missing in ConversationalRetrievalChain?

Suggestion:

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions