OiO.lk Community platform!

Oio.lk is an excellent forum for developers, providing a wide range of resources, discussions, and support for those in the developer community. Join oio.lk today to connect with like-minded professionals, share insights, and stay updated on the latest trends and technologies in the development field.
  You need to log in or register to access the solved answers to this problem.
  • You have reached the maximum number of guest views allowed
  • Please register below to remove this limitation

langchain chat chain invoke does not return an object?

  • Thread starter Thread starter barteloma
  • Start date Start date
B

barteloma

Guest
I have a simple example about langchain runnables. From https://python.langchain.com/v0.1/docs/expression_language/interface/

Code:
from langchain_core.prompts import ChatPromptTemplate
from langchain_openai import ChatOpenAI

model = ChatOpenAI(model="gpt-4")
prompt = ChatPromptTemplate.from_template("tell me a joke about {topic}")
chain = prompt | model
print(chain.invoke({"topic": "chickens"}))

It should return like following in web site:

Code:
AIMessage(content="Why don't bears wear shoes? \n\nBecause they have bear feet!")

But it returns unstructured response:

Code:
content="Why don't bears wear shoes? \n\nBecause they have bear feet!" response_metadata={'token_usage': {'completion_tokens': 19, 'prompt_tokens': 13, 'total_tokens': 32}, 'model_name': 'gpt-4-0613', 'system_fingerprint': None, 'finish_reason': 'stop', 'logprobs': None} id='run-bd7cda7e-dee2-4107-af3f-97282faa9fa4-0' usage_metadata={'input_tokens': 13, 'output_tokens': 19, 'total_tokens': 32}

how can I fix this issue?
<p>I have a simple example about langchain runnables. From <a href="https://python.langchain.com/v0.1/docs/expression_language/interface/" rel="nofollow noreferrer">https://python.langchain.com/v0.1/docs/expression_language/interface/</a></p>
<pre><code>from langchain_core.prompts import ChatPromptTemplate
from langchain_openai import ChatOpenAI

model = ChatOpenAI(model="gpt-4")
prompt = ChatPromptTemplate.from_template("tell me a joke about {topic}")
chain = prompt | model
print(chain.invoke({"topic": "chickens"}))
</code></pre>
<p>It should return like following in web site:</p>
<pre><code>AIMessage(content="Why don't bears wear shoes? \n\nBecause they have bear feet!")
</code></pre>
<p>But it returns unstructured response:</p>
<pre><code>content="Why don't bears wear shoes? \n\nBecause they have bear feet!" response_metadata={'token_usage': {'completion_tokens': 19, 'prompt_tokens': 13, 'total_tokens': 32}, 'model_name': 'gpt-4-0613', 'system_fingerprint': None, 'finish_reason': 'stop', 'logprobs': None} id='run-bd7cda7e-dee2-4107-af3f-97282faa9fa4-0' usage_metadata={'input_tokens': 13, 'output_tokens': 19, 'total_tokens': 32}
</code></pre>
<p>how can I fix this issue?</p>
 

Latest posts

Online statistics

Members online
0
Guests online
2
Total visitors
2
Top