OiO.lk Community platform!

Oio.lk is an excellent forum for developers, providing a wide range of resources, discussions, and support for those in the developer community. Join oio.lk today to connect with like-minded professionals, share insights, and stay updated on the latest trends and technologies in the development field.
  You need to log in or register to access the solved answers to this problem.
  • You have reached the maximum number of guest views allowed
  • Please register below to remove this limitation

ImportError: cannot import name 'Ollama' from 'llama_index.llms' (unknown location) - installing dependencies does not solve the problem

  • Thread starter Thread starter mazix
  • Start date Start date
M

mazix

Guest
I want to learn LLMs. I run Ollama with the following Docker Compose file - it's running:

Code:
services:
  ollama:
    image: ollama/ollama:latest
    ports:
      - 11434:11434
    volumes:
      - ollama_data:/root/.ollama
    healthcheck:
      test: ollama list || exit 1
      interval: 10s
      timeout: 30s
      retries: 5
      start_period: 10s
  ollama-models-pull:
    image: curlimages/curl:8.6.0
    command: >-
      http://ollama:11434/api/pull -d '{"name": "mistral"}'
    depends_on:
      ollama:
        condition: service_healthy
volumes:
  ollama_data:

I would like to write a Python app, which will use ollama, and I found this piece of code:

Code:
from llama_index.llms import Ollama, ChatMessage

llm = Ollama(model="mistral", base_url="http://127.0.0.1:11434")

messages = [
    ChatMessage(
        role="system", content="you are a multi lingual assistant used for translation and your job is to translate nothing more than that."
    ),
    ChatMessage(
        role="user", content="please translate message in triple tick to french ``` What is standard deviation?```"
    )
]
resp = llm.chat(messages=messages)
print(resp)

I installed all dependencies:

Code:
python3 -m venv venv
source venv/bin/activate
pip install llama-index  
pip install llama-index-llms-ollama
pip install ollama-python

However, when I run the app, I got:

Code:
Traceback (most recent call last):
  File "/home/user/test.py", line 1, in <module>
    from llama_index.llms import Ollama, ChatMessage
ImportError: cannot import name 'Ollama' from 'llama_index.llms' (unknown location)

where can be the problem?
<p>I want to learn LLMs. I run Ollama with the following Docker Compose file - it's running:</p>
<pre><code>services:
ollama:
image: ollama/ollama:latest
ports:
- 11434:11434
volumes:
- ollama_data:/root/.ollama
healthcheck:
test: ollama list || exit 1
interval: 10s
timeout: 30s
retries: 5
start_period: 10s
ollama-models-pull:
image: curlimages/curl:8.6.0
command: >-
http://ollama:11434/api/pull -d '{"name": "mistral"}'
depends_on:
ollama:
condition: service_healthy
volumes:
ollama_data:
</code></pre>
<p>I would like to write a Python app, which will use ollama, and I found this piece of code:</p>
<pre><code>from llama_index.llms import Ollama, ChatMessage

llm = Ollama(model="mistral", base_url="http://127.0.0.1:11434")

messages = [
ChatMessage(
role="system", content="you are a multi lingual assistant used for translation and your job is to translate nothing more than that."
),
ChatMessage(
role="user", content="please translate message in triple tick to french ``` What is standard deviation?```"
)
]
resp = llm.chat(messages=messages)
print(resp)
</code></pre>
<p>I installed all dependencies:</p>
<pre><code>python3 -m venv venv
source venv/bin/activate
pip install llama-index
pip install llama-index-llms-ollama
pip install ollama-python
</code></pre>
<p>However, when I run the app, I got:</p>
<pre><code>Traceback (most recent call last):
File "/home/user/test.py", line 1, in <module>
from llama_index.llms import Ollama, ChatMessage
ImportError: cannot import name 'Ollama' from 'llama_index.llms' (unknown location)
</code></pre>
<p>where can be the problem?</p>
 

Latest posts

Top