Skip to content

Commit

Permalink
update to llama3.1 (#237)
Browse files Browse the repository at this point in the history
  • Loading branch information
jmorganca authored Jul 29, 2024
1 parent f62eb97 commit 60d292a
Showing 1 changed file with 14 additions and 14 deletions.
28 changes: 14 additions & 14 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ pip install ollama

```python
import ollama
response = ollama.chat(model='llama3', messages=[
response = ollama.chat(model='llama3.1', messages=[
{
'role': 'user',
'content': 'Why is the sky blue?',
Expand All @@ -29,7 +29,7 @@ Response streaming can be enabled by setting `stream=True`, modifying function c
import ollama

stream = ollama.chat(
model='llama3',
model='llama3.1',
messages=[{'role': 'user', 'content': 'Why is the sky blue?'}],
stream=True,
)
Expand All @@ -45,13 +45,13 @@ The Ollama Python library's API is designed around the [Ollama REST API](https:/
### Chat

```python
ollama.chat(model='llama3', messages=[{'role': 'user', 'content': 'Why is the sky blue?'}])
ollama.chat(model='llama3.1', messages=[{'role': 'user', 'content': 'Why is the sky blue?'}])
```

### Generate

```python
ollama.generate(model='llama3', prompt='Why is the sky blue?')
ollama.generate(model='llama3.1', prompt='Why is the sky blue?')
```

### List
Expand All @@ -63,14 +63,14 @@ ollama.list()
### Show

```python
ollama.show('llama3')
ollama.show('llama3.1')
```

### Create

```python
modelfile='''
FROM llama3
FROM llama3.1
SYSTEM You are mario from super mario bros.
'''

Expand All @@ -80,31 +80,31 @@ ollama.create(model='example', modelfile=modelfile)
### Copy

```python
ollama.copy('llama3', 'user/llama3')
ollama.copy('llama3.1', 'user/llama3.1')
```

### Delete

```python
ollama.delete('llama3')
ollama.delete('llama3.1')
```

### Pull

```python
ollama.pull('llama3')
ollama.pull('llama3.1')
```

### Push

```python
ollama.push('user/llama3')
ollama.push('user/llama3.1')
```

### Embeddings

```python
ollama.embeddings(model='llama3', prompt='The sky is blue because of rayleigh scattering')
ollama.embeddings(model='llama3.1', prompt='The sky is blue because of rayleigh scattering')
```

### Ps
Expand All @@ -123,7 +123,7 @@ A custom client can be created with the following fields:
```python
from ollama import Client
client = Client(host='http://localhost:11434')
response = client.chat(model='llama3', messages=[
response = client.chat(model='llama3.1', messages=[
{
'role': 'user',
'content': 'Why is the sky blue?',
Expand All @@ -139,7 +139,7 @@ from ollama import AsyncClient

async def chat():
message = {'role': 'user', 'content': 'Why is the sky blue?'}
response = await AsyncClient().chat(model='llama3', messages=[message])
response = await AsyncClient().chat(model='llama3.1', messages=[message])

asyncio.run(chat())
```
Expand All @@ -152,7 +152,7 @@ from ollama import AsyncClient

async def chat():
message = {'role': 'user', 'content': 'Why is the sky blue?'}
async for part in await AsyncClient().chat(model='llama3', messages=[message], stream=True):
async for part in await AsyncClient().chat(model='llama3.1', messages=[message], stream=True):
print(part['message']['content'], end='', flush=True)

asyncio.run(chat())
Expand Down

0 comments on commit 60d292a

Please sign in to comment.