Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update docs and examples to llama3.1 #121

Merged
merged 1 commit into from
Jul 29, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 4 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ npm i ollama
import ollama from 'ollama'

const response = await ollama.chat({
model: 'llama2',
model: 'llama3.1',
messages: [{ role: 'user', content: 'Why is the sky blue?' }],
})
console.log(response.message.content)
Expand All @@ -34,7 +34,7 @@ Response streaming can be enabled by setting `stream: true`, modifying function
import ollama from 'ollama'

const message = { role: 'user', content: 'Why is the sky blue?' }
const response = await ollama.chat({ model: 'llama2', messages: [message], stream: true })
const response = await ollama.chat({ model: 'llama3.1', messages: [message], stream: true })
for await (const part of response) {
process.stdout.write(part.message.content)
}
Expand All @@ -46,7 +46,7 @@ for await (const part of response) {
import ollama from 'ollama'

const modelfile = `
FROM llama2
FROM llama3.1
SYSTEM "You are mario from super mario bros."
`
await ollama.create({ model: 'example', modelfile: modelfile })
Expand Down Expand Up @@ -209,7 +209,7 @@ import { Ollama } from 'ollama'

const ollama = new Ollama({ host: 'http://127.0.0.1:11434' })
const response = await ollama.chat({
model: 'llama2',
model: 'llama3.1',
messages: [{ role: 'user', content: 'Why is the sky blue?' }],
})
```
Expand Down
2 changes: 1 addition & 1 deletion examples/abort/any-request.ts
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ setTimeout(() => {

try {
ollama.generate({
model: 'llama2',
model: 'llama3.1',
prompt: 'Write a long story',
stream: true,
}).then(
Expand Down
2 changes: 1 addition & 1 deletion examples/abort/specific-request.ts
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ setTimeout(() => {

try {
ollama.generate({
model: 'llama2',
model: 'llama3.1',
prompt: 'Write a long story',
stream: true,
}).then(
Expand Down
2 changes: 1 addition & 1 deletion examples/pull-progress/pull.ts
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
import ollama from 'ollama'

const model = 'llama2'
const model = 'llama3.1'
console.log(`downloading ${model}...`)
let currentDigestDone = false
const stream = await ollama.pull({ model: model, stream: true })
Expand Down
Loading