
Ollama
Ollama API Integration
The Jspreadsheet Ollama extension is a frontend plugin that integrates the local and free Ollama API into Jspreadsheet Pro data grids. It enables automated content generation, response handling, and advanced data analysis. The Ollama service must be running locally to enable frontend queries to connect to the API and provide real-time data to users.
Endless Possibilities
Generate Marketing Copy
Generate marketing copy quickly for products based on their attributes.
A1: "Product Name"
B1: "Product Description"
C1: "Marketing Copy"
A2: "Smartwatch"
B2: "Fitness tracking, sleep monitoring, 48-hour battery life"
In C2, you would enter:
=OLLAMA("Write an engaging marketing copy for a product named ", A2, " with these features: ", B2)
Documentation
Methods
| Method | Description |
|---|---|
| =OLLAMA(...ARGUMENTS) | Creates dynamic prompts by combining different values or text fragments, allowing interaction with AI models using customized prompts. |
Installation
This extension requires Jspreadsheet Server and an Ollama Instance running.
Using NPM
$ npm install @jspreadsheet/ollama
Using a CDN
<script src="https://cdn.jsdelivr.net/npm/@jspreadsheet/ollama/dist/index.min.js"></script>
Configuration
The Ollama extension requires a reference to an Ollama Instance running.
Running an Ollama Model locally with Docker
Add the following service to your docker-compose.yml:
ollama:
image: ollama/ollama
ports:
- "11436:11434"
volumes:
- ollama_volume:/root/.ollama
deploy:
resources:
reservations:
devices:
- driver: nvidia
count: all
capabilities: [gpu]
After the container is running, open a bash shell in the Ollama service and start the desired model. In the example below, we run llama3, which is the default model for the extension:
$ docker compose exec ollama bash
$ ollama run llama3
The llama3 model will now be available on port 11436, as specified in your docker-compose.yml file.
Client Configuration
| Option | Description |
|---|---|
onbeforesend |
Modifies the default request options before sending the API request.onbeforesend(instance, requestOptions) => void |
onsuccess |
Performs an action after the API request has successfully completed.onsuccess(instance, cell, x, y) => void |
requestOptions default
It is possible to customize the option request.
let requestOptions = {
prompt: "{prompt passed from frontend spreadsheet}",
model: 'llama3',
stream: false,
}
Connect to your server and open your spreadsheet:
import jspreadsheet from 'jspreadsheet';
import formula from '@jspreadsheet/formula-pro';
import ollama from '@jspreadsheet/ollama';
// Connect to the Ollama API
ollama({
url: 'http://localhost:11437'
onbeforesend: function(instance, options) {
// Update the default
options.model = 'mistral';
options.options = {
temperature: 0
}
},
});
// Load the extensions
jspreadsheet.setExtensions({ formula, ollama });
// Connect to a spreadsheet
jspreadsheet(HTMLElement, {
worksheets: [{
minDimensions: [5, 5]
}]
});
More Examples
Translating Column to French
This example demonstrates using OLLAMA to translate an entire column.
jspreadsheet(HTMLElement, {
worksheets: [
{
data: [
['Hello', '=OLLAMA("Translate ",A1," to French")'],
['Bye', '=OLLAMA("Translate ",A2," to French")'],
['Thanks', '=OLLAMA("Translate ",A3," to French")'],
['Sorry', '=OLLAMA("Translate ",A4," to French")'],
]
}
]
})
Combining Word Semantics
This example demonstrates using OLLAMA to combine word semantics.
jspreadsheet(HTMLElement, {
worksheets: [
{
data: [
['Flower', 'Bee', `=OLLAMA(A1," and ",B1," combined result in this word:")`],
]
}
]
})