Plugin: Jarvis (AI assistant) [v0.9.1, 2024-11-02]

As far as I know Google extended the period until billing starts to May 14.

1 Like

Thank you so much for pulling this together! I'm just exploring some of the basics right now to get a feel for how it works. I have a lot of notes (including thousands that I didn't import to Jopin from Evernote), and having something that can help me search them & use them for context should prove very useful.

Is there a limitation for the number it can process? The "related notes" tab shows this message, and seems to be stuck at 460. I'm using a paid Google Gemini account's API key. Maybe I need to compile from source & look into the logs?

Hi @erikprzekop, I suspect that there are notes that can break Jarvis (maybe some html notes, maybe imported ones from Evernote, I'm just guessing here because it never happened to me personally). It has happened once to a user once, and it seems that we were able to isolate and skip the notes that caused the issue. I never had an example that I could debug though. In any case, the process outlined in the link may help you identify and exclude the notes that get stuck. We can continue the discussion there.

1 Like

v0.8.3-v0.8.4

  • improve: exclude notes in trash from db
  • improve: OpenAI model updates
    • added gpt-4o (latest model)
    • deprecated legacy gpt-3.5-turbo-16k
      • gpt-3.5-turbo points to a newer version of this model
    • deprecated legacy gpt-4
    • all legacy models are still accessible via the openai-custom model setting
    • improved model descriptions with tokens / price cateogry

Thank you for the excellent plugin, I love it! I followed the guide and there is a more straight forward connection to Ollama, not requiring LiteLLM. Ollama servers openai compatible API already. Tested and working.

Offline chat model with ollama
Install ollama
Pick a LLM model to use from the ollama library and run ollama run MODELNAME (e.g., ollama run llama3:latest) in a terminal
Setting Advanced Value
Model: OpenAI API Key No Something, anything
Chat: Model No (online) OpenAI or compatible: custom model
Chat: Timeout (sec) Yes 600
Chat: OpenAI (or compatible) custom model ID Yes MODELNAME
Chat: Custom model is a conversation model Yes Yes
Chat: Custom model API endpoint Yes http:// ollama-IP :11434/chat/completions

1 Like

Thanks for the reminder @Midnight! I was vaguely aware that I had to update the guide and forgot to.

Updated it just now. Also improved Xinference's instructions, which now include a setup for both chat and notes / embedding models.

v0.8.5

  • new: separate settings sections for chat, related notes, annotations and research
  • fix: set default values for API keys
    • this is a workaround that ensures that keys are saved securely to your keychain (where available)
  • changed default settings
    • context tokens: increased to 2048
    • annotation: tags method changed to existing tags
2 Likes

v0.9.0

  • models
  • chat with notes / related notes
    • new: settings to customise what extra information goes into each block / chunk (default: all selected)
      • title
      • full headings path
      • last heading
      • note tags
    • improve: chat with notes prompt and model compatibility
  • annotations
    • new: setting Annotate: Preferred language
1 Like

Thanks for the update!

I get a Model could note be loaded panel error after restarting Joplin when Related Notes uses the new (offline) Ollama option. Here's what I did:

  • Set the related notes feature to use my functioning model ID and API endpoint from the chat settings page.
  • Restarted the application, clicked the OK button on the Jarvis intro popup, and saw the error.

Additionally:

  • Chat with your notes also didn't work, but Chat with Jarvis did, despite both using the same settings.
  • Any model I tested, which all work fine in Ollama or other LLM UIs, caused the same issue.

Does anybody have an idea why this might happen or how I can fix it?

Also, what input is expected for the Notes: User CSS for notes panel field? I can't figure out how to target HTML tags inside the document.

I had to update the guide with instructions for the new Ollama API, here they are. Hopefully related notes and chat with your notes will work now.

To set the style of the text in the related notes panel, maybe try this:

.jarvis-semantic-section a, .jarvis-semantic-note a {color: red; background-color: yellow;}

I need to simplify the CSS a bit at some point, so that it's more easily customisable.

I love the plugin! Thanks for your hard work. I am manually copying and pasting prompts from fabric/patterns at main ยท danielmiessler/fabric ยท GitHub. Is there a better way to incorporate these?

Thanks for sharing the link @GhostReader!

As for storing prompt templates, you could use the following settings in Jarvis Chat -> Advances Settings. You may then select these templates when you open the Ask Jarvis dialog (Cmd/Ctrl + Shift + J).

1 Like