Need advise for Integration with my local LLM

I'll be interested to read the results of this comparison.

I'm not familiar with how PrivateGPT works and how it handles queries such as the ones that you're interested in. In any case, what Jarvis does is to calculate the embedding of your query, and then search for the most similar text sections in your Joplin database. There are a few additional tricks in the guide and on the forum for steering the reponse in the right direction (it's worth reading these), but that's the basic mechanism (similar to this). This means that Jarvis will not use the LLM to construct repeated text search queries until the search results match your request, or other iterative processes. (Could be an interesting project though.)

Note that Jarvis works with external models, and you can define which model to use for generating the response (default: OpenAI, not the one preferred for privacy conscious users) and which model to use for generating note embeddings (default: the offline USE). Also note that if you wish to process code sections you will need to switch it on in the settings.

1 Like