GSoC 2026 – Idea 1 (AI Search): Architecture Questions Before Final Submission

I'm applying for Idea 1 (AI-Supported Search) and have a working proof-of-concept plugin and an open PR (#14865). While finalising my proposal, I have a few architecture questions I'd love mentor input on before the deadline.

1. Plugin vs core implementation preference The idea description says "this can be developed as an external application or possibly as part of the core application." Given the shared infrastructure discussion, do mentors prefer the search feature as a standalone plugin (simpler, independent) or integrated into Joplin's core search engine? This affects the entire architecture of my proposal.

2. Embedding model loading in the plugin sandbox During PoC development I confirmed that native modules like onnxruntime-node and hnswlib-node cannot be loaded inside Joplin's plugin sandbox due to webpack bundling constraints. My current solution bundles the model weights directly into the plugin package and loads them from local files via the panel webview. Is there a recommended approach for this, or should I assume the shared infrastructure project will handle model loading in the future and my plugin should be designed to consume that?

3.Hybrid search — BM25 vs Joplin's existing FTS For the keyword component of hybrid ranking, should I use Joplin's existing FTS4 search engine via the Data API, or implement BM25 separately? Using the existing engine is simpler but means I depend on the Data API's search capabilities.

Thank you — happy to share the PoC repo or proposal draft for feedback.

I’d love to hear @laurent @shikuz opinions on this

Hello please update your draft proposal according to the template

I’ve done the required changes, here’s the draft proposal: GSoC 2026 Proposal Draft – Idea 1: AI-Supported Search for Notes – Amirtha Yazhini M