Express.js server in a Plugin

Hi, everyone! I am currently developing a plugin that summarizes notes and notebooks. I've had trouble running ML and other libraries, such as Transformers.js, WebLLM, Pyodide, etc., in Joplin.

A couple of weeks ago, I suggested using the local Express.js server as a microservice. For example, I would send note contents to the microservice, which would return the predicted data.

Last weekend, I discovered that I could access the unpacked plugin jpl in joplin.plugins.installationDir(). Therefore, I considered using Webpack's CopyPlugin to copy the NodeJS containing an Express.js server and then use NodeJS child processes to turn on the server. That way, we can easily use ML tools and Python scientific packages without any problems.

I am not aware of any problems that could arise with this approach besides:

  1. When to turn off the Express.js server
    • before turning off the Joplin, destroy all processes
    • timeouts
  2. Users could use the same port for their own development
    • use a very unlikely number that users would use
  3. joplin.plugins.installationDir() is a temporary directory
    • store the NodeJS project in joplin.plugins.dataDir() with versioning in case of the update

I would like to get an opinion from the community and Joplin's members on whether this solution could work.

Hmm, I feel like rather than explaining the solution, it would be better to explain the problem. In general we should aim to reduce dependencies as much as possible and a server running Python is far from ideal for a plugin. Why is it necessary? Is there really no options to run the code as WASM for instance? Perhaps we can create a plugin API to make that easier?

1 Like

Alright, I will explain the problem. I might have figured out where the problem is!

Problems with Transformers.js

It seems that someone already in the community tried to run the library in the plugin from this topic: Using native node modules in plugins, where I got the exact issues and added my inputs there.

First problem

When I installed Transformers.js and run the plugin, I got this:

You may need an appropriate loader to handle this file type, currently no loaders are configured to process this file. See https://webpack.js.org/concepts#loaders
(Source code omitted for this binary file)

So then I installed node-loader and got this:

Uncaught Error: node-loader:
Error: ENOENT, services/plugins/node_modules/@xenova/transformers/node_modules/onnxruntime-node/bin/napi-v3/darwin/arm64/onnxruntime_binding.node not found in /Applications/Joplin.app/Contents/Resources/app.asar

I managed to solve it by this and the solution worked for Webpack in Next.js.

const pluginConfig = { ...base config, entry: './src/index.ts',
	resolve: {
		alias: {
			api: path.resolve(__dirname, 'api'),
			"sharp$": false,
            "onnxruntime-node$": false,
		},
...

However, that introduced another problem.

Second problem

Something went wrong during model construction (most likely a missing operation). Using `wasm` as a fallback. 
Kr @ plugin_com.example.JoplinSummarizeAILocal.js:2
2plugin_com.example.JoplinSummarizeAILocal.js:2 Uncaught (in promise) TypeError: Cannot read properties of undefined (reading 'create')
   at Kr (plugin_com.example.JoplinSummarizeAILocal.js:2:681968)
   at async Promise.all (/Users/billtonhoang/Documents/GitHub/joplin-hahabill/joplin-hahabill-resolved/packages/app-desktop/services/plugins/index 1)
   at async zo.from_pretrained (plugin_com.example.JoplinSummarizeAILocal.js:2:688061)
   at async Mc.from_pretrained (plugin_com.example.JoplinSummarizeAILocal.js:2:721345)
   at async Promise.all (/Users/billtonhoang/Documents/GitHub/joplin-hahabill/joplin-hahabill-resolved/packages/app-desktop/services/plugins/index 1)
   at async plugin_com.example.JoplinSummarizeAILocal.js:2:788312
   at async Zh (plugin_com.example.JoplinSummarizeAILocal.js:2:787993)
   at async a.predict (plugin_com.example.JoplinSummarizeAILocal.js:2:561693)

where the build tool removes onnxruntime-web from the final build, it cannot find the InferenceSession object to call create on.

For a more detailed investigation, I documented it here: Bonding Period Update - Week 1-2 - #4 by allanmax

Final Thoughts

It seems that the node-loader cannot find the node files. It might correlate to the problem I had with word2vec, where it used child_process to execute .sh files. However, it could not locate ./sh files, and I had to copy them into the dist folder, which is located in ~\.config\joplin-desktop\cache\<PLUGINID>.

Could it be that the plugin cannot find the .node files in the dist folder at runtime, and that is why the node-loader does not work? If so, how can we tell the node-loader to load .node files, which are located in the dist folder?

I haven't tried converting Python code to .wasm myself and using it. I have only tried it by relying on other open-sourced libraries. I have not found success, but I can try more to investigate and implement!!

How would creating a new plugin API work? Do you mean by extending this: joplin | Joplin Plugin API Documentation?