This feature request is not my own but was brought up by @udaypb, one of our GSoC applicants, in their introduction. I’m not sure the full details of the request, but I’d love to see it fleshed out here.
Thanks, @bedwardly-down for posting this feature.
The idea behind this feature is to have improved accessibility options for any type of collaborated or non-collaborated notes available on Joplin. The feature would allow students with learning disabilities to access the content/notes in a better way, thereby making the application more productive for them.
This feature crosses lines with the OCR(optical character recognition) feature in that it tries to convert a text-based content or handwritten content into audio or sign language.
For instance, if I want to listen to the handwritten content on Joplin, I would use this feature to do it. As on the development side of things, the handwritten content would have to be converted into audio by semi-conversion (directly to audio) or full-conversion (handwritten to text to audio).
This is the most basic example I could think of. There could be one more possible addition to this feature - highlighting and hyperlinking technical words or jargons in notes. For example, if content tagged as math has been uploaded then all the jargons in the text that most people could be unfamiliar with will be highlighted. On the development side, there needs to be a dataset collected that could help us determine if a word is jargon or not.
I will try to share the development steps that I am planning to take to build the feature in its simplest form and integrate it to the Joplin’s mobile application.