Hey everyone! My first logseq plugin logseq-plugin-gpt3-openai just landed in the marketplace. This plugin allows you to run gpt-3 text summarization and generation commands within logseq via the OpenAI API. Check out the repo for gifs of demos. I would really appreciate any feedback. Let me know if you find it helpful or have any suggestions!
I was able to install the plugin from the marketplace and enter the OpenAI API key under plugin settings. However, I get an error (OpenAI Plugin Error) when I enter /gpt3 after a question. Any pointers to resolve this? Thank you!
Will there be a chatGPTcompletion interface for this? I’d love to use it with gpt-3.5-turbo or gpt-4. Happy to help work on this if there is interest.
Sorry, just saw 3.5-turbo is supported. Yay!
To really be in line with the privacy ideals of logseq one would need an local-AI.
Ist this even possible? I can imagine that even the most “basic” functions require huge amounts of data and computing power. Of course this would only work for stand-alone functions like the student-teacher dialog etc. not the “asking questions” functions.
Thank you very much for provide this plugin.
I download the latest version, and can get into the GPT table. But when I use any function, it prompts “Unknown error”. I’m not sure if it’s a network problem or something wrong with my APIkey.
There are drop in replacements afaik that use the same API as openAI. So services that are meant to talk to OpenAI can be redirected to local services. If I am not mistaken, GPT4all provides such an interface. Also localAI platform.
Of course those run the AI on (and are thus limited by) your hardware. It can be self-hosted and accessed over a network.