I wrote MinimalGPT in about a weekend as a minimal chat client where everything is stored client side (chat messages aside obviously).

Entire conversations are stored local to your browser instead of a database etc…

Supports both GPT3.5 and GPT4 as well as basic DALL-E image generation. Possibly Bard integration in the future if anyone actually wants it.

The GitHub is available here

It’s nothing crazy, but for a simple chat client without any BS it is nice.

You have to provide your own API key but they hand them out like candy so have a blast!

Edit - Pushed out a small update that adds a toggle for auto saving new conversations. If disabled new conversations are only saved (locally) when you press the save icon.

After a conversation has been saved it is automatically updated/saved every time you send a message from there on out.

  • SimpleDev@lemmy.worldOP
    link
    fedilink
    arrow-up
    3
    ·
    1 year ago

    If anyone runs across this in the future and doesn’t want to revive a dead thread feel free to message me!

    • BadRS@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      So, is there any way to get it working as a front-end for a local llama installation?

      • SimpleDev@lemmy.worldOP
        link
        fedilink
        arrow-up
        3
        ·
        1 year ago

        Not currently unfortunately, I just started work on adding PaLM API support in some form.

        I’ll have to look into LLaMA after that is complete, I hadn’t even thought about that one yet haha.

  • EraNet@lemmy.world
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    I’ve added an API Key and still getting: An error occurred while fetching a response.

    • SimpleDev@lemmy.worldOP
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      You might double check the API key is correct.

      Also make sure you have access to the model you’ve selected, you may not have GPT4 access by default if you’re using that.

      I believe it’s GPT3.5 by default and you can request GPT4 access.

      • EraNet@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        1 year ago

        It gives me a GPT3.5turbo and GPT4 options. Not sure what’s “turbo”.

        I am just a regular 3.5 user

        • SimpleDev@lemmy.worldOP
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          Turbo is just what they call their 3.5 model for some reason.

          Odd it’s not working for you, seems alright for others at the moment. The only other thing I can think of is something blocking JavaScript or the browser LocalStorage.

          • EraNet@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            Should this work on an Android device (I am using a brave browser), or just on desktop?

            • SimpleDev@lemmy.worldOP
              link
              fedilink
              arrow-up
              1
              ·
              edit-2
              1 year ago

              Should work on any platform, Brave might be blocking some scripts.

              If it works on the desktop or chrome mobile etc… it’s probably just Brave being overly aggressive.

    • SimpleDev@lemmy.worldOP
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      It is a progressive web app so you can save it to your home screen to behave like an app on both iOS and Android.

  • johntash@eviltoast.org
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    Nice, this looks pretty good. Any chance of having an export option for the chats or maybe just an option to store them server side?

    • SimpleDev@lemmy.worldOP
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      1 year ago

      Thanks!

      Good question, I hadn’t really thought about it before but I think an export is possible. Conversations are stored in your browsers local storage, adding the ability to export/import seems entirely possible.

      It might even be possible to generate a web link on export that you can use to load conversations on another device.

      As far as server side storage goes personally I’d like to handle as little of other peoples data as possible even if it’s just AI chat conversations. It’s not out of the question though, I just need to look into it further.

      Edit - I mention here I will add the ability to export. I’ve also added a GitHub Issue with the feature request.

      Now I just have to not be lazy and do it eventually.

  • player2@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    This looks cool, but I’m also having trouble using it on Android. I installed the app like was shown and I entered the API key that I generated for this, but I get the error fetching response. I tried both the 3.5 and 4 models, I’m not sure how to check which I have.

    • SimpleDev@lemmy.worldOP
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      1 year ago

      If you’ve created a new account and generated an api key I think you just need to go Billing —> overview and setup your payment method.

      I think you get $20 credit for signing up after that and your key is active.

      Edit - the default model will be 3.5 as well. I think you have to request GPT4 access still. But they’re giving out access pretty quick these days I think.

      Edit 2 - Actually it looks like GPT4 should be available from the start as well now. Nice!

      • player2@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        Thanks for the reply, I haven’t given them my payment info yet so maybe that’s the issue, I may try that later. Their website made it seem like I was currently on the free plan which included the $20 credit, but that makes sense, maybe it’s not active yet.