SDF Chatter
  • Communities
  • Create Post
  • Create Community
  • heart
    Support Lemmy
  • search
    Search
  • Login
  • Sign Up
(des)mosthenes@lemmy.world to AI@lemmy.ml · 1 year ago

GPT4All - A free-to-use, locally running, privacy-aware chatbot. No GPU or internet required.

lemmy.world

video
message-square
16
fedilink
63
video

GPT4All - A free-to-use, locally running, privacy-aware chatbot. No GPU or internet required.

lemmy.world

(des)mosthenes@lemmy.world to AI@lemmy.ml · 1 year ago
message-square
16
fedilink
alert-triangle
You must log in or register to comment.
  • Einar@lemm.ee
    link
    fedilink
    arrow-up
    12
    ·
    1 year ago

    Here is the link: https://gpt4all.io/

  • (des)mosthenes@lemmy.worldOP
    link
    fedilink
    arrow-up
    8
    ·
    1 year ago

    it works on apple arm architecture surprisingly well; near instantly typically

  • PlutoniumAcid@lemmy.world
    link
    fedilink
    arrow-up
    7
    ·
    1 year ago

    There’s a long listing of models to choose from. How to pick one? What are differences, benefits/drawbacks?

    • (des)mosthenes@lemmy.worldOP
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      it’s gonna take experimentation; there’s a list of all models in the app or on the site and maybe a little googling. you can still use openai too. mistral is solid overall though; good for programming

  • gandalf_der_12te@feddit.de
    link
    fedilink
    arrow-up
    7
    ·
    edit-2
    1 year ago

    Where did this come from? Did OpenAI finally release the source code? And where does the training data come from? Is the training data public domain/appropriately licensed?

    • (des)mosthenes@lemmy.worldOP
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      that’s all on the site - openai offers an api for openai model access

  • Empricorn@feddit.nl
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    1
    ·
    1 year ago

    Is running using a GPU a bad thing? I’m new to this…

    • Fisch@lemmy.ml
      link
      fedilink
      arrow-up
      13
      ·
      1 year ago

      No, a GPU would be ideal but not everyone has one, especially one with enough VRAM. I have an AMD card with 12gb of VRAM and I can run 7B - 13B models but even the 7B models (which seems to be the lowest that is still good) use a little more than 8gb of VRAM and most people probably have an Nvidia card with 8gb or less. 13B models get very close to using the full 12gb.

    • DoYouNot@lemmy.world
      link
      fedilink
      arrow-up
      9
      ·
      1 year ago

      Not everyone has a dedicated GPU, I would guess. GPUs are good at doing tensor calculations, but they’re not the only way.

    • Sabata11792@kbin.social
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      Its better if you have a good GPU, but will not run without a card from the last few years. It can run on CPU but it’s much slower.

  • DoucheBagMcSwag@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    4
    arrow-down
    5
    ·
    1 year ago

    Uh…how are they going to pay for the server load?

    • sane@feddit.de
      link
      fedilink
      arrow-up
      10
      ·
      1 year ago

      locally running

      • DoucheBagMcSwag@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        Oh cool!

    • null@slrpnk.net
      link
      fedilink
      arrow-up
      4
      ·
      1 year ago

      What server load?

    • hswolf@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      you just need to pay your energy bill on time,that’s all

      • DoucheBagMcSwag@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        I just read its local

AI@lemmy.ml

artificial_intel@lemmy.ml

Subscribe from Remote Instance

Create a post
You are not logged in. However you can subscribe from another Fediverse account, for example Lemmy or Mastodon. To do this, paste the following into the search field of your instance: !artificial_intel@lemmy.ml

Artificial intelligence (AI) is intelligence demonstrated by machines, unlike the natural intelligence displayed by humans and animals, which involves consciousness and emotionality. The distinction between the former and the latter categories is often revealed by the acronym chosen.

Visibility: Public
globe

This community can be federated to other instances and be posted/commented in by their users.

  • 1 user / day
  • 16 users / week
  • 91 users / month
  • 1.1K users / 6 months
  • 47 local subscribers
  • 4.83K subscribers
  • 528 Posts
  • 1.61K Comments
  • Modlog
  • mods:
  • BE: 0.19.8
  • Modlog
  • Instances
  • Docs
  • Code
  • join-lemmy.org