SDF Chatter
  • Communities
  • Create Post
  • Create Community
  • heart
    Support Lemmy
  • search
    Search
  • Login
  • Sign Up
☆ Yσɠƚԋσʂ ☆@lemmygrad.ml to Technology@lemmygrad.mlEnglish · 6 months ago

LocalAI is the free, open source locally run drop-in replacement REST API for OpenAI

github.com

external-link
message-square
15
fedilink
  • cross-posted to:
  • opensource@lemmy.ml
21
external-link

LocalAI is the free, open source locally run drop-in replacement REST API for OpenAI

github.com

☆ Yσɠƚԋσʂ ☆@lemmygrad.ml to Technology@lemmygrad.mlEnglish · 6 months ago
message-square
15
fedilink
  • cross-posted to:
  • opensource@lemmy.ml
GitHub - mudler/LocalAI: :robot: The free, Open Source alternative to OpenAI, Claude and others. Self-hosted and local-first. Drop-in replacement for OpenAI, running on consumer-grade hardware. No GPU required. Runs gguf, transformers, diffusers and many more models architectures. Features: Generate Text, Audio, Video, Images, Voice Cloning, Distributed, P2P inference
github.com
external-link
:robot: The free, Open Source alternative to OpenAI, Claude and others. Self-hosted and local-first. Drop-in replacement for OpenAI, running on consumer-grade hardware. No GPU required. Runs gguf,...
  • ☆ Yσɠƚԋσʂ ☆@lemmygrad.mlOP
    link
    fedilink
    arrow-up
    1
    ·
    6 months ago

    This one is multimodal and can generate images.

    • Comprehensive49@lemmygrad.ml
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      6 months ago

      You can run multimodal models like LLaVA and LLaMA on Ollama as well.

      The AI models are coded and used in a way that makes them basically platform-agnostic, so the specific platform (Ollama, LocalAI, vLLM, llama.cpp, etc.) you run them with ends up being irrelevant.

      Because of that, the only reasons to use one platform over another are if it’s best for your specific use case (depends), it’s the best supported (Ollama by far), or if it has the best performance (vLLM seems to win right now).

      • ☆ Yσɠƚԋσʂ ☆@lemmygrad.mlOP
        link
        fedilink
        arrow-up
        1
        ·
        6 months ago

        Ah gotcha, I thought Ollama was text only

Technology@lemmygrad.ml

technology@lemmygrad.ml

Subscribe from Remote Instance

Create a post
You are not logged in. However you can subscribe from another Fediverse account, for example Lemmy or Mastodon. To do this, paste the following into the search field of your instance: !technology@lemmygrad.ml

A tech news sub for communists

Visibility: Public
globe

This community can be federated to other instances and be posted/commented in by their users.

  • 118 users / day
  • 214 users / week
  • 409 users / month
  • 1.04K users / 6 months
  • 19 local subscribers
  • 1.12K subscribers
  • 997 Posts
  • 3.83K Comments
  • Modlog
  • mods:
  • Muad'Dibber@lemmygrad.ml
  • BE: 0.19.8
  • Modlog
  • Instances
  • Docs
  • Code
  • join-lemmy.org