Lemmy
  • Communities
  • Create Post
  • Create Community
  • heart
    Support Lemmy
  • search
    Search
  • Login
  • Sign Up
just_another_person@lemmy.world to Linux@lemmy.worldEnglish · 4 months ago

AMD Announces "Instella" Fully Open-Source 3B Language Models

www.phoronix.com

external-link
message-square
10
fedilink
  • cross-posted to:
  • linux@lemmy.ml
1
external-link

AMD Announces "Instella" Fully Open-Source 3B Language Models

www.phoronix.com

just_another_person@lemmy.world to Linux@lemmy.worldEnglish · 4 months ago
message-square
10
fedilink
  • cross-posted to:
  • linux@lemmy.ml
Attention Required! | Cloudflare
www.phoronix.com
external-link
alert-triangle
You must log in or register to comment.
  • 𝕽𝖚𝖆𝖎𝖉𝖍𝖗𝖎𝖌𝖍@midwest.social
    link
    fedilink
    English
    arrow-up
    0
    ·
    4 months ago

    I need to catch up on training. I need an LLM that I can train on all my ebooks and digitized music, and can answer questions “what’s that book where the girl goes to the thing and does that deed?”

    • swelter_spark@reddthat.com
      link
      fedilink
      English
      arrow-up
      1
      ·
      3 months ago

      You probably could use RAG for this instead of actually training a model.

  • GaMEChld@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    4 months ago

    Smart people, I beg of thee, explain! What can it do?

    Edit: looks to be another text based one, not image generation right?

    • just_another_person@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      4 months ago

      It’s language only, hence, LM

      • GaMEChld@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 months ago

        To be fair, I didn’t know if that language included programming language, and thus maybe still consider image based AI to be included in LLM. Is there a different designation for the type of AI that does image generation?

        • just_another_person@lemmy.worldOP
          link
          fedilink
          English
          arrow-up
          1
          ·
          4 months ago

          Yes: https://www.hachi-x.com/en/single-post/differences-between-llm-vlm-lvm-lmm-mllm-generative-ai-and-foundation-models

  • HappyFrog@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    0
    ·
    4 months ago

    I see all these graphs about how much better this LLM is than another, but do those graphs actually translate to real world usefulness?

    • oldfart@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      I have yet to see a 3B model that’s not dumb.

  • brokenlcd@feddit.it
    link
    fedilink
    English
    arrow-up
    0
    ·
    4 months ago

    The problem is… How do we run it if rocm is still a mess for most of their gpus? Cpu time?

    • swelter_spark@reddthat.com
      link
      fedilink
      English
      arrow-up
      1
      ·
      3 months ago

      There are ROCm versions of llama.cpp, ollama, and kobold.cpp that work well, although they’ll have to add support for this model before they could run it.

Linux@lemmy.world

linux@lemmy.world

Subscribe from Remote Instance

Create a post
You are not logged in. However you can subscribe from another Fediverse account, for example Lemmy or Mastodon. To do this, paste the following into the search field of your instance: !linux@lemmy.world

Welcome to c/linux!

Welcome to our thriving Linux community! Whether you’re a seasoned Linux enthusiast or just starting your journey, we’re excited to have you here. Explore, learn, and collaborate with like-minded individuals who share a passion for open-source software and the endless possibilities it offers. Together, let’s dive into the world of Linux and embrace the power of freedom, customization, and innovation. Enjoy your stay and feel free to join the vibrant discussions that await you!

Rules:

  1. Stay on topic: Posts and discussions should be related to Linux, open source software, and related technologies.

  2. Be respectful: Treat fellow community members with respect and courtesy.

  3. Quality over quantity: Share informative and thought-provoking content.

  4. No spam or self-promotion: Avoid excessive self-promotion or spamming.

  5. No NSFW adult content

  6. Follow general lemmy guidelines.

Visibility: Public
globe

This community can be federated to other instances and be posted/commented in by their users.

  • 396 users / day
  • 700 users / week
  • 2.15K users / month
  • 4.11K users / 6 months
  • 1 local subscriber
  • 11.9K subscribers
  • 306 Posts
  • 2.34K Comments
  • Modlog
  • mods:
  • MigratingtoLemmy@lemmy.world
  • BE: 0.19.9
  • Modlog
  • Instances
  • Docs
  • Code
  • join-lemmy.org