• thickertoofan@lemm.ee
    link
    fedilink
    English
    arrow-up
    1
    ·
    23 hours ago

    ayy, that’s nice. LLMs are truely overkill just for semantic search though, didnt know there are other ways to achieve this. but we need intelligence too right. (somewhat)

    • meme_historian@lemmy.dbzer0.comOP
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      20 hours ago

      Don’t get me wrong though… throwing an LLM at it would be a lot easier and faster. Just a mind boggling use of resources for a task that could probably be done more efficiently :D

      Setting this up with Apache Solr and a suitable search frontend runs a high risk of becoming an abandoned side project itself^^

      • thickertoofan@lemm.ee
        link
        fedilink
        English
        arrow-up
        3
        ·
        18 hours ago

        Yeah LLM seems like the go to solution. And the best one. And talking about resources, we can use barely smart models which can generate coherent sentences, be it 0.5b-3b models offloaded to CPU inference only.

    • 0xD@infosec.pub
      link
      fedilink
      English
      arrow-up
      1
      ·
      22 hours ago

      Yes, your own intelligence that you integrate into the structure of your database and queries ;)