Isn’t paraphrasing/summarizing the top result a pretty good use case for LLMs? If I search “what temperature should I bake cupcakes at?” I really just want a simple answer, not dozens of links to life story style recipe blogs.
DDG didn’t provide a summary, but Google did (and it was very long). I assumed the answer was 350F, but the summary suggested 325-375. Lower for flatter cupcakes, higher for more domed. Interesting.
This type of summary wouldn’t be nearly as helpful for a technical programming question, but I doubt that describes the bulk of search queries.
I wasn’t arguing about it’s accuracy, I was attacking it’s need to exist. fuck AI, I’m tired of hearing that acronym. can’t wait for this shit to go away like every tech fad in the last decade
that Google AI is so shit. I just paraphrases the top result
Isn’t paraphrasing/summarizing the top result a pretty good use case for LLMs? If I search “what temperature should I bake cupcakes at?” I really just want a simple answer, not dozens of links to life story style recipe blogs.
DDG didn’t provide a summary, but Google did (and it was very long). I assumed the answer was 350F, but the summary suggested 325-375. Lower for flatter cupcakes, higher for more domed. Interesting.
This type of summary wouldn’t be nearly as helpful for a technical programming question, but I doubt that describes the bulk of search queries.
I wasn’t arguing about it’s accuracy, I was attacking it’s need to exist. fuck AI, I’m tired of hearing that acronym. can’t wait for this shit to go away like every tech fad in the last decade