dustbunnies [she/her, comrade/them]

42f late-dx AuDHD cis bi married parent of trans kid

used to be a sea bass, now I’m a trail of dust bunnies

  • 2 Posts
  • 150 Comments
Joined 12 days ago
cake
Cake day: October 18th, 2024

help-circle

  • okay, so let’s run through what the possible scenarios could be here, so you can figure out how you feel about them and how you would deal with them

    everybody help, I won’t think of everything

    • he’s exactly what he claims to be
    • he’s a serial killer and you’re in danger
    • he is mostly what he seems but also very lonely and maybe a little weird and has a hard time making friends except in this way
    • he is looking to trap a slave
    • he is running a cult
    • he is part of a cult

    as I’m typing them, the third one seems the most likely to me – and I bet this strategy tends to backfire a little, because it can make for an awkward beginning to a friendship when there’s so much inequity from the jump. it can be hard to recover from that, but finding a way to make the aid mutual really helps, so if that is the situation, just seeing the act of friendship as a gift you’re giving in return might be helpful.

    if I had more money, I would be trying to take people in constantly, so I’ve been more trusting of these kinds of situations than I ought to be and gotten into trouble. but I am that kind of person, and I know there are others, and some of them probably have money. so maybe it’s legit.














  • as much as the speech-to-text gets wrong on my phone, I can only imagine what it does with doctors’ notes.

    one of my million previous jobs was in medical transcription, and it is so easy to misunderstand things even when you have a good grasp of specialty-specific terminology and basic anatomy.

    they enunciate the shit they’re recording about your case about as well as they legibly write. you really have to get a feel for a doctor’s speaking style and common phrases to not turn in a bunch of errors.

    But Whisper has a major flaw: It is prone to making up chunks of text or even entire sentences, according to interviews with more than a dozen software engineers, developers and academic researchers. Those experts said some of the invented text — known in the industry as hallucinations — can include racial commentary, violent rhetoric and even imagined medical treatments.

    internet-delenda-est

    Edit: oh yeah, ✨ innovation ✨

    While most developers assume that transcription tools misspell words or make other errors, engineers and researchers said they had never seen another AI-powered transcription tool hallucinate as much as Whisper.

    Edit 2: it gets better and better

    In an example they uncovered, a speaker said, “He, the boy, was going to, I’m not sure exactly, take the umbrella.”

    But the transcription software added: “He took a big piece of a cross, a teeny, small piece … I’m sure he didn’t have a terror knife so he killed a number of people.”

    A speaker in another recording described “two other girls and one lady.” Whisper invented extra commentary on race, adding “two other girls and one lady, um, which were Black.”

    In a third transcription, Whisper invented a non-existent medication called “hyperactivated antibiotics.”

    Edit 3: wonder if the Organ Procurement Organizations are going to try to use this to blame for the extremely fucked up shit that’s been happening