I work in a subindustry of healthcare whose end users are all healthcare workers. To be clear, I’m not a healthcare worker. But I’m in a monthly all hands call and they’re currently laying out the amount of AI features they have planned for our platform and it’s monumental. They want AI touching everything.

There’s lots of talk about doctors learning about new innovations via “microcontent”. In other words, there’s a vision of doctors scrolling an AI-filled infinite feed to learn about medical innovations instead of reading full articles. They want to addict doctors to a nerdy Twitter clone so they can show them ads.

  • Infamousblt [any]@hexbear.net
    link
    fedilink
    English
    arrow-up
    25
    ·
    3 months ago

    I’m in tech and I keep seeing the results of executive dreams rubbing up against the reality of what LLMs can actually accomplish and it’s amazing. They ask for the world and the LLM can’t even deliver a speck of dirt half the time. Our code does the thing correctly the same way every time and if it doesn’t it’s a critical bug that must be fixed immediately, but when an LLM can’t accomplish anywhere near what the established code can it’s “well we can find ways to improve it” and “people will understand because it’s just where LLMs are” and all manner of excuses.

    Without some serious breakthrough on the order of actual AGI this shit will not last for anywhere near the number of applications execs think it will. It just can’t. It’s going to crash and burn in a giant fireball. I encourage everyone to develop skills that will let you continue to work when it does.

    • carpoftruth [any, any]@hexbear.net
      link
      fedilink
      English
      arrow-up
      13
      ·
      3 months ago

      it doesn’t have to be good or actually replace the value that doctors provide, it just has to be good enough to provide plausible deniability of its shittiness when it is applied to the bottom 40, 50, 80, 95% of the populace.

      • Infamousblt [any]@hexbear.net
        link
        fedilink
        English
        arrow-up
        19
        ·
        3 months ago

        The problem isn’t really for consumer grade stuff it’s enterprise grade stuff. Things most people don’t even know about. If that stuff isn’t perfect, and there’s another option, entire companies spend millions and move mountains to shift to the slightly more perfect option. Consumer tech is not what powers the industry. They just get the scraps. When enterprises realize AI can’t and won’t solve their needs the entire mountain will collapse faster than you can imagine.

  • a friend of mine is healthcare adjacent too, something like institutional research, and she says every confab she goes to is crammed with dumbass suits completely in the tank for AI. im in a text group where she just livestreams stupid shit coming from the MBAs holding us all hostage to their fabulist imaginaries.

  • SchillMenaker [he/him]@hexbear.net
    link
    fedilink
    English
    arrow-up
    14
    ·
    3 months ago

    The whole AI trends might literally be God testing us. Are we worthy of existence?

    So you’re telling me that the managers are excited about the prospect of magic software that would essentially do the things that a competent manager would do that they are currently not doing? I genuinely don’t think I could go on living if that was me.

  • dose@ttrpg.network
    link
    fedilink
    English
    arrow-up
    8
    ·
    3 months ago

    I am a front line health care worker and people are loving AI. Specifically AI Scribes. Thin edge of the wedge.

    It is the first serious attempt at solving a real problem that has been piling up for years which is the volume of documentation required. Even before COVID it was near a crisis. Every day at work there is conversation about how fed up everyone, especially doctors, are with increasing administrative demands. The software we were given to use barely adequate.

    AI is the first nice thing to have come down from the powers that be. Adoption is at about 30%.

    AI Scribe is speech to text. It doesn’t only transcribe, it interprets the information into a standard format of medical note taking like consultation or SOAP. Like magic. Then the doctor only has to proofread which they are mostly doing, for now.

    Universal opinions regarding AI Scribe: “I’m never going back” “If they take AI away I’m quitting.”

    Lots the critiques of AI are about how badly it makes memes or music but this truly is a case of automating tedious work that was dreaded by those who performed it. And scraping the content of every patient encounter for some proprietary use we have no idea of.

    Published today: https://www.nature.com/articles/s41746-025-01895-6

    EHR vendors moving faster and spending more than has ever been seen to get AI in there. Doctors have found it a completely pleasant experience so far. Still recovering from COVID, need any break. They are a friendly audience to whatever comes next.