Emotion recognition systems are finding growing use, from monitoring customer responses to ads to scanning for ‘distressed’ women in danger.

  • AllonzeeLV@lemmy.world
    link
    fedilink
    arrow-up
    33
    ·
    edit-2
    1 year ago

    Too bad it gets the emotion and not the context.

    I’d love to be fired because “I hate making money for these greedy ass capitalist douchebags” pops up on a screen whenever I come in.

    The idea that employers should even be allowed to ask what their employees are feeling, much less scan them to discern it, is a new low for our modern Orwellian dystopia.

    • thanevim@kbin.social
      link
      fedilink
      arrow-up
      12
      ·
      1 year ago

      The thing is though, I don’t see how someone like this could even work out.

      Like, you hire employee 1, they get frustrated at something overnight. You fire them for being upset. Now you have to fill the seat. Employee 2 is brought on. They get told what happened to the person they replaced. They leave or are fired for having emotion and being human. This repeats ad nauseum.

      • Radioactive Radio@lemm.ee
        link
        fedilink
        arrow-up
        10
        ·
        1 year ago

        Let’s be real, most of us would get weeded out at the interview when they start spilling all the “we’re like a family” bullshit.

        • randon31415@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          1 year ago

          What type of family? Found family? The kind of family that requires restraining orders for abuse? The kind that only sees each other on Chirstmas?

      • AllonzeeLV@lemmy.world
        link
        fedilink
        arrow-up
        9
        ·
        edit-2
        1 year ago

        I’m guessing it’s going to be implemented as identifying “persistent negative attitudes” and as validation to fire anyone in non-fire-at-will locales.

        It could also be used as bullshit to deny raises and promotions if your grateful or motivated indexes weren’t high enough.

        • FringeTheory999@lemmy.world
          link
          fedilink
          arrow-up
          10
          ·
          1 year ago

          so, basically a tool to suss out which employees have undisclosed mental health issues that the employer can’t legally ask about. cool. cool.

  • kool_newt@lemm.ee
    link
    fedilink
    arrow-up
    31
    arrow-down
    2
    ·
    1 year ago

    It’s about time we start holding the engineers building these technologies responsible directly.

    I’m not talking about scientists expanding knowledge, I’m talking specifically about the engineers building these technologies.

    Is mood recognition a tool useful for anything other than maintaining power over others (actually curious)?

    • const_void@lemmy.ml
      link
      fedilink
      arrow-up
      20
      arrow-down
      2
      ·
      1 year ago

      Seriously. Why are people choosing to work for these companies? There are other ways to make a buck. Have some fucking morals.

    • PeterPoopshit@lemmy.world
      link
      fedilink
      arrow-up
      7
      ·
      edit-2
      1 year ago

      At a certain point, not just the companies doing this are to blame but the people working for them as well. Who tf can support this kind of thing? People need to have some self fucking respect.

      For example we could probably have the cure for cancer right by now if they spent half the effort on it as they did making unbeatable thc drug tests for example. It’s clear where society’s priorities are. Improving lives does not generate profit.

    • SokathHisEyesOpen@lemmy.ml
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 year ago

      There are uses for it. They can track the average mood of an entire room over a period of time. If you use that somewhere like a restaurant, or a banquet venue, then that information can be useful for tweaking the policies, environment, prices, etc. Of course an actual human could do this too, just by being there. I think it’ll get the most use at places like casinos where they’re always using psychological tricks to make people want to gamble. Ironically I don’t think that “happy” is the mood they’ll be aiming for.

      • kool_newt@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Ya, I guess I can see some uses for it, but nothing that makes the risks of it’s existence worth it.

        It seems like every tool/tech will be used by good people to do good things and bad people to do bad things. Some things like a spoon are handy for getting good things done but not very useful to bad people to do bad things with. Other tools like mood recognition might be quite handy for bad people looking to control others, but only moderately useful to good people.

        Tools in that second group I think we should be wary of letting them exist. Just because something can be done doesn’t mean it should be done or that it can be called “progress”.

        • SokathHisEyesOpen@lemmy.ml
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          1 year ago

          It has already existed for a decade or so. I’m surprised it hasn’t made headlines before. I saw a working demo of it at the Microsoft Visitor Center about 8 years ago. In addition to estimating your mood, it also assigns you a persistent ID, estimates your height, weight, eye color, hair color, ethnicity, and age. It is scarily accurate at all of those things. That ID can be shared across all linked systems at any number of locations. I completely agree with you that there are a lot of concerning, if not downright terrifying implications of this system. It’s a privacy nightmare.

    • Neato@kbin.social
      link
      fedilink
      arrow-up
      7
      arrow-down
      1
      ·
      1 year ago

      You can to an extent, but that’s a losing venture. If pubic opinion goes against this tech hard enough, it’ll keep some people from working in those industries. BUT if those products are profitable enough, they will simply pay more and that’ll be moot.

      Attacking the people who are earning a living isn’t the answer. Most people take the job with the best combo of pay and work/life balance they can find in their area, or if they can afford to move. Not that many have the luxury to pick and choose based on their morality. And if compensation is high enough, it’s a lot less likely.

      It’s far easier to try to prevent this tech from being used at all. I know political action is hard as hell but it’s a lot easier than trying to ostracize an entire industry’s worth of workers. It may feel easier to denigrate faceless individuals but that won’t accomplish anything. Plenty of people work for weapons manufacturers and such.

      • kool_newt@lemm.ee
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        1 year ago

        Plenty of people work for weapons manufacturers and such.

        And those are bad people. If you work to build technology used to maintain power when you have an option not to, what else can that be called? These people are not desperate for a job.

        I’m an engineer, I quit \ (after the startup I worked for was acquired) because Intel powers much of the MI complex. I quit \ when it became clear I was directly assisting with state level genetic experiments. As an engineer I could easily get a job elsewhere where I was not directly contributing to the downfall of my fellow humans.

        Take McDonald’s for example. There’s a difference between someone who needs a job working in a restaurant and an Engineer working for McDonald’s figuring out how to more efficiently slaughter animals paid only to be concerned about their employer’s profit – that engineer could go work to more efficiently bake cookies.

        • Neato@kbin.social
          link
          fedilink
          arrow-up
          3
          arrow-down
          1
          ·
          1 year ago

          These people are not desperate for a job.

          You’re painting with a firehose. Some people are.

          I’m an engineer, I quit Intel (after the startup I worked for was acquired) because Intel powers much of the MI complex. I quit Illumina when it became clear I was directly assisting with state level genetic experiments. As an engineer I could easily get a job elsewhere where I was not directly contributing to the downfall of my fellow humans.

          You are what we call, privileged. Maybe you should…check it?

          • PiecePractical@midwest.social
            link
            fedilink
            arrow-up
            1
            ·
            1 year ago

            Yeah, I was a field service tech at a machine tool distributor for 15 years. One day about 7 years ago I realized that more of our customers than not were involved in some kind of arms manufacturing. Everything from components to military armaments to places making parts for AR-15s. Didn’t start that way but the business drifted into that market over time.

            I decided to move on and it took me all of 5 years to find a position that; a) I was qualified for, b) paid enough that I wouldn’t lose my house and, c) was relatively safe from drifting into the customer base as the last company.

            I don’t even have kids and this whole process was absolutely terrifying. I can easily see how someone with a family to support or less stability in their life wouldn’t feel like leaving was a possibility.

    • Buelldozer@lemmy.today
      link
      fedilink
      arrow-up
      4
      ·
      1 year ago

      Is mood recognition a tool useful for anything other than maintaining power over others (actually curious)?

      If you ever want a real General AI then it will need the ability to recognize the mood of the person it’s interacting with. ESPECIALLY if you want to use it for things like Mental Health Counseling.

      • kool_newt@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Mental Health Counseling.

        Thanks, that’s a valid answer like I was looking for. Though we don’t have actual AI and probably won’t have actual AGI for at least a good decade (we currently have machine learning and complex decision trees which appear kinda intelligent to us in 2023).

    • SokathHisEyesOpen@lemmy.ml
      link
      fedilink
      English
      arrow-up
      10
      ·
      edit-2
      1 year ago

      What’s crazy is that this was already fully functional and in-use at least 8 years ago. Idk how this has stayed out of the headlines until now. Microsoft had a working demo of this in their visitor center in 2015 and was already using it in multiple places. As soon as you enter the room it assigns you a persistent ID, estimates your height, weight, eye color, hair color, and age. Then it tracks your mood and the overall mood of the room continuously. The ID can be persistent across any number of linked locations. They don’t ask for anyone’s permission before using it.

    • Lyrl@lemm.ee
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      Sounds like you are fighting on behalf of the whole world. I hope you get some times with yourself or a smaller circle that are positive and a break from the dumpster fires of modern civilization.

  • 🇰 🌀 🇱 🇦 🇳 🇦 🇰 ℹ️@yiffit.net
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    1 year ago

    If they could do that, they would probably see how God damn miserable most people are. If they used that to change and make them not miserable, I don’t see it being dangerous. But more than likely it will be more “your sadness doesn’t vibe with us. You’re fired.”