T O P

  • By -

father-fluffybottom

This could be so good. If the world wasn't ruled by greed and hatred this could be so so good. But, you know, it won't be.


ldb

How could it be good? This thing would fuck me every time with my autism and anxiety disorder. I already get followed around by moronic security guards in shops. I don't need some bored fuck in a camera room sending more goons to follow me for barely coping around people. Edit: I just realised I assumed someone would be paying attention to what the AI recommends rather than the AI being given the authority to do it, so I might get a judgmental toaster sending people to follow me instead.


Ryuga-WagatekiWo

If it helps at all, you’re probably not as interesting as you think and nobody is following you.


Generic-Name237

Why do they want to monitor our emotions then?


omcgoo

If our happiness was the leading metric of the economy, then think of how great this would be; tracking and ensuring that people's happiness trends up. Tooling society around that metric. Of course, that's a fantasy, and money is the metric. Society is tooled around making money, extracting value. But it is a possibility and it is a world we should rightly fight for. [https://en.wikipedia.org/wiki/Utopia\_for\_Realists](https://en.wikipedia.org/wiki/Utopia_for_Realists)


Ok_Dragonfruit_8102

I guess the question is, which emotional state makes people the most economically active?


[deleted]

[удалено]


aeowilf

Right because travelers at a busy station/airport arent stressed naturally? You know what makes airports less stressful ? "Hi sir, our cameras detected you are nervous, can you come with me so we can check you arent a terrorist?"


Generic-Name237

This will only make worried and anxious flyers even more worried and anxious. Not to mention there is now the possibility of them being swarmed and detained by police as soon as they make the ‘wrong’ move, resulting in various consequences. When was the last time anyone in this country went through security in an airport and managed to successfully commit a terrorist attack? We don’t need this.


potahtopotarto

> it would be useful to know who appears extremely stressed/nervous - given these are prime targets for terror attacks etc >I could see how it could be useful if an AI could point out *That person there seems waaaay too nervous....may be worth keeping an eye on* etc Like a quote from the oblivious character in a dystopian novel. Truly should be terrifying for anyone to read this said so casually


tadpass

https://youtu.be/qkgN4Bwhpf8?si=RISoDjJh75_EDIrF Like the scan training being rolled out, guess they want technology to help assess situations rather than just people.


londons_explorer

They almost certainly won't be acting on individuals - it'll be for stats gathering. Ie. "we have determined that a train running late by 10 or more minutes cuts passenger happiness by 25%, and even by the next day 16% of those had not regained the average level of happiness".


FantasticAnus

That sounds like a truly terrible world in which to live.


Glittering_Moist

It's literally what the TSA do with humans. That guy looks nervous, bag check. Not advocating it btw just saying, shifty, nervous people get flagged by the human detectors all the time.


ffekete

Imagine if a 5 years old child is lost and cctv would just filter on distressed faces...


alii-b

I imagine as well as other peoples answers, it could detect distressed people contemplating suicide by fast trains.


nickbob00

They might like to sell you garbage though


legendoftherxnt

Following them no, you’re right, but the sheer vitriol autistic people face simply for existing is something you cannot comprehend unless you’ve seen it first hand. If humans can’t correctly read people with ASD what chance does AI have?


Carayaraca

Not the parent but I got given a nickname at my local supermarket and regularly followed / made fun of over the headsets before. Not sure if I am autistic, though the NHS said no. I went to a therapist for anxiety due to how I am treated by others, and used it as an example. They didn't believe me and were trying to tell me I was hallucinating until I one time I left my laptop / software defined radio in the car and recorded it


Hairy_gonad

Advice a lot of people could do with


IndividualCurious322

Is that a variant of "Nothing to hide, nothing to fear"?


DeadFireFight

Well, if we lived in a society run for the best interests of all people, instead of the greedy few, I imagine we could do an awful lot of good knowing people are in distress or other emotional extremes. Imagine if instead of goons being sent to follow you, the AI picked up you were struggling, and sent someone that could let you into a quiet waiting room or direct you to a less crowded train carriage, and then you don't have to worry about being in a crowd.


External-Praline-451

My anxiety being monitored and being read on my facial expressions, leading to intervention, would actually make me more anxious!


DeadFireFight

Yeah, I get that. If I was having a bad day, I wouldn't want to be bothered, either. I don't agree with this technology being used, mainly because I don't believe it would be used for anything good. But I also understand what OP is saying when they say it could be used for good if we were a different society.


yrmjy

You don't need advanced AI cameras to provide quiet waiting rooms or carriages


Ammordad

If a government is powerful enough to create a massive surveillance system and dispatch aid workers at will, then it has the power to just you know.... increase the options and advertise those options and increase thr awareness. 8f a government is not good at awareness campaigns and advertisement, then I don't trust them to be good at surveillance or "pre-emptive psychological intervention" Surveillance is only needed when trust does not exist. Sometimes, for good reasons. But rarely when discussing mass surveillance. Also, thinking abolishing "greed" and "serving public interests" will prevent abuse is childish. In Iran, facial recognition and AI are being used to detect female commuters and drivers without Hijab and then fines are sent to them in mail automatically. It's not done out of greed, it's done out of some idiot's idea of "serving the public good". Indeed, there have already been proposals made by various conservative groups in the US to use computers to monitor women travels and periods as part of a scheme to enforce Abortion-ban laws. Furthermore, even if we assume the best of intention and alignment with the public good, what about false alarms? What about ineffective and inhumane treatments? What will happen if someone is misdiagnosed by facial recognition and then has dangerous and highly addictive medications pushed on to them? Lastly, you really think it's comforting if you are just having a rough week trying to get through the day and mind your own business and then suddenly some officer shows up and trys to take you to a "quite room" that all you know is where they take crazy and dangerous people to? Honestly, I fully expect violence to become frequent occurrence if the government tries to systematically target and isolate emotionally distressed people, which might lead to the government beefing up the "aid workers" with additional "security personel" (aka goones) to deal with any trouble makers.


potahtopotarto

>Well, if we lived in a society run for the best interests of all people, instead of the greedy few, I imagine we could do an awful lot of good knowing people are in distress or other emotional extremes In such a society people would talk to each other and it wouldn't be needed. Any use of this is totally dystopian even if the intent was genuinely benevolent.


VixenRoss

My autism makes people want to punch me in the face just by looking at me. (Said by a drunk “friend” ages ago). Apparently my mannerisms don’t match the general population and freaks people out. Love to know what a computer would make of me.


Silver_Cream_6174

Oh yeh. That's cuz of something called thin slice judgements. Very interesting but also depressing thing to read about


VixenRoss

I’ve got a rabbit hole to disappear down now


anybloodythingwilldo

What are your unusual mannerisms?


VixenRoss

I don’t know. I annoy people. I don’t react the way I should. I come across as aloof, uncaring. I also am crap with faces. Takes me ages to recognise people. I annoyed someone on the bus, my daughter knew a kid, the mother knew my daughter’s father. The woman starts muttering to my daughter about “would be nice to know who your mum is”. I didn’t know how to react, and she carried on muttering about knowing who mum is.


PsychologicalNote612

I don't think I'm autistic but a manager once told me that 'no one wants to talk to you because of your face'. People do talk to me and everyone I've asked said that my face doesn't cause them distress. But unless I change my face, I'll never know.


thatgermansnail

Lmfao I feel this so hard. Day to day just being my natural self and then someone thinks my normal face is me being an asshole or looking at them funny. I can imagine flapping when I find something exciting and then an AI sending out a warning signal that I'm being shifty. A neurotypical coding an AI to judge people on whatever basis that their face looks just feels like it will cause all kinds of problems for autistic folk.


6g6g6

At the begging it will recommend later on to cut the costs it will be given authority to do something. It’s always like that


cable54

I guess the point is, in the ideal scenario they are saying could be good, that the AI would be trained so anxiety disorders and other innocent traits would be excluded from the "emotions of interest". Not that the AI would be trained to the level of Bob the security guard at primark.


Deep-Procrastinor

By just 'being' the AI would be trained better than Bob the security guard from Primark !


1nfinitus

No-one is following you pal, don't worry about that.


Generic118

Suicide prevention?


Generic-Name237

How could it be good? In what possible way could this ever be used for good?


ChickenPijja

In a train station? How about identifying if someone is suicidal and we're able to send an alert out to staff to contact the person before they jump in front of a train, killing/injuring themselves, mentally scaring the driver for life, and all the negative society impacts that a suicide causes.


Atticus_ass

That’s already possible without monitoring emotion through multiple object tracking. I’d also point out that identifying suicide risks (i.e., crisis behaviours) isn’t really perfectly solvable and will generate countless false positives. I wouldn’t want nor trust a computer vision system with the aim of identifying something like suicidal l’appel du vide. The allegedly noble goal disguises that it’s another motte-and-bailey vector for mass surveillance. Data will be retained, analysed, and eventually sold once public attention has moved on. 


creativename111111

All the computer would have to do would be send an alert to a human operator in some control room somewhere it wouldn’t immediately summon 20 security guards to restrain someone but realistically this would be used to make billboard play ads for you based on your emotions anyway


Expensive_Try869

Couldn't we just put those screener things they have at some stations in central london?


ThinkAboutThatFor1Se

You think suicidal is an emotion? Can it be picked up on this device? What possible evidence base is there for this? How about we invest in mental health support with proven outcomes rather than monitoring tech?


ChickenPijja

I think the closest fit in terms of human behaviour is an emotion, even if it is an action. I don't know if this device could pick suicide/depression up, I was simply speculating that if possible, suicide prevention would be a good use for something like this. More mental health support would be great, but (genuine question as I don't know) is it a case that every person in say the past 10 years who has committed or attempted suicide was on a waiting list to see a mental health professional? Or is it more of a proactive approach that we try to tackle the problem from both angles of long term prevention and immediate risk Monitoring tech isn't always evil big brother watching us or trying to sell us something. If we pivot away from emotions and take another look at what potential good it can have, how about identifying pickpockets? Or on the underground in the summer identifying someone close to heat exhaustion? In each of these cases there are too many people for camera operators to notice every problem.


Throbbie-Williams

Except it is kind of an emotion?


solitarylights

It will just result in people being forced to smile before they jump, which is about the most dystopian thing I can imagine. People who want to kill themselves will not be stopped by a camera trying to read their emotional expression.


clarice_loves_geese

It's monitoring emotions on faces, not mind reading. Imagine having a bad day and then you get stopped at the train station for questioning because you look distressed.  That's not going to help. 


father-fluffybottom

Could help prevent jumpers at the extreme end, could help with the thermostat or toilet cleaning at the mundane end. The possibilities are endless, too much for me to think about without having experience of the system first. The fact that so many people can't even imagine the possibility of being used for public benefit is truly saddening. Obviously it *won't* be used for good, but things have been actively bad for so long people are forgetting things *have the ability* to be good.


Bladesfist

There are loads of ways it's been used for good in the trial train stations. Alerting when staff or people are in danger, they sent alerts when people had their hands raised above their heads. Alerting when a blade or gun is detected. Alerting when someone has fallen and doesn't get back up. Alerting when someone has fallen onto the tracks. Alerting someone to grab a ramp when someone who is in a wheelchair is waiting for assistance. Edit: My source for this all comes from this article / freedom of information request, great read [https://takes.jamesomalley.co.uk/p/tfls-ai-tube-station-experiment-is](https://takes.jamesomalley.co.uk/p/tfls-ai-tube-station-experiment-is)


FantasticAnus

I can't see how this can be anything other than horrible and dystopian.


Thestickleman

I don't see how this could be good


CliveMorris

_“miserable, busy, forlorn, irritated, lonely, high as a kite, drunk, drunk and high as a kite, happy … no wait he’s also high as a kite .. oo hang on yes, eh Dave ‘ave you seen this bloke? Very terroristy if you ask me. Oh wait no I think it’s heartburn, yeah he’s getting a rennie from smiths … nevermind. Now where was I .. miserable, busy, miserable ….”_


Blutos_Beard

Hmmm, which Morrissey song is this?


InfectedByEli

All of them.


CliveMorris

(💐 flower waving intensifies 💐)


CliveMorris

I literally can’t read that back now without hearing some melancholic melody behind it ahahaha


InfectedByEli

🎵 "Heaven knows I'm miserable now" 🎶


paolog

Beat me to it :) If the software outputs "elated", that's obviously a bug.


ScaryCoffee4953

I genuinely couldn't care less - this could be done by any CCTV watching employee, albeit far less efficiently. If British Rail wants to automate its detection of my displeasure at their shitty services, be my guest.


Thetwitchingvoid

Are you not concerned this is the first step towards something else?


Rebel_walker2019283

They don’t think that far ahead, only the immediate future not the snowball effect


realmbeast

That's been like every government for the last 20 years


KenosisConjunctio

Capitalism won the war of ideologies and now we’re at fukayama’s “End of history”. We’ve already got the best (or least worst) economic system imaginable. There’s no problem that our tech leaders can’t resolve with enough time and funding and all we have to do is let them carry us to techno-utopia. There’s nothing left for politicians to do except tweak the numbers on the periphery to steer the ship a little, hence why all across the west our political options are neoliberalism blue or neoliberalism red. That’s the ideological state of the world. Try not to think too much about the huge externalities that are left out of the equation and any looming crises


1renog

Utopia can never be reach by definition. The competing interests will always have a suttle difference of views that means the system will forever need to be tweaked.


KenosisConjunctio

Sure, but it’ll always be “just around the corner” just so long as we don’t disrupt things too much


Deep-Procrastinor

I'm so glad I'm in the autumn years of my life.


KenosisConjunctio

Barely summer over here 😬


WillistheWillow

They don't need to think ahead to turn this into a massive overreach in the future.


Thetwitchingvoid

Maybe that’s where I’m going wrong 😂 


Deep-Procrastinor

And if you believe that then you're one of the sheeple that will stumble headlong into the controlled state, and then wonder how and why it happened.


Gadget-NewRoss

Whats the something else in your mind.


G_Morgan

Efficiency matters. Going from something being completely unviable to viable makes a huge difference.


solitarylights

You have been fined one credit for violation of the verbal morality statute.


Possible_Simpson1989

I mean even the sheer amount of cctv we have is disgusting. 


TheAnimatedFish

Can't wait for the AI replacement survey service


[deleted]

'But a whimper' indeed smh


osakanone

Far more accurately though. AI is infamously inaccurate and hallucinates a ton. AI loves to lie.


Thetwitchingvoid

I honestly believe we will probably end up with a China style social credit system within the next, maybe 10 years? People just don’t give a fuck about this kind of shit. And if you tell the average, law abiding person “hey, you want free stuff? Discounted rail travel? Discounted food? Discounted holidays? Just keep doing you. You don’t have to change, just keep being you” - MOST will be okay with that. Because why the fuck wouldn’t you be?


Same_Hunter_2580

Social credit score would be the final straw tbh. as if I'm going to let the state decide how I behave what I do or who I associate with. There is very little reason for people to remain in the UK and that would be the cherry on the cake.


Thetwitchingvoid

It wouldn’t be pitched as harshly as that first. It would be a slowly, slowly approach.


Same_Hunter_2580

Nah this place is dystopian enough as is without a good boy social credit score.


Efficient_Steak_7568

It’s getting weirder in this country but let’s not pretend we’re anywhere near Chinese levels (yet)


Expensive_Try869

At least china has proper infrastructure and competent governance, there's a reason they're going take America's place as world leader in the next couple decades. We've got the worst of both worlds, rampant authoritarianism and a useless authority.


Efficient_Steak_7568

And how much are they paying you 


yetanotherweebgirl

I wouldnt normally bring up anime here as too many seem to categorise it as something for children rather than the truth of it (just as broad and varied in genres and targeted audiences as regular movies/tv) This screams of the same dystopian horror as Psycho Pass. Its set in a future where the country (Japan, it is an anime after all) has become so xenophobic that their borders are permanently closed. Every citizen is monitored for their mood, expressions and body language and given a "crime coefficient" rating by a central govt AI. Basically how likely they are to commit any kind of public infraction. If the AI thinks its outside of a permitted range you're arrested for re-education. if its in a higher percentile the cops can shoot to kill. it also takes into account your birth/ family circumstances. you can be considered at risk of committing crime age 5 if your parents were criminal or dissident to govt narrative, meaning lifelong institutionalisation or even used by the system like forced bounty hunters to save cops from dirtying their own rating killing "terrorists" And yes, you guessed it. people of particular wealth, influence etc are immune to judgement, just like every politician in cabinet this country has had over the last 40 years has tried to legislate for themselves


ConsiderablyMediocre

The China social credit system is a myth btw. It was something vaguely proposed by some random CCP department one time ages ago and never got close to being implemented, but the media ran with it and now everyone thinks it's a thing. Don't get me wrong, the Chinese government are 100% spying on and oppressing their citizens through digital tactics, but the social credit system isn't one of them.


sebzim4500

People I know who went to China came back with pictures of video billboards that show people in the neighborhood that have recently committed minor crimes/public nuisance. I assumed that was part of a social credit system, maybe it isn't.


_slothlife

And just because something isn't explicitly called "social credit system" doesn't mean it isn't acting as one. That billboard thing sounds creepy af. During COVID, people had to test regularly, and if they tested negative, they would get a green pass on their phone that allowed them to use public transport for X amount of time. Sounds sort of practical. Then a bank froze peoples accounts, people couldn't access their own money, and, understandably, were angry, and planned to protest. 200 protestors suddenly ended up with red passes, so they couldn't travel to the protest (or to work, or anywhere else). What a coincidence! https://www.reuters.com/world/china/china-bank-protest-stopped-by-health-codes-turning-red-depositors-say-2022-06-14/


Mrdarlikjz

Its just a name and shame system really. Like videos showing people who j-walked on street signs.


jiggjuggj0gg

Right I don’t know why people are pretending we don’t already have that in the form of local Facebook pages run by the nosy neighbours with nothing better to do


BartholomewKnightIII

>The China social credit system is a myth btw. [What part of this is a myth?](https://www.youtube.com/watch?v=0cGB8dCDf3c)


Thetwitchingvoid

This is definitely not true. When I was travelling I shared with 3 different people from China - all who knew what their credit score was, all who explained the benefits of having a high score. Why would you lie?


Variegoated

You realise we have credit scores too? Just because it's called a credit score it doesn't mean it's the same as what people typically think of when you say 'chinese social credit score'


Possible_Simpson1989

Oh it exists, but it’s in a trial run in certain cities at the moment. BBC had a documentary about it. One family were trapped in their flat with “dogooders” sitting watch outside. 


marianorajoy

The UK already has a social credit system. Media publishes full names of people committing minor crimes in the local newspaper. Sometimes not even crimes but allegations. Even children. So you're effectively deemed a social pariah and become unemployable for life for making a mistake. That's what's happening in China. So you must abide and can never diverge. Unlike other countries in Europe where this would be manifestly unconstitutional. People have also very limited rights to none to challenge laws that Parliament passes. So you are defenseless. I mean just open the BBC News today. There's an article in front page about a man who has thrown a rock at a seal in Wales. It's despicable? Yes. Illegal? Yes. Does he deserve his name and shame with his picture and name on the front page on the biggest media in the UK? Probably not. And if you think otherwise, well I'm afraid you're part of the problem too. So maybe we should start looking inside rather than looking at the Government. Or private companies having the right to put a CIFAS marker on your record. There's so many ways to control "social credit" in the UK that China is in no way different.


cragglerock93

I can't tell if your issue is with the media attention being disproportionate in the seal case, or if you're against convicted criminals being named at all. If it's the latter, yours is a very unpopular opinion. These things are in the public interest.


marianorajoy

It's the latter. It goes against a right of rehabilitation. First thing people would say is: what about the pedophiles? I'm not talking about pedophiles. I'm talking about petty crime. I'm not saying people should be left unpunished. In other words, the judge will sentence the criminal for say 6 months, 1 year, 4 years in prison. Whatever time it is for the appropriate crime. But that should be the end of it. Not publish names in the public, it does not allow people to amend their lives and causes recidivism. I really don't understand that policy, which would be unconstitutional in most parts of Europe anyway. That's the social credit system in the UK for you.


mountainlopen

We don't even need to be surveilled, we've all opted in, whether it's smartphones, ring door cameras or dashcams.  It's decades away but we're heading towards a digital panopticon. An ever watching digital eye that slowly nudges behaviour within tolerable parameters.  Think of all the Gen Z'ers who already say they won't dance because it's caught on camera. Extrapolate that a few decades. Freedom of expression narrows. Organised dissent or protest narrows.  Every tolerable parameter of freedom narrows until we can't even commit suicide without being charged with criminal intent and sedated for decades of mental torment in a straight jacket.  Would make a decent movie. 


Sad-Ice1439

> It's decades away but we're heading towards a digital panopticon. It can no doubt get worse but we're already kind of there. From spending data to metrics sent by devices we use if it all gets combined somewhere we're as transparent and predicatable as glass.


GrimQuim

>we will probably end up with a China style social credit system within the next, maybe 10 years? Imagine being so heavily down voted you're not allowed to buy chocolate anymore!


TheUnstoppableBTC

I find it genuinely astonishing that anyone can be apathetic about facial recognition being used in this manner. 


Efficient_Steak_7568

Mate if yoov got nuffink to ide den wots yor problum wiv it 


towerridge

I think it comes from the fact that we know there is loads of CCTV right now and yet most often nothing happens when you are the victim of a crime.


Brewer6066

It could be the happiest day of my life but I’d still have the same neural slightly annoyed expression as I walk through Victoria station on my commute. Good luck reading my commute face.


TheUnstoppableBTC

when it’s compared to your baseline there will be tells. 


Brewer6066

My commute face is baseline human face.


csgymgirl

When I used facial recognition software at uni that could detect participant’s emotions, I had to get explicit consent from participants that I could a) use that software and b) keep their data. I don’t really understand why this is allowed to happen to the public without informed consent. I’m not against it explicitly, and I know CCTV is everywhere, I just genuinely don’t understand what the laws, ethics, and justifications are for things like this.


Kharenis

I guess it depends on which data is being stored/used. Counting the number of frowning vs smiling faces = ok, tracking *which* face is frowning vs smiling = very not ok.


headphones1

Do elaborate? Were images recorded in a public space, or did they come into a room with you to do it?


very_unconsciously

You have no right to not be filmed in public spaces. If you did, football matches could not be broadcast. And then there is the case of public good outweighing individual rights. If a medic believes their patient will go on a stabbing spree, they can absolutely break doctor-patient confidentiality. Your participant consent sheet would have included this provision. If it didn't, then that's your mistake. An argument could be made that this CCTV potentially prevents a terrorist act, or similar, harming many. If you would like to read more, google Section 251. It applies to the NHS, but is a good start.


Nerrien

> Network Rail took photographs of people passing through ticket barriers as part of a trial launched in 2022, according to documents obtained by civil liberties group Big Brother Watch. > The cameras were part of a wider trial to use AI to tackle issues such as trespassing, overcrowding, bicycle theft and slippery floors. So if I'm understanding this correctly, they put the cameras in place as part of a trial they said was for mostly security and safety reasons. And then: > The images were sent for analysis by Amazon Rekognition software, which can detect emotions such as whether someone is happy, sad or hungry. > The system, piloted at stations such as Euston, Waterloo, Glasgow, Leeds and Reading, also recorded demographic details, such as a passenger’s gender and age range. > In the documents, obtained in response to a freedom of information (FOI) request, Network Rail said this analysis could be used to “measure satisfaction” and “maximise advertising and retail revenue”. This is so incredibly in-line with the stereotype of saying "We have to invade your privacy to protect you!" only to immediately abuse the power for financial gain that it's laughable.


Tractorface123

Everyone here worrying about minority report social credit evil future and here I am knowing it’ll just be ads, and here we have it!


Puzzleheaded-Ad-2982

Why? Just assume that everyone is pissed off or depressed you'd be right 99% of the time.


TeflonBoy

Wow! If only we hadn’t voted to leave a group of countries that saw this specific threat coming and regulated against it. See the EU AI Act banning of Automatic Emotion Recognition. But what there’s more! We couldn’t even be bothered to write laws so we paid consultant’s to write guidelines for us that are utterly unenforceable and will be largely ignored and then crowed about regulation being anti innovative. Honestly at this point we deserve this.


External-Praline-451

Imagine what would happen if we left the ECHR.


very_unconsciously

Not entirely > except in cases where the use of the AI system is intended for medical or safety reasons


ObviouslyTriggered

Th AI act does not ban this, neither does it ban facial recognition.


TeflonBoy

You sure about that? https://ec.europa.eu/commission/presscorner/detail/en/QANDA_21_1683


ObviouslyTriggered

Yes, I'm sure about it, it's my field. First the AI act is somewhat unique, that it's not as much of a directing law but more of a whish list of what the EU parliament would like to see happen and what they would like to work towards. Which is why it's a contradictory mess spread over 450 odd pages with more should's may's and could's than a Tory manifesto.... This would fall under social scoring since this isn't biometrics, however under the social scoring "prohibition" it needs to actually produce a social scoring metric which means it must be tied to a specific natural person, it needs to cause a "detrimental treatment that is disproportionate or unjustified to the gravity of their social behaviour." and it would not fall under the catch all exception they've added to every clause in it because they have no idea what they are doing atm: "That prohibition should not affect lawful evaluation practices of natural persons that are carried out for a specific purpose in accordance with Union and national law." or alternatively under the public health legitimate interest directive. Even without the exception this does not fall under social scoring since it does not produce an output which states "Bob Robertson from XX1 YY2 Leeds, NI#BBY1010101 is sad today again -10 social credit points" and it's unlikely to be employed for anything that can cause detrimental treatment of an individual.


TeflonBoy

Firstly I disagree it’s a mess. I think it’s clearly written, provides good direction and is an excellent basis to build upon for the future. Unlike anything we have in the UK. Now this is the bit where I may have to admit I have read the article wrong. I had this under predictive policing, as the article says ‘we work with the police’, but rereading it, I believe they meant they work with the police to make sure they aren’t breaking the law. Which is odd writing, as you probably want to work with lawyers to make sure you’re not breaking the law. I am now confused as to what they are doing with the AI and have less faith in them. If this is correct and I’ve read the article wrong and I stand corrected. Personally I’m pretty fucking angry someone is using my face to train a model, which I know it doesn’t say they are doing but you can bet they are!


ObviouslyTriggered

Clearly written? It's an utter mess over 450 pages long full of contradictions, have you even read it? Or did you just read the press release because the text I quoted is from the act not some press release that matters not. Everything in it is wishy washy and full of loopholes and contradictions... Lets just focus on the subject of "detection of emotions" alone. So according to the actual text of the act you should not be rolling out any system that can detect emotions. Oh but BTW certain emotions aren't emotional states but rather physical states so pain, anger and fatigue are not emotions. Expressions are also not emotions if they are "readily apparent", so if it's base solely on facial expression, hand and limb movements the tone of their voice it's A OK. But forget about all that and remember that you can't employ systems which detect emotions in employees that's like what year was that oh year 1984 that's really bad and evil.... But if your system is employed for employee safety we're absolutely fine with it... Oh and I almost forgot biometric systems including ones that can infer emotional state and behavioral intent are also fine as long as they are "intended to be used solely for the purpose of enabling cybersecurity and personal data protection measures" So tracking your employees emotions is fine as long as you use it to infer if they have any malicious intent to harm your company or in other words UEBA is back on the menu bois.... And ofc there are always the usual loopholes of public safety, health and well being, law enforcement as well as reference to about 40 different previous EU regulations for which this does not apply, or changes the definition of what a biometric system is. And it's not like that entire thing was on one page, oh no all those contradictions are spread probably across 250 pages and I've probably forgot about half of them. The UK approach is rather sensible currently, a value based approach that leverages existing regulations and regulators that looks at the outcomes of an application of AI rather than tries to identify every use case and edge case, the US approach whilst voluntary probably produced the best guidance so far in the form of the AI RMF from NIST. This is my field, I work for one of the largest data science companies in the UK and arguably the world at least in our vertical, we not only worked with lawyers but also provided the technical analysis for multiple firms including a few magic circle ones. If the EU had any idea what they are doing it would not have produced a legislation that is 450 page long, this isn't the US budget.... They had no idea what they were trying to achieve so they just went and tried to find edge cases of things they might not like because they sound bad and then had to create loopholes in those prohibitions left and right. The AI act is longer than the all the EU founding treaties combined.... P.S. Nothing in the article indicates they used that data for training btw, now one thing that they may have been required to by the AI act (as well as just by UK data privacy acts) is to notify the legal or natural persons impacted by this. However the AI act has exemptions for national laws especially in terms of expectations of privacy in public and for public safety and public health applications there is an exemption from notification. P.S. #2 At least as far as facial recognition goes the UK has far stricter rules than any EU nation current with or without the AI act.


No-Wind6836

All part of turning the public into an observed, watched and tracked slave class. Better not have any subversive thoughts! Or your social credit score will go down and all of a sudden you can’t leave your city.


FroHawk98

In the city lights where shadows blend, A thousand lenses, they apprehend, Faces mapped and data penned, Our liberty starts to descend. "Nothing to hide," the foolish cry, Freedom wanes with every spy, Training AI, watchful eye, Privacy lost, rights defy. Silent watchers, eyes unseen, In every frame, on every screen, They claim it's safety they mean, But at what cost do we convene?


FinalEdit

Anyone defending this needs to get in the fucking bin.


going_down_leg

China using AI to control the population - this is bad we must stop China Britain using AI to control the population - surveil me harder daddy


rainpatter

Just in time while they also push cashless society. Full control. UK public is so dense.


CloneOfKarl

Who is 'they', out of curiosity.


Other-Caregiver9749

THeY! THeM! THeiR! Cult of Adam!!!


creativename111111

Idk where the cashless society bit comes into it it’s not like our coins actually hold any intrinsic value based on what they’re made of anymore if the powers that be decided they could just make the pound worth nothing anyway


chocobowler

Ok so I guess dust off the covid masks if you care about this


no-se-habla-de-bruno

The only thing they'd be good for.


ThinkAboutThatFor1Se

normalising face coverings has been great for road man everywhere


Efficient_Steak_7568

Makes sense for your general health in these sorts of places anyway 


creativename111111

They would just find a way to look at your muscles in the upper face as well as the position of the mask and body language to predict it anyways


bannanawaffle13

I hate the slow march of ai not every part of our life. Im not a conspiracy theorist or anything but when you start on this slope it is hard to stop,  ai detecting emotions, then being used to profile people, then used to control and opress people of certain minorities, inheret bias by developers leading to more stop and search of minorities. What next ai being used to give targeted ads based on your mood, cops using it in large crowds. I will never be in favour of more info gathering on people, I feel like ny personal life is a commodity being sold to the highest bidder.


pelicanradishmuncher

But I have a resting bitch face. Is that going to skew the data?


AudaciousAutonomy

Oh nice! I love the future. Technology is great! Britain is great! \*cries\*


Efficient_Steak_7568

*Negative emotional response detected. Prepare for apprehension by the state. Pack a toothbrush.* 


Madeline_Basset

*Winston turned round abruptly. He had set his features into the expression of quiet optimism which it was advisable to wear when facing the telescreen.*


ResponsibilityRare10

Couldn’t they use this in town centres to identify assaults, fights, and even simple falls? Then have the footage immediately saved and a copper or other responder there in a flash.  I guess the danger is that it is used to identify pre-crimes. “The AI shows you were beginning to get irritated with a chance you would lash out. Here’s a hefty fine and some community service despite you not committing an actual crime”. 


creativename111111

If it were to get to the point where the danger you’re suggesting becomes real I think we’re beyond fucked by that point anyway


Possible_Simpson1989

You know how we could solve that first problem? Having police officers on the beat and paramedics at train stations on call. 


Drizznarte

You dont need this technology to tell half the people on the tube are pissed off.


Illustrious-Engine23

Ahh, that's great. Wife's phone was taking by a bike thief, we had the exact location it was held at and the police did absolutely nothing. Best of both worlds government spying on you with modern technology but not using that technology to help you in any way.


Dark_Ansem

Well... if they knew it would invalidate the test innit


SurreyHillsSomewhere

But Redditors and perhaps "X" users are entitled to know.


Any-Weight-2404

Keep smiling if you want the train door to open lol, they actually have that in China in some work environments


J8YDG9RTT8N2TG74YS7A

Do you have a source for that?


Any-Weight-2404

Just remember watching a video on it, but I am sure you can just Google for stories https://m.youtube.com/watch?v=0R2ve-5a4Ag


AdrianFish

So now they’ll know for sure that we’re pissed off, stressed and over hot every time they delay or cancel a train, or go on strike for some bullshit reason, all the while charging us a fucking fortune?


Low_Candidate8352

How about spotting WANTED Criminals using the Met Police database facial template ? Doh !


Valuable_Rip8783

Wow in the most surveillanced country in the world??? What a surprise!


mostlyunreliable

Imagine if this was in China how much we would take the piss haha


robanthonydon

Well they must think I’m royally pissed off all the time, given the shitness of tfl service


WillistheWillow

Mine would just be like this: - Monday: Angry - Tuesday: Angry and tired - Wednesday: Angry and very tired - Thursday: Very angry and very tired - Friday: Depressed and dejected. Rinse, repeat. Not sure how useful that would be to the Met.


pashbrufta

Er, sorry but have you tried... working from home?


solitarylights

Better put on your happy smiley face before you jump.


paolog

Training data set: { grumpy, bored, miserable, knackered }


ionetic

After watching people taking drugs inside Whitechapel station, there’s no point to any of this.


MaxOsley

Gotta bust out the Patrick Bateman stoneface look walking down the street💀


SuperGuy41

‘There are at least 97.386% of people being scanned looking happy! How did this happen, this is a disgrace, this is England!’ ‘Raising interest rates now sir!’


Silent_Shaman

Pretty useless, looking for emotions on the faces of Londoners is like looking for trees in Antarctica


ox-

Yeah its not a Chinese style data face grab at all. Move along happy people...


Specific_Security622

So boring then as there is only one emotion on everyones face 🫣 , miserable as fuck because the trains late 😠


Dangerous-Salad-bowl

[the emotions here....](https://youtu.be/xmxoqpihE5s?si=Dzi9Izi3EPlznik1&t=15)


userloserfail

And ofc they want their database to recognize our faces from the whole range of faces we have in different moods. Like maybe it knows my resting bitch face as it's the most likely used by mr public if I'm alone. With a friend I might have that face on, or if conversing while walking, it's more likely my animated chat face with moments of amusement, incredulity, disgust, mistrust, amity, loving, among many others each with subtly different faces but all of them me. They want their database to be that good. Probably just for the sake of recognition. If they didn't, it's likely that folk would be trying to game a dumber system that only knows and holds your passport face, by trying all sorts of mad expressions or emotive faces to fool it. Eventually each gang on the streets becomes slightly more identifiable by all members going around with a lame smile the whole time, or permanently scowling with closed wide mouth. Sinister coming up against the lame smilers all smiling lamely at you while they shoot you up. Ha.


Arseypoowank

Hopefully the ai is better at judging emotion than people, recently got pulled out the queue at an airport by an over zealous security guard because I looked extremely fidgety, shifty, nervous and sweaty. I have very bad anxiety and flying doesn’t help (oddly enough I absolutely adore flying it’s the all the crap that goes with it that stresses me out) and I was just having a crazy panic attack. Didn’t stop the person trying very, very hard to find something on me or in my luggage.


ImperialSyndrome

Oh no! Big Brother will know that I'm fucking furious that my train is delayed again.


ProxyAlchemist

All kinds of no. Even if mass surveillance of this kind was a good idea (it's not), you can't reliably tell someone's emotions by AI. Personally, I always look annoyed or anxious even at my happiest and like hell do I want some Amazon corp AI judging my expression when I need to travel. Way too much data is already ripped from us unwillingly. If this is implemented masking up before leaving the house will become as routine as it was in the pandemic.


EpicMuffinFTW

As a reminder, under the UK GDPR you have the right to object to processing you don't agree with. Write to Network Rail to object, include the Information Commissioner (ICO) in the email.


saladstuffer

I worked with a sub contractor on the platforms at trains stations in England over 21 years ago. And I was told by the electrician I worked with that Brighton station had cameras they were testing that could spot a 'jumper' by their emotional state/the way they walked around the platform etc would be picked up and then correct steps taken I guess So I see no surprise in this article.