by gorbachev
6 subcomments
- Meta cancels the contract with the outsourcing company they contracted to classify smart glasses content after employees at the company whistleblow about serious privacy issues with the content they were paid to classify.
- > "We see everything - from living rooms to naked bodies," one worker reportedly said.
> Meta said this was for the purpose of improving the customer experience, and was a common practice among other companies.
Am I reading this correctly?! This is probably the weirdest statement I've read on the internet in twenty years.
by HarHarVeryFunny
5 subcomments
- Not sure which is worse here - that Meta are recording video from customers' smart glasses, or that they are firing people who talk about it.
- I believe the tricky privacy and security issues around smart glasses (and other "personal" tech) can be navigated successfully enough by a thoughtful, diligent, responsive company.
Which is why I'd never touch a person tech device from Meta.
Their entire DNA is written to exploit their users for profit. In my judgement, they literally cannot and will never consider those issues as anything other than something to obscure to keep people unaware of the depth of the exploitation.
by reliablereason
1 subcomments
- I wonder under what circumstances footage from the glasses are uploaded for classification.
Probably this is people asking the glasses something about what they see and the glasses uploading video for classification to generate an answer.
People think it is "just AI" so are not very concerned about privacy.
- If you want to read more about how unsavory aspects of AI-training are off-loaded onto poor workers in third-world countries, would recommend Karen Hao's "Empire of AI". These workers are paid pennies an hour for unstable jobs that expose them to some horrific material.
by KaiserPro
2 subcomments
- Ex Meta employee here (yes you are right to boo):
The thing that really gets me is that internally there are 4 levels of data 1 being public domain shit (the sky is blue) up to 4 which is private user data, or something that is sensitive if leaked or shared.
I was told that by default all user data is level 4, as in if you do anything without decent approval, you're insta fired. There are many stories about at least one person a month during boot camp accessing user data and getting escorted out of the building within hours.
The part where I worked, in visual research, we had to jump through a years worth of legal hoops to get permission to record videos in public. We had to build an anonymisation pipeline, bullet proof audit trail, delete as much data as possible, with auto delete if something went wrong.
We had rigid rule about where that data could be stored and _who_ could access it. We were not allowed to share "wild" footage (ie data that might have the hint of anyone who hadn't signed a contract) for annotation because it would be given to a third party. THe public datasets we released all had traceable people, locations all with legal waivers signed.
Then I hear they just started fucking hosing private data to annotators to _train_ on? without any fucking basic controls at all? Just shows that whenever Zuck or monetsization want something, the rules don't apply.
I look forward to that entire industry collapsing in on it's self.
by swiftcoder
3 subcomments
- One of the bigger commercial niches for smart glasses is filming POV porn, so it is hardly surprising that sort of content ended up in the moderation queue. The project should have planned to account for that use case.
- This headline reminds me that “row” is one of these words I’ve been mispronouncing almost my whole life (I just learned the correct pronunciation this year). In this context row rhymes with cow,¹ now dough.
⸻
1. The first rhyme that came to mind was bow, but I realized there was a problem with that example.
- https://archive.ph/ubWba
- Bigtech and the race to the bottom of the ethical pitt. We can still go lowerrrr!
- So I've never had a smart speaker in my house (Alexa, Apple, Google). I've just never been comfortable with the idea of having an always-on cloud-connected microphone in my house. Not because I thought these companies would deliberately start listening and recording in my house but because they will likely be careless with that data and it'll open the door for law enforcement to request it. Consider the Google Wi-fi scraping case from STreetView.
Or they might start scanning for "problematic" behavior, a bit like the Apple CSAM fingerprinting initiative.
So not one part of me would ever buy Meta glasses (or the Snap glasses before that). You simply don't have sufficient control over the recordings and big tech companies can't be trusted, as we've witnessed from outsourced workers sharing explicit images. And I bet that's just the tip of the iceberg.
I honestly don't understand why anyone would get these and trust Meta to manage the risks.
- Meta ended its contract with Sama
At this scale, this sound like some insider joke contract made up only to make some hustle on the side capitalizing with stock options on the possibility of adhoc news trading bots glitching out on the keyword, here "x.com/sama" signals.
- Absolutely no way I'd buy anything from Meta that has a camera built-in.
- What does "in row" mean? For us non-English English speakers.
- Why do they even need workers to classify naked content? They could filter some content prior to passing it to workers. They already have models to moderate explicit content.
- I think Meta, like all companies, doesn’t want its subcontractors creating bad press for them.
So it doesn’t surprise me that Meta didnt renew/cancelled a contract that is a net negative for them. Arguing over the reason seems fruitless as no reason is needed per the terms of the contract (I assume since breach of contract wasn’t brought up by the sub).
by letmetweakit
0 subcomment
- Unfortunately this news will have no impact, neither on customer behavior, neither on policy, neither on Meta's behavior.
- > and was a common practice among other companies.
Meta isn’t lying, you should assume other companies are doing it too, Tesla did it with their cameras, and assume others like any company has access to your camera, I would even assume CCTV cameras too. It’s why for anything sensitive, try to use open source stacks, you might lose some of the features, but it’s a needed compromise.
by I_am_tiberius
3 subcomments
- Not a fan of regulation in general, but would love to see a ban of cameras on glasses used in public spaces.
by shevy-java
0 subcomment
- Facebook may have to rename itself into NaughtyBook or SpyBook
or Pr0nBook. They really want people to help them spy on other
people here - including their sex life. Expect new sexy videos
in 3 ... 2 ...
- > Meta's glasses have a light in the corner of the frames that is turned on when the built-in camera is recording.
Because nobody knows how to put a dot of nail polish on an led they don't want seen, right?
by theowsmnsn
1 subcomments
- Meta is so evil
- It seems the issue is not the glasses users, but the people that the glasses users were having sex with. Did meta get their consent before redistributing this content?
- A question for the HN folks who work for Meta - Is the pay so good that it makes it worth working for such a morally bankrupt organization?
by jimmyjazz14
2 subcomments
- It still blows my mind that anyone would volunteer to don these smart glasses, it's almost like some alien mindset to me.
by rufasterisco
0 subcomment
- It would be refreshing for once to see the top comment to such articles to be
“Yes, we all know it, and we keep those app installed regardless“.
by talkingtab
6 subcomments
- Meta said the contracting "did not meet (meta's) standards". I am sure that is true. meta's "standard" is not to reveal the illegal, immoral, unethical things meta does. No matter what the harm.
Maybe a company with those standards should not get our business. Oops, no wait, maybe they mean the Friedman Doctrine standards? In that case they are entitled to do any and every thing to make a profit. No matter what the harm.
[edit: add last two sentences]
- Why would anyone trust Meta with their personal data! After a while it's just natural selection.
- Good. Anyone who works for such a company is immoral in my opinion.
by MarchApril
0 subcomment
- I don't even trust the webcam electrical shutter of my laptop to the point I ran tape over it. Why people trust tech giants with their private data so much? What do they think of when they hear "private data"? Like trivial things such as their names, their mother's maiden names, what they ate this morning, posts they saw on Facebook? Because that have to be the case when I see someone flushing with an app on their phone and yells "Siri, play a seductive song to get me and my wife Julie in the mood? Turn it off after 10 minutes. Also don't record us during our hot love making!".
- i don't think smart glasses itself is a good idea
by fortran77
2 subcomments
- People have sex with their glasses on?
- this may be the greatest title i've seen on hacker news in a decade
- who on earth is buying these things and why
- This is what happens when you buy a camera from the "they trust me, dumb fucks" guy and put it on your face.
- I got a paywall, first time I've seen that on BBC.
- Oops! Oh, too late. And another nail in the heart of smart glasses…
by game_the0ry
0 subcomment
- Can we boycott meta yet? I am sick of this company.
- I bet the victims had their socks on too
by ai-network-lab
0 subcomment
- [flagged]
by 3748595995
1 subcomments
- [flagged]
- About the "they asked us to view it and then fired us for it". Having worked in their RL division(I don't work at meta anymore) this story is quite weird for two reasons:
1. Meta AFAIR paid/compensated people — contractors or recruited via ads — to have them submit their data. There are strict privacy protocol and reviews in place to distinguish data use in these cases vs gen public. This is not to say the process is perfect, but if these users are gen public, I would be very shocked.
2. Hiring contractors to submit data is a more controlled environment VS recruitment of gen pub via ads to submit data, but the former has more well understood privacy disclosures than the latter. This means in practice asking contractors to wear glasses and "move around their surroundings naturally and do things" goes well with basically the privacy practice "the data your are submitting we can view and use all of it for purpose X and nothing but X". BUT this framing is with ad based recruited people — which are general users who willingly submit data — is much much harder. My suspicion is they are running ad based recruiting in general public and while those users may have signed a privacy statement it is very surprising that they did not tighten the privacy practices around the use of the data and who has access.