logoalt Hacker News

Meta’s AI smart glasses and data privacy concerns

1362 pointsby sandbachyesterday at 10:32 PM760 commentsview on HN

Comments

zmmmmmtoday at 12:32 AM

I do think it's completely unacceptable if Meta makes the glasses unable to be used for routine functions without (a) other humans reviewing your private content and (b) AI training on your content. There needs to be total transparency to people when this is happening - these are absolutes.

But I'm a bit confused by the article because it describes things that seem really unlikely given how the glasses work. They shine a bright light whenever recording. Are people really going into bathrooms, having sex, sharing rooms with people undressed while this light is on? Or is this deliberate tampering, malfunctioning, or Meta capturing footage without activating the light (hard to believe even Meta would do this intentionally).

show 16 replies
chwahootoday at 1:53 AM

I'll confess that I like my Meta Ray Ban glasses: I love using them to listen to podcasts at the pool/beach, while riding my bike, and it's cool to snap a quick picture of my kids without pulling out my phone.

I wish this article (or Meta) were a bit clearer about the specific connection between the device settings and use and when humans get access to the images.

My settings are:

- [OFF] "Share additional data" - Share data about your Meta devices to help improve Meta products.

- [OFF] "Cloud media" - Allow your photos and videos to be sent to Meta's cloud for processing and temporary storage.

I'm not sure whether my settings would prevent my media from being used as described in the article.

Also, it's not clear which data is being used for training:

- random photos / videos taken

- only use of "Meta AI" (e.g., "Hey Meta, can you translate this sign")

As much as I've liked my Meta Ray Ban's I'm going to need clarity here before I continue using them.

TBH, if it were only use of Meta AI, I'd "get it" but probably turn that feature off (I barely use it as-is).

show 14 replies
jspdowntoday at 7:04 AM

Don't you need to obtain consent before filming random people in the street? I already feel uncomfortable when someone takes a photo in public and I happen to be in it, but this type of device takes things to an entirely different level. With smart glasses, there's no visible cue that you're being recorded. No phone held up, no camera in sight. I'm questioning the legality of this in Europe, where privacy laws tend to be stricter. In the meantime, should I just assume that anyone wearing these glasses is always filming? And would I be within my rights to ask them to stop the moment I notice them?

show 10 replies
notyetmachinetoday at 8:00 AM

Ghanaian authorities are seeking the arrest of a Russsian national who was using glasses to record himself picking up, and sleeping with, women in Ghana and Kenya. He uploaded them to social media and telegram. Was quite the story on African tech twitter last month.

https://www.bbc.com/news/articles/c9wn5p299eko

jamesontoday at 4:29 PM

Everyone should assume that _anything_ connected to internet will get uploaded to internet and someone within the company will have permission to review the contents regardless of what the policy says.

1. Debugging for troubleshooting.

2. Analytical for making product better.

3. Bugs that collects your info when it shouldn't.

4. Bugs from 3rd party vendor if company uses those.

5. Insecure process. Getting access to a private content within the company is trivial due to coarse permission model.

Source: I worked at two well known social media companies. Trust & Safety and data infra teams

blakesterzyesterday at 11:00 PM

  Meta aims to introduce facial recognition to its smart glasses while its biggest critics are distracted, according to a report from The New York Times. In an internal document reviewed by The Times, Meta says it will launch the feature “during a dynamic political environment where many civil society groups that we would expect to attack us would have their resources focused on other concerns.”


https://www.theverge.com/tech/878725/meta-facial-recognition...
show 9 replies
MerrimanIndyesterday at 10:55 PM

I was in engineering school back in ~2012 when Google Glass came out. One of my classmates got hold of a pair when they were still quite uncommon and wore them to an extracurricular club meeting. Within minutes someone made a comment about him wearing the "creeper" glasses and asked if he was filming. He never wore them to the club again.

I just don't see a world where that doesn't happen with Meta glasses.

show 23 replies
mayowaxcviyesterday at 11:53 PM

My concern was whether the glasses might record or transmit data while switched off or in standby mode. From what I can tell, they don’t do this intentionally. So the risk is broadly similar to other modern electronic devices.

The creepiness concern is real, but I think people misplace where the actual surveillance happens. The most consequential stores of personal data aren’t ad networks they’re things like banks, hospitals, insurers, and telecoms. These institutions hold information about your health, finances, movements, and relationships, indexed and searchable by employees you’ve never met, governed by policies you’ve never read.

Realistically, there’s very little an individual can do to completely opt out.

My take is: if the main outcomes are that I get shown ads for things I don’t need and my facecomputer knows the difference between a fork and a spoon… I… I can live with that.

show 2 replies
bhekaniktoday at 2:00 PM

As a dev, I think the core issue isn’t whether one indicator LED can be bypassed — motivated people can bypass almost any client-side control. The trust boundary is policy + defaults. If enabling “AI features” implicitly authorizes broad retention/review, users won’t understand the tradeoff until after the fact.

A better pattern would be tiered modes with explicit UX: local-only capture, cloud processing without retention, and opt-in retention/training with visible status. If the product can’t technically support that separation today, that limitation should be stated plainly in setup, not buried in policy docs.

_ZeD_today at 5:43 AM

Sooo... I really should start keepin running this[1] all the time...

https://github.com/yjeanrenaud/yj_nearbyglasses/

thomassmith65today at 1:58 AM

I do not care about the privacy of people who buy these glasses nor their families.

I care about the innocent people whose privacy is invaded by people who buy these glasses.

show 1 reply
bogzzyesterday at 11:01 PM

I am so far removed from the type of person who might consider buying something like that. You'd have to be exceptionally impervious to social cues to even think of wearing that in public.

If you're blind, it's of course understandable but that's pretty much it in terms of cases in which I would consider the glasses acceptable to wear in public.

show 3 replies
NalNezumiyesterday at 11:46 PM

I sincerely hope someone in Japan or Korea get caught using those to peek under trousers on the train so it get the forced camera sound treatment of smartphones over there.

So the world can label them as Hentai glasses and move on

show 2 replies
ccccrrriistoday at 2:16 AM

I got a pair as a gift and didn't look much into them but I have to be honest, I assumed any data I captured - voice, video, etc. - would be sent to their servers (to use their models) and they'd be using it for training with humans in the loop.

Tbh the only thing I really use the glasses for are listening to music or talking on the phone - so basically how you'd use airpods. I don't use airpods because I had an ear injury that prevents me from using them on my left ear, so these glasses were kinda nice for that. I really wish they didn't have a camera though because I do always feel compelled to remove them if I interact with people.

I also have to add that the quality is mediocre. They're a month old and the case has problems charging sometimes, and one of the screws is always coming loose at a hinge no matter how often I retighten that side.

majestikyesterday at 10:51 PM

Is anyone here actually surprised Meta is recording and reviewing their content?

Vote with your dollars people.

show 6 replies
gverrillatoday at 10:59 AM

Usage of creep-ware won't be tolerated in the social groups I take part on.

We will shame hard anyone who uses this sh1t.

show 1 reply
dhabtoday at 1:28 PM

Conditioning the crowd gradually towards being monetised in some of the most egregious ways - first pay for the glasses, then pay with revelations of private life sold to government (ICE?), business (private insurance) and so on. Super evil.

And despite this, there is no strong will to detach from what they produce - in the beginning or later when it is considered like cultural fabric. That’s how good their tactics is.

And for the pay one gets working for them - screw the world! I won’t use it anywhere near my loved ones - but will build it

andy_ppptoday at 11:53 AM

We need safe spaces where you aren't constantly living inside the panopticon...

shevy-javatoday at 11:00 AM

Well - don't wear their spyglasses. It's really not that hard.

You can still record stuff without spyglasses. People do that on youtube too, e. g. first amendment audits. It's not that different to the spyglasses, except that you can cut off Meta from the process (admittedly youtube creates another problem which is called Google; it would be nice if we could have platforms without corporate overlord, but the financial aspect may still be an issue that requires solving. I don't have a good way to solve that, as I am also having a 100% zero ads policy aka using ublock origin mandatorily. And Google declared total war againts ublock origin, we all know that.)

binarynatetoday at 1:32 AM

At a friend's party recently, I met someone who told me that they had worked in data for Meta's glasses division and warned me never to get Meta glasses for this very reason—that the workers can see everything. They told me of a comical case where a guy pulled down his pants to look at his penis, asked "Meta, what is this?", and the AI responded that it was a thumb. XD

show 4 replies
greatgibtoday at 1:58 AM

Privacy policies and usage terms are like the magic wand of the industry. Whatever totally bad they want to do and however they want to abuse of you and of your data, they just have to add a few unreadable lines in a 40 pages document and that's it.

No one will read it, but even if you do, most of the time the FOMO or sunk cost fallacy effect will make you go on anyway. And then it is a free pass for them.

halaprotoday at 5:28 AM

To a technical person, this is obvious. AI doesn't happen on the glasses, it doesn't happen on your crappy phone, it happens online. Live streaming, which is also a feature, by definition sends everything it captures to someone else's computer (ahem, the cloud).

Yesterday I saw a Instagram reel of a guy asking "what am I looking at" while between his girlfriend's legs. Congrats, some Indian guy saw her too.

The core piece of information that is missing or unclear is whether this collection happens also when not actively and knowingly sending data to the cloud.

The glasses let me record videos locally, can Facebook see any frames of them? This is the question that needs to be answered. Everything is else is nonsense like "omg Amazon hears what I tell Alexa"

robotburritotoday at 5:54 PM

Often I hear. “these are so cool! It’s a shame meta makes them.” All of their pivots will fail because of their track record IMo.

KaiserProtoday at 9:27 AM

What I don't understand is where this data is coming from. Is it actually Meta's raybans or is it project aria (https://www.projectaria.com/)

Because I didn't think that the data was uploaded to meta by default, when you take a video with the raybans.

More over, I didn't think that those glasses could record more than 2.5 minutes.

The point still remains, the devil in detail of the "privacy" policy.

nothrowawaysyesterday at 11:02 PM

The whole project is a Creepy privacy nightmare.

zouhairtoday at 2:58 PM

I am confused that people act so offended. What do people expect? Meta and all tech companies do not believe in privacy for anyone but themselves. Isn't our info how they make their money?

Mulhimfytoday at 6:58 PM

The privacy implications here go beyond just the recording light. The real concern is how Meta trains AI models on captured data without explicit consent — and most users have no idea this is happening.

xmx98yesterday at 10:51 PM

Of course! Glasses with cameras are a classic secret spy gadget :)

arian_today at 2:32 AM

Workers can see everything" means this isn't an AI privacy problem. It's a surveillance-as-a-service problem with extra steps.

impossibleforkyesterday at 11:44 PM

While it may be legal for an individual to film something, it is certainly not permissible to process video data of this sort at scale.

I don't agree that responsibility to comply with Swedish law is on the wearer. This should motivate prosecutors to immediately order raids to secure any data relating to the processing of the data.

I also think the Swedish camera surveillance law is also applicable and there's a deceptive element since the cameras are disguised as glasses.

showersttoday at 12:22 AM

How does this not fall afoul of states with two party consent laws around recording conversations? Particularly since California is one of the strictest states.

show 1 reply
de6u99ertoday at 7:55 AM

It's the same issue with Tesla collecting camera feeds through their cars to use it for macbine learning.

Those videos can also be a used to track people. IMHO each Tesla owner sending video data to Tesla's data centers is violating privacy laws!

show 1 reply
FireSquid2006yesterday at 11:40 PM

I'm not sure if there is any use case that could convince me to mount an internet connected device to my head at all times.

show 1 reply
bryanrasmussentoday at 8:42 AM

smart glasses are a potential great boon for mankind, really, only both of the iterations we have had have been from two companies that are arguably detrimental to humanity.

show 1 reply
smbulletyesterday at 11:04 PM

Hopefully this causes Meta to be more transparent about what data is sent to their annotators. It seems like even the annotators didn't know whether the person explicitly hit recorded (whether accidentally or not) or if it's samples from a constant stream. This kind of makes it impossible for anyone to consent to the purchase agreements.

roughlyyesterday at 11:53 PM

Everything else in this article is horrific, but this stuck out to me:

> “The algorithms sometimes miss. Especially in difficult lighting conditions, certain faces and bodies become visible”.

Right, “difficult lighting conditions,” not sure when we’d run into those in situations where we might be concerned with privacy. A 97% success rate looks good on paper.

show 1 reply
nomilkyesterday at 11:43 PM

Is it paranoid to assume every device with a camera/mic can see/hear everything?

That's my default assumption.

umpalumpaaatoday at 6:46 AM

The title is now “She Came Out of the Bathroom Naked, Employee Says”

yaloginyesterday at 11:57 PM

Of course they can, why would one expect anything else? However if you look through their processes I am sure they are covered by some legal jargon to do the bare minimum in terms of security. They will have every knob available to debug to the lowest level possible and view everything

dlev_pikatoday at 1:17 AM

Crazy to have 1 trillion invested in data centers, underpinned by dollar-a-day human turk ops

show 1 reply
aucisson_masqueyesterday at 10:54 PM

Beside the privacy part, I fail to see what value these glasses bring that a smartphone with a camera can't do already ?

And you're still forced to carry a smartphone anyway with these glasses since they require internet connection.

Is this fashion, or something I'm not aware of ? They look horrendous to me.

show 4 replies
stavrostoday at 1:48 AM

What the hell? I thought the videos went to the phone directly, they're all getting uploaded to Meta? I don't know why I let my guard down against that company for one second.

EDIT: Wait, is this when you use the "ask Meta" feature? I do expect that to send all the clips to a server for an LLM to process, it's not done on-device. It's not clear to me whether it's that or just all videos/photos you record with the glasses.

brevetoday at 10:02 AM

Meta's business model is premised on intensive and pervasive user surveillance.

When you use Meta's products and services you are tagged, tracked, and commodified like an animal. You are cattle.

The question isn't whether or not Meta's AI smart glasses raise data privacy concerns.

The question is why use anything from Meta in the first place?

show 2 replies
sidcooltoday at 4:03 AM

Despite the historical misadventures of Meta, if people still use their products with an expectation of privacy, it's on the people.

Murfalotoday at 2:05 AM

Surely this is already happening with our other devices? Not that it isn't a problem but that the game is already lost...?

mjbonannotoday at 1:35 PM

The privacy angle here is fascinating. Curious if anyone has tried running the on-device model locally yet?

stevefan1999today at 4:58 AM

I would really love to use smart glasses for DevOps, especially Grafana dashboards

arcadianalpacatoday at 7:20 AM

The recording light argument keeps coming up but I don't buy it. I can't tell if someone's glasses have a tiny LED on from across a room, and neither can anyone else.Under GDPR it's a on Meta to handle consent, not on me to squint at someone's face to figure out if I'm being filmed.

show 1 reply
thunderforktoday at 3:23 AM

Fun fact: all advertiser chat support agents at Meta used to (still might) have full super-read on FB. When you read "workers" in this headline, don't think "devs", think "legions of contracted-out T1 support staff"

show 1 reply

🔗 View 50 more comments