I do think it's completely unacceptable if Meta makes the glasses unable to be used for routine functions without (a) other humans reviewing your private content and (b) AI training on your content. There needs to be total transparency to people when this is happening - these are absolutes.
But I'm a bit confused by the article because it describes things that seem really unlikely given how the glasses work. They shine a bright light whenever recording. Are people really going into bathrooms, having sex, sharing rooms with people undressed while this light is on? Or is this deliberate tampering, malfunctioning, or Meta capturing footage without activating the light (hard to believe even Meta would do this intentionally).
I do believe people do all of that with the light on. And then there are also people who tamper with the device to deactivate the light. You can find guides for that online.
I'm going to guess that people are intentionally recording themselves having sex, assuming that they are creating a local recording that is not sent to Meta. Does the light mean "camera is recording" or "cloud services are involved"?
But there is total transparency though? Meta is using all your data, always. And the harder they say they’re not, the sneakier they’re doing it.
If you're not paying a subscription for Meta to AI process your audio and video then they're going to get value out of it some way. It's just like any other 'free' digital service
It is absolutely within possibility that all "camera is on" lights are software controlled just like the camera and independently of the camera. They are meant to tell the user that they are using the camera. They are not meant to tell anyone that the owner of the devices back-end is using the camera.
I find the root issue to be that what the glasses see is described as "content" in the first place.
It is also completely unacceptable to capture the public space without oversight and consent from third parties. If glass users are fine with that, why wouldn’t they accept it for themselves?
The Zuck being the Zuck, I wouldn't put it past him collecting data even if the cosmetic light is not on.
This was one of the first hits on Kagi. 404 has a similar article (I think) but it's behind a paywall.
"The demand for this ‘Ray-Ban hack’ has been steadily increasing, with the hobbyist’s waiting list growing longer by the day. This demonstrates a clear desire among Ray-Ban owners to exercise more control over their privacy and mitigate concerns about unknowingly recording others."
https://bytetrending.com/2025/10/28/ray-ban-hack-disabling-t...
People absolutely will (and are) modding them to hide that light. But even if they weren’t, a lot of people won’t notice.
And regardless of any privacy policy or the like, you still have to worry about Room 641A scenarios [https://en.wikipedia.org/wiki/Room_641A].
Can you imaging a Stasi that has a large portion of the population also wearing pervasive surveillance tech? Amazing!
If anyone were to record even when the light is not shining, it would be Meta. This would not surprise me at all, they have everything to win and nothing to lose, no country would fine them anything remotely relevant compared to the value of the data they'd be getting.
I mean laptop webcams also shine a light when they're recording but obviously you don't just trust the light to come on right?
>hard to believe even Meta would do this intentionally).
Hahahahahahahaha
ZUCK: yea so if you ever need info about anyone at harvard
ZUCK: just ask
ZUCK: i have over 4000 emails, pictures, addresses, sns
FRIEND: what!? how’d you manage that one?
ZUCK: people just submitted it
ZUCK: i don’t know why
ZUCK: they “trust me”
ZUCK: dumb fucks
Actual quote, BTW [1].
[1] https://www.newyorker.com/magazine/2010/09/20/the-face-of-fa...
> There needs to be total transparency to people when this is happening
This is why WE have the GDPR. To outlaw and prevent exploitation such as this.
Agreed. I'm confused trying to map what the article is saying to what's happening at a technical level. For example, obviously it's not doing on-device inference, so it's unsurprising that it won't work without a network connection, but this is totally distinct from your recordings ending up getting labeled. It talks about being able to opt into that, which is one thing. But I guess I don't understand if you don't opt in, if the data still gets sent out for labeling.
I feel like this article is either a bombshell, or totally confused.