> The IR cameras can detect environmental image changes, facilitating a broader range of gestures to improve user interaction. For example, if a user watches a video using Apple Vision Pro and the new AirPods, and turns their head to look in a specific direction, the sound source in that direction can be "emphasized to enhance the spatial audio/computing experience."
I wonder if the signal could be integrated into AR glasses or headset to provide a wider FOV to the wearer.
> For example, if a user watches a video using Apple Vision Pro and the new AirPods, and turns their head to look in a specific direction, the sound source in that direction can be "emphasized to enhance the spatial audio/computing experience.
Geez, if only the Apple Vision had some kind of gyroscope and accelerometer so it could detect head motion without relying on external hardware...
> It looks like these cameras are infrared and intended to see gestures from the wearer.
I had an APV for a while, controlling it with just gestures was sweet. If they're looking to bring those kinds of gestures to other Apple devices via AirPods (i.e. beyond just bringing more gestures to the AirPods), I'm intrigued.
Remember the fuss people made about the Google Glass?
Turns out that people become okay with cameras every few inches observing every action as long as you say "It's for gestures.", even though the data stream will inevitably make it back to corpo for training ( and voyeurism[0] by the other parties that'll be in the room).
Same excuse for the VR headsets. "Oh, it has a red LED that fires when recording!" , meanwhile the thing has 30 other IR cams in non-stop loops consuming the environment at power-on til death.
My experience working at Apple was that private information does not leave the device. All our training data came from contractors who were hired to perform data collection and not from the general public.
If Apple really walked the walk about their privacy marketing, it would sunset all their APIs used for tracking, making impractically hard for advertisers to track you. Not warning "do you still want to be tracked?", but remaking their whole stack to make tracking unreasonably hard.
Currently I see Apple as safer than, say Google or Microsoft, but not as the privacy bastion it claims to be.
It's clear it doesn't bother you, but I'll try to explain my posture.
Years ago, Apple's Weather app sourced their data from The Weather Channel. That meant these three tracking options ragasrding your location:
- Always share - You get real-time weather alerts, very useful some seasons
- Share while using - You get current weather, but lose real-time alerts
- Do not share - Might as well uninstall the app
Then Apple made Apple Weather, which collects weather data from multiple sources, and is supposedly safer to share real-time location with since Apple won't share it with anyone. Before this, The Weather Channel had the real-time location of millions worldwide, and all Apple had for privacy was that prompt.
This is the kind of stack reengineering I'm talking about, that makes privacy a real proposal, but applied deeper so it really makes a difference.
Unless you're some sort of globetrotter going to a new city every week, the app is quite usable just by adding your city.
>Before this, The Weather Channel had the real-time location of millions worldwide
Are you sure apple wasn't proxying the traffic through their servers?
edit: for instance the stocks app very prominently shows the data is from yahoo stocks, but if you check "most contacted domains" in app privacy report, they're all apple domains. It doesn't contact yahoo at all.
>You are one PRISM type request and one gag order from a silent update changinf that.
Wouldn't the bigger issue be that they can abuse the same thing to grab camera and or microphone from your phone? Probably more useful than airpods too, given that a phone's always on, unlike airpods.
People commonly continue to wear their AirPods in the changing rooms, because why not? Keep the tunes/podcast/etc going while you get ready/even shower.
If they add cameras to them, regardless of the implementation, I'm pretty sure that not only is against every gym-policy, but may be an actual criminal offense in certain states.
So I guess what I am saying is: This could be an anti-feature for certain people, or get people into trouble who continue to do a preexisting habit.
>If they add cameras to them, regardless of the implementation, I'm pretty sure that not only is against every gym-policy, but may be an actual criminal offense in certain states.
Is having the camera on illegal in and of itself, or only when you're actively recording?
If you open the camera app but don't hit record, it's still "recording" in the sense that the image sensor is receiving light and sending over data, but it's not "recording" in what most people interpret the word to mean.
I think if people take out their phone and start pointing the camera at others in the changing room, most people would interpret that to not be ok and that's what we're discussing here.
You're essentially setting the scene to where people cannot know if they're being recorded or not while a camera is always pointed at them. That's a problem. Least of all because law enforcement will need to investigate to determine if the law was broken on each complaint.
"But internal processing!" isn't quite the slam dunk defense you seem to think it is. It won't work like that in the real world, people being recorded while changing won't care about that distinction.
>I think if people take out their phone and start pointing the camera at others in the changing room, most people would interpret that to not be ok and that's what we're discussing here.
Right, because there's very few plausible justifications for why someone would be aiming their phone in a changing room. The same doesn't apply to airpods.
>You're essentially setting the scene to where people cannot know if they're being recorded or not while a camera is always pointed at them. That's a problem. Least of all because law enforcement will need to investigate to determine if the law was broken on each complaint.
Is there any indication that you'll be able to get a video feed from airpods? If not, and you basically have to jailbreak them to do surreptitious recordings, what makes them more or less notable than all the other spy cameras you can get today and are more or less tolerated by the authorities?
>"But internal processing!" isn't quite the slam dunk defense you seem to think it is. It won't work like that in the real world, people being recorded while changing won't care about that distinction.
Do you think it's some outrageous breach of privacy that face id (and similar face scanning technologies) are constantly on the lookout for faces, including in bathrooms?
It doesn't today because they don't have cameras on them, it won't tomorrow if they do. People will definitely need to justify (maybe legally), why they're pointing the camera on their AirPods at people changing.
> Is there any indication that you'll be able to get a video feed from airpods?
You're just repeating the same "internal processing" point you've made, and that I've already pointed out isn't a legal or practical difference in the real world.
Have you never used an IR camera? IR can see more than visible light cameras, including through layers of people's clothing and can certainly see a detailed image of a naked human.
So more of an issue in that case; and the laws I am talking about don't have a "IR Camera" exception.
Mine are older and support Find My, but only when they’re out of the case. If I can’t find my case when they’re in it, I’m stuck. Does pro do anything for that?
The AirPods Pro 2, AirPods Pro 3, and AirPods 4 WITH ANC all have cases with speakers and separate Find My integration that I belive is what you are looking for.
It’s the only reason why I have upgraded to ANC on the Airpods 4, because I don’t really like ANC, but I want the smart case.
I wonder what this would do to battery life -- continuously-on IR cameras are going to be a significant power draw. And then there's the question of whether the video processing is done on the earbuds, or how much Bluetooth bandwidth is used sending the video stream to your phone for processing.
Using this to detect gestures does seem very cool, however. Seems like a fascinating engineering challenge.
I still own a Google device with that tech on it (Home Display), and, yeah it isn't useful. They just hide certain UI elements until your hand gets close, which is obnoxious and feels like they invented something then invented a usage for it to justify it.
UI should be consistent, it allows users to learn a muscle memory, this "hide stuff until you're 20cm away" stuff is the antithesis of that (and all good design in general).
This was a solved problem in the 1st and 2nd generation of AirPods with tap controls[1]. I'm still surprised that they removed that feature in favor of pressure, although now that I'm reflecting more on it, I wonder if it's part of Apple using their manufacturing and engineering as a moat[2]. i.e. Tap controls are relatively easy, so once wireless earbuds became commodities, they had to figure out some way to differentiate themselves.
That said, as someone who does pottery (messy hands), wears gloves/hats (stuff in the way), and has relatively poor fine motor control, I guess I welcome any solution that doesn't mean getting clay or cold air in my hair/ear.
The battery consumption and latency of the IR cameras will be interesting though. Too sensitive, and you'll eat up your battery. Not sensitive enough, and UX suffers.
> Japan’s requirement for an audible camera shutter sound isn’t just a quirky design decision — it’s a deliberate policy meant to prevent secret photography.
>I suspect it could be due to Japan requiring that cameras play a sound when taking a photo or video.
That depends on what "taking a photo or video" means. If it only covers making a recording, then it won't apply to airpods. The same applies for faceID for instance, I doubt japanese iPhones are making shutter sounds everytime you pull out your phone, even though it's obviously using the camera.
That law regulates captured images. It doesn't require continuous shutter sounds while the phone is processing and even displaying the camera input - only once an image is captured. It seems unlikely that the AirPods will allow users to capture IR images that are used for gestural control and environmental awareness for system functions.
It is interesting to see camera-in-airpods as a rumor instead of wireless-radio-as-camera [0] to detect similar. Maybe it is less power/volume intensive to add very limited cameras instead of upping the processing power to run inference on the radio signals?
I suppose that the "camera" could be as simple as an optical flow sensor [1] commonly used on mice and quad-copters and placed behind the black plastic so there would not be a visible lens [2].
Can we get a bloody better sound driver, protection from ANC feedback shriek, one or two physical buttons (or at least a raised haptic buttons, not just a touch sensitive stem area), and battery level on the case?
We don't want no fucking infra cameras for "better hand gestures and enhanced spatial audio experience".
There was an old video about the new app for your BlackBerry that would show you what's on the other side of your BlackBerry. At the time we believed this to be a joke, rather than a product development brief.
I suspect most devices will have cameras and mics on them and will mesh connect as a collective system. OpenAI is most likely working on a suite of devices that would fit this "regalia" of sorts.
https://www.macrumors.com/2024/06/30/new-airpods-to-feature-...
> The IR cameras can detect environmental image changes, facilitating a broader range of gestures to improve user interaction. For example, if a user watches a video using Apple Vision Pro and the new AirPods, and turns their head to look in a specific direction, the sound source in that direction can be "emphasized to enhance the spatial audio/computing experience."
I wonder if the signal could be integrated into AR glasses or headset to provide a wider FOV to the wearer.
Geez, if only the Apple Vision had some kind of gyroscope and accelerometer so it could detect head motion without relying on external hardware...
I had an APV for a while, controlling it with just gestures was sweet. If they're looking to bring those kinds of gestures to other Apple devices via AirPods (i.e. beyond just bringing more gestures to the AirPods), I'm intrigued.
Turns out that people become okay with cameras every few inches observing every action as long as you say "It's for gestures.", even though the data stream will inevitably make it back to corpo for training ( and voyeurism[0] by the other parties that'll be in the room).
Same excuse for the VR headsets. "Oh, it has a red LED that fires when recording!" , meanwhile the thing has 30 other IR cams in non-stop loops consuming the environment at power-on til death.
[0]: https://www.reuters.com/article/world/uk/nsa-staff-used-spy-...
Currently I see Apple as safer than, say Google or Microsoft, but not as the privacy bastion it claims to be.
It's opt in, and the bolded option is "ask app not to track", so I'm really not sure what the issue is here.
Years ago, Apple's Weather app sourced their data from The Weather Channel. That meant these three tracking options ragasrding your location:
- Always share - You get real-time weather alerts, very useful some seasons
- Share while using - You get current weather, but lose real-time alerts
- Do not share - Might as well uninstall the app
Then Apple made Apple Weather, which collects weather data from multiple sources, and is supposedly safer to share real-time location with since Apple won't share it with anyone. Before this, The Weather Channel had the real-time location of millions worldwide, and all Apple had for privacy was that prompt.
This is the kind of stack reengineering I'm talking about, that makes privacy a real proposal, but applied deeper so it really makes a difference.
Unless you're some sort of globetrotter going to a new city every week, the app is quite usable just by adding your city.
>Before this, The Weather Channel had the real-time location of millions worldwide
Are you sure apple wasn't proxying the traffic through their servers?
edit: for instance the stocks app very prominently shows the data is from yahoo stocks, but if you check "most contacted domains" in app privacy report, they're all apple domains. It doesn't contact yahoo at all.
In fact, it could have been the case already and you would not have known ir.
Wouldn't the bigger issue be that they can abuse the same thing to grab camera and or microphone from your phone? Probably more useful than airpods too, given that a phone's always on, unlike airpods.
If they add cameras to them, regardless of the implementation, I'm pretty sure that not only is against every gym-policy, but may be an actual criminal offense in certain states.
So I guess what I am saying is: This could be an anti-feature for certain people, or get people into trouble who continue to do a preexisting habit.
Is having the camera on illegal in and of itself, or only when you're actively recording?
You're essentially setting the scene to where people cannot know if they're being recorded or not while a camera is always pointed at them. That's a problem. Least of all because law enforcement will need to investigate to determine if the law was broken on each complaint.
"But internal processing!" isn't quite the slam dunk defense you seem to think it is. It won't work like that in the real world, people being recorded while changing won't care about that distinction.
Right, because there's very few plausible justifications for why someone would be aiming their phone in a changing room. The same doesn't apply to airpods.
>You're essentially setting the scene to where people cannot know if they're being recorded or not while a camera is always pointed at them. That's a problem. Least of all because law enforcement will need to investigate to determine if the law was broken on each complaint.
Is there any indication that you'll be able to get a video feed from airpods? If not, and you basically have to jailbreak them to do surreptitious recordings, what makes them more or less notable than all the other spy cameras you can get today and are more or less tolerated by the authorities?
>"But internal processing!" isn't quite the slam dunk defense you seem to think it is. It won't work like that in the real world, people being recorded while changing won't care about that distinction.
Do you think it's some outrageous breach of privacy that face id (and similar face scanning technologies) are constantly on the lookout for faces, including in bathrooms?
It doesn't today because they don't have cameras on them, it won't tomorrow if they do. People will definitely need to justify (maybe legally), why they're pointing the camera on their AirPods at people changing.
> Is there any indication that you'll be able to get a video feed from airpods?
You're just repeating the same "internal processing" point you've made, and that I've already pointed out isn't a legal or practical difference in the real world.
So more of an issue in that case; and the laws I am talking about don't have a "IR Camera" exception.
Also the new model can regonize when you fall a sleep and stop the media. I think it works, but I'm not sure how quickly it detects the sleep.
(I don't really want to be wearing them while asleep, but my body sometimes has other plans.)
It’s the only reason why I have upgraded to ANC on the Airpods 4, because I don’t really like ANC, but I want the smart case.
Using this to detect gestures does seem very cool, however. Seems like a fascinating engineering challenge.
Don't know, sounds totally useless, like most on-air gesture interfaces.
UI should be consistent, it allows users to learn a muscle memory, this "hide stuff until you're 20cm away" stuff is the antithesis of that (and all good design in general).
That said, as someone who does pottery (messy hands), wears gloves/hats (stuff in the way), and has relatively poor fine motor control, I guess I welcome any solution that doesn't mean getting clay or cold air in my hair/ear.
The battery consumption and latency of the IR cameras will be interesting though. Too sensitive, and you'll eat up your battery. Not sensitive enough, and UX suffers.
1: https://support.apple.com/en-us/102628 2: https://news.ycombinator.com/item?id=45186975
Meta showed that it was possible to do that from cameras mounted on glasses.
However, the power required to do that is quite high (30-60mw to detect, more to do the pose extractions)
So I suspect its just hand recognition.
Seems like a negative tradeoff.
https://japandaily.jp/why-you-cant-turn-off-the-camera-shutt...
> Japan’s requirement for an audible camera shutter sound isn’t just a quirky design decision — it’s a deliberate policy meant to prevent secret photography.
That depends on what "taking a photo or video" means. If it only covers making a recording, then it won't apply to airpods. The same applies for faceID for instance, I doubt japanese iPhones are making shutter sounds everytime you pull out your phone, even though it's obviously using the camera.
I suppose that the "camera" could be as simple as an optical flow sensor [1] commonly used on mice and quad-copters and placed behind the black plastic so there would not be a visible lens [2].
0. https://web.ece.ucsb.edu/~ymostofi/WiFiReadingThroughWall
1. https://ieeexplore.ieee.org/document/10164626
2. https://www.schneier.com/blog/archives/2012/07/camera-transp...
We don't want no fucking infra cameras for "better hand gestures and enhanced spatial audio experience".