Apple is working on AirPods with a camera, but also on the Apple Watch. However, the cameras on the devices aren’t designed for video calls or photos. They could arrive sometime in 2027, according to Bloomberg. Below, we’ll tell you everything you need to know about the topic.
The “Nevis” project and the “Glennie” project
Apple has developed a chip it has internally codenamed “Nevis,” which will be designed to equip a camera in the Apple Watch . Meanwhile, its “Glennie” project will be designed to incorporate a camera into AirPods .
The chips will be ready around 2027, and if they are developed early that year, we could see an official launch by the end of 2027.
Popular insider Ming-Chi Kuo thinks Apple wants to incorporate infrared cameras into its AirPods to provide an improved spatial audio experience on the Apple Vision Pro.
The AirPods’ camera sensors could also be designed to add new health features like body temperature measurement.
Additionally, the cameras on these devices could also offer air gesture controls, with motion sensors capable of identifying and interpreting different hand postures.
For his part, Mark Gurman believes that Apple is considering using cameras in AirPods and Apple Watch with the goal of “feeding data to Apple Intelligence ‘s artificial intelligence . “
Another option, at least on the Apple Watch, would be to use a camera near the screen or the Digital Crown to activate Apple’s Visual Intelligence feature. This Apple service is already available on phones like the iPhone 15 Pro models or the iPhone 16 line and is used to perform internet searches based on the photos we take, such as finding matches for a book you want to buy or an item of clothing.
So, it seems Apple wants to include cameras in its wearables, but not for what we were all hoping for. We won’t be able to FaceTime from these devices, but we’ll be able to use them for much more.