It is probably a moment very similar to the one we live with the arrival of the first iPhone. I am referring to the launch of Apple Vision Pro (for now, only in the United States), which is promoting the appearance of new applications that take advantage of the hardware capabilities of the new Apple device.
One of the most curious, and which is causing the most talk, is Magic Room: it is a native app for visionOS that makes visible what is called the “Mesh Dynamics” of the Apple Vision Pro LiDAR sensor. Basically, it is the “mapping” of reality with a polygonal mesh composed of thousands of points sampled by this type of sensor.
This polygonal mesh is capable of detect depth, position and size of each element in real time (since hundreds of thousands of samples are produced per second), allowing visionOS to know where each element is with respect to the user.
APPLE’S AUGMENTED REALITY GLASSES are called VISION PRO and they look like SCIENCE FICTION
The technology in Apple Vision Pro with LiDAR
LiDAR technology has been in iPhones since the iPhone 12 Pro, and allows the device to “know” what reality is like to make decisions regarding depth of field, photographs, mixed reality applications and other decisions that require distance calculation of this type. On iPhones it is mainly used to improve photographs in low lighting conditions, since this type of sensor does not need it to work.
LiDAR is an active remote sensing system. That is, it emits infrared laser pulses and measures the return time to the sensor for each point it maps. It does this in real time, calculating about 120,000 samples per second, so we can use Apple Vision Pro moving through a real environment and the system will be able to detect our environment and its volume.
What Magic Room does is precisely “reveal this magic”, as they define it. When starting the application, suddenly this mesh of points connected in the form of polygons – just as visionOS is recognizing it through LiDAR – will appear in front of us.
I tried the example you see in the image above on the terrace at home. The reference photo is taken during the day, but I did the test with the app at night so that you could see that even without light, the LiDAR works perfectly – and is capable of “tracing” reality with astonishing detail and speed.
Not only is it curious – and spectacular, when you try it with the Apple viewer – it is also interesting check the power of the device to interpret reality. In the application, you can even adjust the development speed and some other special tricks.
Something that surprises the first time you try it is that It gives us the impression that we can “see through things.”. Not really: Magic Room is mapping constantly reality, even if we do not activate the mesh visualization. If we have moved through the environment previously, the spatial point data will be stored.
At home, for example, it happens to me with the room where the washing machine is: is able to understand that it is behind a wall with the information you already have from previous sampling and the sample on the model from our point of view. This interpretation of the polygonal mesh almost causes us to fool ourselves as if we could “see through the wall”.
Matrix in your living room
One of the most curious and spectacular uses is to activate the “Digital Rain” visualization, where we will see how our house magically becomes a interpretation of the Matrix code, moving and flowing around us. As we move around the room, new parts are revealed, in perfect real time and without any type of latency or delay.
You can also change the font size of the code that flows, whose changes we will see applied instantly in the experience. And we have the possibility of blur the real environment or mix it with the experience interactive to give an even more cinematic feel.
In the images or videos that you can see, you cannot appreciate the sensation of having it not only in front of you, but also enveloping you. It is a very curious immersive sensation, which also transmits the technology capabilities in Apple Vision Proand that games like the great Super Fruit Ninja already use to “map” the room and know where we have the furniture in our living room so that the fruit hits, stains or slides (For example) by the back of our sofa. The future of these apps, more perfected, immersive and consistent with the reality that surrounds the user, is just around the corner.
In Applesphere | Apple Vision Pro, first impressions: after trying it for 24 hours I can say that it is the most amazing thing that Apple has done in recent years