Apple Glass can help you "see" in the dark - here's how

Apple Glass can help you "see" in the dark - here's how

We've seen some unusual patents on Apple Glass ideas over the past few months, from navigating through subtle changes in music to bone-conduction audio technology, but this latest one may be the strangest yet. Apparently, Apple is thinking of an innovative way to make vision clearer in low-light conditions by wearing Apple Glass.

No, this is not some kind of night vision plug-in. Rather, the new patent explores how Apple Glass could use a variety of sensors to measure the world around its owner and give him or her a better understanding of what is nearby.

The patent, titled "Head-Mounted Display With Low Light Operation," begins by outlining the problem. By "light acuity," we mean the state in which the eye functions best "in high levels of ambient light, such as daylight." Otherwise, mesopic or scotopic vision "can cause loss of color vision, altered sensitivity to different wavelengths of light, reduced visual acuity, and motion blur."

This is a far-fetched explanation of what most people already know: that humans have trouble seeing in the dark. This is the interesting part. Sensors in head-mounted displays (HMDs) can detect the wearer's surroundings and provide feedback in the form of graphic content.

"Depth sensors detect the environment and, in particular, the depth (e.g., distance) from it to objects in the environment," the patent explains.

"The depth sensor typically includes an illuminator and a detector. The illuminator emits electromagnetic radiation (e.g., infrared light) into the environment. The detector observes the electromagnetic radiation reflected from objects in the environment."

There is no set type of depth sensor, and Apple cites several examples: one uses time-of-flight (ToF), where the time it takes for a pattern to be projected into the environment and for the device to see it again provides an approximation of depth.

Apple also states that RADAR and LiDAR, which just debuted on the iPhone 12 Pro, could be used. The company writes, "It should be noted that one or more types of depth sensors may be utilized, for example, incorporating one or more of the structured light sensor, time-of-flight camera, RADAR sensor, and/or LIDAR sensor."

However it is done, the ultimate goal is to give the wearer details of the environment that cannot be detected by the eye. The [HMD's] controller determines the graphic content in response to the sensing of the environment by one or more of the infrared, depth, and ultrasonic sensors, and manipulates the display to provide the graphic content simultaneously with the sensing of the environment."

Sounds a bit sci-fi, but certainly intriguing. While not all of Apple's patents will ever see the light of day, applications like this one provide interesting insight into the thinking in Cupertino and the problems the company believes it must overcome.

Even if this patent were to be commercialized, it will not happen for some time. Apple Glass is not expected until next spring at the earliest, and more likely not until 2023. There is plenty of time for this technology to mature a bit more.

Categories