Macs may be able to turn a wall or any surface into a touch input device in the future, with Apple researching ways to add touch-based interactions to a surface that doesn’t have any capability for touch sensing at all.
Many electronic devices offer touch capabilities as a way to interact with the computer system, such as an iPhone or iPad display. While touch interfaces are common, there are areas where such interactions could be useful, but are currently not able to be used.
For example, a teacher in a classroom may use a projector on a wall to show content to students. It is currently possible for the teacher to tap virtual buttons or interact with content displayed via the projector, but either by interacting with the host computer directly or by employing a potentially expensive smartboard system that can detect touch.
In a patent granted to Apple on Tuesday by the US Patent and Trademark Office titled “Self-mixing based 2D/3D user input detection and scanning laser system,” Apple suggests such interactions could be performed on a wall or another surface being projected to, without requiring any form of touch-sensitive system in the surface itself. In theory, it would allow interactions with a projection on a plain brick wall.
In simple terms, the patent revolves around the transmission of light and detecting any light reflected back to the device, such as that of the surface being projected to or an object obscuring the path to the surface. For the purposes of the patent, this obscuring object may be a stylus or the user, which may be an object that needs to be tracked.
To accomplish this, the light beamed out from the projector used for detection could be a laser diode, which Apple suggests may be a vertical cavity surface-emitting laser (VCSEL), the same type used for Face ID. Using a laser for projecting the image makes sense as it is a more uniform light than typical bulb-based projections, and stands the best chance of being reflected directly back to the device.
The reflection is were the meat of the patent lies, in that the reflected light is collected by a “self-mixing interferometry sensor.” Interferometry refers to techniques to measure distances and other data points, by mixing together multiple sources of light to create an interference pattern, which can then be analyzed for changes in displacement.
By using interferometry, the system would be able to determine not only the distance traveled by light to reach a projection surface, but also how far away any obscuring object is from that same surface. By tracking the position and the distance from the projection surface, the system can try to work out which objects are moving with the intention of interacting with the projected display.
This can include movements of an object that could be determined to be a gesture, such as a user’s finger moving as if to touch the surface, or as another non-touch gesture in the air.
In theory, such a system would be easily usable on practically any surface, given a brief period for the system to calibrate the projected screen to the new environment. It may not necessarily even require a flat surface to work, and could feasibly work on curved or uneven surfaces if sufficient distance data points are collected.
The patent lists its inventors as Mehmet Mutlu and Ahmet Fatih Cihan, and was originally filed on May 9, 2019.
Apple files numerous patent applications on a weekly basis, but while the existence of a patent application suggests areas of interest for Apple’s research and development efforts, they do not guarantee the ideas will appear in a future product or service.
The concept of projecting a display to a surface that you can interact with without relying on touch-sensitive components has been around for a while, and is possibly best known to be used in projection keyboards.
Typically projecting red light onto a flat surface, these accessories can detect key “presses” when a user taps onto a surface that the projector is shining onto. Such devices have been a curiosity for around two decades, but largely haven’t been adopted by users for reasons including a lack of accuracy and speed compared to a physical keyboard.
These keyboards also typically rely on an invisible infrared light pattern to be emitted onto the same surface, broken when the user taps a projected key space and detected by an image sensor in the device. While feasibly similar, this isn’t the same technique as proposed in the patent, as the projected light itself is used for distance measurement in the patent, rather than a secondary invisible light pattern.
Apple has previously applied for patents relating to projections, including one from October where it proposed projecting onto objects, such as a car or a building, to create a form of real-world AR.
Meanwhile, one patent from 2013 had Apple envisioning a desk setup that used a projector instead of an LCD display. The “desk-free computer” did away with a notebook or typical desktop computer, and even eliminated wires by using inductive charging.