Google has decided to expand the testing of the innovative video link system Project Starline to include not only the company's own staff but also its partners, including Salesforce, WeWork, T-Mobile and Hackensack Meridian Health.
The Project Starline system was initially presented to the public at Google I/O last year, but against the background of the more pressing annotations of Wear OS and Android, the initiative did not receive adequate attention. However, the developer did not refuse it and decided to expand its testing by involving outside participants. Furthermore, the company noted that it was working to make the system more accessible, and this makes sense given the many expensive equipment needed for its work.
Project Starline is a video link system that offers the most realistic image and sound transmission possible. Engineers have set themselves the task of creating a sense that the interviewer is right across from the user. To achieve the goal, they had to build a complex design and place it in a box larger than 2x2 m. This is also a research project, which is said in Google.
On the screen side of the video link box, there are 14 cameras and 16 infrared projectors, which together capture and track the three-dimensional real-time image of the user. The sound is answered by four microphones and two dynamics: they do not simply reproduce the speech, but use the spatial sound algorithms, creating the feeling that the speech comes directly from the human mouth on the screen. Curious is the nuance that the camera over the display does not allow participants to have full visual contact: the person looks either to the camera or to the screen. In the case of Project Starline, the system corrects the position of the three-dimensional image of the interviewer, and the visual contact is also quite credible.
The data processing is performed by a powerful workstation with two Intel Xeon processors and four NVIDIA graphic adapters: the Quadro RTX 6000 pair and the Titan RTX pair. The image is displayed on a 65-inch autostereoscopic screen with a resolution of 8K and a frequency of 60 Hz, which allows you to create a three-dimensional image of the interviewer in its natural size and see it without glasses. In fact, it is an expanded version of the Nintendo 3DS display, but without restrictions because the system monitors the position of the user's head.
In the implementation of Google's system, it did make it a little easier, limiting the user's position to a rigidly fixed bench, and putting a small barrier in front of the screen to make it seem that the lower part of the person's body is going to go under the display. In addition, the system has a special set of lights for optimal capture of the 3D texture, which is also facilitated by infrared lighting. The result is, of course, impressive: people who have been able to test the video link in Project Starline say that the presence effect is true. And soon the number of positive feedback will grow: the company has agreed to partner with Salesforce, WeWork, T-Mobile and Hackensack Meridian Health, representing different industries. Before that, only Google's own employees used the system.