Can virtual reality fix the workplace?

Not long ago, I decided to try writing an article in a virtual world. It was not the first time that I had this idea. In the spring of 2016, a student in the computer science department at Georgetown University set up an HTC Vive virtual reality platform in a conference room and offered to give demonstrations. I volunteered and was impressed with the experience. He took me into the laboratory of a mad scientist, cluttered with dazzling equipment and gadgets. I crouched down, looked under a desk, and inspected the pipes from a sink to the wall. The following demo featured an underwater world. At one point, a whale swam overhead. I remember being surprised when I looked up to see him up close and seemingly so tall – my first compelling virtual presence moment.

The timing of this demo was fortuitous. Earlier that year, I had published a book, “Deep Work,” which was a cross between a manifesto and an instruction manual on the importance of distraction-free concentration. During this time, I thought a lot about ways to improve concentration; this explains why, shortly after my experience with the Vive, I wrote a speculative essay on how virtual reality could help creativity: “Imagine, for example, that when the time comes to. . . tackling a new chapter of your sci-fi novel, you can stand in a quiet room of a space station with a rotating view of the glittering galaxy outside your window. An engaging virtual environment, I argued, would help us resist “the addictive lure of inboxes and flows” and potentially access “massive amounts of intense productivity fueled by work.” I even gave this concept a suitably techno-optimistic label: immersive monotask.

My arousal level was high, but my options for taking action were limited. The system the student demonstrated was expensive and required the virtual reality headset to be connected to a powerful computer. The setup was also complicated: the student had to place and calibrate infrared sensors in the room. As a young teacher with young children at home, I lacked both time and discretionary income, and it didn’t seem practical to get involved in virtual productivity experiences.

Then the technology got better. Last May I wrote an article for The New Yorker on the power of new environments to improve concentration. I reported that Peter Benchley had escaped the distractions of his pretty shed in Pennington, New Jersey, to work instead on “Jaws” in the back office of a nearby furnace store, and that Maya Angelou would retire to hotel rooms, where she would remove the work of art from the walls. Describing these examples of analog immersion made me rethink the potential of digital tools to create the same type of productive cocoon. A bit of Google research revealed that in the half-decade since I wrote on this topic, virtual reality systems have become significantly cheaper and more powerful. For less than three hundred dollars, you can now purchase an Oculus Quest 2, a fully self-contained headset that can be used right out of the box. Moreover, I was clearly not the only one thinking of applying virtual reality to the field of work. The Oculus App Store now has an entire section dedicated to productivity. It was finally within my reach to test if the immersive monotask had potential. So, a few weeks ago, I bought an Oculus, downloaded a popular productivity app called Immersed, put on the headset, and decided to go to work.

When you launch the Immersed app, you are taken to one of the many virtual rooms available. For my experience, I chose a gabled roof lodge with exposed beams and views of wooded hills on all sides. This space is furnished with a combination of sofas and wooden tables, which face rectangular fireplaces that crackle as you approach. The feeling of immersion provided by the helmet is striking. The piece is presented in a wide stereoscopic 3D field of view which gives a convincing impression that some objects are farther away than others. As you move your head, the view seamlessly changes to follow. From a technological standpoint, these effects are hard-won. When I gaze at the hills beyond my virtual lodge, I’m actually looking at an LCD screen, roughly the size of a standard smartphone, placed a few inches from my eyes. A pair of hybrid fresnel lenses bend the light rays coming from the screen into parallel angles, relieving fatigue and prompting my brain to perceive light as coming from further away. A collection of four outward-facing sensors on the outside of the helmet continuously maps the room to help calculate where exactly my head is positioned in space. This information powers a powerful Qualcomm chip, known as the Snapdragon XR2, which renders the scene seventy-two times per second, altering the sight presented to each eye to create simulated stereo vision. All of this complexity has to fit together perfectly so that I forget, if only for a few minutes, that I’m sitting on a worn-out chair in my office, next to a potted plant that I need to remove. watering and a desk cluttered with papers.

The main feature of Immersed is the possibility of replicating the screens of your personal computer in the virtual environment. Using portable controllers, you can reach out and grab a screen, move it to another location, and stretch it to any size you want. For my experiment, I put a screen that mirrors the open word processor on my laptop over a virtual table and stretch it to the size of a large flat screen TV. Now is the time to write. I bring my real laptop to my chair. In my helmet, I see his screen hovering in front of me. Gentle rain begins to fall on the digital mountains. I take a moment to think about something appropriate to commemorate this first step in virtual productivity, finally writing one sentence: “As I type the first draft of this article, I am sitting in a room with a high ceiling. The key word here is “finally” because the first literal letters I type are: “Vzzs. K]].

With the headphones on, I cannot see my keyboard and my fingers are not aligned correctly. The Immersed app anticipates this problem and offers a smart solution: a mode in which you can teach the outward-facing sensors of your helmet to recognize your hands and your real keyboard, rendering both inside the world. virtual. However, as a newbie to this technology, I struggled with the steps required by the calibration routine and eventually gave up. The on-screen keyboard wasn’t the only advanced feature that I couldn’t grasp. Immersed offers extreme flexibility in the configuration of screens. You can create several different virtual monitors, and with proper use of your controllers, you can push, pull, extend, tilt, and rotate each surface to an exact position. The subtle movements of the controller required for this manipulation escaped me. I ended up just pushing my main screen roughly until it landed in a reasonable position.

As I learned from talking to Renji Bijoy, the founder and CEO of the company that created Immersed, at least a third of the thousands of monthly active users on the app are software developers, and many more work in similar fields of information technology. demographic who appreciates the kind of advanced features that I struggled to deploy. As he explained, these power users especially appreciate the ability to add five different virtual displays into their submerged environment, which is more than what can be found in any of the most over-the-top real-world offices. In a demo video I stumbled across on YouTube, an Immersed user positions three large monitors in a semicircle around his seat, then adds a fourth above, tilted down from the ceiling, where he can see it. when he tilts his head back. Immersed not only makes such environments possible, but also makes them portable. “Software engineers love to sit on their couch, on their porch or in their hotel, and have all of their screens with them,” Bijoy said.

About Paul Cox

Check Also

Kalona Chamber welcomes new staff | Radio KCII

Kalona Chamber welcomes new staff | KCII Radio – The one to rely on …