A Day In The Metaverse


For most of us, we start the day by opening our eyes - and looking immediately at our smartphones. Smartphones have become an extension of who we are. They hold our relationships, remind us of our cousin’s birthdays and grandparent’s anniversaries. Smartphones ping us with tasks to accomplish each day. They keep track of our fitness with prompts to get in those daily steps. 

What if interacting with our digital assistants and virtual life could be more streamlined, productive and even fun? For many tasks, smartphones are bulky. They don’t allow us to work with both hands. They’re awkward as mirrors to the augmented world. AR glasses offer a different solution, one where the digital and physical come together in a streamlined approach.

These glasses would be our gateway into the virtual world, often called the metaverse. Also referred to as the AR cloud, spatial internet, spatial web, mirror world and/or the Magicverse (which we explored in a recent article), the metaverse is the new platform that comes after the smartphone. Just like when we use our mobile phones or search for something on the web, most people don’t think about all the processes that are happening inside the phone. 

For instance, when we turn on the lights or open the refrigerator, we know there’s electricity powering them but we don’t think about how it works. In the same way, artificial intelligence, sensors, and robot-to-device communication will be the “electricity” that will run the back end of the metaverse. The physical world we live in becomes machine-readable, clickable, and searchable. There will be new interfaces and new ways to navigate and create content. We will create new lexicons and even new architectures for our digital selves and the digital world around us. 

This new convergence of our digital and physical realities will run on 5G and 6G, quantum computing, and Graphene-based CPUs. The way we will see this world will start off through our glasses.

Even as jobs change and our concept of work evolves, a large part of that world will be for work. The metaverse will make it so we no longer have to work in a physical place in the real world. Where we work will take place through the metaverse. AR glasses and the metaverse will be built together over time. In the near future AR glasses will have a plethora of uses in the workplace. 

They can be used as security badges, they will remember coworker’s names, departments, and past projects worked on together, and they can act as a remote window into global offices, manufacturing facilities, and packaging plants. Glasses have the potential to replace cell phones and laptops via augmented reality. Digital images, haptic feedback, biometric sensors, and audio queues are all possible features. Computer vision will be a critical component. 

AR glasses can help people new to a job start working on the first day by showing them the processes and flows. They can assist those who might not be able to work without technical assistance. AR glasses are more than for social face filters. In a world where “remote” is quickly becoming the norm, AR glasses can turn our homes into the workplace, our workplaces into playgrounds and our lives into creative expressions of who we are. 

Here’s an exploration of what a day in the metaverse could possibly be like if we enter the metaverse wearing AR glasses that are sleek and useful. Some call this the Ray-Ban moment. 

Imagine A Day With AR glasses

Meet Katie Wu. The year is 2028 and she lives in Atlanta, Georgia.

A gentle alarm from Katie’s AR glasses wakes her up in the morning and she puts them on for the day, The glasses look perfectly normal by today’s standards, but the frames and temples contain powerful invisible technology. Realizing Katie is awake, the glasses connect to her cloud services via a now mature 6G network, gathering all the information she will need for the upcoming day.

As Katie gazes out the window, sipping her morning coffee, the neighboring buildings fade away, replaced by unicorns grazing in a green field. The sounds of nature echo through the glasses’ built-in spatial audio that realistically responds to her head movements. The unicorns help her brain train and assist her in starting off her day with a meditation that centers Katie and allows her to start off her day in a positive way.

Katie starts her off her day by looking for the perfect spot for her next family vacation. She wants to find a house on a cliff that oversees the ocean and has a playground nearby, a dog park, and also a tennis court. Since the world is now searchable, she’s able to search and find all the places in the world that exist that meet her requirements. She finds the perfect house and her glasses go into full immersive mode. For a few minutes her room becomes the outside of the house and she can see the house and how it overlooks the ocean. It’s exactly what she was looking for. Katie books it for a whole week, then asks her virtual assistant to include all the rental information in her family’s shared calendar. 

The unicorns fade, replaced by the work calendar for the day. Katie has a few minutes before her first meeting. While brushing her teeth, Katie’s toothbrush measures her plaque levels and takes her temperature and sends the information to her glasses. They automatically schedule a cleaning with her dentist and her temperature is sent to her employer so they know she’s in the clear to come into the office later that day. 

Katie sees her prescription bottle highlighted (it only lights up after eating breakfast since the medication is supposed to be taken on a full stomach). She takes her medicine then heads to her closet, which is outfitted with mirrors. Looking inside, Katie swipes through various outfit combinations her glasses show her. Some outfit combinations include clothing from her fashion subscriptions like Rent the Runway. Katie calls in her AI stylist, Dana, to help her select an outfit. Dana, a virtual human that works for several sustainable designers shows Katie what she thinks would be a good fit for that day. Thanks to Dana’s help, Katie picks an outfit, then takes out her clothes as the glasses highlight where they are hanging in the closet. 

The glasses, using behavior technology, start playing audio of a coffee shop background. It’s time to get to work. 

Katie is a manufacturing technologist. She uses her AR glasses to design training modules for operators on the line. Her kitchen transforms into her client’s assembly line. She spends the morning transforming the mix of paper and digital, training and repair manuals into AR assembly instructions.

Katie drags and drops arrows, buttons, and navigation across her field of view which will turn into real-time instructions. At one point Katie gets stuck. One of the machines she needs to create work instructions for isn’t running. “Hey, glasses, see who is available to help with this.” 

Using the context of what Katie is working on and the status of her colleagues, the glasses decide who to contact. The hologram of her associate, Mike, is added to her augmented field of view inside the plant. Mike says, “Good morning Katie, how can I help“. Katie responds, “Hi Mike, can you run this machine for me so I can create the work instructions?” Mike says, “Sure thing.” 

Katie watches him move his arms which remotely run the work station. Katie flips up the arrows and navigation for how to operate the work station. “Thanks, Mike!” Katie and Mike’s augmented holograms high five, sending a vibration to the smartwatch on Katie’s wrist. 

At lunchtime, Katie decides to try a new restaurant. 

“Take me to lunch,” Katie says. 

“There are three restaurants you haven’t visited within walking distance.” The AR glasses respond. Bubbles display in Katie’s field of vision where the restaurants are located with dotted lines marking the walking path and distance. Katie decides on one, the menu appears before her. “Number 19 Shwarma Wrap.” The order is placed.

At the restaurant, Katie picks up her sandwich and confirms payment through her AR glasses. A digital receipt displays and folds itself into a digital expense tracker. Before leaving the restaurant, her glasses remind her that her niece is having a birthday soon and that she should order her a gift. 

The glasses recommend a brand, but Katie knows her niece is really interested in a new Roblox game and wanted to get new skins for her avatar and some game-related swag. Katie asks the glasses if there are any special discount codes she can use to save money on the gift. The glasses tell her there are discounts if she use the new digital currency that was launched for a new version of the game and with a literal the blink of her eye, her glasses read her intent and she purchases the skin which will be sent to her niece digitally on her birthday and a box of swag that will arrive via drone-delivery during the birthday party.

Katie takes her sandwich and heads to the park across the street to visit her friend’s recently uploaded art exhibit. The park bench Katie sits on turns into a bridge. Under Katie’s feet, a river rushes by, digital stone fish, like from a fountain, jump upstream, digitally splashing Katie’s lap. Katie gets a call from her boss who digitally appears on the bench next to her. “We have a new client, can you come into the office this afternoon?” Katie says yes. The hologram disappears. 

When Katie gets to the office, her glasses connect with the building’s cloud services to securely verify her identity and open the door. A map of the office building appears showing her the conference room where her boss is located. Visual cues help navigate her to the location. When she arrives, a digital layout of the new client’s plant displays on the conference room table. Tiny, holographic operators and engineers move within the plant.

After the virtual tour, Katie gets an audio reminder that the building where she works is having an emergency drill to practice in case the city gets flooded again, as flooding has become an increased concern for residents. Sure enough, an alarm goes off and she gets a visual prompt to follow the escape route that her glasses outline. Katie follows the navigation to the staircase that leads her up and out of the building onto the flying carport.

Katie and her coworkers gather and the building’s AI facility manager, James, another virtual human, lets everyone know that in case of an emergency like the one they are practicing for, they should all take the stairs to the top level and that flying autonomous rescue vehicles will be there to airlift everyone to a safety zone. James also shows them a 1 to 1 virtual map of all the areas that were flooded in the building the last time it happened. After the drill is done, Katie realizes her workday is over. 

Katie heads back downstairs. She stops in the bathroom and uses her AR glasses to change her digital look from business professional to “happy hour hip”. She blinks through a few outfits before she’s satisfied with her digital avatar’s look. Katie leaves the office building and jumps into an autonomous taxi that takes her home. 

On the way home, she gets an audio message from her sister. Katie’s sister, Manuela, is the CMO of a new metaverse social marketing firm. Manuela just won a big contract to help one of top virtual fashion brands with their latest digital human campaign. Her sister has been at the forefront of brand activations in the metaverse for several years now and is a leading voice in the metamarketing world. 

She was one of the first people to realize that direct to consumer was morphing into direct to avatar. Katie can hear the excitement in Manuela’s voice, but decides she’ll call her tomorrow, as she knows Drew is proposing to Manuela tonight when they meet in virtual Venice for a gondola ride and virtual performance from Andrea Bocelli’s AI. 

When Katie gets home she relaxes a short bit by taking a stroll through her augmented feed. Clips from her friends’ day play on either side. A LIVE sign hangs above one group of friends. Katie virtually walks over to join the party. She’s so glad she had changed her digital outfit before leaving work. Ever since Katie and her friends graduated college, they moved all over the country. That doesn’t stop them from having get-togethers in the metaverse. Her friend’s playlist appears as a DJ, spinning tracks in the corner and the room she’s in turns into a virtual club where Katie can see her friends’ avatars and holograms. They dance and chat. It’s nice to enjoy time together with her friends even when they are far away. It feels just like the old times when they were all in the same physical place. 

During the party, Katie gets a volumetric call from her boyfriend Marco, who’s an up and coming virtual artist. Marco is in Costa Rica creating the first part of his next big installation. It’s a persistently running space composed of a digital river that connects different parts of the world through portals. 

For the first part, Marco is working deep inside the rainforest. He called to give Katie a sneak peek at the way his installation will connect the rain forest portal to the undersea portal in Belize, and finally the space portal on the moon base which he will work on in a few months. Marco will use neural interfaces so people who visit the virtual installation can expand their existing limbs into wings, creating an experience where both the interface and the interaction with the art responds to what each person imagines, making it unique and personalized. 

After virtually walking through the rain forest with Marco, they give each other a kiss goodnight. The nanobots in Katie’s chapstick tingle, making her feel for a moment that she was with Marco under the Costa Rican stars. 

As Katie winds down for the night her glasses monitor her brain wave patterns and provide her a custom guided meditation to set her up for a restful deep sleep. When she takes off her glasses for the night they connect with her smart home system to turn on the night security mode, slowly turning down the lights and providing tranquil background sounds. As her AR glasses are wirelessly charging, Katie falls asleep, subconsciously preparing her mind for the day ahead. 

Do You See The Possibilities?

Former Apple CEO, John Sculley, once said, “the future belongs to those who see the possibilities before they become obvoius.” Do you see the possibilities?

Augmented reality glasses have the potential to do more than overlay directions and show us digital calendars. They have the potential to revolutionize the way we work, socialize, and take care of ourselves. AR glasses are a portal to the next wave of computing - spatial computing. It’s where digital avatars, digital representations of ourselves, and the physical world intermingle to form a new reality. It’s already happening with mixed reality and spatial computing headsets, social media AR filters and lenses, and Google Maps Live View. One thing the pandemic has shown us is that our lives are becoming more digitized and virtualized.

Being able to access the virtual world via glasses will also allow companies to reimagine their products in new ways and even who the company is as a brand. It will allow them to accelerate business processes as well, being able to try out new approaches or even product lines on a virtual twin to see what efficiencies can be created before deploying them. It will also allow researchers and medical professionals to use exact virtual replicas powered by AI to explore new treatments without risking the patient.

Many wonder when will we reach that Ray-Ban moment or what will be the impact on humanity. In a recent video interview, Nicolas Berggruen from the Berggruen Institute, shared his philosophical perspective on being human in the digital/virtual age and the convergence that is upon us.

AR glasses might not solve all our problems or be exactly how people work and conduct business. But imagining the possibilities is the first step to innovating the future.

As British science fiction writer Arthur C. Clarke said, “Any sufficiently advanced technology is indistinguishable from magic.”

Written with insight from Fifer Garbesi, Lily Snyder, Rob Crasco, Marcus Endicott, Taron Lizagub, Ryan Gill, and Linda Ricci. 



Source link