Connect with us

Its123.com

Cryptocurrency Bitcoin I Want My Daughter to Live in a Better Metaverse


Crypto

Cryptocurrency Bitcoin I Want My Daughter to Live in a Better Metaverse

It’s a cool summer evening in 2030, and my 16-year-old daughter and I are walking down our street, stargazing with augmented reality glasses. Above us the majestic night sky is clear, superimposed with information about distant stars that are configured into constellations like Pegasus, whose legend I use to teach lessons about life. It’s a…

Cryptocurrency  Bitcoin I Want My Daughter to Live in a Better Metaverse

Cryptocurrency Bitcoin

It’s a cool summer evening in 2030, and my 16-year-old daughter and I are walking down our street, stargazing with augmented reality glasses. Above us the majestic night sky is clear, superimposed with information about distant stars that are configured into constellations like Pegasus, whose legend I use to teach lessons about life. It’s a beautiful moment.

Further along, we pass a wooden fence tagged with a litany of swear words and slurs. I see the graffiti through my glasses, but my daughter, whose glasses are set to filter out inappropriate content, does not. Nor does she understand the cause of agitation written on the faces of those nearby.

I’m excited about the first possibility, but I worry about the second. While I appreciate the ability to shield my teenager from unseemly content, I also understand the importance of having meaningful conversations about why certain words and actions can harm others. That can’t happen if children never experience them in the first place.

We continue walking and see a young homeless man panhandling in front of the store. Here, the role of parental controls is more murky. Inadvertently or by design, an algorithm classifies his prone posture upon the sidewalk, ragged attire, and a note asking for money as inappropriate for children, and renders his appearance and surroundings to be more benign. While altering and architecting our experiences of the world may seem far-fetched, for years we have been constantly learning about the impact of similar algorithmic biases redacting what we are shown online.

What would prompt my daughter to ask questions about important societal concerns like homelessness and empathize with those experiencing it if, in her world, she never sees it? What if others, who prefer an “idealized” world, also choose these settings in their AR glasses? How can we have meaningful conversations on how to address these challenges if large portions of the population are oblivious to the true conditions of our community?

We’re closer to dealing with these kinds of moral issues than you might think. Facebook now plans to pursue Mark Zuckerberg’s vision of transforming from a social media company into a “metaverse company”–and we’ve already seen a glimpse of what it could look like with Workrooms bringing a sense of presence and select gestures. Media fragmentation and echo chambers have already shattered our common reality. If left unchecked, the metaverse may only make things worse. It won’t be long until each of us is able to live in an entire world tailored to our own personalities, interests, and tastes, which may further erode our shared experiences and make it harder us to meaningfully connect.

Collective experiences are essential to our ability to bond and cooperate. Much of the division we see today is a product of our splintering digital realities. When we’re not encountering the same problems, it’s hard to come together to develop solutions and empathize with others. Filter bubbles are ultimately an empathy problem.

The reality is, we’re already living in a near-infinite number of realities online. Moments after we begin browsing, our web experiences diverge. We each see wildly different things based on who we are, where we live, what content we consume. The things we like get resurfaced again and again in different forms, each new iteration more enticing than the last. Eventually, our life online is entirely our own, which can lead to selective and self-reinforcing worldviews–and thereby alternate realities.

Not only do many (if not most) of us still struggle to differentiate between what’s real and what’s fake, we often don’t realize that these experiences are heavily influenced by outside actors with an agenda–whether it’s as banal as selling a new product or as sinister as shaping political beliefs and sowing discord. The metaverse will apply that dynamic to real-world interactions.

Real Life. Real News. Real Action

Zillion Things Mobile!

Read More-Visit US

Time and again when companies develop new technologies, they rarely do so while considering the possibility of adversaries. We’ve seen this with baby monitors, AI, and of course social media platforms. The metaverse isn’t immune. It’s not difficult to imagine nefarious actors injecting extremist or toxic content into metaverse experiences directly.

You can get a sense of what these attacks will look like from the digital worlds that already exist. We’ve seen bad actors on Roblox creating fascist digital experiences that attack people of a certain race or belief. The metaverse equivalent to this could involve prejudiced actors architecting users’ experiences with subtle microaggressions to exclude, bully, humiliate, or corrupt, using engagements with synthetic people who ignore them repeatedly or introducing reminders of embarrassing experiences or even trauma—like a vicious dog attack—over and over.

Due to the nature of the metaverse, it’s possible that the people who experience these attacks will do so completely alone–and they’ll now be “living” the experience versus passively consuming it. Distinguishing between what’s real and what’s fake will become incredibly challenging. What is fake in a world where most things are synthetic? How can people make informed decisions when the lines between real and fake are so heavily blurred? Once in-person interactions are entirely online, it becomes ever more difficult to break out of the kinds of radicalization and disinformation ecosystems that have become more common in recent years.

I remain excited to see the dream of the metaverse become real. Doing so the right way will require building the right tools and security frameworks to ensure the metaverse is both safe and equitable. While the research here is still nascent, there are a few immediate approaches we can take.

One is to borrow something my research team has done in the privacy space. To help people see the alternate online realities experienced by others, we developed a tool that spins up thousands of alternate browsers for each user, creating a different persona for them in each instance. By aggregating all these differences for each persona, we help users see how the world is being constantly morphed by those around them. Simply calling out the narrowness of their experience was enough to change their cognitive filtering.

In the metaverse, we could accomplish something similar by introducing a cue that actively notifies people if they’re experiencing something radically different from what others are experiencing. For example, if my daughter sees a fashion billboard that’s been edited to change the body types of the models, she and I could both see a label that informs us we’re seeing something different and why it was changed for us. This empowers people to share the experiences of another. Instead of giving people rose-colored glasses to only see a curated world, we could give them x-ray vision when they need to see together, which in turn helps build empathy and connectedness.

Another approach would require more research to execute: creating a true metaverse “public square.” Here, rather than divide everyone based on individual tastes, we could create a single shared space that is provably the same for everyone. Within this space, when two people take a picture of something, the pictures would capture the same moment–no filters. This concept could take on many forms, but ultimately, there would be reality trust anchors that bring us together. These can include the underlying platform and are necessary to align people’s reality and verify that certain information is correct or consistent, similar to the way that app store platforms are trusted to accurately display app permissions today. You can use trust anchors to make sure the person you’re talking to is over 18, that your friend is seeing the same vista as you, and that it is your friend and not an AI. This would allow everyone to witness the same reality.

With these new types of cybersafety frameworks, I’m cautiously optimistic about the metaverse and its promises to reshape the fabric of our society. I long for the day when I can skip over to the Olympic Games with my daughter and then run to an “in-person” meeting a few minutes later. However, we have all seen firsthand what happens when we don’t build protections into our technologies, platforms, and devices, and we don’t want to make that same mistake when the devices are us.


More Great WIRED Stories

Subscribe to the newsletter news

We hate SPAM and promise to keep your email address safe

Top News

Popular Posts

To Top