Connect with us

Startups

Metaverse vs. data privacy: A clash of the titans?

Published

on

It may well be another “clash of the titans”  when the upcoming metaverse – such as we understand it now – meets data privacy. The metaverse wants to harvest new, uncharted personal information, even to the point of noting and analyzing where your eyes go on a screen and how long you gaze at certain products. Data privacy, on the other hand, wants to protect consumers from this incessant cherry-picking.

That, friends, describes the upcoming battle in the future world of harvesting new personal preference information, and companies are already planning on how to monetize this potential bonanza for themselves. One can bet that in the new online economy of the future, plenty of new startups will be lining up on both sides.

“When you talk about going into a virtual or an augmented reality (AR), it’s all about information as power,” said David Nuti, senior VP for North America Channel at Nord Security, an international online security provider. “They don’t create these platforms to feel good about bringing people together. They’re mining information that is sold off to serve you content that is relevant to what you like to do. 

“For example, if I’m in an augmented reality environment, a company may want to serve me an advertisement for a couch because they can see in my augmented environment that my couch is kind of ratty in the background. Through artificial intelligence, they’ll serve me up a color of a new couch that matches the paint on the wall of my house. If I serve up an advertisement, it’s no longer knowing that I’m serving up the advertiser to the person, but how long my eyeballs are focused on that content.”

Charting your eye movements on screen

The value of those few seconds on the screen to advertisers can’t be overstated. The question is: should companies have use of the analytics of that long stare at the new phone you’re thinking about buying, or a new Peloton you’re admiring? If you contend this is an over-the-top invasion of personal space, get with it: that plane left the runway a long time ago. This is all going to depend upon whether a user even wants to enter the metaverse in the first place.

Today, Jan. 28, is international Data Privacy Day, which hopes to highlight the coming struggle within a specific, though arbitrary, 24-hour period. A recent study from NordVPN revealed a whopping 87% of Americans expressed major privacy concerns if Facebook succeeds in creating its proposed metaverse. In fact, half of Americans fear it will be too easy for hackers to impersonate others in this brave and bold new world, thus threatening personal information privacy on an unstoppable basis. 

This is mostly fear of the unknown at the present time since, in the same survey, 55% of Americans hadn’t even heard of the metaverse, let alone know what it entails. In fact, only 14% of those polled said they could explain the metaverse to someone else.

Let’s back up a little and define these terms. Metaverse is the term Meta’s (Facebook) CEO Mark Zuckerberg foisted on the world last October. At a high level, it means the comingling of the real and digital worlds, such that it becomes difficult to ascertain reality from unreality. In this new setting, personal avatars are quickly expected to multiply.

Zuckerberg introduced the metaverse and even produced a video explaining what it will look like – a stunt that famously received mixed reviews. He called the metaverse “an embodied internet where you’re in the experience, not just looking at it.” Imagine that you could meet your friends from all over the world in virtual reality, discuss business with partners without leaving your office, or access fantasy worlds you’ve always dreamed about. That’s what Zuckerberg has in mind.

Advertisers and online merchants – not to mention Meta itself – have other ideas, however.

Some other data points from the NordVPN study:

  • 47% don’t trust that their identity will be legally protected    
  • 45% fear that even more data can be collected and used against them
  • 43% are concerned about not being sure of the identity of others
  • 41% think it will be hard to safeguard their real identity from their metaverse identity
  • 37% fear that their transactions won’t be very secure
  • Once the metaverse was described to respondents, 66% said they think the metaverse can replace social media as we currently know and use it.

What is  biometrically inferred data?

Kavya Pearlman, founder of the XR Safety Initiative, a nonprofit that advocates for the ethical development of immersive technologies, told VentureBeat that “privacy is all about the data collection. Because there is this enormous amount of data [that will be harvested], you can’t have the convergence of these environments. That’s what I am most concerned about.

“This is now all about biometrically inferred data,” Pearlman said. “Our data privacy laws need to be updated because they are inadequate. This enormous eye-tracking, gait-tracking the way you move, the way you walk – all this analysis – can infer a lot of information about you. And then there are the intersections of these other technologies, which is just like a brain-computer interface that will provide the alpha, beta, gamma – and even your thoughts – at some point. What happens to privacy when our thoughts are not even protected?”

All this information – stacked in cloud storage and constantly being analyzed by multiple buyers – could give companies a greater ability to understand individual traits, Pearlman said. An insurance company, for example, might see a behavioral clue inferring a customer’s health problem before the person notices anything herself. “Now, the data is in inferences,” Pearlman said.

One common denominator about all of this that our sources agree upon is that this is only the beginning of a new phase of commerce and socialization on the internet. As time and tech move on, the results of the success of data privacy policies, software, and hardware will become apparent. The other item everybody agrees on is that national, international and local laws and regulations will lag far behind the advancement of  technology, as it has for decades.

Some other varying perspectives on the coming battle between data privacy and the metaverse:

Peter Evans, CEO of Patriot One Technologies: We don’t expect the issues of data privacy or security to go away with the metaverse. As an industry, we see repeated examples where technology gets way ahead of security, data privacy, and good governance … and the world’s zeal to play with new and interesting things and leverage them for business benefit, competitive advantage, and profit. 

All the issues that we’ve recently seen in the press about Facebook’s use of data to drive marketing and revenue are examples of a marketing opportunity getting ahead of good governance, security, and protection.

This has been going on for 20+ years, going back to the first introduction of the internet, online banking and ecommerce, AI and facial recognition, etc.

We see these issues repeating themselves over and over again, with governments and data privacy often lagging. By the time the world opens its eyes to the data privacy data management issues, it’s too late, because the horse has left the barn. With each new iteration of innovation, we see an order-of-magnitude jump in both business benefits as well as the complexity of data privacy issues. I expect that we will see the same with the metaverse.

Ben Brook, CEO of Transcend, a data privacy software provider: In the beginning, the metaverse can actually be good for privacy because people can adopt anonymous avatars. But over time, as we spend more time in the metaverse and our avatar becomes a bigger portion of our life (in a sense, we become our avatars and we shop as it, we consume content as it, and we form relationships as it), then all the same privacy principles will apply.

It’s still too early to say what specific protections it will require as usage evolves, but the reality is we’re not starting from the most solid foundation. In many jurisdictions, consumers don’t yet have the protections they need for today, let alone for the metaverse and the myriad new ways their data may be used (and abused) tomorrow.

More data means advertisers have a substantially richer cupboard to mine for far deeper targeting, often using the same platforms that are speaking most loudly about the metaverse’s potential.

David Blonder, senior director, legal counsel, regulatory and privacy and data protection officer at BlackBerry: With the metaverse and creating a hybrid-reality, it’s important to remember one simple truism: people will trade security for convenience. The metaverse will see considerably more user interaction than a cellphone. Therefore, it is not unreasonable to assume it would collect much more information and attract many more attackers as well. For security to succeed in the metaverse, it will have to be implemented in a way that is robust without negatively impacting user convenience.

Source link

Startups

Why graphic novels are lucrative IP for Web3: From MEFaverse to metaverse

Published

on

Marvel’s multi-billion dollar IP enterprise is eating up the film and streaming market — but the metaverse is offering new opportunities and creating a whole new market.

Marvel is valued at nearly $6 billion for films alone, $40 billion for streaming and about $3 billion for consumer products, according to a 2021 Forbes analysis. While the media giant dominates the lion’s share of graphic novel IP in entertainment within film and streaming, the metaverse offers new opportunities for graphic novel IP. The ‘metaverse in entertainment’ market share is expected to increase to $28.92 billion by 2026. 

The entertainment market is essentially expanding with the creation of the metaverse, therefore presenting opportunities to replicate the lucrative success that Marvel has enjoyed. But what made Marvel so popular, and why is the multiverse primed for the metaverse? 

Since the inception of the metaverse as a concept, some of the earliest explorations have included the creation — and adaptation of — graphic novels for this new virtual environment. From Method Man’s comic book MEFaverse, to the adaptation of Dan LuVisi’s iconic Last Man Standing: Killbook of a Bounty Hunter, to Killtopia catering to Japan’s ‘Otaku’ community of manga and animé fans.

Event

Intelligent Security Summit

Learn the critical role of AI & ML in cybersecurity and industry specific case studies on December 8. Register for your free pass today.


Register Now

But why is graphic novel IP so attractive to directors writing for a digital medium with interactive audiences? And what opportunities are potentially being left on the table? To understand the attraction of graphic novel IP, we only need to look at the formula of success that Marvel and DC have built. 

An ever-expanding world

Marvel’s IP is not one story, but a universe that continues to expand. Recent editions to Marvel’s onscreen world include She-Hulk: Attorney at Law, Ms. Marvel and the upcoming Secret Invasion. The stories that come to life in film and TV are often based on specific heroes within that universe — or, more aptly, the multiverse.

In film, appearance-altering costumes, special FX make-up and visual FX (VFX) enable directors to cast different actors to play the same character in the franchise. The most popular and talented actors, with the strongest following in the target demographic for the box office, can have their turn playing the hero. In fact, actors no longer need to sign long-haul multi-movie contracts with Marvel.

The metaverse offers even more creative diversity. Graphic novel characters can be customizable according to the themes of different concept artists, and the same character can travel through a manga world into one that’s photorealistic. Perhaps a good interpretation is Dr. Strange’s journey through the multiverses, as we see him enter a variety of differently stylized worlds until he eventually finds himself surreally realized as a colorful gelatinous shape. 

One of the key differentiators between a virtual world and a game within the metaverse — or what will be the metaverse — is this interoperability, the way in which an avatar could be used in different virtual worlds. The way avatars are translated stylistically in those different worlds is a key focus for metaverse builders. And it’s something Marvel has been doing well for some time. People love the graphic novel style of Marvel films and how they not only pay homage to the original art form but also amplify the movie experience with state-of-the-art VFX. 

For example, LMS: Killbook of a Bounty Hunter is being translated for the metaverse after amassing a core fanbase. LMS is simultaneously a scrapbook-style graphic novel, a character bible for the anti-hero Gabriel and an introduction to the colorful yet deadly world of ‘New Amerika’. Initially released as a series of artworks, LMS soon gathered a solid fanbase that demanded more of Dan LuVisi’s world. The rights to LMS were bought by Section 9, which approached metaverse-as-a-service company Sequin AR with the idea of creating an LMS metaverse. With a rich world and a pre-existing community, Sequin believed LMS was the perfect property for a metaverse environment. 

The attractiveness of graphic novel IP

Sequin AR’s CEO Rob DeFranco explains why the graphic novel IP was so attractive: “The world that Dan created is vivid, imaginative, and full of pop-culture references with a sharp satirical tone that makes it a model property for the metaverse. There is a big community already in place for LMS. For example, a Comic-Con special edition toy of Gabriel, created by the popular brand Funko, sold out on the first day of the convention. Since the book first launched 10 years ago, there has been a cultural shift in how we interact with the properties we love.” 

Graphic novels rely on captivating imagery, along with compelling stories. The community building the metaverse is a blend of creatives, technologists and storytellers, similar to the teams that produce the Marvel universe. For example, the team behind Method Man’s MEFaverse includes Method Man himself, and renowned graphics artist Jonathan Winbush of Winbush Immersive, with Xsens motion tracking technology helping them translate real-life movement into the digital world. It’s no coincidence that Winbush built his own brand as a creator from his time working at Marvel. 

“The trajectory of the NFT/Web3 space as a whole, in my opinion, only has one direction to go: up,” says Method Man. “I see no reason why it wouldn’t, as brands and individuals realize the unique opportunities and potential this space offers, as well as the utility it provides. That said, my hope is that it can continue to grow while remaining mindful of values such as inclusivity and positivity, which are both pillars of the MEFaverse community.”

The metaverse and the story of good vs. evil 

The metaverse has the potential to be many things, good or bad. Most metaverse evangelists also acknowledge how human influence tends to invade — and sometimes spoil — the utopian promise of future technology.

For example, Aragorn Meulendijks, Chief Metaverse Officer (CMO) from Your Open Metaverse, a distributed metaverse for streaming Web3 content, recently shared his candid thoughts on Elaine Pringle Schwitter’s HeadsTalk Podcast. According to Meulendijks, the mission for those building the metaverse needs to align with the reality of flawed human nature. This sentiment is omnipresent in Marvel; the premise of superhero films is that good and evil always exist in tandem, and even heroes are flawed. 

While there are inevitable flaws, the multiverse can also be employed altruistically. Representation and connection are frequent themes in graphic novels, often speaking to those who don’t feel part of mainstream pop culture. This links back to Winbush’s work on the MEFaverse.

“We wanted to create more ‘metamasks’ or PFPs with different traits to represent our community,” he explained. “Method Man’s motivation in creating the MEFaverse was to show his fans their powers, the unique traits that make them who they are but in the superhero realm. Method Man wanted everyone that was excited about the MEFaverse to have a mask that truly represents them. He wanted his community to be shown their unique powers in a superhero realm.”

The building blocks of film production are being used to build the metaverse

The technology that underpins movie production is driving metaverse creation. For example, motion capture is harnessing and translating movement to avatars, while Unreal Engine is being used to create the worlds themselves.

Charles Borland, founder of real-time studio Voltaku explained: “When I was an actor in a video game called Grand Theft Auto IV, I would spend a lot of time in a mocap suit, and I’d been on a lot of TV and film shoots and saw just how inefficient the Hollywood production process is. I remember thinking, holy cow, when this technology and the economics get to a certain point, all of this gaming technology and real-time technology is going to revolutionize filmmaking and how you make content.” 

Talking about the use of technology in Killtopia, Charles elaborated: “If we’re going to build this in a game engine, like Unreal Engine, then we [had]to do things like set up a camera inside of Unreal. We knew we were going to have an actress and we were going to try and do this in real-time, but one of the things we were looking at was real-time ray tracing, and to push the envelope on that. We couldn’t go into the studio and do full camera tracking, so we wanted to find something inertia-based. Using the Xsens suit, capturing the raw mocap data, enabled us to create the avatars”. 

From an investment standpoint, how Marvel’s magic formula for success translates to the metaverse is clear. But IP in the metaverse goes far beyond a franchise of characters. Fans build on these worlds themselves, becoming creators in their own right. And in order to create, they need to feel invested. And that’s where the technology underpinning interoperability is key.

Blockchain blockbusters

Killtopia’s Charles Borland explains: “To invest in interoperability, stakeholders and project owners need to know that the assets for whom they’re building aren’t going anywhere. Of course, that’s if by ‘decentralized,’ you mean you’re applying blockchain. What’s great about that is it’s immutable and it’s public. So I know if I build around a project, even if it tanks, my pipeline will stay. Because the things I’ve been referencing and looking at are going to stay online in this decentralized file hosting system, which is great.”

This is an example of how the technology used in metaverse creation is improving the entire production pipeline. Accelerating the content production workflow, and safeguarding the assets for future use, is a challenge even Marvel faces. 

Cultural shift between content creators and consumers

Borland highlights the cultural shift in how we interact with the properties we love. COVID-19 drove the rapid acceleration in digital experiences, helping us to forge genuine connections when real-life interaction wasn’t possible. The convergence of these behavioral changes and technology advancements is now paving the way for the future metaverse, with mixed reality live performances — which became more prevalent during the recent pandemic — offering a hint of what we might expect. 

Brett Ineson, founder of Animatrik Film Design, which has hosted mixed reality performances for Justin Bieber, Pentakill with Wave XR and even virtual circuses with Shocap Entertainment, says: “Nailing the look and feel of a world will be paramount to delivering the illusion of reality, and that’s where capture technology will come into play. Motion capture will be essential for creating lifelike animation for characters and creatures in these virtual worlds so that players feel like they are interacting with real beings.”

Technologists and storytellers are helping to unleash the potential of new IP into the metaverse. Right now, the reality is that the metaverse does not exist, but it represents the next step in immersive and engaging entertainment. The more engaged a community is, the more invested it is in the story. Powered motion tracking, performance capture, interoperable avatars, virtual worlds and hip hop artists-turned-super heroes, the metaverse is prime real estate for the next Marvel enterprise. 

Rob DeFranco is CEO of Sequin AR.

Brett Ineson is cofounder of Animatrik Film Studios.

Remco Sikkema is senior marketing communications manager at Movella and Xsens.

Source link

Continue Reading

Startups

Fortnite Chapter 4 debuts with Unreal Engine 5.1

Published

on

Fornite Battle Royale Chapter 4 arrived today and it makes use of Unreal Engine 5.1, Epic Games announced.

The debut shows how tightly Epic Games ties its overall strategy together. Fortnite is the prime revenue generator for the company, reaching tens of millions of players who buy in-game items. And Unreal Engine is the game developer tool that makes the advances in Chapter 4 available. To sell developers on the engine, Epic eats its own dog food by building Fortnite with Unreal to showcase what it can do.

Unreal Engine 5.1 provides new features that make the game look and run better. Unreal Engine 5 itself debuted earlier this year and it Unreal Engine 5 ushers in a generational leap in visual fidelity, bringing a new level of detail to game worlds like the Battle Royale Island.

Shadows and lighting are better in Fortnite with Unreal Engine 5.1.

Next-gen Unreal Engine 5 features such as Nanite, Lumen, Virtual Shadow Maps, and Temporal Super Resolution — all features that can make Fortnite Battle Royale shine on next-generation systems such as PlayStation 5, Xbox Series X|S, PC, and cloud gaming.

Epic Games said that over half of all announced next-gen games are being created with Unreal Engine. And it said developers can now take advantage of updates to the Lumen dynamic global illumination and reflections system. This is important stuff if you’re a game developer, or you’re expecting to build the metaverse.

Epic has made updates to the Nanite virtualized micropolygon geometry system, and virtual shadow maps that lay the groundwork for games and experiences running at 60 frames per second (fps) on next-gen consoles and capable PCs. These improvements will enable fast-paced competition and detailed simulations without latency, Epic said.

Additionally, Nanite has also added a programmable rasterizer to allow for material-driven animations and deformations via world position offset, as well as opacity masks. This development paves the way for artists to use Nanite to program specific objects’ behavior, for example Nanite-based foliage with leaves blowing in the wind.

Nanite provides highly-detailed architectural geometry. Specifically, buildings are rendered from millions of polygons in real time, and each brick, stone, wood plank, and wall trim is modeled. Natural landscapes are highly-detailed too. Individual trees have around 300,000 polygons, and each stone, flower, and blade of grass is modeled.

On top of that, Lumen reflections provide high-quality ray traced reflections on glossy materials and water.

Water and shadows look prettier in Fortnite Battle Royale Chapter 4.

Also, Lumen provides real-time global illumination at 60 frames per second (FPS). You’ll see beautiful interior spaces with bounce lighting, plus characters reacting to the lighting of their surroundings. (For example, red rugs may bounce red light onto your outfit.) Also, Outfits that have emissive (a.k.a. glowing) qualities will scatter light on nearby objects and surfaces.

Virtual Shadow Maps allow for highly detailed shadowing. Each brick, leaf, and modeled detail will cast a shadow, and character self-shadowing is extremely accurate. This means that things like hats and other small details on characters will also cast shadows.

Temporal Super Resolution is an upgrade over Temporal Anti-Aliasing in Fortnite, and allows for high-quality visuals at a high framerate.

With the introduction of these UE5 features in Fortnite Battle Royale, Fortnite’s Video settings have changed on PC. You can see them here.

To run Nanite, the minimum hardware requirements are Nvidia Maxwell-generation cards or newer or AMD GCN-generation cards or newer.

For Nanite, Lumen, Virtual Shadow Maps, and Temporal Super Resolution to be available in Fortnite on your PlayStation 5 or Xbox Series X|S, make sure the “120 FPS Mode” setting (in the “graphics” section of the Video settings) is set to off.

Unreal’s reach has grown well beyond games. Unreal Engine has now been used on over 425 film and TV productions, and is integrated into over 300 virtual production stages worldwide. Unreal Engine usage in animation has grown exponentially, from 15 productions between 2015 and 2019 to over 160 productions from 2020 to 2022.

Source link

Continue Reading

Startups

Is It Time To Talk About A More Sustainable Approach To Serving Our Customers?

Published

on

At a recent event, I spoke to a Chief Technology Officer (CTO) about how it was not untypical for him to have a day of 14 back-to-back half-hour meetings. He explained that this started during the early part of the pandemic, and by 4 pm, he was absolutely exhausted and struggled to stay focused and pay attention. He added, however, that over time he got used to such a heavy schedule and was able to manage his energy and concentration better.

On hearing this story, I commented that while I often hear stories like this from all sorts of executives at different firms, I am often left wondering how folks end up doing any work if they are in back-to-back meetings all day.

I asked slightly tongue-in-cheek how we had gotten to his point, given that I’d never seen a job description that contained any objective that required a person to attend as many meetings as physically possible.

This raised a few smiles and quite a few nods.

Whilst my comment was playful, it also contained a serious point and one that I have made to many executives about how they should actively manage their time to create the space necessary to really think about and understand the challenges they are facing.

I was thinking about that conversation again the other day when I came across some research from Microsoft about the impact on our brains and emotional state when we have back-to-back meetings.

Using an electroencephalography [EEG] cap, the Microsoft research team were able to monitor the electrical activity in the brain of back-to-back meeting participants. Unsurprisingly, they found that back-to-back virtual meetings are stressful, and a series of meetings can decrease your ability to focus and engage.

However, the research also found that introducing short breaks between meetings to allow people to move, stretch, gather their thoughts or grab a glass of water can help reduce the cumulative buildup of stress across a series of meetings.

That’s really useful insight, and I hope that more executives and their teams embrace the introduction of these short breaks between meetings to reduce stress, support well-being and maintain attention levels.

But I’ve also been thinking about whether these research findings have a broader application.

Specifically, I’ve been thinking about whether the calls taken by customer service agents could be analogous to a series of very short, back-to-back meetings. If they are, that has ramifications for the amount of stress customer service representatives have to deal with. This is brought into sharp focus when you consider that the average customer service representative is often expected to be constantly on calls for the duration of an 8-hour shift apart from a 30-minute lunch break and two 15 min breaks, one in the morning and one in the afternoon.

So, is it any wonder that the contact center industry faces perennial burnout and high levels of staff churn?

Suppose we want to build a more sustainable approach to serving our customers, particularly over live channels like the phone or video. If we do, we need to think more clearly and empathetically about our agents and what they go through.

Now, I know that technology is evolving to help with this challenge and that’s great. But we shouldn’t stop there. Building a more attractive and sustainable contact center model will require us to rethink both contact center operations and their economics.

Continue Reading

Trending

URGENT: CYBER SECURITY UPDATE