Connect with us

Startups

Metalenz unveils PolarEyes sensors that are 1,000 times smaller than a human hair

Published

on

Metalenz unveiled a new kind of sensor dubbed PolarEyes, which uses polarization technology to bring a new kind of sensing to camera-equipped devices.

The sensors can detect air quality, or manage your healthcare from a smartphone based on your vials. The Harvard University-born meta-optics company also brings polarized lenses to consumer and mobile devices for improved privacy and security features.

The full-stack, system-level solution combines physics and optics, software and hardware, to power everything from next-generation smartphones and consumer electronics, to new healthcare and automotive applications.

Metalenz PolarEyes collects the polarized light information traditional cameras discard and parses through that information to better interpret the world around us.

Event

The 2nd Annual GamesBeat and Facebook Gaming Summit and GamesBeat: Into the Metaverse 2


Learn More

“We’re just really excited now to see exactly how we can impact all of these industries,” said Robert Devlin, CEO of Metalenz, in an interview with VentureBeat. “The thing that we have set out to do is bring an entirely new form of sensing and put it into everyone’s pocket.”

We last wrote about them in February 2021, when they raised $10 million to create 3D sensors on a chip with structures that are 1,000 times smaller than a human hair.

That investment helped Metalenz scale production and accelerate the development of miniature optics on a chip technology and its new lens that could power the next generation of sensors for use in smartphones and other consumer, health care, and automotive applications.

For consumers, the higher-quality lenses mean more powerful phone capabilities that can help them snap more professional-looking pictures — even in the most challenging environments — while promoting longer battery life. Right now, smartphone makers such as Apple are including multiple cameras in smartphones to render 3D imagery.

Devlin said the company has generated its first revenue and its first product will be out in the second quarter of 2022.

“It’s a 3D-sensing product and it’ll be in consumer devices starting in the second quarter of this year,” Devlin said. “We have had a chance to see the manufacturing process proven. We can use it to bring entirely new forms of sensing to the consumer form factor and price point.”

The last big advance in facial sensing was when Apple came out with face recognition for its smartphones. Metalenz provides a big advance in optics.

“With polarization, you can actually understand something about what the materials you’re looking at are made up of, and so you can understand something about the underlying structure of these materials,” Devlin said. “You can understand whether something is man-made versus natural object versus transparent or opaque object. So it gives you a whole extra set of information that allows your cameras and your machine vision systems really to make a lot better decisions.”

Origins

Metalenz could make camera lenses and sensors better. The sensors are based on meta-optic technology that was pioneered at Harvard University’s John A. Paulson School of Engineering and Applied Sciences (SEAS). Metalenz was cofounded in 2017 by Devlin and professor of applied physics Federico Capasso.

With a decade of research and 15 issued patents on the concept, the Boston company said it has an intellectual property head start.

Metalenz said it is already engaged with a number of the world’s largest manufacturers in the consumer electronics and automotive spaces, from original equipment manufacturers (OEMs) to subsystem makers.

“We’ve really been focusing on getting that first product into market,” said Devlin. “We have now hit a lot of our big milestones as a startup.”

Despite the improvements in cameras, lens technology has remained relatively unchanged for hundreds of years. The emerging field of meta-optics, or optical metasurfaces, centers on engineered materials with patterned structures that are 1,000 times smaller than a human hair and specifically designed to exploit properties that cannot be obtained from bulk natural materials.

Meta-optics use planar surfaces consisting of sub-wavelength strucures with a uniform height of several hundred nanometers. They act as waveguides to manipulate light and provide a degree of control not possible with refractive lenses. A single thin meta-optic and outperform a stack of refractive lenses and provide better performance and cost advantages. They are manufactured using standard semiconductor processes in the same factories that make image sensors.

The company said unique meta-optic properties permit the combination of several lenses into a single thin and flat surface and unlock new possibilities. This includes improved 3D sensing and new sensors that can fit under the display of a cell phone.

Above: A wafer with Metalenz chips.

Image Credit: Metalenz

Metalenz holds an exclusive worldwide license, through Harvard University’s Office of Technology Development, to a portfolio of foundational intellectual properties relating to metasurfaces developed in the Capasso Lab at Harvard SEAS. The team has 20 people and it has raised $15 million.

The researchers had studied whether they could completely control light with just a single nanoscale structure. Then they built it using a simple semiconductor chip design in a foundry, or contract manufacturer. As a fabless semiconductor company, Metalenz focuses on design and engineering, and it can tap manufacturers to mass-produce its sensors.

Among the features: Metalenz can do “spoof-proof” facial authentication, as it can figure out if someone is wearing a facial-spoofing mask or photos with the aim of defrauding someone.

If someone is trying to trick a smartphone camera, they can hold a picture of someone’s face in front of a camera and the camera won’t be able to discern human skin or a piece of paper. Devlin showed a demo of how this works. It can basically detect the polarization signature of human skin as well as that of paper.

A doctor could look at skin cancer and understand if it is something dangerous.

It also does enhanced 3D sensing. It provides more details to detect shapes and edges with increased contrast; improving virtual backgrounds’ quality and 3D object scanning resolution in AR/VR  environments.

And it identifies molecular makeup of objects, giving automakers the ability to alert drivers to road hazards like black ice, and doctors the potential to diagnose skin cancer from a smartphone.

It also has anti-glare vision. It works around glare, the reflective light which often overpowers vision and machine vision, enabling robots to better maneuver and automobiles the ability to monitor for distracted driving – a new safety measure required in all vehicles by 2025.

Metalenz is working with manufacturers around the world, including STMicroelectronics, as it tries to scale up its production and logistics.

Metalenz is backed by semiconductor leaders including 3M Ventures, Intel Capital, and TDK Ventures. The company was founded in 2016.

Source link

Startups

Holiday Gift Guide For Eco-Minded Travelers

Published

on

Traveling offers a greater understanding of the world but can also damage it. Here are some gift ideas for those who prefer to tread lightly. Almost all are from small businesses.

1. Water filtration bottle. Drink filtered water on the go instead of buying plastic water bottles with Hydros 20 oz Water Filter Bottle $20 or if the local water quality is iffy, protect against bacteria, parasites, dirt, and sand with a Lifestraw bottle $40.

2. Shampoo/conditioner bars. Eschew plastic shampoo bottles and airline liquid restrictions with shampoo and conditioner bars from Green Ablutions $12, Ethique $15, HiBar $13.95 or good juju $19.95. All brands have discounts on your first purchase.

3. Solar powered phone charger/lantern. This flat solar-powered phone charger from Luminaid is also a folded-up lantern that pops up into a cube shape for night time. A red light mode ensures your star-gazing won’t be interrupted if you are using it for camping. $60

4. Laundry saver. Do less laundry on the road with Magic BrushOff, designed as a reusable spot lifter/sponge to get rid of deodorant, makeup marks, lint, salt marks, and more from clothing and cloth surfaces. $14.50.

5. Makeup remover. Use this specially-designed washcloth from Make Up Eraser instead of packing and throwing away disposable makeup wipes. $20.

6. Personal lubricant (for vehicles). Gear Hugger’s plant-based lubricant can help keep bike parts, strollers or wheelchairs moving so the recipient can “buy less, fix more and play longer.” $13

7. Cozy hat made of sustainable recycled wool and cotton for your winter travels. Designed in Denmark. Comes in recycled packaging. From Trendhim $39

8. Silk Travel Eye Mask to help get some shut eye on the road. Silk can biodegrade, and uses less water, chemicals, and energy than many other fibers. From Saatva. $60

9. Reusable lunch and snack bags Eschew plastic bags and foil wrap on your day trips with colorful reusable, washable, dishwasher-safe zip up bags $13.50 from Green City Living.

10. Natural Bug Repellant. Murphy’s Naturals DEET-free, plant-based Mosquito Repellent Balm $9.99 or Bite Relief Soothing Balm Stick $5.99 can be thrown in a purse or backpack.

11. See home in a new way – Give the gift of experience with a food, coffee, or street art tour guided by local experts from the Tours By Locals website.

12. Recipient’s Choice: Reduce returns and waste by giving the recipient their choice of travel item from Snappy Gifts website. Items include a solar phone charger and sustainability sourced toiletry organizer.

Note to readers: I received product samples for evaluation but I will pass them along if possible (not the used shampoo bars) and I do not/will not receive any payment from the companies listed.

Continue Reading

Startups

Why graphic novels are lucrative IP for Web3: From MEFaverse to metaverse

Published

on

Marvel’s multi-billion dollar IP enterprise is eating up the film and streaming market — but the metaverse is offering new opportunities and creating a whole new market.

Marvel is valued at nearly $6 billion for films alone, $40 billion for streaming and about $3 billion for consumer products, according to a 2021 Forbes analysis. While the media giant dominates the lion’s share of graphic novel IP in entertainment within film and streaming, the metaverse offers new opportunities for graphic novel IP. The ‘metaverse in entertainment’ market share is expected to increase to $28.92 billion by 2026. 

The entertainment market is essentially expanding with the creation of the metaverse, therefore presenting opportunities to replicate the lucrative success that Marvel has enjoyed. But what made Marvel so popular, and why is the multiverse primed for the metaverse? 

Since the inception of the metaverse as a concept, some of the earliest explorations have included the creation — and adaptation of — graphic novels for this new virtual environment. From Method Man’s comic book MEFaverse, to the adaptation of Dan LuVisi’s iconic Last Man Standing: Killbook of a Bounty Hunter, to Killtopia catering to Japan’s ‘Otaku’ community of manga and animé fans.

Event

Intelligent Security Summit

Learn the critical role of AI & ML in cybersecurity and industry specific case studies on December 8. Register for your free pass today.


Register Now

But why is graphic novel IP so attractive to directors writing for a digital medium with interactive audiences? And what opportunities are potentially being left on the table? To understand the attraction of graphic novel IP, we only need to look at the formula of success that Marvel and DC have built. 

An ever-expanding world

Marvel’s IP is not one story, but a universe that continues to expand. Recent editions to Marvel’s onscreen world include She-Hulk: Attorney at Law, Ms. Marvel and the upcoming Secret Invasion. The stories that come to life in film and TV are often based on specific heroes within that universe — or, more aptly, the multiverse.

In film, appearance-altering costumes, special FX make-up and visual FX (VFX) enable directors to cast different actors to play the same character in the franchise. The most popular and talented actors, with the strongest following in the target demographic for the box office, can have their turn playing the hero. In fact, actors no longer need to sign long-haul multi-movie contracts with Marvel.

The metaverse offers even more creative diversity. Graphic novel characters can be customizable according to the themes of different concept artists, and the same character can travel through a manga world into one that’s photorealistic. Perhaps a good interpretation is Dr. Strange’s journey through the multiverses, as we see him enter a variety of differently stylized worlds until he eventually finds himself surreally realized as a colorful gelatinous shape. 

One of the key differentiators between a virtual world and a game within the metaverse — or what will be the metaverse — is this interoperability, the way in which an avatar could be used in different virtual worlds. The way avatars are translated stylistically in those different worlds is a key focus for metaverse builders. And it’s something Marvel has been doing well for some time. People love the graphic novel style of Marvel films and how they not only pay homage to the original art form but also amplify the movie experience with state-of-the-art VFX. 

For example, LMS: Killbook of a Bounty Hunter is being translated for the metaverse after amassing a core fanbase. LMS is simultaneously a scrapbook-style graphic novel, a character bible for the anti-hero Gabriel and an introduction to the colorful yet deadly world of ‘New Amerika’. Initially released as a series of artworks, LMS soon gathered a solid fanbase that demanded more of Dan LuVisi’s world. The rights to LMS were bought by Section 9, which approached metaverse-as-a-service company Sequin AR with the idea of creating an LMS metaverse. With a rich world and a pre-existing community, Sequin believed LMS was the perfect property for a metaverse environment. 

The attractiveness of graphic novel IP

Sequin AR’s CEO Rob DeFranco explains why the graphic novel IP was so attractive: “The world that Dan created is vivid, imaginative, and full of pop-culture references with a sharp satirical tone that makes it a model property for the metaverse. There is a big community already in place for LMS. For example, a Comic-Con special edition toy of Gabriel, created by the popular brand Funko, sold out on the first day of the convention. Since the book first launched 10 years ago, there has been a cultural shift in how we interact with the properties we love.” 

Graphic novels rely on captivating imagery, along with compelling stories. The community building the metaverse is a blend of creatives, technologists and storytellers, similar to the teams that produce the Marvel universe. For example, the team behind Method Man’s MEFaverse includes Method Man himself, and renowned graphics artist Jonathan Winbush of Winbush Immersive, with Xsens motion tracking technology helping them translate real-life movement into the digital world. It’s no coincidence that Winbush built his own brand as a creator from his time working at Marvel. 

“The trajectory of the NFT/Web3 space as a whole, in my opinion, only has one direction to go: up,” says Method Man. “I see no reason why it wouldn’t, as brands and individuals realize the unique opportunities and potential this space offers, as well as the utility it provides. That said, my hope is that it can continue to grow while remaining mindful of values such as inclusivity and positivity, which are both pillars of the MEFaverse community.”

The metaverse and the story of good vs. evil 

The metaverse has the potential to be many things, good or bad. Most metaverse evangelists also acknowledge how human influence tends to invade — and sometimes spoil — the utopian promise of future technology.

For example, Aragorn Meulendijks, Chief Metaverse Officer (CMO) from Your Open Metaverse, a distributed metaverse for streaming Web3 content, recently shared his candid thoughts on Elaine Pringle Schwitter’s HeadsTalk Podcast. According to Meulendijks, the mission for those building the metaverse needs to align with the reality of flawed human nature. This sentiment is omnipresent in Marvel; the premise of superhero films is that good and evil always exist in tandem, and even heroes are flawed. 

While there are inevitable flaws, the multiverse can also be employed altruistically. Representation and connection are frequent themes in graphic novels, often speaking to those who don’t feel part of mainstream pop culture. This links back to Winbush’s work on the MEFaverse.

“We wanted to create more ‘metamasks’ or PFPs with different traits to represent our community,” he explained. “Method Man’s motivation in creating the MEFaverse was to show his fans their powers, the unique traits that make them who they are but in the superhero realm. Method Man wanted everyone that was excited about the MEFaverse to have a mask that truly represents them. He wanted his community to be shown their unique powers in a superhero realm.”

The building blocks of film production are being used to build the metaverse

The technology that underpins movie production is driving metaverse creation. For example, motion capture is harnessing and translating movement to avatars, while Unreal Engine is being used to create the worlds themselves.

Charles Borland, founder of real-time studio Voltaku explained: “When I was an actor in a video game called Grand Theft Auto IV, I would spend a lot of time in a mocap suit, and I’d been on a lot of TV and film shoots and saw just how inefficient the Hollywood production process is. I remember thinking, holy cow, when this technology and the economics get to a certain point, all of this gaming technology and real-time technology is going to revolutionize filmmaking and how you make content.” 

Talking about the use of technology in Killtopia, Charles elaborated: “If we’re going to build this in a game engine, like Unreal Engine, then we [had]to do things like set up a camera inside of Unreal. We knew we were going to have an actress and we were going to try and do this in real-time, but one of the things we were looking at was real-time ray tracing, and to push the envelope on that. We couldn’t go into the studio and do full camera tracking, so we wanted to find something inertia-based. Using the Xsens suit, capturing the raw mocap data, enabled us to create the avatars”. 

From an investment standpoint, how Marvel’s magic formula for success translates to the metaverse is clear. But IP in the metaverse goes far beyond a franchise of characters. Fans build on these worlds themselves, becoming creators in their own right. And in order to create, they need to feel invested. And that’s where the technology underpinning interoperability is key.

Blockchain blockbusters

Killtopia’s Charles Borland explains: “To invest in interoperability, stakeholders and project owners need to know that the assets for whom they’re building aren’t going anywhere. Of course, that’s if by ‘decentralized,’ you mean you’re applying blockchain. What’s great about that is it’s immutable and it’s public. So I know if I build around a project, even if it tanks, my pipeline will stay. Because the things I’ve been referencing and looking at are going to stay online in this decentralized file hosting system, which is great.”

This is an example of how the technology used in metaverse creation is improving the entire production pipeline. Accelerating the content production workflow, and safeguarding the assets for future use, is a challenge even Marvel faces. 

Cultural shift between content creators and consumers

Borland highlights the cultural shift in how we interact with the properties we love. COVID-19 drove the rapid acceleration in digital experiences, helping us to forge genuine connections when real-life interaction wasn’t possible. The convergence of these behavioral changes and technology advancements is now paving the way for the future metaverse, with mixed reality live performances — which became more prevalent during the recent pandemic — offering a hint of what we might expect. 

Brett Ineson, founder of Animatrik Film Design, which has hosted mixed reality performances for Justin Bieber, Pentakill with Wave XR and even virtual circuses with Shocap Entertainment, says: “Nailing the look and feel of a world will be paramount to delivering the illusion of reality, and that’s where capture technology will come into play. Motion capture will be essential for creating lifelike animation for characters and creatures in these virtual worlds so that players feel like they are interacting with real beings.”

Technologists and storytellers are helping to unleash the potential of new IP into the metaverse. Right now, the reality is that the metaverse does not exist, but it represents the next step in immersive and engaging entertainment. The more engaged a community is, the more invested it is in the story. Powered motion tracking, performance capture, interoperable avatars, virtual worlds and hip hop artists-turned-super heroes, the metaverse is prime real estate for the next Marvel enterprise. 

Rob DeFranco is CEO of Sequin AR.

Brett Ineson is cofounder of Animatrik Film Studios.

Remco Sikkema is senior marketing communications manager at Movella and Xsens.

Source link

Continue Reading

Startups

Fortnite Chapter 4 debuts with Unreal Engine 5.1

Published

on

Fornite Battle Royale Chapter 4 arrived today and it makes use of Unreal Engine 5.1, Epic Games announced.

The debut shows how tightly Epic Games ties its overall strategy together. Fortnite is the prime revenue generator for the company, reaching tens of millions of players who buy in-game items. And Unreal Engine is the game developer tool that makes the advances in Chapter 4 available. To sell developers on the engine, Epic eats its own dog food by building Fortnite with Unreal to showcase what it can do.

Unreal Engine 5.1 provides new features that make the game look and run better. Unreal Engine 5 itself debuted earlier this year and it Unreal Engine 5 ushers in a generational leap in visual fidelity, bringing a new level of detail to game worlds like the Battle Royale Island.

Shadows and lighting are better in Fortnite with Unreal Engine 5.1.

Next-gen Unreal Engine 5 features such as Nanite, Lumen, Virtual Shadow Maps, and Temporal Super Resolution — all features that can make Fortnite Battle Royale shine on next-generation systems such as PlayStation 5, Xbox Series X|S, PC, and cloud gaming.

Epic Games said that over half of all announced next-gen games are being created with Unreal Engine. And it said developers can now take advantage of updates to the Lumen dynamic global illumination and reflections system. This is important stuff if you’re a game developer, or you’re expecting to build the metaverse.

Epic has made updates to the Nanite virtualized micropolygon geometry system, and virtual shadow maps that lay the groundwork for games and experiences running at 60 frames per second (fps) on next-gen consoles and capable PCs. These improvements will enable fast-paced competition and detailed simulations without latency, Epic said.

Additionally, Nanite has also added a programmable rasterizer to allow for material-driven animations and deformations via world position offset, as well as opacity masks. This development paves the way for artists to use Nanite to program specific objects’ behavior, for example Nanite-based foliage with leaves blowing in the wind.

Nanite provides highly-detailed architectural geometry. Specifically, buildings are rendered from millions of polygons in real time, and each brick, stone, wood plank, and wall trim is modeled. Natural landscapes are highly-detailed too. Individual trees have around 300,000 polygons, and each stone, flower, and blade of grass is modeled.

On top of that, Lumen reflections provide high-quality ray traced reflections on glossy materials and water.

Water and shadows look prettier in Fortnite Battle Royale Chapter 4.

Also, Lumen provides real-time global illumination at 60 frames per second (FPS). You’ll see beautiful interior spaces with bounce lighting, plus characters reacting to the lighting of their surroundings. (For example, red rugs may bounce red light onto your outfit.) Also, Outfits that have emissive (a.k.a. glowing) qualities will scatter light on nearby objects and surfaces.

Virtual Shadow Maps allow for highly detailed shadowing. Each brick, leaf, and modeled detail will cast a shadow, and character self-shadowing is extremely accurate. This means that things like hats and other small details on characters will also cast shadows.

Temporal Super Resolution is an upgrade over Temporal Anti-Aliasing in Fortnite, and allows for high-quality visuals at a high framerate.

With the introduction of these UE5 features in Fortnite Battle Royale, Fortnite’s Video settings have changed on PC. You can see them here.

To run Nanite, the minimum hardware requirements are Nvidia Maxwell-generation cards or newer or AMD GCN-generation cards or newer.

For Nanite, Lumen, Virtual Shadow Maps, and Temporal Super Resolution to be available in Fortnite on your PlayStation 5 or Xbox Series X|S, make sure the “120 FPS Mode” setting (in the “graphics” section of the Video settings) is set to off.

Unreal’s reach has grown well beyond games. Unreal Engine has now been used on over 425 film and TV productions, and is integrated into over 300 virtual production stages worldwide. Unreal Engine usage in animation has grown exponentially, from 15 productions between 2015 and 2019 to over 160 productions from 2020 to 2022.

Source link

Continue Reading

Trending

URGENT: CYBER SECURITY UPDATE