Connect with us

Startups

AI-powered supply chain management platform 7bridges nabs $17M

Published

on

Businesses tasked with managing supply chains face increasing challenges as the pandemic takes a toll on operations. Because modern supply chains involve many steps, including assembling parts into finished products and shipping products to customers, there’s more opportunities for issues to arise. Forty-two percent of supply management organizations say that the growing cost of supply management was a major concern in 2021, according to the Institute for Supply Management, with 43% pegging limited availability of raw materials or supply as equally troubling.

The hurdles, both old and new, are spurring enterprises to invest in technologies that promise to automate and streamline supply chain management processes. In a 2021 Statista survey, of those in the supply chain management industry, 45.1% said that they were investing in software — and that automation was a key feature. One vendor benefiting from the boom is 7bridges, which aims to help retail, pharmaceutical, manufacturing and distribution brands use AI to execute workloads across their supply chains. 7bridges today announced that it raised $17 million in a series A round led by Eight Roads with participation from Maersk Growth.

Supply chain automation

Pandemic woes have placed a spotlight on the global supply chain. But while the health crises might have exposed its fragilities, businesses have struggling for years to rejigger their supplier and distribution networks. A recent study by McKinsey found that 85% of executives struggle with technology inefficiencies in their supply chains, ranging from disconnected and siloed data to time-intensive, manual spreadsheet-based tracking.

Search for a shipment : 7bridges

7bridges, which was founded by Philip Ashton and Matei Beremski in 2016, aims to apply AI to inform customers’ decisions about inbound and outbound logistics, inventory optimization, and other components of sprawling supply chains. Ashton was previously head of business intelligence at World Courier, a biopharmaceutical courier services company, while Beremski was a quantitative analyst at BNP Paribas before joining IBM as a senior analytics consultant.

“Enterprises struggle with complex supply chains now more than ever,” Ashton told VentureBeat via email. “This complexity arises from the explosion of new fulfillment strategies and methods, new carriers — for example digital freight forwarders and same-day shippers — and more stock storage locations moving closer to customers.”

7bridges brings together logistics data and processes and makes them accessible via a modular dashboard. AI technology adapts to changing conditions, balancing real-time variables including business constraints, operational capacity, available inventory by site, and carrier prices and performance. When a warehouse receives an order, the AI can choose the best dispatch site, route, and carrier for the shipment, recommend optimal packing materials and configurations, and create the necessary labeling and paperwork for the order, Ashton says.

“7bridges can build a digital twin of a customer’s logistics network: our AI uses this to simulate outcomes of numerous ‘what if’ scenarios. This is a way of stress-testing operations, and showing how costs might be impacted and performance might be affected in all sorts of scenarios that threaten their supply chain,” Ashton explained. “[Meanwhile,] 7bridges’AI-powered carrier selection combines real-time data with existing knowledge of best and alternative routes to decide on the optimal routing and transport providers for orders. [And] AI-powered inventory optimization ensures organizations have the right stock and packaging in the right volume at the right place so they can fulfill every order on time, in full.”

Send a shipment : 7bridges

7bridges also connects to a network of logistics service providers, enabling customers to access their services including packing, storage, and same-day and sustainable shipping. According to Ashton, it takes most customers two weeks to integrate the platform with their existing systems.

Growing services

A recent study by cloud-based supply chain management company E2open suggests that the use of AI and real-time data during the pandemic cut supply chain forecast error by 32%. Of course, E2open has a horse in the race, and opposing research from Vanson Bourne shows that a lack of internal knowledge and lingering fears over risk and control threaten to stymie the adoption of AI for supply chain applications. But while there’s hesitance around AI — and justifiable skepticism of its potential — the number of startups offering AI-infused supply chain services continues to grow.

For example, Tealbook uses AI to update and maintain a database of supply chain data. Atlana is creating a platform to unify global supply chain data. There’s also Optimal Dynamics, a New York-based startup applying AI to shipping logistics.

As Tokenist’s Tim Fries writes, venture capitalists are broadening their portfolios and are now significantly investing in industrial tech startups that are attempting to solve issues for supply chains. “As per a report by PitchBook, venture capitalists have invested a record $45.1 billion in industrial startups so far this year,” he noted in a piece last October. “In comparison, such firms raised a total of $34 billion for the entirety of 2020.”

“Our primary decision-makers tend to be in the commercial side of the business, for whom 7bridges offers a rapid step-change in their supply chain transformation and a rapid way to improve profit margins and logistics performance,” Ashton added. “For a CTO, 7bridges represents the last logistics integration their business needs to make in order to future-proof their systems and therefore saves their technical teams time and money … It’s a huge task for people to manually compile, normalize, and analyze vast datasets from numerous suppliers, logistics service providers, and internal systems. The 7bridges AI does it for them; normalizing and automatically analyzing data at a rate that is simply impossible for a human to achieve. ”

Ashton says that the additional investment from Maersk and Eight Roads will be put toward expanding 45-employee, London, U.K.-based 7bridges’ workforce and R&D. It brings 7bridges’ total capital raised to over $20 million.

Source link

Startups

Why graphic novels are lucrative IP for Web3: From MEFaverse to metaverse

Published

on

Marvel’s multi-billion dollar IP enterprise is eating up the film and streaming market — but the metaverse is offering new opportunities and creating a whole new market.

Marvel is valued at nearly $6 billion for films alone, $40 billion for streaming and about $3 billion for consumer products, according to a 2021 Forbes analysis. While the media giant dominates the lion’s share of graphic novel IP in entertainment within film and streaming, the metaverse offers new opportunities for graphic novel IP. The ‘metaverse in entertainment’ market share is expected to increase to $28.92 billion by 2026. 

The entertainment market is essentially expanding with the creation of the metaverse, therefore presenting opportunities to replicate the lucrative success that Marvel has enjoyed. But what made Marvel so popular, and why is the multiverse primed for the metaverse? 

Since the inception of the metaverse as a concept, some of the earliest explorations have included the creation — and adaptation of — graphic novels for this new virtual environment. From Method Man’s comic book MEFaverse, to the adaptation of Dan LuVisi’s iconic Last Man Standing: Killbook of a Bounty Hunter, to Killtopia catering to Japan’s ‘Otaku’ community of manga and animé fans.

Event

Intelligent Security Summit

Learn the critical role of AI & ML in cybersecurity and industry specific case studies on December 8. Register for your free pass today.


Register Now

But why is graphic novel IP so attractive to directors writing for a digital medium with interactive audiences? And what opportunities are potentially being left on the table? To understand the attraction of graphic novel IP, we only need to look at the formula of success that Marvel and DC have built. 

An ever-expanding world

Marvel’s IP is not one story, but a universe that continues to expand. Recent editions to Marvel’s onscreen world include She-Hulk: Attorney at Law, Ms. Marvel and the upcoming Secret Invasion. The stories that come to life in film and TV are often based on specific heroes within that universe — or, more aptly, the multiverse.

In film, appearance-altering costumes, special FX make-up and visual FX (VFX) enable directors to cast different actors to play the same character in the franchise. The most popular and talented actors, with the strongest following in the target demographic for the box office, can have their turn playing the hero. In fact, actors no longer need to sign long-haul multi-movie contracts with Marvel.

The metaverse offers even more creative diversity. Graphic novel characters can be customizable according to the themes of different concept artists, and the same character can travel through a manga world into one that’s photorealistic. Perhaps a good interpretation is Dr. Strange’s journey through the multiverses, as we see him enter a variety of differently stylized worlds until he eventually finds himself surreally realized as a colorful gelatinous shape. 

One of the key differentiators between a virtual world and a game within the metaverse — or what will be the metaverse — is this interoperability, the way in which an avatar could be used in different virtual worlds. The way avatars are translated stylistically in those different worlds is a key focus for metaverse builders. And it’s something Marvel has been doing well for some time. People love the graphic novel style of Marvel films and how they not only pay homage to the original art form but also amplify the movie experience with state-of-the-art VFX. 

For example, LMS: Killbook of a Bounty Hunter is being translated for the metaverse after amassing a core fanbase. LMS is simultaneously a scrapbook-style graphic novel, a character bible for the anti-hero Gabriel and an introduction to the colorful yet deadly world of ‘New Amerika’. Initially released as a series of artworks, LMS soon gathered a solid fanbase that demanded more of Dan LuVisi’s world. The rights to LMS were bought by Section 9, which approached metaverse-as-a-service company Sequin AR with the idea of creating an LMS metaverse. With a rich world and a pre-existing community, Sequin believed LMS was the perfect property for a metaverse environment. 

The attractiveness of graphic novel IP

Sequin AR’s CEO Rob DeFranco explains why the graphic novel IP was so attractive: “The world that Dan created is vivid, imaginative, and full of pop-culture references with a sharp satirical tone that makes it a model property for the metaverse. There is a big community already in place for LMS. For example, a Comic-Con special edition toy of Gabriel, created by the popular brand Funko, sold out on the first day of the convention. Since the book first launched 10 years ago, there has been a cultural shift in how we interact with the properties we love.” 

Graphic novels rely on captivating imagery, along with compelling stories. The community building the metaverse is a blend of creatives, technologists and storytellers, similar to the teams that produce the Marvel universe. For example, the team behind Method Man’s MEFaverse includes Method Man himself, and renowned graphics artist Jonathan Winbush of Winbush Immersive, with Xsens motion tracking technology helping them translate real-life movement into the digital world. It’s no coincidence that Winbush built his own brand as a creator from his time working at Marvel. 

“The trajectory of the NFT/Web3 space as a whole, in my opinion, only has one direction to go: up,” says Method Man. “I see no reason why it wouldn’t, as brands and individuals realize the unique opportunities and potential this space offers, as well as the utility it provides. That said, my hope is that it can continue to grow while remaining mindful of values such as inclusivity and positivity, which are both pillars of the MEFaverse community.”

The metaverse and the story of good vs. evil 

The metaverse has the potential to be many things, good or bad. Most metaverse evangelists also acknowledge how human influence tends to invade — and sometimes spoil — the utopian promise of future technology.

For example, Aragorn Meulendijks, Chief Metaverse Officer (CMO) from Your Open Metaverse, a distributed metaverse for streaming Web3 content, recently shared his candid thoughts on Elaine Pringle Schwitter’s HeadsTalk Podcast. According to Meulendijks, the mission for those building the metaverse needs to align with the reality of flawed human nature. This sentiment is omnipresent in Marvel; the premise of superhero films is that good and evil always exist in tandem, and even heroes are flawed. 

While there are inevitable flaws, the multiverse can also be employed altruistically. Representation and connection are frequent themes in graphic novels, often speaking to those who don’t feel part of mainstream pop culture. This links back to Winbush’s work on the MEFaverse.

“We wanted to create more ‘metamasks’ or PFPs with different traits to represent our community,” he explained. “Method Man’s motivation in creating the MEFaverse was to show his fans their powers, the unique traits that make them who they are but in the superhero realm. Method Man wanted everyone that was excited about the MEFaverse to have a mask that truly represents them. He wanted his community to be shown their unique powers in a superhero realm.”

The building blocks of film production are being used to build the metaverse

The technology that underpins movie production is driving metaverse creation. For example, motion capture is harnessing and translating movement to avatars, while Unreal Engine is being used to create the worlds themselves.

Charles Borland, founder of real-time studio Voltaku explained: “When I was an actor in a video game called Grand Theft Auto IV, I would spend a lot of time in a mocap suit, and I’d been on a lot of TV and film shoots and saw just how inefficient the Hollywood production process is. I remember thinking, holy cow, when this technology and the economics get to a certain point, all of this gaming technology and real-time technology is going to revolutionize filmmaking and how you make content.” 

Talking about the use of technology in Killtopia, Charles elaborated: “If we’re going to build this in a game engine, like Unreal Engine, then we [had]to do things like set up a camera inside of Unreal. We knew we were going to have an actress and we were going to try and do this in real-time, but one of the things we were looking at was real-time ray tracing, and to push the envelope on that. We couldn’t go into the studio and do full camera tracking, so we wanted to find something inertia-based. Using the Xsens suit, capturing the raw mocap data, enabled us to create the avatars”. 

From an investment standpoint, how Marvel’s magic formula for success translates to the metaverse is clear. But IP in the metaverse goes far beyond a franchise of characters. Fans build on these worlds themselves, becoming creators in their own right. And in order to create, they need to feel invested. And that’s where the technology underpinning interoperability is key.

Blockchain blockbusters

Killtopia’s Charles Borland explains: “To invest in interoperability, stakeholders and project owners need to know that the assets for whom they’re building aren’t going anywhere. Of course, that’s if by ‘decentralized,’ you mean you’re applying blockchain. What’s great about that is it’s immutable and it’s public. So I know if I build around a project, even if it tanks, my pipeline will stay. Because the things I’ve been referencing and looking at are going to stay online in this decentralized file hosting system, which is great.”

This is an example of how the technology used in metaverse creation is improving the entire production pipeline. Accelerating the content production workflow, and safeguarding the assets for future use, is a challenge even Marvel faces. 

Cultural shift between content creators and consumers

Borland highlights the cultural shift in how we interact with the properties we love. COVID-19 drove the rapid acceleration in digital experiences, helping us to forge genuine connections when real-life interaction wasn’t possible. The convergence of these behavioral changes and technology advancements is now paving the way for the future metaverse, with mixed reality live performances — which became more prevalent during the recent pandemic — offering a hint of what we might expect. 

Brett Ineson, founder of Animatrik Film Design, which has hosted mixed reality performances for Justin Bieber, Pentakill with Wave XR and even virtual circuses with Shocap Entertainment, says: “Nailing the look and feel of a world will be paramount to delivering the illusion of reality, and that’s where capture technology will come into play. Motion capture will be essential for creating lifelike animation for characters and creatures in these virtual worlds so that players feel like they are interacting with real beings.”

Technologists and storytellers are helping to unleash the potential of new IP into the metaverse. Right now, the reality is that the metaverse does not exist, but it represents the next step in immersive and engaging entertainment. The more engaged a community is, the more invested it is in the story. Powered motion tracking, performance capture, interoperable avatars, virtual worlds and hip hop artists-turned-super heroes, the metaverse is prime real estate for the next Marvel enterprise. 

Rob DeFranco is CEO of Sequin AR.

Brett Ineson is cofounder of Animatrik Film Studios.

Remco Sikkema is senior marketing communications manager at Movella and Xsens.

Source link

Continue Reading

Startups

Fortnite Chapter 4 debuts with Unreal Engine 5.1

Published

on

Fornite Battle Royale Chapter 4 arrived today and it makes use of Unreal Engine 5.1, Epic Games announced.

The debut shows how tightly Epic Games ties its overall strategy together. Fortnite is the prime revenue generator for the company, reaching tens of millions of players who buy in-game items. And Unreal Engine is the game developer tool that makes the advances in Chapter 4 available. To sell developers on the engine, Epic eats its own dog food by building Fortnite with Unreal to showcase what it can do.

Unreal Engine 5.1 provides new features that make the game look and run better. Unreal Engine 5 itself debuted earlier this year and it Unreal Engine 5 ushers in a generational leap in visual fidelity, bringing a new level of detail to game worlds like the Battle Royale Island.

Shadows and lighting are better in Fortnite with Unreal Engine 5.1.

Next-gen Unreal Engine 5 features such as Nanite, Lumen, Virtual Shadow Maps, and Temporal Super Resolution — all features that can make Fortnite Battle Royale shine on next-generation systems such as PlayStation 5, Xbox Series X|S, PC, and cloud gaming.

Epic Games said that over half of all announced next-gen games are being created with Unreal Engine. And it said developers can now take advantage of updates to the Lumen dynamic global illumination and reflections system. This is important stuff if you’re a game developer, or you’re expecting to build the metaverse.

Epic has made updates to the Nanite virtualized micropolygon geometry system, and virtual shadow maps that lay the groundwork for games and experiences running at 60 frames per second (fps) on next-gen consoles and capable PCs. These improvements will enable fast-paced competition and detailed simulations without latency, Epic said.

Additionally, Nanite has also added a programmable rasterizer to allow for material-driven animations and deformations via world position offset, as well as opacity masks. This development paves the way for artists to use Nanite to program specific objects’ behavior, for example Nanite-based foliage with leaves blowing in the wind.

Nanite provides highly-detailed architectural geometry. Specifically, buildings are rendered from millions of polygons in real time, and each brick, stone, wood plank, and wall trim is modeled. Natural landscapes are highly-detailed too. Individual trees have around 300,000 polygons, and each stone, flower, and blade of grass is modeled.

On top of that, Lumen reflections provide high-quality ray traced reflections on glossy materials and water.

Water and shadows look prettier in Fortnite Battle Royale Chapter 4.

Also, Lumen provides real-time global illumination at 60 frames per second (FPS). You’ll see beautiful interior spaces with bounce lighting, plus characters reacting to the lighting of their surroundings. (For example, red rugs may bounce red light onto your outfit.) Also, Outfits that have emissive (a.k.a. glowing) qualities will scatter light on nearby objects and surfaces.

Virtual Shadow Maps allow for highly detailed shadowing. Each brick, leaf, and modeled detail will cast a shadow, and character self-shadowing is extremely accurate. This means that things like hats and other small details on characters will also cast shadows.

Temporal Super Resolution is an upgrade over Temporal Anti-Aliasing in Fortnite, and allows for high-quality visuals at a high framerate.

With the introduction of these UE5 features in Fortnite Battle Royale, Fortnite’s Video settings have changed on PC. You can see them here.

To run Nanite, the minimum hardware requirements are Nvidia Maxwell-generation cards or newer or AMD GCN-generation cards or newer.

For Nanite, Lumen, Virtual Shadow Maps, and Temporal Super Resolution to be available in Fortnite on your PlayStation 5 or Xbox Series X|S, make sure the “120 FPS Mode” setting (in the “graphics” section of the Video settings) is set to off.

Unreal’s reach has grown well beyond games. Unreal Engine has now been used on over 425 film and TV productions, and is integrated into over 300 virtual production stages worldwide. Unreal Engine usage in animation has grown exponentially, from 15 productions between 2015 and 2019 to over 160 productions from 2020 to 2022.

Source link

Continue Reading

Startups

Is It Time To Talk About A More Sustainable Approach To Serving Our Customers?

Published

on

At a recent event, I spoke to a Chief Technology Officer (CTO) about how it was not untypical for him to have a day of 14 back-to-back half-hour meetings. He explained that this started during the early part of the pandemic, and by 4 pm, he was absolutely exhausted and struggled to stay focused and pay attention. He added, however, that over time he got used to such a heavy schedule and was able to manage his energy and concentration better.

On hearing this story, I commented that while I often hear stories like this from all sorts of executives at different firms, I am often left wondering how folks end up doing any work if they are in back-to-back meetings all day.

I asked slightly tongue-in-cheek how we had gotten to his point, given that I’d never seen a job description that contained any objective that required a person to attend as many meetings as physically possible.

This raised a few smiles and quite a few nods.

Whilst my comment was playful, it also contained a serious point and one that I have made to many executives about how they should actively manage their time to create the space necessary to really think about and understand the challenges they are facing.

I was thinking about that conversation again the other day when I came across some research from Microsoft about the impact on our brains and emotional state when we have back-to-back meetings.

Using an electroencephalography [EEG] cap, the Microsoft research team were able to monitor the electrical activity in the brain of back-to-back meeting participants. Unsurprisingly, they found that back-to-back virtual meetings are stressful, and a series of meetings can decrease your ability to focus and engage.

However, the research also found that introducing short breaks between meetings to allow people to move, stretch, gather their thoughts or grab a glass of water can help reduce the cumulative buildup of stress across a series of meetings.

That’s really useful insight, and I hope that more executives and their teams embrace the introduction of these short breaks between meetings to reduce stress, support well-being and maintain attention levels.

But I’ve also been thinking about whether these research findings have a broader application.

Specifically, I’ve been thinking about whether the calls taken by customer service agents could be analogous to a series of very short, back-to-back meetings. If they are, that has ramifications for the amount of stress customer service representatives have to deal with. This is brought into sharp focus when you consider that the average customer service representative is often expected to be constantly on calls for the duration of an 8-hour shift apart from a 30-minute lunch break and two 15 min breaks, one in the morning and one in the afternoon.

So, is it any wonder that the contact center industry faces perennial burnout and high levels of staff churn?

Suppose we want to build a more sustainable approach to serving our customers, particularly over live channels like the phone or video. If we do, we need to think more clearly and empathetically about our agents and what they go through.

Now, I know that technology is evolving to help with this challenge and that’s great. But we shouldn’t stop there. Building a more attractive and sustainable contact center model will require us to rethink both contact center operations and their economics.

Continue Reading

Trending

URGENT: CYBER SECURITY UPDATE