Connect with us

Startups

Scratchpad, a productivity workspace for Salesforce, raises $33M

Published

on

Scratchpad, a company that has built a modern productivity workspace on top of Salesforce, has raised $33 million in a series B round of funding.

While Salesforce remains one of the most used customer relationship management (CRM) tools, claiming an estimated one-fifth of the CRM market, this doesn’t always translate into popularity — “clunkiness” and a “lack of user-friendliness” are just some of the common complaints that abound.

Founded in 2019, Scratchpad essentially builds on Salesforce’s utility, serving busy salespeople with the tools they need to get their job done more quickly — less switching between tabs and apps, and more time selling, is the name of the game.

A typical workflow involving a Salesforce user might involve keeping notes in separate Word or Excel documents, and then copy/pasting sections into Salesforce as required. With Scratchpad, users can get access to features spanning notes, spreadsheets, tasks, Kanban boards, search, deal collaboration and more, all consolidated under a single interface on top of Salesforce, and sold as a SaaS subscription.

“Salespeople are overloaded with technology and tools like CRM software, call recording, email sequencing, note taking, and much more,” Scratchpad cofounder and CEO Pouyan Salehi told VentureBeat. “Yet, most [sales] reps still use general-purpose spreadsheets, docs, note apps, and task managers to do their jobs, only to then suffer from hours of manual data entry to update their CRM to appease their managers.”

Scratchpad: Kanban board
Scratchpad: Kanban board

The San Francisco-based company has claimed a slew of big-name customers over the past few years, including Algolia, Ironclad, Quora, Twilio, and Udemy.

“Our mission is to make salespeople happy — happy salespeople drive more growth, foster better cultures and team dynamics, and create delightful customer experiences for their businesses,” Salehi added.

Up to scratch

Scratchpad had previously raised around $16 million, the bulk of which arrived via its $13 million series A round last year. With another $33 million in the bank from existing investors including Craft Ventures and Accel, Scratchpad is strongly positioned as it looks to capitalize on recent momentum which has seen it bring a slew of new products to market. These include Scratchpad Command, which allows users to update Salesforce from anywhere on the web; a new unified workspace for calendar, sales notes, and Salesforce; and a workspace commenting system for sales teams.

“Scratchpad is becoming a must-have for the sales team tech stack,” Salehi said. “This means account executives, sales reps, revenue operations, sales leaders, sales enablement, and customer success teams finally have a modern unified workspace connected to Salesforce that makes salespeople happy and helps revenue teams produce more.”

As a $200 billion-plus company today, Salesforce’s success over the past two decades can be attributed to a number of factors — chief among them, perhaps, has been the ecosystem it has nourished, with support for third-party applications and integrations through AppExchange and countless companies building million-dollar businesses off the back of the Salesforce platform.

Scratchpad serves as a further example of the Salesforce network effect. But it also acknowledges several truths — that Salesforce is the dominant CRM, but not everybody enjoys using it; that they typically use multiple tools in tandem with Salesforce; and they waste a considerable amount of time on repetitive admin.

“Revenue teams are literally paying their reps to complete administrative work instead of revenue-generating work,” Salehi said. “Worse yet, the data quality in most CRMs remains poor, and sales process adherence is mostly a dream creating drag across the revenue team. This drag slows salespeople down, keeps revenue teams from operating at peak performance, cultivates unhappy salespeople who often sell less and churn, and can yield poor experiences for customers.”

Source link

Startups

Why graphic novels are lucrative IP for Web3: From MEFaverse to metaverse

Published

on

Marvel’s multi-billion dollar IP enterprise is eating up the film and streaming market — but the metaverse is offering new opportunities and creating a whole new market.

Marvel is valued at nearly $6 billion for films alone, $40 billion for streaming and about $3 billion for consumer products, according to a 2021 Forbes analysis. While the media giant dominates the lion’s share of graphic novel IP in entertainment within film and streaming, the metaverse offers new opportunities for graphic novel IP. The ‘metaverse in entertainment’ market share is expected to increase to $28.92 billion by 2026. 

The entertainment market is essentially expanding with the creation of the metaverse, therefore presenting opportunities to replicate the lucrative success that Marvel has enjoyed. But what made Marvel so popular, and why is the multiverse primed for the metaverse? 

Since the inception of the metaverse as a concept, some of the earliest explorations have included the creation — and adaptation of — graphic novels for this new virtual environment. From Method Man’s comic book MEFaverse, to the adaptation of Dan LuVisi’s iconic Last Man Standing: Killbook of a Bounty Hunter, to Killtopia catering to Japan’s ‘Otaku’ community of manga and animé fans.

Event

Intelligent Security Summit

Learn the critical role of AI & ML in cybersecurity and industry specific case studies on December 8. Register for your free pass today.


Register Now

But why is graphic novel IP so attractive to directors writing for a digital medium with interactive audiences? And what opportunities are potentially being left on the table? To understand the attraction of graphic novel IP, we only need to look at the formula of success that Marvel and DC have built. 

An ever-expanding world

Marvel’s IP is not one story, but a universe that continues to expand. Recent editions to Marvel’s onscreen world include She-Hulk: Attorney at Law, Ms. Marvel and the upcoming Secret Invasion. The stories that come to life in film and TV are often based on specific heroes within that universe — or, more aptly, the multiverse.

In film, appearance-altering costumes, special FX make-up and visual FX (VFX) enable directors to cast different actors to play the same character in the franchise. The most popular and talented actors, with the strongest following in the target demographic for the box office, can have their turn playing the hero. In fact, actors no longer need to sign long-haul multi-movie contracts with Marvel.

The metaverse offers even more creative diversity. Graphic novel characters can be customizable according to the themes of different concept artists, and the same character can travel through a manga world into one that’s photorealistic. Perhaps a good interpretation is Dr. Strange’s journey through the multiverses, as we see him enter a variety of differently stylized worlds until he eventually finds himself surreally realized as a colorful gelatinous shape. 

One of the key differentiators between a virtual world and a game within the metaverse — or what will be the metaverse — is this interoperability, the way in which an avatar could be used in different virtual worlds. The way avatars are translated stylistically in those different worlds is a key focus for metaverse builders. And it’s something Marvel has been doing well for some time. People love the graphic novel style of Marvel films and how they not only pay homage to the original art form but also amplify the movie experience with state-of-the-art VFX. 

For example, LMS: Killbook of a Bounty Hunter is being translated for the metaverse after amassing a core fanbase. LMS is simultaneously a scrapbook-style graphic novel, a character bible for the anti-hero Gabriel and an introduction to the colorful yet deadly world of ‘New Amerika’. Initially released as a series of artworks, LMS soon gathered a solid fanbase that demanded more of Dan LuVisi’s world. The rights to LMS were bought by Section 9, which approached metaverse-as-a-service company Sequin AR with the idea of creating an LMS metaverse. With a rich world and a pre-existing community, Sequin believed LMS was the perfect property for a metaverse environment. 

The attractiveness of graphic novel IP

Sequin AR’s CEO Rob DeFranco explains why the graphic novel IP was so attractive: “The world that Dan created is vivid, imaginative, and full of pop-culture references with a sharp satirical tone that makes it a model property for the metaverse. There is a big community already in place for LMS. For example, a Comic-Con special edition toy of Gabriel, created by the popular brand Funko, sold out on the first day of the convention. Since the book first launched 10 years ago, there has been a cultural shift in how we interact with the properties we love.” 

Graphic novels rely on captivating imagery, along with compelling stories. The community building the metaverse is a blend of creatives, technologists and storytellers, similar to the teams that produce the Marvel universe. For example, the team behind Method Man’s MEFaverse includes Method Man himself, and renowned graphics artist Jonathan Winbush of Winbush Immersive, with Xsens motion tracking technology helping them translate real-life movement into the digital world. It’s no coincidence that Winbush built his own brand as a creator from his time working at Marvel. 

“The trajectory of the NFT/Web3 space as a whole, in my opinion, only has one direction to go: up,” says Method Man. “I see no reason why it wouldn’t, as brands and individuals realize the unique opportunities and potential this space offers, as well as the utility it provides. That said, my hope is that it can continue to grow while remaining mindful of values such as inclusivity and positivity, which are both pillars of the MEFaverse community.”

The metaverse and the story of good vs. evil 

The metaverse has the potential to be many things, good or bad. Most metaverse evangelists also acknowledge how human influence tends to invade — and sometimes spoil — the utopian promise of future technology.

For example, Aragorn Meulendijks, Chief Metaverse Officer (CMO) from Your Open Metaverse, a distributed metaverse for streaming Web3 content, recently shared his candid thoughts on Elaine Pringle Schwitter’s HeadsTalk Podcast. According to Meulendijks, the mission for those building the metaverse needs to align with the reality of flawed human nature. This sentiment is omnipresent in Marvel; the premise of superhero films is that good and evil always exist in tandem, and even heroes are flawed. 

While there are inevitable flaws, the multiverse can also be employed altruistically. Representation and connection are frequent themes in graphic novels, often speaking to those who don’t feel part of mainstream pop culture. This links back to Winbush’s work on the MEFaverse.

“We wanted to create more ‘metamasks’ or PFPs with different traits to represent our community,” he explained. “Method Man’s motivation in creating the MEFaverse was to show his fans their powers, the unique traits that make them who they are but in the superhero realm. Method Man wanted everyone that was excited about the MEFaverse to have a mask that truly represents them. He wanted his community to be shown their unique powers in a superhero realm.”

The building blocks of film production are being used to build the metaverse

The technology that underpins movie production is driving metaverse creation. For example, motion capture is harnessing and translating movement to avatars, while Unreal Engine is being used to create the worlds themselves.

Charles Borland, founder of real-time studio Voltaku explained: “When I was an actor in a video game called Grand Theft Auto IV, I would spend a lot of time in a mocap suit, and I’d been on a lot of TV and film shoots and saw just how inefficient the Hollywood production process is. I remember thinking, holy cow, when this technology and the economics get to a certain point, all of this gaming technology and real-time technology is going to revolutionize filmmaking and how you make content.” 

Talking about the use of technology in Killtopia, Charles elaborated: “If we’re going to build this in a game engine, like Unreal Engine, then we [had]to do things like set up a camera inside of Unreal. We knew we were going to have an actress and we were going to try and do this in real-time, but one of the things we were looking at was real-time ray tracing, and to push the envelope on that. We couldn’t go into the studio and do full camera tracking, so we wanted to find something inertia-based. Using the Xsens suit, capturing the raw mocap data, enabled us to create the avatars”. 

From an investment standpoint, how Marvel’s magic formula for success translates to the metaverse is clear. But IP in the metaverse goes far beyond a franchise of characters. Fans build on these worlds themselves, becoming creators in their own right. And in order to create, they need to feel invested. And that’s where the technology underpinning interoperability is key.

Blockchain blockbusters

Killtopia’s Charles Borland explains: “To invest in interoperability, stakeholders and project owners need to know that the assets for whom they’re building aren’t going anywhere. Of course, that’s if by ‘decentralized,’ you mean you’re applying blockchain. What’s great about that is it’s immutable and it’s public. So I know if I build around a project, even if it tanks, my pipeline will stay. Because the things I’ve been referencing and looking at are going to stay online in this decentralized file hosting system, which is great.”

This is an example of how the technology used in metaverse creation is improving the entire production pipeline. Accelerating the content production workflow, and safeguarding the assets for future use, is a challenge even Marvel faces. 

Cultural shift between content creators and consumers

Borland highlights the cultural shift in how we interact with the properties we love. COVID-19 drove the rapid acceleration in digital experiences, helping us to forge genuine connections when real-life interaction wasn’t possible. The convergence of these behavioral changes and technology advancements is now paving the way for the future metaverse, with mixed reality live performances — which became more prevalent during the recent pandemic — offering a hint of what we might expect. 

Brett Ineson, founder of Animatrik Film Design, which has hosted mixed reality performances for Justin Bieber, Pentakill with Wave XR and even virtual circuses with Shocap Entertainment, says: “Nailing the look and feel of a world will be paramount to delivering the illusion of reality, and that’s where capture technology will come into play. Motion capture will be essential for creating lifelike animation for characters and creatures in these virtual worlds so that players feel like they are interacting with real beings.”

Technologists and storytellers are helping to unleash the potential of new IP into the metaverse. Right now, the reality is that the metaverse does not exist, but it represents the next step in immersive and engaging entertainment. The more engaged a community is, the more invested it is in the story. Powered motion tracking, performance capture, interoperable avatars, virtual worlds and hip hop artists-turned-super heroes, the metaverse is prime real estate for the next Marvel enterprise. 

Rob DeFranco is CEO of Sequin AR.

Brett Ineson is cofounder of Animatrik Film Studios.

Remco Sikkema is senior marketing communications manager at Movella and Xsens.

Source link

Continue Reading

Startups

Fortnite Chapter 4 debuts with Unreal Engine 5.1

Published

on

Fornite Battle Royale Chapter 4 arrived today and it makes use of Unreal Engine 5.1, Epic Games announced.

The debut shows how tightly Epic Games ties its overall strategy together. Fortnite is the prime revenue generator for the company, reaching tens of millions of players who buy in-game items. And Unreal Engine is the game developer tool that makes the advances in Chapter 4 available. To sell developers on the engine, Epic eats its own dog food by building Fortnite with Unreal to showcase what it can do.

Unreal Engine 5.1 provides new features that make the game look and run better. Unreal Engine 5 itself debuted earlier this year and it Unreal Engine 5 ushers in a generational leap in visual fidelity, bringing a new level of detail to game worlds like the Battle Royale Island.

Shadows and lighting are better in Fortnite with Unreal Engine 5.1.

Next-gen Unreal Engine 5 features such as Nanite, Lumen, Virtual Shadow Maps, and Temporal Super Resolution — all features that can make Fortnite Battle Royale shine on next-generation systems such as PlayStation 5, Xbox Series X|S, PC, and cloud gaming.

Epic Games said that over half of all announced next-gen games are being created with Unreal Engine. And it said developers can now take advantage of updates to the Lumen dynamic global illumination and reflections system. This is important stuff if you’re a game developer, or you’re expecting to build the metaverse.

Epic has made updates to the Nanite virtualized micropolygon geometry system, and virtual shadow maps that lay the groundwork for games and experiences running at 60 frames per second (fps) on next-gen consoles and capable PCs. These improvements will enable fast-paced competition and detailed simulations without latency, Epic said.

Additionally, Nanite has also added a programmable rasterizer to allow for material-driven animations and deformations via world position offset, as well as opacity masks. This development paves the way for artists to use Nanite to program specific objects’ behavior, for example Nanite-based foliage with leaves blowing in the wind.

Nanite provides highly-detailed architectural geometry. Specifically, buildings are rendered from millions of polygons in real time, and each brick, stone, wood plank, and wall trim is modeled. Natural landscapes are highly-detailed too. Individual trees have around 300,000 polygons, and each stone, flower, and blade of grass is modeled.

On top of that, Lumen reflections provide high-quality ray traced reflections on glossy materials and water.

Water and shadows look prettier in Fortnite Battle Royale Chapter 4.

Also, Lumen provides real-time global illumination at 60 frames per second (FPS). You’ll see beautiful interior spaces with bounce lighting, plus characters reacting to the lighting of their surroundings. (For example, red rugs may bounce red light onto your outfit.) Also, Outfits that have emissive (a.k.a. glowing) qualities will scatter light on nearby objects and surfaces.

Virtual Shadow Maps allow for highly detailed shadowing. Each brick, leaf, and modeled detail will cast a shadow, and character self-shadowing is extremely accurate. This means that things like hats and other small details on characters will also cast shadows.

Temporal Super Resolution is an upgrade over Temporal Anti-Aliasing in Fortnite, and allows for high-quality visuals at a high framerate.

With the introduction of these UE5 features in Fortnite Battle Royale, Fortnite’s Video settings have changed on PC. You can see them here.

To run Nanite, the minimum hardware requirements are Nvidia Maxwell-generation cards or newer or AMD GCN-generation cards or newer.

For Nanite, Lumen, Virtual Shadow Maps, and Temporal Super Resolution to be available in Fortnite on your PlayStation 5 or Xbox Series X|S, make sure the “120 FPS Mode” setting (in the “graphics” section of the Video settings) is set to off.

Unreal’s reach has grown well beyond games. Unreal Engine has now been used on over 425 film and TV productions, and is integrated into over 300 virtual production stages worldwide. Unreal Engine usage in animation has grown exponentially, from 15 productions between 2015 and 2019 to over 160 productions from 2020 to 2022.

Source link

Continue Reading

Startups

Is It Time To Talk About A More Sustainable Approach To Serving Our Customers?

Published

on

At a recent event, I spoke to a Chief Technology Officer (CTO) about how it was not untypical for him to have a day of 14 back-to-back half-hour meetings. He explained that this started during the early part of the pandemic, and by 4 pm, he was absolutely exhausted and struggled to stay focused and pay attention. He added, however, that over time he got used to such a heavy schedule and was able to manage his energy and concentration better.

On hearing this story, I commented that while I often hear stories like this from all sorts of executives at different firms, I am often left wondering how folks end up doing any work if they are in back-to-back meetings all day.

I asked slightly tongue-in-cheek how we had gotten to his point, given that I’d never seen a job description that contained any objective that required a person to attend as many meetings as physically possible.

This raised a few smiles and quite a few nods.

Whilst my comment was playful, it also contained a serious point and one that I have made to many executives about how they should actively manage their time to create the space necessary to really think about and understand the challenges they are facing.

I was thinking about that conversation again the other day when I came across some research from Microsoft about the impact on our brains and emotional state when we have back-to-back meetings.

Using an electroencephalography [EEG] cap, the Microsoft research team were able to monitor the electrical activity in the brain of back-to-back meeting participants. Unsurprisingly, they found that back-to-back virtual meetings are stressful, and a series of meetings can decrease your ability to focus and engage.

However, the research also found that introducing short breaks between meetings to allow people to move, stretch, gather their thoughts or grab a glass of water can help reduce the cumulative buildup of stress across a series of meetings.

That’s really useful insight, and I hope that more executives and their teams embrace the introduction of these short breaks between meetings to reduce stress, support well-being and maintain attention levels.

But I’ve also been thinking about whether these research findings have a broader application.

Specifically, I’ve been thinking about whether the calls taken by customer service agents could be analogous to a series of very short, back-to-back meetings. If they are, that has ramifications for the amount of stress customer service representatives have to deal with. This is brought into sharp focus when you consider that the average customer service representative is often expected to be constantly on calls for the duration of an 8-hour shift apart from a 30-minute lunch break and two 15 min breaks, one in the morning and one in the afternoon.

So, is it any wonder that the contact center industry faces perennial burnout and high levels of staff churn?

Suppose we want to build a more sustainable approach to serving our customers, particularly over live channels like the phone or video. If we do, we need to think more clearly and empathetically about our agents and what they go through.

Now, I know that technology is evolving to help with this challenge and that’s great. But we shouldn’t stop there. Building a more attractive and sustainable contact center model will require us to rethink both contact center operations and their economics.

Continue Reading

Trending

URGENT: CYBER SECURITY UPDATE