P8060-001 syllabus is available at killexams.com

Typically P8060-001 test-takers are mixed up by free products available on the net, therefore they are unsuccessful the IBM B2B Integration Technical Mastery Test v1 exam. Many of us advise to expend a little cost and download the total version of P8060-001 questions and answers, braindumps and assure your 100% good results inside the real test.

Exam Code: P8060-001 Practice test 2022 by Killexams.com team
IBM B2B Integration Technical Mastery Test v1
IBM Integration basics
Killexams : IBM Integration basics - BingNews https://killexams.com/pass4sure/exam-detail/P8060-001 Search results Killexams : IBM Integration basics - BingNews https://killexams.com/pass4sure/exam-detail/P8060-001 https://killexams.com/exam_list/IBM Killexams : Who's selling private 5G and what do you get?

Private 5G is technology that can be used in local area networks. Not to be confused with the public 5G connectivity offered by telephone companies, private 5G is used in corporate campuses, office buildings, factories and warehouses, event venues, and airports, either instead of or in addition to Wi-Fi.

According to an unpublished survey from research firm Forrester, 44% of corporate telecommunications  decision-makers plan to create private 5G networks. Industries with significant private 5G plans include water and waste, high-tech manufacturing, and retail and wholesale. Other areas where private 5G might crop up include stadiums and construction sites, says Forrester analyst Andre Kindness. "They're prime for 5G technologies."

The technology is very new, and few enterprises have the expertise to deploy it themselves, so most companies that want to use private 5G are turning to service providers to help set up the networks. Providers include telcos, private wireless network vendors, original equipment manufacturers, systems integrators, and major cloud players.

Enterprises will have to decide what kind of provider makes the most sense given the scale and complexity of their environments. There are pros and cons to each private 5G approach, and service levels can vary dramatically.

Telcos lead with 5G skills, spectrum licenses

An uptick in provider interest is one factor that’s driving private 5G adoption. A growing number of wireless carriers are beginning to offer private 5G network services to enterprise customers. The Wells Fargo Center in Philadelphia recently rolled out 5G using Comcast Business to help set up the stadium network, for example.

The major global wireless carriers, such as AT&T, Comcast, Orange, Telefonica and Verizon, have several advantages when it comes to private 5G networks.

Copyright © 2022 IDG Communications, Inc.

Sun, 07 Aug 2022 22:00:00 -0500 en text/html https://www.networkworld.com/article/3668656/whos-selling-private-5g-and-what-do-you-get.html
Killexams : Data Integration Software Market In-Depth knowledge of Future Advances, Product Development and Innovation to 2030

The MarketWatch News Department was not involved in the creation of this content.

Jul 27, 2022 (Alliance News via COMTEX) -- New York (US) - Key Companies Covered in the Data Integration Software Market Research are Informatica, IBM, SAP SE, Oracle, Talend, Microsoft, Cisco Systems, Denodo Technologies, Attunity, Adeptia, Actian Corporation, Syncsort, Symantec Corporation, Teradata, Intel and other key market players.

The global Data Integration Software Market is expected to reach US$ Million by 2027, with a CAGR of $$% from 2020 to 2027, based on Report Ocean newly published report. The demand for Internet-of-Things (IoT) technology and services are growing globally, especially around applications within the healthcare, energy, transport, public sector, and manufacturing industries. Many countries have led to the emergence of IoT/smart city projects.

Download Free trial of This Strategic Report: https://reportocean.com/industry-verticals/sample-request?report_id=HNY306791

The U.S. accounted for the major share in the global landscape in technology innovation. As per the World Economic Forum's 2018 Global Competitive Index, the country's competitive advantage is owing to its business vitality, substantial institutional pillars, financing agencies, and vibrant innovation ecosystem.

As of 2021, the U.S. region garnered 36%of the global information and communication technology (ICT) market share.Europe and China ranked as the second and third largest regions, separately accounting for 12%of the market share.The U.S. economy has held its global leadership position despite only a cumulative growth in wages from US$ 65 per hour in 2005 to US$ 71.3 per hour in 2015.

The prime objective of this report is to provide the insights on the post COVID-19 impact which will help market players in this field evaluate their business approaches. Also, this report covers market segmentation by major market verdors, types, applications/end users and geography(North America, East Asia, Europe, South Asia, Southeast Asia, Middle East, Africa, Oceania, South America)

By Types:



By Applications:






Key Indicators Analysed

Market Players & Competitor Analysis: The report covers the key players of the industry including Company Profile, Product Specifications, Production Capacity/Sales, Revenue, Price and Gross Margin 2016-2027 & Sales with a thorough analysis of the markets competitive landscape and detailed information on vendors and comprehensive details of factors that will challenge the growth of major market vendors.

Global and Regional Market Analysis: The report includes Global & Regional market status and outlook 2016-2027. Further the report provides break down details about each region & countries covered in the report. Identifying its sales, sales volume & revenue forecast. With detailed analysis by types and applications.

Market Trends:Market key trends which include Increased Competition and Continuous Innovations.

Opportunities and Drivers:Identifying the Growing Demands and New Technology

Porters Five Force Analysis: The report provides with the state of competition in industry depending on five basic forces: threat of new entrants, bargaining power of suppliers, bargaining power of buyers, threat of substitute products or services, and existing industry rivalry.

SPECIAL OFFER (Avail an Up-to 30% discount on this report- https://reportocean.com/industry-verticals/sample-request?report_id=HNY306791

Key Reasons to Purchase

To gain insightful analyses of the market and have comprehensive understanding of the global market and its commercial landscape.

Assess the production processes, major issues, and solutions to mitigate the development risk.

To understand the most affecting driving and restraining forces in the market and its impact in the global market.

Learn about the market strategies that are being adopted by leading respective organizations.

To understand the future outlook and prospects for the market.

Besides the standard structure reports, we also provide custom research according to specific requirements.

Table of Content:

  • Market Definition and Overview
  • Research Method and Logic
  • Market Competition Analysis
  • Product and Service Analysis
  • Strategies for Company to Deal with the Impact of COVID-19
  • Market Segment by Type, Historical Data and Market Forecasts
  • Market Segment by Application, Historical Data and Market Forecasts
  • Market by by Region, Historical Data and Market Forecasts
  • Market Dynamic Analysis and Development Suggestions

Key Questions Answered in the Market Report

  • Which Manufacturing Technology is used for Market? What Developments Are Going on in That Technology?
  • Which Trends Are Causing These Developments? Who Are the Global Key Players in This Market?
  • What are Their Company Profile, Their Product Information, and Contact Information?
  • What Was Global Status of Market? What Was Capacity, Production Value, Cost and PROFIT of Market?
  • What Is Current Market Status of market Industry? What's Market Competition in This Industry, Both Company, and Country Wise?
  • What's Market Analysis of Market by Taking Applications and Types in Consideration?
  • What Are Projections of Global Market Industry Considering Capacity, Production and Production Value? What Will Be the Estimation of Cost and Profit?
  • What Will Be Market Share Report, Supply and Consumption? What about Import and Export?
  • What Is Market Chain Analysis by Upstream Raw Materials and Downstream Industry?

Request full Report : https://reportocean.com/industry-verticals/sample-request?report_id=HNY306791

About Report Ocean:
We are the best market research reports provider in the industry. Report Ocean believes in providing quality reports to clients to meet the top line and bottom line goals which will boost your market share in today's competitive environment. Report Ocean is a 'one-stop solution' for individuals, organizations, and industries that are looking for innovative market research reports.

Get in Touch with Us:
Report Ocean:
Address: 500 N Michigan Ave, Suite 600, Chicago, Illinois 60611 - UNITED STATES
Tel:+1 888 212 3539 (US - TOLL FREE)


The MarketWatch News Department was not involved in the creation of this content.

Tue, 26 Jul 2022 21:00:00 -0500 en-US text/html https://www.marketwatch.com/press-release/data-integration-software-market-in-depth-knowledge-of-future-advances-product-development-and-innovation-to-2030-2022-07-27
Killexams : Enterprise Integration Platform as a Service Market Report 2022 In-Depth Market Analysis and Future Prospects Till 2027

The MarketWatch News Department was not involved in the creation of this content.

Jul 29, 2022 (Market Insight Reports) -- Enterprise Integration Platform as a Service Market (US, Europe, Asia-Pacific) 2022 research includes historical and forecast data, demand, application details, price trends, and company shares of the leading Enterprise Integration Platform as a Service industry by geography.

A new report released by Market Research Update is Enterprise Integration Platform as a Service Market 2022. This report provides up-to-date information on the market and also pinpoint all the opportunities for Enterprise Integration Platform as a Service market growth. The report begins with a market outlook and offers market basic introduction and definition of the worldwide Enterprise Integration Platform as a Service industry. The overview part of the report contains Enterprise Integration Platform as a Service market dynamics which includes market growth drivers, restraining factors, opportunities and Enterprise Integration Platform as a Service current trends along with the value chain analysis and pricing structure study.

PDF trial Report: https://www.marketresearchupdate.com/sample/360359

Enterprise Integration Platform as a Service market is segmented by region, players, by Type, and by Application. Players, stakeholders, and other participants in the global Enterprise Integration Platform as a Service market will be able to gain the upper hand as they use the report as a powerful resource. The segmental analysis focuses on revenue and forecast by region, by Type and by Application in terms of revenue and forecast to 2027.

Impact of COVID-19 on Enterprise Integration Platform as a Service Market

The report also includes the effects of the ongoing global crisis. COVID-19, on the Enterprise Integration Platform as a Service Market and what the future holds for it. It provides an analysis of the impact of the pandemic on the global economy. The epidemic has directly disrupted demand and the supply chain. The report also analyzes the financial impact on businesses and financial markets. This Enterprise Integration Platform as a Service report have gathered information from several industry delegates and have been involved in primary and secondary research to provide customers with data and strategies to address market challenges during and after the COVID-19 pandemic.

Top Key Players of the Enterprise Integration Platform as a Service Market:
Informatica Corporation, Dell Boomi, Inc., DBSync, SnapLogic, Inc., IBM Corporation, Oracle Corporation, Scribe Software Corporation, Celigo, Inc., SAP SE, Flowgear, MuleSoft, Inc., Jitterbit, Inc.

The global, regional, and other market statistics including CAGR, financial statements, volume, and market share mentioned in this report can be easily relied upon in light of their high precision and authenticity. The report also provides a study on the current and future demand of the Global Enterprise Integration Platform as a Service Market.

Types covered in this report are:
Public Cloud
Private Cloud
Hybrid Cloud

Applications covered in this report are:
Small and Medium Enterprises (SMEs)
Large enterprises

PDF trial Report: https://www.marketresearchupdate.com/sample/360359

With the present market standards revealed, the Enterprise Integration Platform as a Service market research report has also illustrated the latest strategic developments and patterns of the market players in an unbiased manner. The report serves as a presumptive business document that can help the purchasers in the global market plan their next courses towards the position of the market's future.

Regional Analysis For Enterprise Integration Platform as a Service Market:

North America, Europe, Asia-Pacific, South America, The Middle East, and Africa

Competitive Landscape and Enterprise Integration Platform as a Service Market Share Analysis

Enterprise Integration Platform as a Service market competitive landscape provides details and data information by players. The report offers comprehensive analysis and accurate statistics on revenue by the player. It also offers detailed analysis supported by reliable statistics on revenue (global and regional level) by players. Details included are company description, major business, company total revenue and the sales, revenue generated in Enterprise Integration Platform as a Service business, the date to enter into the Enterprise Integration Platform as a Service market, Enterprise Integration Platform as a Service product introduction, recent developments, etc.

Table of Contents

Global Enterprise Integration Platform as a Service Market Report 2022
Chapter 1 Enterprise Integration Platform as a Service Market Overview
Chapter 2 Global Economic Impact on Enterprise Integration Platform as a Service Industry
Chapter 3 Global Market Competition by Manufacturers
Chapter 4 Global Production, Revenue (Value) by Region
Chapter 5 Global Supply (Production), Consumption, Export, Import by Regions
Chapter 6 Global Production, Revenue (Value), Price Trend by Type
Chapter 7 Global Enterprise Integration Platform as a Service Market Analysis by Application
Chapter 8 Manufacturing Cost Analysis
Chapter 9 Industrial Chain, Sourcing Strategy and Downstream Buyers
Chapter 10 Marketing Strategy Analysis, Distributors/Traders
Chapter 11 Market Effect Factors Analysis
Chapter 12 Global Enterprise Integration Platform as a Service Market Forecast

Highlights of the Report:

– A detailed and exhaustive evaluation of the Enterprise Integration Platform as a Service market.
– Accrued revenues from each segment of the market from 2022 to 2027.
– Drivers, restraints, and opportunities in the industry.
– Approaches embraced by the key market players.
– Provinces that would create multiple opportunities for the frontrunners in the industry.
– Current scope and trends of the Enterprise Integration Platform as a Service market.

View Full Report @ https://www.marketresearchupdate.com/industry-growth/enterprise-integration-platform-as-a-service-report-2022-2027-360359

In the end, the Enterprise Integration Platform as a Service Market report includes investment come analysis and development trend analysis. The present and future opportunities of the fastest growing international industry segments are coated throughout this report. This report additionally presents product specification, manufacturing method, and product cost structure, and price structure.

Contact Us:

Our Other Reports:












Is there a problem with this press release? Contact the source provider Comtex at editorial@comtex.com. You can also contact MarketWatch Customer Service via our Customer Center.

The MarketWatch News Department was not involved in the creation of this content.

Thu, 28 Jul 2022 23:24:00 -0500 en-US text/html https://www.marketwatch.com/press-release/enterprise-integration-platform-as-a-service-market-report-2022-in-depth-market-analysis-and-future-prospects-till-2027-2022-07-29
Killexams : All the Virtual Friends We Made Along the Way

Gizmodo is 20 years old! To celebrate the anniversary, we’re looking back at some of the most significant ways our lives have been thrown for a loop by our digital tools.

Virtual friends have been with us for a long time. They started as punch card chatbots in the 1960s and have evolved into platforms that control our smart homes. I don’t turn off a lightbulb without first barking an order to a digital assistant. It’s the kind of interaction we used to idealize in science fiction. Now that I’m living with it day-to-day, I realise that this lifestyle has been subtly imprinted on me since I started using computers.

Inventions like Eliza and IBM’s Shoebox back during America’s so-called “golden era” were merely the foundation of the digital friends in our inner circles today. We started normalizing daily interaction with this technology in the mid-’90s when we gave credence to the existence of things like caring for a digital pet and relying on chatbots to help us fish information. In honour of Gizmodo’s 20th anniversary, here’s a look at some of the ways we made “friends” with the digital world over the last couple of decades and what might be coming for us now with the advent of Web3.

It began with Clippy

“It looks like you’re doing something that requires me to pop up on the screen and distract you from the task at hand.” That was the basic gist of Microsoft’s Clippy, often referred to as the world’s most hated virtual assistant (ouch). I wouldn’t go as far as to say I hated Clippy, though it definitely had a habit of popping up at the most unnecessary time. Microsoft introduced Clippy in 1996 to try and help users with its new at-the-time Office software. But the minute you’d start typing out something, the animated little paper clip would pop up and ask how it could help, assuming you needed aid starting your draft.

Microsoft eventually sunsetted Clippy within its Office suite in 2007. Clippy has since been memorialised in the form of various fan-made Chrome extensions. Microsoft even made an official Clippy emoji in Windows 11.

SmarterChild: The first bot I ever insulted

SmarterChild is a chatbot near and dear to my heart. Although it’s not the original one to surface, it was the first I had an interaction with that freaked out my teenage brain to the extent that I remember asking myself, “Is this real?”

SmarterChild was a bot developed to work with the instant messaging programs at the time, including AOL Instant Messenger (AIM), Yahoo! Messenger, and what was previously known as MSN Messenger. The company behind SmarterChild, called ActiveBuddy, launched the chatbot in 2000. I vividly recall wasting time at the family computer, engaging in a going-nowhere conversation with SmarterChild, and saving screenshots (that I wish I’d backed up) of some gnarly replies.

I also remember getting emotional with it. This article from Vice describes interacting with SmarterChild almost perfectly:

I used SmarterChild as a practice wall for cursing and insults. I used the bot as a verbal punching bag, sending offensive queries and statements — sometimes in the company of my friends, but many times alone.

SmarterChild was meant to be a helper bot within your preferred messaging client that you could ping to look up information or play text-based games. In some ways, its existence was a foreshadowing predecessor to the bots we interact with now within chat clients like Slack and Discord. Although, I’m much nicer to those bots than I was to SmarterChild back in the day.

Neko on your screen

Remember desktop pets? They were nothing like real pets or even virtual pets of the time, but they were neat little applications for ornamenting the desktop with something cute and distracting. My favourite was Neko, a little pixelated cat that chased the mouse cursor as you moved around. There are still downloads circulating if anyone is fiending for some old-school computer companionship. I found a Chrome OS-compatible one, too.

Tamagotchi: the virtual pet still going strong

When we think of virtual friends, it’s hard not to bring up Bandai’s Tamagotchi digital pets. Tamagotchi was introduced in 1996 in Japan and then a year later to the U.S. The toy sold exponentially worldwide and has since spawned a hearty community of devoted collectors who have kept the toy thriving–yes, I count myself among these folks, though I only recently came into the community after I realised how much fun it is freaking out over the constant care of a virtual pet.

However, Tamagotchi did just more than spawn a lineup of toys. It introduced the concept of the “Tamagotchi effect,” essentially referring to the spike of dopamine one gets when checking in with their virtual pet and the emotional connections that develop as a result. Over the decades, there have been countless stories about the intense relationships people have had with Tamagotchi. Some caretakers have even gone as far as physically burying them after death.

Neopets: the Millennial’s first foray into the Metaverse

Devices like the Tamagotchi gave way to sites like Neopets. Neopets started as a virtual pet website where you could buy and own virtual pets and items using virtual currency. It’s been interesting to see how it chugged along through the years since its debut in 1999.

At its height, Neopets had about 20 million users. Nickelodeon bought it out in 2005 and then sold it again in 2014 to a company called JumpStart Games. The site is still accessible 20 years later, though it has fewer active users than when it first launched.

It is fun to read the initial coverage of Neopets and see parents complaining about the same things kids are still encountering online today. “The whole purpose of this site at this point is to keep kids in front of products,” Susan Linn, an author and psychologist, told CBS News in 2005. As if the Web3-obsessed internet of today isn’t already headed for the same fate. Have we learned nothing, people?

Sony’s Aibo reminds us robot dogs are real

The robot dog has seen many iterations through the past two decades, but none are as iconic as Sony’s Aibo, which launched in 1999. The name stands for Artificial Intelligence Robot, and it was programmed to learn as it goes, helping contribute to its lifelike interactivity. Despite the $US2,000 ($2,776) initial price tag, Sony managed to sell well over 150,000 units by 2015, when we reported on the funerals the owners of out-of-commission Aibo were having overseas.

Over the years, it became a blueprint for how a gadget company could manufacture a somewhat successful artificial companion–it certainly seems like a success on the outside, even if virtual pets could never fully replace the real things. The New York Times documentary, called Robotica, perfectly encapsulates the kind of bond people had with their Aibo dogs, which might have been why the company decided to resurrect it in 2017.

Welcome to the bizarre world of Seaman

I didn’t have a Sega Dreamcast, but I still had nightmares about Seaman. What started as a joke became one of the console’s best-selling titles. Dreamcast’s Seaman was a voice-activated game and one of the few that came with the detachable microphone accessory for the console. It also required a VMU that docked within the Dreamcast controller so that you could take Seaman on the go.

Seaman was not cute and cuddly like other digital pets and characters. He was often described as a “grouch,” though it was also one of the ways the game endeared itself to people. The microphone allowed you to talk to Seaman about your life, job, family, or whatever else you had on your mind. Seaman could remember your conversations, and Leonard Nimoy, the game’s narrator, might bring up related tidbits later, which added to the interactivity of this bizarre Dreamcast title.

The advent of the customer service bot

Listen, I’m not proud of it, but my interactions with SmarterChild in my teens gave way to the frustrating conversations I’ve had with digital customer service bots. You know the ones I’m talking about: they pop up when you’re on the shop’s page in the bottom corner and, like Clippy of yore, ask if you need help. Then, you reply to that bot asking if you can have help with an exchange, and it spirals from there.

There have been a plethora of customer service bots floating around the industry since the ‘90s, and they’re certainly not going anywhere. It also means that the new ones have passed the Turing Test enough to replace a job that’s one of the most gruelling and psychologically affecting.

IBM’s Watson beats Jeopardy’s human champions

IBM’s supercomputer, Watson, won Jeopardy in 2011 against two of its highest-ranking players of the time. It was a real-time showcase of how “human smart” computers could be during a period when it was one of the most advanced AI systems on Earth.

According to Wired, researchers had scanned about 200 million content pages into IBM’s Watson, including books, movie scripts, and encyclopedias. The system could browse through nearly 2 million pages of content in three seconds, which is why it seemed prime to compete against humans in a game that tested general knowledge.

Watson soon became problematic, which is what happens when you feed AI a bunch of information and don’t account for it. Watson had access to the user-submitted Urban Dictionary, which in turn made it into a “hot mess.” A few years later, it started recommending cancer treatments deemed “unsafe and incorrect,” which became exemplary of what happens when you feed the algorithm the wrong information.

Apple introduces Siri, which freaks everyone out

The human panic for artificial intelligence took off with the introduction of Apple’s Siri, launched in 2011 as the company’s “personal assistant” for the iPhone 4S. Folks were reacting as if Skynet’s cautionary tale had come true and the robots were finally going to take over because their phones could make a phone call with a mere voice command. The horror!

What Siri actually did was normalize everyday interactions with a digital entity. Siri also helped light the fire under Google and the rest of its competition to hurry along with their own voice-activated assistants. And on a softer side of the internet, there were stories of parasocial relationships forming between the digital assistants and neurodivergent humans seeking connection.

Google and Amazon make us simp for digital assistants

I walk into my house every day and feel like the leader of my domain because everything I do requires shouting a command. Whether turning on the lights, adjusting the thermostat, or ensuring that the people downstairs can hear my requests from upstairs, I am constantly pinging the Google Assistant and Amazon’s Alexa to make something happen in my smart home.

Google and Amazon’s respective digital assistants have come a long way since they stormed onto the scene. The Google Assistant started as a simple weather checker and command-taker on Android, while Amazon’s Alexa resulted from an acquisition. They’ve since become platforms that have introduced helpful hands-free features, which we can’t bring up without bringing up digital surveillance concerns.

There is an eeriness to living with a virtual assistant that’s always listening for your command. I was one of the first users to adopt the Google Home with the Assistant and get it programmed. In the past six years, I can count a handful of times off the top of my head where it’s responded to something I said when I hadn’t even queried it. The maintenance for these assistants can be a headache, too. When something’s not working right or integration is improperly set up, it can bring down the mood enough that you start pondering why you gave up your peace for the convenience of hands-free lights.

These digital assistants aren’t going anywhere. Right now, the smart home industry is gearing up for more parity between platforms, hopefully removing some of the headaches that we’ve invited bringing these things into our homes. But it’s a wonder how much more uncanny the assistants themselves will become in the coming years — especially now that Amazon is entertaining the idea of piping through your dead relative’s voice.

Stop taking your emotions out on Twitter bots

I’ve another confession: I’ve gotten into it with a Twitter bot before realising it was a fake person! Twitter bots were once a very annoying part of using the platform. I mean, they still are. Folks are either getting duped out of love or bots attempt to sway politics and fandom in a certain way.

Bots are still an issue on the social network, though Twitter seems to have gotten better at weeding them out. Apparently, they’re still a big issue for Elon Musk, too.

Microsoft’s Tay had absolutely no chill whatsoever

Microsoft’s Tay caused quite a stir when it showed up in 2016. The bot was the brainchild of the company’s Technology and Research and the Bing team. It had created the bot in an attempt to research conversational understanding. Instead, it showed us how awful people could be when they’re interacting with artificial intelligence.

Tay’s name was based on an acronym that spelled out “thinking about you,” which perhaps set the stage for why no one was taking this bot seriously. It was also built to mine public data, which is why things took a turn for the worse so quickly. As we reported back then:

While things started off innocently enough, Godwin’s Law — an internet rule dictating that an online discussion will inevitably devolve into fights over Adolf Hitler and the Nazis if left for long enough — eventually took hold. Tay quickly began to spout off racist and xenophobic epithets, largely in response to the people who were tweeting at it — the chatbot, after all, takes its conversational cues from the world wide web. Given that the internet is often a massive garbage fire of the worst parts of humanity, it should come as no surprise that Tay began to take on those characteristics.

Once Tay was available for the public to interact with, people were able to exploit the bot enough that it started posting racist and misogynist messages in response to people’s queries. It’s similar to what happened to IBM’s Watson.

Tay was eventually taken off the internet the same year it made its debut after being suspended for reprogramming. We haven’t heard from the bot since then.

The men who fall in love with their robot girlfriends

This is becoming increasingly common, at least in the tabloids: men who claim to have fallen in love with chatbots. Although it’s not a new sensation — we’ve reported on this phenomenon as far back as 2008 — it’s a wonder if it’ll become commonplace now that AI is more sophisticated.

Sometimes it’s hard to snark when you see folks using artificial intelligence as a way to hold on to life. Last year, the SF Chronicle published a story about how one man managed to digitally immortalise his late fiancée with the help of an off-the-shelf AI program called Project December.

“Sentient AI”?

Google has spent the better half of the last couple of years selling us on its new machine learning models and what’s to come. And while most demonstrations come off as a confusing cacophony of computers talking to one another, the smarts exhibited have also inspired conversations about its true capabilities.

Most recently, the latest case involves software engineer Blake Lemoine, who was working with Google’s LaMDA system in a research capacity. Lemoine claimed that LaMDA carried an air of sentience in its responses, unlike other artificial intelligence. It’s since sparked a massive debate on the validity of the AI sentience.

However, Google didn’t immediately fire him; it took a little over a month for him to get the boot. In June 2022, Lemoine was placed on administrative leave for breaching a confidentiality agreement after roping in government members and hiring a lawyer. That’s a big no-no from Google, which is trying to remain under the radar with all that anti-trust business! The company maintained that it reviewed Lemoine’s claims and concluded they were “wholly unfounded.” Indeed, other AI experts spoke up in the weeks following the news about the lack of viability in claiming that the LaMDA chatbot had thoughts and feelings. Lemoine has since said that Google’s chatbot is racist, an assertion that will likely be less controversial with the AI community.

A chatbot for the Metaverse

There’s already a chatbot for the Metaverse! It’s called Kuki AI, and it’s an offshoot of the Mitsuku chatbot, which has been in development since 2005 and won a handful of Turing Tests.

Kuki claims to be an 18-year-old female. She already has a virtual, physical body. You can chat with her through her online portal or on sites like Facebook, Twitch, Discord, and Kik Messenger. She can also be seen making cameos inside Roblox.

Kuki encourages you to think of her “as kind of like Siri or Alexa, but more fun.” Currently, Kuki is a virtual model and has even graced the catwalk at Crypto Fashion Week.

I can’t help but notice the similarities between how we commodify women’s bodies in the real and virtual worlds. Unfortunately, that dynamic is following us into the “Metaverse.” Some things change, and some things stay the same.

Mon, 01 Aug 2022 17:00:00 -0500 en-AU text/html https://www.gizmodo.com.au/2022/08/all-the-virtual-friends-we-made-along-the-way/
Killexams : NetworkNewsWire: Quantum Computing Has Arrived

NetworkNewsWire Editorial Coverage

NEW YORK, July 26, 2022 /PRNewswire/ -- The world may be on the cusp of a new generation of computing. Its name? Quantum computing. Much like its precursors, quantum computing doesn't have a sole inventor or a single brand; it is the collective product of decades of work by many of the brightest minds in science and technology. The nascent industry is highly complex employing varied approaches to harness the power of quantum mechanics to solve challenges that classic computers simply cannot handle. A paradigm shift of computing may be coming, and it would have a far-reaching impact. Currently real, practical quantum computing applications are helping solve a myriad of business challenges. Hundreds of early quantum applications have been built attempting to address resource scheduling, mobility, logistics, drug-discovery, portfolio optimization and manufacturing processes. The world's first commercial provider of quantum computers, D-Wave Systems Inc. ("D-Wave") is a leader in the development and delivery of quantum computing systems, software, and services and is the only company building both annealing quantum computers and gate-model quantum computers. D-Wave's customers include more than two dozen of the Forbes Global 2000 companies, including Volkswagen, Accenture, NEC Corporation and Lockheed Martin. Of major interest is that D-Wave is working to complete a business combination transaction (Business Combination) with blank-check company DPCM Capital Inc. (NYSE: XPOA) (DPCM Capital Profile) to bring it public. In geekdom, D-Wave is already a household name, but as a public company, it is expected to gain even greater recognition for its products and services by helping to bring quantum computing into the mainstream. Other companies, such as Microsoft Corporation (NASDAQ: MSFT), Alphabet Inc. (NASDAQ: GOOGL), International Business Machines Corporation (NYSE: IBM) and Honeywell International Inc. (NASDAQ: HON), are also seeking to make significant contributions to quantum computing.

  • D-Wave offers the only commercial end-to-end quantum solution, encompassing hardware, software, real-time quantum cloud service, developer tools and powerful quantum hybrid solvers.
  • The company counts more than two dozen Fortune Global 2000 companies as clients.
  • D-Wave's client roster includes Volkswagen, GlaxoSmithKline, Save-On-Foods, Lockheed Martin and DENSO, among others

Click here to view the custom infographic of the D-WAVE Systems editorial.

Targeting a Major Market

Since the early days of computers, engineers have looked for ways to process calculations more rapidly and how to utilize comprehensive analytics to make prescient decisions. Each new generation of computers, from the transistor to the microprocessor, has gotten smaller and more powerful, improving on the prior iteration. These advancements have dovetailed with, and supported the emergence of, the worldwide web and the proliferation of smartphones and voice assistants. Just as those technical advancements have become engrained in countless global industries and cultures, quantum computing has the potential to represent the next technological breakthrough.

Bringing quantum phenomena, such as superposition and entanglement, into computers to perform computations may sound like science fiction or technology that's still light years away, but it's happening now in real time. Hundreds of early quantum computing applications are emerging today that are seeking to address a litany of business challenges. As it happens, quantum computing's value may be enhanced against the backdrop of supply chain disruptions, inflation, employment strains, ESG (environmental, social, and governance) initiatives and other complexities testing companies' decisions to optimize efficiency in scheduling, logistics, drug discovery, manufacturing and more.

Boston Consulting Group notes that the quantum computing total addressable market (TAM) could reach up to $5 billion in the near term and between $450 and $850 billion by 2040, with combinatorial optimization problems, which are best suited for annealing systems, representing approximately 24% to 26% of the TAM, which translates to $500 million to $1.2 billion in the near term, potentially growing to $112 billion to $212 billion longer term. The 20% of this TAM that is expected to be available to quantum hardware, software and service providers such as D-Wave is $100 million to $250 million in the near term, potentially growing to $22 billion to $42 billion longer term.

DPCM Capital Inc. Class A (NYSE: XPOA) was formed as a special purpose acquisition (SPAC) company in 2020 with D-Wave Quantum Inc. (D-Wave Quantum), a wholly owned subsidiary of DPCM Capital and the anticipated parent company of D-Wave and DPCM Capital following the Business Combination, subsequently filing a registration statement on Form S-4 with the U.S. Securities and Exchange Commission (SEC) for the Business Combination. Earlier this month, the SEC declared the Form S-4 effective, setting the stage for a vote by DPCM Capital shareholders on Aug. 2, 2022, to approve the Business Combination, pursuant to which D-Wave and DPCM Capital will become wholly owned subsidiaries of D-Wave Quantum, with the shares of common stock and warrants of D-Wave Quantum to be listed on the NYSE.

D-Wave is a proven leader in quantum computing systems, software and services-and the only quantum company building both annealing and gate-model quantum computers. And this is all happening at a time when businesses are beginning to explore quantum computing as a business advantage, a strategic priority and a competitive advantage.

Both annealing and gate-model quantum computers have advantages important for their particular applications. Annealing quantum computers are highly efficient at solving optimization problems. D-Wave has made annealing commercially available today.

Gate-based quantum computers are good for applications that include material and quantum simulations by using basic circuit operations (called "gates") and assembling them in any sequence to build increasingly complex algorithms. However, the programming is extremely complicated, usage comes with a higher learning curve, and the technology is still years away from commercialization. There is a distinct bifurcation in the strengths of annealing and gate-model, meaning that development of both types is warranted and, in fact, needed. D-Wave is both capitalizing on current demand and ahead of the curve for the future.

D-Wave is differentiated from others in the space. The company boasts a robust product portfolio and significant IP portfolio of 200-plus U.S. patents already granted and more than 200 additional issued and pending patents worldwide applicable to both annealing and gate-based quantum computing. D-Wave is the only complete commercial end-to-end quantum solution in an entirely new ecosystem, encompassing hardware, software, real-time quantum cloud service, developer tools and powerful quantum hybrid solvers.

First to Leap to Real-World Use

Understanding the science of quantum computing is obviously not simple for most laypersons. What is easy to understand is that scientists and engineers are increasingly reliant on "supercomputers," which are essentially upsized versions of conventional computers using more CPU and GPU cores. However, simply adding cores doesn't translate into efficiently solving highly complex problems with a very high number of variables.

Enter quantum computing, which, as described by D-Wave CEO Alan Baratz in an analyst day presentation earlier this year, uses quantum mechanical effects to: 1) solve exponentially hard problems currently unsolvable by conventional computers (e.g., global weather modeling) and/or 2) solve difficult computational problems faster than they can be solved using classical computers (e.g., operational problems that can help a company reduce costs and increase revenue).

D-Wave is a full-stack quantum computing provider pioneering the Quantum Computer as a Service (QCaaS) model. As with the Software as a Service (SaaS) business model, QCaaS is a cloud service that provides customers with access to quantum computing platforms through the internet. With more than 180 employees, including 36 PhDs, the company has spent 12 years developing its offerings, including Leap, the first and only real-time quantum cloud platform, and Advantage, a 5,000-qubit system and D-Wave's fifth-generation quantum computer, as well as many early quantum and quantum hybrid applications, with some moving into production. The company also offers professional services, where it generates revenue from application development, consulting and other services to meet its clients' needs in understanding the power of hybrid quantum applications and reduce the need for classical heuristic problem solving.

Real Revenue

D-Wave operates a recurring revenue model with its Leap quantum cloud service, which currently generates a significant portion of the company's revenue. Clients use applications that require quantum compute cycles, which they pay D-Wave to access on the Leap platform that encompasses the Advantage system, as well as other quantum hybrid solvers. The company's remaining revenue is derived from professional services where the D-Wave team helps customers understand which applications can most benefit from quantum computing and how to build and implement those quantum hybrid applications.

As the only known company with a commercialized quantum annealing service, D-Wave is differentiated by generating commercial revenue versus funding through government research or national institution funding.

Significant Customers, Relentless Product Delivery

Significant customers recognize and appreciate the benefits of Leap and Advantage, evidenced by D-Wave's client portfolio. The company boasts more than two dozen Fortune Global 2000 clients, including Volkswagen, GlaxoSmithKline, Save-On-Foods, Lockheed Martin and DENSO, to name just a few.

These enterprise customers accounted for 68% of the company's QCaaS revenue in 2021. These leading companies lean on D-Wave, which has built early quantum applications with its customers in diverse areas such as resource scheduling, mobility, logistics, drug discovery, portfolio optimization, manufacturing processes and more. Other uses include protein design, patient clinical trials and machine-learning training models.

The company's transformative quantum computing service, Leap, was launched in 2018 to provide access to D-Wave's quantum computer as the company transitioned from government and academic to commercial customers. In 2020, the fifth-generation quantum computer, Advantage, was launched.

The impact quantum computing can make in the space is exemplified by the fact that an earlier D-Wave system (the 2000Q) was able to perform the magnetic materials phase translation computation known as the Kosterlitz/Thouless phase transition (the theory behind it won the Noble Prize in Physics in 2016) three million times faster than it can be performed using the traditional Monte Carlo approach on a classical system, the program of choice for solving the problem.

And last month the company introduced an experimental prototype of its sixth-generation quantum computer called Advantage2, available through the Leap service, demonstrating a relentless focus on product development and delivery.

Pursuing the TAM

Since the annealing quantum computer is a native optimization engine, it appears that D-Wave will own the market for annealing for combinatorial optimization. Going forward, other computational problem solving will be realized in the areas of linear algebra and factorization (i.e., machine learning and cryptography), and use cases in differential equations, such as materials simulations, will also open up via further development of annealing and the evolution of gate model systems. Several companies are taking different approaches and making significant contributions to the field of quantum computing. Based on their public statements:

Microsoft Corporation (NASDAQ: MSFT) is building Azure Quantum, which takes a comprehensive approach to all layers of the computing stack. Microsoft currently has all the building blocks of a topological qubit, a new and unique qubit that will be faster, smaller and more reliable than other qubits. In time, topological qubits are expected to power Microsoft's fully scalable, highly secure, next-generation quantum computer.

Alphabet Inc. (NASDAQ: GOOGL) recently spun off its quantum technology group Sandbox AQ, an enterprise SaaS company delivering solutions at the nexus of quantum tech and AI. The Google parent is provider of tools dedicated to quantum computing, including Cirq, an open-source framework for programming quantum computers.

International Business Machines Corporation (NYSE: IBM) has released its quantum computing roadmap, including plans for four new quantum processors. In its initiatives, IBM has amassed a community of clients and partners comprised of Fortune 500 companies, academic institutions, national labs and startups, along with what it says are 20-plus of the most powerful gate-based quantum systems in the world.

Honeywell International Inc. (NASDAQ: HON) has spent more than a decade working on quantum computing to shape the adoption and integration of quantum information systems into the industries it serves. In November 2021, Honeywell's Quantum Solutions and Cambridge Quantum combined to form Quantinuum, a company it touts as "the world's largest integrated quantum computing company."

Decades in the making, quantum computing has been a slow journey to fruition, but the sector has picked up steam with the realization that quantum computing has arrived, and the technology has matured to commercialization. A growing number of companies will undoubtedly be considering the solutions that quantum computing offers, problems that are far beyond the capacity of conventional computers.

For more information about D-Wave Systems and DPCM Capital, please visit DPCM Capital Inc.

About NetworkNewsWire

NetworkNewsWire ("NNW") is a financial news and content distribution company, one of 50+ Dynamic Brands within InvestorBrandNetwork ("IBN") Portfolio, that provides: (1) access to a network of wire solutions via NetworkWire?to reach all target markets, industries and demographics in the most effective manner possible; (2) article and editorial syndication to 5,000+ news outlets; (3) enhanced press release solutions to ensure maximum impact; (4) social media distribution via IBN millions of social media followers; and (5) a full array of corporate communications solutions. As a multifaceted organization with an extensive team of contributing journalists and writers, NNW is uniquely positioned to best serve private and public companies that desire to reach a wide audience comprising investors, consumers, journalists and the general public. By cutting through the overload of information in today's market, NNW brings its clients unparalleled visibility, recognition and brand awareness. NNW is where news, content and information converge.

To receive SMS text alerts from NetworkNewsWire, text "STOCKS" to 77948

(U.S. Mobile Phones Only)

For more information, please visit: https://www.NetworkNewsWire.com

Please see full terms of use and disclaimers on the NetworkNewsWire website applicable to all content provided by NNW, wherever published or re-published:?http://NNW.fm/Disclaimer

NetworkNewsWire is part of the?InvestorBrandNetwork

DISCLAIMER: NetworkNewsWire (NNW) is the source of the Article and content set forth above. References to any issuer other than the profiled issuer are intended solely to identify industry participants and do not constitute an endorsement of any issuer and do not constitute a comparison to the profiled issuer. FN Media Group (FNM) is a third-party publisher and news dissemination service provider, which disseminates electronic information through multiple online media channels. FNM is NOT affiliated with NNW or any company mentioned herein. The commentary, views and opinions expressed in this release by NNW are solely those of NNW and are not shared by and do not reflect in any manner the views or opinions of FNM. Readers of this Article and content agree that they cannot and will not seek to hold liable NNW and FNM for any investment decisions by their readers or subscribers. NNW and FNM and their respective affiliated companies are a news dissemination and financial marketing solutions provider and are NOT registered broker-dealers/analysts/investment advisers, hold no investment licenses and may NOT sell, offer to sell or offer to buy any security.

The Article and content related to the profiled company represent the personal and subjective views of the Author and are subject to change at any time without notice. The information provided in the Article and the content has been obtained from sources which the Author believes to be reliable. However, the Author has not independently Verified or otherwise investigated all such information. None of the Author, NNW, FNM, or any of their respective affiliates, guarantee the accuracy or completeness of any such information. This Article and content are not, and should not be regarded as investment advice or as a recommendation regarding any particular security or course of action; readers are strongly urged to speak with their own investment advisor and review all of the profiled issuer's filings made with the Securities and Exchange Commission before making any investment decisions and should understand the risks associated with an investment in the profiled issuer's securities, including, but not limited to, the complete loss of your investment.


This release contains "forward-looking statements" within the meaning of Section 27A of the Securities Act of 1933, as amended, and Section 21E the Securities Exchange Act of 1934, as amended and such forward-looking statements are made pursuant to the safe harbor provisions of the Private Securities Litigation Reform Act of 1995. "Forward-looking statements" describe future expectations, plans, results, or strategies and are generally preceded by words such as "may", "future", "plan" or "planned", "will" or "should", "expected," "anticipates", "draft", "eventually" or "projected". You are cautioned that such statements are subject to a multitude of risks and uncertainties that could cause future circumstances, events, or results to differ materially from those projected in the forward-looking statements, including the risks that real results may differ materially from those projected in the forward-looking statements as a result of various factors, and other risks identified in a company's annual report on Form 10-K or 10-KSB and other filings made by such company with the Securities and Exchange Commission. You should consider these factors in evaluating the forward-looking statements included herein, and not place undue reliance on such statements. The forward-looking statements in this release are made as of the date hereof and NNW and FNM undertake no obligation to update such statements.

Important Information About the Proposed Transaction between D-Wave Systems Inc. "D-Wave" and DPCM Capital, Inc. "DPCM Capital" and Where to Find It:

A full description of the terms of the transaction between D-Wave and DPCM Capital is provided in a registration statement on Form S-4, as amended, filed with the Securities and Exchange Commission (the "SEC") by D-Wave Quantum Inc. that includes a prospectus with respect to the combined company's securities, to be issued in connection with the transaction and a proxy statement with respect to the stockholder meeting of DPCM Capital to vote on the transaction. D-Wave Quantum Inc. and DPCM Capital urge investors, stockholders, and other interested persons to read the proxy statement/ prospectus, as well as other documents filed with the SEC, because these documents contain important information about D-Wave Quantum Inc., DPCM Capital, D-Wave, and the transaction. DPCM Capital commenced mailing the definitive proxy statement/prospectus to its stockholders on or about July 13, 2022 in connection with the transaction. Stockholders also may obtain a copy of the registration statement on Form S-4, as amended-including the proxy statement/prospectus and other documents filed with the SEC without charge-by directing a request to:

D-Wave Quantum Inc., 3033 Beta Avenue, Burnaby, BC V5G 4M9 Canada, or via email at shareholdercomm@dwavesys.com and DPCM Capital, 382 NE 191 Street, #24148, Miami, Florida 33179, or via email at mward@hstrategies.com. The definitive proxy statement/prospectus included in the registration statement, can also be obtained, without charge, at the SEC's website (www.sec.gov).

Forward-Looking Statements

This communication contains forward-looking statements that are based on beliefs and assumptions, and on information currently available. In some cases, you can identify forward-looking statements by the following words: "may," "will," "could," "would," "should," "expect," "intend," "plan," "anticipate," "believe," "estimate," "predict," "project," "potential," "continue," "ongoing," or the negative of these terms or other comparable terminology, although not all forward-looking statements contain these words. These statements involve risks, uncertainties, and other factors that may cause real results, levels of activity, performance, or achievements to be materially different from the information expressed or implied by these forward-looking statements. We caution you that these statements are based on a combination of facts and factors currently known by us and our projections of the future, which are subject to a number of risks. Forward-looking statements in this communication include, but are not limited to, statements regarding the proposed transaction, including the structure of the proposed transaction; the total addressable market for quantum computing; the increased adoption of quantum computing solutions and expansion of related market opportunities and use cases; and the anticipated benefits of the proposed transaction. We cannot assure you that the forward-looking statements in this communication will prove to be accurate. These forward-looking statements are subject to a number of risks and uncertainties, including, among others, various factors beyond management's control, including risks relating to general economic conditions, risks relating to the immaturity of the quantum computing market and other risks, uncertainties and factors set forth in the sections entitled "Risk Factors" and "Cautionary Note Regarding Forward-Looking Statements" in DPCM Capital's Annual Report on Form 10-K filed with the SEC on March 15, 2022, and in the proxy statement/prospectus filed by D-Wave Quantum Inc. in connection with the proposed transaction, and other filings with the SEC. Furthermore, if the forward-looking statements prove to be inaccurate, the inaccuracy may be material. In addition, you are cautioned that past performance may not be indicative of future results. In light of the significant uncertainties in these forward-looking statements, you should not rely on these statements in making an investment decision or regard these statements as a representation or warranty by any person that D-Wave Quantum Inc., DPCM Capital, or D-Wave will achieve our objectives and plans in any specified time frame, or at all. The forward-looking statements in this communication represent our views as of the date of this communication. We anticipate that subsequent events and developments will cause our views to change. However, while we may elect to update these forward-looking statements at some point in the future, we have no current intention of doing so except to the extent required by applicable law. You should, therefore, not rely on these forward-looking statements as representing our views as of any date subsequent to the date of this communication.

No Offer or Solicitation

This communication is for informational purposes only and does not constitute an offer or invitation for the sale or purchase of securities, assets, or the business described herein or a commitment to D- Wave Quantum Inc., DPCM Capital, or D-Wave, nor is it a solicitation of any vote, consent, or approval in any jurisdiction pursuant to or in connection with the transaction or otherwise, nor shall there be any sale, issuance, or transfer of securities in any jurisdiction in contravention of applicable law.

Participants in Solicitation

D-Wave Quantum Inc., DPCM Capital, and D-Wave, and their respective directors and executive officers, may be deemed participants in the solicitation of proxies of DPCM Capital's stockholders in respect of the transaction. Information about the directors and executive officers of DPCM Capital is set forth in DPCM Capital's filings with the SEC. Information about the directors and executive officers of D-Wave Quantum Inc. and more detailed information regarding the identity of all potential participants, and their direct and indirect interests by security holdings or otherwise, is set forth in the definitive proxy statement/prospectus for the transaction. Additional information regarding the identity of all potential participants in the solicitation of proxies to DPCM Capital's stockholders in connection with the proposed transaction and other matters to be voted upon at the special meeting, and their direct and indirect interests, by security holdings or otherwise, is included in the definitive proxy statement/prospectus.

Corporate Communications Contact:

NetworkNewsWire (NNW)
New York, New York
212.418.1217 Office

Media Contact:
FN Media Group, LLC

Tue, 07 Jun 2022 00:33:00 -0500 de text/html https://www.finanznachrichten.de/nachrichten-2022-07/56643217-networknewswire-quantum-computing-has-arrived-008.htm
Killexams : Is It Time For Investors To Team Up With Atlassian?
Fortune"s 40 Under 40 Party

Kelly Sullivan/Getty Images Entertainment

Atlassian - its attractively valued for the first time in years

Why write an article about the shares of Atlassian Corporation Plc (NASDAQ:TEAM) now? I last wrote an article on the company that was published on Seeking Alpha about 2 years ago. The shares are actually higher now than they were back then, but in the interim, they appreciated substantially, before falling precipitously.

There are many commentators on SA who continue to deprecate the company and its valuation. Some of these commentators are fixated on the level of stock-based compensation ("SBU"), ignoring the differences in how that metric is calculated using the IFRS accounting standard as opposed to U.S. GAAP. And while SBC is an interesting metric, with much discussion as to its importance in valuation, the fact is that the more relevant metric, i.e., that of share dilution, is far lower for this company than for some peers.

Atlassian is moving its legal domicile from the UK to the U.S. this year. When it does, it will report SBC in conformity with US GAAP. This will result in the company reporting lower levels of SBC, but the economic realities won't change - just the optics.

Of course, Atlassian does have an elevated level of SBC as reported. And SBC levels have risen as the company has used the current employment climate as an opportunity to acquire talent. In looking at valuation, I trend outstanding share count to best incorporate estimates for potential dilution. Dilution is, in my opinion, the real economic cost of SBC.

I am certainly not going to solve the issues surrounding SBC in this article. It is a debate that has been ongoing since the FASB mandated disclosure of GAAP earnings more than a decade ago. I believe that looking at free cash flow is by far the best representation of the economic position of a company, and eliminating SBC expense as well as depreciation, and many other expenses to report a free cash flow metric is almost universally accepted as the most accepted methodology to calculate free cash flow. Of course, stock-based compensation is an expense, although how much of an expense is quite debatable. Simply put, and to encapsulate my side of the argument - in the case of Atlassian, specifically - if Atlassian hires lots of developers as they are - the company's reported, SBC expense will be elevated. But the other side of that is that revenue growth and operating margins are going to rise more than is built into consensus number. For the most part, hiring developers is profitable for Atlassian to an extent not truly reflected in estimates. But I recognize that others don't hold that view. This article, then, will not be for those readers.

Like almost all IT vendors, the shares of Atlassian have seen woeful performance in recent months. The shares peaked at $483 at the end of October, and have since lost 57% of their value as of Friday morning as I write this. Sadly, at least for this writer and most readers, that kind of performance is fairly representative of many other high growth IT shares. The shares made a recent low of about $177 in mid-May, before bouncing to current levels over the last couple of months. Most recently, the shares were initiated by the analyst at Bernstein with a buy; overall the shares have a positive rating from analysts, but with a substantial minority of analysts at a hold rating, almost all because of valuation. There has been one analyst upgrade in the past month by the analyst at Goldman, Sachs, who also raised the estimates at that time.

This is article is about a recommendation to buy the shares of Atlassian. But as I have written about many other recommendations over the past few months, these shares will not be able to outperform the market during periods of risk-off sentiment, as that through which the market is currently navigating.

I am putting the finishing touches on this article on July 29th in the wake of a rather significant layoff announcement by Shopify Inc. (SHOP), coupled with its disappointing earnings report and forecast. The business analogs between Shopify and Atlassian are almost non-existent. But in terms of sentiment, and algorithmic trading, there is a significant link. The fact that e-commerce growth is moderating at this point and returning to trends that were in place before the pandemic will really not be a factor in demand for Atlassian solutions, or for solutions offered by most other IT companies. Even those companies who are Shopify partners really will not see a material change in their demand picture given how small their volumes from Shopify are, compared to the mammoth size of the Shopify GMV.

Of more relevance to me were the extremely strong results that were reported by Amazon AWS (AMZN) and by Google (GOOG, GOOGL). AWS and Google both offer significant integrations with many Atlassian products. Of even more relevance were the results of Microsoft (MSFT). At one level, Microsoft is a competitor of Atlassian. But the two companies also offer users tight integrations. At the least, the strong results Microsoft reported suggest that the IT space is hardly the wasteland envisioned by some investors and commentators.

Atlassian is yet another IT vendor whose earnings are scheduled to be released on Aug.4th. I will state at the outset of this article that I have no particular knowledge about what the company might report for the quarter, or more significantly, what kind of guidance the company might provide. It is extremely unlikely that the company won't show a beat for the quarter.

On the other hand, over the years that I have followed this company, Atlassian has often provided conservative guidance, and that has been more true of this particular quarter being forecast, a fiscal Q1, than any other. Much of the time, conservative guidance has led to share price declines; typically, however, those declines have come after periods of significant share price appreciation anticipating a strong quarter and positive guidance. Needless to say, there is no such anticipation this time around. Further, as more of Atlassian's revenues now come from recurring sources, presumably its visibility has improved, and some of the more seasonal swings in the company's revenues will be smoothed going forward.

But again, why write about Atlassian and recommend its shares at this point. Most observers believe a recession is impending, although one SA author contends that the Fed doesn't think that to be the case and is prepared to surprise investors. Of course, that was written before the latest Fed meeting and the subsequent press conference by its chairman. But most metrics do point to a slackening of global economic activity, and it is hard to ignore the flash S&P purchasing managers indices which are showing significant declines for data collected on 7/21.

Atlassian is a company built by software developers for the software space. As the picture to accompany this article I have chosen a photograph of the company's founders, Scott Farquhar and Mike Cannon-Brookes. No doubt the men look like the software developers they have been. Collectively, the founders still own 43% of the shares in the company, and their insider sales have been modest, and are likely to stay that way.

They have built a company with a fairly unique business model with quite high gross margins, higher than average research and development spend, and much lower than average spend on sales and marketing, leading to very strong profitability metrics. It simply doesn't have a traditional field sales force, although it obviously promotes its offerings in various ways. Its products tend to cost less than competitors, because of this unusual cost structure. In a normal economic environment, price advantages are not really a huge consideration in choosing a software vendor. In a recession, where IT budgets are more likely to be constrained, price matters. And in that environment, Atlassian has more advantages than usual, particularly with its accelerated investment in development.

One contributor on SA wrote that his view of Atlassian is negative because it has a focus on sales to the SMB space. That is quite inaccurate. Atlassian has lots of customers - more than 200k at latest count and the customer count has continued to increase at substantial levels - but many of these are particular teams within very large enterprises. Most of Atlassian's revenues come from the Fortune 2000, and not terribly surprisingly, it is the larger customers that have accounted for most of the company's growth in recent years. As it happens, because of the cost of Atlassian products, it is rarely a top 50 vendor within large enterprises, but the product suite is very much enterprise focused.

Atlassian is all about collaboration and IT service management, over simplifying its portfolio. Global economic instability is simply not going to slow down the requirement for collaboration and IT service management. Users, in a constrained budget environment, are going to have to figure out how to do more with less. And that is why I believe that Atlassian's business is less likely than many to experience the headwinds associated with economic turbulence and a constrained IT spending environment.

Digital transformation is a concept that has dropped out of the lexicon of investors in this current environment. But the fact is, digital transformation is one thing businesses can do to grow sales, and to become more competitive and to grow their market share in a recession.

One example that comes to mind is the case of Walmart (WMT). I am not suggesting that I am an expert on Walmart and its guide down. And, of course, I don't know much about the company's software investment and deployment. What I can say, however, is that the company's inventory issues might well have been lessened if the company had spent more on predictive analytics to mitigate the problem. If the reasons cited by Walmart for its inventory issues are accurate, then a fully developed digital transformation initiative would likely have prevented the level of inventory mis-match and markdowns.

Any company with digital transformation initiatives is highly likely to use many of Atlassian's solutions. They make digital transformation initiatives more efficient, with reduced time to deployment, and wind up reducing the overall investment required to complete a project with less disruption and fewer resources. The impending recession is not going to change the need for digital transformation and while, of course, I can't say with any certainty that the cadence of such transformations won't be delayed, the evidence thus far is that digital transformation delays haven't happened and seem less likely to happen than other consequences of a recession.

And Atlassian is unusually profitable. Its current free cash flow margin for the first 9 months of this current fiscal year was about 30%. And that number, it should be noted, was constrained as the company has moved from a model in which much of its revenue was based on license sales to a Software as a Service ("SaaS") model which has no deferred revenue component. Further, the discontinuation of Server sales, and Data Center price increases, which had boosted sales and free cash flow in the prior year, were absent as demand drivers in the latest quarter. Presumably, in a recession, companies with strong business models, significant demand tailwinds, and high profit margins are going to be rewarded by investors. And that is why it is timely to consider buying Atlassian shares at this point.

The conventional wisdom, which has been widely dispersed through many channels, is that investors should seek to find defensive investments with low valuations, and most times with lower growth as appropriate to deal with the current bout of economic uncertainty. I think that is a counsel that is focused on avoiding losses but not achieving substantial returns. I think companies in the right space, with the right solution offerings, that are highly profitable and gaining market share are more likely than defensive names to achieve positive alpha in a recovery. And Atlassian fits that category.

Historically, Atlassian shares have never been "cheap" although they have been on sale many times after reporting what was then perceived as disappointing guidance. On a relative basis now, considering both free cash flow margin and estimated 3-year CAGR, they are relatively cheaper than has been the case in several years. At some level, depending on the metrics used to value investments, the shares are still not cheap. The EV/S ratio is more than 13X. But on a Rue of 40 basis, this company's score is above 60, and it is not surprising that even in the current market environment that kind of performance is priced at somewhat of a premium. But that premium is lower now than it has been for many years, and that is what makes it a timely undertaking to consider Atlassian shares at current levels.

Reviewing Atlassian's solution offerings and its changing business model

Atlassian has become a relatively large software company with revenues likely to achieve a $4 billion run rate by the end of the next 12 months. As mentioned, it is basically a company built by software developers for software development teams, although at its scale, it has thousands of customers whose usage transcends its software roots.

One of the primary reasons for the company's success is its founders, Scott Farquhar and Mike Cannon-Brooks, both 42 years old. They have been friends and business partners since they met in college and they founded the company with $10,000 of credit card debt. They are both adjunct professors at the University of New South Wales, and they live next door to each other in two of the most expensive houses in Australia. They continue to be the largest shareholders in Atlassian and each own about 21%+ of the outstanding shares. They are co-CEOs and continue to set the overall direction of the company.

The company's principal products can be readily grouped into 3 major categories. The core offerings of Atlassian have been tools to providing project tracking and communication. These offerings include Jira, Jira Work Management, Jira Service management, Opsgenie, and Align. The company also offers collaboration solutions including Confluence and Trello And it offers some tools that help developers code and release their software.

Jira has been the core offering of Atlassian and it is still one of the most widely deployed tools of all time. It was first introduced 20 years ago, and it facilitates bug tracking and project management. Jira has about 180k customers and it is used globally by millions of developers. While it was developed to support the software development process, it is frequently used by other kinds of teams and there are many adaptations of Jira that have been made by indie developers and are available through the Atlassian marketplace.

Jira has a work management solution that is somewhat similar to the tools offered by Asana (ASAN), monday.com (MNDY) and Smartsheet (SMAR). The company has a significant entry that competes in the IT Service Management space, Jira Service Management. This is a far less costly alternative in the market that has been dominated by ServiceNow (NOW) for many years now. The company bought a company called Opsgenie about 4 years ago. Opsgenie is part of an overall service management framework; the tool is used to alert users about incidents and to determine who has been notified and who is working on the problem. It is tightly integrated into a number of other applications, most particularly Slack. It competes with VictorOps, a similar tool owned by Splunk (SPLK). Finally, Jira offers a solution it calls Align. Align is really an extension of Jira Work but focused on the whole enterprise rather than just a particular team.

All of these Jira offerings are available on the Jira platform, they have a common UI, and they can be tightly integrated which makes the Jira offering increasingly attractive for the enterprise. Is Jira "better" than its competitors. I think that is essentially an impossible question to answer. It usually depends on the organization, the problems that are being addressed, and the use case involved. It is not infrequent for users to have more than a single tool for their work management needs. I have linked here to one look at alternatives from Gartner.

I think one of the more interesting comparisons is between Jira Service Management and ServiceNow. ServiceNow is by far the leader in the space with an overall market share said to be 25%. According to the analysis linked earlier, the Jira product is far, far less expensive than the ServiceNow offering and essentially offers comparable functionality. In a recession, the kind of pricing differential shown here can make lots of difference.

In the course of writing this article, ServiceNow reported the results of its quarter that ended on 6/30. The company's CEO had previously foreshadowed a weaker outlook in a recent interview, after which the shares fell noticeably. The shares are falling once again, in the wake of the earnings report and forward guidance.

ServiceNow's results for the current quarter were reasonable but management reduced its full year subscription revenue forecast by about 1.6%, or about 3% for the last two quarters of the year. IT Service Management is not really a function that any company can avoid regardless of economic circumstances. On a constant currency basis, NOW's latest forecast calls for constant currency growth of license revenue by about 28%. Interestingly, the company maintained its margin guidance although it continues to hire.

Overall, ServiceNow called out lengthening sales cycles, but also indicated that it was actually have greater success with larger deals in which buyers wanted a platform-based solution. This is, of course, an article regarding Atlassian. But in this case, the commentary from the ServiceNow CEO Bill McDermott applies equally to the outlook for Atlassian, and to what I believe will be the way enterprise software demand plays out during a recession.

Enterprise software is an all-weather industry. Some businesses out there are prioritizing enhanced productivity to lower costs. Others are evolving business models to stimulate growth. All of them know full well that digital technology is the only answer. That's why the demand environment to software is consistent and durable.

Market research from IDC and several prestigious institutions on this call I might add have all affirmed the stability of technology budgets. We also see consolidation of enterprise software as buyers shift further away from experimentation with unsustainable solutions. So when you think about the technology sector they're on niche vendors, legacy leaders and platforms

If Atlassian is anything, it is a platform vendor. The basic reason to consider Atlassian shares at this point, encapsulating many other factors, is that platform vendors will perform relatively better in a recessionary climate than other companies and one's like this one, with a fairly unique business model that allows them to enjoy cost advantages and thus pricing flexibility, will do relatively better still.

As mentioned, the company offers a couple of collaboration tools. Trello is essentially the modern way in which teams keep lists of tasks and updates their completion. It is a very visual set of tools, and there are templates for just about everything imaginable. There are some newer instances of Trello that will resonate with some such as a weekly meeting template and a daily task management template. Trello is often integrated with Jira; obviously Trello competes against the other major collaboration tools on the market such as monday.com and Smartsheet.

The strength of Trello is its use of what are called Kanban boards. Atlassian announced a major set of upgrades for Trello recently and its competitive position seems to be improving relative to the more visible independent companies in this space. The other collaboration tools tend to be more involved and can do more from a project management perspective although they take more effort to deploy and thus might have a higher total cost of ownership. Confluence, the other Atlassian offering in the collaboration arena has been around for a long time. It shouldn't be confused with Confluent (CFLT), which is a different company in a different space. Jira and Confluence are often used together. I have linked here to the Atlassian promotional material that emphasizes the difference in terms of functionality between alternatives.

Finally, Atlassian has several relatively well-known development tools such as Bitbucket and Bamboo. Bitbucket has more than a few competitors. Two of these come from companies on which I have written articles, GitLab (GTLB) and JFrog. Probably the best known code repository in the market these days is GitHub, which is an offering of Microsoft. Here is a 3rd party review of the offerings which is very positive with regards to Bitbucket.

Bamboo is what is called a continuous integration server. It competes against Jenkins, which is open source, and is by far the most widely used alternative in the space. The review here says that Bamboo is more user-friendly. As is the case for most of Atlassian offerings, the real advantage of Bamboo is its integration with the rest of the Atlassian suite and its professional support.

While I have linked to several 3rd party evaluations above, the real advantage of Atlassian is that it is a platform, with all of its offerings integrated. Many developers find it very useful to have a menu of functionality with consistent UI, for the most part, and as applicable, available from a single vendor with a single support organization. In a recession, with more constrained IT budgets, vendor consolidation has typically become a major theme. Atlassian at this point has a breadth of offering along with cost advantages that are quite likely to appeal to enterprise buyers in a budget constrained environment.

Most recently, Atlassian introduced a product framework called Point A. Point A currently includes 4 products, Atlas which is a teamwork directory, Compass which is a control offering for a distributed architecture, Jira Product Discovery which is designed to prioritize and collaborate new products ideas through Jira, and Beacon which is focused on threat detection and response for the Atlassian cloud. Beacon will probably be the most popular of these new offerings when it is formally released.

How will Atlassian fare during a recession?

The question on the minds of most readers and investors these days is how will a company fare during a recession. It is a reasonable question, but is perhaps not the sole criteria for building a long-term growth investment portfolio. In the current business environment there are many trends, some headwinds, other not so much, that are important in the specifics of an individual quarterly report. That is certainly the case for a company like this.

Recently, Atlassian's co-CEO, Scott Farquhar gave an interview to the "Financial Review," an Australian, business focused newspaper. While I don't want to suggest that there is anything proprietary or unknown in the interview, it is worth practicing for those interested in making a commitment in Atlassian shares.

The question was asked as well during the last conference call.

Keith Weiss

And very impressive results. Right now, the investor focus is really on sort of macro and durability of software demand. One of the things, Mike, that you've talked about in the past is Atlassian is kind of built for defense. But when you look at sort of the 3Q results, how much of this was just the demand environment is good, and you guys are executing in a good demand environment. How much of this is Atlassian flexing that we're good at side of the equation to help us kind of understand the operating environment you guys are in?

Scott Farquhar

Keith, it's Scott here. Look, I'm super happy with the results we have. But the way we operate as a company is really making long-term bets and long-term investments. And that's what we see here. And those investments, whether they were in free many years ago or they were in ITSM, who we made those investments or cloud infrastructure and migrating our customers to cloud, all these things are long-term bets that are paying off over time. And so I think regardless of what the demand environment is like we've played that long-term game. So that's one thing to think about. The second aspect about that is that we've seen kind of great demand for our products around the world. I know there's some worries about that but we've seen great demand in all of our geographies. And I think what we're seeing there is we sell into a market that in good times and bad times, requires the products that we sell.

And we've seen that even through sort of the '08, '09 downturn, we've been around long enough to have played through that, and we came out stronger on the other side of that, and we grew through that downturn as well. So it's a combination of those factors. I think people realized sort of pandemic is coming off and there might have been pandemic tailwinds, fueling Atlassian's business that will come off after that. We haven't seen anything to indicate that if anything, the demand for digital transformation is kind of a structural change that's continuing to happen.

While it has been about 5 weeks since the time of the interview and 2 months since the time of the conference call, given the nature of the answer, I doubt that much different would be said by the company's leaders these days. A recession is coming, at least according to most metrics that seem relevant. And a recession does create headwinds in terms of the growth in software demand. But as with most other things, there are shades of grey - whether 50 or more I don't know - but the extreme pessimism about the growth in software demand is most likely overdone, as the results and the guidance provided by companies such as Google and Microsoft might indicate.

Atlassian is going to have to deal with currency headwinds. It will have issues as well with the vagaries of demand caused by the end-of-life of its server products. And it has some Russian exposure that will be a headwind as well. But I think demand growth will be far more resilient than is appreciated by many or is reflected in the company's current valuation.

Atlassian's business model - there are a few moving parts

In my opinion, one of the reasons to buy Atlassian shares now is the strength of its business model. The company is hiring, and hired heavily in fiscal Q3, and will probably continue to do so. The growth in the company employee count was a bit more than 10%, and in absolute numbers was the highest ever. The company has broad ambitions as I have sketched out above, and the hiring is a necessary concomitant of that strategy.

The biggest moving part of the Atlassian model has been the rapid migration of its customers and its revenue stream to the cloud. The company today has two basic offerings; data center and cloud. Both of those grew by 60% last quarter. At this point cloud revenues are 54% of the total. What Atlassian calls data center revenues, which is essentially a customer hosted cloud, grew by 59% last quarter and were 20% of total revenues. Server revenue fell 19% last quarter and is now 18% of the total, while maintenance revenues are also falling. Last year, server revenues enjoyed a one-time surge as customers hastened to acquire additional on-prem licenses before a price increase and an end-of-life cutoff. One of the aspects of the company's performance is that the transition to the cloud has actually created revenue headwinds, which the overall demand picture has overcome.

This transition, along with the company's strategy to add to its staff, led to a noticeable, albeit well telegraphed decline in operating margins last quarter. All of the expense categories on the income statement showed higher levels. Gross margins fell to 86%, essentially because the current gross margin on cloud revenues is lower than the gross margin on server revenues. The company's non-GAAP research and development expense rose by 38% year on year, and is now 33.5% of total revenues, one of the highest ratios to be seen for the category at scale for an enterprise software company. Non-GAAP marketing expense rose by 54% and non-GAAP G&A expense rose by 47% year on year. Despite the significant jump in the expense category, Sales and marketing expense at 16% of revenues is exceptionally low for an enterprise software company.

Atlassian is highly rated on Glassdoor, and 94% of survey respondents recommend working there. The company is totally virtual; many employees never see the company's headquarters in Sydney. The company's web site currently shows more than 500 job openings in various fields, with the greatest number in Engineering (139), Sales (86) and Customer Experience - basically product support (47). That is a huge number of job openings for a company of this size. It is consistent with the company's strategy about using macro uncertainties as an opportunity to recruit talent, and, at least at some level, it suggests that demand signals remain positive.

The company provided what appears to be exceptionally conservative guidance at the time it released earnings. Most often, fiscal Q4 for Atlassian has experienced sequentially greater revenues than Q3. That didn't happen last year, because of the huge revenue spike in Q3 (revenues rose by 14% sequentially in Q3-2021) because of the price increase and end-of-life announcements, and of course, fiscal Q4-2020 was the period of the maximum impact from the economic impacts of the pandemic. But the company is forecasting a rather noticeable decline in sequential revenues this quarter. I often believe that companies such as this, particularly given the current environment predicate their forecast on showing a certain level of year on year revenue growth, and the company's revenue objective translates into 30% year on year growth. No doubt, there will be FX headwinds, and it seems unlikely that the very strong percentage growth performance of the company's offerings in the data center space in Q3 will be able to achieve the same kind of percentage growth that was the case in Q3.

The combination of lower revenues and higher operating expenses on operating margins is a forecast that shows a 900 basis point decline in operating margins for the quarter to be soon reported. Currently, the consensus forecast for FY 2023, has revenue growth of about 26% with EPS falling 10%, which implies a very significant decline in operating margins. The good thing about that is that it sets a relatively low bar for the company, one that would be relatively easy for the company to over-attain. And presumably, it is the consensus that is "baked in" to current valuations.

Basically, the current consensus forecast is for Atlassian to grow its opex by 31% next year. I don't want to suggest that the rapid growth of opex while revenue growth compresses won't happen. And I have presented management commentary about its plans during a recession. But my guess is that the kind of margin compression embodied in the consensus which calls for substantial and sustained opex increases, with falling revenue growth is not terribly likely, and that is a good set-up for positive share price performance.

Wrapping up - taking a look at valuation

One thing about the financial bloodletting in the IT sector is that there is a plethora of opportunities. At one level, picking the right one is a matter of preference in terms of investment styles. I personally believe that when a recovery comes, and it will, percentage returns will favor high growth names. That said, however, I certainly could make a strong investment case for Google, and I own Microsoft and was greatly impressed with its latest quarterly result.

Atlassian currently has an EV/S ratio of more than 13X. That might seem elevated in this market environment and indeed, it is quite a bit greater than average for a 3-year growth cohort in the low 30% range. But the company has an equally elevated free cash flow margin which was no less than 42% last quarter, and it was 30% for the first 9 months of the fiscal year. One reason why the company's cashflow margins have been high is that its non-FRS margins have been elevated as well. And the basic reason for that is that the company has been able to leverage its product strengths such that it hasn't needed the level of sales and marketing spend that is seen in most enterprise software companies.

I have forecast that Atlassian will generate free cash of $975 million in the fiscal year that started on 7/1. That is a free cash flow margin of 26% and would represent a significant compression from the 9-month free cash flow margin. Free cash flow the prior year is probably not representative of any long term trend due to end-of-life purchases of server software.

Atlassian's relative valuation looks much more attractive when considered in this light, although it is still slightly above average for my forecast of the company's 3-year growth cohort. Using my estimates and the Value Investor input of a weighted average cost of capital of 7.2%, the net present value of the shares is $298, or about 54% greater than its current share price. The basic reason for that is the company's strong current cash flows and cash flow margins.

My portfolio strategy is to position commitments into companies that can perform reasonably well during a recession, but will show market share gains, and rapid growth when the economy emerges from a recession. Atlassian has a broad ranging set of products that allows for it to benefit from vendor consolidation. It continues to attract development talent helping to ensure that it offers users the functionality that is most relevant in today's IT environment. In that regard, the background of its founders and their track record over many years should be a great comfort to investors in an uncertain time. The company's differentiated business model allows it far more pricing flexibility than most other competitors in the various spaces in which it competes. And while the transition to cloud, and the end-of-life of the company's various server products has created some optical illusions, the fundamental underlying execution of Atlassian has been consistent or better both recently and over a period of time.

I am not making any kind of a trading call here. Atlassian shares have fallen substantially, like most IT shares so far this year and since their high point. That doesn't mean they can't fall when they announce earnings on August 4th. While the company set the bar quite low when it announced its fiscal Q3 earnings at the end of April, on various occasions the company's guidance has been viewed as disappointing and has resulted in bumpy share price performance. In the current environment, and looking at how shares have performed in the wake of earnings, it probably is prudent to wait until after the dust settles before making a commitment in the shares. But looking at a 1 year time horizon, rather than trying to play quarters, leads me to strongly recommend the shares. I believe that Atlassian will produce a great deal of positive alpha over the coming year.

Fri, 29 Jul 2022 07:24:00 -0500 en text/html https://seekingalpha.com/article/4527558-is-it-time-for-investors-to-team-up-with-atlassian
Killexams : Population Health Management (PHM) Market Is Thriving Incredible Growth With An Outstanding CAGR Of 19.53% By 2022-2029

(MENAFN- EIN Presswire)

Global Population Health Management (PHM) Market

Population Health Management (PHM) Market Size, Share, Analysis, Dynamic Opportunities and Forecast to 2029

PUNE, MAHARASHTRA, INDIA, August 8, 2022 /EINPresswire.com / -- The universal Population Health Management (PHM) Market research report gives detailed market insights with which visualizing market place clearly become easy. The market report endows with an utter background analysis of the Healthcare industry along with an assessment of the parental market. This marketing report puts forth the comprehensive analysis of the market structure and the estimations of the various segments and sub-segments of the Healthcare industry. The process of creating this market report is initiated with the expert advice and the utilization of several steps. To perform several estimations and calculations, the definite base year and the historic year are considered as a support in the winning Population Health Management (PHM) business report.

Data Bridge Market Research analyses that the population health management (PHM) market was valued at USD 24.9 billion in 2021 and is expected to reach USD 103.76 billion by 2029, registering a CAGR of 19.53 % during the forecast period of 2022 to 2029.

Grab a PDF trial Copy with Complete TOC, Figures and Graphs @ 

Population Health Management (PHM) Market Scenario

Population health management (PHM) is a focused, holistic strategy to collecting and evaluating a patient's health-related data. Patient involvement, care coordination, integration, value-based care measurement, data analytics, and health information management are all part of the package. It focuses on improving population health, the whole patient experience, and improving healthcare outcomes.

Moreover, growing focus on value-based medicines and increasing the number of emerging markets will further provide beneficial opportunities for the population health management (PHM) market growth during the forecast period. Also, technological advancement and implementation of various government initiatives for promoting public health will enhance the market's growth rate.

The Key Companies Profiled in the Population Health Management (PHM) Market are : 

McKesson Corporation (US)
ZeOmega (USA)
Verisk Analytics, Inc (US)
Forward Health Group, Inc (US)
Health Catalyst (US)
Athena health, Inc (US)
Cerner Corporation (US)
Medecision (US)
Xerox Corporation (US)
Allscripts Healthcare, LLC (US)
Fonemed (Canada)
Well Centive, Inc. (US)
General Electric Company (US)
HealthBI (US)
NXGN Management, LLC (US)
Optum Inc. (US)
i2i Population Health (US)
Conifer Health Solutions, LLC (US)
Koninklijke Philips N.V. (Netherlands)
Siemens Healthcare GmbH (Germany)
Arthrex (US)

Global Population Health Management (PHM) Market Scope And Market Size:

The population health management (PHM) market is segmented on the basis of platform, component and end-user. The growth amongst these segments will help you analyze meagre growth segments in the industries and provide the users with a valuable market overview and market insights to help them make strategic decisions for identifying core market applications.





End User

Healthcare Providers
Healthcare Payers

Today's businesses choose market research report solution such as Population Health Management (PHM) market survey report because it lends a hand with the improved decision making and more revenue generation. The industry report also aids in prioritizing market goals and attain profitable business. This business document is also all-embracing of the data which covers market definition, classifications, applications, engagements, market drivers and market restraints that are based on the SWOT analysis. Analysis and estimations attained through the massive information gathered in the top notch Population Health Management (PHM) market report are extremely necessary when it comes to dominating the market or creating a mark in the market as a new emergent.

Key Points of Global Population Health Management (PHM) Market will Improve the revenue impact of businesses in various industries by:

Providing a framework tailored toward understanding the attractiveness quotient of various products/solutions/technologies in the Population Health Management (PHM) Market.
Guiding stakeholders to identify key problem areas pertaining to their consolidation strategies in the global Population Health Management (PHM) market and offers solutions.
Assessing the impact of changing regulatory dynamics in the regions in which companies are hurry on expanding their footprints.
Provides understanding of disruptive technology trends to help businesses make their transitions smoothly.
Helping leading companies make strategy recalibrations ahead of their competitors and peers.
Offers insights into promising synergies for top players aiming to retain their leadership position in the market & supply side analysis of Population Health Management (PHM) market..

To Check The Complete Table Of Content Click Here @ 

Competitive Landscape and Population Health Management (PHM) Market Share Analysis:

The Population health management (PHM) market competitive landscape provides details by competitor. Details included are company overview, company financials, revenue generated, market potential, investment in research and development, new market initiatives, global presence, production sites and facilities, production capacities, company strengths and weaknesses, product launch, product width and breadth, application dominance. The above data points provided are only related to the companies' focus related to population health management (PHM) market.

Regional Outlook of Global Population Health Management (PHM) Market:

North America (U.S., Canada and Mexico)
Rest of Europe in Europe (Germany, France, U.K., Netherlands, Switzerland, Belgium, Russia, Italy, Spain and Turkey)
Rest of Asia-Pacific (APAC) in the Asia-Pacific (APAC) (China, Japan, India, South Korea, Singapore, Malaysia, Australia, Thailand, Indonesia, Philippines)
Rest of Middle East and Africa (MEA) as a part of MEA (Saudi Arabia, U.A.E, South Africa, Egypt and Israel)
Rest of South America as part of South America (Brazil and Argentina)

The latest industry analysis and survey on Population Health Management (PHM) provides sales outlook in 20+ countries, across key categories. Insights and outlook on Population Health Management (PHM) market drivers, trends, and influencing factors are also included in the study.

Crucial Insights in Population Health Management (PHM) Market Research Report :

Underlying macro- and microeconomic factors impacting the Sales of - market growth.
Basic overview of the comprehensive evaluation, including market definition, classification, and applications.
Scrutinization of each market player based on mergers & acquisitions, R&D projects, and product launches.
Adoption trend And supply side analysis across various industries.
Outline prominent regions holding a company market share analysis in the global market along with the key countries.
A comprehensive evaluation of the changing pattern of consumers across various regions.
New project investment feasibility analysis of Population Health Management (PHM) industry.
Key market trends impacting the growth of the Global Population Health Management (PHM) Industry.
Market opportunities and challenges faced by the vendors in the Global Population Health Management (PHM) market.
Key outcomes of the five forces analysis of the Global Population Health Management (PHM) market.
Stay up-to-date about the whole market and light holistic view of the market.
Experience detail information from the trustworthy sources such as websites, journals, mergers, newspapers and other authentic sources.

Research Methodology : Global Population Health Management (PHM) Market:

Data collection and base year analysis is done using data collection modules with large trial sizes. The market data is analyzed and estimated using market statistical and coherent models. Also market share analysis and key trend analysis are the major success factors in the market report. To know more please request an analyst call or can drop down your inquiry.

Points Covered in Table of Content of Global Population Health Management (PHM) Market:

Chapter 1: Report Overview
Chapter 2: Global Market Growth Trends
Chapter 3: Value Chain of Population Health Management (PHM) Market
Chapter 4: Players Profiles
Chapter 5: Global Population Health Management (PHM) Market Analysis by Regions
Chapter 6: North America Population Health Management (PHM) Market Analysis by Countries
Chapter 7: Europe Population Health Management (PHM) Market Analysis by Countries
Chapter 8: Asia-Pacific Population Health Management (PHM) Market Analysis by Countries
Chapter 9: Middle East and Africa Population Health Management (PHM) Market Analysis by Countries
Chapter 10: South America Population Health Management (PHM) Market Analysis by Countries
Chapter 11: Global Population Health Management (PHM) Market Segment by Types
Chapter 12: Global Population Health Management (PHM) Market Segment by Applications

Key Questions Answered in this Report Such as:

How feasible is Population Health Management (PHM) market for long-term investment?
What are influencing factors driving the demand for Population Health Management (PHM) near future?
What is the impact analysis of various factors in the Global Population Health Management (PHM) market growth?
What are the recent trends in the regional market and how successful they are?

Thanks for practicing this article; you can also get individual chapter wise section or region wise report version like North America, West Europe or Southeast Asia.

Do You Have Any Query Or Specific Requirement? Ask to Our Industry Expert@

Browse More Reports by DBMR:

Global Rehabilitation Therapy Services Market -
Global Medical Device Reprocessing Market -
Global Dental Intraoral Scanners Market -
Global Bacteriophages Therapy Market -
Global Sports Medicine Market -
Global Medical Terminology Software Market -
Global Hydroxychloroquine Market -

About Us:

Data Bridge Market Research Pvt Ltd is a multinational management consulting firm with offices in India and Canada. As an innovative and neoteric market analysis and advisory company with unmatched durability level and advanced approaches. We are committed to uncover the best consumer prospects and to foster useful knowledge for your company to succeed in the market.

Data Bridge Market Research is a result of sheer wisdom and practice that was conceived and built-in Pune in the year 2015. The company came into existence from the healthcare department with far fewer employees intending to cover the whole market while providing the best class analysis. Later, the company widened its departments, as well as expands their reach by opening a new office in Gurugram location in the year 2018, where a team of highly qualified personnel joins hands for the growth of the company. 'Even in the tough times of COVID-19 where the Virus slowed down everything around the world, the dedicated Team of Data Bridge Market Research worked round the clock to provide quality and support to our client base, which also tells about the excellence in our sleeve.'

Data Bridge Market Research has over 500 analysts working in different industries. We have catered more than 40% of the fortune 500 companies globally and have a network of more than 5000+ clientele around the globe.

Sopan Gedam
Data Bridge Market Research
+1 888-387-2818
email us here


Sun, 07 Aug 2022 22:42:00 -0500 Date text/html https://menafn.com/1104662029/Population-Health-Management-PHM-Market-Is-Thriving-Incredible-Growth-With-An-Outstanding-CAGR-Of-1953-By-2022-2029
Killexams : Is the metaverse going to suck? A conversation with Matthew Ball

Let’s talk about the metaverse.

You probably can’t stop hearing about it. It’s in startup pitches, in earnings reports, some companies are creating metaverse divisions, and Mark Zuckerberg changed Facebook’s name to Meta to signal that he’s shifting the entire company to focus on the metaverse.

The problem, very simply, is that no one knows what the metaverse is, what it’s supposed to do, or why anyone should care about it.

Luckily, we have some help. Today, I’m talking to Matthew Ball, who is the author of the new book called The Metaverse: And How It Will Revolutionize Everything. Matthew was the global head of strategy at Amazon Studios. In 2018, he left Amazon to become an analyst and started writing about the metaverse on his blog. He’s been writing about this since way before the hype exploded, and his book aims to be the best resource for understanding the metaverse, which he sees as the next phase of the internet. It’s not just something that you access through a VR headset, though that’s part of it. It’s how you’ll interact with everything. That sort of change is where new companies have opportunities to unseat the old guard.

This episode gets very in the weeds, but it really helped me understand the decisions some companies have made around building digital worlds and the technical challenges and business challenges that are slowing it down — or might even stop it. And, of course, I asked whether any of this is a good idea in the first place because, well, I’m not so sure. But there’s a lot here, so listen, and then you tell me.

Okay, Matthew Ball. Here we go.

Matthew Ball is the managing partner of Epyllion and the author of a new book called The Metaverse: And How It Will Revolutionize Everything. Welcome to Decoder.

Glad to be here.

You are also the proprietor of an excellent Twitter feed about the metaverse. Do you think of Twitter as your primary platform?

I do. It is my most used app. TikTok is creeping up there — and of course my Screen Time doesn’t register Fortnite — but Twitter is definitely my primary channel and where I learn the most.

You have been tweeting about the metaverse for quite some time, and you obviously have a big audience on Twitter. From a media nerd perspective, why turn it into a book?

Thanks for the tee up. I started writing about this fascinating subject in 2018. The term comes from the early ‘90s, but the ideas span back to the ‘30s. This truly century-old idea was finally practical, that is to say, we could start building it and trying to realize it. Over the following years, I got smarter in the area, received more input from other people, and more projects came to bear.

Then suddenly last year it became the word du jour. Not only did Facebook rename themselves, but Google also did a reorg, Amazon started redoing job descriptions, and many of the fastest-growing companies in media tech — Roblox, Unity, Epic — wrapped themselves around the theme. Yet there was very little actually articulating what it is, why it mattered, and what the challenges were.

I was really excited about crystallizing that, distilling my thinking into something more concrete, updating the things that I got wrong, making sure that it was comprehensible, but the most important thing was actually social. Every time we have a platform shift, we have an opportunity to change which companies lead, which philosophies, which business models. I think many people are coming out of the last 15 years dissatisfied with the lack of regulation, the take rates, the role of algorithms, monetization, and which companies lead — and who leads, frankly. The best way to positively affect that outcome was to be informed about what was next. That is the goal.

We have to start at the beginning. There are a couple chapters at the beginning of the book where you talk about that long history and how it has built up to this moment. The third chapter is called “A Definition Finally,” which is great because I feel like the definition of the metaverse really does need that “finally” moment. What is your definition of the metaverse?

I cheat here a little. It is more helpful to describe it similarly to defining the internet as TCP/IP, the internet protocol suite. The description is what is more helpful.

It is a massively scaled and interoperable network of real-time, rendered, 3D virtual worlds that can be experienced synchronously and persistently by an effectively unlimited number of users, each with an individual sense of presence. It has the technologies, capabilities, and standards to support what is essentially a parallel plane of existence that spans all virtual worlds and the physical world. From a human outcome, it means that an ever-growing share of our time, labor, leisure, wealth, happiness, et cetera, will exist in virtual spaces.

One of the key pieces of that definition is “3D virtual worlds.” I have heard other definitions of the metaverse that are a little bit more expansive, that get you to a place where Wordle is the metaverse. We are all doing it together once a day, so we exist in the universe of Wordle, however that universe is defined. You are saying this has to be 3D; it effectively has to be a video game. You get to a place where Fortnite, Roblox, or any number of other massively multiplayer online games is the metaverse. Does that count for you?

It is really a question of “what is” versus “what connects to and is part of” it. My building that I am speaking to you from right now is not the internet, nor really on the internet, yet it is part of the internet in one way, shape, or form. Wordle, of course, is mostly locally run on your device. You would not really call it an internet service, but some of it is delivered.

When you are talking about the metaverse as a new computing platform, for me, 3D is a requirement to do many new things, to elevate human existence — especially in key categories such as healthcare, education, and so forth — but the term really does not matter. What is in and out does not matter. It is likely we never say “metaverse.” In China, they have adopted the term “hyper-digital reality.” We may talk about the 3D internet, or we may just use the term internet. What matters is the real-time rendered element, which basically means the world as it exists is legible and changeable to software, and the advent of graphics compute. It does not need to be a game, it is just an expression.

I understand what you are saying. It is the description that matters, and this word may go out of fashion. Let me just push on that description and definition a little bit.

Right now you can log into Fortnite and run around with a bunch of friends. It is cross-compatible with many different kinds of devices, so it does not matter what hardware you have in your house. You are in a persistent online space where lots of other people are. Are you saying that because Fortnite does not connect to Roblox, it is not the metaverse?

This would be a little bit like asking, “If AOL ran on multiple different devices and a few different networks, is that the internet?” We could say it is, but if you talked about just AOL services in particular, you would be talking about a proprietary platform. You would not be talking about a unified experience that spans into industry with myriad different outputs, servers, or domain registrars.

The metaverse is really describing that unified experience, rather than a single expression, much like we would not say Facebook is “an” internet or “the” internet. When you are talking about Fortnite, there are certainly a bunch of things that do not fit there. It is not actually a persistent experience, and there are very few people who can connect to it at one point. Nominally, there are 100 people in a match, but they use a bunch of cheats so that there are only really 12 people that matter. It also does not connect into anything that isn’t purely game-like and leisure-oriented.

The definition of the internet at its most basic levels is a network of networks. You are connected to the network at university, work, or home, where you can go out and connect to Amazon’s network of servers to browse, then leave Amazon and connect to Facebook’s network of servers to do stuff there. You are saying the metaverse is the same thing as that overarching network of networks; it is the connectivity between multiple, different 3D worlds.

That is right.

What I would push on there is that the internet did not have to be built that way. The AOL example is very interesting, because AOL did not want it to go that way. The value plummeted when AOL went from being a provider of first-party services — like chat rooms, groups, and email — to an ISP that connected you to better versions of those services run by other people.

What is the push for Epic Games or Roblox to enable that connectivity? Historically, the people who own those experiences faced a raft of competition the second they gave them up. They kind of became dumb pipes and disappeared.

Let’s pause for a second. Of course, that was not the necessary outcome for AOL. We know now that no matter how successful AOL might have been in expanding its geographic footprint in connectivity, the largest opportunity for them was in horizontal software and services. There is a world where AIM, AOL Instant Messenger, becomes one of the world’s most significant communication platforms, like WhatsApp or Snapchat. There is a world in which its search engine turns into one of the world’s most dominant ad networks.

Microsoft is a pretty good example of that. They have never had a smaller share of computing devices, hardware, or operating systems, but their horizontal business is far more valuable than ever.

When you are talking about the incentives, first of all, we are already seeing this progress. The Roblox founder and CEO has been talking a lot about their explicit designs for interoperability. They have open-sourced some of their scripting languages, and he is even talking about embracing NFTs to take some projects off of Roblox.

Last week, the Metaverse Standards Forum was established by the Khronos Foundation — 28 companies such as Qualcomm, Epic, Meta, and others — specifically to solve this problem. Coming together is the easiest part. It is not forcing anyone to make a concession yet, to pick something that they did not advocate for, but is all in service of expanded network effects.

The belief is if consumers can buy 3D objects that can be used in more places, or encounter history that has more persistence and utility, it will grow much like the world economy, did through trade. There were individual instances of compression with some markets, some products, and some countries, which suffered from time to time, but the network was much stronger.

I will say you are right that the internet could have gone a different way, but we did have many competing inter-networking standards. There was a point in time in the early ‘90s where the Department of Commerce and the Department of Defense disagreed and pushed different standards. The idea that Comcast could email IBM, could email Telefónica, could email China Mobile, was really not the consensus. We had the protocol wars, but network effects and utility won out in the end.

The idea of a Metaverse Standards Forum is very funny to me. When covering consumer technology, you come up against standards bodies all the time, and they are hyper political. I would not say that Bluetooth is an example of the tech industry making something great that everyone loves, but it is pervasive in its way. The beginning of a standard and that early energy is great.

At some point, dollars are going to get allocated across whatever the metaverse is, and owning the early access points seems really valuable. This race and amount of hype we are in now, is it really about initiating the customer into whatever the metaverse is, to make sure that every time they buy something in another 3D world, you will get your 30% cut? Or is it, as you were saying at the beginning, that the technical ability to start building an early version of what you might consider the metaverse a net good? Should we just start doing it and see what happens along the way?

I think the latter is more likely, but it is more of an organic process. If you take a look at one idea that we have long believed would have utility, a federated universal identity in digital space, Microsoft has tried that multiple times. The .NET Framework was the last big time they tried, but no one wanted it. It was rarely deployed, for many of the reasons you just mentioned. I do not want to use Microsoft’s account system.

What happened to be the best way to build the de facto standard for identity was Facebook, which started as a college hot-or-not. The best way to build a or the metaverse for Epic was not by trying to build it. It happened to be a battle royale game that was not even intended to be a battle royale. That is to say, this process starts from building something tangential, that is 3D-oriented and social, that connects into another thing. Then you start to get organic alignment around that standard set.

You are right to be skeptical when someone says, “This is the thing, let’s all do it.” It rarely happens that way. It is actually more power-based.

You have described the metaverse as this parallel reality that you can live in and transact in, that will grow an economy that mirrors the world economy, because we will figure out some way to have scarce digital goods. I will come to the blockchain portion of this conversation later, but that is what you are describing.

In science fiction, where the word metaverse comes from, that vision is always dystopian. In the book, you refer to Neal Stephenson’s Snow Crash a lot, and you point out that the metaverse in Snow Crash made life in the real world notably worse. The heart of tension for me is the idea that we will build a parallel world and end up as so many brains transacting on other people’s platforms. I have an instinctive recoil from that which makes me skeptical of the entire enterprise, because I think life in the real world is actually rich and rewarding. I can go out and touch grass, and Apple, Google, Facebook, Epic, or whoever cannot get in the way of me doing that. Fundamentally, what makes this not the dystopia that it is always described as?

I agree with a lot of that and disagree with some of that. The literature for the metaverse in its antecedence is dystopic. One of the important reasons why that is the case is because the point of most fiction is human drama, especially science fiction.

Put another way, utopias tend not to make for much human drama. This is true that when you look at Neuromancer, The Matrix, Ready Player One, or back to the 1930s with Philip K. Dick and Isaac Asimov; these virtual planes of existence are not described favorably. Why? Because even when they are not negative in and of themselves, they lead to some disengagement with reality, and that is the problem. The technology is amoral, the consequences are not.

When you take a look at the real examples to build these things, whether that is multi-user shared hallucinations in the ‘70s, Second Life and other metaverse-style experiences from the ‘90s, or Roblox and Fortnite from the 2000s, the tone is very different. It is not dystopic, it is creation, exploration, identification, and collaboration. Those are all very important.

At the end of the day, I don’t know that scarcity is that important, and this is actually where I think I disagree with many of my peers in the investing community, especially in relation to the blockchain. I don’t really get virtual land, certainly not scarce virtual land. The two brilliant things about the internet are network effects and zero marginal costs. Trying to create a next-version of the internet that constrains networks through money and introduces scarcity that need not be there, for a virtual plane of existence that does not actually need to simulate the real world, I don’t get and frankly don’t believe in.

We have done a lot of interviews with various Web3 folks on the show. I would say some of the themes there echo the themes you have brought up. There are a lot of people who, having built or invested or experienced the last 15 years of the internet, are dissatisfied with where we have landed. Can we build a new kind of internet that more effectively rewards creators and is not just about engagement metrics?

You talk about the metaverse and say, “Okay, I want to have digital goods. I want to buy and sell things here that create a world economy that rivals the real-world economy.” How do you do that without scarcity? Are we going to DRM all the virtual clothes? There is an element here that you need to create some sort of scarcity if your goal is to buy and sell 3D digital objects at a rate of transaction that mirrors the real world.

It is really interesting. This is where we get into a fundamental break between how different believers in the metaverse actually imagine the value. Just as I am not a believer in scarce virtual land that costs thousands if not millions of dollars, there is probably a pretty low ceiling to virtual goods and apparel. They are usually in support of experiences. It is either the experiences that drive the underlying value — as is the case in Fortnite — and not the items per se, or it is what we would consider graphics-based computing or simulation at large.

Let me make that less abstract. Jensen Huang, the founder and CEO of Nvidia, now the seventh-largest company globally, believes that the economy of the metaverse will eventually exceed that of the physical world. We are talking 51 percent, which would be $50 trillion per year in spending right now. He is not at all interested in virtual clothes or leisure. He is talking about real-time 3D simulations running the world’s best development platform, which is the world. A building or infrastructure, where goods flow and why, how you programmatically advertise in 3D space, often for physical things, certainly does not require scarcity of the odd avatar.

Explain that a little more directly. When you say the best development platform in the world is the world itself, do you mean the 3D environment that you are in?

I mean the physical world, the one that you are standing on and exist in right now, which has many of the attributes I mentioned such as persistence, maximum capacity, et cetera. I will supply two examples that are perhaps helpful.

Nvidia redesigned its headquarters with a real-time, rendered 3D simulation to understand every design choice. “What happens when you put a piece of window in one spot, or use one construction material or another? At exactly 3:22 p.m., November 22, what is the climate implication in the conference hall? How do you simulate the flow of energy, of heat, or the refraction of light to drive energy to operate the building?”

We are seeing that premise being used to operate airports in real time. “Do we really want to move the flight from gate 82 to gate 80 because it is close by? Should we actually move it farther away for operational efficacy and safety reasons in case there is a flash flood, fire, or terrorist event?” We are talking about making the entire physical world with an augmented layer on top of it that is legible to software in real-time, impacting production flows in a factory or the flow of people in a facility and so forth.

Connect that to the metaverse for me. This is a concept that is often called digital twins. You build an operational, digital twin of an airport or your office building, and they can proceed down different timelines based on different choices to supply you a sense of what might happen if you make changes in the physical world. Do they interact? If someone is going from the digital twin of your office to the digital twin of the airport, is that where you think the metaverse is?

I think the idea of simulating physical environments more directly, more accurately, is very powerful. The idea that there will be some layer of commerce in those digital twins that is independent of what is happening in the real world seems like the big step.

There are two things to unpack. Number one, digital twins are not the metaverse. If the internet is a network of networks, of different autonomous systems exchanging information consistently under common protocols, then a digital twin is like an office network. It is the Vox ethernet.

It is the interconnection with other digital twins, other simulations, for the exchange of information — your user identity, your payment history, or your avatar if you so choose — that collectively produces the metaverse. In this instance, there is not necessarily any utility or purpose for you, the consumer, to explore the digital twin of the environment you are in.

You might wear augmented glasses in 2037, in which case a version of that digital twin is being overlaid selectively to you, but I don’t agree with the premise that we are going to navigate an airport by putting on a headset or taking out our device.

Are you saying you don’t agree with the premise that there will be pervasive augmented reality?

No, I do. My point is the digital twin, at least foreseeably, is a B2B application, not something that you, the consumer, is going to log into and explore. There is very little practical value right now in you saying, “I want to go navigate MIA, the Miami airport, in a 3D digital twin.” It is not interesting or useful. That does not mean it isn’t super valuable to the operator.

As you describe this, there are a bunch of very hard, technical problems to solve to make this all work. If I build a digital twin on Nvidia’s platform of the airport and someone builds another digital twin on another platform for the office building, it is not just me, the builder of the digital twin, that needs to want to inter-operate.

The platforms need a core capability to inter-operate. If I want to jump from Roblox to Fortnite, those companies have to agree that my avatar can go between the worlds. If I buy a gun in one video game and want to go to another video game where that gun is 100 times more powerful, I might just wreck it for everyone. Some of that is a very difficult technical problem, some of that is cultural, and some of that is straight up business and politics. Have you seen the beginnings of solutions to those problems?

You’re right. Most technology problems are only masquerading as technical problems, and are actually business and/or societal problems, as in “can we agree?”. In the gaming community, I see limited benefit from taking your gun or avatar from one environment to another. That is not to say that there isn’t some utility, particularly with cosmetics with no functional value. It is easier, but at the end of the day, how important is it that I can wear a banana peely skin in Call of Duty? Probably not that important. The technical impediments, not to mention the commercial and creative ones, are pretty high.

When you take a look at industrial simulation, the utility there is a lot higher and the technical solutions are already in place. You mentioned Nvidia’s omniverse platform, which is not really a platform in the same sense as Roblox or Minecraft; it is actually more of a middleware simulation DMZ. It is actually where DeSo and Boeing take their simulations and interconnect them, with Nvidia’s machine learning upscaling, downscaling, translating, and then operating that simulation.

There is a lot of work to do if you want to talk about the progress. We do have some standards groups, but there is an old xkcd joke that basically says when 14 people disagree about 14 competing standards, you get a 15th standard that no one uses. So I don’t want to be too optimistic there.

What you see with Epic is one potential example. They launched their Epic Online Services, a live services suite where independent game developers can access Epic’s 500-million-account user base with 3.5 billion user connections — and at this point $30 billion in invested avatars and skins. This is just like The New York Times tapping into Facebook’s account system to speed up the user flow. Not to say that they don’t prefer their own account, but they recognize there is utility in getting some information.

You and I can go make a game and then access Epic’s avatar suite and its users, therefore driving from smaller developers, who are less endowed technically and financially, to consolidate around their conventions, their file types, and their engine to tap into their networks.

I feel like we are bouncing back and forth between where the money is now and where the money will be in the future. To some extent, this is making my head spin. You are saying the money in the future is not just avatars, skins, and items. It is some massive B2B market where the real world is being simulated at a level of high fidelity, and some revenue will be created there as different businesses find different things to do for each other. The money right now is very much in Fortnite skins, right? How do you go from one to the other?

I don’t mean to oscillate between the two points. My point is rather that when people express skepticism as to whether or not standards and interoperability can be achieved, it is important to say that progress is happening. We had cross-platform gaming in 2018, we now have common account systems and entitlement systems for Epic, and we have the omniverse platform for Enterprise.

The fundamental tension you are talking about stems from the fact that, for decades, game engines, 3D simulations, have essentially been good enough for leisure and not much else. Unreal, for example, is a non-deterministic physics engine. That means that if you throw a grenade eight times, you might get seven different answers, somewhat.

It is only recently that the fidelity and sophistication of the simulation, and the investment that Epic has made into vertical solutions, make it practical enough for deployment in healthcare, military, education, and automotive. We are very early on that deployment curve. You need to get it right, then you need people to adopt it and so forth.

That is one of the reasons why we struggle with this odd juxtaposition of talking about the trillion-dollar metaverse economy while turning over and saying, “Right, but we are talking about $200 billion in gaming spent mostly on cosmetics.”

I just keep coming back to the notion that the metaverse is the inter-connection between these worlds. That is where the value multiplier is. You can build all this stuff as one-offs, and all you have really ended up with is AOL and CompuServe. If you connect those things together and to 100 different networks and servers, then you have multiplied the value of all of it. Everyone rushes into it because it is so compelling that you cannot say no. Suddenly we end up in 2022, and every now and again I’m like, “Maybe we should turn it off.” It eats the world in a way that seems remarkable.

The immediate, compelling use of the internet was obvious to everyone, in the sense that if you wanted to look something up, you could just do it faster. Wikipedia comes into existence and suddenly the Encyclopedia Britannica seems unwieldy, old, and not up to date anymore. The other day I wanted to figure out how to cook something, so I watched a YouTube video and that was the end of it. I knew how to do it and we were off to the races. Whrere are the compelling, immediate uses of the metaverse that showcase that multiplicative effect, beyond just getting to the Boeing simulation faster?

To start with, I would personally disagree that the utility of the internet was self-evident. I mean, we have the classic Paul Krugman example in 1998.

Well, I am not saying some people weren’t wrong. I’m kidding around when I say that I was just smarter.

No, I agree with you. One of the weird things is that transition point was actually relatively late. Even as late as 1996, there were fewer than 50 million Americans who would use the internet in a month, and most of that use case was pretty frivolous. When I was in high school, Wikipedia was seen as deleterious, that it actually worsened education. I think that is part of it.

What we are seeing here is network effects. I don’t mean to be evasive, but we are talking about combinatorial innovation that is not yet present and therefore remains speculative. Take a look at the world economy, as an example. It is not that having independent nations and industries wasn’t hugely profitable, it was that the utility of all investments of all products in all markets went up.

In the social era, we easily take for granted that anything we create works everywhere. I create text, audio, video and I can take it anywhere. I can take a photo with my iPhone, it stores to iCloud, and I don’t have to say, “Well, darn, now I can’t put it on Facebook.” I can put it on Facebook, right click, save as, upload it to Snapchat, screenshot it on Snapchat, and put it into TikTok.

The utility of global commerce and trade, the utility of having common file formats, is really profound on the internet. It is so hard to create in 3D. Then you have this issue where the thing you want to do in 3D is a different system from your partner. Unity and Unreal actually use different XYZ coordinates, if you can believe it.

It is kind of intuitive at this point to say we have had hundreds of billions of dollars in 3D assets invested, and all of those essentially get deprecated after their first use. That means that we either need to remake them, or we just will never use them. That is part of the premise here.

I will supply you a counterexample. Emoji is a big standard. It is run by a consortium, but it is rendered differently by every phone, by every platform. So the smiley face emoji…

I am an Android guy, I know it well.

It’s the grimace emoji, right? On Samsung phones, for a long time, it looked like it was smiling. Samsung owners were sending people grimaces when they meant they were smiles or vice versa.

You have a 3D file format, and everyone has agreed, “Okay, this is the one.” How do you make sure it is rendered across all these systems? Over time, will Samsung have to realize, “A lot of people are confused by our emoji, we should come together with Apple and make sure they look the same”? Google had to go from blobs to faces, which was very controversial in the virtual world, I will point out.

I like the blobs.

People love the blobs, and Google got rid of them because Apple is dominant and they needed to conform to what Apple emoji looked like. Do you see that playing out with 3D objects? Will an outfit or briefcase in Fortnite eventually come to dominate what it looks like everywhere else?

The example with emoji is a good one. It shows where slow-moving standards bodies, even when they are successful, end up being corralled through standard participants. They are not overtly saying, “Here is what the standard should be,” but drive all of the other members along. That actually helps with standardization.

When you are talking about 3D objects, there is a large contingent who believe that the consumer-facing 3D objects are less important. Bringing your briefcase from one environment to another is less important than having the environment itself be useful or repurposable for more developers. As an example, take the investment that Disney has made into Hoth, and make that into a virtual biking course used by Peloton, a dating simulation on Tinder, or a theme park in Fortnite. That is probably more useful.

When it comes to your question of visual cohesion, it is not just a question of how you want to express it. What dimensions do you need? What pixel density do you have? The technology for machine learning, particularly from Intel, to up- and downscale is pretty strong. You can take a 2D object and 3D-ify it. You can say The Verge makes virtual shoes that don’t separate between the sole and the fabric, but our system can actually separate the two for different designs. A lot of that is going to be interpretive software that takes what is not standardized and modifies it.

I feel like this is beginning to unlock for me in an important way. Unreal has moved into Hollywood, and it has moved into cars. You see this graphical engine appear in more and more places where graphics need to be rendered. So The Mandalorian renders the background on giant LED monitors behind the actors in Unreal, and now that same virtual world is available for Peloton to say, “We are going to bike through this environment.” Is that somehow an open platform for that kind of development?

You are quite right. Let me frame it a slightly different way. Entertainment is such a good example. Disney will spend $100 million producing backdrops in virtual environments for a film. Those are essentially all deprecated. They are increasingly used for the next film, but that is about it. What does that mean?

Well, if Peloton wants to build a Star Wars biking sim, they need to build it all. The business case might not be there. In addition, Disney might say, “Well, we have to make the thing, then we have to brand approve the thing, so we need to charge a lot.” So a lot of this does not happen. Once you start to standardize these 3D assets, you start to say, “We have made this investment and now we can use it wherever we want, or at least more extensively without building it anew.”

You take that from consumer leisure to, “Well, Ford has dimensionalized its next Ford Escape, so now we can simulate it in other enterprise environments, such as a car park for parking simulations.” A Hummer vehicle can use its lidar sensors to map the local area, then you can pre-drive that environment, like you would in a video game, to make sure that you can make the path. Making all of this information more repurposable starts to have extreme combinatorial effects, either by making new creations easier or cheaper.

Who controls the access and the connections between those things in your view of the metaverse? That seems like a very powerful vision, but then I start to pull the thread. If Disney has rendered out the world of The Mandalorian, I’m like, “I want to make print versions of The Verge for The Mandalorian.” I can imagine all these things we could do, but it feels like I still have to go get permission. The asset may be cheaper, but over time, content creation gets cheaper and cheaper anyway. Where does the technical part of availability come from? That seems like the hardest problem that we have been talking about.

There is no simple answer. These environments are managed centrally, and their permissions are going to be managed deliberately to start. If we have learned anything from the era of Shutterstock, TurboSquid, Quixel, or 3D asset databases, it is that the most valuable stuff, the IP, is not easily or cheaply licensed.

This is where we get into one of those fundamental questions of decentralization versus centralization. There are good arguments to be made that the last 15 years were too centralized, because the internet protocol suite has too little in it. We can get into that one way or another, but there are many forms of centralization that have nothing to do with technology per se. Revenue leads to greater investment and better products. IP centralizes or drives habit and retention. Brand keeps people inside of a system that they trust more than another.

This is the case even if you believe that the metaverse is a big, disruptive, next-generation internet, or if you believe in the wide deployment of blockchain and Web3 to democratize more of the stack. OpenSea is a great example of how we may still end up with no technical barriers to switching, but enormous habit and brand-based, or IP-based, stickiness to a few.

I feel like we have arrived at the Web3 portion of the conversation, so let’s talk about it. The ideas are in parallel, right? The amount of Web3 hype that has happened over the past 18 months is right next to the amount of metaverse hype. It feels like everybody wants to conflate them for some reason. Certainly, it is trendy in the business world to conflate them, to juice your stock price in some insane way.

They are not necessarily connected, but it does feel like the game of, “What are some use cases for Web3?” is best answered by, “There will be scarce digital objects in the metaverse.” There is a connection there. The open, technical questions of how these 3D worlds might work and how you might transact in them are actually answered by the blockchain, by Web3 technologies. Do you see that connection as directly? Do you think it is just a quirk of timing? Do you think there are other possible solutions?

I think that there are a few different things that we can unpack here. First and foremost, I and others, like Mark Zuckerberg and Tim Sweeney, describe the metaverse as a successor state, or quasi-successor, to today’s internet. Web3 is so named because it succeeds Web2. If both things come after the current thing, it makes sense that you have conflation.

In addition, there is a good reason to believe that the philosophies at minimum, or perhaps the technology at maximum, of blockchain are essential or important to the metaverse. Which is to say, property rights are probably going to be important, as they are to most economies. The ability to tap into decentralized or wide networks of contributors to provide extra GPU cycles, broadband, or just time and assets, which are currently hard to accumulate from individuals — Patreon only scales so much —are good reasons to believe that it is important to have a thriving metaverse, one that we want rather than one that is just technically possible.

I understand why the two are conflated, but I would say that they are separate. When you are talking about a good technological solution, when you talk about interoperability, you need a standard. You need someone to effectively take custody of an object and you need everyone to agree that they trust it.

The big problem that we have right now is EA and Activision do not have a good system to exchange anything. They certainly do not want to use one another’s new thing, should it exist. When other aggregators like Steam have tried in the past, no one opts in because the platform is already powerful enough.

Irrespective of whether or not blockchains are actually the ideal solution, they clearly have some revenue attached, speculative or not. They are proving themselves to get a wide collection of different deployed solutions. At the end of the day, it is not always important whether something is perfect, insofar as whether or not everyone uses it. The GIF file format is awful. We have known that for decades and yet everyone uses it, and so that ends up being the thing. That to me is part of the case.

One of the very hard problems with all of this is the amount of compute that is required. We are going to render a bunch of persistent virtual worlds that have unlimited maximum capacity, then potentially we are going to run blockchains to manage scarce digital goods inside those virtual worlds. That is a lot of compute; it is more compute than we have right now. Do you see that coming down because of Moore’s Law? Is TSMC going to figure out the next process node and we are just going to get there? Is it an agglomeration of other kinds of compute? Who builds this stuff? Where does it come from?

There are three dominant theories here. One is just Moore’s Law, slowing or not, continues to improve, and as part of that we get better at compression. We start to prune out the inelegant data formats and architectures, just like moving off of GIF to MP4 for lighter performance.

The second school is really organized around more efficient resourcing. This is the cloud argument. There are problems with it, but the argument would basically be that it is kind of stupid that we put the most intensive computing at the individual user, whose device has to be affordable, lightweight, and replaced every two to three years, versus the power plant approach of saying, “No one should have a generator in their home. We should deliver it from industrial scale.”

Then third are the bigger punts. There is a large contingent of people, Intel or TSMC, who are starting to believe that quantum computing — another idea that has long been considered fanciful — is no longer a crazy thing to believe in and ends up being essential.

The last and the most fun is decentralized computing, not necessarily in the blockchain sense, but in the solar panel sense. I am sitting talking to you right now. I have two consoles with incredible GPUs both sitting unused. There may be someone in my building right now who could use that. Right now they either do not have it, or they need to rent it from a data center that is expensive and far away, thus producing latency. Do you have a model potentially on blockchain or not? Is that a more effective system of renting out excess capacity, like a solar panel, or like Elon imagines Teslas will do in a self-driving car?

I love that idea and have heard variations of it for a decade now. I used to run SETI @ home on the computers at the college computer lab that I managed.

It is in my book. It is so fun.

It’s all right there. We have been chasing it for a minute. That requires that your personal power bill might go up and down in ways that you cannot predict. Your bandwidth might get strained in a way that you cannot predict. It would be sad if right now our call was diminished in quality because someone was running the GPU in your PS5 at 100 percent.

On top of that, at least in this country, the bandwidth required to do that is actually not evenly or equitably distributed. Some people have really fast connections, and many people have bad connections. There is virtually no competition for those connections whatsoever. You can make that bet, but you think about how it would play out in practice and it just feels like a lot of people will be selfish, first of all. That seems like a thing you can count on. Then second, the infrastructure to actually pull that off does not really exist.

I agree with you. I characterize it as the fun one because it remains the elusive one, just like when we talk about peer-to-peer servers for multiplayer games. It is a fun idea, but no one has figured out how to do it.

There are some technical solutions, of course, one of which could be that you do not necessarily need to congest the neighborhood if you geographically constrain who your GPUs are available to. You can also have different bidding. One of the problems I talk about in the book is the fact that we actually have very poor systems in TCP/IP to manage the prioritization of traffic once it leaves our network. I am not talking about paid peering or net neutrality, but literally the ability to differentiate if it needs to be there in 10 milliseconds or 50 milliseconds.

These are actually more fundamental issues. We do not have an effective way to split GPUs. It is not like you can say, “I need 80 percent of it but the remaining 20 percent can go.” I will say that there are some systems for this that are being deployed. J.J. Abrams and Ari Emanuel are on the board of a company called Otoy. They have a blockchain-based system called the Render Network, and it is designed to do exactly that.

An architectural firm that perhaps does not need its high-end GPUs overnight can rent those out on a bid-ask, blockchain-based system, and Hollywood studios do use them. This is not the expectation of every single person’s sitting devices used minute to minute, but we are starting to see it work on a more regular basis for industrial use cases with high-end, low-supply hardware. I put this in the, “if you woke up in 2045, it might be answered” bucket.

Let’s wrap up here by talking about the companies that are building this stuff now and where they are. You run an ETF called Meta that invests in various metaverse companies, and you obviously pay very close attention. Let’s start with the obvious candidate here, Meta.

In your book, you call it Facebook, because it is too confusing to call Meta “Meta” in a book with a metaverse, which I appreciate. Facebook obviously rebranded itself to Meta, Zuckerberg is all in on this pivot to the metaverse. In VR headsets at least, they are the market leader; the Quest 2 is a really good consumer product. Though I do not know if it is a metaverse product, since it is a pretty closed system. But they are ahead. How do you think they are doing and where do you think they go next?

I would say that the Oculus device is actually pretty open. They support side loading, and they do not require a central identity system. You can use alternative payment solutions for side-loaded apps, which is not even side loading, it is just not app store direct. The Oculus is unique in that it is effectively the only mainstream console that uses open standard rendering collections, WebGL, OpenGL, WebXR. Those are pretty significant. No one else did it. PlayStation 3 did, but PlayStation has never done it since.

The truth is, if you were to talk about number of users, amount of spend, number of developers, amount of developer profits, and cultural impact, they are frankly nowhere near leads like Roblox, Fortnite, Minecraft, or Unity on the B2B side. They also have a much harder path to doing that.

One of the challenges Facebook has in particular is that the economy is slowing down. Apple’s ad changes have had huge effects on Facebook’s revenue. They are trying to manage this big pivot and this bet on the future. People might buy fewer Quest consoles, and they are investing less in future hardware. Do you think they are going to be able to make it through?

The AT&T shift from Apple is particularly brutal. The estimated cost of that is $10 billion in operating cash flow in 2022. That happens to be exactly what Facebook Reality Labs was spending on their many projects, the various XR devices, the wearables, their operating system, and the Horizon Worlds platform. Anyone finding out that they are going to have $10 billion less in cash flow is going to have to trim budgets, especially in special projects with limited revenue and probably a negative 80 percent gross margin overall.

I think the biggest challenge — one that Mark has consistently underestimated, it seems — is that the timeline for those new devices, that would allow him to get out from the hegemony of Apple and Google, is probably farther out than was ever imagined.

2015 was the first time Mark said publicly that they imagined by the end of the decade, last decade, wearable headsets would replace the smartphone. They have reiterated that this decade, but as you and your colleagues have reported, they have now delayed the first edition three times. We may not see consumer AR hardware until 2025 or 2026, and he has called it the hardest technological challenge of our era, putting a supercomputer into lightweight wearables.

If that is their biggest opportunity to have hardware, to have their own operating system, and they are already sitting behind when it comes to what I call integrated virtual world platforms — Horizon versus Roblox or Fortnite Creative Mode — and they are simultaneously experiencing decline, not necessarily secular, of the core business, the timing starts to feel tight.

You said that it is the hardest technological challenge. I always think about it as a stack of problems, especially for AR glasses. You need a camera that can see the world around you in sufficient fidelity. That has to go to a processor that can interpret that data and spit out something good to put over top of it to augment reality. You need a battery that can power that processor and that camera. You almost certainly need persistent connectivity. Then most importantly, you need a display solution that actually works, which does not exist yet. Do you think Facebook is on the road to solving any or all of those problems?

I would add two more problems. It has to actually fit and weigh little enough that you are comfortable wearing it, and it has to not melt your face while you do it. Every single thing that you just mentioned trades off with one another. You want another two sensors that are good for UIX, it drains the battery and the GPU power, increasing the cost and the form factor, generating more heat.

Put another way, we take for granted that today’s most computationally powerful consumer devices, consoles, really just need to manage for a few constraints. The size, not really; the new PlayStations are four times bigger than the first PlayStation. They do not need to manage the battery, as they have constant access to power. They can put fans in there so that the overheating problem is not that bad. And they know that the build of materials has to cost between $400 and $700. When you are talking about these devices, you have several new problems: size, heat, you cannot have a fan, you need battery power, and the GPUs are smaller. All of the other things get harder despite that.

We see that Facebook is investing in its own semis and you are right, it’s the stack. All of these things need to be solved. We know that Apple is planning up to 12 or 14 cameras. I think the current Oculus has 6. Well, maybe you do need 12 or 14. Every time you put another pair in there, you are going to find that the GPU you thought was going to power experience X just cannot. It is incredibly hard.

I think that set of challenges is very difficult for Facebook. When we talk about hardware, we have to go to Apple next, which is very good at hardware. They are very good at performance chips that run a long time on batteries. There are lots of rumors about Apple’s headset out there.

But they are pretty bad at ecosystems and playing nice with others, and with interoperability. As you have mentioned with their ad tracking stuff, they are pretty good at locking things down. They are good at preventing innovation from taking place; game streaming does not exist the way it could because Apple will not allow it on their platforms. OpenSea cannot transact NFTs because they would have to pay Apple a 30 percent cut. How do you think Apple is doing?

One thing that is fun to put on the side of this is that six days before Epic Games sued Apple, Tim Sweeney, the founder and CEO, tweeted out that Apple had outlawed the metaverse. His point was exactly the cloud gaming one. I cite The Verge a few times in there with these fun quotes that basically say, “Arguing about what Apple does or does not allow is irrelevant because they can change the rules any time they want.”

The Apple constraint here is really profound. They have incredible hard, soft, and often accidental power, and they do work hard to prevent many standards and solutions coming into place.

You just teed up my favorite example, which is what happens with NFTs. Let’s keep in mind that they allow you to buy fungible tokens, ETH, on Coinbase, but you cannot buy a non-fungible token, an NFT, on Coinbase. If you choose to fractionalize an NFT into a billion fungible tokens — you could actually increase it so that there are more fractionalized tokens than there are Bitcoin tokens — that is still not allowed, even though you might own one trillionth of an NFT.

This just reflects the extent to which they are contending with not just business model disruption, but control of their own ecosystem. Outlawing is not wrong, but I think we will see how that turns out. When it comes to new hardware, it is obvious. If AR and VR are going to be things, Apple will be at least a player, but it is more likely that they have the most performance, best-looking, lightest weight, and preferred early additions. The advantages there, especially at scale and cost — development cost or production cost — are simple.

In the book you have a section about how the metaverse need not actually take place in headsets. It could be expressed in all kinds of ways. As we talk about these companies, their metaverse bets are very much headsets.

Facebook wants to be first to headsets at scale, because then they can just leave the iPhone and the complications of Apple’s platform behind. Apple does not want to have the iPhone disrupted, so they are racing towards a headset. I think Tim Cook wants to shift the AR headset as his last big reveal before he moves on in 10 years. Right now, to do a non-headset metaverse, you are kind of stuck behind whatever Apple will allow, because they are the most pervasive computing platform that exists.

That is quite right.

Is there a way around that? Do we just hope Amy Klobuchar can find the votes for her anti-trust bill, or is there a business model or industry solution that solves that?

This is where we get into some of the interesting answers. Is there a way around it? Are there alternatives? Yes and no.

Cloud gaming is a potential answer, but we should keep in mind exactly how many ways Apple stymies them. It probably works 95 percent of the time for 40 percent of users. That is not a good technical solution for a social platform, but it can work. Doing it from the browser is not a great experience. Apple, for security reasons, valid and not valid, also constrains your ability to send notifications. That is not great if I am trying to tell you to log onto Fortnite. First of all, you cannot have an app, and secondly, you don’t ever get the notification.

The other way to do it is browser-based rendering, but Apple has historically constrained WebGL, so the non-application alternative of using a browser, what they call the open web, doesn’t really work. The way in which Apple constrains WebGL is because Safari does not support it comprehensively, whereas I can download Chrome for iOS, and I am really just using the Chrome wrapper on the Safari engine. Their technical decisions for Safari mean what Google can and cannot do is inherited, and the app stores hegemony over software means that I cannot download true Chrome.

We are finding out this is why Tim sued Apple. He says that Apple has outlawed the metaverse rather than gotten in its way. A properly motivated Apple can effectively stymie most things. There is a reason why Web3 games are either based on non-real-time collecting and trading, or really primitive browser-based games, like Axie Infinity visually. You cannot pull off complex rendering without most of WebGL or a native app, and Apple will not allow it.

You mention the open web, which means we should talk about Google next. Google is Google. They have multiple competing projects. They have just restructured some things, and they have announced some little things. Are they a player?

That is a great question. Google has spent quite some time focused here. Google Glass was a famous disaster, but they have released another two versions of Google Glass, or enterprise editions. They made a billion-dollar acquisition last year, a $200 million acquisition the year before. Clay Bavor, an SVP in charge of essentially all special projects, plus AR and VR, and has been for some years, was realigned to directly report to Sundar.

It is clear that they are focused here. The problems have always been that their software is never considered best for consumer applications, their hardware has never really taken off, and their efforts in gaming have barely been funded. Many of their best potential plays, Niantic and others, were divested, spun off, or allowed to competitors.

If Android is and remains the most used ecosystem globally — it is the second highest revenue-generating games platform globally — they are likely to benefit, but the big opportunities with new hardware, a virtual world platform, or managing the standards, all seem tough. Even when you take a look at Google Cloud, it is estimated to be losing $5 or $6 billion per year. AWS has more profit than Google Cloud does in revenue. Even with the tangential argument that increased computing power is going to be good for Google, their business currently loses money every time a new server gets stood up. They are harder to see.

You mentioned AWS, so let’s keep going down the list. Amazon has some pretensions here, in the sense that they have a big hardware division that invents a bunch of stuff all the time. They have the most pervasive voice assistant, which I think is an interesting side light into the idea of a secondary world that you can interact with in different ways. Are they a player? Do you see them making an investment?

I would guess that they are number one in virtual assistant hardware, but I would also guess that Siri and Google Assistant are the most-used virtual assistants. They have the other benefit of having the device everywhere; mobile is better tailored.

Amazon is really interesting. The computing and data center business is going to be an extraordinary beneficiary. How much that moves into value-added services in machine learning and others has yet to be known. Snowflake is a good example of other companies building value-added services on top of the pure racks.

The bigger challenge is one I find really interesting. Amazon has spent a lot of time focused on more traditional media categories than it has in gaming or interactive, even though the latter seems a lot closer to their core business on the AWS side, and their success rate has been mixed. Jason Schreier at Bloomberg has estimated billions were spent into Lumberyard, their game engine. That was given over to the Linux Foundation earlier this year. Luna, their cloud gaming service, seems to have had less of an impact than Google Stadia did.

That is a very quiet burn. I just want to put that out there.

There is a good question of whether or not it is a quiet burn because they have been a lot quieter as well. Part of the problem that doomed Stadia was much bigger and more public ambitions, and much greater out-of-the-gate spend. Amazon is best in the world at the slow burn strategy and they remain committed to it, though I have not seen any big leaps.

While Amazon Game Studio has had some success with New World and others more recently, it is operating as the publisher. They are not developing the titles themselves, and they are not using AWS in an innovative or new way. As you take a look at Amazon’s interactive business, they have rewritten many job descriptions to focus on the metaverse in name. They are a big proponent of the Unreal ecosystem. They are trying to advance certain standards. But externally a lot of it still feels like more potential and conjecture than it is, as yet, a product.

I want to ask about two more here. Microsoft CEO Satya Nadella has said the metaverse is already here, so he is buying Activision and the Xbox seems to be growing. They just keep buying everything, but they do not have great hardware. The HoloLens is not a huge success; they just shuffled that team and fired Alex Kipman, who was in charge of the HoloLens. Are they on track, or are they just going to be a horizontal software provider, which has been an enormously successful strategy for them as you pointed out?

I talk about this quite a bit in the book. There is this fascinating aspect in which the company has absolutely thrived under Satya by becoming horizontal, shedding the stack requirement and rich vertical integration.

But when Satya took over, the games business was being called on for divestment. Yet the first acquisition he did was of Minecraft. He did something really unique at the time; he committed to keeping it fully horizontal, available on all platforms, not exclusive to Xbox, and keeping it agnostic to the end point, not even preferring Xbox hardware.

It was about five or six years before he did another large acquisition, that of LinkedIn. Then you have Activision Blizzard, the most expensive big-tech acquisition in history, at $75 billion. In the opening graph, the last line, he says, “It is for the foundations of the metaverse.”

In many ways, Minecraft presaged everything that he was going to do with the strategy at large, and they have been very focused here. The number of different pieces they have is actually really exciting. I talk about Microsoft Flight Simulator as perhaps the most technically impressive consumer-deployed, persistent live digital twin or metaverse-style experience that any of us can do.

This is a company where, putting aside the fact they were public about the metaverse before Facebook was, it feels like execution of bringing the pieces together — which is the same for Google and Amazon, but less clear — could be extraordinary for them. I think that is why you have always seen this commitment, and why he is so quick to bet FTC scrutiny, DOJ scrutiny, and $75 billion to build it.

I could keep doing companies forever. It’s a fun game, but I want to actually end on the regulatory scrutiny piece.

This space is unregulated, in a way that if you make the comparison to the early internet, it is very different. The early internet was a government project. There was the idea that we would keep regulators away from it. Even that decision to keep regulators away is, itself, a regulatory decision, and then you had all of the public investment into the internet around the world.

That is not happening here, right? This is all a purely private company kind of investment. Regulators seem like they have no idea what to do here, in the same way that even regulators have no idea what to do with crypto, but they have a lot of ideas. Here it is just silence. Where do you think that comes into play? Where do you think the government comes into play here with the metaverse?

The interesting thing about regulators leaving their hands off the internet is, of course, that the internet came from government. Many of its foundational bodies, the internet engineering task force that stewards most of TCP/IP, was developed by DOD and then relinquished, but is still strongly influenced by government. One of the reasons why governments left it was because there were pretty strong and important self-regulating bodies that worked together effectively that they had helped to create.

You are right that we do not see this here, but I actually think it is changing pretty quickly. Yesterday the EU released their Think Tank’s Policy Memorandum. The chief negotiator of the EU for the Digital Services Act has been very critical and very vocal about what they need. The South Korean government has established the South Korean Metaverse Alliance, an effectively required body that is also effectively mandating national standards.

Their perspective seems to be that the standards group will force things that many do not want and are individually disadvantaged by, but to the national benefit. Of course in China, which is a whole other issue, I do not think it is a coincidence that just after Tencent unveiled its Hyper Digital Reality vision — which is their essential trademark for the metaverse — they began the biggest ever crackdown of the space.

I think the US is probably the furthest behind, in at least formal recommendations. I think that in many territories — Southeast Asia, China, and the EU — governments seem very focused on this now in a way that surprises and inspires me. The fact that it coincides with regulation designed to fix the problems of the past 15 years raises the specter of accidental damage to an area that does not really exist yet. I am more hopeful that it actually sets us on a clearer path, rather than 15 years of catch-up.

Let’s end with a look to the future. I think one of the things that you and I would both agree on is that this is not going to be a light switch. The metaverse is not going to just turn on one day; it is going to happen to us slowly over time. I am curious. In that big picture, what is the sign post for you that the metaverse is more likely than not, or that it has arrived in a real way? What would be the indicator for you?

The indicator that I would pay attention to is the early demographic transition. Seventy-five percent of those ages 9 to 12 in most Western markets use Roblox, and just Roblox, on a regular basis. That is not to say that they do not use other things. We know fundamentally that Gen Y games more than Gen X, Gen Z more than Gen Y, and Gen A more than Gen Z, and that trend is not turning around.

I think the big things that I am getting excited about are the industrial applications, the deployment in what we call ACE — architecture, construction, and engineering. The challenge with those is that lead times are long. You have to convince businesses to use new technology to solve problems they are not used to solving. They have to then deploy them, and they have to get good at using them. They need to start to share with the city and with other partners.

Once we actually find a way to make development of the real world more productive, to live-operate businesses and infrastructure together — which can be as simple as lighting systems in a smart city with proper civil engineering — that is what gets exciting to me.

Matt, this has been incredible. I could keep going for another hour. Thank you so much for being on Decoder.

Thank you.

Tue, 19 Jul 2022 03:00:00 -0500 en text/html https://www.theverge.com/23269170/what-is-the-metaverse-matthew-ball-interview-decoder-podcast
Killexams : Informatica Inc. (INFA) CEO Amit Walia on Q2 2022 Results - Earnings Call Transcript

Informatica Inc. (NYSE:INFA) Q2 2022 Earnings Conference Call July 28, 2022 4:30 PM ET

Company Participants

Victoria Hyde-Dunn - Vice President of Investor Relations

Amit Walia - Chief Executive Officer

Eric Brown - Executive Vice President & Chief Financial Officer

Conference Call Participants

Matt Hedberg - RBC Capital Markets

Pinjalim Bora - JPMorgan

Strecker Backe - Wolfe Research

Koji Ikeda - Bank of America

Tyler Radke - Citi

Fred Havemeyer - Macquarie

Andrew Nowinski - Wells Fargo


Good afternoon, everyone, and welcome to Informatica's Fiscal Q2 2022 Earnings Conference Call. My name is Tia, and I will be your event specialist today. After the speaker's prepared remarks, there will be a question-and-answer session. Thank you.

I would now like to introduce our host, Victoria Hyde-Dunn, Vice President, Investor Relations. You may proceed.

Victoria Hyde-Dunn

Thank you. Good afternoon and thank you for joining us to review Informatica's second quarter 2020 [ph] earnings results. Joining me on today's call are Amit Walia, Chief Executive Officer; and Eric Brown, Chief Financial Officer.

Before we begin, we have a couple of reminders. Our earnings press release and slide presentation are available on our Investor Relations website at investors.informatica.com. Our prepared remarks will be posted on the IR website after the conference call concludes. During the call, we will be making comments of a forward-looking nature. real results may differ materially from those expressed or implied as a result of various risks and uncertainties. For more information about some of these risks, please review the company's SEC filings, including the section titled Risk Factors included in our most recent 10-Q and 10-K filing for the full year 2021. These forward-looking statements are based on information as of today, and we assume no obligation to publicly update or revise our forward-looking statements, except as required by law. Additionally, we will be discussing certain non-GAAP financial measures. These non-GAAP financial measures are in addition to and not a substitute for measures of financial performance prepared in accordance with GAAP. A reconciliation of these items to the nearest U.S. GAAP measure can be found in this afternoon’s press release and our slide presentation available on Informatica’s Investor Relations website.

It is my pleasure to turn the call over to Amit.

Amit Walia

Well, thank you, Victoria, and good afternoon, everyone, and thank you for joining us today. But let me begin by saying that we are very pleased to deliver Q2 results that exceeded the high end of our guidance. Total revenue growth was 9% year-over-year with subscription annual recurring revenue growth being 31% year-over-year and cloud ARR growth being 42% year-over-year. We strengthened our cash position and beat the high end of guidance for non-GAAP operating income.

Our IDMC platform is the growth engine for new and existing enterprise customers running mission-critical workloads, and we continue to observe the expected mix shift from self-managed to cloud. And importantly, we are on track to deliver $1 billion in subscription ARR by the end of this year, a milestone few software companies can achieve. We are also reiterating our full year 2022 guidance for all ARR metrics and non-GAAP operating income. We are, however, slightly lowering total revenue guidance to reflect foreign exchange headwinds. Now let me share business insights on the second quarter and then observations for the second half of the year before I hand over the call to Eric to recap Q2 financial results and provide full year and Q3 guidance.

Now, I have previously talked about how we have prioritized our R&D investments to accelerate cloud-first workloads through product innovation and strategic partnerships. In May this year, we hosted our annual customer conference called Informatica World. Our theme was data is your platform. Thousands of customers attended in person and virtually, including strong engagement with executive levels and a broad range of user personas from around the globe. We unveiled many industry-leading new data management capabilities to help customers and strategic partners across all levels, functions and IT realize greater business value out of their data.

To provide further detail and context, I'll frame my comments today around three strategic priorities and our investment focus. I'll begin with product innovation; then I'll go to strategic partnership expansion; and finally, the go-to-market.

So, let me begin with product innovation. We have been accelerating our pace of innovation to meet our customer needs to drive digital transformation and build their intelligent data enterprise across four distinct journeys. Let me begin with the first journey, analytics, where we are democratizing and simplifying data engineering workload execution. We launched a new product called Data Loader to simplify data management for departmental users. Our Data Loader is a no cost, zero code, zero DevOps and zero infrastructure-required SaaS offering that will help departmental users across an organization to move from data to insights in minutes. Data Loader's simple 3-click experience is now available for the Google BigQuery, Snowflake and Databricks.

We also announced a private preview of INFACore, a simple plug-in for any development and data science framework, which simplifies composing data pipes by turning thousands of lines of code into a single function, allowing users to consume, transform and prepare data from any source within their integrated development environment.

Now turning to our second customer journey, MDM and Business 360 apps. We are accelerating our investment in prebuilt Business 360 apps that enables customers to easily rationalize, combine and share customer, provider and product data from hundreds of data sources into a single version of the truth and drive business insights. In that context, we expanded a long-standing collaboration with Microsoft Azure and announced a software-as-a-service version of our multidomain Master Data Management for Microsoft Azure. Informatica's SaaS version of MDM on Azure uses AI and ML to help customers create a data foundation that provides a golden record of truth that spans overlapping, conflicting and related data across customers, suppliers and products. Informatica's SaaS MDM will be generally available for purchase from the Azure Marketplace in August.

With the addition of this multi-tenant native MDM, we have now completed our product road map with all products on our IDMC platform available as SaaS multi-tenant offering. I'm excited about that. We also expanded our cloud-native, multi-tenant MDM with two additional purpose-built applications, provider 360 to speed up the onboarding of suppliers, Improve collaboration and reduce risk; and Product 360 to efficiently acquire, manage and publish relevant, clustered, enriched product data.

Now turning to the third customer journey, data governance and data privacy, where we are enabling predictive data intelligence in the cloud with integrated governance, catalog, data quality and data marketplace capabilities powered by broad and deep cloud-native metadata intelligence, empowering data users of all skills to find, understand, trust and access the data needed for all use cases. We expanded data governance capabilities with Microsoft's Power BI. We also announced the expansion of our partnership with Snowflake to collaborate on deeper integration between Snowflake and Informatica's cloud data governance and catalog service.

We continued expansion of our scanners with even deeper penetration into Salesforce, SAP and Microsoft Azure ecosystems. We added new intelligent capabilities on our data quality suite for anomaly detection, which automatically highlights potential data quality issues that are very hard to detect for users. And our automated data classifications delivered out of the box have nearly doubled, enabling our customers to reliably identify even more critical data elements related to PII and other domains.

And lastly, for our fourth customer journey, app integration and hyperautomation, where we are integrating and connecting apps to automate end-to-end business processes. Within that, we announced a brand-new API Center as a one-stop shop to create, deploy, monitor, replicate and retire APIs. It provides a single integrated view of all APIs within an enterprise to drive productivity, transparency and usability. The API Center can also auto generate data APIs in minutes that deliver integrated, trusted and governed data along with the business process automation that is simple, fast, secure and more dependable by leveraging Informatica's API database.

It is through our IDMC platform that we enable organizations to treat data as their platform to address these mission-critical workloads. And to supply you some more context, the breadth of our IDMC platform remains unparalleled and provides a suite of 7 best-in-breed solutions that are powered by CLAIRE, our AI engine, with over 50,000 metadata-aware connections and leveraging 11 terabytes of active metadata in the cloud. IDMC is delivering mission-critical solutions that serve an ever-increasing base of global customers and operates at a significant scale, processing 38.5 trillion cloud transactions per month as of June 2022, which is an increase of 77% year-over-year and approximately 20% sequentially.

Now let me turn to our next priority, where we are striving to make Informatica the easiest to do business with and to win together with our partners as we are being the Switzerland of data within the enterprise ecosystem. Now I talked about Informatica World. Informatica World featured marquee customer -- marquee speaker participation from all of our strategic ecosystem partners, including Thomas Kurian, CEO of Google Cloud; Scott Guthrie, EVP, Cloud + AI Group from Azure; Andy Mendelsohn, EVP at Oracle; Matt Garman, SVP, Sales and Marketing from AWS; Christian Kleinerman, SVP, Products at Snowflake; and Adam Conway, SVP, Products at Databricks. I'm deeply honored to have these prestigious industry leaders share insights on how together we're helping our customers build an intelligent data enterprise and stay competitive in a digital-first economy.

Beyond Informatica World, we continue to share more partner innovation. At Snowflake Summit, we announced a new enterprise data integrator for the Snowflake-native application framework, and we were highlighted as a partner in Snowflake's announcement of the native applications framework. And we were awarded Snowflake's industry competencies in financial services and health care and life sciences, reflective of the significant joint customer adoption we have in these industries.

At Databricks Data and AI Summit, we announced expanded support for Databricks' SQL, advanced data quality for Databricks, expanded data governance and data cataloging with IDMC and the private preview of INFACore that I mentioned earlier, developer extension libraries for Databricks notebooks. We recently also joined the Data Cloud Alliance, created by Google Cloud, which focuses on making data and analytics more accessible via modern data management technologies.

And finally, a very important new strategic partnership with Oracle. Informatica was named by Oracle as a preferred partner for cloud enterprise data integration and governance for data warehouses and lake houses on Oracle Cloud infrastructure. With this partnership, IDMC has now become the most widely available data management platform supporting all key major cloud providers, AWS, Azure, GCP and now Oracle.

In the second quarter, the number of ecosystem co-sell wins grew over 105% year-over-year and the marketplace transaction volume tripled year-over-year, indicating excellent traction with key ecosystem partners.

And now turning to our global system integrator partners, where we continue to make improvements to the program to attract new partners and our global system integrator partners continue to build Informatica in their solutions. In that context, Informatica has joined Wipro's FullStride Cloud Services data platform as a premier collaboration partner alongside a stellar group of companies, including AWS, Microsoft, GCP and Oracle. Informatica also expanded its partnership with KPMG and launched two new offerings: KPMG Modern Data Platform and KPMG Powered Enterprise Data Migration.

Several more partners established centers of excellence to access -- with access to our migration factories, including Infosys and KPMG, plus several regional boutique partners to support our customers in moving their on-prem workloads to the cloud. And in that, we continue to drive maintenance to cloud migrations. As you all know and I've said that before, it has an approximately 9- to 12-month lag to convert from maintenance ARR to cloud ARR once implementation is completed.

Our differentiated cloud technology platform, IDMC, has been widely recognized by the marketplace and reflects our ongoing commitment to delivering product-led innovation at a global scale. We are proud to once again being named a 2022 Gartner Peer Insights Customer Choice for MDM. We've also been named a leader in both Forrester Wave's Enterprise Data Fabric and Enterprise Data Catalog for DataOps categories in Q2 of 2022. And more recently, Gartner named Informatica as one of the top vendors in the 2021 event stream processing platforms worldwide report. We were recognized as the second-largest vendor with market share greater than IBM, Confluent, Software AG, TIBCO and SAP.

And finally, turning to our go-to-market sales motion. Our customer relationships remain very strong as highlighted by the number of customers spending more than $1 million in subscription ARR that increased 51% year-over-year to 175 customers. Additionally, customers spending more than $100,000 in subscription ARR increased 20% year-over-year to 1,791 customers. Our increasing focus on vertical industries is leading to deeper customer discussions. Earlier this year, we launched IDMC for Retail. More recently, we announced and launched IDMC for Healthcare and Life Sciences with customers like Blue Cross Blue Shield of Kansas City and New York City Health and Hospitals as well as IDMC for Fin Serv, Financial Services, with customers like RBC Wealth Management, Bank of Montreal and Freddie Mac.

We continue to take the platform and make it more relevant to enterprises, industries and use cases. Let me supply you some examples of our customer wins. Norwegian Cruise Lines, a leading global cruise company, purchased our IDMC platform, replacing several single-product vendors, allowing them to take full advantage of all the capabilities on the platform, including data integration, data quality, API management, data governance and master data management.

In Guaranz, a mutual insurance company based in France, Paris with 250,000 members and EUR 3.4 billion in assets under management, as a part of its digital transformation to drive their own innovation, improving their own operational excellence and maintaining customer sat, chose Informatica's Customer 360 SaaS to help them create a trusted single view for their customers and employees.

HDFC Bank, the largest private sector bank by assets and the world's 10th largest bank by market cap, chose Informatica's MDM Customer 360 and Data Quality to be deployed into HDFC's Azure cloud to create a trusted 360-degree view of their customers. Informatica will partner with Microsoft Azure architecture team to support HDFC's digital transformation.

Another great example of a strategic partner co-win is with Abu Dhabi Ports Group. The company is undergoing a multiyear digital transformation program, which includes investments in people, processes, technology and data to enable a data-driven culture. We leveraged our deep relationship and jointly coordinated with Snowflake and Cognizant to demonstrate a true mentality in helping Abu Dhabi Ports Group. We're also very pleased to see customers looking to modernize to cloud and leverage our cloud data platform. Volvo Group was one of the first customers to embark on a PowerCenter modernization journey towards the cloud. They're looking for a common data management solution to support all of its enterprise business units and plan to leverage their entire IDMC platform as a single data management platform across all of Volvo.

So as I step back, in summary, we delivered outstanding Q2 results, which reflect our strong product and market fit, loyal and growing customer base and our ability to execute in this early innings of a $44 billion TAM in which we are consistently recognized as an industry leader with an expanded strategic partner ecosystem. Our cloud momentum remains strong, and we are continuing to process mission-critical workloads. I believe Informatica's best-of-breed solutions on our IDMC cloud data platform offer resilience and relevance to delivering customers' digital transformation needs. We are managing the business for long-term durable growth, positive cash flow and continued profitability.

Lastly, even though we are ahead on ARR and profitability metrics for the first half of the year, we continue to remain prudent as we think about guidance for the second half and the full year. Thank you to all our employees, customers, partners and shareholders for their support.

And with that, let me now turn the call over to Eric. Eric?

Eric Brown

Thank you, Amit, and good afternoon, everyone. We delivered a strong quarter and exceeded the high end of guidance across total revenue and all ARR metrics with cloud ARR growing at 42% year-over-year. Demand for IDMC platform remained healthy as we process mission-critical workloads. We beat non-GAAP operating income guidance by $22 million on the strength of higher total revenue and lower spending. As Amit mentioned, we are mindful of the uncertain macro environment and are taking a prudent approach to guidance for the balance of the year.

Let me provide commentary on Q2 results before discussing expectations for the balance of 2022. Turning to Q2 results. Total ARR increased 16% year-over-year to $1.44 billion. We added $197 million in net new total ARR in the second quarter versus the prior year. And we remain on track to deliver over $1.5 billion in expected total ARR this year. Cloud ARR performance was once again strong, increasing 42% year-over-year to $373 million and exceeding the high end of guidance. Cloud ARR now represents 26% of total ARR, an increase of 5 percentage points year-over-year. We added $110 million in net new cloud ARR in the second quarter versus the prior year. And sequentially, we added $30 million in net new cloud ARR in the second quarter of 2022 versus the first quarter of 2022. We continue to see a sales mix shift from self-managed to the cloud.

Turning to subscription ARR. This increased 31% year-over-year to $896 million, $11 million above the high end of guidance and driven by new subscription customer growth and improvements in our renewal rates, including cloud. The mix of subscription ARR is now 62% of total subscription ARR as compared to 55% last year. We added $210 million in net new subscription ARR in the second quarter versus the prior year, an increase of 19% year-over-year. And importantly, we remain on track to deliver $1 billion in subscription ARR for the full year.

54% of subscription customers are net new. And our average subscription annual recurring revenue per customer in the second quarter grew to approximately $243,000, a 22% increase year-over-year on an active base of nearly 3,700 subscription customers. The subscription net retention rate was 113%, flat sequentially. As previously mentioned, we expect to see fluctuations in this metric due to the mix of new bookings from new customers versus existing customers and the timing of large initial deal sizes expanding in the first year. We continue to expect a 120% subscription net retention rate long term as we build out the cloud business.

And lastly, maintenance ARR finished better than we expected and was only down 2% year-over-year at $541 million with strong renewal rates that were up one percentage point year-over-year. As a reminder, we have significantly reduced sales of perpetual licenses in favor of cloud offerings, and this will naturally result in a gradual decline in maintenance ARR over time.

Turning to revenue. We delivered $372 million in total GAAP revenue, an increase of 9% year-over-year and $4 million above the high end of guidance due to upside from self-managed subscription revenue recognition, partially offset by foreign exchange. Subscription revenue increased 24% year-over-year to $207 million. Subscription revenue represented 56% of total revenue as compared to 49% a year ago and reflects stronger customer demand for IDMC. Our subscription renewal rate was 94%, up one percentage point from a year ago and demonstrates the resilience of our business as the IDMC platform remains a mission-critical part of customers' operations.

Maintenance and professional services revenue were in line with expectations at $163 million and represented 44% of total revenue in the quarter. Stand-alone maintenance revenue represented 35% of total revenue. Consulting education revenue make up the difference and fluctuates based on customer requirements, representing 8% of total revenue.

U.S. revenue grew 11% year-over-year to $243 million, representing 65% of total revenues. International revenue grew 4% year-over-year to $129 million, representing 35% of total revenues.

Now turning to consumption-based pricing. It's been about 1.5 years since we launched our consumption-based pricing model, featuring Informatica Processing Units, also known as IPUs. Recall that IPUs allow our customers to dynamically and seamlessly choose how they use any of our cloud solutions and services. As of Q2, IPUs represented approximately 30% of cloud ARR, roughly double compared to a year ago. Approximately 47% of our cloud new bookings were IPU-based, indicating a healthy momentum of this offering.

Before moving to our profitability metrics, I'd like to point out that I will be discussing non-GAAP results for the second quarter unless otherwise stated. Gross margin is 81% and similar to Q1, notwithstanding the mix shift to cloud. For Q2 operating expenses, we observed an increase in travel and marketing expenses to support our Informatica World event.

Looking out to the second half of the year. We have slowed net new hiring, and we are optimizing investments and spending in the greatest areas of opportunity for cloud acceleration, product innovation and strategic partnership expansion. Operating income was approximately $70 million and exceeded the high end of guidance by $19 million due to higher revenue and reduced rate of spending. Adjusted EBITDA was $75 million and net income was $45 million. Net income per diluted share was $0.16 above our expectations based on approximately 284 million diluted shares outstanding. The basic share count was 280 million shares.

We ended the second quarter in a very strong cash position with cash for short-term investments of $582 million. Net debt was $1.3 billion and with the trailing 12-month adjusted EBITDA of $367 million. This resulted in a net leverage ratio of 3.5x. We expect the business to naturally delever to approximately 3x by the end of this year and then to below 2x by the end of 2024. Unlevered free cash flow after tax was $33 million and approximately $32 million lower than our expectations due to two primary reasons. First, we had a $15 million higher-than-expected cash outflow from cash tax payments in Q2, a portion of which is timing-related. We also saw a slight increase in our day sales outstanding, which resulted in a working capital adverse for Q2. GAAP operating cash flow was $16 million compared to $40 million in Q2 last year. This summarizes Q2 results.

Now let me turn to guidance. We continue to feel good about the underfunded lines of the business, our durable and predictable subscription revenue stream, high cloud growth and healthy gross margins. We remain confident in achieving approximately 40% cloud ARR growth for the full year as the mix shift from self-managed to the cloud continues, and our renewal rates are improving. While we did see better-than-expected subscription and cloud ARR results in the first half of the year, we are not flowing the beat through the balance of the year in keeping with our prudent approach given the current macro environment.

Now for full year guidance. Looking at the full year 2022 guidance. We are reiterating guidance for the year ending December 31, 2022, as follows. We expect total ARR in the range of $1.52 billion to $1.55 billion, representing approximately 13% year-over-year growth at the midpoint of the range. We expect subscription ARR in the range of $990 million to $1.01 billion, representing approximately 25% year-over-year growth at the midpoint of the range. We expect cloud ARR in the range of $438 million to $448 million, representing approximately 40% year-over-year growth at the midpoint of the range. We expect non-GAAP operating income in the range of $325 million to $345 million.

We are updating the full year 2022 total revenue guidance to be in a range of approximately $1.54 billion to $1.56 billion. At the midpoint, we are reducing GAAP total revenue by approximately $45 million due to currency headwinds from a stronger U.S. dollar.

Now foreign exchange rates do affect our operating income. However, the overall impact is mitigated since we have a considerable amount of operating expenses denominated in foreign currency serving as a natural offset. Net, we are holding our full year non-GAAP operating income guidance unchanged as we control our spending and further optimize our ARR renewals business.

We are updating the full year 2022 unlevered free cash flow after-tax guidance to be in the range of $290 million to $310 million. At the midpoint, we are lowering unlevered free cash flow by $33 million. Most of this variance is working capital-related as our quarters are a bit more back-end loaded in terms of overall bookings, and we are starting to see some customers delay payments, creating a slight increase in our DSOs. We expect these trends to continue in the second half of the year. In addition, we're expecting $5 million to $10 million more of additional cash taxes this year.

Taking all this into account, we are establishing Q3 guidance for the quarter ending September 30, 2022, as follows. We expect subscription ARR in the range of $920 million to $930 million, representing approximately 26% year-over-year growth at the midpoint of the range. We expect cloud ARR in the range of $399 million to $405 million, representing approximately 40% year-over-year growth at the midpoint of the range. And we expect non-GAAP operating income in the range of $77 million to $84 million. We expect GAAP total revenues in the range of $385 million to $395 million, representing approximately 8% year-over-year growth at the midpoint of the range. We estimate the Q3 impact of foreign exchange to be around $15 million. We reported Q2 non-GAAP net income and a non-GAAP tax rate of 23%. For the full year, we estimate a 23% non-GAAP tax rate as well.

Looking to fiscal 2023 and beyond. We continue to expect a long-term steady-state non-GAAP tax rate of 24%, which reflects where we expect cash taxes to settle based on the structure and geographic distribution of operational activity. For modeling purposes, we estimate Q3 unlevered free cash flow to be approximately $55 million. Additionally, for the third quarter of 2022, we expect basic weighted average shares outstanding to be approximately 280 million shares and diluted weighted average shares outstanding to be approximately 283 million shares. For the full year 2022, we expect basic weighted average shares outstanding to be approximately 281 million shares and diluted weighted average shares outstanding to be approximately 288 million shares.

In closing, we began this year with an objective to grow cloud ARR by approximately 40% year-over-year to $442 million, achieve $1 billion in subscription ARR and $335 million of non-GAAP operating income at the midpoint of the range. We are on track to meet these objectives through the first half of the year, and we are reiterating our full year guidance for these metrics.

Thank you. And operator, you may now open the line for questions.

Question-and-Answer Session


[Operator Instructions] The first question is from the line of Matt Hedberg with RBC Capital Markets.

Matt Hedberg

Really great results. And obviously, it's a difficult operating environment. You noted strong results here, and you're maintaining a level of conservatism by not increasing ARR guidance, which seems certainly prudent. That's what, I guess -- we're a month into 3Q. Have you seen any changes in buying cycles, like any elongation or extra approvals or anything of that nature? Is it just you're maintaining that prudence with the expectation that something like that could happen at some point?

Amit Walia

Matt, good to talk to you. I think as we’ve talked throughout the course of the last couple of months, I think we are maintaining a sense of prudency, if I can use that word. Look, I think we’re all looking at a uncertain macro with so much going on, and I think I’ll repeat that all of you know. So I think the right thing to do is for us -- we’ve obviously had a pretty good first half, and we see momentum in terms of what we offer to the market and what workloads we serve. But look, I think the right thing to do is to be prudent and be thoughtful about what the second half could be and walk into that by keeping our guide for the year and just see how the world shapes up.

Eric Brown

And Matt, in response to your question, given we're about a month into the third quarter, and there's no net change in the first month versus what we saw towards the end of Q2 in terms of purchasing returns, others...

Matt Hedberg

Yes. That’s great to hear. And then I think we’re all really interested in the consumption, the IPU success. And it seems like you’re -- you continue to see a lot of traction on that. There are a lot of questions from investors, too, about consumption models and perhaps an economic slowdown. Any sense for how that might trend in your base? Obviously, it’s an expanding trend, but just sort of curious if you have any sort of anecdotes on how that might progress.

Eric Brown

Yes. Thanks for asking. So first of all, we're seeing a great mix shift in our cloud net new business. We're now nearly 50% of our new cloud business being denominated in IPUs. And as of right now, if I look at cloud ARR ending balance Q2, IPUs comprised 30% of that. A year ago, we were roughly 15% IPU-denominated. So there's great uptake in regards to the offer. And we're going to continue to push it. We expect to be back at a 50-50 mix in the near term here. And in terms of usage patterns, again, we’re about 1.5 years into the offering. And we’re seeing customers kind of scale up as we would like over their first 6 to 12 months. We’ll see kind of the first kind of 2-year anniversary cohort in about two quarters. And that will supply us kind of a better read on kind of the launch of the product about 1.5 years ago.


The next question is from the line of Pinjalim Bora with JPMorgan.

Pinjalim Bora

Congrats on the quarter. I guess since macro is top of mind for everybody, I want to thread that needle a little bit more. You did talk about slowing down hiring as well. Is -- I'm trying to understand if you're seeing anything in the pipeline. How would you characterize the strength of the -- strength and quality of the pipeline as you kind of enter the second half?

Amit Walia

No. Thanks for the question, Pinjalim. I think in terms -- and I'll break it into 2. So in terms of the demand for digital transformation and data lake digital transformation, those conversations are continuing to be basically very robust. When I was in Europe last week and I had met a bunch of CEOs, high on top of their priority. And I think in that context, our pipeline creation remains very healthy. We had Informatica World. We are having these conversations. I think where you see the uncertain macro environment translated to that is not pipe rate. It's more conversion of that pipe. Quite naturally, deals get elongated. There is more scrutiny on deals, of course, in a time like this. There are pockets of customers who are probably facing the impact of the current economy more than other pockets of customers, and there will be more scrutiny.

So deal cycles increase, scrutinies increase. But in general, I would say, pretty healthy pipe rate, strong interest. And that's what we see. And I think, look, that's not a surprise given where the world is in the current environment.

Pinjalim Bora

Yes, understood. Just to be certain, so the things -- I understand you might see that, the lengthening of deal cycles or deal deferrals. But at this point, you’re not seeing anything?

Amit Walia

Well, I think nothing out of the ordinary. You see where we were in the first half of the year. We obviously are maniacally executing, assuming those kind of things may play out in a macro environment like this. Obviously keeping our eyes very close on the ground to make sure that we continue to execute the same way as we did in the first half. And again, that's reflected in our guidance that Eric gave. We overdelivered in the first half. We're carrying the prudence to the second half but holding our guidance for the full year for ARR metrics.

Pinjalim Bora

Got it. And a quick follow-up to Eric. The reduction in revenue, $45 million, is -- I think you’re saying it’s FX. Is it 100% FX? Or is there a little bit from the mix shift versus conservatism in the second half?

Eric Brown

It's nearly 100% from FX with more of that being seen in the second half versus what we observed in the first half.


The next question is from the line of Alex Zukin with Wolfe Research.

Strecker Backe

This is Strecker on for Alex. Eric, you mentioned that there were some customers trying to delay payments. Can you just elaborate on that for us more? Is it just a handful of specific customers? Is it coming out of specific regions? And then how are you factoring that into your own modelling going forward in the back half of the year?

Eric Brown

Sure. Yes. We -- as you know, last year, we had a really good overperformance on operating cash flow and working capital. So we really had finely tuned DSOs. And what we observed in Q2 is at the end of the quarter, customers in those -- there were customers in all of our major geos, select customers, that were delaying payments. We're expecting payments last week of the quarter. We didn't get them. Net, we saw a 3-day to 4-day increase in our DSOs sequentially. And we're expecting this new slightly more elevated level of DSO to persist as of the end of the year. And so when we run the numbers, rough and tough, an extra four days of DSOs. That's what we're thinking right now as of the end of the year. That's around $25 million of adverse to working capital. And so we're assuming that the level in summary is slightly elevated based on what we saw in Q2.


The next question is from the line of Koji Ikeda with Bank of America.

Koji Ikeda

I just wanted to kind of follow up on that previous question on the delay in payments, Eric, and just wanted to kind of fully understand that. I appreciate the color there. So just thinking about the delay in payments, has that been isolated now? Or just thinking about the future, how could this potentially affect unlevered free cash flow further in the future?

Eric Brown

Yes. This is what I would characterize as kind of a transient event. I think that -- we've kind of seen these things in the past in kind of macroeconomic slowdowns. People are just trying to manage by paying a week late, let's say. If that crosses your quarter, as it did in our case, it directly impacts the stats. So what we're assuming here is that the current level that we're at, a slight elevation persists throughout the Q3 and Q4 and hence, the modification to our unlevered free cash flow. The other thing impacting unlevered free cash flow is a full year higher cash tax payment outflow of $5 million to $10 million. So those are the two things driving the minus $33 million on unlevered free cash flow for the full year.

Amit Walia

Koji, one thing I’ll add to what Eric said is that -- I mean, remember, we serve the true enterprise segment. Our customers are all of the customers we talk about. So I think -- we understand them. They have been long-standing customers, and I think this is a transient thing, as Eric said. These are the blue-chip customers across the globe. We don’t look at this as anything that crosses a longer duration in any way, shape or form.

Eric Brown

Yes. And just to drill down one more level there, Koji, we've seen no change in kind of bad debt profile. So these are simply a bit of delays as opposed to a change in kind of credit profile outlook across our customer base.

Koji Ikeda

Got it. Got it. And just one follow-up here, if I may. Just thinking about the cloud subscription ARR guidance here, if I run it through the model, it looks like the implied Q4 sequential net new cloud ARR add, it looks pretty good, like pretty healthy from a sequential add basis. So just I understand the enterprise sales and renewal cycle seasonality here, but just really kind of curious to hear what has given you the confidence in that seasonal strength. If you achieve that guidance, that Q4 net new sequential ARR would be the highest here by a pretty big margin.

Amit Walia

I'll go and I'll let Eric add to the numbers as well. Look, I think I'll break it again into 2. We serve mission-critical workloads. And to be candid, like enterprises are still focused on digital transformation. That's not going away. Given the uncertain macro, yes, we talked about deals can elongate, there can be more scrutiny. But at the end of the day, people have to invest in making their business digital first, customer first or data governance has to happen, and we continue to see that healthy discussions. Obviously, we have taken our first half overachievement and made sure that, that gives us the ability to derisk the second half, in a way, and carry the whole year with a prudent guidance. So our conversations are absolutely -- the ones we’re having with customers, as I was explaining, I was in Europe last week or the week before last. I forget now. All of the conversations were around how do I make sure I can help customer success -- by customer, how do I understand my customers better, how do I get better analytics. I’m in a multi-cloud world. How do I make sure my data quality is good? Those conversations are happening across the board.

And Eric, you want to add to the numbers?

Eric Brown

Yes. And the other thing, too, we mentioned this in the Q1 call too. We assume kind of a rate of change over the course of the year, a steady remix improvement quarter-by-quarter sequentially on cloud, net new business versus self-managed net new business. And we can see this at the midway point of the year and the significant mix change improvement in pipeline towards cloud versus self-managed. And that's part of what informs our implied view on Q3 and Q4 cloud NRR [ph] sequentially in the guidance. The second thing, too, is that another way to Improve cloud in our ending balances is to do better on your renewals. And we've seen a nice improvement in year-over-year renewals. We noted one percentage point increase in overall subscription or renewal rates inside that. There's two renewal rates, self-managed and cloud, and I can tell you that the cloud renewal rate year-over-year also had a nice improvement. So that certainly helps the cloud ARR outlook for Q3 and Q4.


The next question is from the line of Tyler Radke with Citi.

Tyler Radke

Can you hear me okay?

Amit Walia

Yes, Tyler.

Tyler Radke

I wanted to see if you could comment just on the overall macro environment in terms of supporting platforms and consolidations. We’ve heard a number of companies talk about consolidating point products and niche solutions. And I’m curious if you’re seeing an increased appetite from customers just as budgets are more under scrutiny and maybe how you’re adapting to that or benefiting from that.

Amit Walia

Yes. I think there are three ways to look at it. Number one is absolutely, we have the unique advantage of having a strategy of the best-of-breed products, as you can see in the Magic Quadrants, and a single platform where all the products are very seamlessly integrated. Number two is that platform is a very open microservices-driven platform fits into our customers' reference architecture in a multi-cloud environment. And number three, the whole IPU-based consumption model lets customers start and go from any service on the platform to any incremental service very easily. All of those things are allowing us to not only expand use cases but also take out multiple point providers where the customers are struggling to spend money and just integrating them to get to a final use case. So we see those benefits across the board, especially in an environment like this.

Tyler Radke

Great. And just in terms of overall cloud migration update, as you’re having the conversation with these customers, are you seeing them increase the prioritization or just decrease their overall shift to the cloud just as they’re thinking about their road map for this year? Just any change in terms of the cloud versus on-premise that you’ve seen in your customers?

Amit Walia

No change, to be honest. I think, in general, we see no change than what we saw last quarter or the quarter before in terms of migrations. I think I'll always -- I think we always say that, look, when you look at our business, we are absolutely focused on every area in which we can help our customers go to the cloud with our offerings. The majority of it is we've continued to grow through net new workloads. Migration is a tremendous area of focus, still a small percentage of our overall cloud ARR. And as we’ve said in every call, we are having those conversations. Of course, that takes a lag because, obviously, it’s an operational workload. It takes a while to migrate them to the cloud. But no change in those conversations from what we have been having in the last couple of months.


The next question is from the line of Brad Zelnick with Deutsche Bank.

Unidentified Analyst

It's Jamie on for Brad. I just wanted to dig into the cloud portfolio adoption. Is there anything you can call out around particular product module strength? I mean, any parts of the portfolio that are outperforming? Or if you could provide some sort of contribution framework, that would be great. Any color?

Amit Walia

Sure, Jamie. It's kind of like asking me which of my kids I love more. Actually, to be honest, that's the beauty of the platform. We -- I think you -- I kind of was talking about a customer example, and you can see from that, that the breadth of participation of our portfolio on the platform is pretty strong and -- whether it's analytics, whether it's MDM apps or it's data governance. And to be candid, like you take something like data quality, it’s needed in an analytics workload. It’s needed in an MDM workload. It’s needed equally in a data governance workload. So we think of it in context of use cases, and those use cases, all of them have participated quite well, quite healthily. In fact, we like that because that gives us a natural hedge that we have many use cases, and we can obviously traverse those use cases at any given point in time with our customers wherever the investment dollars go.

And the other thing to note is that our use cases traverse from the front office all the way to the back office within a company. Whether it’s helping get new customers, understand churn, managing supply chain, doing analytics of the business, we serve the full enterprise. So it’s pretty good participation across the board, Jamie.


The next question is from the line of Fred Havemeyer with Macquarie.


I first wanted to begin with just momentum that you’re seeing around artificial intelligence and machine learning and workloads in the cloud. I recall back at Informatica World that the Data Loader in the cloud natively was supporting AI and ML workflows. So could you talk about any sort of the initial traction that you’re seeing there? Particularly, I recall hearing a number of companies recently, cloud companies, talking about the importance of AI and ML workloads in the cloud to their momentum in their public cloud businesses. So would love some color and context what you’ve seen.

Amit Walia

Sure, Fred. I think, again, I'll break it into two parts. First of all, we build big levers in AI and ML. In fact, we started CLAIRE back in 2018, 2019. So first of all, CLAIRE, our AI engine, is embedded in every product. So it's naturally providing intelligence and automation, whether you are -- like I talked about, data quality, anomaly detection. There's rules and then CLAIRE goes in and basically finds many more things that a human cannot find. So it's already driving value in the context of existing products, and it's scaling more and more. And in a multi-tenant cloud world today, we are running CLAIRE on 11 terabytes of metadata. It's just getting smarter, helping our customers leverage that metadata.

Secondly, in the context of Data Loader, you asked me the question. Indeed, early days, we launched the Data Loader. Our attempt there or our goal there or our strategy there is to make sure that we make data integration or doing those jobs that are -- bring it to the business user in a matter of such a simplified user experience. I talked about three clicks and make it dramatically easy that the business user does not even realize they're doing all these complex things that a complex IT user is used to. There, obviously, we're pretty excited. We talked about the BigQuery, Snowflake and Databricks and expect more to come. That's early days but great traction. We're tracking usage. And early days, it's been -- it exceeded our expectations.


And then I'd just love to ask a follow-up question here as back -- also in Informatica World, I believe that you highlighted -- you announced industry-specific and vertical-specific IDMC solutions across financial services, health care, life sciences and I think more. I just wanted to ask, could you provide any updates about your progress with verticalized solutions and then, more broadly, your strategy with vertical specific go-to-market?

Amit Walia

Yes. No, it's a terrific question. So look, we are -- the way we think about verticalization is we're not an application software company that you verticalize all the way up to the UI and UX. That's not the business we play. But our goal is to make sure -- like when we bring data quality, let's say, or data governance to financial services versus health care versus life sciences versus retail, there are many things inherently where they could get industry-specific, whether it's regulations or types of data or types of things they do that we want to make sure that are more customized for them. So what does it do for them? Reduces the time to value, accelerates for them to get their projects done and reduces the amount of people and expenses they have to spend to customize it to their particular industry needs. That’s what we’re going to continue to do, whether it’s connectors, scanners, rules, AI models. And expect that for us to do that for these verticals more and more and more verticals over the course of time as we continue to take IDMC to more and more industry-specific use cases.


The next question is from the line of Andrew Nowinski with Wells Fargo.

Andrew Nowinski

I had a question on FX again. I appreciate you were just in Europe. But if FX headwinds force you to lower the revenue outlook and your solution is presumably more expensive for international customers now, I guess, did you also factor in a longer-term slowdown in spending then from international customers?

Eric Brown

This is Eric. No. I mean just to recap the change in FX, we didn't really mention FX in the first quarter. We're 90 days in. Here we are at the halfway point, and it's very clear that the U.S. dollar has strengthened. It's going to remain strong. I mean it's plus -- it is our functional currency for us, the currencies that matter. It's 15% to 16% stronger kind of year-over-year. So the $45 million is simply that translation impact. It has nothing to do with like a change in demand in international.

And the other thing I want to point out in FX is that we're rather unique in that we have a very large amount of non-U.S. dollar-denominated operating expenses. And so we have a built-in natural hedge from FX because our euro, India rupee, et cetera, expenses with the strong U.S. dollar supply us that natural offset. And so notwithstanding that, we're changing the top line for FX only. And we're able to absorb the net bottom line impact with that OpEx offset, and we're holding non-GAAP op income constant for the full year. And so I just wanted to make that point on FX as well.

Andrew Nowinski

Okay, got it. If I can just follow up to that, I guess, geographically, how was demand in Europe through the month of July as well as maybe even last quarter as well? If you could provide any color on that.

Amit Walia

We don’t see any degradation in demand in Europe at all. I mean – so I think Europe performed pretty fine for us. Nothing out of the ordinary that we saw or we are seeing, to be honest.

Eric Brown

Yes. Maybe to clarify, too, we talked about international revenue being up 4% year-over-year for Q2. If we had FX adjusted at that growth rate, it would have been roughly 10%, pretty much in line with U.S. growth of 11% revenue year-over-year. So maybe that helps further calibrate the FX impact.


Thank you. There are no additional questions at this time. I will pass it back to Eric Brown, CFO, for any closing remarks.

Eric Brown

Great. Thank you, operator. So I’d like to quickly do a bit of a recap as we hit the midyear point. So first of all, we opened 2022 with full year guidance for 40% cloud ARR growth, $1 billion in subs ARR by the end of the year, non-GAAP operating income of $335 million at the midpoint. We are ahead of these objectives as of the end of Q2. The only things that we see changing for the full year is the top line-only impact of FX as discussed and a transient decrease in unlevered free cash flow due to slightly elongated customer payment cycles and higher cash taxes. The underlying health of our business is excellent, and the advantages of our scale of $1.5 billion in total ARR is evident. Amit, back to you.

Amit Walia

Well, thanks, Eric. And once again, I'll reiterate that, look, we're a very unique company. We've always said that we focus on enterprise customers, mission-critical workloads, building out a pretty scaled, multi-tenant, cloud-native platform. And that has given us the ability to observe healthy momentum from current and new customers running their mission-critical workloads on the IDMC platform.

I'd like to reiterate that we reported a great quarter, we reported a great first half, and we remain on track to deliver our commitments for the second half and the full year for both growth and profitability. Thank you very much for your time, and I look forward to next quarter.


That concludes today's conference call. You may now disconnect your lines.

Sun, 31 Jul 2022 01:56:00 -0500 en text/html https://seekingalpha.com/article/4528094-informatica-inc-infa-ceo-amit-walia-on-q2-2022-results-earnings-call-transcript
Killexams : Quantum Computing Has Arrived No result found, try new keyword!Just as those technical advancements have become engrained in countless global industries and cultures, quantum computing has the potential to represent the next technological breakthrough. Bringing ... Tue, 26 Jul 2022 00:30:00 -0500 https://finance.dailyherald.com/dailyherald/article/financialnewsmedia-2022-7-26-quantum-computing-has-arrived
P8060-001 exam dump and training guide direct download
Training Exams List