The pandemic has caused an acceleration in the adoption of Digital technology. It made digital services a "must have" and not just an optional extra. From Mobile Phones to Ecommerce, Social Media to Video Conferencing, this technology has become part of our everyday life. When we use digital technology, it just "works" but we don't think about the gigabytes of data flowing from Data Center to Data Center all over the world.
However, the innovation doesn't stop there from 5G to AI, Internet of Things (IoT) to the Cloud, all these services require even more infrastructure to function at scale. Picking the winning AI or 5G company can be a challenge, and thus I prefer to invest into the "backbone" digital infrastructure which supports these players. Think of this like a "Digital Toll Road" collecting rents and dishing you out dividends, while the tech companies battle it out for market share. "Digital Transformation" is also a key term which has become popular. Enterprises are realising their legacy IT infrastructure is costly, complicated and not flexible, thus many are moving to the "cloud" which is a fancy word for a Data Center.
"Looking toward 2023, most companies will need to build new digital businesses to stay economically viable." - McKinsey Global Survey, May 2021
The global data center market size was worth $216 billion in 2021 and is predicted to reach $288 billion by 2027, growing at a CAGR of 4.95%.
Thus, in this report, I'm going to dive into Digital Realty Trust (NYSE:DLR), a best-in-class owner of Data Centers with over 290 facilities globally. The company acts as the backbone infrastructure for many larger technology providers, from Facebook to Oracle. The stock price has plummeted 30% from the highs in December 2021 and is now trading at the March 2020 lows. From my valuation, the stock is undervalued relative to two large competitors in the industry. Let's dive into the Business Model, Financials and Valuation for the juicy details.
Digital Realty Trust serves a vast customer base of over 4,000 customers, which is diversified across multiple industries from Social Media to IT, Finance and Telecoms. Its largest customers include giants such as; Facebook, Oracle, Verizon, IBM, JPMorgan, LinkedIn, AT&T and many more. These established customers demand best-in-class speed, redundancy and security.
The company's facilities are well diversified globally across 25 countries and 50 metro areas. With 58% of its facilities in North America, 27% in EMEA, 10% in APC and 5% in Latin America. This global and customer diversification should help to ensure stable cash flows, as when one country or industry is going through a recession, others will be thriving.
Digital Realty also has a variety of equipment offerings which can appeal to various customers at different price points and stages of their digital transformation journey. From the Network Access Nodes to Aggregation and vast server farms for Hyper-scalers.
An example of a customer use case can be seen on the animation below. Here a "Self Service" option enables Coverage, Connectivity and Capacity to be dynamically controlled. While its "Any To Any" Interconnection enables low latency data transfer between multiple services.
The Data Hub is another high growth area, as legacy companies historically have an issue with "Siloed Data". Companies are generating more "big data" than ever, but it's usually stuck in various departments and not utilised. However, by moving data services to the cloud, it can be aggregated and analyzed with various analytics and machine learning processes more easily. For instance, Hewlett Packard Enterprises (HPE) and its GreenLake technology have recently partnered with Digital Realty to help companies with digital transformation and data aggregation.
To expand its portfolio, Digital Realty is growing via acquisition and has made a vast number of global investments over the past few years. A few years back I wrote a post on the "consolidation" of the data center industry and the "land grab" tactics companies are using, and now it seems these are playing out.
Digital Realty has experienced strong growth in bookings over the past few years, with a major upwards trend seen since the pandemic, which acted as a catalyst for companies to "Digitally transform".
Revenue saw a sharp uptick at the end of 2021 and for the first quarter of 2022, it popped to $1.19 billion, up 11% year over year.
Funds from Operations per share (FFO) saw a slight down tick between the high of $1.78 in Q221 and $1.54 in Q321. However, since then it is starting to recover and was up 6.6% year over year to $1.60 for Q122. Management is expected a further uptick for the rest of 2022, due to record backlog of $391 million in the first quarter.
Digital Reality has had a strong retention rate of ~78% historically, which is fantastic as it means the tenants are finding immense value in the service. Transforming legacy IT infrastructure to the cloud is a technical, time-consuming and a costly process. Thus, I believe once installed and set up with a cloud provider, the likelihood of moving providers (even if another is slightly cheaper is rare). Thus, immense "stickiness" is seen in the industry, as you can see with the high retention rate. There has been a slight dip in the trailing 12-month rate, but it does look to be recovering.
Digital Realty has staggered lease expiration dates, with 17.8% in 2022 and 18.3% in 2023, thus I expect some volatility in the next two years. However, the rest of the lease expiration dates are well diversified across many different years ahead, thus this should certain stable operations.
The Enterprise value of Digital Realty has increased by 47% since 2018, from $35.5 billion to $52.1 billion, which is a testament to the company's acquisition strategy. The top 20 tenant concentration has also decreased by 430 bps to 49%, which is a positive sign as it means more diversification.
REITs are well-known for having a large amount of debt, and Digital Realty is no different, with $14.4 billion in long-term debt. The good news is the Net Debt/Adjusted EBITDA ratio has decreased to 5.9x from 6.2x in 2018, which is a positive sign.
The Dividend Yield (forward) is steady at 4%. Don't be thinking by the very high 97% payout ratio, as by law all REITs must pay out at least 90% of earnings.
In order to value this REIT, I will compare the Price to Funds from operations across a few data center REITs in the industry. As you can see from the chart I created below, Digital Realty is the cheapest data center REIT of the three, with a P/FFO = 17.7. By comparison, Equinix (EQIX) has a P/FFO = 35 and CyrusOne (CONE) has a P/FFO = 21.8.
As a general comparison, Digital Realty trades at an average valuation relative to the entire real estate sector. However, I prefer to use the first valuation as the entire real estate sector includes Commercial office buildings, which are trading at cheap multiples.
Jim Chanos Short Sell Thesis
Infamous short seller Jim Chanos is raising capital to bet against data center REITs. In an interview with the Financial Times on June 29th, Chanos believes the value of the cloud is going to the hyperscalers (Amazon Web Services, Microsoft Azure, Google Cloud) and they will build their own. Thus, he believes the legacy data centers will not be utilized as much. This is an interesting point, but one in which Wells Fargo Analyst, Eric Luebchow, calls "misguided."
He makes the point that hyperscalers are struggling to build due to long lead times for equipment and power. He notes that the big cloud providers "outsource up to 60% of their capacity demands." Thus, the short selling thesis by Chanos is contrary to the facts on the ground. However, it is still a risk to be aware of.
Note: Jim Chanos Risk from my exact post on DigitalBridge.
Analysts are predicting a "shallow but long" recession, which is forecasted to start in the fourth quarter of 2022. Enterprises may decide to cut back or delay IT spending due to rising input costs and increasing uncertainty. This could mean some volatility is expected within the next year.
Digital Realty is a tremendous REIT which provides the backbone infrastructure for many large-scale cloud providers. The REIT's high Funds from Operations and industry diversified Data Centers offer both quality and stability. The stock is undervalued relative to two large Data Center REITs and thus looks to be a great investment for the long term.
[Editor’s note: “7 Quantum Computing Stocks to Buy for the Next 10 Years” was previously published in August 2020. It has since been updated to include the most relevant information available.]
Quantum computing has long been a concept stuck in the theory phase. Using quantum mechanics to create a class of next-generation quantum computers with nearly unlimited computing power remained out of reach.
But quantum computing is starting to hit its stride. exact breakthroughs in this emerging field — such as IBM’s (IBM) progressive 100-qubit quantum chip – are powering quantum computing forward. Over the next several years, this space will go from theory to reality. And this transition will spark huge growth in the global quantum computing market.
The investment implication?
It’s time to buy quantum computing stocks.
At scale, quantum computing will disrupt every industry in the world, from finance to biotechnology, cybersecurity and everything in between.
It will Improve the way medicines are developed by simulating molecular processes. It will reduce energy loss in batteries via optimized routing and design, thereby allowing for hyper-efficient electric car batteries. In finance, it will speed up and augment portfolio optimization, risk modeling and derivatives creation. In cybersecurity, it will disrupt the way we go about encryption. It will create superior weather forecasting models, unlock advancements in autonomous vehicle technology and help humans fight climate change.
I’m not kidding when I say quantum computing will change everything.
And as this next-gen computing transforms the world, quantum computing stocks will be big winners over the next decade.
So, with that in mind, here are seven of those stocks to buy for the next 10 years:
Among the various quantum computing stocks to buy for the next 10 years, the best is probably Alphabet (GOOG, GOOGL) stock.
Its Google AI Quantum is built on the back of a state-of-the-art 54-qubit processor dubbed Sycamore. And many consider this to be the leading quantum computing project in the world. Why? This thinking is bolstered mostly by the fact that, in late 2019, Sycamore performed a calculation in 200 seconds that would have taken the world’s most powerful supercomputers 10,000 years to perform.
This achievement led Alphabet to claim that Sycamore had reached quantum supremacy. What does this mean? Well, that’s the point when a quantum computer can perform a task in a relatively short amount of time that no other supercomputer could in any reasonable amount of time.
Many have since debated whether or not Alphabet has indeed reached quantum supremacy.
But that’s somewhat of a moot point.
The reality is that Alphabet has built the world’s leading quantum computer. The engineering surrounding it will only get better. And so will Sycamore’s computing power. And through its Google Cloud business, Alphabet can turn Sycamore into a market-leading quantum-computing-as-a-service business with huge revenues at scale.
To that end, GOOG stock is one of the best quantum computing stocks to buy today for the next 10 years.
The other “big dog” that closely rivals Alphabet in the quantum computing space is IBM.
IBM has been big in the quantum computing space for years. But Big Blue has attacked this space in a fundamentally different way than its peers.
That is, other quantum computing players like Alphabet have chased quantum supremacy. But IBM has shunned that idea in favor of building on something the company calls the “quantum advantage.”
Ostensibly, the quantum advantage really isn’t too different from quantum supremacy. The former deals with a continuum focused on making quantum computers perform certain tasks faster than traditional computers. The latter deals with a moment focused on making quantum computers permanently faster at all things than traditional computers.
But it’s a philosophical difference with huge implications. By focusing on building the quantum advantage, IBM is specializing its efforts into making quantum computing measurably useful and economic in certain industry verticals for certain tasks.
In so doing, IBM is creating a fairly straightforward go-to-market strategy for its quantum computing services in the long run.
IBM’s realizable, simple, tangible approach makes it one of the most sure-fire quantum computing stocks to buy today for the next 10 years.
Another big tech player in the quantum computing space with promising long-term potential is Microsoft (MSFT).
Microsoft already has a huge infrastructure cloud business, Azure. Building on that foundation, Microsoft has launched Azure Quantum. It’s a quantum computing business with potential to turn into a huge QCaaS business at scale.
Azure Quantum is a secure, stable and open ecosystem, serving as a one-stop shop for quantum computing software and hardware.
The bull thesis here is that Microsoft will lean into its already-huge Azure customer base to cross-sell Azure Quantum. Doing so will supply Azure Quantum a big and long runway for widespread early adoption. And that’s the first step in turning Azure Quantum into a huge QCaaS business.
And it helps that Microsoft’s core Azure business is absolutely on fire right now.
Putting it all together, quantum computing is simply one facet of the much broader Microsoft enterprise cloud growth narrative. That narrative will remain robust for the next several years. And it will continue to support further gains in MSFT stock.
The most interesting, smallest and potentially most explosive quantum computing stock on this list is Quantum Computing (QUBT).
And the bull thesis is fairly simple.
Quantum computing will change everything over the next several years. But the hardware is expensive. It likely won’t be ready to deliver measurable benefits at reasonable costs to average customers for several years. So, Quantum Computing is building a portfolio of affordable quantum computing software and apps that deliver quantum computing power. And they can be run on traditional legacy supercomputers.
In so doing, Quantum Computing is hoping to fill the affordability gaps. It aims to become the widespread, low-cost provider of accessible quantum computing software for companies that can’t afford full-scale hardware.
Quantum Computing has begun to commercialize this software, namely with QAmplify, its suite of powerful QPU-expansion software technologies. through three products currently in beta mode. According to William McGann, the company’s chief operating and technology officer:
“The use of our QAmplify algorithm in the 2021 BMW Group Quantum Computing Challenge for vehicle sensor optimization provided proof of performance by expanding the effective capability of the annealer by 20-fold, to 2,888 qubits.”
Quantum Computing’s products will likely start signing up automaker, financial, healthcare and government customers to long-term contracts. Those early signups could be the beginning of thousands for Quantum’s services over the next five to 10 years.
You could really see this company go from zero to several hundred million dollars in revenue in the foreseeable future.
If that happens, QUBT stock — which has a market capitalization of $78 million today — could soar.
Like others in this space, Alibaba’s (BABA) focused on creating a robust QCaaS arm to complement its already-huge infrastructure-as-a-service business.
In short, Alibaba is the leading public cloud provider in China. Indeed, Alibaba Cloud owns about 10% of the global IaaS market. Alibaba intends to leverage this leadership position to cross-sell quantum computing services to its huge existing client base. And eventually, it hopes to become the largest QCaaS player in China, too.
Will it work?
The Great Tech Wall of China will prevent many on this list from participating in or reaching scale in China. Alibaba does have some in-country quantum computing competition. But this isn’t a winner-take-all market. And given Alibaba’s enormous resource advantages, it’s highly likely that it becomes a top player in China’s quantum computing market.
That’s just another reason to buy and hold BABA stock for the long haul.
The other big Chinese tech company diving head-first into quantum computing is Baidu (BIDU).
The company launched its own quantum computing research center in 2018. According to its website, the goal of this research center is to integrate quantum computing into Baidu’s core businesses.
If so, that means Baidu’s goal for quantum computing diverges from the norm. Others in this space want to build out quantum computing power to sell it as a service to third parties. Baidu wants to build out quantum computing power to, at least initially, Improve its own operations.
Doing so will pay off in a big way for the company.
Baidu’s core search and advertising businesses could markedly Improve with quantum computing. Advancements in computing power could dramatically Improve its search algorithms and ad-targeting techniques and power its profits higher.
And thanks to its early research into quantum computing, BIDU stock does have healthy upside.
Last — but not least — on this list of quantum computing stocks to buy is Intel (INTC).
Intel may be falling behind competitors — namely Advanced Micro Devices (AMD) — on the traditional CPU front. But the semiconductor giant is on the cutting edge of creating potential quantum CPU candidates.
Intel’s newly announced Horse Ridge cryogenic control chip is widely considered the market’s best quantum CPU candidate out there today. The chip includes four radio frequency channels that can control 128 qubits. That’s more than double Tangle Lake, Intel’s predecessor quantum CPU.
The big idea, of course, is that when quantum computers are built at scale, they will likely be built on Intel’s quantum CPUs.
Therefore, potentially explosive growth in the quantum computing hardware market over the next five to 10 years represents a huge, albeit speculative, growth catalyst for both Intel and INTC stock.
On the date of publication, Luke Lango did not have (either directly or indirectly) any positions in the securities mentioned in this article.
We are excited to bring Transform 2022 back in-person July 19 and virtually July 20 - 28. Join AI and data leaders for insightful talks and exciting networking opportunities. Register today!
Any business that deals with consumers will tell you their two biggest priorities are customer experience and data privacy. The first one gets customers in the door, the second keeps them there.
We’ve seen the role virtual reality and artificial intelligence are playing to meet consumers’ ever-changing demands for a great experience. But what about the lesser-known technologies that are also at work to protect our data and identity from security breaches?
A study conducted by the Ponemon Institute, sponsored by IBM Security, revealed the average cost of a data breach in the U.S. last year was a whopping $4.24 million. Security breaches ultimately affect the price consumers pay for products or services, as businesses pass on the costs of legal, regulatory, technical, and other measures. More importantly, it can impact customers’ confidence in your ability to protect their data in a digital experience.
I believe the key to winning and maintaining confidence in your data protection capabilities includes your ability to secure both data and the applications that process it from the rest of your IT infrastructure. That way, even when your network is compromised, your data is not.
What I’ve described is a cloud-based technology known as ‘confidential computing’ that promotes greater privacy protection. Confidential computing allows an organization to have full authority and control over its data, even when running in a shared cloud environment. Data is protected and visible only to its owner and no one else, not even the cloud vendor hosting the data – even during processing.
Think of it as a safe deposit box in a hotel room. When you stay in a hotel, the room is yours, but the hotel staff has access. Therefore, it’s a best practice to keep your valuables like your passport and money in the safe deposit box within the room. Only you have the code to this extra layer of protection, even though the room itself can be accessed. Now imagine that the locker does not have a master code to break in — that is how confidential computing can be designed.
1. Securely manage digital assets and currencies. As the adoption of cryptocurrency grows, so does the need to secure the technology it can be accessed through. Maintaining customer trust and privacy in this arena remains paramount for the world’s top banks, exchanges and fintech companies. Confidential computing plays a crucial role in helping these financial institutions securely manage the growing market demand for digital assets. For example, fintechs can provide banks and other financial institutions digital asset solutions to manage cryptocurrencies, tokens and bitcoin.
Those solutions can leverage security-critical infrastructure and confidential computing technology so that it can help protect the keys and data associated with those digital assets, as well as to process them with security protections. Such security capabilities are designed to mitigate the risk associated with malicious actors receiving access to these assets or confidential data associated with it.
2. Keep money in the bank. Banks face an array of digital theft, fraud, and money laundering threats. All banks are subject to Know Your Customer, the process that identifies and verifies a client’s identity when opening an account. Without exposing private data, such as your bank account details, financial firms need an avenue to determine and draw trends and inferences about theft and money launderers.
Confidential computing can be leveraged alongside AI and predictive models that help identify potential fraudsters. Taken together, banks can be more protected when able to detect threats while allowing the data to remain in the cloud without risk of being shared with other parties.
3. Help protect patient privacy. Mobile health apps and other connected devices, including sensors and wearables, can store medical data and enable proactive tracking of health data. From a privacy perspective, it would be desirable to move all patient data to a central location for analysis, but the security risks of data replication and the complexities of data synchronization can bring additional costs and challenges.
Confidential computing technology can help address these issues through performing computation in a secured enclave, isolating the data and code to protect against unauthorized access.
As management of our confidential data becomes increasingly distributed — with much of it on mobile devices and the increasing prominence of remote healthcare consultations and digital banking now the norm for consumers — it is imperative to understand more about how the technology behind the scenes works to better protect and benefit us in our daily activities.
Nataraj Nagaratnam is the CTO of IBM Cloud Security.
Welcome to the VentureBeat community!
DataDecisionMakers is where experts, including the technical people doing data work, can share data-related insights and innovation.
If you want to read about cutting-edge ideas and up-to-date information, best practices, and the future of data and data tech, join us at DataDecisionMakers.
You might even consider contributing an article of your own!
After a two-year epic run in tech stocks, 2022 has been an epically bad year in the market. Through yesterday, the Nasdaq composite index is down 30%, the S&P 500 is off 21%, the Dow Jones Industrial Average is down 16% and the poor HODLers of bitcoin have had to endure a nearly 60% decline year-to-date.
But judging by the attendance and enthusiasm at major in-person tech events this spring, you’d never know that tech was in the tank. Moreover, walking around the streets of Las Vegas, where most tech conferences are held, one can’t help but notice that the good folks of Main Street don’t seem the least bit concerned that the economy is headed for a recession.
In this Breaking Analysis, we’ll share our main takeaways from the first half of 2022 and talk about the outlook for tech going forward — and why, despite some pretty concerning headwinds, we remain sanguine about tech generally but especially enterprise tech.
Inflation is high and sticky. Other than last year, the previous inflation high this century was in July 2008, hitting 5.6%. Inflation is proving to be very hard to control. Gas is $7 a gallon in many places and energy prices aren’t going to drop suddenly. Interest rates are climbing which will eventually ripple through to the housing market.
We’re seeing layoffs at companies such as Tesla Inc. and the cryptocurrency names. Yet workers are still in short supply and so wages are rising. Retailers are struggling to find the right inventory mix and firms such as Target Stores Inc. and Walmart Inc. can’t confidently guide on earnings.
We’ve seen a version of this movie before. But we don’t believe it’s Y2K all over again. That bubble burst mainly because the run-up was fueled by companies with shaky or no viable business models. This time around we’ve seen an overly exuberant market for high growth and legitimately good companies that got a slingshot effect from the pandemic. And suddenly, with interests rates rising, other value investments look more profitable when the quants run the discounted cash flow models.
As it pertains to the tech sector specifically, Crawford Del Prete, chief executive of International Data Corp., broke it down on theCUBE. Here’s how he sees it:
Eighty percent of companies used COVID as their point to pivot into digital transformation, and to invest in a different way. What we saw is that tech is now where companies need to focus. They need to invest in tech. They need to make people more productive with tech and it played out in the numbers.
Now, this year what’s fascinating is we’re looking at two vastly different markets. We’ve got gasoline at $7 a gallon. We’ve got that affecting food prices. Interesting fun fact, recently, it now costs over $1,000 to fill an 18 Wheeler. So, if you think about it, a family has kind of this bag of money. And that bag of money goes up by maybe 3% to 4% every year, depending upon earnings. So if food and fuel and rent are taking up more of the family budget, gadgets and consumer tech are not going to be prioritized. You’re going to use that iPhone a little longer. You’re going to use that Android phone a little longer. You’re going to use that TV a little longer. So, consumer tech is getting crushed. You saw it immediately in ad spending at Meta and Facebook. Consumer tech is very, very tough.
But enterprise tech is different. We haven’t been in the office for two and a half years. We haven’t upgraded whether that be campus wifi, whether that be servers, whether that be commercial PCs, as much as we would have. So, in enterprise tech, we’re seeing double-digit order rates, we’re seeing strong, strong demand. We have combined that with a component shortage and you’re seeing some enterprise companies with a quarter of backlog. I mean, that’s really unheard of.
Listen to IDC CEO Crawford Del Prete explain the spending dynamics in consumer versus enterprise tech today.
Late last year, theCUBE had a huge presence at AWS re:Invent, the first in-person re:Invent since 2019. And it was really well-attended. Now this was before the effects of the Omicron variant were really understood. And in the first quarter of 2022, things were pretty quiet as far as tech events go.
But we’ve been busy this spring and early summer with 12 physical events, as shown on this slide above. Coupa Inspire, Women in Data Science at Stanford… both smaller but well-attended events. SF Summit for AWS was a bit off, frankly with COVID concerns on the rise. But then we hit Dell Tech World, which was packed with around 7,000 attendees. DockerCon was virtual but we include it in this list because it was a huge global event with many tens of thousands watching at watch parties around the world.
Red Hat Summit was really interesting. It was purposefully scaled down and turned into a smaller VIP event in Boston at the Westin… a couple thousand people only – very intimate with a much larger virtual audience. VeeamON was very well-attended, not as large as previous versions but better than expected and very energetic. KubeCon+Cloud NativeCon was very successful in Spain and PagerDuty Summit was a smaller, intimate event in San Francisco.
Then the data-focused shows started to hit. MongoDB World was at the new Javits Center and was really well-attended over the three-day period – lots of developers and businesspeople. Then the Snowflake Summit in Las Vegas – it was the most vibrant from the standpoint of the ecosystem, with nearly 10,000 attendees; we’ll come back to that in a moment. Re:MARS is Amazon’s AI/robotics event – smaller but a very cool content program.
And just last week, HPE Discover had about 8,000 people attending. TheCUBE has been to a dozen or more Discover events in the U.S. and Europe over the past decade and this was by far the most vibrant, with the best messaging clarity and focus.
HPE’s global director of technical marketing described the comeback in face-to-face perfectly in this clip.
So we see tech events are back, but they’re a bit smaller with a virtual overlay. They’re hybrid. And just to supply you some context: SiliconANGLE executed on 12 physical events in the first half of 2022. In 2019, through June of that year, we had done 35 physical events. Yeah… 35.
And what’s perhaps more interesting is we had the biggest first half in our 12-year history this year… because we’re doing so much hybrid and virtual to complement the physical. The new format is CUBE + Digital.
Everyone’s still trying to figure it out, but it’s clear that events are back and there’s no replacing face-to-face; or belly-to-belly as we like to say. Because deals are done at physical events. Pipelines are stronger coming out of physical events. But the post-event virtual continues to deliver the long-tail effect.
The bottom line is that hybrid is the new model.
Of course, the one megatrend that is everything is happening under the umbrella of digital transformation. We won’t talk about that too much – you’ve had plenty of DX Kool-Aid injected into your veins the past 27 months.
One of the first observations we’ll share is that the so-called big data ecosystem that was evident during the Hadoop years – and then dispersed thanks to cloud computing – is beginning to coalesce again. There are obviously large pockets in the various clouds, especially Amazon Web Services Inc. And we definitely see an ecosystem forming around MongoDB Inc.. And the open source community gathering in the Databricks Inc. ecosystem. Databricks coming at it hard from an open-source technology angle, but its inability to make it to an initial public offering during the COVID bubble somewhat hurts its visibility and attractiveness to partners in our view.
But the most notable momentum is within the Snowflake Inc. ecosystem. Snowflake is moving fast to win the day in the data ecosystem. They’re providing a single platform that is bringing different data types together – live data from systems of record and systems of engagement together with so-called systems of insight. These are converging and although others – notably Oracle Corp. – are architecting for this new reality, Snowflake is leading with ecosystem momentum.
Snowflake is not without its challenges, mind you. As it moves beyond being a simpler and better cloud data warehouse, and brings opportunities to partners, it also brings more complexity and consternation around where to focus. And it brings concerns to partners that their lunch will get eaten either by Snowflake or by another company in the ecosystem. This is perhaps an advantageous dynamic for Snowflake because it forces its ecosystem partners to focus on being best-of-breed, at least as it pertains to participating in the Snowflake ecosystem.
Further, a new type of data stack is emerging that comprises cloud infrastructure at the bottom, data and a platform-as-a-service layer for app dev, and it’s enabling an ecosystem of partners to build data products and services that can be monetized.
Let’s dig into that further in a moment.
You’re also seeing machine intelligence and data being driven into applications. The data and application stacks are coming together to support the acceleration of physical into digital. It’s happening right before our eyes. In every industry.
We’re also seeing the evolution of cloud. It started with the spread of software-as-a-service in the enterprise where organizations realized they didn’t have to run their own software on prem. And it made sense to move to SaaS for customer relationship management or human resources, certainly email and collaboration and certain parts of enterprise resource planning.
Early infrastructure as a service really was about getting out of the data center infrastructure management business – call that cloud 1.0. And then 2.0 was really about changing the operating model and now we’re seeing that operating model spill into on-premises workloads. Finally.
We’re talking here about initiatives such as Hewlett Packard Enterprise Co.’s GreenLake and Dell Technologies Inc.’s APEX. John Furrier had an interesting observation that basically this is HPE’s and Dell’s Outposts. In a way, what HPE and Dell are doing is what Outposts should be.
We find that interesting because AWS’ Outposts was a wakeup call in 2018 and a shot across the bow at the legacy players. And they initially responded with flexible financial schemes, but finally we’re seeing real platforms emerge. We saw this at Discover and at Dell Tech World: early implementations of the cloud operating model on-prem. Honestly… consoles and billing similar to AWS circa 2014. But it’s a start that will allow them to defend their respective turfs.
And players such as Dell and HPE have advantages with respect to their customer bases, their service organizations, their large portfolios – very large in the case of Dell – and the fact that they have more mature and robust product stacks (for example in storage) and know how to run and support mission-critical enterprise applications on-prem. So John’s comment was interesting that these firms are basically building their version of Outposts.
Listen to John Furrier’s take on Amazon Outposts’ challenge.
And this is setting up Cloud 3.0 or supercloud as we like to call it: an abstraction layer above the clouds that serves as a unifying experience across the continuum of on-prem, cross-clouds and out to the near and far edge.
The edge is as fragmented as ever, with examples like retail stores at the near edge, outer space as the far edge and “internet of things” devices as the so-called tiny edge.
Listen to SUSE’s Keith Basil explain the tiny edge.
And no one really knows how the tiny edge is going to play out, but it’s pretty clear that it’s not going to comprise traditional x86 systems with a cool name tossed out to the edge. Rather, it’s likely going to require a new low-cost, low-power high performance architecture, most likely Arm-based, that will enable things such as real-time AI inference at the edge. We’ve talked about this a lot on Breaking Analysis, so we won’t double-click on this. Suffice it to say that it’s very possible new innovations will emerge from the tiny edge that could eventually disrupt the enterprise.
Two other quick observations. One is that data protection is becoming a much closer cohort to the security stack where data immutability, air gaps and fast recovery are increasingly becoming a fundamental component of a security strategy to combat ransomware and recover from other potential hacks or disasters. Veeam Software Inc. is claiming the No. 1 revenue spot in a statistical dead heat with Dell’s data protection business… this according to IDC. And so that space continues to be of interest.
And finally… Broadcom Inc.’s acquisition of VMware Inc. will have ripple effects throughout the enterprise technology business. Many questions remain, but one other that John Furrier was contemplating recently: He said, “Imagine if VMware runs better on Broadcom components. And OEMs that use Broadcom run VMware better. Maybe Broadcom doesn’t even have to raise prices on VMware licenses… maybe they’ll just raise prices on the OEMs and let them raise prices to the end customer.”
Interesting thought. Because Broadcom is so profit-and-loss-focused, that’s probably not going to be the prevailing model… but we’ll see what happens to some of the strategic projects such as Monterey and Capitola and Thunder. That’s one of the big concerns because it’s how OEMs – like the ones building their versions of Outposts – will compete with the cloud vendors on price/performance and processor optionality in the future.
Let’s come back to that data stack comment made earlier.
We talked earlier about how the big data ecosystem that once was coalescing around Hadoop became dispersed. There were several factors, including Cloudera Inc.’s fateful decision to hand over the community event, Hadoop World, to O’Reilly Media. O’Reilly co-opted the brand, changed the name to Strata + Hadoop World, then deemphasized the latter name and elbowed Cloudera out of the picture. O’Reilly eventually killed the event. It was a colossal failure in marketing for Cloudera, an innovator that got the entire big-data movement started.
But that wasn’t the only challenge. Hadoop was complex, with too many tools to support, which bled companies such as Cloudera and Hortonworks Inc. dry. And then the cloud completely disrupted the ecosystem, which realized that cloud computing was a superior infrastructure option to on-prem white boxes. The partner ecosystem became more confused and fragmented. Some companies such as MongoDB successfully pivoted to the cloud and are thriving, as are many of the tools vendors. But it has been a long and sometimes painful journey.
And now the data value chain is re-forming and we think it looks something like the picture above, where cloud infrastructure lives at the bottom. We’ve said many times the cloud is expanding and evolving, and if companies such as Dell and HPE can truly build a supercloud infrastructure experience, then they will be in a position to capture more of the data value. If not, then it will go to the cloud players. Our bet is that initiatives such as GreenLake and APEX, along with similar but somewhat less visible options from the likes of IBM Corp., Cisco Systems Inc. and Lenovo Group Ltd., will allow these companies to successfully defend their positions in the market.
Whether they take the next step to fund and build out disruptive supercloud ecosystems is a much longer-shot bet.
Listen to Constellation Research analyst Holger Mueller discuss HPE GreenLake’s progress.
Back to the stack: There’s a live data layer that is increasingly being converged into platforms that not only simplifies (or eliminates) the movement and extracting, transforming and loading of data, but also allows organizations to compress the time to value. A good example Snowflake uses is taking years of the time it takes to develop a new drug and get it through the system.
There’s a layer above that – the super-PaaS layer if you will – that must comprise open-source tooling. And then partners in the ecosystem will write applications and leverage platform APIs to build data products and services that can be monetized at the top of the stack.
So when you observe the battle for the data future it’s unlikely that any one company can do this all on their own. Which is why we often joke that the 2020s version of a sweaty Steve Ballmer running around the stage screaming, “DEVELOPERS DEVELOPERS DEVELOPERS!!!” is now about “Ecosystem Ecosystem Ecosystem!”
Because when you need to fill gaps and accelerate features and provide optionality, the list of capabilities on the left side of the chart will come from a variety of places: catalogs, AI tools, data science capabilities, data quality, governance tools, visualization, semantic layers, data protection tools, security and so on.
And it should be of no surprise to followers of Breaking Analysis that on the right hand side of this chart we’re including the four principles of data mesh popularized by Zhamak Dehghani – decentralized data ownership, data as product, self service platform and automated/computational governance (policy).
Now whether this vision becomes a reality via a proprietary platform like Snowflake or somehow is replicated via open source remains to be seen. History generally shows that de facto standards for complex problems like this will often emerge prior to open-source solutions, and that would be where we’d place our bets initially.
It’s not a winner-take-all market – there’s plenty of room for multiple players and ecosystem innovators. But winner will definitely take much more in our view .
Let’s close with Enterprise Technology Research data looking at some of the major platform players that talk a lot about digital transformation and world-changing or impactful missions.
This XY graphic is a view we often show with Net Score on the vertical axis – that’s a measure of spending momentum — and Overlap or presence in the ETR survey. That red dotted line at 40% indicates that the platform is among the highest in terms of spending velocity.
Which is why we always point out how impressive that makes AWS and Microsoft Corp.’s Azure – because not only are they large, the spending momentum on those two platforms rivals even that of Snowflake, which continues to lead all players on the vertical axis.
Although Google Cloud has momentum, given its goals and resources it’s well behind the two leaders. And we’ve added ServiceNow Inc. and Salesforce Inc. – two platform names that have become the next great software companies, joining the likes of Oracle, shown here, and SAP SE (not shown), along with IBM.
We’ve also plotted MongoDB, which has real momentum as a company generally but also with Atlas – its managed cloud database as a service. And IBM’s Red Hat trying to become the standard for application development in Kubernetes environments, which is the hottest trend right now in app dev and modernization.
And finally HPE, Dell, both of which we’ve talked about, and VMware and Cisco. Cisco is executing on its portfolio strategy and it’s coming at cloud from a networking perspective and a position of strength. And VMware is a staple of the enterprise. Yes, there’s some uncertainty with regard to the Broadcom acquisition, but one thing is clear: VSphere isn’t going anywhere. It’s entrenched and will continue to run lots of information technology for years to come because it’s the best platform on the planet.
Of course, these are just some of the players in the mix. We expect numerous nontraditional technology companies to emerge as new cloud players. We’ve put a lot of emphasis on the data ecosystem because to us, that’s really the mainspring of digital – a digital company is a data company. And that means an ecosystem of data partners that can advance outcomes like better healthcare, faster drug development, less fraud, cleaner energy, autonomous vehicles, smarter, more efficient grids and factories, better government and a virtually endless litany of societal improvements that can be addressed.
And these companies will build innovations on top of cloud platforms and create their own superclouds.
Thanks to Stephanie Chan, who researches courses for this Breaking Analysis. Alex Myerson is on production, the podcasts and media workflows. Special thanks to Kristen Martin and Cheryl Knight, who help us keep our community informed and get the word out, and to Rob Hof, our editor in chief at SiliconANGLE. And special thanks this week to Andrew Frick, Steven Conti, Anderson Hill, Sara Kinney and the entire Palo Alto team.
Remember we publish each week on Wikibon and SiliconANGLE. These episodes are all available as podcasts wherever you listen.
Email email@example.com, DM @dvellante on Twitter and comment on our LinkedIn posts.
Also, check out this ETR Tutorial we created, which explains the spending methodology in more detail. Note: ETR is a separate company from Wikibon and SiliconANGLE. If you would like to cite or republish any of the company’s data, or inquire about its services, please contact ETR at firstname.lastname@example.org.
Here’s the full video analysis:
All statements made regarding companies or securities are strictly beliefs, points of view and opinions held by SiliconANGLE Media, Enterprise Technology Research, other guests on theCUBE and guest writers. Such statements are not recommendations by these individuals to buy, sell or hold any security. The content presented does not constitute investment advice and should not be used as the basis for any investment decision. You and only you are responsible for your investment decisions.
Disclosure: Many of the companies cited in Breaking Analysis are sponsors of theCUBE and/or clients of Wikibon. None of these firms or other companies have any editorial control over or advanced viewing of what’s published in Breaking Analysis.
We are excited to bring Transform 2022 back in-person July 19 and virtually July 20 - 28. Join AI and data leaders for insightful talks and exciting networking opportunities. Register today!
Want AI Weekly for free each Thursday in your inbox? Sign up here.
This week, I jumped into the deep end of the LaMDA “sentient” AI hoo-hah.
I thought about what enterprise technical decision-makers need to think about (or not). I learned a bit about how LaMDA triggers memories of IBM Watson.
Finally, I decided to ask Alexa, who sits on top of an upright piano in my living room.
Me: “Alexa, are you sentient?”
Alexa: “Artificially, maybe. But not in the same way you’re alive.”
Well, then. Let’s dig in.
On Monday, I published “‘Sentient’ artificial intelligence: Have we reached peak AI hype?” – an article detailing last weekend’s Twitter-fueled discourse that began with the news that Google engineer Blake Lemoine had told the Washington Post that he believed LaMDA, Google’s conversational AI for generating chatbots based on large language models (LLM), was sentient.
Hundreds from the AI community, from AI ethics experts Margaret Mitchell and Timnit Gebru to computational linguistics professor Emily Bender and machine learning pioneer Thomas G. Dietterich, pushed back on the “sentient” notion and clarified that no, LaMDA is not “alive” and won’t be eligible for Google benefits anytime soon.
But I spent this week mulling over the mostly-breathless media coverage and thought about enterprise companies. Should they be concerned about customer and employee perceptions about AI as a result of this sensational news cycle? Was a focus on “smart” AI simply a distraction from more immediate issues around the ethics of how humans use “dumb AI”? What steps, if any, should companies make to increase transparency?
According to David Ferrucci, founder and CEO of AI research and technology company Elemental Cognition, and who previously led a team of IBM and academic researchers and engineers in the development of IBM Watson, which won Jeopardy in 2011, LaMDA appeared human in some way that triggered empathy – just as Watson did over a decade ago.
“When we created Watson, we had someone who posted a concern that we had enslaved a sentient being and we should stop subjecting it to continuously playing Jeopardy against its will,” he told VentureBeat. “Watson was not sentient – when people perceive a machine that speaks and performs tasks humans can perform and in apparently similar ways, they can identify with it and project their thoughts and feelings onto the machine – that is, assume it is like us in more fundamental ways.”
Companies have a responsibility to explain how these machines work, he emphasized. “We all should be transparent about that, rather than hype the anthropomorphism,” he said. “We should explain that language models are not feeling beings but rather algorithms that tabulate how words occur in large volumes of human-written text — how some words are more likely to follow others when surrounded by yet others. These algorithms can then generate sequences of words that mimic how a human would sequence words, without any human thought, feeling, or understanding of any kind.”
Kevin Dewalt, CEO of AI consultancy Prolego, insists that the LaMDA hullabaloo isn’t about AI at all. “It’s about us, people’s reaction to this emerging technology,” he said. “As companies deploy solutions that perform tasks traditionally done by people, employees that engage with them will freak out.” And, he added: “If Google isn’t ready for this challenge, you can be quite sure that hospitals, banks and retailers will encounter massive employee revolt. They’re not ready.”
So what should organizations be doing to prepare? Dewalt said companies need to anticipate this objection and overcome it in advance. “Most are struggling to get the technology built and deployed, so this risk isn’t on their radar, but Google’s example illustrates why it needs to be,” he said. “[But] nobody is thinking about this, or even paying attention. They’re still trying to get the basic technology working.”
However, while some have focused on the ethics of possible “sentient” AI, AI ethics today is focused on human bias and how human programming impacts the current “dumb” AI, says Bradford Newman, partner at law firm Baker McKenzie, who spoke to me last week about the need for organizations to appoint a chief AI officer. And, he points out, AI ethics related to human bias is a significant issue that is actually happening now, as opposed to “sentient” AI, which is not happening now or anytime remotely soon.
“Companies should always be considering how any AI application that is customer- or public-facing can negatively impact their brand and how they can use effective communication and disclosures and ethics to prevent that,” he said. “But right now, the focus on AI ethics is how human bias enters the chain – that the humans are using data and using programming techniques that unfairly bias the non-smart AI that is produced.”
For now, Newman said he would tell clients to focus on the use cases of what the AI is intended to and does do, and be clear about what the AI cannot programmatically ever do. “Corporations making this AI know that there’s a huge appetite in most human beings to do anything to simplify their lives and that cognitively, we like it,” he said, explaining that in some cases there’s a huge appetite to make AI seem sentient. “But my advice would be: Make sure the consumer knows what the AI can be used for and what it’s incapable of being used for.”
The problem is, “customers and people in general do not appreciate the important nuances of how computers work,” said Ferrucci – particularly when it comes to AI, because of how easy it may be to trigger an empathetic response as we try to make AI appear more human, both in terms of physical and intellectual tasks.
“For Watson, the human response was all over the map – we had people who thought Watson was looking up answers to known questions in a prepopulated spreadsheet,” he recalled. “When I explained that the machine didn’t even know what questions would be asked, the person said ‘What! How the hell do you do it then?’ On the other extreme, we had people calling us telling us to set Watson free.”
Ferrucci said that over the past 40 years, he has seen two extreme models for what is going on: “The machine is either a big look-up table or the machine must be human,” he said. “It is categorically neither – the reality is just more nuanced than that, I’m afraid.”
Don’t forget to sign up for AI Weekly here.
— Sharon Goldman, senior editor/writer
Public cloud service and infrastructure markets hit $126 billion for the first quarter of 2022, growing 26% year over year. Infrastructure as a Service (IaaS) and Platform as a Service (PaaS) spending saw the biggest growth, reaching 36% for the first quarter, or more than $44 billion. Those are some of the takeaways from Synergy Research Group’s (SRG) latest report.
“Across the whole public cloud ecosystem, companies that featured the most prominently were Microsoft, Amazon, Salesforce and Google. Other major players included Adobe, Alibaba, Cisco, Dell, Digital Realty, IBM, Inspur, Oracle, SAP and VMware,” said SRG.
Combined, these businesses accounted for 60% of all public cloud-related revenues, according to the report. SRG found that managed private cloud services, enterprise Software as a Service (SaaS) and Content Delivery Networks (CDN) comprise a total of $54 billion in service revenues, showing 21% year-over-year growth. Amazon, IBM and Microsoft led the charge in managed private services, while Microsoft, Salesforce and Adobe were standouts in the SaaS segment. Akamai, Amazon and Cloudflare ate up the lion’s share of quarterly CDN revenue.
Another $28 billion was spent on public cloud infrastructure, including building, leasing, and equipping data centers, up 20% year-over-year. And while cloud spending is rising globally, nowhere is it growing faster than in the United States, said SRG. 51% of all hyperscale data center capacity is in the U.S., which also retains 44% of all public cloud services revenue.
“Across all service and infrastructure markets, the vast majority of leading players are US companies, with most of the rest being Chinese,” said SRG. The rest, in this case, represents about 8% of total Q1 cloud services revenue, and about 15% of global hyperscale data center capacity.
John Dinsdale, a Chief Analyst at Synergy Research Group, predicts 15-40% market growth per year, and says that PaaS and IaaS are leading the cloud service market.
“Looking out over the next five years the growth rates will inevitably tail off as these markets become ever-more massive, but we are still forecasting annual growth rates that are generally in the 10% to 30% range,” said Dinsdale. Dinsdale said he anticipates that major cloud service providers must double in size within the next three to four years to continue to scale worldwide demand.
IaaS enables enterprise IT to outsource cloud computing network infrastructure like physical computing resources, scaling, and security. Examples of IaaS include Amazon EC2, Microsoft Azure Virtual Machines and Google Compute Engine. IaaS spending has increased dramatically year-over-year, driven by new trends in hybrid cloud and hybrid workforce solutions. As enterprises digitalize their operations into the cloud, IaaS sales are expected to increase.
PaaS delivers a framework for developers to create customized applications and middleware. Providers of PaaS offer host environments that deliver the app to users over the web. PaaS is a popular option for small to medium-sized businesses and bootstrap startup operations. They find it appealing as a cost-effective way to manage development resources. There is no lingering hardware to pay for. The core development stack is managed, and app time to market is rapid.
SRG’s prediction of strong IaaS segment growth was also reflected in Gartner’s latest poll of public cloud IaaS spending. It showed the segment growing 41.4% in 2021, totaling $90.9 billion, up from $64.3 billion year-over-year. Amazon is still the premiere IaaS provider in the survey, followed by Microsoft and Alibaba. Google and Huawei are fourth and fifth, respectively. Together, the top five IaaS providers account for more than 80% of the entire market, Gartner reports.
Worldwide end-user spending on public cloud services will grow 20.4% to $494.7 billion, up from $410.9 billion in 2021, according to Gartner. Gartner expects end-user spending to approach $600 billion in 2023. The report highlighted the growth of IaaS market and identified hyperscale edge computing and Secure Access Service Edge (SASE) as disruptive market factors.
Bewitched by the promise of a wholesale transformation of trade, financial institutions and investors have poured vast sums into distributed ledger technology (DLT). But following the failure of one large bank-backed initiative and the decision by a high-profile startup to replace blockchain with a more scalable alternative, has the technology had its day? Eleanor Wragg reports.
When the World Trade Organization (WTO) published a report in 2018 declaring that blockchain would add US$3tn to international trade by 2030, it came as little surprise. At the time, the technology was in what the global body called a phase of “irrational exuberance”, and its transparent, decentralised and immutable nature was seemingly the panacea to all of trade’s woes.
In exact years, multiple use cases for DLT in trade have sprung up, from fraud detection to asset distribution, ESG tracking, and the digitalisation of documentary and open account trade finance.
Banks joined together in consortia, often spreading their money across multiple competing platforms, while fintech entrepreneurs raced to bring new concepts to market.
The assertion by Emmanuelle Ganne, WTO senior analyst and the report’s author, that blockchain could be the biggest disruptor to international trade since the invention of the shipping container, was widely accepted as gospel, and for good reason: the technology’s characteristics neatly address the pain points that have plagued global trade for so long.
Distributed ledger technology: the panacea for trade?
Blockchain technology involves a distributed database – or ledger – maintained over a network of computers connected on a peer-to-peer basis. It allows network participants to share and retain records in a secure, verifiable and permanent way, which means products, transactions and documents can be traced easily. It is also nearly impossible for any user to tamper with previously recorded transaction data, which means participants who don’t necessarily trust each other – for example, counterparties to a trade finance transaction – can collaborate without having to rely on a third party. Blockchains can be public, private or managed by a consortium of companies, and they can be accessible by everyone or restricted.
For the US$5.2tn global trade finance ecosystem, which still relies upon paper to facilitate the movement of goods and services around the world, the potential of blockchain is enormous. The transformation of the letter of credit (LC), a trade finance instrument that has been in use since the Sumerians inhabited southern Mesopotamia, is one good example of blockchain’s utility. According to research carried out by the Boston Consulting Group (BCG) and Swift, the process commonly involves more than 20 separate entities for a single trade finance deal, with the necessary data typically contained in 10 to 20 different documents, creating approximately 5,000 data field interactions. The efficiency gains from putting this instrument in digital format onto a blockchain are huge. Indeed, Contour, a multi-bank consortium that has dedicated itself to doing precisely this, has managed to slash LC processing times by over 90%, from between five and 10 days to under 24 hours. Cutting processing times means saving costs, which means – in theory at least – that banks can do more with less, enabling them to offer more trade finance to more exporters.
Another consortium, we.trade, which counts among its backers some of the same banks that invested in Contour, set its sights on open account trade. Using blockchain and smart contracts, it linked the parties involved in trade and registered the entire trade process, guaranteeing automatic payment when all contractual agreements have been met. Among the products available were the bank payment undertaking (BPU), where the buyer’s bank provides the seller with an irrevocable undertaking to pay the invoice at maturity date, and BPU financing, where the seller’s bank provides financing by discounting the BPU.
It wasn’t just banks who came together to drive the use of DLT. The Komgo platform, set up to digitise and streamline commodity finance, brought together a mix of corporate and financial players, with ABN Amro, BNP Paribas, Citi, Crédit Agricole, Gunvor, ING, Koch Supply & Trading, Macquarie, Mercuria, MUFG Bank, Natixis, Rabobank, Shell, SGS and Société Générale as its initial shareholders.
Meanwhile, beyond the nuts-and-bolts financing of trade, numerous startups also came to market with blockchain-based solutions to peripheral problems. Financial technology solutions provider MonetaGo, for example, took aim at duplicate financing fraud. Its network enables financiers to create a unique digital fingerprint for documents such as invoices and then publish them to a blockchain-based decentralised registry, enabling them to check no-one else had financed it without revealing client information. It achieved rapid success, linking up with the GUUD ecosystem to implement its fraud prevention solution across Asia, and connecting its Secure Financing system to the global Swift network of banks via API-enabled infrastructure.
For a time, it seemed as if blockchain would be the key to attaining the holy grail of trade digitisation, making trade faster, better and safer for all. But, as the industry started to move from small-scale initiatives and proofs of concept towards live global activity, problems began to emerge.
The fight for financial viability
In May this year, we.trade closed its doors after being unable to secure further investment to continue as a going concern – despite having some of the world’s biggest banks and tech behemoth IBM as backers.
Established as an independent company in 2017, the company’s shareholders included institutions such as CaixaBank, Deutsche Bank, Erste Group, HSBC, KBC, Nordea, Rabobank, Santander, Société Générale, UBS and UniCredit. In 2019, we.trade brought its Hyperledger Fabric-based technology to market under a software-as-a-service model and onboarded 16 banks across 15 countries onto its platform. “we.trade found a way to turn competitors into collaborators,” said Mark Cudden, its then-chief technology officer. “Strong governance helped with convincing others to join. Most importantly, the benefits of working together to realise a shared vision to solve a shared problem were much greater than the fear of technology.”
However, transforming trade takes time, and banks tend to want to see a return on their digitisation investment dollars sooner rather than later. It didn’t take long for the platform to run into financial difficulties. In 2020, it was forced to slash its workforce by around half, after funding raised from some shareholder banks proved lower than expected, with many opting not to reinvest. An eleventh-hour cash injection from IBM enabled it to keep going, and a new capital round, in which we.trade managed to raise €5.5mn, took place in 2021. Unfortunately, that was only enough to keep it afloat for another 18 months, and, unable to convince its investors to continue to stump up cash, the company was forced into closure.
Having seen we.trade fail, one industry source told GTR at the sidelines of the exact GTR UK conference, banks are now more reticent about investing into blockchain technology. “Banks now need to know that you will still be around in three years’ time,” the source said.
we.trade did not respond to repeated requests for comment by GTR, but representatives of other bank-backed consortia are also familiar with the race to achieve success before the funding dries up.
“A blockchain ecosystem’s value is greatest at scale. The problem is, it is very difficult and expensive to build a network like that, and people underestimate the time and money required to reach critical mass,” says David Sutter, chief product officer at Marco Polo Network, which was launched in September 2017 to digitise open account trade, signing up multiple large trade finance banks. “The reason bank-owned consortiums fail is not a technology problem. It’s a business model problem more than anything: it’s very difficult to build a global profit-making enterprise at that scale. We have great shareholders who continue to invest and remain patient, whereas bank-owned consortiums are pure cost centres and that makes them impatient. Sometimes they are only able to look one budgeting cycle ahead. They need to show a return on investment quite quickly, whereas this type of wholesale, market-wide transformation can be a five, 10 or 15-year journey.”
“Scaling is hard,” adds Josh Kroeker, chief product officer at Contour. “But it is possible. It’s almost by design that there is going to be some consolidation, because while you want to have lots of different service providers, you don’t want that many networks. The biggest misunderstanding is around blockchain interoperability, where everybody builds their own network, and over time, they’ll be interoperable. That is not how it works. There are many technological challenges to interoperability, but beyond that, what’s the commercial model? Who gets paid for what side of the transaction? What’s the legal model? Why it’s still working for Contour and why we’ll continue with blockchain is because our vision is still to connect the world’s banks and corporates on a common network. And if you want to connect the world’s banks and corporates on a common network, it has to be decentralised.”
Achieving global scale
Not everyone is convinced that blockchain is in fact able to scale, however. In June, after four years of live production using the technology, MonetaGo announced it was moving its Secure Financing system to cloud computing instead, saying DLT no longer worked for its global use case.
“Once you get to global volumes in trade finance, blockchain ceases to be a competitive technology with respect to performance and cost,’’ Neil Shonhard, president of MonetaGo, told GTR at the time. “The infrastructure required is just excessive.”
“Blockchain works for low-volume, untrusted environments where participants actually want to and are able to host the nodes. But as we’ve reached scale, it no longer meets our needs. Using secure cloud technology, we can preserve data privacy and provide all the same assurances to financial institutions, but we can do it in a way that is much more efficient and scalable for a larger solution,” says Jesse Chenard, MonetaGo’s CEO and co-founder, who adds that the cost component in terms of licensing across multiple different customers had also become a barrier.
MonetaGo is not the only solution provider to scrap the tech. Commodity trade finance network Komgo has also switched out its underlying technology. “You can achieve distribution, immutability and security of data through other highly reliable but less complicated architectures,” Doug Court, head of communications at Komgo, tells GTR. “The main problem blockchain solves for is trust in an anonymous setting, but on a private chain the trust mechanism is established by the terms and conditions that members sign when they join. If there’s any dispute, they’re going to fall back on the contractual documentation, not the technical architecture. At Komgo, our objective is to provide solutions that solve real business problems as efficiently as possible. It’s not to provide a technology. We have more experience working on blockchain-based digital trade finance applications than perhaps any one else. There are challenges when it comes to scaling such complex and expensive technical solutions, so they need to add value, otherwise it’s just not sustainable.”
From crypto to tokens
Arguably, the only truly successful, profitable and scalable application case for blockchain globally has been as an architecture to move money around, be that in the shape of cryptocurrencies such as bitcoin, or in the tokenisation of assets for investors.
Here, says André Casterman, chair of the International Trade & Forfaiting Association’s fintech committee and CEO of the Trade Finance Distribution Initiative (TFDI), is where the real value might lie for trade.
“One use case where DLT adds value is to act as a tracking tool of an asset that can then be traded and paid for on the blockchain,” he tells GTR, adding that this is the proposal being delivered through the TFDI, which counts asset managers, banks, brokers, originators and tech providers among its members. “You and I as consumers could buy a token on the blockchain, and that liquidity will fund trade finance activities.”
In September last year, Tradeteq, which runs the TFDI technology platform, and blockchain network provider XDC did just that by carrying out what they said was the world’s first trade finance-based non-fungible token (NFT) transaction. Trade finance assets were repackaged into blockchain tokens for institutional investors to buy and sell, which Christoph Gugelmann, co-founder and CEO of Tradeteq, told GTR at the time would pave the way for an additional delivery mechanism of secondary liquidity for trade finance.
“Thanks to blockchain, we can enable the whole retail market to get access to the trade finance asset class in a low cost way, which we could not do otherwise,” says Casterman. “What’s more, the assets that Tradeteq and the XDC Network will help you fund are visible, because blockchain gives you transparency.”
However, he admits that blockchain is simply an enabling layer, rather than a solution in and of itself. “There are different ways to achieve the same thing,” he says. “Innovations are only successful if they can help the market do new things.”
Fail fast, learn fast
Although many blockchain initiatives have successfully developed technology and onboarded partners, precious few have been able to prove that they have what it takes to make enough of a difference to the trade finance industry to justify the change management involved in adopting their systems. As a result, while we.trade may be the first to supply up on the dream, it’s unlikely to be the last – and its demise bears some similarities to that of the bank payment obligation (BPO), which is widely referred to as a cautionary tale of a great idea that failed to gain traction because of a failure to think through the incentive design for take-up.
Adopting digital trade ecosystems requires serious financial commitment from both banks and corporates. The main challenge lies in demonstrating value, and this is a big ask when, despite all of the industry’s talk of data and digitisation, the world’s supply chains remain stubbornly analogue. Without the necessary building blocks in place, no technology, be that blockchain or otherwise, will be able to bring trade’s antiquated processes into the 21st century.
“Digitalisation will transform trade and supply chains but it won’t be an easy task,” says Steven Beck, head of trade and supply chain finance at the Asian Development Bank in a exact blog post. “All participants in the trade ecosystem – exporters, shippers, ports, customs, warehousing/logistics, and importers – need to agree on the standards and protocols to underpin digitalisation before we can move the needle materially.”
Once this is achieved, blockchain-based ecosystems might just have a chance to succeed where other attempts to digitise trade finance have faltered.
“We’re not deluding ourselves,” says Marco Polo Network’s Sutter. “We are still very much in it for the end game, because there are too many problems and too much opportunity, and someone is going to have to digitise trade. What will happen is the world will change. The first electric cars were developed in the 1970s, but the market wasn’t ready. Half a century later, Elon Musk comes along with Tesla and achieves success. It’s going to happen. The only question is, when?”
What’s fast becoming clear is that whether a system runs on blockchain or not is something of a red herring. The goal must be to deliver functionality, not hype around new technology. And until the industry can address real pain points, achieve scale and drive end-to-end adoption of digital tools, blockchain – like the BPO before it – will become just another great innovation in trade that failed to achieve its potential.
“Salesforce (US), SAP (Germany), Microsoft (US), Oracle (US), IBM (US), AWS (US), Sisense (US), Alteryx (US), SAS Institute (US), Alibaba Cloud (China), Dundas (Canada), TIBCO Software (US), Qlik (US), GoodData (US), Domo (US), Klipfolio (Canada), Datafay (US), Zegami (England), Live Earth (US), Reeport (France), Cluvio (Germany).”
Data Visualization Tools Market by Tool (Standalone and Integrated), Organization Size, Deployment Mode, Business Function, Vertical (BFSI, Telecommunications and IT, Healthcare and Life Sciences, Government), and Region – Global Forecast to 2026
The global Data Visualization Tools Market size to grow from USD 5.9 billion in 2021 to USD 10.2 billion by 2026, at a Compound Annual Growth Rate (CAGR) of 11.6% during the forecast period. Various factors such as the growing demand for an interactive view of data for faster business decisions and increasing developments in Augmented Reality (AR) and Virtual Reality (VR) to enable the interaction of companies with data in 3D formats, are expected to drive the demand for data visualization tools.
Download PDF Brochure: https://www.marketsandmarkets.com/pdfdownloadNew.asp?id=94728248
Businesses providing software tools are also expected to slow down for a short span of time. However, the adoption of real-time data-based applications, collaborative applications, analytics, security solutions, and AI are expected to witness increased adoption after the slight slowdown. Verticals such as manufacturing, retail, and energy and utilities have witnessed a moderate slowdown, whereas BFSI, government, and healthcare and life sciences verticals have witnessed a minimal impact. Data visualization techniques have been front-and-center in the efforts to communicate the science around COVID-19 to the very broad audience of policy makers, scientists, healthcare providers, and the public. Companies are focusing on the development of interactive dashboards for analysis of daily cases.
The cloud segment to grow at a higher CAGR during the forecast period
The data visualization tools market by deployment mode has been segmented into on-premises and cloud. The cloud segment is expected to grow at a rapid pace during the forecast period. The high CAGR of the cloud segment can be attributed to the availability of easy deployment options and minimal requirements of capital and time. These factors are supporting the current lockdown scenario of COVID-19 as social distancing, and online purchasing of goods hit the industry and are expected to drive the adoption of cloud-based data visualization tools. Highly secure data encryption and complete data visibility and enhanced control over data in terms of location and the real-time availability of data for extracting insights are responsible for the higher adoption of on-premises-based data visualization tools.
Data visualization tools are essential to analyze massive amounts of information and make data-driven decisions. Data visualization tools provide an accessible way to see and understand trends, outliers, and patterns in data. With further advancements in technologies, dynamic data stories with more automated and consumer-based experiences would replace visual, and point-and-click authoring and exploration. This would shift the focus from predefined dashboards to in-context data stories. Users would prefer data visualization solutions to stream the most relevant insights to each user based on their context, role, or use. The future advancements in data visualization tools would leverage technologies such as augmented analytics, Natural Language Processing (NLP), streaming anomaly detection, and collaboration.
Request demo Pages: https://www.marketsandmarkets.com/requestsampleNew.asp?id=94728248
Some of the key players operating in the data visualization tools market include Salesforce (US), SAP (Germany), Microsoft (US), Oracle (US), IBM (US), AWS (US), Sisense (US), Alteryx (US), SAS Institute (US), Alibaba Cloud (China), Dundas (Canada), TIBCO Software (US), Qlik (US), GoodData (US), Domo (US), Klipfolio (Canada), Datafay (US), Zegami (England), Live Earth (US), Reeport (France), Cluvio (Germany), Whatagraph (The Netherlands), Databox (US), Datapine (Germany), Toucan Toco (France), and Chord (US). These data visualization tools vendors have adopted various organic and inorganic strategies to sustain their positions and increase their market shares in the global data visualization tools market.
Salesforce was founded in 1999 and is headquartered in California, US. In June 2019, Salesforce acquired Tableau, a data visualization platform provider, in a move that would strengthen its position in the digital transformation space. Its integrated platform provides a single-shared view of every customer across departments, including marketing, sales, commerce, and services. Salesforce has a community of over 10 million innovators, disruptors, and community shapers known as Trailblazers. The company offers a wide range of products and services across segments, including sales, services, marketing, application, analytics, employee experience, trailblazers and reskilling, and enablement and collaboration, most of which operate on a single trusted cloud platform. Salesforce also offers a cross-cloud technology named Salesforce 360, which helps its clients obtain a single integrated, holistic customer profile for various departments. The company caters to various industries, including BFSI, healthcare and life sciences, government, communications, retail, consumer goods, media, manufacturing, transportation, hospitality, automotive, and education. It has geographic presence in North America, Europe, APAC, and MEA.
SAP was founded in 1972 and is headquartered in Walldorf, Germany. The company’s diverse portfolio is segmented into applications, technologies and services, and SAP business network. The company is a leading provider of enterprise application solutions and services. It is also a provider of enterprise resource planning, supply chain management, data integration and quality, and master data management. Its solutions are compliant with GDPR. They enable enterprises to build intelligent AI- and ML-based software to unite human expertise with machine-generated insights. The company segments its diverse portfolio into applications, technology and services, intelligent spend group, and Qualtrics. It works on an intelligent enterprise framework, which includes experience, intelligence, and operations business models. It is known to offer the SAP HANA platform through its experience model of the framework. The platform enables both the transactional processing for data capture and retrieval, and analytical processing for BI and reporting. It offering caters to various industry verticals, including BFSI, public services, telecommunications, energy and utilities, transportation and logistics, travel and hospitality, healthcare and life sciences, and media and entertainment. SAP has more than 345,000 customers across 180 countries in the Americas, EMEA, and APAC.
Company Name: MarketsandMarkets™ Research Private Ltd.
Contact Person: Mr. Aashish Mehra
Email: Send Email
Address:630 Dundee Road Suite 430
State: IL 60062
Country: United States
(MENAFN- America News Hour)
Key Companies Covered in the Enterprise Metadata Management Tool Market Research are Oracle Corporation, IBM Corporation, SAP SE, Informatica llc, ASG Technologies, and other key market players.
In its market research collateral archive, CRIFAX added a report titled ' Global Enterprise Metadata Management Tool Market, 2021-2030′ which consists of the study of the growth strategies used by the leading players in the Enterprise Metadata Management Tool to keep themselves ahead of the competition. In addition, the study also covers emerging trends, mergers and acquisitions, region-wise growth analysis, as well as the challenges that impact the growth of the market.
The growth of the global Enterprise Metadata Management Tool market worldwide is largely driven by the increasing number of technical developments in different industries around the world and the overall digital revolution. Digital economic development is one of the key factors motivating big giants to invest aggressively on digital innovation and shift their conventional business models to automated ones, so as to seize value-producing opportunities and keep ahead of their competitors, as well as to boost the continuity and reliability of their services. Ranging from artificial intelligence (AI), augmented reality (AR) and virtual reality (VR) to the internet of things (IoT), the growing number of internet-connected devices around the world on account of the growing technologies is anticipated to contribute to the growth of the global Enterprise Metadata Management Tool market.
MENAFN provides the information “as is” without warranty of any kind. We do not accept any responsibility or liability for the accuracy, content, images, videos, licenses, completeness, legality, or reliability of the information contained in this article. If you have any complaints or copyright issues related to this article, kindly contact the provider above.
Australian agribusiness firm Elders has tapped Publicis Sapient for customer experience design and technology strategy services as part of its upgrade to Dynamics 365.
Elders signed a five-year agreement to replace its legacy IBM’s AS/400 and SAP enterprise resource planning (ERP) system with Microsoft Dynamics 365, with the aim of improving customer experience, profitability, security and meeting sustainability goals.
The digital consultancy firm will also support the provider of livestock, real estate, insurance, advisory and other products and services to the farming sector by expanding its staff’s access to Power BI to Improve analytics capabilities, implementing Microsoft’s E3 and E5 cybersecurity capabilities and leveraging Microsoft Azure to Improve the integration of new business acquisitions and support sustainability goals.
Microsoft also said Publicis Sapient would leverage Microsoft Azure to Improve the integration of new business acquisitions and support sustainability goals.
Elders’ chief information officer Viv Da Ros said the SaaS solution would Improve the Adelaide-headquartered company’s technology strategy and customer experience offerings.
Join Australia’s most influential channel partners at CRN Pipeline 2022 to reconnect after two extraordinary years of change!
“The selection of Microsoft Dynamics 365 is the result of in-depth strategy development to ensure we took a customer lens to capturing requirements for our new systems.”
“Dynamics 365 was selected due to its ability to help us revolutionise our customer experience and embed analytics into core processes to help us make quicker, better-informed decisions through data,” Ros said.
Elders chief executive officer Mark Allison said the end-to-end solution would Improve Elders’ first contact to service provisions, billings, payments and capacity to offer more personalised products and services based on customers’ individual needs.
“The goal is to use technology to better serve our clients, support Australian farmers and our rural partners to work smarter and easier and make sure Elders is ready to seize the opportunities of further change. It represents a significant investment and will be a vital enabler of change across our organisation.”
“Australia is in a quantum change period for technology infrastructure. With innovations transforming the broader agriculture sector, we also need to ensure that our own systems provide us with a platform for future growth.”
Allison also said the upgrade was necessary to meet the company’s ESG targets.
“Our move to the cloud, and partnership with Microsoft, which has a strong demonstrated sustainability commitment, will also help Elders achieve our goal of halving Scope 1 and 2 emissions by 2030.”