A month after the National Institute of Standards and Technology (NIST) revealed the first quantum-safe algorithms, Amazon Web Services (AWS) and IBM have swiftly moved forward. Google was also quick to outline an aggressive implementation plan for its cloud service that it started a decade ago.
It helps that IBM researchers contributed to three of the four algorithms, while AWS had a hand in two. Google contributed to one of the submitted algorithms, SPHINCS+.
A long process that started in 2016 with 69 original candidates ends with the selection of four algorithms that will become NIST standards, which will play a critical role in protecting encrypted data from the vast power of quantum computers.
NIST's four choices include CRYSTALS-Kyber, a public-private key-encapsulation mechanism (KEM) for general asymmetric encryption, such as when connecting websites. For digital signatures, NIST selected CRYSTALS-Dilithium, FALCON, and SPHINCS+. NIST will add a few more algorithms to the mix in two years.
Vadim Lyubashevsky, a cryptographer who works in IBM's Zurich Research Laboratories, contributed to the development of CRYSTALS-Kyber, CRYSTALS-Dilithium, and Falcon. Lyubashevsky was predictably pleased by the algorithms selected, but he had only anticipated NIST would pick two digital signature candidates rather than three.
Ideally, NIST would have chosen a second key establishment algorithm, according to Lyubashevsky. "They could have chosen one more right away just to be safe," he told Dark Reading. "I think some people expected McEliece to be chosen, but maybe NIST decided to hold off for two years to see what the backup should be to Kyber."
After NIST identified the algorithms, IBM moved forward by specifying them into its recently launched z16 mainframe. IBM introduced the z16 in April, calling it the "first quantum-safe system," enabled by its new Crypto Express 8S card and APIs that provide access to the NIST APIs.
IBM was championing three of the algorithms that NIST selected, so IBM had already included them in the z16. Since IBM had unveiled the z16 before the NIST decision, the company implemented the algorithms into the new system. IBM last week made it official that the z16 supports the algorithms.
Anne Dames, an IBM distinguished engineer who works on the company's z Systems team, explained that the Crypto Express 8S card could implement various cryptographic algorithms. Nevertheless, IBM was betting on CRYSTAL-Kyber and Dilithium, according to Dames.
"We are very fortunate in that it went in the direction we hoped it would go," she told Dark Reading. "And because we chose to implement CRYSTALS-Kyber and CRYSTALS-Dilithium in the hardware security module, which allows clients to get access to it, the firmware in that hardware security module can be updated. So, if other algorithms were selected, then we would add them to our roadmap for inclusion of those algorithms for the future."
A software library on the system allows application and infrastructure developers to incorporate APIs so that clients can generate quantum-safe digital signatures for both classic computing systems and quantum computers.
"We also have a CRYSTALS-Kyber interface in place so that we can generate a key and provide it wrapped by a Kyber key so that could be used in a potential key exchange scheme," Dames said. "And we've also incorporated some APIs that allow clients to have a key exchange scheme between two parties."
Dames noted that clients might use Kyber to generate digital signatures on documents. "Think about code signing servers, things like that, or documents signing services, where people would like to actually use the digital signature capability to ensure the authenticity of the document or of the code that's being used," she said.
During Amazon's AWS re:Inforce security conference last week in Boston, the cloud provider emphasized its post-quantum cryptography (PQC) efforts. According to Margaret Salter, director of applied cryptography at AWS, Amazon is already engineering the NIST standards into its services.
During a breakout session on AWS' cryptography efforts at the conference, Salter said AWS had implemented an open source, hybrid post-quantum key exchange based on a specification called s2n-tls, which implements the Transport Layer Security (TLS) protocol across different AWS services. AWS has contributed it as a draft standard to the Internet Engineering Task Force (IETF).
Salter explained that the hybrid key exchange brings together its traditional key exchanges while enabling post-quantum security. "We have regular key exchanges that we've been using for years and years to protect data," she said. "We don't want to get rid of those; we're just going to enhance them by adding a public key exchange on top of it. And using both of those, you have traditional security, plus post quantum security."
Last week, Amazon announced that it deployed s2n-tls, the hybrid post-quantum TLS with CRYSTALS-Kyber, which connects to the AWS Key Management Service (AWS KMS) and AWS Certificate Manager (ACM). In an update this week, Amazon documented its stated support for AWS Secrets Manager, a service for managing, rotating, and retrieving database credentials and API keys.
While Google didn't make implementation announcements like AWS in the immediate aftermath of NIST's selection, VP and CISO Phil Venables said Google has been focused on PQC algorithms "beyond theoretical implementations" for over a decade. Venables was among several prominent researchers who co-authored a technical paper outlining the urgency of adopting PQC strategies. The peer-reviewed paper was published in May by Nature, a respected journal for the science and technology communities.
"At Google, we're well into a multi-year effort to migrate to post-quantum cryptography that is designed to address both immediate and long-term risks to protect sensitive information," Venables wrote in a blog post published following the NIST announcement. "We have one goal: ensure that Google is PQC ready."
Venables recalled an experiment in 2016 with Chrome where a minimal number of connections from the Web browser to Google servers used a post-quantum key-exchange algorithm alongside the existing elliptic-curve key-exchange algorithm. "By adding a post-quantum algorithm in a hybrid mode with the existing key exchange, we were able to test its implementation without affecting user security," Venables noted.
Google and Cloudflare announced a "wide-scale post-quantum experiment" in 2019 implementing two post-quantum key exchanges, "integrated into Cloudflare's TLS stack, and deployed the implementation on edge servers and in Chrome Canary clients." The experiment helped Google understand the implications of deploying two post-quantum key agreements with TLS.
Venables noted that last year Google tested post-quantum confidentiality in TLS and found that various network products were not compatible with post-quantum TLS. "We were able to work with the vendor so that the issue was fixed in future firmware updates," he said. "By experimenting early, we resolved this issue for future deployments."
The four algorithms NIST announced are an important milestone in advancing PQC, but there's other work to be done besides quantum-safe encryption. The AWS TLS submission to the IETF is one example; others include such efforts as Hybrid PQ VPN.
"What you will see happening is those organizations that work on TLS protocols, or SSH, or VPN type protocols, will now come together and put together proposals which they will evaluate in their communities to determine what's best and which protocols should be updated, how the certificates should be defined, and things like things like that," IBM's Dames said.
Dustin Moody, a mathematician at NIST who leads its PQC project, shared a similar view during a panel discussion at the RSA Conference in June. "There's been a lot of global cooperation with our NIST process, rather than fracturing of the effort and coming up with a lot of different algorithms," Moody said. "We've seen most countries and standards organizations waiting to see what comes out of our nice progress on this process, as well as participating in that. And we see that as a very good sign."
According to the 2022 IBM Institute for Business Value study on AI Ethics in Action, building trustworthy Artificial Intelligence (AI) is perceived as a strategic differentiator and organizations are beginning to implement AI ethics mechanisms.
Seventy-five percent of respondents believe that ethics is a source of competitive differentiation. More than 67% of respondents who view AI and AI ethics as important indicate that their organizations outperform their peers in sustainability, social responsibility, and diversity and inclusion.
The survey showed that 79% of CEOs are prepared to embed AI ethics into their AI practices, up from 20% in 2018, but less than a quarter of responding organizations have operationalized AI ethics. Less than 20% of respondents strongly agreed that their organization's practices and actions match (or exceed) their stated principles and values.
Peter Bernard, CEO of Datagration, says that understanding AI gives companies an advantage, but Bernard adds that explainable AI allows businesses to optimize their data.
"Not only are they able to explain and understand the AI/ML behind predictions, but when errors arise, they can understand where to go back and make improvements," said Bernard. "A deeper understanding of AI/ML allows businesses to know whether their AI/ML is making valuable predictions or whether they should be improved."
Bernard believes this can ensure incorrect data is spotted early on and stopped before decisions are made.
Avivah Litan, vice president and distinguished analyst at Gartner, says that explainable AI also furthers scientific discovery as scientists and other business users can explore what the AI model does in various circumstances.
"They can work with the models directly instead of relying only on what predictions are generated given a certain set of inputs," said Litan.
But John Thomas, Vice President and Distinguished Engineer in IBM Expert Labs, says at its very basic level, explainable AI are the methods and processes for helping us understand a model's output. "In other words, it's the effort to build AI that can explain to designers and users why it made the decision it did based on the data that was put into it," said Thomas.
Thomas says there are many reasons why explainable AI is urgently needed.
"One reason is model drift. Over time as more and more data is fed into a given model, this new data can influence the model in ways you may not have intended," said Thomas. "If we can understand why an AI is making certain decisions, we can do much more to keep its outputs consistent and trustworthy over its lifecycle."
Thomas adds that at a practical level, we can use explainable AI to make models more accurate and refined in the first place. "As AI becomes more embedded in our lives in more impactful ways, [..] we're going to need not only governance and regulatory tools to protect consumers from adverse effects, we're going to need technical solutions as well," said Thomas.
"AI is becoming more pervasive, yet most organizations cannot interpret or explain what their models are doing," said Litan. "And the increasing dependence on AI escalates the impact of mis-performing AI models with severely negative consequences," said Litan.
Bernard takes it back to a practical level, saying that explainable AI [..] creates proof of what senior engineers and experts "know" intuitively and explaining the reasoning behind it simultaneously. "Explainable AI can also take commonly held beliefs and prove that the data does not back it up," said Bernard.
"Explainable AI lets us troubleshoot how an AI is making decisions and interpreting data is an extremely important tool in helping us ensure AI is helping everyone, not just a narrow few," said Thomas.
Hiring is an example of where explainable AI can help everyone.
Thomas says hiring managers deal with all kinds of hiring and talent shortages and usually get more applications than they can read thoroughly. This means there is a strong demand to be able to evaluate and screen applicants algorithmically.
"Of course, we know this can introduce bias into hiring decisions, as well as overlook a lot of people who might be compelling candidates with unconventional backgrounds," said Thomas. "Explainable AI is an ideal solution for these sorts of problems because it would allow you to understand why a model rejected a certain applicant and accepted another. It helps you make your make model better.”
IBM's AI Ethics survey showed that 85% of IT professionals agree that consumers are more likely to choose a company that's transparent about how its AI models are built, managed and used.
Thomas says explainable AI is absolutely a response to concerns about understanding and being able to trust AI's results.
"There's a broad consensus among people using AI that you need to take steps to explain how you're using it to customers and consumers," said Thomas. "At the same time, the field of AI Ethics as a practice is relatively new, so most companies, even large ones, don't have a Head of AI ethics, and they don't have the skills they need to build an ethics panel in-house."
Thomas believes it's essential that companies begin thinking about building those governance structures. "But there also a need for technical solutions that can help companies manage their use of AI responsibly," said Thomas.
Bernard points to the oil and gas industry as why explainable AI is necessary.
"Oil and gas have [..] a level of engineering complexity, and very few industries apply engineering and data at such a deep and constant level like this industry," said Bernard. "From the reservoir to the surface, every aspect is an engineering challenge with millions of data points and different approaches."
Bernard says in this industry, operators and companies still utilize spreadsheets and other home-grown systems-built decades ago. "Utilizing ML enables them to take siloed knowledge, Excellerate it and create something transferrable across the organization, allowing consistency in decision making and process."
"When oil and gas companies can perform more efficiently, it is a win for everyone," said Bernard. "The companies see the impact in their bottom line by producing more from their existing assets, lowering environmental impact, and doing more with less manpower."
Bernard says this leads to more supply to help ease the burden on demand. "Even modest increases like 10% improvement in production can have a massive impact in supply, the more production we have [..] consumers will see relief at the pump."
But Litan says the trend toward explainable AI is mainly driven by regulatory compliance.
In a 2021 Gartner survey, AI in Organizations reported that regulatory compliance is the top reason privacy, security and risk are barriers to AI implementation.
"Regulators are demanding AI model transparency and proof that models are not generating biased decisions and unfair 'irresponsible' policies," said Litan. "AI privacy, security and/or risk management starts with AI explainability, which is a required baseline."
Litan says Gartner sees the biggest uptake of explainable AI in regulated industries like healthcare and financial services. "But we also see it increasingly with technology service providers that use AI models, notably in security or other scenarios," said Litan.
Litan adds that another reason explainable AI is trending is that organizations are unprepared to manage AI risks and often cut corners around model governance. "Organizations that adopt AI trust, risk and security management – which starts with inventorying AI models and explaining them – get better business results," adds Litan.
But IBM's Thomas doesn't think you can parse the uptake of explainable AI by industry.
"What makes a company interested in explainable AI isn't necessarily the industry they're in; they're invested in AI in the first place," said Thomas. "IT professionals at businesses deploying AI are 17% more likely to report that their business values AI explainability. Once you get beyond exploration and into the deployment phase, explaining what your models are doing and why quickly becomes very important to you."
Thomas says that IBM sees some compelling use cases in specific industries starting with medical research.
"There is a lot of excitement about the potential for AI to accelerate the pace of discovery by making medical research easier," said Thomas. "But, even if AI can do a lot of heavy lifting, there is still skepticism among doctors and researchers about the results."
Thomas says explainable AI has been a powerful solution to that particular problem, allowing researchers to embrace AI modeling to help them solve healthcare-related challenges because they can refine their models, control for bias and monitor the results.
"That trust makes it much easier for them to build models more quickly and feel comfortable using them to inform their care for patients," said Thomas.
IBM worked with Highmark Health to build a model using claims data to model sepsis and COVID-19 risk. But again, Thomas adds that because it's a tool for refining and monitoring how your AI models perform, explainable AI shouldn't be restricted to any particular industry or use case.
"We have airlines who use explainable AI to ensure their AI is doing a good job predicting plane departure times. In financial services and insurance, companies are using explainable AI to make sure they are making fair decisions about loan rates and premiums," said Thomas. "This is a technical component that will be critical for anyone getting serious about using AI at scale, regardless of what industry they are in."
What does the future look like with AI ethics and explainable AI?
Thomas says the hope is that explainable AI will spread and see adoption because that will be a sign companies take trustworthy AI, both the governance and the technical components, very seriously.
He also sees explainable AI as essential guardrails for AI Ethics down the road.
"When we started putting seatbelts in cars, a lot more people started driving, but we also saw fewer and less severe accidents," said Thomas. "That's the obvious hope - that we can make the benefits of this new technology much more widely available while also taking the needed steps to ensure we are not introducing unanticipated consequences or harms."
One of the most significant factors working against the adoption of AI and its productivity gains is the genuine need to address concerns about how AI is used, what types of data are being collected about people, and whether AI will put them out of a job.
But Thomas says that worry is contrary to what’s happening today. "AI is augmenting what humans can accomplish, from helping researchers conduct studies faster to assisting bankers in designing fairer and more efficient loans to helping technicians inspect and fix equipment more quickly," said Thomas. "Explainable AI is one of the most important ways we are helping consumers understand that, so a user can say with a much greater degree of certainty that no, this AI isn't introducing bias, and here's exactly why and what this model is really doing."
One tangible example IBM uses is AI Factsheets in their IBM Cloud Pak for Data. IBM describes the factsheets as 'nutrition labels' for AI, which allows them to list the types of data and algorithms that make up a particular in the same way a food item lists its ingredients.
"To achieve trustworthy AI at scale, it takes more than one company or organization to lead the charge,” said Thomas. “AI should come from a diversity of datasets, diversity in practitioners, and a diverse partner ecosystem so that we have continuous feedback and improvement.”
Kevin Markarian is the cofounder of Roopler, an AI-driven lead generation platform built for the real estate industry.
Artificial intelligence is rapidly upending how people do business across industries, and yet skeptics still abound. But is there really a reason to fear AI?
AI will change how we work and do business, and its impact is already being felt. Still, that doesn’t mean it is something to fear. On the contrary, business managers and leaders who embrace AI and harness its potential now have everything to gain.
Making Sense Of AI
According to IBM, at its most basic, AI is anything that “leverages computers and machines to mimic the problem-solving and decision-making capabilities of the human mind.” But not all AI is built alike. There are two types of AI: narrow AI and strong AI.
Narrow AI is trained to perform specific tasks. A bot that can carry out a conversation with a potential customer is an example of narrow AI but is more robust. Strong AI, which is what we’re moving toward, is AI that can perform all the complex tasks and decision-making processes of a human (e.g., an emotionally intelligent machine that can make tough decisions, reflect on their impact, and recalibrate accordingly). Whether strong AI is just another flying car is yet to be seen.
From Flying Cars To AI
As history has repeatedly shown, some visions of the future simply never come to pass. The first patent for a flying car was issued in 1918, and over a century later, we’re still not battling aerial car crashes. This hasn’t prevented people from dreaming and worrying about a future where skyways replace roadways. As a cofounder of a business powered by AI, my best guess is that AI is today’s flying car.
Since 2010, concerns about AI’s pending impact on the economy and the future of work have been on the rise. Unless you’ve been living off the grid, you’ve probably read dozens of articles on the subject by now, such as the 2020 article in Time that reported, “AI job automation has already replaced around 400,000 factory jobs in the U.S. from 1990 to 2007, with another 2 million on the way.”
The Time article isn’t factually incorrect. Some industries have experienced job losses, and I think more job losses might be coming. The article is also right to note that AI enables companies to do more with less. But this doesn’t mean that our jobs are threatened.
For example, consider the real estate industry. Today, AI is beginning to take over some aspects of lead generation and cultivation. While this may appear to be a threat, as someone who runs two successful brokerages and a tech company, I can assure you that the need for human agents isn’t disappearing. AI will help agents serve more consumers, but I don’t foresee anyone closing a deal on a home with a bot. Why? Because AI lacks the emotional intelligence and complexity required to help people make significant decisions, including buying and selling homes.
How Business Leaders Can Leverage AI
While AI may seem out of reach, even small- to medium-sized business owners can embrace AI.
• Leverage AI for lead generation. No one loves bad bots, but with a small investment and the right talent, you can already build AI-backed platforms that actually work. If you’re in a fast-paced, customer-focused industry like real estate or any other high-stakes sales industry, investing in AI can help you quickly respond to incoming client inquiries and close more deals over time.
• Use AI to find and recruit talent. Hiring great talent and building outstanding teams takes time and energy. With the capacity to sort through thousands of applications at a rate much faster than any human, AI is changing how we recruit talent. Better yet, it can help us discover candidates we may have overlooked in the past due to our own biases and assumptions. While AI isn't perfect (biases can be built into algorithms), it still holds the potential to help business owners cast their net wider, review more candidate applications and use increasingly nuanced criteria to recruit and build the very best talent pool.
• Let AI show you the way forward. Used to its full potential, AI can also point you and your business in entirely new directions, and for a simple reason. When you embrace AI, you have access to massive amounts of data about your customer base. You could use this information to keep doing what you're already doing, but the smartest business leaders let their AI point them in new directions.
Common Mistakes Made By New Adopters
We’ve all heard the saying, “If you can’t beat them, join them.” This also holds true for AI. AI isn’t going away. Business owners who embrace AI now and start exploring how it can help them do more will be the biggest winners. Still, it is also important to avoid these three common mistakes.
Not recruiting the right talent to your team. If you're not already using AI, you likely don't have the right talent on your team. While outsourcing an AI initiative is always an option, your return on investment will ultimately be higher if you build your AI project in-house. This likely means recruiting new talent.
Not appreciating the potential risks of investing in AI. AI also poses unique risks. If you invest in a new factory and the gamble doesn't pay off, you can still sell the factory and the equipment in it to recuperate part of your lost investment. If you invest in AI and the gamble doesn't pay off, it is a different story since you likely can't sell your algorithms, which were developed specifically for your business. In this respect, AI, for all its benefits, also poses unique risks to business owners.
Assuming AI can do it all. Finally, it is important to keep AI in perspective. It can transform your business, but this doesn't mean you can put your business on autopilot. Even as AI transforms businesses, business leaders are still calling the shots.
Over the coming decades, AI will profoundly change how we live, learn and do business. But it won't do any of this without our vision, insights and permission. As business leaders, the most strategic thing we can do is embrace AI as an opportunity to serve our broader business mandates.
Forbes Business Council is the foremost growth and networking organization for business owners and leaders. Do I qualify?
After successfully rolling out and bedding in its cloud-based human resource (HR) and finance system, the Financial Ombudsman Service (FOS) is now digging into its functionality.
In November 2021, alongside service partner IBM, the FOS completed the roll-out of the system from Workday and moved into a new phase of using the software, which set out to replace about 14 separate systems as a bedrock of its digital transformation.
Nicola Wadham, CIO at the Financial Ombudsman Service, told Computer Weekly that the service was already seeing the benefits of replacing multiple systems with a single cloud service, which she described as “the power of one”.
“We are up and running – we are stable, paying people and we have got through the key moments,” she said.
The organisation has also set a timeframe for pulling out of its datacentre, with all systems in the cloud. “The legacy systems are awaiting the executioner now. We have archived what we have to do and have a datacentre exit project,” said Wadham.
Set up in 2001 by MPs, the Financial Ombudsman Service settles disputes between consumers and financial services providers. It is contacted by more than one million people each year.
It is planning to digitally transform its entire back office, with Workday’s cloud-based software a foundation stone. The roll-out of Workday at the FOS was far from standard, coming as it did during the Covid-19 lockdown when people were forced to work remotely.
“In usual circumstances you would have in-person drop-in centres, floor walkers and the swivel chair to allow you to ask people how to do things,” said Wadham. “Instead, we had virtual drop-in clinics and good support information.” The FOS now has a hybrid working pattern where staff are in four days every fortnight.
But implementation is only ever half the task, according to Wadham. “It can go horribly wrong, as we all know, but it is the usage of the system and its exploitation where the gold is,” she said.
Three groups of users at the FOS will benefit from the new HR and finance system. Professional users who work in the finance and HR departments will be required to do less double-checking, less manual work, and processes that would normally require two people will now require one. This will reduce processing times significantly.
Meanwhile, managers can use the system for onboarding workers, ending employment contracts or doing appraisals. They will be able to see more information about their workforce than they used to, and do more self-service, according to Wadham.
“We are now transitioning to the continuous improvement phase. There is a lot of good [functionality] out of the box, but you have to discover it, which is something we have to work on”
Nicola Wadham, Financial Ombudsman Service
The organisation is educating managers about the functionality available to them so they can get more out of the system. “It’s about helping them understand how to do the action and making sure they have got the right guide. Now we are back in the office, we are able to promote and advertise clinics that help people get on their way,” said Wadham.
It is also focusing on getting staff across the board to use the new systems, which is a challenge when they have used the same systems for decades. In the past, all these users were interacting with different systems, accessing different pockets of data.
“This is an important part of the wider digitisation strategy because you have to get your staff interacting with the new technology,” said Wadham.
Once staff begin to use the system, they can discover the less obvious, but valuable, functionality, or “the gold”, according to Wadham.
“We are now transitioning to the continuous improvement phase. There is a lot of good [functionality] out of the box, but you have to discover it, which is something we have to work on.”
The FOS is also using Workday Innovation Services, of which there are 24 available. The FOS has already adopted eight of them.
For example, it has implemented Workday Everywhere, which offers integration with Microsoft Teams so users don’t have to leave Workday to use Teams.
It is also using Workday Assistant, a chatbot that uses natural language to help staff complete tasks through voice requests, such as booking holidays or finding their pay slips. “I am hoping users will adopt that because it is going to make things quicker,” said Wadham.
Wadham and her team are currently looking at an interview scheduling function, which would allow candidates to schedule interview slots without telephoning. “This is niche, but powerful,” she said.
In the longer term, the FOS plans to use the system for workforce planning. “This is big for us, because we are demand-led and it helps us understand the make-up of our workforce, provide case work to the right people and ensure that our people have the right skills,” said Wadham.
The Workday-based transformation of the back office follows the FOS’s project to put its core front-end systems, such as its case management system, in the cloud.
It has used Microsoft Dynamics 365 for its case work since 2019. It is currently transitioning its case work system support arrangements to IT services provider Tata Consultancy Services and is building a consumer-focused portal with the supplier.
Until a few years ago, Moshe Yanai was considered a serial entrepreneur with a golden touch, a Midas of Israeli high-tech. Indeed, Yanai has had a long career as one of the world's largest data storage experts, having been part of IBM's mainframe success and competitor EMC’s revolutionary Symmetrix product. In 2008, Yanai demonstrated his magic touch once again when he sold two start-ups he had founded - XIV and Diligent Technologies - to IBM, one after the other, for a total sum of about $500 million.
Yanai thus became one of the richest high-tech entrepreneurs of the early 21st century. In addition, he owns a helicopter pilot training school, as well as an executive helicopter that once belonged to Senator Ted Kennedy, which Yanai uses to pilot VIPs around the country.
So when Yanai chose to found Infinidat more than a decade ago, with the promise that it was not intended for sale to a technology giant, it became one of the most intriguing companies in Israel. Yanai, who throughout his career had developed ground-breaking storage solutions and served them on a silver platter to US corporations, now wanted to do things differently. He aimed to establish a revolutionary storage system, one that would significantly Excellerate the information storage capabilities of large enterprises, and would compete directly with the tech giants, all the way to an IPO.
No one imagined that within less than a decade the company would oust Yanai from its management, lose dozens of employees, and wrangle in the labor courts over legal charges by those former employees. No one could have imagined how, a year after that wave of departures, Infinidat would turn into a profitable company, with strong investor backing, and a new management that sees the potential Yanai envisioned from the outset.
Average deal $700,000 a year
Infinidat had an ambitious vision that perhaps was also its Achilles’ heel: a smart storage system capable of hopping between different types of storage using principles of artificial intelligence, algorithms, and mathematics. The aim was to reduce costs and raise workload application speeds for the enterprise. Underlying that vision is the same technology Yanai and his partners thought up a decade ago, currently protected by more than 100 patents. Infinidat is one of Israel’s biggest startups. It has raised $370 million in total, and employs about 500 people in Israel and the US.
Today, enterprises must choose between different types of storage: slow magnetic drives, flash-based solid-state drives (SSDs), and arrays of digital memory cells based on random access, (dynamic random-access memory or DRAM). The latter are fast, but their use, unfortunately, is much more expensive. Infinidat's algorithm learns the organization’s data flow - types of information and usage patterns - and knows to store it in the right place so that it can be accessed as needed, faster and more cheaply than the competition.
But Infinidat sells more than algorithms - it sells a complete system: flash drives, hard disks, its "Neural Cache" software that is the product’s smart core, and full-service company support - the "white glove" model of continuous performance monitoring and immediate troubleshooting. Today, the price of an average deal is about $700,000 per year, and can easily rocket into the millions of dollars.
"A premium product sold at high profit"
In September 2020. Shahar Bar-Or took up the post of GM of Infinidat Israel and Chief Product Officer. Since then, two complementary product lines have been developed: a flash drive system, for those wishing to further enhance their storage activity, and a new backup system that adds disaster recovery capabilities and cyber-attack resilience. The company declines to comment, but according to market estimates, its annual revenue rate now tops $300 million and it makes an operating profit.
In January of this year, the company announced to investors and employees that orders grew by 40% last year, with a 68% increase in the fourth quarter, compared with orders in the corresponding quarter in 2020.
Despite dreams of an IPO, the company is realistic. Just as things were going well for it, the market did an about-face, and New York IPOs have been shuttered for almost a year.
Infinidat’s cost structure is beyond belief: the company has about 500 employees in Israel, as well as anywhere in the world where it has a customer. It produces its systems in-house, maintains production lines at its Kfar Saba facility, not to mention a half-million-shekel monthly electric bill for the large server farm it leases from GDC of Herzliya, located down the street from its headquarters on Hamanofim Street.
"This is a premium product aimed at the largest customers in the world, and it’s sold at a high margin - it’s not an off-the-shelf product sold at a low profit. The company’s position as a privately-held, growing, and profitable company that has been working for several years with hundreds of large and important customers, allows us the flexibility to stay balanced," says Bar-Or.
The employees: Cancelled acquisitions, diluted shares
Unfortunately, many of the partners to this success no longer work for Infinidat. 2020 marked a turning point for the company, but this was preceded by a long period of turmoil accompanied by resignations, lawsuits, and changes in management. According to a lawsuit filed with Tel Aviv District Court and the Bat Yam Regional Labor Court by more than 30 of the company's original employees, it appears that already in early 2020, the company began reneging on its commitment to buy shares back from its employees - a plan first implemented in 2018 that was supposed to happen in 2020 as well. The employees claim this plan kept them waiting for many years, that the framework was supposed to last five years, that the company had committed itself to purchase 2% each year of the special share capital issued to each employee, priced at approximately $1,300 per share.
According to the employees, the management refrained from providing proper information on the matter. It was then discovered that the company also stopped reporting regularly to the Registrar of Companies about other changes made to its share structure, statutes, and board of directors. Upon investigation, the employees discovered that the purchase plans actually diluted the remaining special shares in their possession. Concurrent with the repurchase plan, special employee shares were issued to the Claridge fund (the Bronfman family), and the ION fund, which were protected against dilution. This ultimately diluted the employees to half the shares originally promised to employees only.
An examination conducted afterwards by a few veteran company employees made clear that their situation was even worse. The share series issued especially for them - which was supposed to provide them with protection from dilution so that, in case of a sale, IPO, or liquidation, they would the first receive 20% of the proceeds - had been changed continually without their knowledge since 2015, as new investors had come into the company. This paralleled the company's decision in 2020 to dilute its former employees to one-thousandth of their holdings.
According to the claimants, the original commitments to the employees were initiated by and made in the presence of super-entrepreneur Yanai, to persuade them to continue working at the company for years. In one example cited in the indictment, Yanai is even quoted as raising the notion that, should the company be sold for $1 billion, its employees would receive $200 million.
"In addition to the fact that harm was done to the employees, it was done covertly and only came to light later, after many years and thanks to the resourcefulness of the employees," attorney Yaron Alon of Horovitz, Even, Uzan & Co., who represents a large group of employees, told "Globes". A similar lawsuit has been filed by Gad Ticho and Alon Kanety of Caspi & Co. A significant number of the claimants were senior executives, some of whom had been with the company for years. These include the person who was Yanai’s manager at Elbit and then left with him for EMC; many of the company’s first product architects, and vice-presidents of marketing, sales, development and product throughout the life of the company. Dr. Alex Winokur, who managed development at both XIV and Axxana, (a startup Winokur founded that was eventually acquired by Infinidat), is now in the process of negotiating with the company on the terms of payment due to him. All these proceedings are at different stages in the courts.
"I’m happy the company is doing well, but that success must be attributed to the veteran employees who contributed to its establishment, to the invention of its products, and to their development," says Adv. Alon. "Those employees worked solely in the light of the explicit promises they received about the shares that were to be allocated to them. We are confident that the Economic Affairs Court and the Labor Court will compel the company to meet its obligations."
Infinidat responded: "We believe that the claims are baseless, and in any case will be determined by the appropriate courts."
Upheaval, promotions, and growth engines
The upheaval happened in 2020, after years of the company hemorrhaging operating losses, estimated at tens of millions of dollars a year. The board decided to remove Yanai from the position of chairman (he remains an active company director to this day), and named Boaz Chalamish, founder and president of Clarizen, in his place, and Kariel Sandler as co-CEO and CPO and Nir Simon as co-CEO and CFO. As part of the long recovery process, the company raised tens of millions of dollars from TPG Capital, the Bronfman family’s Claridge fund, ION, and Goldman Sachs. The process also included a plan to dilute the holdings of former employees that, although it was put into effect only a few months later, caused employee resignations, along with employees sent on furlough due to the Covid-19, epidemic, and employees who were fired. In all, the company shed 70% of its workforce.
"In September 2020, we identified those core employees who could be given greater responsibility and we promoted them to more senior positions," said Bar-Or. "I looked for the team leaders who, despite the turmoil, had the courage and strength - some of them even approached me and said, ‘I'm not going anywhere'. The absolute majority of directors you’ll see in the company today are team leaders who have taken responsibility and advanced. Similarly, many of our team leaders today were engineers who took on additional responsibilities. Although we had a high turnover of managers, and the average seniority of management is one year, the company is anchored by product, technology, sales and support that have continued to support customers throughout this time. We hired experienced managers from outside, mainly from large companies, built plans for launching two product lines, and focused on new growth engines, like flash products and backup.
"During the first half of the year, I was losing sleep from the weight of care and responsibility resting on me, but after this period we could say that the company was stabilizing and that the existential threat had lifted."
How did you transition from loss to profit?
"We cancelled unnecessary projects, and had to think better about adjusting the workforce to our revenue level. Up until a half year ago, the term ‘profit’ wasn’t much used in the Israeli high-tech lexicon, but already in 2020, we had committed to ourselves to not spending more than we could afford. That’s considered old school. During the first half of 2021, it was hard, because our teams needed more people, and we wouldn’t allow that until we felt we were meeting our sales targets. Now, in mid-2022, as we go into a global economic crisis, we’re 'privileged ', because we’re already used to operating profitably. We have great conditions here, including fully stocked kitchens, every type of coffee machine, generous meal vouchers, events and activities - but we’re not the type to host extravagant performances or staff trips to exotic islands. We’ll invest in growth and our workers."
"I was excited by the challenge"
What was the moment when you said to yourself, "We've made it"?
"Towards the end of 2020, I saw that we’d succeeded in filling most of the critical positions through internal promotions and external recruitment. I also saw that the acceptance rate for our job offers had crossed the 90% threshold, which means that most of the candidates we interviewed, each considering several different job offers, decided to go with us. In addition, we saw the number of deals increasing rapidly. Up to that point, our competitors were doing unbelievable things, including going to one of customers and telling them we were about to go under. We had to persuade the customer that our competitor was mistaken - and that customer decided to believe us, and has stayed with us to this day. The investors were behind us all the way. Gilad Shany of ION told me: ‘I’m not in your position but I can guess what you’re going through. Even if you miss, know that I’ve got your back.’"
You came from a very stable job. What persuaded you to stick your head into "the hornet's nest" at that time?
"The more difficult the situation described to me, the more attracted I was. They told me about the technology, the people, and the revenue, which was impressive even then, but also about the lawsuits, the loss of trust and the departures, and how desperate the situation seemed for people. That excited me even more, because this was a bigger challenge than coming to a company where everything was okay. Even though, almost every day I was asked at home ‘What were you thinking?’ or ‘What have you done?’, I saw the opportunity in a company with both technological and managerial challenges. After 15 years in corporate America, the combination of a large Israeli-American company with a great opportunity to bring value to Israeli high-tech attracted me. In retrospect, I’m grateful because this is a lesson you won’t learn if you don't roll up your sleeves and get to work."
Does the current economic crisis affect you?
"Since 2020, we’ve avoided unnecessary expenses. We’ve grown in a responsible manner, and we have no need or intention of cutting back. On the contrary: we have many open positions in Israel and in the world, and we’re hiring on the basis of our performance and increased sales. It’s true that in a difficult economic environment, companies are cutting back on many product purchases, but it is less likely that a senior executive at a major enterprise will cut back on storage at a time when the volume of information it collects doubles every period."
Published by Globes, Israel business news - en.globes.co.il - on July 31, 2022.
© Copyright of Globes Publisher Itonut (1983) Ltd., 2022.
Cognitive computing powerhouse IBM Watson Health is adding novel offerings and entering new agreements in an array of healthcare arenas.
This week, IBM Watson Health announced a slew of solutions and partnerships aimed at improving healthcare decision making and delivery. The announcement, which was released during a major gathering for the health IT community--the annual Healthcare Information and Management Systems Society (HIMSS) conference--covers offerings focused on value-based care, medical imaging, and population health.
"Healthcare organizations are operating in a complex and fluctuating business environment, one in which the insights they need to succeed can be hidden amidst an avalanche of disparate and siloed data," Deborah DiSanzo, general manager of IBM Watson Health, said in a press release. She added, "Watson Health's extensive industry expertise informs how we deploy data, cloud, and cognitive computing to help clients make more informed decisions today and understand precisely what their organization should address to achieve their quality care goals and outcomes in a value-based care system."
Among the new products is the IBM Watson Health Value-Based Care solutions. Applications set to be released later in 2017 include solutions that help track and forecast value-based care performance indicators, monitor patient engagement, customize analytics, and tools that can help pinpoint areas of high cost.
IBM also unveiled IBM Watson Imaging Clinical Review, choosing to focus first on aortic stenosis. The offering is designed to alert clinicians to patients who may have aortic stenosis but haven't been identified as a candidate for cardiovascular follow up care, according to a press release. The platform is expected to eventually be expanded to nine more cardiovascular diseases, including cardiomyopathy, deep vein thrombosis, heart attacks, among others.
"Out of the gate, this type of cognitive tool may provide big benefits to hospitals and doctors, providing insights we don't currently have and doing so in a way that fits how we work," Ricard Cury, MD, director of cardiac imaging at Baptist Health of South Florida and chairman and CEO of Radiology Associates of South Florida, said in the release.
Cury's institutions are among the new members of the Watson Health medical imaging collaborative, focused on optimizing the applications of medical imaging. IBM announced in the imaging release that there are now 24 organizations in the collaborative.
Another agreement announced the same day will bring the more than 2000 healthcare providers of the Central New York Care Collaborative (CNYCC) onto a population health platform run on the Watson Health Cloud. The effort aims to cut Medicaid costs and preventable emergency room visits, as well as cut hospital readmissions by 25%, according to a news release.
"Central New York is leading the way for a national movement toward an effective, scalabe patient-centric approach to population health management and value-based care," Anil Jain, MD, FACP, vice president and chief health informatics officer, Value-Based Care at IBM Watson Health, said in the release.
IBM Watson Health also signed on to an agreement with a healthcare organization, Atrius Health. The collaboration will center around more information that can be used to facilitate shared decision making and Excellerate delivery of patient care in the eastern Massachusetts region covered by the healthcare organization.
"Atrius Health is committed to increasing the joy in the practice of medicine for our clinicians and staff," Steve Strongwater, MD, president and CEO of Atrius Health, said in a news release. "Working with IBM Watson Health offers a unique opportunity to help our Atrius Health clinicians make greater use of the mountains of digitalized information generated daily through our care of patients."
Of course, there's more to these new collaborations than the exciting potential of the technology. Forbes reported recently that a partnership between IBM Watson and MD Anderson has been paused, highlighting the important role that project management and finances--in addition to the technology--play in the success of such a joint effort.
IBM Watson Health already has an impressive list of collaborations under its belt, including with Quest Diagnostics, Medtronic, Johnson & Johnson, and Memorial Sloan Kettering. More partnerships seem likely, as IBM also debuted its Watson Health Consulting Services unit and new features for its Watson Platform for Health Cloud this week. These offerings echo IBM Watson Health's other priorities, with a continued focus on quicker, better insights, improved patient care, and value-based care.
"The launch of the new Watson Health Consulting Services unit is about helping our clients transform healthcare, in quality, improved access, patient satisfaction and lower cost in the cognitive healthcare era," Matt Porta, vice president and partner for IBM Watson Health Consulting Group, said in the release.
[Image courtesy of FRANKY242/FREEDIGITALPHOTOS.NET]
Given the million possibilities, Jitesh Lalwani, founder and CEO of Artificial Brain Tech Inc., believes quantum computing can provide the answers. His company, based in the US and Pune, has developed an algorithm that leverages the power of quantum computing accessed via cloud.
“The algorithm solves this problem by considering the point of interest and already existing electric charging locations and then optimally placing new charging points to cover as many people as possible," says Lalwani.
But why not use high-performance computers (HPCs) for the task? HPCs process data and perform complex calculations at high speeds. “These are very complex problems that classical computers (conventional ones we use in our everyday lives such as laptops, desktops, and even HPCs) are not capable of solving. And even if they do, it will take them millions of years," he explains.
Lalwani claims his company’s quantum algorithm returns result from an actual quantum computer in less than three seconds for 8543811434435330 (8.5 * 10^15) combinations.
Artificial Brain now plans to modify this quantum algorithm to find optimal locations for wind and solar farms, too.
Chennai-based Quantica Computacao is another company that’s betting on quantum computing. It aims to develop quantum cryptographic tools to help protect banking transactions. It is also working on quantum machine learning (ML) and artificial intelligence (AI) tools to help researchers accelerate their algorithms.
These two startups underline that quantum computing is no longer an esoteric science confined to research labs—it is beginning to find business applications. More of this later. First, let’s demystify the quantum computer.
The computers we see and use in our homes and offices today process information with bits (ones and zeroes). They are referred to as classical or conventional computers. Quantum computers, on the other hand, use quantum bits or qubits that can process the ones and zeroes simultaneously due to a property known as superposition which allows them to process a lot more information than traditional computers. In October 2019, Google said it had performed a calculation on a quantum processor in 300 seconds that would have been practically impossible to achieve with the algorithms available at the time. The report was published in Nature magazine.
Quantum computing also rides on ‘quantum entanglement’, a property that Albert Einstein referred to as “spooky action at a distance" since it allows quantum particles to connect regardless of their location in the universe.
On 27 January, scientists from two premier Ahmedabad-based laboratories of the Department of Space—the Space Applications Centre and Physical Research Laboratory—jointly demonstrated quantum entanglement with real-time Quantum Key Distribution (QKD) between two buildings separated by a distance of 300 metres. QKD allows any two parties to generate random secret keys that can be shared exclusively to encrypt and decrypt messages. This makes the communication very secure, especially vital for defence and strategic agencies across the globe.
Race for pace
Companies like International Business Machines Corp. (IBM), D-Wave Systems, Google, Microsoft, Amazon, Nvidia, and Intel have begun providing cloud-based quantum services. IBM has built over 30 quantum computers in the last four-five years, of which over 20 are active right now with IBM providing access to them through the IBM Cloud. However, while most tech companies use the quantum gate approach (similar to logic gates in conventional computers), D-Wave uses both ‘quantum annealing’ and quantum gates.
Also referred to as adiabatic quantum computing, D-Wave’s quantum annealing approach can determine the lowest energy state of a system using the superposition property of qubits. The University of Southern California Information Sciences Institute (USC-ISI), for instance, has used D-Wave’s quantum computer to advance data analysis for the discovery of Higgs bosons, and methods for doing error correction with quantum annealing. Informally known as the ‘God Particle’, the Higgs bosons is the fundamental particle associated with the Higgs field, responsible for giving other fundamental particles their mass.
And all companies are focused on adding more qubits. While IBM aims to launch a 4,000-qubit quantum computer by 2025, D-Wave’s Advantage2 system is expected to feature 7,000 qubits with a new qubit design. However, given that both these companies (IBM’s gate-based Vs D-Wave’s annealing approach) are building these computers with different technologies, it’s not fair to compare these two approaches. “It’s better to have options for different objects and implementations," says Federico Spedalieri, research assistant professor at USC-ISI.
Google’s parent Alphabet, meanwhile, is building advanced quantum computing hardware with a focus on developing quantum AI. Alphabet also has a dedicated “secretive" team working on quantum tech, which it has now spun off into a separate company called SandboxAQ with AQ standing for ‘AI and Quantum’. Google plans to have a one million qubit quantum computer ready by 2030.
Microsoft, on its part, offers access to quantum computers from numerous makers on its Azure cloud platform, while Intel is developing its own silicon-based quantum computing hardware.
How is India placed?
The quantum ecosystem in India is growing at an accelerated pace with support from government agencies and participation from the academia, service providers, and the startup community, points out a report titled The Quantum Revolution in India, which was jointly published by Nasscom and management consulting firm Avasant this February.
India, for instance, plans to develop a quantum computer with about 50 qubits by 2026. India also has a Quantum Simulator platform built by the Indian Institute of Sciences (IISc), Bengaluru, Indian Institute of Technology (IIT)-Roorkee, and Pune-based Centre for Development of Advanced Computing (C-DAC), which allows users to do quantum simulation using computing resources from C-DAC’s high-performance computers like PARAM Shavak and PARAM Siddhi.
In a exact interview to Mint, Murray Thom, vice president of product management at D-Wave, underscored that with over 25,000 users, India has the third highest quantum cloud service sign-ups since it began providing access to users like Lalwani in India in 2020.
Dario Gil, senior VP and director of IBM Research, too, pointed out in an interview to Mint that IBM is having “many conversations right now with some IITs and leading centres for training to develop a curriculum and certification. The Qiskit (a software development kit in Python used to programme quantum computers) textbook is now also available in Tamil, Bengali and Hindi".
Indian IT services providers, too, are getting their feet wet in this space. Last September, Infosys said it is partnering with Amazon Web Services to develop quantum computing capabilities. This April, Tech Mahindra’s research and development arm, Makers Lab, announced it has set up a quantum centre of excellence called QNxT in Finland to leverage the country’s expertise in quantum computing. It also plans to set up quantum centres in Pune and Hyderabad to explore applications in sectors like telecom, 5G, energy, and healthcare.
Tata Consultancy Services (TCS) is working on quantum algorithms for applications in optimization, ML, image processing, molecular simulations and use cases such as portfolio and risk, transportation, logistics, and communication. HCL, meanwhile, is developing use cases for transport and logistics, finance, and security. Zensar is focusing on areas like drug discovery, genomic analysis, fraud detection, advanced materials, credit risk optimization, and supply chain optimization.
The adoption of quantum technologies across industries could thus potentially add $280–310 billion of value to the Indian economy by 2030, with the manufacturing, high-tech, banking, and defence sectors at the forefront of quantum-led innovation, according to the Nasscom-Avasant report.
Globally, IBM is working with the likes of JP Morgan Chase, Goldman Sachs, Wells Fargo, Mizuho Bank, Daimler, energy companies, and some materials companies, according to Gario. Sectors such as pharmaceuticals, chemicals, automotive, and finance could see short-term benefits of anywhere between $300 billion and $700 billion in value from the technology, according to a December 2021 report by McKinsey and Co., titled Quantum computing: An emerging ecosystem and industry use cases.
Before quantum computers can solve business problems better than classical computers (known as the ‘quantum advantage’) or even those that the former cannot solve (called ‘quantum supremacy’), many hurdles have to be overcome.
For one, quantum computers are highly prone to interference that leads to errors in quantum algorithms running on it. According to the McKinsey report cited above, while multiple quantum-computing hardware platforms are being developed, it will be important to achieve “fully error-corrected, fault-tolerant quantum computing, without which a quantum computer cannot provide exact, mathematically accurate results". That said, some companies, including Google, have announced plans to have fault-tolerant quantum-computing hardware by 2030.
Second, most quantum computers cannot function without being super-cooled to a little above absolute zero since heat generates error or noise in qubits. But this June, the Pawsey Supercomputing Research Centre in Perth said it is working with German-Australian startup Quantum Brilliance to test the latter’s two-qubit diamond quantum ‘accelerator’ that uses synthetic diamonds and runs at room temperature in any environment. The Centre is currently testing it by pairing it with their new state-of-the-art supercomputer, Setonix. Running quantum computers at room temperature may prove a game changer.
Finding the right talent is another big hurdle. The National Mission on Quantum Technologies and Applications (NM-QTA), a government of India programme, aims to create a workforce of over 25,000 in India over the next five to seven years, but there is an acute shortage of candidates with doctorates in quantum physics, engineering, and statistics.
That said, the Defence Institute of Advanced Technology (DIAT) in Pune, launched an MTech in quantum computing in 2020. IISc Bangalore followed by offering the same course a year later. IBM has partnered with top-tier academic institutions in India to provide access to IBM quantum systems, while Microsoft Garage India has joined hands with IIT Roorkee to conduct lectures on quantum computing for an entire semester.
Startups are pitching in, too. QpiAI, a startup that leverages quantum computing and AI to offer solutions to different industries, offers a module-based study for AI and quantum certification, while Qulabs Software, another quantum solutions firm, offers six-month internship projects at the company.
Others like Sumant Parimal, founding partner of research and advisory firm Innogress, are focusing on building the ecosystem. Innogress plans to set up the Greater Karnavati Quantum Computing Technology Park in Gujarat to enable everything from “R&D to design and engineering, simulation to testing and manufacturing to packaging and skilling".
“We are talking to potential technology partners and investors. It will require an initial capex of around $300 million. It’s a 5–10-year road map," says Parimal.
He has a point. China and the European Union rank first and second on public funding for quantum computing with investments of $15 billion and $7.2 billion, respectively, according to a McKinsey report. The US, the UK, and India follow at a distant third, fourth and fifth place, respectively, with a little over $1 billion each.
To be sure, the Indian government did announce NM-QTA with a total budget outlay of ₹8,000 crore for a period of five years to be implemented by the department of science and technology in 2020 but it is yet to get Cabinet clearance.
Despite these hurdles, quantum computing is set to grow but is likely to see a hybrid computing-operating model that combines conventional computing with emerging quantum computing before the latter comes of age. As the McKinsey report puts it: “Change may come as early as 2030, as several companies predict they will launch usable quantum systems by that time."
It also advises companies to start formulating “their quantum computing strategies, especially in industries such as pharmaceuticals that may reap the early benefits of commercial quantum computing".
Enterprise Streaming and Webcasting Market 2022 Global Industry research report presents you analysis of market size, share, and growth, trends, and cost structure, statistical and comprehensive data of the global market. Top Key Players are – STREAMSHARK, Kollective, Wistia, IBM, Arkena, VIMEO, Qumu, Agile Content, Viocorp, Vbrick, DACAST, Haivision, Ooyala, Vidizmo.
Global “Enterprise Streaming and Webcasting Market“report provides a meticulous analysis of market dynamics, size, share, current developments, and trending business strategies. This report gives a complete analysis of different segments on the basis of type, application, and region. The study was made to combine both, primary and secondary information along with inputs from the major candidates in the global Enterprise Streaming and Webcasting industry. Experts use the most exact Enterprise Streaming and Webcasting Market research techniques and tools to assemble widespread and precise marketing research reports.
Get a trial PDF of the Report at-https://www.researchreportsworld.com/enquiry/request-sample/21166626
The global Enterprise Streaming and Webcasting market is anticipated to rise at a considerable rate during the forecast period, between 2022 and 2029. The report screens the key models and market drivers in the two and for development situation and offers on the ground snippets of data. Enterprise Streaming and Webcasting Market Report will join the appraisal of the effect of COVID-19 on this industry.
The Enterprise Streaming and Webcasting market analysis provides a detailed analysis of global market size, regional and country-level market size, segmentation market growth, market share, competitive Landscape, sales analysis, impact of domestic and global market players, value chain optimization, trade regulations, exact developments, opportunities analysis, strategic market growth analysis, product launches, area marketplace expanding, and technological innovations.
Enterprise Streaming and Webcasting is split by Type and by Application. For the period 2012-2029, the growth among segments provide accurate calculations and forecasts for sales by Type and by Application in terms of volume and value. This analysis can help you expand your business by targeting qualified niche markets
Scope of the Report
Get a trial Copy of the Enterprise Streaming and Webcasting Market Report 2022
List of TOP KEY PLAYERS in Enterprise Streaming and Webcasting Market Report are
On the basis of product, this report displays the production, revenue, price, market share and growth rate of each type, primarily split into
On the basis of the end users/applications, this report focuses on the status and outlook for major applications/end users, consumption (sales), market share and growth rate for each application, including
Enquire before purchasing this report –https://www.researchreportsworld.com/enquiry/pre-order-enquiry/21166626
Covid-19 Impact On Market –
Enterprise Streaming and Webcasting Market Forecast report gives required information regarding the total valuation that this industry holds presently, and it also lists the segmentation of the market along with the growth opportunities present across this business vertical. The Global Enterprise Streaming and Webcasting market growth report 2022 research provides a basic overview of the industry including definitions, classifications, applications and industry chain structure. Moreover, Consumer behaviour analysis and market dynamics like drivers, restraints, opportunities and provides crucial information for knowing the Enterprise Streaming and Webcasting market. The impact of the COVID-19 pandemic on Enterprise Streaming and Webcasting market share, global consumer prices, and annual growth rate was also examined in the study.
To know How COVID-19 Pandemic Will Impact This Market/Industry–Request a trial copy of the report –https://www.researchreportsworld.com/enquiry/request-covid19/21166626
Why Buy This Report:
Regions and Countries Level Analysis
Regional analysis is another highly comprehensive part of the research and analysis study of the global Enterprise Streaming and Webcasting presented in the report. This section sheds light on the sales growth of different regional and country-level Enterprise Streaming and Webcasting markets. For the historical and forecast period 2012 to 2029, it provides detailed and accurate country-wise volume analysis and region-wise market size analysis of the global Electrical Heating Cables market.
Geographically,the report includes the research on production, consumption, revenue, market share and growth rate, and forecast (2017 -2029) of the following regions:
Years considered for this report:
Historical Years: 2017-2021
Base Year: 2021
Estimated Year: 2022
Forecast Period: 2022-2029
Purchase this report (Price 3450 USD for single user license)
Major Points from Table of Contents:
1 Enterprise Streaming and Webcasting Market Overview1.1 Product Overview and Scope of Enterprise Streaming and Webcasting
1.2 Enterprise Streaming and Webcasting Segment by Type
1.2.1 Global Enterprise Streaming and Webcasting Sales and CAGR Comparison by Type (2017-2029)
1.2.2 The Market Profile of Fluid Management system
1.2.3 The Market Profile of Fluid Management Disposables and Accessories
1.2.4 The Market Profile of Visualization System and Accessories Market
1.3 Global Enterprise Streaming and Webcasting Segment by Application
1.3.1 Enterprise Streaming and Webcasting Consumption (Sales) Comparison by Application (2017-2029)
1.3.2 The Market Profile of Neurology
1.3.3 The Market Profile of Laparoscopy
1.3.4 The Market Profile of Urology
1.3.5 The Market Profile of Arthroscopy
2 Global Enterprise Streaming and Webcasting Market Landscape by Player
2.1 Global Enterprise Streaming and Webcasting Sales and Share by Player (2017-2022)
2.2 Global Enterprise Streaming and Webcasting Revenue and Market Share by Player (2017-2022)
2.3 Global Enterprise Streaming and Webcasting Average Price by Player (2017-2022)
2.4 Global Enterprise Streaming and Webcasting Gross Margin by Player (2017-2022)
2.5 Enterprise Streaming and Webcasting Manufacturing Base Distribution, Sales Area and Product Type by Player
2.6 Enterprise Streaming and Webcasting Market Competitive Situation and Trends
2.6.1 Enterprise Streaming and Webcasting Market Concentration Rate
2.6.2 Enterprise Streaming and Webcasting Market Share of Top 3 and Top 6 Players
2.6.3 Mergers and Acquisitions, Expansion
3 Enterprise Streaming and Webcasting Upstream and Downstream Analysis
3.1 Enterprise Streaming and Webcasting Industrial Chain Analysis
3.2 Key Raw Materials Suppliers and Price Analysis
3.3 Key Raw Materials Supply and Demand Analysis
3.4 Manufacturing Process Analysis
3.5 Market Concentration Rate of Raw Materials
3.6 Downstream Buyers
3.7 Value Chain Status Under COVID-19
4 Enterprise Streaming and Webcasting Manufacturing Cost Analysis
4.1 Manufacturing Cost Structure Analysis
4.2 Enterprise Streaming and Webcasting Key Raw Materials Cost Analysis
4.2.1 Key Raw Materials Introduction
4.2.2 Price Trend of Key Raw Materials
4.3 Labor Cost Analysis
4.3.1 Labor Cost of Enterprise Streaming and Webcasting Under COVID-19
4.4 Energy Costs Analysis
4.5 RandD Costs Analysis
5 Market Dynamics
5.2 Restraints and Challenges
6 Players Profiles
6.1.1 Basic Information, Manufacturing Base, Sales Area and Competitors
6.1.2 Enterprise Streaming and Webcasting Product Profiles, Application and Specification
6.1.3 Enterprise Streaming and Webcasting Market Performance (2017-2022)
6.1.4 Business Overview
7 Global Enterprise Streaming and Webcasting Sales and Revenue Region Wise (2017-2022
7.1 Global Enterprise Streaming and Webcasting Sales and Market Share, Region Wise (2017-2022)
7.2 Global Enterprise Streaming and Webcasting Revenue (Revenue) and Market Share, Region Wise (2017-2022)
7.3 Global Enterprise Streaming and Webcasting Sales, Revenue, Price and Gross Margin (2017-2022)
7.4 United States Enterprise Streaming and Webcasting Sales, Revenue, Price and Gross Margin (2017-2022)
7.4.1 United States Enterprise Streaming and Webcasting Market Under COVID-19
7.5 Europe Enterprise Streaming and Webcasting Sales, Revenue, Price and Gross Margin (2017-2022)
7.5.1 Europe Enterprise Streaming and Webcasting Market Under COVID-19
7.6 China Enterprise Streaming and Webcasting Sales, Revenue, Price and Gross Margin (2017-2022)
8 Global Enterprise Streaming and Webcasting Sales, Revenue (Revenue), Price Trend by Type
8.1 Global Enterprise Streaming and Webcasting Sales and Market Share by Type (2017-2022)
8.2 Global Enterprise Streaming and Webcasting Revenue and Market Share by Type (2017-2022)
8.3 Global Enterprise Streaming and Webcasting Price by Type (2017-2022)
8.4 Global Enterprise Streaming and Webcasting Sales Growth Rate by Type (2017-2022)
9 Global Enterprise Streaming and Webcasting Market Analysis by Application
9.1 Global Enterprise Streaming and Webcasting Consumption and Market Share by Application (2017-2022)
9.2 Global Enterprise Streaming and Webcasting Consumption Growth Rate by Application (2017-2022)
9.2.1 Global Enterprise Streaming and Webcasting Consumption Growth Rate of Neurology (2017-2022)
9.2.2 Global Enterprise Streaming and Webcasting Consumption Growth Rate of Laparoscopy (2017-2022)
9.2.3 Global Enterprise Streaming and Webcasting Consumption Growth Rate of Urology (2017-2022)
9.2.4 Global Enterprise Streaming and Webcasting Consumption Growth Rate of Arthroscopy (2017-2022)
10 Global Enterprise Streaming and Webcasting Market Forecast (2022-2029)
10.1 Global Enterprise Streaming and Webcasting Sales, Revenue Forecast (2022-2029)
10.1.1 Global Enterprise Streaming and Webcasting Sales and Growth Rate Forecast (2022-2029)
10.1.2 Global Enterprise Streaming and Webcasting Revenue and Growth Rate Forecast (2022-2029)
10.1.3 Global Enterprise Streaming and Webcasting Price and Trend Forecast (2022-2029)
10.2 Global Enterprise Streaming and Webcasting Sales and Revenue Forecast, Region Wise (2022-2029)
11 Research Findings and Conclusion
12.2 Research Data Source
Browse complete table of contents at –https://www.researchreportsworld.com/TOC/21166626#TOC
Research Reports World is the credible source for gaining the market reports that will provide you with the lead your business needs. At Research Reports World, our objective is providing a platform for many top-notch market research firms worldwide to publish their research reports, as well as helping the decision makers in finding most suitable market research solutions under one roof. Our aim is to provide the best solution that matches the exact customer requirements. This drives us to provide you with custom or syndicated research reports.
Research Reports World
US (+1) 424 253 0807
UK (+44) 203 239 8187
Other Reports Here:
Benzophenone Ultraviolet Absorber Market Size 2022, Growth, Share, Company Profiles, Emerging Technologies, Trends, Segments Demand by Forecast to 2028
Glass Molding Market Size, Growth, Future Trends, Key Factors, Share, Demand, Manufacture Players and Opportunities Analysis by Outlook 2022-2028
Energy Security Market Size 2022, Share, Growth, Company Profiles, Competitive Landscape and Key Regions Analysis 2029
PET Compressors Market Size, Share, 2022 Global Industry Growth, Development, Revenue, Future Analysis, Business Prospects and Forecast to 2029
USB Interface Ics Market-2022 Industry Trends, Size, Growth Insight, Share, Emerging Technologies, Share, Competitive, Regional and Global Industry Forecast to 2028
Press Release Distributed by The Express Wire
To view the original version on The Express Wire visit Enterprise Streaming and Webcasting Market Size 2022, Trends, Growth Insight, Share, Competitive, Regional, And Global Industry Forecast to 2029
New Power10 scale-out and midrange models extend IBM's capabilities to deliver flexible and secured infrastructure for hybrid cloud environments
ARMONK, N.Y., July 12, 2022 /PRNewswire/ -- IBM (NYSE: IBM) today announced a significant expansion of its Power10 server line with the introduction of mid-range and scale-out systems to modernize, protect and automate business applications and IT operations. The new Power10 servers combine performance, scalability, and flexibility with new pay-as-you-go consumption offerings for clients looking to deploy new services quickly across multiple environments.
IBM announced an expansion of its Power10 server line with mid-range and scale-out systems.
Digital transformation is driving organizations to modernize both their applications and IT infrastructures. IBM Power systems are purpose-built for today's demanding and dynamic business environments, and these new systems are optimized to run essential workloads such as databases and core business applications, as well as maximize the efficiency of containerized applications. An ecosystem of solutions with Red Hat OpenShift also enables IBM to collaborate with clients, connecting critical workloads to new, cloud-native services designed to maximize the value of their existing infrastructure investments.
The new servers join the popular Power10 E1080 server introduced in September 2021 to deliver a secured, resilient hybrid cloud experience that can be managed with other x86 and multi-cloud management software across clients' IT infrastructure. This expansion of the IBM Power10 family with the new midrange and scale-out servers brings high-end server capabilities throughout the product line. Not only do the new systems support critical security features such as transparent memory encryption and advanced processor/system isolation, but also leverage the OpenBMC project from the Linux Foundation for high levels of security for the new scale-out servers.
Highlights of the announcements include:
"Today's highly dynamic environment has created volatility, from materials to people and skills, all of which impact short-term operations and long-term sustainability of the business," said Steve Sibley, Vice President, IBM Power Product Management. "The right IT investments are critical to business and operational resilience. Our new Power10 models offer clients a variety of flexible hybrid cloud choices with the agility and automation to best fit their needs, without sacrificing performance, security or resilience."
The expansion of the IBM Power10 family has been engineered to establish one of the industry's most flexible and broadest range of servers for data-intensive workloads such as SAP S/4HANA – from on-premises workloads to hybrid cloud. IBM now offers more ways to implement dynamic capacity – with metering across all operating environments including IBM i, AIX, Linux and OpenShift supporting modern and traditional applications on the same platforms – as well as integrated infrastructure automation software for improved visibility and management.
The new systems with IBM Power Virtual Server also help clients operate a secured hybrid cloud experience that delivers high performance and architectural consistency across their IT infrastructure. The systems are uniquely designed so as to protect sensitive data from core to cloud, and enable virtual machines and containerized workloads to run simultaneously on the same systems. For critical business workloads that have traditionally needed to reside on-premises, they can now be moved into the cloud as workloads and needs demand. This flexibility can help clients mitigate risk and time associated with rewriting applications for a different platform.
"As organizations around the world continue to adapt to unpredictable changes in consumer behaviors and needs, they need a platform that can deliver their applications and insights securely where and when they need them," said Peter Rutten, IDC Worldwide Infrastructure Research Vice President. "IBM Power continues its laser focus on helping clients respond faster to dynamically changing environments and business demands, while protecting information security and distilling new insights from data, all with high reliability and availability."
Ecosystem of ISVs and Channel Partners Enhance Capabilities for IBM Power10
Critical in the launch of the expanded Power10 family is a robust ecosystem of ISVs, Business Partners, and lifecycle services. Ecosystem partners such as SVA and Solutions II provide examples of how the IBM Ecosystem collaborates with clients to build hybrid environments, connecting essential workloads to the cloud to maximize the value of their existing infrastructure investments:
"SVA customers have appreciated the enormous flexibility of IBM Power systems through Capacity Upgrade On-Demand in the high-end systems for many years," said Udo Sachs, Head of Competence Center Power Systems at SVA. "The flexible consumption models using prepaid capacity credits have been well-received by SVA customers, and now the monthly pay-as-you-go option for the scale-out models makes the platform even more attractive. When it comes to automation, IBM helps us to roll out complex workloads such as entire SAP landscapes at the push of a button by supporting Ansible on all OS derivatives, including AIX, IBM i and Linux, as well as ready-to-use modules for deploying the complete Power infrastructure."
"Solutions II provides technology design, deployment, and managed services to hospitality organizations that leverage mission critical IT infrastructure to execute their mission, often requiring 24/7 operation," said Dan Goggiano, Director of Gaming, Solutions II. "System availability is essential to maintaining our clients' revenue streams, and in our experience, they rely on the stability and resilience of IBM Power systems to help solidify their uptime. Our clients are excited that the expansion of the Power10 family further extends these capabilities and bolsters their ability to run applications securely, rapidly, and efficiently."
For more information on IBM Power and the new servers and consumption models announced today, visit: https://www.ibm.com/it-infrastructure/power
IBM is a leading global hybrid cloud and AI, and business services provider, helping clients in more than 175 countries capitalize on insights from their data, streamline business processes, reduce costs and gain the competitive edge in their industries. Nearly 3,800 government and corporate entities in critical infrastructure areas such as financial services, telecommunications and healthcare rely on IBM's hybrid cloud platform and Red Hat OpenShift to affect their digital transformations quickly, efficiently, and securely. IBM's breakthrough innovations in AI, quantum computing, industry-specific cloud solutions and business services deliver open and flexible options to our clients. All of this is backed by IBM's legendary commitment to trust, transparency, responsibility, inclusivity, and service. For more information, visit www.ibm.com.
SAP and other SAP products and services mentioned herein as well as their respective logos are trademarks or registered trademarks of SAP SE in Germany and other countries. Please see https://www.sap.com/copyright for additional trademark information and notices.
1Comparison based on best performing 4-socket systems (IBM Power E1050 3.15-3.9 GHz, 96 core and Inspur NF8480M6 2.90 GHz, Intel Xeon Platinum 8380H) using published results at https://www.spec.org/cpu2017/results/rint2017.html as of 22 June 2022. For more information about SPEC CPU 2017, see https://www.spec.org/cpu2017/.
2Comparison based on best performing 4-socket systems (IBM Power E1050 3.15-3.9 GHz, 96 core; and Inspur NF8480M6 2.90 GHz, Intel Xeon Platinum 8380H) using published results at https://www.spec.org/cpu2017/results/rint2017.html as of 22 June 2022. For more information about SPEC CPU 2017, see www. http:/spec.org/cpu2017
3Comparison based on best performing 4-socket systems (1) IBM Power E1050; two-tier SAP SD standard application benchmark running SAP ERP 6.0 EHP5; Power10 2.95 GHz processor, 4,096 GB memory, 4p/96c/768t, 134,016 SD benchmark users, 736,420 SAPS, AIX 7.3, DB2 11.5, Certification # 2022018 and (2) Dell EMC PowerEdge 840; two-tier SAP SD standard application benchmark running SAP ERP 6.0 EHP5; Intel Xeon Platinum 8280 2.7 GHz, 4p/112c/224t, 69,500 SD benchmark users (380,280 SAPS), SUSE Linux Enterprise Server 12 and SAP ASE 16, Certification # 2019045. All results can be found at sap.com/benchmark Valid as of 7 July 2022.
View original content to get multimedia:https://www.prnewswire.com/news-releases/ibm-expands-power10-server-family-to-help-clients-respond-faster-to-rapidly-changing-business-demands-301584186.html