The average cost of a data breach is still going up, according to a new survey.
I recently met with Dr. Nick Fuller, Vice President, Distributed Cloud, at IBM Research for a discussion about IBM’s long-range plans and strategy for artificial intelligence and machine learning at the edge.
Dr. Fuller is responsible for providing AI and platform–based innovation for enterprise digital transformation spanning edge computing and distributed cloud management. He is an IBM Master Inventor with over 75 patents and co-author of 75 technical publications. Dr. Fuller obtained his Bachelor of Science in Physics and Math from Morehouse College and his PhD in Applied Physics from Columbia University.
Edge In, not Cloud Out
In general, Dr. Fuller told me that IBM is focused on developing an "edge in" position versus a "cloud out" position with data, AI, and Kubernetes-based platform technologies to scale hub and spoke deployments of edge applications.
A hub plays the role of a central control plane used for orchestrating the deployment and management of edge applications in a number of connected spoke locations such as a factory floor or a retail branch, where data is generated or locally aggregated for processing.
“Cloud out” refers to the paradigm where cloud service providers are extending their cloud architecture out to edge locations. In contrast, “edge in” refers to a provider-agnostic architecture that is cloud-independent and treats the data-plane as a first-class citizen.
IBM's overall architectural principle is scalability, repeatability, and full stack solution management that allows everything to be managed using a single unified control plane.
IBM’s Red Hat platform and infrastructure strategy anchors the application stack with a unified, scalable, and managed OpenShift-based control plane equipped with a high-performance storage appliance and self-healing system capabilities (inclusive of semi-autonomous operations).
IBM’s strategy also includes several in-progress platform-level technologies for scalable data, AI/ML runtimes, accelerator libraries for Day-2 AI operations, and scalability for the enterprise.
It is an important to mention that IBM is designing its edge platforms with labor cost and technical workforce in mind. Data scientists with PhDs are in high demand, making them difficult to find and expensive to hire once you find them. IBM is designing its edge system capabilities and processes so that domain experts rather than PhDs can deploy new AI models and manage Day-2 operations.
Why edge is important
Advances in computing and storage have made it possible for AI to process mountains of accumulated data to provide solutions. By bringing AI closer to the source of data, edge computing is faster and more efficient than cloud. While Cloud data accounts for 60% of the world’s data today, vast amounts of new data is being created at the edge, including industrial applications, traffic cameras, and order management systems, all of which can be processed at the edge in a fast and timely manner.
Public cloud and edge computing differ in capacity, technology, and management. An advantage of edge is that data is processed and analyzed at / near its collection point at the edge. In the case of cloud, data must be transferred from a local device and into the cloud for analytics and then transferred back to the edge again. Moving data through the network consumes capacity and adds latency to the process. It’s easy to see why executing a transaction at the edge reduces latency and eliminates unnecessary load on the network.
Increased privacy is another benefit of processing data at the edge. Analyzing data where it originates limits the risk of a security breach. Most of the communications between the edge and the cloud is then confined to such things as reporting, data summaries, and AI models, without ever exposing the raw data.
IBM at the Edge
In our discussion, Dr. Fuller provided a few examples to illustrate how IBM plans to provide new and seamless edge solutions for existing enterprise problems.
Example #1 – McDonald’s drive-thru
Dr. Fuller’s first example centered around Quick Service Restaurant’s (QSR) problem of drive-thru order taking. Last year, IBM acquired an automated order-taking system from McDonald's. As part of the acquisition, IBM and McDonald's established a partnership to perfect voice ordering methods using AI. Drive-thru orders are a significant percentage of total QSR orders for McDonald's and other QSR chains.
McDonald's and other QSR restaurants would like every order to be processed as quickly and accurately as possible. For that reason, McDonald's conducted trials at ten Chicago restaurants using an edge-based AI ordering system with NLP (Natural Language Processing) to convert spoken orders into a digital format. It was found that AI had the potential to reduce ordering errors and processing time significantly. Since McDonald's sells almost 7 million hamburgers daily, shaving a minute or two off each order represents a significant opportunity to address labor shortages and increase customer satisfaction.
Example #2 – Boston Dynamics and Spot the agile mobile robot
According to an earlier IBM survey, many manufacturers have already implemented AI-driven robotics with autonomous decision-making capability. The study also indicated that over 80 percent of companies believe AI can help Excellerate future business operations. However, some companies expressed concern about the limited mobility of edge devices and sensors.
To develop a mobile edge solution, IBM teamed up with Boston Dynamics. The partnership created an agile mobile robot using IBM Research and IBM Sustainability Software AI technology. The device can analyze visual sensor readings in hazardous and challenging industrial environments such as manufacturing plants, warehouses, electrical grids, waste treatment plants and other hazardous environments. The value proposition that Boston Dynamics brought to the partnership was Spot the agile mobile robot, a walking, sensing, and actuation platform. Like all edge applications, the robot’s wireless mobility uses self-contained AI/ML that doesn’t require access to cloud data. It uses cameras to read analog devices, visually monitor fire extinguishers, and conduct a visual inspection of human workers to determine if required safety equipment is being worn.
IBM was able to show up to a 10X speedup by automating some manual tasks, such as converting the detection of a problem into an immediate work order in IBM Maximo to correct it. A fast automated response was not only more efficient, but it also improved the safety posture and risk management for these facilities. Similarly, some factories need to thermally monitor equipment to identify any unexpected hot spots that may show up over time, indicative of a potential failure.
IBM is working with National Grid, an energy company, to develop a mobile solution using Spot, the agile mobile robot, for image analysis of transformers and thermal connectors. As shown in the above graphic, Spot also monitored connectors on both flat surfaces and 3D surfaces. IBM was able to show that Spot could detect excessive heat build-up in small connectors, potentially avoiding unsafe conditions or costly outages. This AI/ML edge application can produce faster response times when an issue is detected, which is why IBM believes significant gains are possible by automating the entire process.
IBM market opportunities
Drive-thru orders and mobile robots are just a few examples of the millions of potential AI applications that exist at the edge and are driven by several billion connected devices.
Edge computing is an essential part of enterprise digital transformation. Enterprises seek ways to demonstrate the feasibility of solving business problems using AI/ML and analytics at the edge. However, once a proof of concept has been successfully demonstrated, it is a common problem for a company to struggle with scalability, data governance, and full-stack solution management.
Challenges with scaling
“Determining entry points for AI at the edge is not the difficult part,” Dr. Fuller said. “Scale is the real issue.”
Scaling edge models is complicated because there are so many edge locations with large amounts of diverse content and a high device density. Because large amounts of data are required for training, data gravity is a potential problem. Further, in many scenarios, vast amounts of data are generated quickly, leading to potential data storage and orchestration challenges. AI Models are also rarely "finished." Monitoring and retraining of models are necessary to keep up with changes the environment.
Through IBM Research, IBM is addressing the many challenges of building an all-encompassing edge architecture and horizontally scalable data and AI technologies. IBM has a wealth of edge capabilities and an architecture to create the appropriate platform for each application.
IBM AI entry points at the edge
IBM sees Edge Computing as a $200 billion market by 2025. Dr. Fuller and his organization have identified four key market entry points for developing and expanding IBM’s edge compute strategy. In order of size, IBM believes its priority edge markets to be intelligent factories (Industry 4.0), telcos, retail automation, and connected vehicles.
IBM and its Red Hat portfolio already have an established presence in each market segment, particularly in intelligent operations and telco. Red Hat is also active in the connected vehicles space.
There have been three prior industrial revolutions, beginning in the 1700s up to our current in-progress fourth revolution, Industry 4.0, that promotes a digital transformation.
Manufacturing is the fastest growing and the largest of IBM’s four entry markets. In this segment, AI at the edge can Excellerate quality control, production optimization, asset management, and supply chain logistics. IBM believes there are opportunities to achieve a 4x speed up in implementing edge-based AI solutions for manufacturing operations.
For its Industry 4.0 use case development, IBM, through product, development, research and consulting teams, is working with a major automotive OEM. The partnership has established the following joint objectives:
Maximo Application Suite
IBM’s Maximo Application Suite plays an important part in implementing large manufacturers' current and future IBM edge solutions. Maximo is an integrated public or private cloud platform that uses AI, IoT, and analytics to optimize performance, extend asset lifecycles and reduce operational downtime and costs. IBM is working with several large manufacturing clients currently using Maximo to develop edge use cases, and even uses it within its own Manufacturing.
IBM has research underway to develop a more efficient method of handling life cycle management of large models that require immense amounts of data. Day 2 AI operations tasks can sometimes be more complex than initial model training, deployment, and scaling. Retraining at the edge is difficult because resources are typically limited.
Once a model is trained and deployed, it is important to monitor it for drift caused by changes in data distributions or anything that might cause a model to deviate from original requirements. Inaccuracies can adversely affect model ROI.
Day-2 AI Operations (retraining and scaling)
Day-2 AI operations consist of continual updates to AI models and applications to keep up with changes in data distributions, changes in the environment, a drop in model performance, availability of new data, and/or new regulations.
IBM recognizes the advantages of performing Day-2 AI Operations, which includes scaling and retraining at the edge. It appears that IBM is the only company with an architecture equipped to effectively handle Day-2 AI operations. That is a significant competitive advantage for IBM.
A company using an architecture that requires data to be moved from the edge back into the cloud for Day-2 related work will be unable to support many factory AI/ML applications because of the sheer number of AI/ML models to support (100s to 1000s).
“There is a huge proliferation of data at the edge that exists in multiple spokes,” Dr. Fuller said. "However, all that data isn’t needed to retrain a model. It is possible to cluster data into groups and then use sampling techniques to retrain the model. There is much value in federated learning from our point of view.”
Federated learning is a promising training solution being researched by IBM and others. It preserves privacy by using a collaboration of edge devices to train models without sharing the data with other entities. It is a good framework to use when resources are limited.
Dealing with limited resources at the edge is a challenge. IBM’s edge architecture accommodates the need to ensure resource budgets for AI applications are met, especially when deploying multiple applications and multiple models across edge locations. For that reason, IBM developed a method to deploy data and AI applications to scale Day-2 AI operations utilizing hub and spokes.
The graphic above shows the current status quo methods of performing Day-2 operations using centralized applications and a centralized data plane compared to the more efficient managed hub and spoke method with distributed applications and a distributed data plane. The hub allows it all to be managed from a single pane of glass.
Data Fabric Extensions to Hub and Spokes
IBM uses hub and spoke as a model to extend its data fabric. The model should not be thought of in the context of a traditional hub and spoke. IBM’s hub provides centralized capabilities to manage clusters and create multiples hubs that can be aggregated to a higher level. This architecture has four important data management capabilities.
In addition to AI deployments, the hub and spoke architecture and the previously mentioned capabilities can be employed more generally to tackle challenges faced by many enterprises in consistently managing an abundance of devices within and across their enterprise locations. Management of the software delivery lifecycle or addressing security vulnerabilities across a vast estate are a case in point.
Multicloud and Edge platform
In the context of its strategy, IBM sees edge and distributed cloud as an extension of its hybrid cloud platform built around Red Hat OpenShift. One of the newer and more useful options created by the Red Hat development team is the Single Node OpenShift (SNO), a compact version of OpenShift that fits on a single server. It is suitable for addressing locations that are still servers but come in a single node, not clustered, deployment type.
For smaller footprints such as industrial PCs or computer vision boards (for example NVidia Jetson Xavier), Red Hat is working on a project which builds an even smaller version of OpenShift, called MicroShift, that provides full application deployment and Kubernetes management capabilities. It is packaged so that it can be used for edge device type deployments.
Overall, IBM and Red Hat have developed a full complement of options to address a large spectrum of deployments across different edge locations and footprints, ranging from containers to management of full-blown Kubernetes applications from MicroShift to OpenShift and IBM Edge Application Manager.
Much is still in the research stage. IBM's objective is to achieve greater consistency in terms of how locations and application lifecycle is managed.
First, Red Hat plans to introduce hierarchical layers of management with Red Hat Advanced Cluster Management (RHACM), to scale by two to three orders of magnitude the number of edge locations managed by this product. Additionally, securing edge locations is a major focus. Red Hat is continuously expanding platform security features, for example by recently including Integrity Measurement Architecture in Red Hat Enterprise Linux, or by adding Integrity Shield to protect policies in Red Hat Advanced Cluster Management (RHACM).
Red Hat is partnering with IBM Research to advance technologies that will permit it to protect platform integrity and the integrity of client workloads through the entire software supply chains. In addition, IBM Research is working with Red Hat on analytic capabilities to identify and remediate vulnerabilities and other security risks in code and configurations.
Telco network intelligence and slice management with AL/ML
Communication service providers (CSPs) such as telcos are key enablers of 5G at the edge. 5G benefits for these providers include:
The end-to-end 5G network comprises the Radio Access Network (RAN), transport, and core domains. Network slicing in 5G is an architecture that enables multiple virtual and independent end-to-end logical networks with different characteristics such as low latency or high bandwidth, to be supported on the same physical network. This is implemented using cloud-native technology enablers such as software defined networking (SDN), virtualization, and multi-access edge computing. Slicing offers necessary flexibility by allowing the creation of specific applications, unique services, and defined user groups or networks.
An important aspect of enabling AI at the edge requires IBM to provide CSPs with the capability to deploy and manage applications across various enterprise locations, possibly spanning multiple end-to-end network slices, using a single pane of glass.
5G network slicing and slice management
Network slices are an essential part of IBM's edge infrastructure that must be automated, orchestrated and optimized according to 5G standards. IBM’s strategy is to leverage AI/ML to efficiently manage, scale, and optimize the slice quality of service, measured in terms of bandwidth, latency, or other metrics.
5G and AI/ML at the edge also represent a significant opportunity for CSPs to move beyond traditional cellular services and capture new sources of revenue with new services.
Communications service providers need management and control of 5G network slicing enabled with AI-powered automation.
Dr. Fuller sees a variety of opportunities in this area. "When it comes to applying AI and ML on the network, you can detect things like intrusion detection and malicious actors," he said. "You can also determine the best way to route traffic to an end user. Automating 5G functions that run on the network using IBM network automation software also serves as an entry point.”
In IBM’s current telecom trial, IBM Research is spearheading the development of a range of capabilities targeted for the IBM Cloud Pak for Network Automation product using AI and automation to orchestrate, operate and optimize multivendor network functions and services that include:
Future leverage of these capabilities by existing IBM Clients that use the Cloud Pak for Network Automation (e.g., DISH) can offer further differentiation for CSPs.
5G radio access
Open radio access networks (O-RANs) are expected to significantly impact telco 5G wireless edge applications by allowing a greater variety of units to access the system. The O-RAN concept separates the DU (Distributed Units) and CU (Centralized Unit) from a Baseband Unit in 4G and connects them with open interfaces.
O-RAN system is more flexible. It uses AI to establish connections made via open interfaces that optimize the category of a device by analyzing information about its prior use. Like other edge models, the O-RAN architecture provides an opportunity for continuous monitoring, verification, analysis, and optimization of AI models.
The IBM-telco collaboration is expected to advance O-RAN interfaces and workflows. Areas currently under development are:
IBM Cloud and Infrastructure
The cornerstone for the delivery of IBM's edge solutions as a service is IBM Cloud Satellite. It presents a consistent cloud-ready, cloud-native operational view with OpenShift and IBM Cloud PaaS services at the edge. In addition, IBM integrated hardware and software Edge systems will provide RHACM - based management of the platform when clients or third parties have existing managed as a service models. It is essential to note that in either case this is done within a single control plane for hubs and spokes that helps optimize execution and management from any cloud to the edge in the hub and spoke model.
IBM's focus on “edge in” means it can provide the infrastructure through things like the example shown above for software defined storage for federated namespace data lake that surrounds other hyperscaler clouds. Additionally, IBM is exploring integrated full stack edge storage appliances based on hyperconverged infrastructure (HCI), such as the Spectrum Fusion HCI, for enterprise edge deployments.
As mentioned earlier, data gravity is one of the main driving factors of edge deployments. IBM has designed its infrastructure to meet those data gravity requirements, not just for the existing hub and spoke topology but also for a future spoke-to-spoke topology where peer-to-peer data sharing becomes imperative (as illustrated with the wealth of examples provided in this article).
Edge is a distributed computing model. One of its main advantages is that computing, and data storage and processing is close to where data is created. Without the need to move data to the cloud for processing, real-time application of analytics and AI capabilities provides immediate solutions and drives business value.
IBM’s goal is not to move the entirety of its cloud infrastructure to the edge. That has little value and would simply function as a hub to spoke model operating on actions and configurations dictated by the hub.
IBM’s architecture will provide the edge with autonomy to determine where data should reside and from where the control plane should be exercised.
Equally important, IBM foresees this architecture evolving into a decentralized model capable of edge-to-edge interactions. IBM has no firm designs for this as yet. However, the plan is to make the edge infrastructure and platform a first-class citizen instead of relying on the cloud to drive what happens at the edge.
Developing a complete and comprehensive AI/ML edge architecture - and in fact, an entire ecosystem - is a massive undertaking. IBM faces many known and unknown challenges that must be solved before it can achieve success.
However, IBM is one of the few companies with the necessary partners and the technical and financial resources to undertake and successfully implement a project of this magnitude and complexity.
It is reassuring that IBM has a plan and that its plan is sound.
Paul Smith-Goodson is Vice President and Principal Analyst for quantum computing, artificial intelligence and space at Moor Insights and Strategy. You can follow him on Twitter for more current information on quantum, AI, and space.
Note: Moor Insights & Strategy writers and editors may have contributed to this article.
Moor Insights & Strategy, like all research and tech industry analyst firms, provides or has provided paid services to technology companies. These services include research, analysis, advising, consulting, benchmarking, acquisition matchmaking, and speaking sponsorships. The company has had or currently has paid business relationships with 8×8, Accenture, A10 Networks, Advanced Micro Devices, Amazon, Amazon Web Services, Ambient Scientific, Anuta Networks, Applied Brain Research, Applied Micro, Apstra, Arm, Aruba Networks (now HPE), Atom Computing, AT&T, Aura, Automation Anywhere, AWS, A-10 Strategies, Bitfusion, Blaize, Box, Broadcom, C3.AI, Calix, Campfire, Cisco Systems, Clear Software, Cloudera, Clumio, Cognitive Systems, CompuCom, Cradlepoint, CyberArk, Dell, Dell EMC, Dell Technologies, Diablo Technologies, Dialogue Group, Digital Optics, Dreamium Labs, D-Wave, Echelon, Ericsson, Extreme Networks, Five9, Flex, Foundries.io, Foxconn, Frame (now VMware), Fujitsu, Gen Z Consortium, Glue Networks, GlobalFoundries, Revolve (now Google), Google Cloud, Graphcore, Groq, Hiregenics, Hotwire Global, HP Inc., Hewlett Packard Enterprise, Honeywell, Huawei Technologies, IBM, Infinidat, Infosys, Inseego, IonQ, IonVR, Inseego, Infosys, Infiot, Intel, Interdigital, Jabil Circuit, Keysight, Konica Minolta, Lattice Semiconductor, Lenovo, Linux Foundation, Lightbits Labs, LogicMonitor, Luminar, MapBox, Marvell Technology, Mavenir, Marseille Inc, Mayfair Equity, Meraki (Cisco), Merck KGaA, Mesophere, Micron Technology, Microsoft, MiTEL, Mojo Networks, MongoDB, MulteFire Alliance, National Instruments, Neat, NetApp, Nightwatch, NOKIA (Alcatel-Lucent), Nortek, Novumind, NVIDIA, Nutanix, Nuvia (now Qualcomm), onsemi, ONUG, OpenStack Foundation, Oracle, Palo Alto Networks, Panasas, Peraso, Pexip, Pixelworks, Plume Design, PlusAI, Poly (formerly Plantronics), Portworx, Pure Storage, Qualcomm, Quantinuum, Rackspace, Rambus, Rayvolt E-Bikes, Red Hat, Renesas, Residio, Samsung Electronics, Samsung Semi, SAP, SAS, Scale Computing, Schneider Electric, SiFive, Silver Peak (now Aruba-HPE), SkyWorks, SONY Optical Storage, Splunk, Springpath (now Cisco), Spirent, Splunk, Sprint (now T-Mobile), Stratus Technologies, Symantec, Synaptics, Syniverse, Synopsys, Tanium, Telesign,TE Connectivity, TensTorrent, Tobii Technology, Teradata,T-Mobile, Treasure Data, Twitter, Unity Technologies, UiPath, Verizon Communications, VAST Data, Ventana Micro Systems, Vidyo, VMware, Wave Computing, Wellsmith, Xilinx, Zayo, Zebra, Zededa, Zendesk, Zoho, Zoom, and Zscaler. Moor Insights & Strategy founder, CEO, and Chief Analyst Patrick Moorhead is an investor in dMY Technology Group Inc. VI, Dreamium Labs, Groq, Luminar Technologies, MemryX, and Movandi.
Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Watch here.
While enterprises are setting records in cybersecurity spending, the cost and severity of breaches continue to soar. IBM’s latest data breach report provides insights into why there’s a growing disconnect between enterprise spending on cybersecurity and record costs for data breaches.
This year, 2022, is on pace to be a record-breaking year for enterprise breaches globally, with the average cost of a data breach reaching $4.35 million. That’s 12.7% higher than the average cost of a data breach in 2020, which was $3.86 million. It also found a record 83% of enterprises reporting more than one breach and that the average time to identify a breach is 277 days. As a result, enterprises need to look at their cybersecurity tech stacks to see where the gaps are and what can be improved.
Enhanced security around privileged access credentials and identity management is an excellent first place to start. More enterprises need to define identities as their new security perimeter. IBM’s study found that 19% of all breaches begin with compromised privileged credentials. Breaches caused by compromised credentials lasted an average of 327 days. Privileged access credentials are also bestsellers on the Dark Web, with high demand for access to financial services’ IT infrastructure.
The study also shows how dependent enterprises remain on implicit trust across their security and broader IT infrastructure tech stacks. The gaps in cloud security, identity and access management (IAM) and privileged access management (PAM) allow expensive breaches to happen. Seventy-nine percent of critical infrastructure organizations didn’t deploy a zero-trust architecture, when zero trust can reduce average breach losses by nearly $1 million.
Enterprises need to treat implicit trust as the unlocked back door that allows cybercriminals access to their systems, credentials and most valuable confidential data to reduce the incidence of breaches.
The report quantifies how wide healthcare’s cybersecurity gap is growing. IBM’s report estimates the average cost of a healthcare data breach is now $10.1 million, a record and nearly $1 million over last year’s $9.23 million. Healthcare has had the highest average breach cost for twelve consecutive years, increasing 41.6% since 2020.
The findings suggest that the skyrocketing cost of breaches adds inflationary fuel to the fire, as runaway prices are financially squeezing global consumers and companies. Sixty percent of organizations participating in IBM’s study say, they raised their product and service prices due to the breach, as supply chain disruptions, the war in Ukraine and tepid demand for products continue. Consumers are already struggling to meet healthcare costs, which will likely increase by 6.5% next year.
The study also found that nearly 30% of breach costs are incurred 12 to 24 months after, translating into permanent price increases for consumers.
“It is clear that cyberattacks are evolving into market stressors that are triggering chain reactions, [and] we see that these breaches are contributing to those inflationary pressures,” says John Hendley, head of strategy for IBM Security’s X-Force research team.
For healthcare providers with limited cybersecurity budgets, prioritizing these three areas can reduce the cost of a breach while making progress toward zero-trust initiatives. Getting identity access management (IAM) right is core to a practical zero-trust framework, one that can quickly adapt and protect human and machine identities are essential. IBM’s study found that of the zero-trust components measured in the study, IAM is the most effective in reducing breach costs. Leading IAM includes Akamai, Fortinet, Ericom, Ivanti, Palo Alto Networks and others. Ericom’s ZTEdge platform is noteworthy for its combining ML-enabled identity and access management, zero-trust network access (ZTNA), microsegmentation and secure web gateway (SWG) with remote browser isolation (RBI) and Web Application Isolation.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn more about membership.
Research from the tech giant has found that the cost of data breaches rose by 13 per cent over the past two years
The average cost of a data breach in an enterprise or large organisation hit $4.35 million (€4.3 million) according to IBM’s Cost of a Data Breach report 2021. The IT giant investigated 537 breaches across large organisations for the study and found the average cost of a breach has risen by 13 per cent over the past two years.
The average cost was $1.07 million higher where remote work was a factor in causing the ...
The MarketWatch News Department was not involved in the creation of this content.
Aug 01, 2022 (Heraldkeepers) -- Pune, India-The Global SaaS Enterprise Applications Market 2022 that centers around Global SaaS Enterprise Applications Market examines the significant components with a top to bottom methodology and empowers the client to survey the drawn-out based interest additionally predicts explicit executions. This report gives subjective investigation, clarifying item scope and expounding industry experiences and standpoint to 2027. The Global SaaS Enterprise Applications Market is a critical reference for essential and notable parts in the current market. The data separated in the report offers a thorough evaluation of the significant elements of Global SaaS Enterprise Applications Market like the chances, market patterns, cut-off points, and business methodologies. Likewise, the report additionally shows the current essential industry occasions along with their pertinent impact available. The market study report additionally includes the top vital participants in the Global SaaS Enterprise Applications Market.
To Get sample Copy of Report, Click Here: https://www.reportsweb.com/inquiry&RW00014721340/sample
Key Players: Microsoft, Sage Software, Ramco Systems, Oracle, SAP, Epicor Software, Google, IBM, Acumatica, Plex Systems, Box, Infor, Salesforce
(Our sample COPY of the report offers a quick advent to the Study report outlook, TOC, a listing of tables and figures, Analysis on Competitors of the market, and comprising key regions, also include COVID-19 impact Analysis)
Our Research Analyses Some Key Points for your Research Study:- Market Research Report, Industry Business Outlook, Revenue, Trends, Industry Business Outlook, Revenue, Comprehensive Analysis, Development Strategy, Future Plans and Industry Growth, Potential growth, attractive valuation, increasing demand with Industry Professionals, innovations, CAPEX cycle and the dynamic structure, increasing demand, gross profits, operating income, COGS, EBITDA, sales volume, product offerings, company landscape analysis, key strategic moves, key recent developments, and technological roadmap
Advantage of requesting FREE sample PDF Report Before purchase to know about:
To part the breakdown information by districts, type, makers, and applications.
To break down and research the worldwide Global SaaS Enterprise Applications Market status and future figure, including, creation, income, utilization, recorded, and conjecture.
To distinguish huge patterns, drivers, impact factors in worldwide and locales.
To introduce the key Global SaaS Enterprise Applications Market makers, creation, income, piece of the pie, and ongoing turn of events.
To investigate serious advancements, for example, extensions, arrangements, new item dispatches, and acquisitions on the lookout.
To investigate the worldwide and key districts market potential and preferred position, opportunity, and challenge, limitations, and dangers.
Key Industry Insights:
This Market report is a wide audit that incorporates an itemized review of the Global SaaS Enterprise Applications Market business. The report clarifies kind of Global SaaS Enterprise Applications Market and application in various verticals of the market with respect to different nations and key areas. The examination has recorded and assessed all the central members in the worldwide Global SaaS Enterprise Applications Market and analyzed them based on various measurements, for example, yearly deals shipments volume, verifiable development rates, market income, and promoting techniques. Based on every one of these discoveries, the worldwide Global SaaS Enterprise Applications Market industry study report proposes key intends to Excellerate market positions for existing business sector members.
This Study Report Offers Following Objectives:
There are 15 Key Chapters Covered in the Global SaaS Enterprise Applications Market:
Chapter 1, Industry Overview of Global SaaS Enterprise Applications Market;
Chapter 2, Classification, Specifications and Definition of Global SaaS Enterprise Applications Market Segment by Regions;
Chapter 3, Industry Suppliers, Manufacturing Process and Cost Structure, Chain Structure, Raw Material;
Chapter 4, Research Findings/Conclusion, Global SaaS Enterprise Applications Market deals channel, traders, distributors, dealers analysis;
Chapter 5, Complete Market Research, Capacity, Sales and Sales Price Analysis with Company Segment;
Chapter 6, Analysis of Regional Market that contains the United States, Europe, India, China, Japan, Korea & Taiwan;
Chapter 7 & 8, Global SaaS Enterprise Applications Market Analysis by Major Manufacturers, The Global SaaS Enterprise Applications Market Segment Market Analysis (by Type) and (by Application);
Chapter 9, Regional Market Trend Analysis, Market Trend by Product Type and by Application:
Chapter 10 & 11, Supply Chain Analysis, Regional Marketing Type Analysis, Global Trade Type Analysis;
Chapter 12, The Global SaaS Enterprise Applications Market industry consumers Analysis;
Chapter 13, Appendix and data source of Global SaaS Enterprise Applications Market.
Click here to avail lucrative discounts on our latest reports. We offer student, enterprise, and special periodic discounts to our clientele. Please fill the inquiry form below to know more – https://www.reportsweb.com/inquiry&RW00014721340/discount
ReportsWeb is a one stop shop of market research reports and solutions to various companies across the globe. We help our clients in their decision support system by helping them choose most relevant and cost-effective research reports and solutions from various publishers.
The market research industry has changed in last decade. As corporate focus has shifted to niche markets and emerging countries, a number of publishers have stepped in to fulfil these information needs. We have experienced and trained staff that helps you navigate different options and lets you choose best research solution at most effective cost.
Phone:+1-646-491-9876 || +91-20-67271633 Rest of the World
The MarketWatch News Department was not involved in the creation of this content.
Riyadh, Saudi Arabia: IBM, the leading global technology company, has published a study highlighting the importance of cybersecurity in an increasingly digital age. According to IBM Security’s annual Cost of a Data Breach Report, the Middle East has incurred losses of SAR 28 million from data breaches in 2022 alone — this figure already exceeding the total amount of losses accrued in each of the last eight years.
The latest edition of the Cost of a Data Breach Report — now in its 17th year — reveals costlier and higher-impact data breaches than ever before. As outlined by the study, the global average cost of a data breach has reached an all-time high of $4.35 million for surveyed organizations. With breach costs increasing nearly 13% over the last two years of the report, the findings suggest these incidents may also be contributing to rising costs of goods and services. In fact, 60% of studied organizations raised their product or services prices due to the breach, when the cost of goods is already soaring worldwide amid inflation and supply chain issues.
Notably, the report ranks the Middle East2 among the top five countries and regions for the highest average cost of a data breach. As per the study, the average total cost of a data breach in the Middle East amounted to SAR 28 million in 2022, the region being second only to the United States on the list. The report also spotlights the industries across the Middle East that have suffered the highest per-record costs in millions; the financial (SAR 1,039), health (SAR 991) and energy (SAR 950) sectors taking first, second and third spot, respectively.
Fahad Alanazi, IBM Saudi General Manager, said: “Today, more so than ever, in an increasingly connected and digital age, cybersecurity is of the utmost importance. It is essential to safeguard businesses and privacy. As the digital economy continues to evolve, enhanced security will be the marker of a modern, world class digital ecosystem.”
He continued: “At IBM, we take great pride in enabling the people, businesses and communities we serve to fulfil their potential by empowering them with state-of-the-art services and support. Our findings reiterate just how important it is for us, as a technology leader, to continue pioneering solutions that will help the Kingdom distinguish itself as the tech capital of the region.”
The perpetuality of cyberattacks is also shedding light on the “haunting effect” data breaches are having on businesses, with the IBM report finding 83% of studied organizations have experienced more than one data breach in their lifetime. Another factor rising over time is the after-effects of breaches on these organizations, which linger long after they occur, as nearly 50% of breach costs are incurred more than a year after the breach.
The 2022 Cost of a Data Breach Report is based on in-depth analysis of real-world data breaches experienced by 550 organizations globally between March 2021 and March 2022. The research, which was sponsored and analyzed by IBM Security, was conducted by the Ponemon Institute.
Some of the key global findings in the 2022 IBM report include:
“Businesses need to put their security defenses on the offense and beat attackers to the punch. It’s time to stop the adversary from achieving their objectives and start to minimize the impact of attacks. The more businesses try to perfect their perimeter instead of investing in detection and response, the more breaches can fuel cost of living increases.” said Charles Henderson, Global Head of IBM Security X-Force. “This report shows that the right strategies coupled with the right technologies can help make all the difference when businesses are attacked.”
Over-trusting Critical Infrastructure Organizations
Concerns over critical infrastructure targeting appear to be increasing globally over the past year, with many governments’ cybersecurity agencies urging vigilance against disruptive attacks. In fact, IBM’s report reveals that ransomware and destructive attacks represented 28% of breaches amongst critical infrastructure organizations studied, highlighting how threat actors are seeking to fracture the global supply chains that rely on these organizations. This includes financial services, industrial, transportation and healthcare companies amongst others.
Despite the call for caution, and a year after the Biden Administration issued a cybersecurity executive order that centers around the importance of adopting a zero trust approach to strengthen the nation’s cybersecurity, only 21% of critical infrastructure organizations studied adopt a zero trust security model, according to the report. Add to that, 17% of breaches at critical infrastructure organizations were caused due to a business partner being initially compromised, highlighting the security risks that over-trusting environments pose.
Businesses that Pay the Ransom Aren’t Getting a “Bargain”
According to the 2022 IBM report, businesses that paid threat actors’ ransom demands saw $610,000 less in average breach costs compared to those that chose not to pay – not including the ransom amount paid. However, when accounting for the average ransom payment, which according to Sophos reached $812,000 in 2021, businesses that opt to pay the ransom could net higher total costs - all while inadvertently funding future ransomware attacks with capital that could be allocated to remediation and recovery efforts and looking at potential federal offenses.
The persistence of ransomware, despite significant global efforts to impede it, is fueled by the industrialization of cybercrime. IBM Security X-Force discovered the duration of studied enterprise ransomware attacks shows a drop of 94% over the past three years – from over two months to just under four days. These exponentially shorter attack lifecycles can prompt higher impact attacks, as cybersecurity incident responders are left with very short windows of opportunity to detect and contain attacks. With “time to ransom” dropping to a matter of hours, it's essential that businesses prioritize rigorous testing of incident response (IR) playbooks ahead of time. But the report states that as many as 37% of organizations studied that have incident response plans don’t test them regularly.
Hybrid Cloud Advantage
The report also showcased hybrid cloud environments as the most prevalent (45%) infrastructure amongst organizations studied. Averaging $3.8 million in breach costs, businesses that adopted a hybrid cloud model observed lower breach costs compared to businesses with a solely public or private cloud model, which experienced $5.02 million and $4.24 million on average respectively. In fact, hybrid cloud adopters studied were able to identify and contain data breaches 15 days faster on average than the global average of 277 days for participants.
The report highlights that 45% of studied breaches occurred in the cloud, emphasizing the importance of cloud security. However, a significant 43% of reporting organizations stated they are just in the early stages or have not started implementing security practices to protect their cloud environments, observing higher breach costs3 . Businesses studied that did not implement security practices across their cloud environments required an average 108 more days to identify and contain a data breach than those consistently applying security practices across all their domains.
Additional findings in the 2022 IBM report include:
About IBM Security
IBM Security offers one of the most advanced and integrated portfolios of enterprise security products and services. The portfolio, supported by world-renowned IBM Security X-Force® research, enables organizations to effectively manage risk and defend against emerging threats. IBM operates one of the world's broadest security research, development, and delivery organizations, monitors 150 billion+ security events per day in more than 130 countries, and has been granted more than 10,000 security patents worldwide. For more information, please check www.ibm.com/security, follow @IBMSecurity on Twitter or visit the IBM Security Intelligence blog.
Cybersecurity has always been a concern for every type of organization. Even in normal times, a major breach is more than just the data economy’s equivalent of a ram-raid on Fort Knox; it has knock-on effects on trust, reputation, confidence, and the viability of some technologies. This is what IBM calls the “haunting effect”.
A successful attack breeds more, of course, both on the same organization again, and on others in similar businesses, or in those that use the same compromised systems. The unspoken effect of this is rising costs for everyone, as all enterprises are forced to spend money and time on checking if they have been affected too.
But in our new world of COVID-19, disrupted economies, climate change, remote working, soaring inflation, and looming recession, all such effects are all amplified. Throw in a war that’s hammering on Europe’s door (with political echoes across the Middle East and Asia) and it’s a wonder any of us can get out of bed in the morning.
So, what are the real costs of a successful cyberattack – not just hacks, viruses, and Trojans, but also phishing, ransomware, and concerted campaigns against supply chains and code repositories?
According to IBM’s latest annual survey, breach costs have risen by an unlucky 13% over the past two years, as attackers, which include hostile states, have probed the systemic and operational weaknesses exposed by the pandemic.
The global average cost of a data breach has reached an all-time high of $4.35 million – at least, among the 550 organizations surveyed by the Ponemon Institute for IBM Security (over a year from March 2021). Indeed, IBM goes so far as to claim that breaches may be contributing to the rising costs of goods and services. The survey states:
Sixty percent of studied organizations raised their product or services prices due to the breach, when the cost of goods is already soaring worldwide amid inflation and supply chain issues.
Incidents are also “haunting” organizations, says the company, with 83% having experienced more than one data breach, and with 50% of costs occurring more than a year after the successful attack.
Cloud maturity is a key factor, adds the report:
Forty-three percent of studied organizations are in the early stages [of cloud adoption] or have not started applying security practices across their cloud environments, observing over $660,000 in higher breach costs, on average, than studied organizations with mature security across their cloud environments.
Forty-five percent of respondents run a hybrid cloud infrastructure. This leads to lower average breach costs than among those operating a public- or private-cloud model: $3.8 million versus $5.02 million (public) and $4.24 million (private).
That said, those are still significant costs, and may suggest that complexity is what deters attackers, rather than having a single target to hit. Nonetheless, hybrid cloud adopters are able to identify and contain data breaches 15 days faster on average, says the report.
However, with 277 days being the average time lag – an extraordinary figure – the real lesson may be that today’s enterprise systems are adept at hiding security breaches, which may appear as normal network traffic. Forty-five percent of breaches occurred in the cloud, says the report, so it is clearly imperative to get on top of security in that domain.
IBM then makes the following bold claim :
Participating organizations fully deploying security AI and automation incurred $3.05 million less on average in breach costs compared to studied organizations that have not deployed the technology – the biggest cost saver observed in the study.
Whether this finding will stand for long as attackers explore new ways to breach automated and/or AI-based systems – and perhaps automate attacks of their own invisibly – remains to be seen. Compromised digital employee, anyone?
But perhaps the most telling finding is that cybersecurity has a political dimension – beyond the obvious one of Russian, Chinese, North Korean, or Iranian state incursions, of course.
Concerns over critical infrastructure and global supply chains are rising, with threat actors seeking to disrupt global systems that include financial services, industrial, transportation, and healthcare companies, among others.
A year ago in the US, the Biden administration issued an Executive Order on cybersecurity that focused on the urgent need for zero-trust systems. Despite this, only 21% of critical infrastructure organizations have so far adopted a zero-trust security model, according to the report. It states:
Almost 80% of the critical infrastructure organizations studied don’t adopt zero-trust strategies, seeing average breach costs rise to $5.4 million – a $1.17 million increase compared to those that do. All while 28% of breaches among these organizations were ransomware or destructive attacks.
Add to that, 17% of breaches at critical infrastructure organizations were caused due to a business partner being initially compromised, highlighting the security risks that over-trusting environments pose.
That aside, one of the big stories over the past couple of years has been the rise of ransomware: malicious code that locks up data, enterprise systems, or individual computers, forcing users to pay a ransom to (they hope) retrieve their systems or data.
But according to IBM, there are no obvious winners or losers in this insidious practice. The report adds:
Businesses that paid threat actors’ ransom demands saw $610,000 less in average breach costs compared to those that chose not to pay – not including the ransom amount paid.
However, when accounting for the average ransom payment – which according to Sophos reached $812,000 in 2021 – businesses that opt to pay the ransom could net higher total costs, all while inadvertently funding future ransomware attacks.”
The persistence of ransomware is fuelled by what IBM calls the “industrialization of cybercrime”.
The risk profile is also changing. Ransomware attack times show a massive drop of 94% over the past three years, from over two months to just under four days. Good news? Not at all, says the report, as the attacks may be higher impact, with more immediate consequences (such as destroyed data, or private data being made public on hacker forums).
The key lesson in cybersecurity today is that all of us are both upstream and downstream from partners, suppliers, and customers in today’s extended enterprises. We are also at the mercy of reused but compromised code from trusted repositories, and even sometimes from hardware that has been compromised at source.
So, what is the answer? Businesses should ensure that their incident responses are tested rigorously and frequently in advance – along with using red-, blue-, or purple-team approaches (thinking like a hacker, a defender, or both).
Regrettably, IBM says that 37% of organizations that have IR plans in place fail to test them regularly. To paraphrase Spinal Tap, you can’t code for stupid.
New Jersey, N.J., July 18, 2022 The AI in Fraud Management Market research report provides all the information related to the industry. It gives the outlook of the market by giving authentic data to its client which helps to make essential decisions. It gives an overview of the market which includes its definition, applications and developments, and manufacturing technology. This AI in Fraud Management market research report tracks all the recent developments and innovations in the market. It gives the data regarding the obstacles while establishing the business and guides to overcome the upcoming challenges and obstacles.
Payment fraud detection is the most common type of fraud addressed by artificial intelligence (AI). Its variations are as diverse as the imagination of fraudsters. However, some of the most common types of payment fraud are: lost cards, stolen cards, counterfeit cards, card identity theft, and card non-receipt.
The rapid development of technology indicates that more business processes can now be automated. This meant that machines and software could relieve workers of boring, routine tasks. Big data, machine learning, and artificial intelligence are also helping to automate more complex tasks, but many of these projects fail or fall short of expectations.
Get the PDF sample Copy (Including FULL TOC, Graphs, and Tables) of this report @:
This AI in Fraud Management research report throws light on the major market players thriving in the market; it tracks their business strategies, financial status, and upcoming products.
Some of the Top companies Influencing this Market include:IBM Corporation, Hewlett Packard Enterprise, Subex Limited, Temenos AG, Cognizant, Splunk, Inc., BAE Systems, Pelican, DataVisor, Inc., Matellio Inc., MaxMind, Inc., SAS Institute Inc., Capgemini SE, JuicyScore, ACTICO GmbH
Firstly, this AI in Fraud Management research report introduces the market by providing an overview which includes definition, applications, product launches, developments, challenges, and regions. The market is forecasted to reveal strong development by driven consumption in various markets. An analysis of the current market designs and other basic characteristics is provided in the AI in Fraud Management report.
The region-wise coverage of the market is mentioned in the report, mainly focusing on the regions:
Segmentation Analysis of the market
The market is segmented on the basis of the type, product, end users, raw materials, etc. the segmentation helps to deliver a precise explanation of the market
Market Segmentation: By Type
Small and Medium Enterprises (SMEs)
Market Segmentation: By Application
For Any Query or Customization: https://a2zmarketresearch.com/ask-for-customization/658134
An assessment of the market attractiveness with regard to the competition that new players and products are likely to present to older ones has been provided in the publication. The research report also mentions the innovations, new developments, marketing strategies, branding techniques, and products of the key participants present in the global AI in Fraud Management market. To present a clear vision of the market the competitive landscape has been thoroughly analyzed utilizing the value chain analysis. The opportunities and threats present in the future for the key market players have also been emphasized in the publication.
This report aims to provide:
Table of Contents
Global AI in Fraud Management Market Research Report 2022 – 2029
Chapter 1 AI in Fraud Management Market Overview
Chapter 2 Global Economic Impact on Industry
Chapter 3 Global Market Competition by Manufacturers
Chapter 4 Global Production, Revenue (Value) by Region
Chapter 5 Global Supply (Production), Consumption, Export, Import by Regions
Chapter 6 Global Production, Revenue (Value), Price Trend by Type
Chapter 7 Global Market Analysis by Application
Chapter 8 Manufacturing Cost Analysis
Chapter 9 Industrial Chain, Sourcing Strategy and Downstream Buyers
Chapter 10 Marketing Strategy Analysis, Distributors/Traders
Chapter 11 Market Effect Factors Analysis
Chapter 12 Global AI in Fraud Management Market Forecast
Buy Exclusive Report @: https://www.a2zmarketresearch.com/checkout
1887 WHITNEY MESA DR HENDERSON, NV 89014
+1 775 237 4157
The average cost of a data breach is still going up, according to a new survey.
This advertisement has not loaded yet, but your article continues below.
The 25 Canadian companies that suffered a breach in the 12-month period ending in March paid an average of $7 million in recovery costs per incident. By comparison, the average of 550 companies studied around the world was $5.5 million (all amounts in Canadian dollars).
The numbers are contained in the latest annual cost of data breach survey by IBM and the Ponemon Institute, which was released today. (Registration required)
“It’s scandalous,” Evan O’Regan, associate partner in IBM Canada’s cybersecurity and digital trust practice, said in an interview. Those costs, he said, are spread across a victim organization’s supply chain to become a “hidden cyber tax” paid by customers.
“Costs are going up for organizations that aren’t prepared for responding to these incidents,” he said, “and the costs are going down dramatically for those that are prepared.”
Among the report’s highlights:
–Canada recorded the third highest average cost of a data breach worldwide once again – after the United States and the Middle East region.
–companies in the financial sector are paying the highest cost for data breaches in Canada, at CA$520 per record. Canadian technology companies paid $433 per record, the second most expensive by industry in the country, followed by the services industry at $362 per record. The national average cost across all sectors was $298 per record;
–stolen or compromised user credentials were again the most common method used as an entry point by attackers targeting Canadian organizations; those breaches were also the costliest. The average cost of a data breach by stolen and compromised credentials was as high as $8.86 million;
This advertisement has not loaded yet, but your article continues below.
–Canadian companies with a mature zero trust adoption observed $3.79 million lower breach costs than organizations with only early adoption of zero trust. Basically, says the report, the zero trust mindset of ‘we have to put our defense on the offense and assume the adversary is already in our environment’ is a money-saver.
There is slightly good news: Canadian firms in the study reported a drop in the average number of days it took to detect an attack: 160, compared to 164 days in the previous year’s study. Still, O’Regan called that number “disappointing.”
The average time Canadian firms took to contain a data breach dropped to 48 days from the 60 days in the previous years’ study.
The combination of defences — particularly identity and access management, participating in threat information sharing networks, and using security products with artificial intelligence — are big factors in cutting the costs of a data breach, O’Regan said.
The biggest cost factors in a data breach are detection and escalation costs (including finding what systems are affected), remediation, and loss of business.
The increase in staff working from home since the pandemic began is a factor in the recent rise in data breaches, says O’Regan. But “blaming the end user, the home worker, for security breaches is abdicating responsibility … Companies are not evolving fast enough to the threat environment and the work environment. By most indications the increase in home working is here to stay. And remote working is neither complicated nor particularly expensive to secure. A great example is zero trust.”
This advertisement has not loaded yet, but your article continues below.
The proof, he added, is that companies that have mature zero trust adoption have data breach costs half the size of other firms.
“The success of a company [against cyber attacks] depends on their approach,” O’Regan said. “If they’re approaching cybersecurity as a cost, then you’re going to do the minimum amount. What we definitely see is that companies that see cybersecurity as an enabler –taking a zero trust approach … and having a mature identity and access management program realize this isn’t a cost centre.”
The post Average cost of a data breach to Canadian firms studied hit $7 million, says IBM first appeared on IT World Canada.
This section is powered by IT World Canada. ITWC covers the enterprise IT spectrum, providing news and information for IT professionals aiming to succeed in the Canadian market.
The MarketWatch News Department was not involved in the creation of this content.
United States, Rockville MD, Jul 28, 2022 (GLOBE NEWSWIRE via COMTEX) -- United States, Rockville MD, July 28, 2022 (GLOBE NEWSWIRE) -- The global enterprise video market is anticipated to secure a market value worth US$ 48.8 Billion while expanding at a CAGR of 10.93% during the forecast period from 2022-2032.
Increasing focus on cloud deployment by various tech players such as Microsoft and Amazon Web Services to procure security and compliance standards is likely to play a salient role in driving the market. Also, ongoing technical development in video streaming, like the introduction of Application Programming Interfaces (APIs) is expected to burgeon the market size in the forecast period.
Several venture capital firms are offering funds to the start-ups developing enterprise video solutions. In March 2022, 100ms, Inc., an Indian start up, took the initiative to develop a live video conferencing infrastructure and raised US$ 20 Million in a Series A funding round. Renowned players in the market are introducing innovative live streaming applications to gain maintain their supremacy in the market. For instance, IBM Corporation offers an IBM Video Streaming platform that assists users broadcast various live-streamed content. The platform gives absolute control to the organizations over the content.
For Critical Insights onEnterprise Video Market, Request a Sample Report
Innovative initiatives to develop platforms/apps are likely to further develop the market during the forecast period. For instance, in October 2020, Microsoft added various features in its Teams applications, which are beneficial for hospitals and other healthcare organizations, including virtual visits and Electronic Healthcare Record (EHR), Teams templates, Teams policy packages, and care coordination and collaboration. Such factors are expected to make a significant contribution to developing the market during the forecast period.
However, developing nations and underdeveloped nations do not have a basic communication infrastructure to support high-quality video communication. Thus, people in such areas depend more on audio-based communication to avoid low-quality video and recurring disconnections. The lack of strong communication resources and infrastructure is a significant limitation to the expansion of the enterprise video market. On the contrary, the growing adoption of BYOD is anticipated to act as a significant opportunity for the global enterprise video market.
What are the Growth Prospects in Europe for the Global Enterprise Video Market?
Increasing Number of SMEs in the Region to Growth the European Market
Europe is anticipated to be the second-largest market for the enterprise video market. The growth of the market can be attributed to the heavy capitalization of advanced technologies. The growing economies like Italy, France, Spain, and others, are expected to make a significant contribution to developing the regional market.
Various organizations in the region are transitioning from large conferencing rooms to smaller hurdle rooms with a motive to utilize maximum office space for employee engagement. This transition from large conference rooms to small hurdle rooms is influenced by the advent of audiovisual (AV) technologies, which allow employees to virtually interact with individuals remotely.
To learn more about Enterprise Video Market, you can get in touch with our Analyst at https://www.factmr.com/connectus/sample?flag=AE&rep_id=7562
Key Segments Covered in Enterprise Video Industry Survey
Key players in the global enterprise video market include Adobe, Avaya Inc., Brightcove Inc., Cisco Systems, Inc., IBM Corporation, Kaltura, Inc., Microsoft, Polycom, Inc. (Plantronics, Inc.), VBrick, and Vidyo, Inc.
Recent Updates from the Industry Include:
Get Customization on Enterprise Video Market Report for Specific Research Solutions
Key players in the Enterprise Video Market
Key Takeaways from Enterprise Video Market Study
About the Technology Division at Fact.MR
Expert analysis, actionable insights, and strategic recommendations of the highly seasoned technology team at Fact.MR helps clients from across the globe with their unique business intelligence needs. With a repertoire of over thousand reports and 1 million-plus data points, the team has analysed the technology industry across 50+ countries for over a decade. The team provides unmatched end-to-end research and consulting services.
Explore Fact.MR's Coverage on the Technology Domain-
Physical Access Control System (PACS) Market- Physical access control system (PACS) market analysis shows that global demand enjoyed year-on-year (YoY) growth of over 10% in 2021, to total 2.2 Million units. Demand for biometric PACS is poised to grow 9% to total 850 '000 units, while demand for card-based PACS will be up 12.5% to 960 '000 units in 2021.
High Power RF Amplifier Market- The global high power RF amplifier market enjoys a valuation of around US$ 4.6 Billion at present. Sales of high power RF amplifiers are slated to accelerate at a high CAGR of 12.3% to reach US$ 14.7 Billion by 2031. Demand for smart energy in end-use sectors is likely to increase at a CAGR of 10.2% over the assessment period of 2021 to 2031. Airport Kiosk Market- The global airport kiosk market was valued at around US$ 1.7 Billion in 2020. Sales of airport kiosks are projected to accelerate at a healthy CAGR of 9% to top US$ 4 Billion by 2031.
Public Safety Software Market- The global public safety software market reached a valuation of around US$ 7 Billion in 2020, and is slated to rise at a CAGR of 11% to top US$ 20 Billion by 2031. Demand for computer-aided dispatch solutions is set to increase at a CAGR of 9% across the assessment period of 2021 to 2031.
AI Virtual Visor Market- As per a new report published by Fact.MR, a market research and competitive intelligence provider, the number of vehicle displays is steadily increasing, and over the years, has increased almost 65% between 2016 and 2021. Fact.MR estimates that the automotive display market is likely to reach almost US$ 22 Billion by 2022.
Bicycle Subscription Market- As per industry analysis by Fact.MR, a market research and competitive intelligence provider, the global market for bicycles stood at US$ 58 Billion in 2020, wherein, around 140 Million bicycles are produced annually across the globe. The market is set to reach a valuation of little over US$ 127 Billion by 2030, with a projected growth of above 8% CAGR.
Satellite Internet Market- The satellite internet market is expected to surpass a market valuation of US$ 6 Billion and expand at a CAGR of more than 8% during the forecast period, 2021-2031. The internet has moved from a goods to an amenity and to a must-have over the past 2 decades, mainly as a result of the smartphone revolution.
Big Data Analytics in Healthcare Market- The global big data analytics in healthcare market is estimated at US$ 39.7 billion in 2022. Data as a technology has been adopted by healthcare industry stakeholders rapidly and is being monetized, which is slated to push the global big data analytics in healthcare market to grow at CAGR of over 19%, and register total market value of US$ 194.7 billion by 2032 end.
Digital Door Lock System Market- The global digital door lock system market is likely to reach a valuation of around US$ 9 Billion in 2022. The sales of digital door lock system are slated to accelerate at a steady CAGR of 18% to top US$ 47 Billion by 2032.
SiC & GaN Power Semiconductor Market- The global SiC & GaN power semiconductor market is estimated at US$ 884 million in 2022. Demand of SiC & GaN power semiconductor devices is forecast to surpass the market value of US$ 6,954 Million by 2032. Stupendous growth rate of 22.9% is projected for the demand of these electronic discrete components during the forecast period of 2022-32.
Market research and consulting agency with a difference! That's why 80% of Fortune 1,000 companies trust us for making their most critical decisions. While our experienced consultants employ the latest technologies to extract hard-to-find insights, we believe our USP is the trust clients have on our expertise. Spanning a wide range - from automotive & industry 4.0 to healthcare & retail, our coverage is expansive, but we ensure even the most niche categories are analyzed. Our sales offices in United States and Dublin, Ireland. Headquarter based in Dubai, UAE. Reach out to us with your goals, and we'll be an able research partner.
(C) Copyright 2022 GlobeNewswire, Inc. All rights reserved.
The MarketWatch News Department was not involved in the creation of this content.
(MENAFN- America News Hour)
Key Companies Covered in the Enterprise Metadata Management Tool Market Research are Oracle Corporation, IBM Corporation, SAP SE, Informatica llc, ASG Technologies, and other key market players.
In its market research collateral archive, CRIFAX added a report titled ' Global Enterprise Metadata Management Tool Market, 2021-2030′ which consists of the study of the growth strategies used by the leading players in the Enterprise Metadata Management Tool to keep themselves ahead of the competition. In addition, the study also covers emerging trends, mergers and acquisitions, region-wise growth analysis, as well as the challenges that impact the growth of the market.
The growth of the global Enterprise Metadata Management Tool market worldwide is largely driven by the increasing number of technical developments in different industries around the world and the overall digital revolution. Digital economic development is one of the key factors motivating big giants to invest aggressively on digital innovation and shift their conventional business models to automated ones, so as to seize value-producing opportunities and keep ahead of their competitors, as well as to boost the continuity and reliability of their services. Ranging from artificial intelligence (AI), augmented reality (AR) and virtual reality (VR) to the internet of things (IoT), the growing number of internet-connected devices around the world on account of the growing technologies is anticipated to contribute to the growth of the global Enterprise Metadata Management Tool market.
MENAFN provides the information “as is” without warranty of any kind. We do not accept any responsibility or liability for the accuracy, content, images, videos, licenses, completeness, legality, or reliability of the information contained in this article. If you have any complaints or copyright issues related to this article, kindly contact the provider above.