Experts suggest killexams C2090-552 examcollection for exam success

killexams.com gives Latest and 2022 refreshed C2090-552 brain dumps with real test questions and deals with serious consequences regarding 100 percent ensured a positive outcome. Practice our C2090-552 Real Exam Questions and Answers to Improve your insight and breeze through your IBM InfoSphere Optim for Distributed Systems - V7.3.1 test with High Marks. We ensure your achievement in a genuine C2090-552 test, covering every one of the subjects of C2090-552 test, and fabricating your Knowledge of the C2090-552 test. Pass with our C2090-552 examcollection.

Exam Code: C2090-552 Practice exam 2022 by Killexams.com team
C2090-552 IBM InfoSphere Optim for Distributed Systems - V7.3.1

Exam Title : IBM Certified Specialist - InfoSphere Optim for Distributed Systems Fundamentals
Exam ID : C2090-552
Exam Duration : 90 mins
Questions in exam : 65
Passing Score : 75%
Official Training : Optim Solution Library
Exam Center : Pearson VUE
Real Questions : IBM InfoSphere Optim for Distributed Systems Fundamentals Real Questions
VCE practice questions : IBM C2090-552 Certification VCE Practice Test

Installation and Configuration
- Optim Architecture
Operating Systems/Hardware support
Database Support
- Establishing Environments
Optim Directories (location, numbers, owners)
Confirming installation pre-reqs
Physical installations
- Security
Initializing
Enabling
Configuring
- Archive / Extract Maintenance
File locations/sub-systems
File naming / sizing 25%
Optim Core Functionality
- Data Growth
Archive
Delete
Restore
- Decommissioning
- Test Data Management Extract / Insert
Edit
Compare
- Data Privacy Functions (when to use which ones)
Convert
Privacy Functions
- Access Definitions
- Relationships 35%
Advanced Topics
- Enterprise Integration
Security
Storage
Automation (i.e. Command Line)
- Import/Export OPTIM DDL
- Archive Files (i.e., location, collections, moving, registering, creating, managing index files)
Optim Connect
Archive Collections
Archive Indexes
Archive File Maintenance
- Complex Data Model Traversals
- Extended Data Sources
InfoSphere Federation Server
Optim Connect
- Non-Enforced Referential Integrity
- Column Maps and Exits 34%
Performance and Troubleshooting
- Log Files, Trace Files and Statistical Reports
- Relationships (i.e., delete strategy, OPTIM relationships, Forced Key/Scan Lookups, Scans, Threads)
- Relationship Index Analysis 6%

IBM InfoSphere Optim for Distributed Systems - V7.3.1
IBM approach
Killexams : IBM approach - BingNews http://www.bing.com:80/news/search?q=IBM+approach&cc=us&format=RSS Search results Killexams : IBM approach - BingNews http://www.bing.com:80/news/search?q=IBM+approach&cc=us&format=RSS https://killexams.com/exam_list/IBM Killexams : IBM Research Rolls Out A Comprehensive AI And Platform-Based Edge Research Strategy Anchored By Enterprise Use Cases And Partnerships

I recently met with Dr. Nick Fuller, Vice President, Distributed Cloud, at IBM Research for a discussion about IBM’s long-range plans and strategy for artificial intelligence and machine learning at the edge.

Dr. Fuller is responsible for providing AI and platform–based innovation for enterprise digital transformation spanning edge computing and distributed cloud management. He is an IBM Master Inventor with over 75 patents and co-author of 75 technical publications. Dr. Fuller obtained his Bachelor of Science in Physics and Math from Morehouse College and his PhD in Applied Physics from Columbia University.

Edge In, not Cloud Out

In general, Dr. Fuller told me that IBM is focused on developing an "edge in" position versus a "cloud out" position with data, AI, and Kubernetes-based platform technologies to scale hub and spoke deployments of edge applications.

A hub plays the role of a central control plane used for orchestrating the deployment and management of edge applications in a number of connected spoke locations such as a factory floor or a retail branch, where data is generated or locally aggregated for processing.

“Cloud out” refers to the paradigm where cloud service providers are extending their cloud architecture out to edge locations. In contrast, “edge in” refers to a provider-agnostic architecture that is cloud-independent and treats the data-plane as a first-class citizen.

IBM's overall architectural principle is scalability, repeatability, and full stack solution management that allows everything to be managed using a single unified control plane.

IBM’s Red Hat platform and infrastructure strategy anchors the application stack with a unified, scalable, and managed OpenShift-based control plane equipped with a high-performance storage appliance and self-healing system capabilities (inclusive of semi-autonomous operations).

IBM’s strategy also includes several in-progress platform-level technologies for scalable data, AI/ML runtimes, accelerator libraries for Day-2 AI operations, and scalability for the enterprise.

It is an important to mention that IBM is designing its edge platforms with labor cost and technical workforce in mind. Data scientists with PhDs are in high demand, making them difficult to find and expensive to hire once you find them. IBM is designing its edge system capabilities and processes so that domain experts rather than PhDs can deploy new AI models and manage Day-2 operations.

Why edge is important

Advances in computing and storage have made it possible for AI to process mountains of accumulated data to provide solutions. By bringing AI closer to the source of data, edge computing is faster and more efficient than cloud. While Cloud data accounts for 60% of the world’s data today, vast amounts of new data is being created at the edge, including industrial applications, traffic cameras, and order management systems, all of which can be processed at the edge in a fast and timely manner.

Public cloud and edge computing differ in capacity, technology, and management. An advantage of edge is that data is processed and analyzed at / near its collection point at the edge. In the case of cloud, data must be transferred from a local device and into the cloud for analytics and then transferred back to the edge again. Moving data through the network consumes capacity and adds latency to the process. It’s easy to see why executing a transaction at the edge reduces latency and eliminates unnecessary load on the network.

Increased privacy is another benefit of processing data at the edge. Analyzing data where it originates limits the risk of a security breach. Most of the communications between the edge and the cloud is then confined to such things as reporting, data summaries, and AI models, without ever exposing the raw data.

IBM at the Edge

In our discussion, Dr. Fuller provided a few examples to illustrate how IBM plans to provide new and seamless edge solutions for existing enterprise problems.

Example #1 – McDonald’s drive-thru

Dr. Fuller’s first example centered around Quick Service Restaurant’s (QSR) problem of drive-thru order taking. Last year, IBM acquired an automated order-taking system from McDonald's. As part of the acquisition, IBM and McDonald's established a partnership to perfect voice ordering methods using AI. Drive-thru orders are a significant percentage of total QSR orders for McDonald's and other QSR chains.

McDonald's and other QSR restaurants would like every order to be processed as quickly and accurately as possible. For that reason, McDonald's conducted trials at ten Chicago restaurants using an edge-based AI ordering system with NLP (Natural Language Processing) to convert spoken orders into a digital format. It was found that AI had the potential to reduce ordering errors and processing time significantly. Since McDonald's sells almost 7 million hamburgers daily, shaving a minute or two off each order represents a significant opportunity to address labor shortages and increase customer satisfaction.

Example #2 – Boston Dynamics and Spot the agile mobile robot

According to an earlier IBM survey, many manufacturers have already implemented AI-driven robotics with autonomous decision-making capability. The study also indicated that over 80 percent of companies believe AI can help Excellerate future business operations. However, some companies expressed concern about the limited mobility of edge devices and sensors.

To develop a mobile edge solution, IBM teamed up with Boston Dynamics. The partnership created an agile mobile robot using IBM Research and IBM Sustainability Software AI technology. The device can analyze visual sensor readings in hazardous and challenging industrial environments such as manufacturing plants, warehouses, electrical grids, waste treatment plants and other hazardous environments. The value proposition that Boston Dynamics brought to the partnership was Spot the agile mobile robot, a walking, sensing, and actuation platform. Like all edge applications, the robot’s wireless mobility uses self-contained AI/ML that doesn’t require access to cloud data. It uses cameras to read analog devices, visually monitor fire extinguishers, and conduct a visual inspection of human workers to determine if required safety equipment is being worn.

IBM was able to show up to a 10X speedup by automating some manual tasks, such as converting the detection of a problem into an immediate work order in IBM Maximo to correct it. A fast automated response was not only more efficient, but it also improved the safety posture and risk management for these facilities. Similarly, some factories need to thermally monitor equipment to identify any unexpected hot spots that may show up over time, indicative of a potential failure.

IBM is working with National Grid, an energy company, to develop a mobile solution using Spot, the agile mobile robot, for image analysis of transformers and thermal connectors. As shown in the above graphic, Spot also monitored connectors on both flat surfaces and 3D surfaces. IBM was able to show that Spot could detect excessive heat build-up in small connectors, potentially avoiding unsafe conditions or costly outages. This AI/ML edge application can produce faster response times when an issue is detected, which is why IBM believes significant gains are possible by automating the entire process.

IBM market opportunities

Drive-thru orders and mobile robots are just a few examples of the millions of potential AI applications that exist at the edge and are driven by several billion connected devices.

Edge computing is an essential part of enterprise digital transformation. Enterprises seek ways to demonstrate the feasibility of solving business problems using AI/ML and analytics at the edge. However, once a proof of concept has been successfully demonstrated, it is a common problem for a company to struggle with scalability, data governance, and full-stack solution management.

Challenges with scaling

“Determining entry points for AI at the edge is not the difficult part,” Dr. Fuller said. “Scale is the real issue.”

Scaling edge models is complicated because there are so many edge locations with large amounts of diverse content and a high device density. Because large amounts of data are required for training, data gravity is a potential problem. Further, in many scenarios, vast amounts of data are generated quickly, leading to potential data storage and orchestration challenges. AI Models are also rarely "finished." Monitoring and retraining of models are necessary to keep up with changes the environment.

Through IBM Research, IBM is addressing the many challenges of building an all-encompassing edge architecture and horizontally scalable data and AI technologies. IBM has a wealth of edge capabilities and an architecture to create the appropriate platform for each application.

IBM AI entry points at the edge

IBM sees Edge Computing as a $200 billion market by 2025. Dr. Fuller and his organization have identified four key market entry points for developing and expanding IBM’s edge compute strategy. In order of size, IBM believes its priority edge markets to be intelligent factories (Industry 4.0), telcos, retail automation, and connected vehicles.

IBM and its Red Hat portfolio already have an established presence in each market segment, particularly in intelligent operations and telco. Red Hat is also active in the connected vehicles space.

Industry 4.0

There have been three prior industrial revolutions, beginning in the 1700s up to our current in-progress fourth revolution, Industry 4.0, that promotes a digital transformation.

Manufacturing is the fastest growing and the largest of IBM’s four entry markets. In this segment, AI at the edge can Excellerate quality control, production optimization, asset management, and supply chain logistics. IBM believes there are opportunities to achieve a 4x speed up in implementing edge-based AI solutions for manufacturing operations.

For its Industry 4.0 use case development, IBM, through product, development, research and consulting teams, is working with a major automotive OEM. The partnership has established the following joint objectives:

  • Increase automation and scalability across dozens of plants using 100s of AI / ML models. This client has already seen value in applying AI/ML models for manufacturing applications. IBM Research is helping with re-training models and implementing new ones in an edge environment to help scale even more efficiently. Edge offers faster inference and low latency, allowing AI to be deployed in a wider variety of manufacturing operations requiring instant solutions.
  • Dramatically reduce the time required to onboard new models. This will allow training and inference to be done faster and allow large models to be deployed much more quickly. The quicker an AI model can be deployed in production; the quicker the time-to-value and the return-on-investment (ROI).
  • Accelerate deployment of new inspections by reducing the labeling effort and iterations needed to produce a production-ready model via data summarization. Selecting small data sets for annotation means manually examining thousands of images, this is a time-consuming process that will result in - labeling of redundant data. Using ML-based automation for data summarization will accelerate the process and produce better model performance.
  • Enable Day-2 AI operations to help with data lifecycle automation and governance, model creation, reduce production errors, and provide detection of out-of-distribution data to help determine if a model’s inference is accurate. IBM believes this will allow models to be created faster without data scientists.

Maximo Application Suite

IBM’s Maximo Application Suite plays an important part in implementing large manufacturers' current and future IBM edge solutions. Maximo is an integrated public or private cloud platform that uses AI, IoT, and analytics to optimize performance, extend asset lifecycles and reduce operational downtime and costs. IBM is working with several large manufacturing clients currently using Maximo to develop edge use cases, and even uses it within its own Manufacturing.

IBM has research underway to develop a more efficient method of handling life cycle management of large models that require immense amounts of data. Day 2 AI operations tasks can sometimes be more complex than initial model training, deployment, and scaling. Retraining at the edge is difficult because resources are typically limited.

Once a model is trained and deployed, it is important to monitor it for drift caused by changes in data distributions or anything that might cause a model to deviate from original requirements. Inaccuracies can adversely affect model ROI.

Day-2 AI Operations (retraining and scaling)

Day-2 AI operations consist of continual updates to AI models and applications to keep up with changes in data distributions, changes in the environment, a drop in model performance, availability of new data, and/or new regulations.

IBM recognizes the advantages of performing Day-2 AI Operations, which includes scaling and retraining at the edge. It appears that IBM is the only company with an architecture equipped to effectively handle Day-2 AI operations. That is a significant competitive advantage for IBM.

A company using an architecture that requires data to be moved from the edge back into the cloud for Day-2 related work will be unable to support many factory AI/ML applications because of the sheer number of AI/ML models to support (100s to 1000s).

“There is a huge proliferation of data at the edge that exists in multiple spokes,” Dr. Fuller said. "However, all that data isn’t needed to retrain a model. It is possible to cluster data into groups and then use sampling techniques to retrain the model. There is much value in federated learning from our point of view.”

Federated learning is a promising training solution being researched by IBM and others. It preserves privacy by using a collaboration of edge devices to train models without sharing the data with other entities. It is a good framework to use when resources are limited.

Dealing with limited resources at the edge is a challenge. IBM’s edge architecture accommodates the need to ensure resource budgets for AI applications are met, especially when deploying multiple applications and multiple models across edge locations. For that reason, IBM developed a method to deploy data and AI applications to scale Day-2 AI operations utilizing hub and spokes.

The graphic above shows the current status quo methods of performing Day-2 operations using centralized applications and a centralized data plane compared to the more efficient managed hub and spoke method with distributed applications and a distributed data plane. The hub allows it all to be managed from a single pane of glass.

Data Fabric Extensions to Hub and Spokes

IBM uses hub and spoke as a model to extend its data fabric. The model should not be thought of in the context of a traditional hub and spoke. IBM’s hub provides centralized capabilities to manage clusters and create multiples hubs that can be aggregated to a higher level. This architecture has four important data management capabilities.

  1. First, models running in unattended environments must be monitored. From an operational standpoint, detecting when a model’s effectiveness has significantly degraded and if corrective action is needed is critical.
  2. Secondly, in a hub and spoke model, data is being generated and collected in many locations creating a need for data life cycle management. Working with large enterprise clients, IBM is building unique capabilities to manage the data plane across the hub and spoke estate - optimized to meet data lifecycle, regulatory & compliance as well as local resource requirements. Automation determines which input data should be selected and labeled for retraining purposes and used to further Excellerate the model. Identification is also made for atypical data that is judged worthy of human attention.
  3. The third issue relates to AI pipeline compression and adaptation. As mentioned earlier, edge resources are limited and highly heterogeneous. While a cloud-based model might have a few hundred million parameters or more, edge models can’t afford such resource extravagance because of resource limitations. To reduce the edge compute footprint, model compression can reduce the number of parameters. As an example, it could be reduced from several hundred million to a few million.
  4. Lastly, suppose a scenario exists where data is produced at multiple spokes but cannot leave those spokes for compliance reasons. In that case, IBM Federated Learning allows learning across heterogeneous data in multiple spokes. Users can discover, curate, categorize and share data assets, data sets, analytical models, and their relationships with other organization members.

In addition to AI deployments, the hub and spoke architecture and the previously mentioned capabilities can be employed more generally to tackle challenges faced by many enterprises in consistently managing an abundance of devices within and across their enterprise locations. Management of the software delivery lifecycle or addressing security vulnerabilities across a vast estate are a case in point.

Multicloud and Edge platform

In the context of its strategy, IBM sees edge and distributed cloud as an extension of its hybrid cloud platform built around Red Hat OpenShift. One of the newer and more useful options created by the Red Hat development team is the Single Node OpenShift (SNO), a compact version of OpenShift that fits on a single server. It is suitable for addressing locations that are still servers but come in a single node, not clustered, deployment type.

For smaller footprints such as industrial PCs or computer vision boards (for example NVidia Jetson Xavier), Red Hat is working on a project which builds an even smaller version of OpenShift, called MicroShift, that provides full application deployment and Kubernetes management capabilities. It is packaged so that it can be used for edge device type deployments.

Overall, IBM and Red Hat have developed a full complement of options to address a large spectrum of deployments across different edge locations and footprints, ranging from containers to management of full-blown Kubernetes applications from MicroShift to OpenShift and IBM Edge Application Manager.

Much is still in the research stage. IBM's objective is to achieve greater consistency in terms of how locations and application lifecycle is managed.

First, Red Hat plans to introduce hierarchical layers of management with Red Hat Advanced Cluster Management (RHACM), to scale by two to three orders of magnitude the number of edge locations managed by this product. Additionally, securing edge locations is a major focus. Red Hat is continuously expanding platform security features, for example by recently including Integrity Measurement Architecture in Red Hat Enterprise Linux, or by adding Integrity Shield to protect policies in Red Hat Advanced Cluster Management (RHACM).

Red Hat is partnering with IBM Research to advance technologies that will permit it to protect platform integrity and the integrity of client workloads through the entire software supply chains. In addition, IBM Research is working with Red Hat on analytic capabilities to identify and remediate vulnerabilities and other security risks in code and configurations.

Telco network intelligence and slice management with AL/ML

Communication service providers (CSPs) such as telcos are key enablers of 5G at the edge. 5G benefits for these providers include:

  • Reduced operating costs
  • Improved efficiency
  • Increased distribution and density
  • Lower latency

The end-to-end 5G network comprises the Radio Access Network (RAN), transport, and core domains. Network slicing in 5G is an architecture that enables multiple virtual and independent end-to-end logical networks with different characteristics such as low latency or high bandwidth, to be supported on the same physical network. This is implemented using cloud-native technology enablers such as software defined networking (SDN), virtualization, and multi-access edge computing. Slicing offers necessary flexibility by allowing the creation of specific applications, unique services, and defined user groups or networks.

An important aspect of enabling AI at the edge requires IBM to provide CSPs with the capability to deploy and manage applications across various enterprise locations, possibly spanning multiple end-to-end network slices, using a single pane of glass.

5G network slicing and slice management

Network slices are an essential part of IBM's edge infrastructure that must be automated, orchestrated and optimized according to 5G standards. IBM’s strategy is to leverage AI/ML to efficiently manage, scale, and optimize the slice quality of service, measured in terms of bandwidth, latency, or other metrics.

5G and AI/ML at the edge also represent a significant opportunity for CSPs to move beyond traditional cellular services and capture new sources of revenue with new services.

Communications service providers need management and control of 5G network slicing enabled with AI-powered automation.

Dr. Fuller sees a variety of opportunities in this area. "When it comes to applying AI and ML on the network, you can detect things like intrusion detection and malicious actors," he said. "You can also determine the best way to route traffic to an end user. Automating 5G functions that run on the network using IBM network automation software also serves as an entry point.”

In IBM’s current telecom trial, IBM Research is spearheading the development of a range of capabilities targeted for the IBM Cloud Pak for Network Automation product using AI and automation to orchestrate, operate and optimize multivendor network functions and services that include:

  • End-to-end 5G network slice management with planning & design, automation & orchestration, and operations & assurance
  • Network Data and AI Function (NWDAF) that collects data for slice monitoring from 5G Core network functions, performs network analytics, and provides insights to authorized data consumers.
  • Improved operational efficiency and reduced cost

Future leverage of these capabilities by existing IBM Clients that use the Cloud Pak for Network Automation (e.g., DISH) can offer further differentiation for CSPs.

5G radio access

Open radio access networks (O-RANs) are expected to significantly impact telco 5G wireless edge applications by allowing a greater variety of units to access the system. The O-RAN concept separates the DU (Distributed Units) and CU (Centralized Unit) from a Baseband Unit in 4G and connects them with open interfaces.

O-RAN system is more flexible. It uses AI to establish connections made via open interfaces that optimize the category of a device by analyzing information about its prior use. Like other edge models, the O-RAN architecture provides an opportunity for continuous monitoring, verification, analysis, and optimization of AI models.

The IBM-telco collaboration is expected to advance O-RAN interfaces and workflows. Areas currently under development are:

  • Multi-modal (RF level + network-level) analytics (AI/ML) for wireless communication with high-speed ingest of 5G data
  • Capability to learn patterns of metric and log data across CUs and DUs in RF analytics
  • Utilization of the antenna control plane to optimize throughput
  • Primitives for forecasting, anomaly detection and root cause analysis using ML
  • Opportunity of value-added functions for O-RAN

IBM Cloud and Infrastructure

The cornerstone for the delivery of IBM's edge solutions as a service is IBM Cloud Satellite. It presents a consistent cloud-ready, cloud-native operational view with OpenShift and IBM Cloud PaaS services at the edge. In addition, IBM integrated hardware and software Edge systems will provide RHACM - based management of the platform when clients or third parties have existing managed as a service models. It is essential to note that in either case this is done within a single control plane for hubs and spokes that helps optimize execution and management from any cloud to the edge in the hub and spoke model.

IBM's focus on “edge in” means it can provide the infrastructure through things like the example shown above for software defined storage for federated namespace data lake that surrounds other hyperscaler clouds. Additionally, IBM is exploring integrated full stack edge storage appliances based on hyperconverged infrastructure (HCI), such as the Spectrum Fusion HCI, for enterprise edge deployments.

As mentioned earlier, data gravity is one of the main driving factors of edge deployments. IBM has designed its infrastructure to meet those data gravity requirements, not just for the existing hub and spoke topology but also for a future spoke-to-spoke topology where peer-to-peer data sharing becomes imperative (as illustrated with the wealth of examples provided in this article).

Wrap up

Edge is a distributed computing model. One of its main advantages is that computing, and data storage and processing is close to where data is created. Without the need to move data to the cloud for processing, real-time application of analytics and AI capabilities provides immediate solutions and drives business value.

IBM’s goal is not to move the entirety of its cloud infrastructure to the edge. That has little value and would simply function as a hub to spoke model operating on actions and configurations dictated by the hub.

IBM’s architecture will provide the edge with autonomy to determine where data should reside and from where the control plane should be exercised.

Equally important, IBM foresees this architecture evolving into a decentralized model capable of edge-to-edge interactions. IBM has no firm designs for this as yet. However, the plan is to make the edge infrastructure and platform a first-class citizen instead of relying on the cloud to drive what happens at the edge.

Developing a complete and comprehensive AI/ML edge architecture - and in fact, an entire ecosystem - is a massive undertaking. IBM faces many known and unknown challenges that must be solved before it can achieve success.

However, IBM is one of the few companies with the necessary partners and the technical and financial resources to undertake and successfully implement a project of this magnitude and complexity.

It is reassuring that IBM has a plan and that its plan is sound.

Paul Smith-Goodson is Vice President and Principal Analyst for quantum computing, artificial intelligence and space at Moor Insights and Strategy. You can follow him on Twitter for more current information on quantum, AI, and space.

Note: Moor Insights & Strategy writers and editors may have contributed to this article.

Moor Insights & Strategy, like all research and tech industry analyst firms, provides or has provided paid services to technology companies. These services include research, analysis, advising, consulting, benchmarking, acquisition matchmaking, and speaking sponsorships. The company has had or currently has paid business relationships with 8×8, Accenture, A10 Networks, Advanced Micro Devices, Amazon, Amazon Web Services, Ambient Scientific, Anuta Networks, Applied Brain Research, Applied Micro, Apstra, Arm, Aruba Networks (now HPE), Atom Computing, AT&T, Aura, Automation Anywhere, AWS, A-10 Strategies, Bitfusion, Blaize, Box, Broadcom, C3.AI, Calix, Campfire, Cisco Systems, Clear Software, Cloudera, Clumio, Cognitive Systems, CompuCom, Cradlepoint, CyberArk, Dell, Dell EMC, Dell Technologies, Diablo Technologies, Dialogue Group, Digital Optics, Dreamium Labs, D-Wave, Echelon, Ericsson, Extreme Networks, Five9, Flex, Foundries.io, Foxconn, Frame (now VMware), Fujitsu, Gen Z Consortium, Glue Networks, GlobalFoundries, Revolve (now Google), Google Cloud, Graphcore, Groq, Hiregenics, Hotwire Global, HP Inc., Hewlett Packard Enterprise, Honeywell, Huawei Technologies, IBM, Infinidat, Infosys, Inseego, IonQ, IonVR, Inseego, Infosys, Infiot, Intel, Interdigital, Jabil Circuit, Keysight, Konica Minolta, Lattice Semiconductor, Lenovo, Linux Foundation, Lightbits Labs, LogicMonitor, Luminar, MapBox, Marvell Technology, Mavenir, Marseille Inc, Mayfair Equity, Meraki (Cisco), Merck KGaA, Mesophere, Micron Technology, Microsoft, MiTEL, Mojo Networks, MongoDB, MulteFire Alliance, National Instruments, Neat, NetApp, Nightwatch, NOKIA (Alcatel-Lucent), Nortek, Novumind, NVIDIA, Nutanix, Nuvia (now Qualcomm), onsemi, ONUG, OpenStack Foundation, Oracle, Palo Alto Networks, Panasas, Peraso, Pexip, Pixelworks, Plume Design, PlusAI, Poly (formerly Plantronics), Portworx, Pure Storage, Qualcomm, Quantinuum, Rackspace, Rambus, Rayvolt E-Bikes, Red Hat, Renesas, Residio, Samsung Electronics, Samsung Semi, SAP, SAS, Scale Computing, Schneider Electric, SiFive, Silver Peak (now Aruba-HPE), SkyWorks, SONY Optical Storage, Splunk, Springpath (now Cisco), Spirent, Splunk, Sprint (now T-Mobile), Stratus Technologies, Symantec, Synaptics, Syniverse, Synopsys, Tanium, Telesign,TE Connectivity, TensTorrent, Tobii Technology, Teradata,T-Mobile, Treasure Data, Twitter, Unity Technologies, UiPath, Verizon Communications, VAST Data, Ventana Micro Systems, Vidyo, VMware, Wave Computing, Wellsmith, Xilinx, Zayo, Zebra, Zededa, Zendesk, Zoho, Zoom, and Zscaler. Moor Insights & Strategy founder, CEO, and Chief Analyst Patrick Moorhead is an investor in dMY Technology Group Inc. VI, Dreamium Labs, Groq, Luminar Technologies, MemryX, and Movandi.

Mon, 08 Aug 2022 03:51:00 -0500 Paul Smith-Goodson en text/html https://www.forbes.com/sites/moorinsights/2022/08/08/ibm-research-rolls-out-a-comprehensive-ai-and-ml-edge-research-strategy-anchored-by-enterprise-partnerships-and-use-cases/
Killexams : IBM report shows healthcare has a growing cybersecurity gap

Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Watch here.


While enterprises are setting records in cybersecurity spending, the cost and severity of breaches continue to soar. IBM’s latest data breach report provides insights into why there’s a growing disconnect between enterprise spending on cybersecurity and record costs for data breaches. 

This year, 2022, is on pace to be a record-breaking year for enterprise breaches globally, with the average cost of a data breach reaching $4.35 million. That’s 12.7% higher than the average cost of a data breach in 2020, which was $3.86 million. It also found a record 83% of enterprises reporting more than one breach and that the average time to identify a breach is 277 days. As a result, enterprises need to look at their cybersecurity tech stacks to see where the gaps are and what can be improved.  

Enhanced security around privileged access credentials and identity management is an excellent first place to start. More enterprises need to define identities as their new security perimeter. IBM’s study found that 19% of all breaches begin with compromised privileged credentials. Breaches caused by compromised credentials lasted an average of 327 days. Privileged access credentials are also bestsellers on the Dark Web, with high demand for access to financial services’ IT infrastructure.  

The study also shows how dependent enterprises remain on implicit trust across their security and broader IT infrastructure tech stacks. The gaps in cloud security, identity and access management (IAM) and privileged access management (PAM) allow expensive breaches to happen. Seventy-nine percent of critical infrastructure organizations didn’t deploy a zero-trust architecture, when zero trust can reduce average breach losses by nearly $1 million. 

Enterprises need to treat implicit trust as the unlocked back door that allows cybercriminals access to their systems, credentials and most valuable confidential data to reduce the incidence of breaches. 

What enterprises can learn from IBM’s data on healthcare breaches 

The report quantifies how wide healthcare’s cybersecurity gap is growing. IBM’s report estimates the average cost of a healthcare data breach is now $10.1 million, a record and nearly $1 million over last year’s $9.23 million. Healthcare has had the highest average breach cost for twelve consecutive years, increasing 41.6% since 2020. 

The findings suggest that the skyrocketing cost of breaches adds inflationary fuel to the fire, as runaway prices are financially squeezing global consumers and companies. Sixty percent of organizations participating in IBM’s study say, they raised their product and service prices due to the breach, as supply chain disruptions, the war in Ukraine and tepid demand for products continue. Consumers are already struggling to meet healthcare costs, which will likely increase by 6.5% next year

The study also found that nearly 30% of breach costs are incurred 12 to 24 months after, translating into permanent price increases for consumers. 

“It is clear that cyberattacks are evolving into market stressors that are triggering chain reactions, [and] we see that these breaches are contributing to those inflationary pressures,” says John Hendley, head of strategy for IBM Security’s X-Force research team.  

Getting quick wins in encryption

For healthcare providers with limited cybersecurity budgets, prioritizing these three areas can reduce the cost of a breach while making progress toward zero-trust initiatives. Getting identity access management (IAM) right is core to a practical zero-trust framework, one that can quickly adapt and protect human and machine identities are essential. IBM’s study found that of the zero-trust components measured in the study, IAM is the most effective in reducing breach costs. Leading IAM includes Akamai, Fortinet, Ericom, Ivanti, Palo Alto Networks and others. Ericom’s ZTEdge platform is noteworthy for its combining ML-enabled identity and access management, zero-trust network access (ZTNA), microsegmentation and secure web gateway (SWG) with remote browser isolation (RBI) and Web Application Isolation.

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn more about membership.

Mon, 01 Aug 2022 16:20:00 -0500 Louis Columbus en-US text/html https://venturebeat.com/security/ibm-report-shows-healthcare-has-a-growing-cybersecurity-gap/
Killexams : IBM Security Report Reveals Data Breach Costs are Soaring

IBM Security released the annual Cost of a Data Breach Report, finding costlier and higher-impact data breaches than ever before, with the global average cost of a data breach reaching an all-time high of $4.35 million for studied organizations.

With breach costs increasing nearly 13% over the last two years of the report, the findings suggest these incidents may also be contributing to rising costs of goods and services.

In fact, 60% of studied organizations raised their product or services prices due to the breach, when the cost of goods is already soaring worldwide amid inflation and supply chain issues.

The perpetuality of cyberattacks is also shedding light on the "haunting effect" data breaches are having on businesses, with the IBM report finding 83% of studied organizations have experienced more than one data breach in their lifetime.

Another factor rising over time is the after-effects of breaches on these organizations, which linger long after they occur, as nearly 50% of breach costs are incurred more than a year after the breach.

The 2022 Cost of a Data Breach Report is based on in-depth analysis of real-world data breaches experienced by 550 organizations globally between March 2021 and March 2022. The research, which was sponsored and analyzed by IBM Security, was conducted by the Ponemon Institute.

Some of the key findings in the 2022 IBM report include:

  • It Doesn't Pay to Pay – Ransomware victims in the study that opted to pay threat actors' ransom demands saw only $610,000 less in average breach costs compared to those that chose not to pay—not including the cost of the ransom. Factoring in the high cost of ransom payments, the financial toll may rise even higher, suggesting that simply paying the ransom may not be an effective strategy.
  • Security Immaturity in Clouds – Forty-three percent of studied organizations are in the early stages or have not started applying security practices across their cloud environments, observing over $660,000 on average in higher breach costs than studied organizations with mature security across their cloud environments. 
  • Security AI and Automation Leads as Multi-Million Dollar Cost SaverParticipating organizations fully deploying security AI and automation incurred $3.05 million less on average in breach costs compared to studied organizations that have not deployed the technology—the biggest cost saver observed in the study.

 

"Businesses need to put their security defenses on the offense and beat attackers to the punch. It's time to stop the adversary from achieving their objectives and start to minimize the impact of attacks. The more businesses try to perfect their perimeter instead of investing in detection and response, the more breaches can fuel cost of living increases." said Charles Henderson, global head of IBM Security X-Force. "This report shows that the right strategies coupled with the right technologies can help make all the difference when businesses are attacked."

For more information about this report, visit https://www.ibm.com/security/data-breach.


Mon, 08 Aug 2022 01:03:00 -0500 en text/html https://www.dbta.com/Editorial/News-Flashes/IBM-Security-Report-Reveals-Data-Breach-Costs-are-Soaring-154257.aspx
Killexams : IBM Report: Data Breach Costs Reach All-Time High

For the twelfth year in a row, healthcare saw the costliest breaches among all industries with the average cost reaching $10.1 million per breach.

CAMBRIDGE, Mass. — IBM (NYSE: IBM) Security released the annual Cost of a Data Breach Report, revealing costlier and higher-impact data breaches than ever before, with the global average cost of a data breach reaching an all-time high of $4.35 million for studied organizations. With breach costs increasing nearly 13% over the last two years of the report, the findings suggest these incidents may also be contributing to rising costs of goods and services. In fact, 60% of studied organizations raised their product or services prices due to the breach, when the cost of goods is already soaring worldwide amid inflation and supply chain issues.

The perpetuality of cyberattacks is also shedding light on the “haunting effect” data breaches are having on businesses, with the IBM report finding 83% of studied organizations have experienced more than one data breach in their lifetime. Another factor rising over time is the after-effects of breaches on these organizations, which linger long after they occur, as nearly 50% of breach costs are incurred more than a year after the breach.

The 2022 Cost of a Data Breach Report is based on in-depth analysis of real-world data breaches experienced by 550 organizations globally between March 2021 and March 2022. The research, which was sponsored and analyzed by IBM Security, was conducted by the Ponemon Institute.

Some of the key findings in the 2022 IBM report include:

  • Critical Infrastructure Lags in Zero Trust – Almost 80% of critical infrastructure organizations studied don’t adopt zero trust strategies, seeing average breach costs rise to $5.4 million – a $1.17 million increase compared to those that do. All while 28% of breaches amongst these organizations were ransomware or destructive attacks.
  • It Doesn’t Pay to Pay – Ransomware victims in the study that opted to pay threat actors’ ransom demands saw only $610,000 less in average breach costs compared to those that chose not to pay – not including the cost of the ransom. Factoring in the high cost of ransom payments, the financial toll may rise even higher, suggesting that simply paying the ransom may not be an effective strategy.
  • Security Immaturity in Clouds – Forty-three percent of studied organizations are in the early stages or have not started applying security practices across their cloud environments, observing over $660,000 on average in higher breach costs than studied organizations with mature security across their cloud environments.
  • Security AI and Automation Leads as Multi-Million Dollar Cost Saver – Participating organizations fully deploying security AI and automation incurred $3.05 million less on average in breach costs compared to studied organizations that have not deployed the technology – the biggest cost saver observed in the study.

“Businesses need to put their security defenses on the offense and beat attackers to the punch. It’s time to stop the adversary from achieving their objectives and start to minimize the impact of attacks. The more businesses try to perfect their perimeter instead of investing in detection and response, the more breaches can fuel cost of living increases.” said Charles Henderson, Global Head of IBM Security X-Force. “This report shows that the right strategies coupled with the right technologies can help make all the difference when businesses are attacked.”

Over-trusting Critical Infrastructure Organizations

Concerns over critical infrastructure targeting appear to be increasing globally over the past year, with many governments’ cybersecurity agencies urging vigilance against disruptive attacks. In fact, IBM’s report reveals that ransomware and destructive attacks represented 28% of breaches amongst critical infrastructure organizations studied, highlighting how threat actors are seeking to fracture the global supply chains that rely on these organizations. This includes financial services, industrial, transportation and healthcare companies amongst others.

Despite the call for caution, and a year after the Biden Administration issued a cybersecurity executive order that centers around the importance of adopting a zero trust approach to strengthen the nation’s cybersecurity, only 21% of critical infrastructure organizations studied adopt a zero trust security model, according to the report. Add to that, 17% of breaches at critical infrastructure organizations were caused due to a business partner being initially compromised, highlighting the security risks that over-trusting environments pose.

Businesses that Pay the Ransom Aren’t Getting a “Bargain”

According to the 2022 IBM report, businesses that paid threat actors’ ransom demands saw $610,000 less in average breach costs compared to those that chose not to pay – not including the ransom amount paid. However, when accounting for the average ransom payment, which according to Sophos reached $812,000 in 2021, businesses that opt to pay the ransom could net higher total costs – all while inadvertently funding future ransomware attacks with capital that could be allocated to remediation and recovery efforts and looking at potential federal offenses.

The persistence of ransomware, despite significant global efforts to impede it, is fueled by the industrialization of cybercrime. IBM Security X-Force discovered the duration of studied enterprise ransomware attacks shows a drop of 94% over the past three years – from over two months to just under four days. These exponentially shorter attack lifecycles can prompt higher impact attacks, as cybersecurity incident responders are left with very short windows of opportunity to detect and contain attacks. With “time to ransom” dropping to a matter of hours, it’s essential that businesses prioritize rigorous testing of incident response (IR) playbooks ahead of time. But the report states that as many as 37% of organizations studied that have incident response plans don’t test them regularly.

Hybrid Cloud Advantage

The report also showcased hybrid cloud environments as the most prevalent (45%) infrastructure amongst organizations studied. Averaging $3.8 million in breach costs, businesses that adopted a hybrid cloud model observed lower breach costs compared to businesses with a solely public or private cloud model, which experienced $5.02 million and $4.24 million on average respectively. In fact, hybrid cloud adopters studied were able to identify and contain data breaches 15 days faster on average than the global average of 277 days for participants.

The report highlights that 45% of studied breaches occurred in the cloud, emphasizing the importance of cloud security. However, a significant 43% of reporting organizations stated they are just in the early stages or have not started implementing security practices to protect their cloud environments, observing higher breach costs2. Businesses studied that did not implement security practices across their cloud environments required an average 108 more days to identify and contain a data breach than those consistently applying security practices across all their domains.

Additional findings in the 2022 IBM report include:

  • Phishing Becomes Costliest Breach Cause – While compromised credentials continued to reign as the most common cause of a breach (19%), phishing was the second (16%) and the costliest cause, leading to $4.91 million in average breach costs for responding organizations.
  • Healthcare Breach Costs Hit Double Digits for First Time Ever– For the 12th year in a row, healthcare participants saw the costliest breaches amongst industries with average breach costs in healthcare increasing by nearly $1 million to reach a record high of $10.1 million.
  • Insufficient Security Staffing – Sixty-two percent of studied organizations stated they are not sufficiently staffed to meet their security needs, averaging $550,000 more in breach costs than those that state they are sufficiently staffed.

To get a copy of the 2022 Cost of a Data Breach Report, visit https://www.ibm.com/security/data-breach.

Fri, 29 Jul 2022 02:15:00 -0500 CS Staff en text/html https://www.campussafetymagazine.com/research/ibm-report-data-breach-costs-reach-all-time-high/
Killexams : IBM report shows cyberattacks growing fast in number, scale No result found, try new keyword!A new report out of IBM shows that when it comes to the rising threat of data breaches, it’s the consumer – not the company – fronting the price tag. Fri, 29 Jul 2022 22:30:00 -0500 text/html https://www.bizjournals.com/triad/news/2022/07/30/ibm-data-cyberattacks-growing-in-number-scale.html Killexams : One in five data breaches due to software supply chain compromise, IBM report warns
Emma Woollacott 27 July 2022 at 15:04 UTC
Updated: 28 July 2022 at 07:55 UTC

Attack vector cost businesses 2.5% more in one year

Supply chain attacks on the rise, costing businesses more year on year

Supply chain attacks on the rise, costing businesses more year on year as organizations failing to implement zero trust strategies.

This is according to IBM’s new Cost of a Data Breach report, which found that one in five breaches occurred because of a compromise at a business partner, with a supply chain breach taking on average 26 days longer to identify and contain than the global average.

The total cost of a supply chain compromise was $4.46 million – 2.5% higher than average.

The report also found that the global average cost of a data breach has hit an all-time high of $4.35 million – up nearly 13% over the last two years.

“Seventeen per cent of breaches in critical infrastructure organizations occurred due to a business partner being initially compromised – this shows us that organizations need to put more focus on the security controls that govern third party access,” John Hendley, head of strategy at IBM Security X-Force told The Daily Swig.

Zero trust, zero problems?

Critical infrastructure organizations such as financial services, industrial, transportation, and healthcare companies are a growing target for these attacks, says IBM, and zero trust is the best way to guard against attack.

“Organizations need to be more vigilant than ever and closely scrutinize these external points of access into their environment, whether that’s through direct network access, applications, or even physical access,” says Hendly.

“Supply chain attacks are of great concern, both because of how insidious they are and how extreme their impacts can be. We saw this play out with SolarWinds, and we’ll surely see more of these attacks in the future.”

Read more of the latest news about software supply chain attacks

Those organizations that had implemented a zero trust security approach saw breaches cost them less, with an average cost saving of $1.5 million.

However, critical infrastructure organizations in particular are failing to do this, with only one in five having adopted a zero trust model, compared with an overall global average of 41%.

Javvad Malik, lead security awareness advocate at KnowBe4, says that greater transparency is needed across the supply chain, along with greater technical assurance that all components are adequately secured.

“We’ve seen many organizations breached, not for the organization itself, but because it will provide a way into another. Popular examples of these include Target, RSA, and more recently SolarWinds,” he told The Daily Swig.

“While many organisations try to mitigate risks by sending out lengthy questionnaires to third parties it deals with to determine the level of security they employ, it is often not sufficient to cover the entire supply chain, and even if it was, it doesn’t provide technical assurance.”

YOU MAY ALSO LIKE ‘We’re still fighting last decade’s battle’ – Sonatype CTO Brian Fox on the struggle to secure the neglected software supply chain

Wed, 27 Jul 2022 19:55:00 -0500 en text/html https://portswigger.net/daily-swig/one-in-five-data-breaches-due-to-software-supply-chain-compromise-ibm-report-warns
Killexams : IBM Security report finds data breaches are costlier than ever before

A new report from IBM Security today reveals that data breaches are costlier and more impactful than ever before.

IBM Security’s 2022 Cost of a Data Breach Report, based on analysis of real-world data breaches experienced by 550 organizations globally between March 2021 and March 2022, found that the average cost of a data breach has hit an all-time high of $4.35 million.

Figures relating to large companies and the cost involved in dealing with data breaches may seem academic to many, but interestingly the report suggests that the increasing cost of these incidents — up 13% over the last two years — is contributing to rising costs of goods and services. Sixty percent of studied organizations raised their product or service prices after experiencing a data breach. Those increases come at a time the cost of goods is already increasing from inflation and supply chain issues.

Data breaches were also found not to be one-offs, with 83% of studied organizations having experienced more than one data breach in their lifetime. Another factor rising over time is the after-effects of breaches on these organizations, which linger long after they occur, as nearly 50% of breach costs are incurred more than a year after the breach.

Whether companies are exclusively to blame for lax cybersecurity is arguable, but many were found lacking in adopting cutting-edge and more modern security practices. Eighty percent of critical infrastructure organizations studied were found to have not adopted zero-trust strategies, seeing average breach costs rise to $5.4 million – a $1.17 million increase compared with those that do.

Companies either in the early stages or who have not started applying security practices across their cloud environments were found to have $660,000 higher average breach costs than studied organizations with mature security across their cloud environments.

Conversely, organizations that have fully deployed security artificial intelligence and automation incurred $3.05 million less on average in breach costs compared to studied organizations that have not deployed the technology – the biggest cost saver observed in the study.

Ransomware victims in the study that opted to pay threat actors’ ransom demands saw only $610,000 less in average breach costs compared with those that chose not to pay – not including the cost of the ransom. Given the high price of ransom payments, the report notes, the financial toll may rise even higher, suggesting that paying the ransom may not be an effective strategy.

The most costly form of data breach among the companies studied was found to be phishing. Compromised credentials were the most common cause of a breach at 19% but phishing, accounting for 16% of breaches, led to average breach costs of $4.91 million.

Other highlights in the report included healthcare breath costs hitting double digits for the first time, with an average breach in the sector resulting in a cost of $10.1 million. Insufficient security staffing was noted to be a serious issue, with 62% of organizations saying they are not sufficiently staffed to meet their security needs, averaging $550,000 more in breach costs than those that state they are sufficiently staffed.

“Businesses need to put their security defenses on the offense and beat attackers to the punch,” Charles Henderson, global head of IBM Security X-Force, said in a statement. “It’s time to stop the adversary from achieving their objectives and start to minimize the impact of attacks.”

Image: IBM Security

Show your support for our mission by joining our Cube Club and Cube Event Community of experts. Join the community that includes Amazon Web Services and Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger and many more luminaries and experts.

Tue, 26 Jul 2022 16:01:00 -0500 en-US text/html https://siliconangle.com/2022/07/27/ibm-security-report-finds-data-breaches-costlier-ever/
Killexams : The Quantum Decade: Canada’s foresight is paying off, says IBM Canada

“Quantum computing is no longer a futuristic concept," says Frank Attaie, IBM Canada’s general manager, technology

Article content

For a country with a legacy of dominance in natural resource industries — forestry, mining, oil and gas and marine — it seems counterintuitive for it also to rank as a global leader in one of the most dynamic and complex computing technologies ever known, yet here we are.

Advertisement 2

This advertisement has not loaded yet, but your article continues below.

Article content

Canada recognized early on that quantum computing presents a staggering amount of potential across all facets of society, including the economy, medicine, climate change, academics, research, manufacturing and so much more. In fact, Canada is ranked fifth in total quantum science expenditures and first among G7 nations in per-capita spending on quantum research. The country has invested more than $1 billion in quantum science over the past decade and is investing $360 million more to launch a National Quantum Strategy to bolster quantum research, advance quantum-ready technologies and attract business and talent.

Frank Attaie, IBM Canada’s general manager, technology, says that Canada’s foresight is paying off: “Quantum computing is no longer a futuristic concept. The world has entered into the Quantum Decade, an era when Canadian enterprises are beginning to see quantum computing’s business value.” Over the last year, there have been unprecedented advances in hardware, software development and services that validate the technology’s momentum, creating an ecosystem that paves the way for further breakthroughs and helps prepare the market for the adoption of this revolutionary technology.

Advertisement 3

This advertisement has not loaded yet, but your article continues below.

Article content

In February, IBM announced — in partnership with the Government of Quebec — it will accelerate discovery in artificial intelligence, semiconductors, high-performance computing and quantum computing, including installation of the first IBM Quantum System One in Canada. What this could mean for science and research in the country is unprecedented, says Attaie.

“What we do know,” continued Attaie, “is that the quantum challenge is too big for any one entity. As quantum moves from the lab to the real world, ecosystems are forming to support collaborative innovation and open-source development.” Potential ecosystems likely include a quantum computing technology partner, quantum computing developers and academic partners.

Advertisement 4

This advertisement has not loaded yet, but your article continues below.

Article content

A successful model of this is the membership of the Université de Sherbrooke in the IBM Quantum Network, a community of Fortune 500 companies, academic institutions, startups and national research labs working with IBM to advance quantum computing. Since the university joined the IBM Quantum Network as a Hub in 2020, other industry leaders such as Lockheed Martin Canada and CMC Microsystems have joined as Hub members. This is in addition to another 12 IBM Quantum Network start-ups and academic partners across the country, actively exploring quantum with our technology and scientists.  

Frank Attaie, IBM Canada’s general manager, technology. SUPPLIED
Frank Attaie, IBM Canada’s general manager, technology. SUPPLIED

The Quantum Decade needs a quantum workforce

“Innovation alone cannot unlock the full potential of quantum computing,” said Attaie. “We need the people and skills to do so.” Estimates indicate there are only about 3,000 skilled quantum workers in the market today, a base that needs to dramatically increase to exploit the full potential of quantum this decade and beyond. And while Canada’s foresight into the future of quantum has helped prime the sector with extraordinary levels of expertise, the growth of the industry will need much, much more, says Attaie.

Advertisement 5

This advertisement has not loaded yet, but your article continues below.

Article content

“For Canada to be quantum ready, industry needs to engage critical constituencies, from domain experts to students, to enable quantum computing skills to blossom.” These include quantum developer certification, as well as bootcamp-like educational programs and investments in educator training that empower a diverse workforce.

The decade ahead — enterprises will see the payoff of quantum

While the power and capacity of quantum is clear, how the Quantum Decade will unfold is not. How will governments implement quantum computing as part of their economic growth strategies? What novel use cases will researchers uncover and put into play?

Attaie agrees there are many unknowns as to how the Quantum Decade will evolve but is confident its end will look nothing like the beginning. “We will be working with quantum processors with thousands of qubits, we will have a whole workforce with years of experience, and enterprises will have seen the payoff of quantum,” said Attaie. “Make no mistake, quantum computing is a whole new paradigm. Every technology leader should be actively building quantum into their plans so that they’re ready for its rapid evolution over the next decade.”

This story was provided by IBM Canada for commercial purposes.

Fri, 05 Aug 2022 05:14:00 -0500 en-CA text/html https://ottawacitizen.com/sponsored/news-sponsored/the-quantum-decade-canadas-foresight-is-paying-off-says-ibm-canada
Killexams : IBM’s Cost of a Data Breach Report finds invisible ‘cyber tax’

Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Watch here.


When it comes to operational challenges, few mistakes are as costly as data breaches. Just one exploited vulnerability can lead to millions in damages, not just due to upfront disruption, but a loss of respect from consumers and potential compliance liabilities. 

Unfortunately, the cost of a data breach is only going up. Today, IBM Security released its annual “Cost of a Data Breach” report conducted by Ponemon Institute, which found that the cost of a data breach in 2022 totaled $4.35 million, an increase of 2.6% since last year’s total of $4.24 million. 

The research also found that organizations that fell victim to cyberattacks were prime target for follow-up attacks as part of a “haunting effect”, with 83% of organizations studied having had more than one data breach. 

For enterprises, the report highlights that new approaches are required to mitigate the impact of data breaches, particularly in the face of a growing number of sophisticated attacks, which can’t always be prevented. 

The hostile reality of the threat landscape 

As the cost of a data breach continues to rise amid a threat landscape of rampant double and triple extortion ransomware attacks and identity-related breaches, it’s becoming increasingly clear that traditional approaches to enterprise security need to be reevaluated. 

In the last week alone, T Mobile and Twitter found out the cost of a data breach first hand with the former agreeing to pay customers $350 million as part of a post-breach settlement, and the latter having to deal with the negative fallout after a hacker claimed to have accessed data on 5.4 million users. 

With the impact of such breaches causing millions in damage, many organizations decide to pass costs onto consumers, as part of an invisible cyber tax. In fact, IBM found that for 60% of organizations, breaches led to price increases passed on to customers. 

“What stands out most in this year’s finding is that the financial impact of breaches is now extending well beyond the breaches organizations themselves,” said Head of Strategy, IBM Security X-Force, John Hendley. 

“The cost is trickling down to consumers. In fact, if you consider that two or three companies within a supply chain may have suffered a breach and increased their prices, there’s this multiplier effect that’s ultimately hitting the consumer’s wallet. Essentially, we’re now beginning to see a hidden “cyber tax” that individuals are paying as a result of the growing number of breaches occurring today compounded with the more obvious disruptive effects of cyberattacks,” Hendley said. 

When asked why the cost of data breaches continued to grow, Hendley explained that there’s a high volume of attacks occurring, but only a limited number of skilled security professionals available to respond to them.

This is highlighted in the research with 62% of organizations saying they weren’t sufficiently staffed to meet their security needs.

What are the implications for CISOs and security leaders 

Although the report highlights the bleakest of the current threat landscape, it also points to some promising technologies and methodologies that enterprises can use to reduce the cost of data breaches. 

For instance, one of the most promising findings was that organizations with fully deployed security AI and automation can expect to pay $3.05 million less during a data breach, and on average cut the time to identify and contain a breach by 74-days. 

At the same time, organizations that implement zero trust can expect to pay 1 million less in breach costs than those that don’t. 

Finally, those organizations maintain an incident response team and regularly tested IR plans can expect to cut the cost by $2.66 million.

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn more about membership.

Tue, 26 Jul 2022 23:01:00 -0500 Tim Keary en-US text/html https://venturebeat.com/security/cost-of-a-data-breach/
Killexams : CIOReview Names Cobalt Iron Among 10 Most Promising IBM Solution Providers 2022

LAWRENCE, Kan.--(BUSINESS WIRE)--Jul 28, 2022--

Cobalt Iron Inc., a leading provider of SaaS-based enterprise data protection, today announced that the company has been deemed one of the 10 Most Promising IBM Solution Providers 2022 by CIOReview Magazine. The annual list of companies is selected by a panel of experts and members of CIOReview Magazine’s editorial board to recognize and promote innovation and entrepreneurship. A technology partner for IBM, Cobalt Iron earned the distinction based on its Compass ® enterprise SaaS backup platform for monitoring, managing, provisioning, and securing the entire enterprise backup landscape.

This press release features multimedia. View the full release here: https://www.businesswire.com/news/home/20220728005043/en/

Cobalt Iron Compass® is a SaaS-based data protection platform leveraging strong IBM technologies for delivering a secure, modernized approach to data protection. (Graphic: Business Wire)

According to CIOReview, “Cobalt Iron has built a patented cyber-resilience technology in a SaaS model to alleviate the complexities of managing large, multivendor setups, providing an effectual humanless backup experience. This SaaS-based data protection platform, called Compass, leverages strong IBM technologies. For example, IBM Spectrum Protect is embedded into the platform from a data backup and recovery perspective. ... By combining IBM’s technologies and the intellectual property built by Cobalt Iron, the company delivers a secure, modernized approach to data protection, providing a ‘true’ software as a service.”

Through proprietary technology, the Compass data protection platform integrates with, automates, and optimizes best-of-breed technologies, including IBM Spectrum Protect, IBM FlashSystem, IBM Red Hat Linux, IBM Cloud, and IBM Cloud Object Storage. Compass enhances and extends IBM technologies by automating more than 80% of backup infrastructure operations, optimizing the backup landscape through analytics, and securing backup data, making it a valuable addition to IBM’s data protection offerings.

CIOReview also praised Compass for its simple and intuitive interface to display a consolidated view of data backups across an entire organization without logging in to every backup product instance to extract data. The machine learning-enabled platform also automates backup processes and infrastructure, and it uses open APIs to connect with ticket management systems to generate tickets automatically about any backups that need immediate attention.

To ensure the security of data backups, Cobalt Iron has developed an architecture and security feature set called Cyber Shield for 24/7 threat protection, detection, and analysis that improves ransomware responsiveness. Compass is also being enhanced to use several patented techniques that are specific to analytics and ransomware. For example, analytics-based cloud brokering of data protection operations helps enterprises make secure, efficient, and cost-effective use of their cloud infrastructures. Another patented technique — dynamic IT infrastructure optimization in response to cyberthreats — offers unique ransomware analytics and automated optimization that will enable Compass to reconfigure IT infrastructure automatically when it detects cyberthreats, such as a ransomware attack, and dynamically adjust access to backup infrastructure and data to reduce exposure.

Compass is part of IBM’s product portfolio through the IBM Passport Advantage program. Through Passport Advantage, IBM sellers, partners, and distributors around the world can sell Compass under IBM part numbers to any organizations, particularly complex enterprises, that greatly benefit from the automated data protection and anti-ransomware solutions Compass delivers.

CIOReview’s report concludes, “With such innovations, all eyes will be on Cobalt Iron for further advancements in humanless, secure data backup solutions. Cobalt Iron currently focuses on IP protection and continuous R&D to bring about additional cybersecurity-related innovations, promising a more secure future for an enterprise’s data.”

About Cobalt Iron

Cobalt Iron was founded in 2013 to bring about fundamental changes in the world’s approach to secure data protection, and today the company’s Compass ® is the world’s leading SaaS-based enterprise data protection system. Through analytics and automation, Compass enables enterprises to transform and optimize legacy backup solutions into a simple cloud-based architecture with built-in cybersecurity. Processing more than 8 million jobs a month for customers in 44 countries, Compass delivers modern data protection for enterprise customers around the world. www.cobaltiron.com

Product or service names mentioned herein are the trademarks of their respective owners.

Link to Word Doc:www.wallstcom.com/CobaltIron/220728-Cobalt_Iron-CIOReview_Top_IBM_Provider_2022.docx

Photo Link:www.wallstcom.com/CobaltIron/Cobalt_Iron_CIO_Review_Top_IBM_Solution_Provider_Award_Logo.pdf

Photo Caption: Cobalt Iron Compass ® is a SaaS-based data protection platform leveraging strong IBM technologies for delivering a secure, modernized approach to data protection.

Follow Cobalt Iron

https://twitter.com/cobaltiron

https://www.linkedin.com/company/cobalt-iron/

https://www.youtube.com/user/CobaltIronLLC

View source version on businesswire.com:https://www.businesswire.com/news/home/20220728005043/en/

CONTACT: Agency Contact:

Sunny Branson

Wall Street Communications

Tel: +1 801 326 9946

Email:sunny@wallstcom.com

Web:www.wallstcom.comCobalt Iron Contact:

Mary Spurlock

VP of Marketing

Tel: +1 785 979 9461

Email:maspurlock@cobaltiron.com

Web:www.cobaltiron.com

KEYWORD: EUROPE UNITED STATES NORTH AMERICA KANSAS

INDUSTRY KEYWORD: DATA MANAGEMENT SECURITY TECHNOLOGY SOFTWARE NETWORKS INTERNET

SOURCE: Cobalt Iron

Copyright Business Wire 2022.

PUB: 07/28/2022 09:00 AM/DISC: 07/28/2022 09:03 AM

http://www.businesswire.com/news/home/20220728005043/en

Thu, 28 Jul 2022 01:03:00 -0500 en text/html https://www.eagletribune.com/region/cioreview-names-cobalt-iron-among-10-most-promising-ibm-solution-providers-2022/article_56f7dda7-cbd5-586a-9d5f-f882022100da.html
C2090-552 exam dump and training guide direct download
Training Exams List