These M2020-645 PDF Braindumps are very well updated

killexams.com always suggest you to download M2020-645 exam real questions for trial, go through the questions and answers before you apply for full version. Killexams.com allows you 3 months free updates of M2020-645 IBM Cognos Business Intelligence Sales Mastery Test v2 exam questions. Our certification group is consistently working at back end and update the M2020-645 real questions as and when need.

Exam Code: M2020-645 Practice exam 2022 by Killexams.com team
IBM Cognos Business Intelligence Sales Mastery Test v2
IBM Intelligence Study Guide
Killexams : IBM Intelligence Study Guide - BingNews https://killexams.com/pass4sure/exam-detail/M2020-645 Search results Killexams : IBM Intelligence Study Guide - BingNews https://killexams.com/pass4sure/exam-detail/M2020-645 https://killexams.com/exam_list/IBM Killexams : Explainable AI Is Trending And Here’s Why

According to the 2022 IBM Institute for Business Value study on AI Ethics in Action, building trustworthy Artificial Intelligence (AI) is perceived as a strategic differentiator and organizations are beginning to implement AI ethics mechanisms.

Seventy-five percent of respondents believe that ethics is a source of competitive differentiation. More than 67% of respondents who view AI and AI ethics as important indicate that their organizations outperform their peers in sustainability, social responsibility, and diversity and inclusion.

The survey showed that 79% of CEOs are prepared to embed AI ethics into their AI practices, up from 20% in 2018, but less than a quarter of responding organizations have operationalized AI ethics. Less than 20% of respondents strongly agreed that their organization's practices and actions match (or exceed) their stated principles and values.

Peter Bernard, CEO of Datagration, says that understanding AI gives companies an advantage, but Bernard adds that explainable AI allows businesses to optimize their data.

"Not only are they able to explain and understand the AI/ML behind predictions, but when errors arise, they can understand where to go back and make improvements," said Bernard. "A deeper understanding of AI/ML allows businesses to know whether their AI/ML is making valuable predictions or whether they should be improved."

Bernard believes this can ensure incorrect data is spotted early on and stopped before decisions are made.

Avivah Litan, vice president and distinguished analyst at Gartner, says that explainable AI also furthers scientific discovery as scientists and other business users can explore what the AI model does in various circumstances.

"They can work with the models directly instead of relying only on what predictions are generated given a certain set of inputs," said Litan.

But John Thomas, Vice President and Distinguished Engineer in IBM Expert Labs, says at its very basic level, explainable AI are the methods and processes for helping us understand a model's output. "In other words, it's the effort to build AI that can explain to designers and users why it made the decision it did based on the data that was put into it," said Thomas.

Thomas says there are many reasons why explainable AI is urgently needed.

"One reason is model drift. Over time as more and more data is fed into a given model, this new data can influence the model in ways you may not have intended," said Thomas. "If we can understand why an AI is making certain decisions, we can do much more to keep its outputs consistent and trustworthy over its lifecycle."

Thomas adds that at a practical level, we can use explainable AI to make models more accurate and refined in the first place. "As AI becomes more embedded in our lives in more impactful ways, [..] we're going to need not only governance and regulatory tools to protect consumers from adverse effects, we're going to need technical solutions as well," said Thomas.

"AI is becoming more pervasive, yet most organizations cannot interpret or explain what their models are doing," said Litan. "And the increasing dependence on AI escalates the impact of mis-performing AI models with severely negative consequences," said Litan.

Bernard takes it back to a practical level, saying that explainable AI [..] creates proof of what senior engineers and experts "know" intuitively and explaining the reasoning behind it simultaneously. "Explainable AI can also take commonly held beliefs and prove that the data does not back it up," said Bernard.

"Explainable AI lets us troubleshoot how an AI is making decisions and interpreting data is an extremely important tool in helping us ensure AI is helping everyone, not just a narrow few," said Thomas.

Hiring is an example of where explainable AI can help everyone.

Thomas says hiring managers deal with all kinds of hiring and talent shortages and usually get more applications than they can read thoroughly. This means there is a strong demand to be able to evaluate and screen applicants algorithmically.

"Of course, we know this can introduce bias into hiring decisions, as well as overlook a lot of people who might be compelling candidates with unconventional backgrounds," said Thomas. "Explainable AI is an ideal solution for these sorts of problems because it would allow you to understand why a model rejected a certain applicant and accepted another. It helps you make your make model better.”

Making AI trustworthy

IBM's AI Ethics survey showed that 85% of IT professionals agree that consumers are more likely to choose a company that's transparent about how its AI models are built, managed and used.

Thomas says explainable AI is absolutely a response to concerns about understanding and being able to trust AI's results.

"There's a broad consensus among people using AI that you need to take steps to explain how you're using it to customers and consumers," said Thomas. "At the same time, the field of AI Ethics as a practice is relatively new, so most companies, even large ones, don't have a Head of AI ethics, and they don't have the skills they need to build an ethics panel in-house."

Thomas believes it's essential that companies begin thinking about building those governance structures. "But there also a need for technical solutions that can help companies manage their use of AI responsibly," said Thomas.

Driven by industry, compliance or everything?

Bernard points to the oil and gas industry as why explainable AI is necessary.

"Oil and gas have [..] a level of engineering complexity, and very few industries apply engineering and data at such a deep and constant level like this industry," said Bernard. "From the reservoir to the surface, every aspect is an engineering challenge with millions of data points and different approaches."

Bernard says in this industry, operators and companies still utilize spreadsheets and other home-grown systems-built decades ago. "Utilizing ML enables them to take siloed knowledge, Boost it and create something transferrable across the organization, allowing consistency in decision making and process."

"When oil and gas companies can perform more efficiently, it is a win for everyone," said Bernard. "The companies see the impact in their bottom line by producing more from their existing assets, lowering environmental impact, and doing more with less manpower."

Bernard says this leads to more supply to help ease the burden on demand. "Even modest increases like 10% improvement in production can have a massive impact in supply, the more production we have [..] consumers will see relief at the pump."

But Litan says the trend toward explainable AI is mainly driven by regulatory compliance.

In a 2021 Gartner survey, AI in Organizations reported that regulatory compliance is the top reason privacy, security and risk are barriers to AI implementation.

"Regulators are demanding AI model transparency and proof that models are not generating biased decisions and unfair 'irresponsible' policies," said Litan. "AI privacy, security and/or risk management starts with AI explainability, which is a required baseline."

Litan says Gartner sees the biggest uptake of explainable AI in regulated industries like healthcare and financial services. "But we also see it increasingly with technology service providers that use AI models, notably in security or other scenarios," said Litan.

Litan adds that another reason explainable AI is trending is that organizations are unprepared to manage AI risks and often cut corners around model governance. "Organizations that adopt AI trust, risk and security management – which starts with inventorying AI models and explaining them – get better business results," adds Litan.

But IBM's Thomas doesn't think you can parse the uptake of explainable AI by industry.

"What makes a company interested in explainable AI isn't necessarily the industry they're in; they're invested in AI in the first place," said Thomas. "IT professionals at businesses deploying AI are 17% more likely to report that their business values AI explainability. Once you get beyond exploration and into the deployment phase, explaining what your models are doing and why quickly becomes very important to you."

Thomas says that IBM sees some compelling use cases in specific industries starting with medical research.

"There is a lot of excitement about the potential for AI to accelerate the pace of discovery by making medical research easier," said Thomas. "But, even if AI can do a lot of heavy lifting, there is still skepticism among doctors and researchers about the results."

Thomas says explainable AI has been a powerful solution to that particular problem, allowing researchers to embrace AI modeling to help them solve healthcare-related challenges because they can refine their models, control for bias and monitor the results.

"That trust makes it much easier for them to build models more quickly and feel comfortable using them to inform their care for patients," said Thomas.

IBM worked with Highmark Health to build a model using claims data to model sepsis and COVID-19 risk. But again, Thomas adds that because it's a tool for refining and monitoring how your AI models perform, explainable AI shouldn't be restricted to any particular industry or use case.

"We have airlines who use explainable AI to ensure their AI is doing a good job predicting plane departure times. In financial services and insurance, companies are using explainable AI to make sure they are making fair decisions about loan rates and premiums," said Thomas. "This is a technical component that will be critical for anyone getting serious about using AI at scale, regardless of what industry they are in."

Guard rails for AI ethics

What does the future look like with AI ethics and explainable AI?

Thomas says the hope is that explainable AI will spread and see adoption because that will be a sign companies take trustworthy AI, both the governance and the technical components, very seriously.

He also sees explainable AI as essential guardrails for AI Ethics down the road.

"When we started putting seatbelts in cars, a lot more people started driving, but we also saw fewer and less severe accidents," said Thomas. "That's the obvious hope - that we can make the benefits of this new technology much more widely available while also taking the needed steps to ensure we are not introducing unanticipated consequences or harms."

One of the most significant factors working against the adoption of AI and its productivity gains is the genuine need to address concerns about how AI is used, what types of data are being collected about people, and whether AI will put them out of a job.

But Thomas says that worry is contrary to what’s happening today. "AI is augmenting what humans can accomplish, from helping researchers conduct studies faster to assisting bankers in designing fairer and more efficient loans to helping technicians inspect and fix equipment more quickly," said Thomas. "Explainable AI is one of the most important ways we are helping consumers understand that, so a user can say with a much greater degree of certainty that no, this AI isn't introducing bias, and here's exactly why and what this model is really doing."

One tangible example IBM uses is AI Factsheets in their IBM Cloud Pak for Data. IBM describes the factsheets as 'nutrition labels' for AI, which allows them to list the types of data and algorithms that make up a particular in the same way a food item lists its ingredients.

"To achieve trustworthy AI at scale, it takes more than one company or organization to lead the charge,” said Thomas. “AI should come from a diversity of datasets, diversity in practitioners, and a diverse partner ecosystem so that we have continuous feedback and improvement.”

Wed, 27 Jul 2022 12:00:00 -0500 Jennifer Kite-Powell en text/html https://www.forbes.com/sites/jenniferhicks/2022/07/28/explainable-ai-is--trending-and-heres-why/
Killexams : IBM's Watson Could Diagnose Cancer Better Than Doctors
Several years ago, IBM's Watson supercomputer gained fame after beating some of the world's top Jeopardy! players. To accomplish that feat, researchers fed thousands of points of information into Watson's database, allowing it to retrieve information presented through natural language. While winning Jeopardy! might be an exciting challenge for researchers, Watson's next goal could revolutionize oncology. IBM is currently working on the third-generation of the Watson platform, which has the power to debate and reason, according to IBM CEO Ginni Rometty.

The latest version of Watson can absorb and analyze vast amounts of data, allowing it to make diagnoses that are more accurate than human doctors. If a Watson-style computer was deployed through a cloud interface, healthcare facilities may be able to Boost diagnosis accuracy, reduce costs and minimize patient wait times.

In combination with the Memorial Sloan-Kettering Cancer Center and Wellpoint, a private healthcare company, researchers hope to see Watson available for rent to any clinic or healthcare facility that wants to get its opinion on an oncology diagnosis. On top of this, researchers state that the system will be able to suggest the most affordable way of paying for treatment.

Approximately two years ago, IBM researchers announced that Watson had the same level of knowledge as a second-year med-school student. Over the past year, Wellpoint, Sloan-Kettering and IBM have been teaching the system how to comprehend and analyze peer-reviewed medical data. While researchers have started off with just breast, prostate and lung cancers, IBM hopes to expand to other forms of cancer in the near future.

As of now, Watson has assimilated over 600,000 unique types of medical evidence. In addition, Watson's database includes two million pages sourced from a variety of different medical journals. To Boost the link between symptoms and a diagnosis, Watson also has the ability to search through 1.5 million patient records to learn from previous diagnoses. This amount of information is more than any human physician can learn in a lifetime.

According to a study by Sloan-Kettering, only one-fifth of knowledge used by physicians when diagnosing a patient is based on trial-based information. To stay on top of new medical knowledge as it is published, physicians would have to read for at least 160 hours every week. Since Watson can absorb this information faster than a human, it could potentially revolutionize the current model of healthcare.

According to Samuel Nessbaum of Wellpoint, Watson's diagnostic accuracy rate for lung cancer is 90%. In comparison, the average diagnostic accuracy rate for lung cancer for human physicians is only 50%.

Wellpoint believes that Watson will be able to reduce waste. According to the company, approximately one-third of the money spent on U.S. healthcare is wasted every year. With Watson, utilization management can be approved, state researchers.

"What Watson is going to enable us to do is take that wisdom and put it in a way that people who don't have that much experience in any individual disease can have a wise counselor at their side at all times and use the intelligence and wisdom of the most experienced people to help guide decisions," notes Dr. Larry Norton, a researcher at Sloan-Kettering.

Sun, 17 Jul 2022 11:59:00 -0500 en text/html https://www.mddionline.com/software/ibms-watson-could-diagnose-cancer-better-doctors
Killexams : IBM report: Data breach costs up, contributing to inflation

The “2022 Cost of a Data Breach Report” found 60 percent of studied organizations raised their product or services prices because of a breach. The report analyzed 550 organizations that suffered a data breach between March 2021 and March 2022, with research conducted by the Ponemon Institute.

IBM has studied data breaches in the United States the last 17 years. In 2021, the average cost of a breach was $4.24 million.

New to this year’s report was a look at the effects of supply chain compromises and the security skills gap. While organizations that were breached because of a supply chain compromise were relatively low (19 percent), the average total cost of such a breach was $4.46 million.

The average time to identify and contain a supply chain compromise was 303 days, opposed to the global average of 277 days.

The study found the average data breach cost savings of a sufficiently staffed organization was $550,000, but only 38 percent of studied organizations said their security team was sufficiently staffed.

Of note, the “Cost of Compliance Report 2022” published by Thomson Reuters Regulatory Intelligence earlier this month found staff shortages have been driven by rising salaries, tightening budgets, and personal liability increases.

The IBM study included 13 companies that experienced data breaches involving the loss or theft of 1 million to 60 million records. The average total cost for breaches of 50-60 million records was $387 million, a slight decline from $401 million in 2021.

For a second year, the study examined how deploying a “zero trust” security framework has a net positive impact on data breach costs, with savings of approximately $1 million for organizations that implemented one. However, only 41 percent of organizations surveyed deployed a zero trust security architecture.

Organizations with mature deployment of zero trust applied consistently across all domains saved more than $1.5 million on average, according to the survey.

Almost 80 percent of critical infrastructure organizations that did not adopt a zero trust strategy saw average breach costs rise to $5.4 million.

The study also found it doesn’t pay to pay hackers, with only $610,000 less in average breach costs compared to businesses that chose not to pay ransomware threat actors.

Organizations that fully deployed a security artificial intelligence and automation incurred $3.05 million less on average in breach costs compared to those that did not, the biggest saver observed in the study.

“Businesses need to put their security defenses on the offense and beat attackers to the punch,” said Charles Henderson, global head of IBM Security X-Force, in a press release announcing the study. “It’s time to stop the adversary from achieving their objectives and start to minimize the impact of attacks.”

Thu, 28 Jul 2022 08:48:00 -0500 en text/html https://www.complianceweek.com/cybersecurity/ibm-report-data-breach-costs-up-contributing-to-inflation/31909.article
Killexams : How IBM’s Watson Can Help Cut Your Tax Bill

Computers have helped people wade through their tax returns for decades. Preparers at H&R Block this season will get some help, too. But a computer won’t just crunch the numbers. Rather, it will probe your return and ask questions along the way--trying to make sure you don’t pay a penny more than necessary to Uncle Sam. H&R Block has “hired” IBM’s Watson--a powerful artificial intelligence system--to act as a cyberguide in preparing taxes.

To study for the job, Watson digested the federal tax code (more than 74,000 pages) and absorbed thousands of conversations between H&R Block’s tax preparers and clients. Its objective is to analyze conversational patterns to determine whether taxpayers may be missing opportunities for savings. Now, when a tax preparer and client go through the paperwork, Watson can follow along, and the system will issue prompts on computer screens if it detects a potential deduction or credit they may be missing. “Watson will ask questions that we might not think about on our own,” says Ed Harbour, a vice president at IBM.

With H&R Block expecting Watson to assist with processing 11 million returns this year, the system should get smarter as it absorbs more data and conversational patterns, making it a more useful tool, says Harbour. For example, IBM says the system should eventually provide taxpayers with “increasingly personalized tips” to help lower their tax bills in the future.

Watson isn’t the only AI technology playing a greater role in our lives. If you use Facebook, Google or Amazon.com, for instance, AI is behind the scenes, guiding you through the site. Meanwhile, Watson is expanding into scientific research, robotics and other fields, and IBM hopes it will push to the forefront of several major technology trends, such as the Internet of Things, cloud computing and personalized medicine. (For more companies cashing in on these trends, see 13 Stocks for the Tech Revolution.)

For now, Watson doesn’t appear to be a big moneymaker for IBM. But investors seem impressed with its potential, seeing it as a way for IBM to reverse a long period of falling sales and profits. IBM’s stock (symbol IBM) returned 25% in 2016, after three straight years of declines. One shareholder who should be pleased: Warren Buffett, whose Berkshire Hathaway owns an 8.5% stake in Big Blue, worth $14.2 billion.

Sun, 17 Jul 2022 11:59:00 -0500 en text/html https://www.kiplinger.com/article/investing/t057-c000-s002-how-ibm-watson-can-help-cut-your-tax-bill.html
Killexams : Managed Network Services Market Size, Share Growing Rapidly with exact Trends, Development, Revenue, Demand and Forecast to 2030

The MarketWatch News Department was not involved in the creation of this content.

Jul 27, 2022 (AmericaNewsHour) -- Key Companies Covered in the Managed Network Services Market Research are Fujitsu Ltd., Cisco Systems, Inc., Dell EMC (EMC Corporation), IBM Corporation, Alcatel-Lucent, and other key market players.

In its market research collateral archive, CRIFAX added a report titled 'Global Managed Network Services Market, 2021-2030�?� which consists of the study of the growth strategies used by the leading players in the Managed Network Services to keep themselves ahead of the competition. In addition, the study also covers emerging trends, mergers and acquisitions, region-wise growth analysis, as well as the challenges that impact the growth of the market.

For more information about this report visit: https://www.crifax.com/sample-request-1001808

The growth of the global Managed Network Services market worldwide is largely driven by the increasing number of technical developments in different industries around the world and the overall digital revolution. Digital economic development is one of the key factors motivating big giants to invest aggressively on digital innovation and shift their conventional business models to automated ones, so as to seize value-producing opportunities and keep ahead of their competitors, as well as to boost the continuity and reliability of their services. Ranging from artificial intelligence (AI), augmented reality (AR) and virtual reality (VR) to the internet of things (IoT), the growing number of internet-connected devices around the world on account of the growing technologies is anticipated to contribute to the growth of the global Managed Network Services market.

For more information about this report visit: https://www.crifax.com/sample-request-1001808

This Report covers about :

  • Historical data
  • Revenue forecasts, CAGR and growth rates up to 2030
  • Industry Analysis
  • Competitive Analysis
  • Key geographic growth data
  • In-depth profiling of Key Player's Companies

Based on the increasing number of internet users and data transmission devices as well as networks, along with the innovations in the ICT industry are expected to generate substantial opportunities in the global Managed Network Services market over the forecast period, i.e., 2021-2030. As per the statistics provided by International Telecommunication Union (ITU), individuals using the internet per 100 inhabitants for developed region increased around 3 times to 86.6 numbers for the time period of 2001- 2019.

Some of the prominent factors that are expected to fuel the demand for Managed Network Services in the coming years are the growing number of internet users and the overall rise in research and development activities in the field of information and communication technology.

However, with technologies continuously changing, firms need to keep up with these innovations to maintain a strategic advantage over their competitors in the industry. In order to achieve this, it is vital for them to train their practitioners on a timely basis. Not only will it encourage marketers to remain ahead of their business, but it will also help them discover new applications from it.

To provide better understanding of internal and external marketing factors, the multi-dimensional analytical tools such as SWOT and PESTEL analysis have been implemented in the global Managed Network Services market report. In addition, the research covers market segmentation, CAGR (Compound Annual Growth Rate), BPS analysis, Y-o-Y growth (%), Porter's five-force model, absolute $ opportunity, and the market's anticipated cost structure.

About CRIFAX

CRIFAX is driven by integrity and commitment to its clients and provides a step-by-step guide on achieving their business prospects with cutting-edge marketing research and consulting solutions. We make sure that our industry enthusiasts understand all the business aspects related to their projects with the help of our industry experts with hands-on experience in their respective fields, which further enhances the consumer base and the size of their organization. From customized and syndicated research reports to consulting services, we offer a wide range of unique marketing research solutions, out of which, we update our syndicated research reports annually to ensure that they are modified according to the latest and ever-changing technology and industry insights. This has enabled us to build a niche in offering 'distinctive market services' that strengthened the confidence in our insights of our global clients and helped us outperform our competitors as well.

Contact Us:

CRIFAX

Email: sales@crifax.com

U.K. Phone: +44 161 394 2021

U.S. Phone: +1 917 924 8284

The post Managed Network Services Market Size, Share Growing Rapidly with exact Trends, Development, Revenue, Demand and Forecast to 2030 appeared first on America News Hour.

COMTEX_411072904/2606/2022-07-27T09:11:46

Is there a problem with this press release? Contact the source provider Comtex at editorial@comtex.com. You can also contact MarketWatch Customer Service via our Customer Center.

The MarketWatch News Department was not involved in the creation of this content.

Wed, 27 Jul 2022 01:11:00 -0500 en-US text/html https://www.marketwatch.com/press-release/managed-network-services-market-size-share-growing-rapidly-with-recent-trends-development-revenue-demand-and-forecast-to-2030-2022-07-27
Killexams : IBM Report: Consumers Pay the Price as Data Breach Costs Reach All-Time High

60% of breached businesses raised product prices post-breach; vast majority of critical infrastructure lagging in zero trust adoption; $550,000 in extra costs for insufficiently staffed businesses

CAMBRIDGE, Mass., July 27, 2022 /PRNewswire/ -- IBM (NYSE: IBM) Security today released the annual Cost of a Data Breach Report,1 revealing costlier and higher-impact data breaches than ever before, with the global average cost of a data breach reaching an all-time high of $4.35 million for studied organizations. With breach costs increasing nearly 13% over the last two years of the report, the findings suggest these incidents may also be contributing to rising costs of goods and services. In fact, 60% of studied organizations raised their product or services prices due to the breach, when the cost of goods is already soaring worldwide amid inflation and supply chain issues.

60% of breached businesses studied stated they increased the price of their products or services due to the data breach

The perpetuality of cyberattacks is also shedding light on the "haunting effect" data breaches are having on businesses, with the IBM report finding 83% of studied organizations have experienced more than one data breach in their lifetime. Another factor rising over time is the after-effects of breaches on these organizations, which linger long after they occur, as nearly 50% of breach costs are incurred more than a year after the breach.

The 2022 Cost of a Data Breach Report is based on in-depth analysis of real-world data breaches experienced by 550 organizations globally between March 2021 and March 2022. The research, which was sponsored and analyzed by IBM Security, was conducted by the Ponemon Institute.

Some of the key findings in the 2022 IBM report include:

  • Critical Infrastructure Lags in Zero Trust – Almost 80% of critical infrastructure organizations studied don't adopt zero trust strategies, seeing average breach costs rise to $5.4 million – a $1.17 million increase compared to those that do. All while 28% of breaches amongst these organizations were ransomware or destructive attacks.
  • It Doesn't Pay to Pay – Ransomware victims in the study that opted to pay threat actors' ransom demands saw only $610,000 less in average breach costs compared to those that chose not to pay – not including the cost of the ransom. Factoring in the high cost of ransom payments, the financial toll may rise even higher, suggesting that simply paying the ransom may not be an effective strategy.
  • Security Immaturity in Clouds – Forty-three percent of studied organizations are in the early stages or have not started applying security practices across their cloud environments, observing over $660,000 on average in higher breach costs than studied organizations with mature security across their cloud environments.
  • Security AI and Automation Leads as Multi-Million Dollar Cost Saver – Participating organizations fully deploying security AI and automation incurred $3.05 million less on average in breach costs compared to studied organizations that have not deployed the technology – the biggest cost saver observed in the study.

"Businesses need to put their security defenses on the offense and beat attackers to the punch. It's time to stop the adversary from achieving their objectives and start to minimize the impact of attacks. The more businesses try to perfect their perimeter instead of investing in detection and response, the more breaches can fuel cost of living increases." said Charles Henderson, Global Head of IBM Security X-Force. "This report shows that the right strategies coupled with the right technologies can help make all the difference when businesses are attacked."

Over-trusting Critical Infrastructure Organizations
Concerns over critical infrastructure targeting appear to be increasing globally over the past year, with many governments' cybersecurity agencies urging vigilance against disruptive attacks. In fact, IBM's report reveals that ransomware and destructive attacks represented 28% of breaches amongst critical infrastructure organizations studied, highlighting how threat actors are seeking to fracture the global supply chains that rely on these organizations. This includes financial services, industrial, transportation and healthcare companies amongst others.

Despite the call for caution, and a year after the Biden Administration issued a cybersecurity executive order that centers around the importance of adopting a zero trust approach to strengthen the nation's cybersecurity, only 21% of critical infrastructure organizations studied adopt a zero trust security model, according to the report. Add to that, 17% of breaches at critical infrastructure organizations were caused due to a business partner being initially compromised, highlighting the security risks that over-trusting environments pose.

Businesses that Pay the Ransom Aren't Getting a "Bargain"
According to the 2022 IBM report, businesses that paid threat actors' ransom demands saw $610,000 less in average breach costs compared to those that chose not to pay – not including the ransom amount paid. However, when accounting for the average ransom payment, which according to Sophos reached $812,000 in 2021, businesses that opt to pay the ransom could net higher total costs - all while inadvertently funding future ransomware attacks with capital that could be allocated to remediation and recovery efforts and looking at potential federal offenses.

The persistence of ransomware, despite significant global efforts to impede it, is fueled by the industrialization of cybercrime. IBM Security X-Force discovered the duration of studied enterprise ransomware attacks shows a drop of 94% over the past three years – from over two months to just under four days. These exponentially shorter attack lifecycles can prompt higher impact attacks, as cybersecurity incident responders are left with very short windows of opportunity to detect and contain attacks. With "time to ransom" dropping to a matter of hours, it's essential that businesses prioritize rigorous testing of incident response (IR) playbooks ahead of time. But the report states that as many as 37% of organizations studied that have incident response plans don't test them regularly.

Hybrid Cloud Advantage
The report also showcased hybrid cloud environments as the most prevalent (45%) infrastructure amongst organizations studied. Averaging $3.8 million in breach costs, businesses that adopted a hybrid cloud model observed lower breach costs compared to businesses with a solely public or private cloud model, which experienced $5.02 million and $4.24 million on average respectively. In fact, hybrid cloud adopters studied were able to identify and contain data breaches 15 days faster on average than the global average of 277 days for participants.

The report highlights that 45% of studied breaches occurred in the cloud, emphasizing the importance of cloud security. However, a significant 43% of reporting organizations stated they are just in the early stages or have not started implementing security practices to protect their cloud environments, observing higher breach costs2. Businesses studied that did not implement security practices across their cloud environments required an average 108 more days to identify and contain a data breach than those consistently applying security practices across all their domains.

Additional findings in the 2022 IBM report include:

  • Phishing Becomes Costliest Breach Cause – While compromised credentials continued to reign as the most common cause of a breach (19%), phishing was the second (16%) and the costliest cause, leading to $4.91 million in average breach costs for responding organizations.
  • Healthcare Breach Costs Hit Double Digits for First Time Ever– For the 12th year in a row, healthcare participants saw the costliest breaches amongst industries with average breach costs in healthcare increasing by nearly $1 million to reach a record high of $10.1 million.
  • Insufficient Security Staffing – Sixty-two percent of studied organizations stated they are not sufficiently staffed to meet their security needs, averaging $550,000 more in breach costs than those that state they are sufficiently staffed.

Additional Sources

  • To download a copy of the 2022 Cost of a Data Breach Report, please visit: https://www.ibm.com/security/data-breach.
  • Read more about the report's top findings in this IBM Security Intelligence blog.
  • Sign up for the 2022 IBM Security Cost of a Data Breach webinar on Wednesday, August 3, 2022, at 11:00 a.m. ET here.
  • Connect with the IBM Security X-Force team for a personalized review of the findings: https://ibm.biz/book-a-consult.

About IBM Security
IBM Security offers one of the most advanced and integrated portfolios of enterprise security products and services. The portfolio, supported by world-renowned IBM Security X-Force® research, enables organizations to effectively manage risk and defend against emerging threats. IBM operates one of the world's broadest security research, development, and delivery organizations, monitors 150 billion+ security events per day in more than 130 countries, and has been granted more than 10,000 security patents worldwide. For more information, please check www.ibm.com/security, follow @IBMSecurity on Twitter or visit the IBM Security Intelligence blog.

Press Contact:

IBM Security Communications
Georgia Prassinos
gprassinos@ibm.com

1 Cost of a Data Breach Report 2022, conducted by Ponemon Institute, sponsored, and analyzed by IBM
2 Average cost of $4.53M, compared to average cost $3.87 million at participating organizations with mature-stage cloud security practices

IBM Corporation logo. (PRNewsfoto/IBM)

Cision View original content to download multimedia:https://www.prnewswire.com/news-releases/ibm-report-consumers-pay-the-price-as-data-breach-costs-reach-all-time-high-301592749.html

SOURCE IBM

Tue, 26 Jul 2022 17:00:00 -0500 en-US text/html https://fox8.com/business/press-releases/cision/20220727NY26218/ibm-report-consumers-pay-the-price-as-data-breach-costs-reach-all-time-high/
Killexams : IBM Touts AI, Hybrid Cloud: ‘Demand For Our Solutions Remains Strong’

Cloud News

Joseph F. Kovar

‘Given its ability to boost innovation, productivity, resilience, and help organizations scale, IT has become a high priority in a company’s budget. As such, there is every reason to believe technology spending in the B2B space will continue to surpass GDP growth,’ says IBM CEO Arvind Krishna.

 ARTICLE TITLE HERE

A strengthening IT environment that is playing into IBM AI and hybrid cloud capabilities means a rosy future for IBM and its B2B business, CEO Arvind Krishna told investors Monday.

Krishna, in his prepared remarks for IBM’s second fiscal quarter 2022 financial analyst conference call, said that technology serves as a fundamental source of competitive advantage for businesses.

“It serves as both a deflationary force and a force multiplier, and is especially critical as clients face challenges on multiple fronts from supply chain bottlenecks to demographic shifts,” he said. “Given its ability to boost innovation, productivity, resilience, and help organizations scale, IT has become a high priority in a company’s budget. As such, there is every reason to believe technology spending in the B2B space will continue to surpass GDP growth.”

[Related: IBM STARTS ‘ORDERLY WIND-DOWN’ OF RUSSIA BUSINESS]

That plays well with IBM’s hybrid cloud and AI strategy where the company is investing in its offerings, technical talent, ecosystem, and go-to-market model, Krishna said.

“Demand for our solutions remains strong,” he said. “We continued to have double-digit performance in IBM Consulting, broad-based strength in software, and with the z16 [mainframe] platform launch, our infrastructure business had a good quarter. By integrating technology and expertise from IBM and our partners, our clients will continue to see our hybrid cloud and AI solutions as a crucial source of business opportunity and growth.”

Krishna said hybrid clouds are about offering clients a platform to straddle multiple public clouds, private clouds, on-premises infrastructures, and the edge, which is where Red Hat, which IBM acquired in 2019, comes into play, Krishna said.

“Our software has been optimized to run on that platform, and includes advanced data and AI, automation, and the security capabilities our clients need,” he said. “Our global team of consultants offers deep business expertise and co-creates with clients to accelerate their digital transformation journeys. Our infrastructure allows clients to take full advantage of an extended hybrid cloud environment.”

As a result, IBM now has over 4,000 hybrid cloud platform clients, with over 250 new clients added during the second fiscal quarter, Krishna said.

“Those who adopt our platform tend to consume more of our solutions across software, consulting, and infrastructure, [and] expanding our footprint within those clients,” he said.

IBM is also benefitting from the steady adoption by businesses of artificial intelligence technologies as those businesses try to process the enormous amount of data generated from hybrid cloud environments all the way to the edge, Krishna said. An IBM study released during the second fiscal quarter found that 35 percent of companies are now using some form of AI with automation in their business to address demographic shifts and move their employees to higher value work, he said.

“This is one of the many reasons we are investing heavily in both AI and automation,” he said. “These investments are paying off.”

IBM is also moving to develop leadership in quantum computing, Krishna said. The company currently has a 127-qubit quantum computer it its cloud, and is committed to demonstrate the first 400-plus-qubit system before year-end as part of its path to deliver a 1,000-plus-qubit system next year and a 4,000-plus-qubit system in 2025, he said.

“One of the implications of quantum computing will be the need to change how information is encrypted,” he said. “We are proud that technology developed by IBM and our collaborators has been selected by NIST (National Institute of Standards and Technology) as the basis of the next generation of quantum-safe encryption protocols.”

IBM during the quarter also move forward in its mainframe technology with the release of its new z16 mainframe, Krishna said.

“The z16 is designed for cloud-native development, cybersecurity resilience, [and] quantum-safe encryption, and includes an on-chip AI accelerator, which allows clients to reduce fraud within real-time transactions,” he said.

IBM also made two acquisitions during the quarter related to cybersecurity, Krishna said. The first was Randori, an attack surface management and offensive cybersecurity provider. That acquisition built on IBM’s November acquisition of ReaQta, an endpoint security firm, he said.

While analysts during the question and answer part of Monday’s financial analyst conference call did not ask about the news that IBM has brought in Matt Hicks as the new CEO of Red Hat, they did seem concerned about how the 17-percent growth in Red Had revenue over last year missed expectations.

When asked about Red Hat revenue, Krishna said IBM feels very good about the Red Hat business and expect continued strong demand.

“That said, we had said late last year that we expect growth in Red Hat to be in the upper teens,” he said. “That expectation is what we are going to continue with. … Deferred revenue accounts for the bulk of what has been the difference in the growth rates coming down from last year to this year.”

IBM CFO James Kavanaugh followed by saying that while IBM saw 17 percent growth overall for Red Hat, the company took market share with its core REL (Red Hat Enterprise Linux) and in its Red Hat OpenShift hybrid cloud platform foundation. Red Hat OpenShift revenue is now four-and-a-half times the revenue before IBM acquired Red Hat, and Red Hat OpenShift bookings were up over 50 percent, Kavanaugh said.

“So we feel pretty good about our Red Hat portfolio overall. … Remember, we‘re three years into this acquisition right now,” he said. “And we couldn’t be more pleased as we move forward.”

When asked about the potential impact from an economic downturn, Krishna said IBM’s pipelines remain healthy and consistent with what the company saw in the first half of fiscal 2022, making him more optimistic than many of his peers.

“In an inflationary environment, when clients take our technology, deploy it, leverage our consulting, it acts as a counterbalance to all of the inflation and all of the labor demographics that people are facing all over the globe,” he said.

Krishna also said IBM’s consulting business is less likely than most vendors’ business to be impacted by the economic cycle as it involves a lot of work around deploying the kinds of applications critical to clients’ need to optimize their costs. Furthermore, he said. Because consulting is very labor-intensive, it is easy to hire or let go tens of thousands of employees as needed, he said.

The Numbers

For its second fiscal quarter 2022, which ended June 30, IBM reported total revenue of $15.5 billion, up about 9 percent from the $14.2 billion the company reported for its second fiscal quarter 2021.

This includes software revenue of $6.2 billion, up from $5.9 billion; consulting revenue of $4.8 billion, up from $4.4 billion; infrastructure revenue of $4.2 billion, up from $3.6 billion; financing revenue of $146 million, down from $209 million; and other revenue of $180 million, down from $277 million.

On the software side, IBM reported annual recurring revenue of $12.9 billion, which was up 8 percent over last year. Software revenue from its Red Hat business was up 17 percent over last year, while automation software was up 8 percent, data and AI software up 4 percent, and security software up 5 percent.

On the consulting side, technology consulting revenue was up 23 percent over last year, applications operations up 17 percent, and business transformation up 16 percent.

Infrastructure revenue growth was driven by hybrid infrastructure sales, which rose 7 percent over last year, and infrastructure support, which grew 5 percent. Hybrid infrastructure revenue saw a significant boost from zSystems mainframe sales, which rose 77 percent over last year.

IBM also reported revenue of $8.1 billion from sales to the Americas, up 15 percent over last year; sales to Europe, Middle East, and Africa of $4.5 billion, up 17 percent; and $2.9 billion to the Asia Pacific area, up 16 percent.

Sales to Kyndryl, which late last year was spun out of IBM, accounted for about 5 percent of revenue, including 3 percent of IBM’s Americas revenue.

IBM also reported net income for the quarter on a GAAP basis of $1.39 billion, or $1.53 per share, up from last year’s $1.33 billion, or $1.47 per share.

Joseph F. Kovar

Joseph F. Kovar is a senior editor and reporter for the storage and the non-tech-focused channel beats for CRN. He keeps readers abreast of the latest issues related to such areas as data life-cycle, business continuity and disaster recovery, and data centers, along with related services and software, while highlighting some of the key trends that impact the IT channel overall. He can be reached at jkovar@thechannelcompany.com.

Tue, 19 Jul 2022 13:17:00 -0500 en text/html https://www.crn.com/news/cloud/ibm-touts-ai-hybrid-cloud-demand-for-our-solutions-remains-strong-
Killexams : Price hike for consumers as data breach costs rocket to all-time high

Organisations are passing costs onto customers as the price of data breaches has hit an all-time high, the latest research from IBM Security has found.

Around 60% of studied businesses raised their product or services prices post-breach, as the global average cost of a data attack hit a record $4.35 million, according to the firm’s 2022 Cost of Data Breach Report.

Related Resource

Storage's role in addressing the challenges of ensuring cyber resilience

Understanding the role of data storage in cyber resiliency

Whitepaper cover with title over a grey rectangle with header graphic and ESG logoFree Download

Sponsored by IBM Security and conducted by the Ponemon Institute, the report is based on an in-depth analysis of data breaches experienced by 550 organisations around the world between March 2021 and March 2022.

It found that 83% of them have experienced more than one data breach in their lifetime, while the after-effects of such attacks appear to linger over the long term – with almost 50% of costs incurred more than a year after the initial breach.

In terms of zero trust, critical infrastructure has been found wanting. The report discovered that 80% of critical infrastructure organisations studied do not adopt zero trust strategies, equating to an average breach cost of $5.4 million – a $1.17 million increase over those that do.

Ransomware and destructive attacks represented 28% of breaches amongst these critical infrastructure organisations, with threat actors seeking to fracture the global supply chains that rely on these organisations, IBM said.

Those that paid the ransom found little success, either. Victims that opted to give in to the attackers’ demands saw only $610,000 less in average breach costs when compared to those that did not pay, not including the cost of the ransom.

When factoring in the high cost of ransom payments, IBM noted that the financial toll “may rise even higher”, rendering the strategy of paying up not very effective.

Muddying the waters further, 43% of organisations were found to be in the early stages of or had not yet started applying security practices across their cloud environments, adding an average of $660,000 in breach costs.

By contrast, those fully deploying security and AI automation incurred $3.05 million less costs on average compared to organisations that did not utilise the technology, making it the biggest cost save in the study, IBM revealed.

Additionally, the data showed phishing to be the costliest breach cause, leading to £4.91 million in average costs for responding organisations. It was also the second most common cause (16%), just behind compromised credentials (19%).

Charles Henderson, global head of IBM Security X-Force, said businesses need to “put their security defenses on the offense” to better protect against attackers.

“It’s time to stop the adversary from achieving their objectives and start to minimise the impact of attacks,” he said. “The more businesses try to perfect their perimeter instead of investing in detection and response, the more breaches can fuel cost of living increases.

“This report shows that the right strategies coupled with the right technologies can help make all the difference when businesses are attacked.”

Featured Resources

The COO's pocket guide to enterprise-wide intelligent automation

Automating more cross-enterprise and expert work for a better value stream for customers

Free Download

Introducing IBM Security QRadar XDR

A comprehensive open solution in a crowded and confusing space

Free Download

2021 Gartner critical capabilities for data integration tools

How to identify the right tool in support of your data management solutions

Free Download

Unified endpoint management solutions 2021-22

Analysing the UEM landscape

Free Download
Sat, 30 Jul 2022 06:48:00 -0500 en text/html https://www.itpro.co.uk/security/data-breaches/368649/price-hike-for-consumers-as-data-breach-costs-rocket-to-all-time-high
Killexams : Intel’s ATX12VO Standard: A Study In Increasing Computer Power Supply Efficiency

The venerable ATX standard was developed in 1995 by Intel, as an attempt to standardize what had until then been a PC ecosystem formed around the IBM AT PC’s legacy. The preceding AT form factor was not so much a standard as it was the copying of the IBM AT’s approximate mainboard and with it all of its flaws.

With the ATX standard also came the ATX power supply (PSU), the standard for which defines the standard voltage rails and the function of each additional feature, such as soft power on (PS_ON).  As with all electrical appliances and gadgets during the 1990s and beyond, the ATX PSUs became the subject of power efficiency regulations, which would also lead to the 80+ certification program in 2004.

Starting in 2019, Intel has been promoting the ATX12VO (12 V only) standard for new systems, but what is this new standard about, and will switching everything to 12 V really be worth any power savings?

What ATX12VO Is

As the name implies, the ATX12VO standard is essentially about removing the other voltage rails that currently exist in the ATX PSU standard. The idea is that by providing one single base voltage, any other voltages can be generated as needed using step-down (buck) converters. Since the Pentium 4 era this has already become standard practice for the processor and much of the circuitry on the mainboard anyway.

As the ATX PSU standard moved from the old 1.x revisions into the current 2.x revision range, the -5V rail was removed, and the -12V rail made optional. The ATX power connector with the mainboard was increased from 20 to 24 pins to allow for more 12 V capacity to be added. Along with the Pentium 4’s appetite for power came the new 4-pin mainboard connector, which is commonly called the “P4 connector”, but officially the “+12 V Power 4 Pin Connector” in the v2.53 standard. This adds another two 12 V lines.

Power input and output on the ASRock Z490 Phantom Gaming 4SR, an ATX12VO mainboard. (Credit: Anandtech)

In the ATX12VO standard, the -12 V, 5 V, 5 VSB (standby) and 3.3 V rails are deleted. The 24-pin connector is replaced with a 10-pin one that carries three 12 V lines (one more than ATX v2.x) in addition to the new 12 VSB standby voltage rail. The 4-pin 12 V connectors would still remain, and still require one to squeeze one or two of those through impossibly small gaps in the system’s case to get them to the top of the mainboard, near the CPU’s voltage regulator modules (VRMs).

While the PSU itself would be somewhat streamlined, the mainboard would gain these VRM sections for the 5 V and 3.3 V rails, as well as power outputs for SATA, Molex and similar. Essentially the mainboard would take over some of the PSU’s functions.

Why ATX12VO exists

A range of Dell computers and server which will be subject to California’s strict efficiency regulations.

The folk over at GamersNexus have covered their research and the industry’s thoughts on the syllabu of ATX12VO in an article and video that were published last year. To make a long story short, OEM system builders and systems integrators are subject to pretty strong power efficiency regulations, especially in California. Starting in July of 2021, new Tier 2 regulations will come into force that add more strict requirements for OEM and SI computer equipment: see 1605.3(v)(5) (specifically table V-7) for details.

In order to meet these ever more stringent efficiency requirements, OEMs have been creating their own proprietary 12 V-only solutions, as detailed in GamersNexus’ recent video review on the Dell G5 5000 pre-built desktop system. Intel’s ATX12VO standard therefore would seem to be more targeted at unifying these proprietary standards rather than replacing ATX v2.x PSUs in DIY systems. For the latter group, who build their own systems out of standard ATX, mini-ITX and similar components, these stringent efficiency regulations do not apply.

The primary question thus becomes whether ATX12VO makes sense for DIY system builders. While the ability to (theoretically) increase power efficiency especially at low loads seems beneficial, it’s not impossible to accomplish the same with ATX v2.x PSUs. As stated by an anonymous PSU manufacturer in the GamersNexus article, SIs are likely to end up simply using high-efficiency ATX v2.x PSUs to meet California’s Tier 2 regulations.

Evolution vs Revolution

Seasonic’s CONNECT DC-DC module connected to a 12V PSU. (Credit: Seasonic)

Ever since the original ATX PSU standard, the improvements have been gradual and never disruptive. Although some got caught out by the negative voltage rails being left out when trying to power old mainboards that relied on -5 V and -12 V rails being present, in general these changes were minor enough to incorporate these into the natural upgrade cycle of computer systems. Not so with ATX12VO, as it absolutely requires an ATX12VO PSU and mainboard to accomplish the increased efficiency goals.

While the possibility of using an ATX v2.x to ATX12VO adapter exists that passively adapts the 12 V rails to the new 10-pin connector and boosts the 5 VSB line to 12 VSB levels, this actually lowers efficiency instead of increasing it. Essentially, the only way for ATX12VO to make a lot of sense is for the industry to switch over immediately and everyone to upgrade to it as well without reusing non-ATX12VO compatible mainboards and PSUs.

Another crucial point here is that OEMs and SIs are not required to adopt ATX12VO. Much like Intel’s ill-fated BTX alternative to the ATX standard, ATX12VO is a suggested standard that manufacturers and OEMs are free to adopt or ignore at their leisure.

Important here are probably the obvious negatives that ATX12VO introduces:

  • Adding another hot spot to the mainboard and taking up precious board space.
  • Turning mainboard manufacturers into PSU manufacturers.
  • Increasing the cost and complexity of mainboards.
  • Routing peripheral power (including case fans) from the mainboard.
  • Complicating troubleshooting of power issues.
Internals of Seasonic’s CONNECT modular power supply. (Credit: Tom’s Hardware)

Add to this potential alternatives like Seasonic’s CONNECT module. This does effectively the same as the ATX12VO standard, removing the 5 V and 3.3 V rails from the PSU and moving them to an external module, off of the mainboard. It can be fitted into the area behind the mainboard in many computer cases, making for very clean cable management. It also allows for increased efficiency.

As PSUs tend to survive at least a few system upgrades, it could be argued that from an environmental perspective, having the minor rails generated on the mainboard is undesirable. Perhaps the least desirable aspect of ATX12VO is that it reduces the modular nature of ATX-style computers, making them more like notebook-style systems. Instead, a more reasonable solution here might be that of a CONNECT-like solution which offers both an ATX 24-pin and ATX12VO-style 10-pin connectivity option.

Thinking larger

In the larger scheme of power efficiency it can be beneficial to take a few steps back from details like the innards of a computer system and look at e.g. the mains alternating current (AC) that powers these systems. A well-known property of switching mode power supplies (SMPS) like those used in any modern computer is that they’re more efficient at higher AC input voltages.

Power supply efficiency at different input voltages. (Credit: HP)

This can be seen clearly when looking for example at the rating levels for 80 Plus certification. Between 120 VAC and 230 VAC line voltage, the latter is significantly more efficient. To this one can also add the resistive losses from carrying double the amps over the house wiring for the same power draw at 120 V compared to 230 VAC. This is the reason why data centers in North America generally run on 208 VAC according to this APC white paper.

For crypto miners and similar, wiring up their computer room for 240 VAC (North American hot-neutral-hot) is also a popular topic, as it directly boosts their profits.

Future Outlook

Whether ATX12VO will become the next big thing or fizzle out like BTX and so many other proposed standards is hard to tell. One thing which the ATX12VO standard has against it is definitely that it requires a lot of big changes to happen in parallel, and the creation of a lot of electronic waste through forced upgrades within a short timespan. If we consider that many ATX and SFX-style PSUs are offered with 7-10 year warranties compared to the much shorter lifespan of mainboards, this poses a significant obstacle.

Based on the sounds from the industry, it seems highly likely that much will remain ‘business as usual’. There are many efficient ATX v2.x PSUs out there, including 80 Plus Platinum and Titanium rated ones, and Seasonic’s CONNECT and similar solutions would appeal heavily to those who are into neat cable management. For those who buy pre-built systems, the use of ATX12VO is also not relevant, so long as the hardware is compliant to all (efficiency) regulations. The ATX v2.x standard and 80 Plus certification are also changing to set strict 2-10% load efficiency targets, which is the main target with ATX12VO.

What would be the point for you to switch to ATX12VO, and would you pick it over a solution like Seasonic CONNECT if both offered the same efficiency levels?

(Heading image: Asrock Z490 Phantom Gaming 4SR with SATA power connected, credit: c’t)

Wed, 03 Aug 2022 11:59:00 -0500 Maya Posch en-US text/html https://hackaday.com/2021/06/07/intels-atx12vo-standard-a-study-in-increasing-computer-power-supply-efficiency/
M2020-645 exam dump and training guide direct download
Training Exams List