Valid and Latest killexams C9550-400 free pdf

A lot of men and women when fall short IBM WebSphere Operational Decision Management V8.0- Application Development exam, do not really attempt again. We all recommend you in order to get our C9550-400 test questions and solutions with VCE exercise test and test once more plus you will obtain the highest marks within C9550-400 exam. That can be guaranteed. We supply updated, valid plus latest C9550-400 exam prep.

Exam Code: C9550-400 Practice test 2022 by Killexams.com team
IBM WebSphere Operational Decision Management V8.0, Application Development
IBM Operational Study Guide
Killexams : IBM Operational Study Guide - BingNews https://killexams.com/pass4sure/exam-detail/C9550-400 Search results Killexams : IBM Operational Study Guide - BingNews https://killexams.com/pass4sure/exam-detail/C9550-400 https://killexams.com/exam_list/IBM Killexams : Taking The Road To Modernizing Today's Mainframe

Milan Shetti, President and CEO, Rocket Software.

With the rising popularity of cloud-based solutions over the last decade, a growing misconception in the professional world is that mainframe technology is becoming obsolete. This couldn’t be further from the truth. In fact, the results of a exact Rocket survey of over 500 U.S. IT professionals found businesses today still rely heavily on the mainframe over cloud-based or distributed technologies to power their IT infrastructures—including 67 of the Fortune 100.

Despite the allure surrounding digital solutions, a exact IBM study uncovered that 82% of executives agree their business case still supports mainframe-based applications. This is partly due to the increase in disruptive events taking place throughout the world—the Covid-19 pandemic, a weakened global supply chain, cybersecurity breaches and increased regulations across the board—leading companies to continue leveraging the reliability and security of the mainframe infrastructure.

However, the benefits are clear, and the need is apparent for organizations to consider modernizing their mainframe infrastructure and implementing modern cloud-based solutions into their IT environment to remain competitive in today’s digital world.

Overcoming Mainframe Obstacles

Businesses leveraging mainframe technology that hasn’t been modernized may struggle to attract new talent to their organization. With the new talent entering the professional market primarily trained on cloud-based software, traditional mainframe software and processes create a skills gap that could deter prospective hires and lead to companies missing out on top-tier talent.

Without modernization, many legacy mainframes lack connectivity with modern cloud-based solutions. Although the mainframe provides a steady, dependable operational environment, it’s well known that the efficiency, accuracy and accessibility modern cloud-based solutions create have helped simplify and Improve many operational practices. Mainframe infrastructures that can’t integrate innovative tools—like automation—to streamline processes or provide web and mobile access to remote employees—which has become essential following the pandemic—have become impractical for most business operations.

Considering these impending hurdles, organizations are at a crossroads with their mainframe operations. Realistically, there are three roads a business can choose to journey down. The first is to continue “operating as-is,” which is cost-effective but more or less avoids the issue at hand and positions a company to get left in the dust by its competitors. A business can also “re-platform” or completely remove and replace its current mainframe infrastructure in favor of distributed or cloud models. However, this option can be disruptive, pricey and time-consuming and forces businesses to simply toss out most of their expensive technology investments.

The final option is to “modernize in place.” Modernizing in place allows businesses to continue leveraging their technology investments through mainframe modernization. It’s the preferred method of IT professionals—56% compared to 27% continuing to “operate as-is” and 17% opting to “re-platform”—because it’s typically cost-efficient, less disruptive to operations and improves the connectivity and flexibility of the IT infrastructure.

Most importantly, modernizing in place lets organizations integrate cloud solutions directly into their mainframe environment. In this way, teams can seamlessly transition into a more efficient and sustainable hybrid cloud model that helps alleviate the challenges of the traditional mainframe infrastructure.

Modernizing In Place With A Hybrid Cloud Strategy

With nearly three-quarters of executives from some of the largest and most successful businesses in agreement that mainframe-based applications are still central to business strategy, the mainframe isn’t going anywhere. And with many organizations still opting for mainframe-based solutions for data-critical operating systems—such as financial management, customer transaction systems of record, HR systems and supply chain data management systems—mainframe-based applications are actually expected to grow over the next two years. That’s why businesses must look to leverage their years of technology investments alongside the latest tools.

Modernizing in place with a hybrid cloud strategy is one of the best paths for an enterprise to meet the evolving needs of the market and its customers while simultaneously implementing an efficient and sustainable IT infrastructure. It lets companies leverage innovative cloud solutions in their tech stack that help bridge the skills gap to entice new talent while making operations accessible for remote employees.

The integration of automated tools and artificial intelligence capabilities in a hybrid model can help eliminate many manual processes to reduce workloads and Improve productivity. The flexibility of a modernized hybrid environment can also allow teams to implement cutting-edge processes like DevOps and CI/CD testing into their operations, helping ensure a continuously optimized operational environment.

With most IT professionals in agreement that hybrid is the answer moving forward, it’s clear that more and more businesses that work within mainframe environments will begin to migrate cloud solutions into their tech stack. Modernizing in place with a hybrid cloud strategy is one great way for businesses to meet market expectations while positioning themselves for future success.


Forbes Technology Council is an invitation-only community for world-class CIOs, CTOs and technology executives. Do I qualify?


Mon, 25 Jul 2022 02:38:00 -0500 Milan Shetti en text/html https://www.forbes.com/sites/forbestechcouncil/2022/07/25/taking-the-road-to-modernizing-todays-mainframe/
Killexams : IBM report: Data breach costs up, contributing to inflation

The “2022 Cost of a Data Breach Report” found 60 percent of studied organizations raised their product or services prices because of a breach. The report analyzed 550 organizations that suffered a data breach between March 2021 and March 2022, with research conducted by the Ponemon Institute.

IBM has studied data breaches in the United States the last 17 years. In 2021, the average cost of a breach was $4.24 million.

New to this year’s report was a look at the effects of supply chain compromises and the security skills gap. While organizations that were breached because of a supply chain compromise were relatively low (19 percent), the average total cost of such a breach was $4.46 million.

The average time to identify and contain a supply chain compromise was 303 days, opposed to the global average of 277 days.

The study found the average data breach cost savings of a sufficiently staffed organization was $550,000, but only 38 percent of studied organizations said their security team was sufficiently staffed.

Of note, the “Cost of Compliance Report 2022” published by Thomson Reuters Regulatory Intelligence earlier this month found staff shortages have been driven by rising salaries, tightening budgets, and personal liability increases.

The IBM study included 13 companies that experienced data breaches involving the loss or theft of 1 million to 60 million records. The average total cost for breaches of 50-60 million records was $387 million, a slight decline from $401 million in 2021.

For a second year, the study examined how deploying a “zero trust” security framework has a net positive impact on data breach costs, with savings of approximately $1 million for organizations that implemented one. However, only 41 percent of organizations surveyed deployed a zero trust security architecture.

Organizations with mature deployment of zero trust applied consistently across all domains saved more than $1.5 million on average, according to the survey.

Almost 80 percent of critical infrastructure organizations that did not adopt a zero trust strategy saw average breach costs rise to $5.4 million.

The study also found it doesn’t pay to pay hackers, with only $610,000 less in average breach costs compared to businesses that chose not to pay ransomware threat actors.

Organizations that fully deployed a security artificial intelligence and automation incurred $3.05 million less on average in breach costs compared to those that did not, the biggest saver observed in the study.

“Businesses need to put their security defenses on the offense and beat attackers to the punch,” said Charles Henderson, global head of IBM Security X-Force, in a press release announcing the study. “It’s time to stop the adversary from achieving their objectives and start to minimize the impact of attacks.”

Thu, 28 Jul 2022 08:48:00 -0500 en text/html https://www.complianceweek.com/cybersecurity/ibm-report-data-breach-costs-up-contributing-to-inflation/31909.article
Killexams : Warehouse Management System Global Market Report 2022

DUBLIN, July 25, 2022 /PRNewswire/ -- The "Warehouse Management System Global Market Report 2022" report has been added to ResearchAndMarkets.com's offering.

This report provides strategists, marketers and senior management with the critical information they need to assess the global warehouse management system market.

This report focuses on warehouse management system market which is experiencing strong growth. The report gives a guide to the warehouse management system market which will be shaping and changing our lives over the next ten years and beyond, including the markets response to the challenge of the global pandemic.

Reasons to Purchase

  • Gain a truly global perspective with the most comprehensive report available on this market covering 12+ geographies.
  • Understand how the market is being affected by the coronavirus and how it is likely to emerge and grow as the impact of the virus abates.
  • Create regional and country strategies on the basis of local data and analysis.
  • Identify growth segments for investment.
  • Outperform competitors using forecast data and the drivers and trends shaping the market.
  • Understand customers based on the latest market research findings.
  • Benchmark performance against key competitors.
  • Utilize the relationships between key data sets for superior strategizing.
  • Suitable for supporting your internal and external presentations with reliable high quality data and analysis

Major players in the warehouse management system market are Manhattan Associates, Oracle Corp., Infor, PTC, SAP SE, PSI Logistics GmbH, IBM Corp., Tecsys, Blue Yonder, Honeywell International Inc, Technology Solutions (UK) Ltd, HighJump Software, Synergy Ltd, Made4net and JDA Software Group Inc.

The global warehouse management system market is expected to grow from $2.39 billion in 2021 to $2.74 billion in 2022 at a compound annual growth rate (CAGR) of 14.77%. The growth is mainly due to the companies rearranging their operations and recovering from the COVID-19 impact, which had earlier led to restrictive containment measures involving social distancing, remote working, and the closure of commercial activities that resulted in operational challenges. The market is expected to reach $4.83 billion in 2026 at a CAGR of 15.15%.

The warehouse management system market consists of sales of warehouse management services by entities (organizations, sole traders and partnerships) which are used by companies to manage and control daily warehouse operations, from the moment goods and materials enter a distribution or fulfilment centre until the moment they leave. Warehouse management systems include inbound logistics and outbound logistics tools for picking and packing processes, resource utilization, analytics, and others.

The main warehouse management system offerings include software and services. Warehouse management system software are used to control and manage daily warehouse operations. The warehouse management system software helps in managing and controlling regular warehouse operations. It directs inventory in managing, picking, and shipping of orders, and guides the system automatically on picking and shipping items.

The different warehouse management system deployment modes include on premises and cloud. The warehouse management system functions include labor management system, analytics and optimization, billing and yard management and systems integration and maintenance, which are used for applications in transportation and logistics, healthcare, retail, manufacturing, food and beverage and other applications.

North America was the largest region in the warehouse management system market in 2021. Asia Pacific is expected to be the fastest-growing region in the forecast period. The regions covered in this report are Asia-Pacific, Western Europe, Eastern Europe, North America, South America, Middle East and Africa.

Increasing demand from e-commerce companies for larger warehouses with better tracking and forecasting is expected to drive the warehouse management system market. The growing e-commerce industry requires continuous tracking of all the equipment and inventory forecasting to keep up the demand and maintain larger cargo movement.

For instance, a study from a research firm Knight Frank reported that the annual warehousing transactions in India are expected to increase from 31.7 million square feet in 2021 to 76.2 million square feet in 2026. Therefore, increasing demand from e-commerce companies is expected to boost the market during forecast period.

Technological advancement is a key trend gaining popularity in the warehouse management system market. Technological advancement is a discovery of knowledge that advances technology. For instance, in May 2020, a US-based provider of technology solutions for distribution centers launched the Manhattan Active Warehouse Management solution, which marks the world's first cloud-native enterprise-class warehouse management system (WMS). The new warehouse management system unifies every aspect of distribution and contains unified control, which allows management team members to quickly visualize, diagnose and take action anywhere in their supply chain.

The countries covered in the warehouse management system market report are Australia, Brazil, China, France, Germany, India, Indonesia, Japan, Russia, South Korea, UK, USA.

Key Topics Covered:

1. Executive Summary

2. Warehouse Management System Market Characteristics

3. Warehouse Management System Market Trends And Strategies

4. Impact Of COVID-19 On Warehouse Management System

5. Warehouse Management System Market Size And Growth
5.1. Global Warehouse Management System Historic Market, 2016-2021, $ Billion
5.1.1. Drivers Of The Market
5.1.2. Restraints On The Market
5.2. Global Warehouse Management System Forecast Market, 2021-2026F, 2031F, $ Billion
5.2.1. Drivers Of The Market
5.2.2. Restraints On the Market

6. Warehouse Management System Market Segmentation
6.1. Global Warehouse Management System Market, Segmentation By Offering, Historic and Forecast, 2016-2021, 2021-2026F, 2031F, $ Billion

6.2. Global Warehouse Management System Market, Segmentation By Deployment, Historic and Forecast, 2016-2021, 2021-2026F, 2031F, $ Billion

6.3. Global Warehouse Management System Market, Segmentation By Function, Historic and Forecast, 2016-2021, 2021-2026F, 2031F, $ Billion

  • Labor Management System
  • Analytics And Optimization
  • Billing And Yard Management
  • Systems Integration And Maintenance

6.4. Global Warehouse Management System Market, Segmentation By Application, Historic and Forecast, 2016-2021, 2021-2026F, 2031F, $ Billion

  • Transportation And Logistics
  • Healthcare
  • Retail
  • Manufacturing
  • Food And Beverage
  • Other Applications

7. Warehouse Management System Market Regional And Country Analysis
7.1. Global Warehouse Management System Market, Split By Region, Historic and Forecast, 2016-2021, 2021-2026F, 2031F, $ Billion
7.2. Global Warehouse Management System Market, Split By Country, Historic and Forecast, 2016-2021, 2021-2026F, 2031F, $ Billion

For more information about this report visit https://www.researchandmarkets.com/r/hy0wjz

Media Contact:

Research and Markets
Laura Wood, Senior Manager
press@researchandmarkets.com

For E.S.T Office Hours Call +1-917-300-0470
For U.S./CAN Toll Free Call +1-800-526-8630
For GMT Office Hours Call +353-1-416-8900

U.S. Fax: 646-607-1907
Fax (outside U.S.): +353-1-481-1716

Logo: https://mma.prnewswire.com/media/539438/Research_and_Markets_Logo.jpg

View original content:https://www.prnewswire.com/news-releases/warehouse-management-system-global-market-report-2022-301592330.html

SOURCE Research and Markets

© 2022 Benzinga.com. Benzinga does not provide investment advice. All rights reserved.

Ad Disclosure: The rate information is obtained by Bankrate from the listed institutions. Bankrate cannot guaranty the accuracy or availability of any rates shown above. Institutions may have different rates on their own websites than those posted on Bankrate.com. The listings that appear on this page are from companies from which this website receives compensation, which may impact how, where, and in what order products appear. This table does not include all companies or all available products.

All rates are subject to change without notice and may vary depending on location. These quotes are from banks, thrifts, and credit unions, some of whom have paid for a link to their own Web site where you can find additional information. Those with a paid link are our Advertisers. Those without a paid link are listings we obtain to Improve the consumer shopping experience and are not Advertisers. To receive the Bankrate.com rate from an Advertiser, please identify yourself as a Bankrate customer. Bank and thrift deposits are insured by the Federal Deposit Insurance Corp. Credit union deposits are insured by the National Credit Union Administration.

Consumer Satisfaction: Bankrate attempts to verify the accuracy and availability of its Advertisers' terms through its quality assurance process and requires Advertisers to agree to our Terms and Conditions and to adhere to our Quality Control Program. If you believe that you have received an inaccurate quote or are otherwise not satisfied with the services provided to you by the institution you choose, please click here.

Rate collection and criteria: Click here for more information on rate collection and criteria.

Sun, 24 Jul 2022 23:15:00 -0500 text/html https://www.benzinga.com/pressreleases/22/07/n28189755/warehouse-management-system-global-market-report-2022
Killexams : Average data breach costs reach an all-time high

In brief: The average cost of an enterprise data breach has reached an all-time high and more often than not, companies raise the price of products or services after a breach to make up for the loss.

In its annual Cost of a Data Breach Report, IBM Security said the global average cost of a data breach is $4.35 million. That's an increase of 2.6 percent from $4.24 million last year and is up 12.7 percent from $3.86 million in the 2020 report. Worse yet, 60 percent of organizations that participated in the study said decisions to raise prices were directly related to security breaches.

Note that this is only the average. Looking at the outliers, we see that those operating in healthcare experienced the costliest breaches for the 12th year in a row with a record average of $10.1 million per incident.

Few will probably be surprised to learn that 83 percent of organizations have experienced more than one data breach in their lifetime. This is no doubt due in part to the fact that 62 percent of those studied felt they are not sufficiently staffed to meet their security needs.

As for attack vectors, IBM noted that 19 percent of breaches resulted from stolen or compromised credentials. Phishing campaigns led to 16 percent of incidents and were the costliest, leading to an average breach cost of $4.91 million. Misconfigured cloud servers caused 15 percent of breaches.

Speaking of the cloud, the study further found that 45 percent of breaches occurred in the cloud. Hybrid cloud environments experienced the lowest average breach cost at $3.8 million compared to organizations using public or private models at $5.02 million and $4.24 million on average, respectively.

Another interesting metric involves ransomware. Businesses that paid ransom demands reported an average of $610,000 less in breach costs compared to those that decided not to pay, but that figure didn't include the ransom amount paid. When factoring in last year's average ransom of $812,360, the pendulum swings the other way and businesses that complied with ransom demands ended up paying more overall in breach costs.

IBM commissioned Ponemon Institute to study 550 organizations across 17 countries and 17 industries between March 2021 and March 2022 to gather data for the report.

Image credit: Pixabay

Wed, 27 Jul 2022 10:03:00 -0500 Shawn Knight en-US text/html https://www.techspot.com/news/95446-average-data-breach-costs-reach-all-time-high.html
Killexams : Nano Electronics Market To Witness Massive Revenue In Upcoming Years 2022-2029| Everspin Technologies, IBM, IMEC

Nano Electronics

Nano Electronics Market 2022 research provides accurate economic, global, and country-level predictions and analyses. It provides a comprehensive perspective of the competitive market as well as an in-depth supply chain analysis to assist businesses in identifying major changes in industry practices. The market report also examines the current state of the Nano Electronics industry, as well as predicted future growth, technological advancements, investment prospects, market economics, and financial data. This study does a thorough examination of the market and offers insights based on an industry SWOT analysis. 

A trial report can be viewed by visiting (Use Corporate eMail ID to Get Higher Priority) at: https://www.worldwidemarketreports.com/sample/817916

The report on the Nano Electronics Market provides access to critical information such as market growth drivers, market growth restraints, current market trends, the market’s economic and financial structure, and other key market details. The research process is used to find, locate, access, and analyze the information available to estimate the market’s overall size and general market scenario for cosmetic surgery. This is done based on a substantial amount of primary and secondary research.

Top Global Key Player Mentioned

❋ Everspin Technologies
❋ IBM
❋ IMEC
❋ HP and OD Vision

Type Segment Analysis

❋ Aluminum Oxide Nanoparticles
❋ Carbon Nanotubes
❋ Copper Oxide Nanoparticles
❋ Gold Nanoparticles
❋ Iron Oxide Nanoparticles
❋ Others

Application Segment Analysis: 

❋ Transistors
❋ Integrated Circuits
❋ Photonics
❋ IOT and wearable Devices
❋ Electronic textile
❋ Others

Get PDF Brochure Here by Clicking: https://www.worldwidemarketreports.com/sample/817916

Main points Covered in Nano Electronics Market Report:

❋ OverviewThe worldwide Nano Electronics market study offers insight into the market’s current state and forecast period. The data in the study is useful for making marketing decisions, determining whether to enter a market and determining the financial standing of the major companies that have been active in it for a while.
❋ Drivers: 
Increasing number of new technological advancements is estimated to augment the growth of the global and Asia Nano Electronics market over the forecast period.
❋ Opportunities
With accuracy and dependability, the study projects the market shares of significant Nano Electronics Market segments. Participants in the industry may use this study to guide strategic investments in the Nano Electronics Market’s high-growth sectors. Additionally, it helps to decide the target audience and strategies the marketing to seize the opportunities at right time.
❋ Regional Analysis
North America (U.S.; Canada; Mexico), Europe (Germany; U.K.; France; Italy; Russia; Spain.), Asia-Pacific (China; India; Japan; Southeast Asia.), South America (Brazil; Argentina.), Middle East & Africa (Saudi Arabia; South Africa.)

Discount on various license types on immediate purchase (Use corporate email ID Get Higher Priority):  https://www.worldwidemarketreports.com/discount/817916

Key Questions Answered in Study on Global Nano Electronics Market:
❋ What would be the 2029 growth of the global Nano Electronics market during the forecasted years?
❋ What are factors influencing the growth of the global Nano Electronics market positively and negatively?
❋ what are the opportunities that might help to overcome the growth restraining factors?
❋ Which region is estimated to hold a substantial share in the next few years?
❋ Which factors would create threats to the thriving businesses in developing economies over the forecast period?
❋ Which are the leading companies operating in the global Nano Electronics market? What strategies they have adopted to hold a stronghold on the market?

To purchase this premium report, click here:  https://www.worldwidemarketreports.com/buy/817916

Contact Us:
Worldwide Market Reports,
Tel: U.S. +1-415-871-0703
U.K. +44-203-289-4040
Japan +81-50-5539-1737
Email: [email protected] 
Website: https://www.worldwidemarketreports.com/ 

Wed, 27 Jul 2022 23:22:00 -0500 Coherent Market Insights en-US text/html https://www.digitaljournal.com/pr/nano-electronics-market-to-witness-massive-revenue-in-upcoming-years-2022-2029-everspin-technologies-ibm-imec
Killexams : IBM Touts AI, Hybrid Cloud: ‘Demand For Our Solutions Remains Strong’

Cloud News

Joseph F. Kovar

‘Given its ability to boost innovation, productivity, resilience, and help organizations scale, IT has become a high priority in a company’s budget. As such, there is every reason to believe technology spending in the B2B space will continue to surpass GDP growth,’ says IBM CEO Arvind Krishna.

 ARTICLE TITLE HERE

A strengthening IT environment that is playing into IBM AI and hybrid cloud capabilities means a rosy future for IBM and its B2B business, CEO Arvind Krishna told investors Monday.

Krishna, in his prepared remarks for IBM’s second fiscal quarter 2022 financial analyst conference call, said that technology serves as a fundamental source of competitive advantage for businesses.

“It serves as both a deflationary force and a force multiplier, and is especially critical as clients face challenges on multiple fronts from supply chain bottlenecks to demographic shifts,” he said. “Given its ability to boost innovation, productivity, resilience, and help organizations scale, IT has become a high priority in a company’s budget. As such, there is every reason to believe technology spending in the B2B space will continue to surpass GDP growth.”

[Related: IBM STARTS ‘ORDERLY WIND-DOWN’ OF RUSSIA BUSINESS]

That plays well with IBM’s hybrid cloud and AI strategy where the company is investing in its offerings, technical talent, ecosystem, and go-to-market model, Krishna said.

“Demand for our solutions remains strong,” he said. “We continued to have double-digit performance in IBM Consulting, broad-based strength in software, and with the z16 [mainframe] platform launch, our infrastructure business had a good quarter. By integrating technology and expertise from IBM and our partners, our clients will continue to see our hybrid cloud and AI solutions as a crucial source of business opportunity and growth.”

Krishna said hybrid clouds are about offering clients a platform to straddle multiple public clouds, private clouds, on-premises infrastructures, and the edge, which is where Red Hat, which IBM acquired in 2019, comes into play, Krishna said.

“Our software has been optimized to run on that platform, and includes advanced data and AI, automation, and the security capabilities our clients need,” he said. “Our global team of consultants offers deep business expertise and co-creates with clients to accelerate their digital transformation journeys. Our infrastructure allows clients to take full advantage of an extended hybrid cloud environment.”

As a result, IBM now has over 4,000 hybrid cloud platform clients, with over 250 new clients added during the second fiscal quarter, Krishna said.

“Those who adopt our platform tend to consume more of our solutions across software, consulting, and infrastructure, [and] expanding our footprint within those clients,” he said.

IBM is also benefitting from the steady adoption by businesses of artificial intelligence technologies as those businesses try to process the enormous amount of data generated from hybrid cloud environments all the way to the edge, Krishna said. An IBM study released during the second fiscal quarter found that 35 percent of companies are now using some form of AI with automation in their business to address demographic shifts and move their employees to higher value work, he said.

“This is one of the many reasons we are investing heavily in both AI and automation,” he said. “These investments are paying off.”

IBM is also moving to develop leadership in quantum computing, Krishna said. The company currently has a 127-qubit quantum computer it its cloud, and is committed to demonstrate the first 400-plus-qubit system before year-end as part of its path to deliver a 1,000-plus-qubit system next year and a 4,000-plus-qubit system in 2025, he said.

“One of the implications of quantum computing will be the need to change how information is encrypted,” he said. “We are proud that technology developed by IBM and our collaborators has been selected by NIST (National Institute of Standards and Technology) as the basis of the next generation of quantum-safe encryption protocols.”

IBM during the quarter also move forward in its mainframe technology with the release of its new z16 mainframe, Krishna said.

“The z16 is designed for cloud-native development, cybersecurity resilience, [and] quantum-safe encryption, and includes an on-chip AI accelerator, which allows clients to reduce fraud within real-time transactions,” he said.

IBM also made two acquisitions during the quarter related to cybersecurity, Krishna said. The first was Randori, an attack surface management and offensive cybersecurity provider. That acquisition built on IBM’s November acquisition of ReaQta, an endpoint security firm, he said.

While analysts during the question and answer part of Monday’s financial analyst conference call did not ask about the news that IBM has brought in Matt Hicks as the new CEO of Red Hat, they did seem concerned about how the 17-percent growth in Red Had revenue over last year missed expectations.

When asked about Red Hat revenue, Krishna said IBM feels very good about the Red Hat business and expect continued strong demand.

“That said, we had said late last year that we expect growth in Red Hat to be in the upper teens,” he said. “That expectation is what we are going to continue with. … Deferred revenue accounts for the bulk of what has been the difference in the growth rates coming down from last year to this year.”

IBM CFO James Kavanaugh followed by saying that while IBM saw 17 percent growth overall for Red Hat, the company took market share with its core REL (Red Hat Enterprise Linux) and in its Red Hat OpenShift hybrid cloud platform foundation. Red Hat OpenShift revenue is now four-and-a-half times the revenue before IBM acquired Red Hat, and Red Hat OpenShift bookings were up over 50 percent, Kavanaugh said.

“So we feel pretty good about our Red Hat portfolio overall. … Remember, we‘re three years into this acquisition right now,” he said. “And we couldn’t be more pleased as we move forward.”

When asked about the potential impact from an economic downturn, Krishna said IBM’s pipelines remain healthy and consistent with what the company saw in the first half of fiscal 2022, making him more optimistic than many of his peers.

“In an inflationary environment, when clients take our technology, deploy it, leverage our consulting, it acts as a counterbalance to all of the inflation and all of the labor demographics that people are facing all over the globe,” he said.

Krishna also said IBM’s consulting business is less likely than most vendors’ business to be impacted by the economic cycle as it involves a lot of work around deploying the kinds of applications critical to clients’ need to optimize their costs. Furthermore, he said. Because consulting is very labor-intensive, it is easy to hire or let go tens of thousands of employees as needed, he said.

The Numbers

For its second fiscal quarter 2022, which ended June 30, IBM reported total revenue of $15.5 billion, up about 9 percent from the $14.2 billion the company reported for its second fiscal quarter 2021.

This includes software revenue of $6.2 billion, up from $5.9 billion; consulting revenue of $4.8 billion, up from $4.4 billion; infrastructure revenue of $4.2 billion, up from $3.6 billion; financing revenue of $146 million, down from $209 million; and other revenue of $180 million, down from $277 million.

On the software side, IBM reported annual recurring revenue of $12.9 billion, which was up 8 percent over last year. Software revenue from its Red Hat business was up 17 percent over last year, while automation software was up 8 percent, data and AI software up 4 percent, and security software up 5 percent.

On the consulting side, technology consulting revenue was up 23 percent over last year, applications operations up 17 percent, and business transformation up 16 percent.

Infrastructure revenue growth was driven by hybrid infrastructure sales, which rose 7 percent over last year, and infrastructure support, which grew 5 percent. Hybrid infrastructure revenue saw a significant boost from zSystems mainframe sales, which rose 77 percent over last year.

IBM also reported revenue of $8.1 billion from sales to the Americas, up 15 percent over last year; sales to Europe, Middle East, and Africa of $4.5 billion, up 17 percent; and $2.9 billion to the Asia Pacific area, up 16 percent.

Sales to Kyndryl, which late last year was spun out of IBM, accounted for about 5 percent of revenue, including 3 percent of IBM’s Americas revenue.

IBM also reported net income for the quarter on a GAAP basis of $1.39 billion, or $1.53 per share, up from last year’s $1.33 billion, or $1.47 per share.

Joseph F. Kovar

Joseph F. Kovar is a senior editor and reporter for the storage and the non-tech-focused channel beats for CRN. He keeps readers abreast of the latest issues related to such areas as data life-cycle, business continuity and disaster recovery, and data centers, along with related services and software, while highlighting some of the key trends that impact the IT channel overall. He can be reached at jkovar@thechannelcompany.com.

Tue, 19 Jul 2022 13:17:00 -0500 en text/html https://www.crn.com/news/cloud/ibm-touts-ai-hybrid-cloud-demand-for-our-solutions-remains-strong-
Killexams : IBM beats quarterly revenue estimates, warns of $3.5 billion forex hit

Reuters

July 19, 2022 / 07:09 AM IST

IT hardware and services company IBM Corp beat quarterly revenue expectations on Monday but warned that the hit from forex for the year could be about $3.5 billion due to a strong dollar.

A hawkish Federal Reserve and heightened geopolitical tensions have driven gains in the dollar against a basket of currencies over the last year, prompting companies with sizeable international operations, including Microsoft and Salesforce, to temper expectations.

IBM now expects a foreign exchange hit to revenue of about 6% this year, Chief Financial Officer James Kavanaugh told Reuters. It had previously forecast a 3% to 4% hit.

Second-quarter revenue was hurt by $900 million due to a stronger U.S. dollar, Kavanaugh said.

Typically, a stronger dollar eats into the profits of companies that have sprawling international operations and convert foreign currencies into dollars.

However, strong demand at its consulting and infrastructure businesses helped IBM post second-quarter revenue of $15.54 billion, beating analysts' average estimate of $15.18 billion, according to Refinitiv data.

IBM sees revenue growth continuing, including in regions such as Europe and Asia Pacific, despite geopolitical turmoil and inflationary pressures, Kavanaugh said, echoing words of peer Accenture, which had last month said it does not foresee a pull back in client spending.

The 110-years-old company, whose revenue growth had hit near-stagnation for years, spun-off its large and laggard IT-managed infrastructure business last year and placed its hopes on high-growth software and consulting businesses with a focus on the so-called "hybrid cloud". Cloud revenue rose 18% to $5.9 billion.

Reuters

Mon, 18 Jul 2022 13:41:00 -0500 en text/html https://www.moneycontrol.com/news/world/ibm-beats-quarterly-revenue-estimates-warns-of-3-5-billion-forex-hit-8847261.html
Killexams : Why AI is critical to meet rising ESG demands

Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Watch here.


Could artificial intelligence (AI) help companies meet growing expectations for environmental, social and governance (ESG) reporting? 

Certainly, over the past couple of years, ESG issues have soared in importance for corporate stakeholders, with increasing demands from investors, employees and customers. According to S&P Global, in 2022 corporate boards and government leaders “will face rising pressure to demonstrate that they are adequately equipped to understand and oversee ESG issues — from climate change to human rights to social unrest.”

ESG investing, in particular, has been a big part of this boom: Bloomberg Intelligence found that ESG assets are on track to exceed $50 trillion by 2025, representing more than a third of the projected $140.5 trillion in total global assets under management. Meanwhile, ESG reporting has become a top priority that goes beyond ticking off regulatory boxes. It’s used as a tool to attract investors and financing, as well as to meet expectations of today’s consumers and employees.  

But according to a exact Oracle ESG global study, 91% of business leaders are currently facing major challenges in making progress on sustainability and ESG initiatives. These include finding the right data to track progress, and time-consuming manual processes to report on ESG metrics.

“A lot of the data that needs to be collected either doesn’t exist yet or needs to come from many systems,” said Sem J. de Spa, senior manager of digital risk solutions at Deloitte. “It’s also way more complex than just your company, because it’s your suppliers, but also the suppliers of your suppliers.” 

ESG data challenges driving use of AI

That is where AI has increasingly become part of the ESG equation. AI can help manage data, glean data insights, operationalize data and report against it, said Christina Shim, VP of strategy and sustainability, AI applications software at IBM. 

“We need to make sure that we’re gathering the mass amounts of data when they’re in completely different silos, that we’re leveraging that data to Improve operations within the business, that we’re reporting that data to a variety of stakeholders and against a very confusing landscape of ESG frameworks,” she said. 

According to Deloitte, although a BlackRock survey found that 92% of S&P companies were reporting ESG metrics by the end of 2020, 53% of global respondents cited “poor quality or availability of ESG data and analytics” and another 33% cited “poor quality of sustainability investment reporting” as the two biggest barriers to adopting sustainable investing. 

Making progress is a must, experts say. Increasingly, these ESG and sustainability commitments are no longer simply nice to have,” said Shim. “It’s really becoming kind of like a basis of what organizations need to be focused on and there are increasingly higher standards that have to be integrated into the operations of all businesses,” she explained. 

“The challenge is huge, especially as new regulations and standards emerge and ESG requirements are under more scrutiny,” said De Spa. This has led to hundreds of technology vendors flooding the market that use AI to help tackle these issues. “We need all of them, at least a lot of them, to solve these challenges,” he said.

The human-AI ESG connection

On top of the operational challenges around ESG, the Oracle study found 96% of business leaders admit human bias and emotion often distract from the end ESG goals. In fact, 93% of business leaders say they would trust a bot over a human to make sustainability and social decisions. 

“We have people who are coming up now who are hardwired for ESG,” Pamela Rucker, CIO advisor, instructor for Harvard Professional Development, who helped put together the Oracle study. “The idea that they would trust a computer isn’t different for them. They already trust a computer to guide them to work, to supply them directions, to tell them where the best prices are.” 

But, she added, humans can work with technology to create more meaningful change and the survey also found that business leaders believe there is still a place for humans in ESG efforts, including managing making changes (48%), educating others (46%), and making strategic decisions (42%). 

“Having a machine that might be able to sift through some of that data will allow the humans to come in and look at places where they can add some context around places where we might have some ambiguity, or we might have places where there’s an opportunity,” said Rucker. “AI gives you a chance to see more of that data, and you can spend more time trying to come up with the insights.” 

How companies can get started with AI and ESG

Seth Dobrin, chief AI officer at IBM, told VentureBeat that companies should get started now on using AI to harness ESG data. “Don’t wait for additional regulations to come,” he said. 

Getting a handle on data is essential as companies begin their journey towards bringing AI technologies into the mix. “You need a baseline to understand where you are, because you can make all the goals and imperatives, you can commit to whatever you want, but until you know where you are, you’re never gonna figure out how to get to where you need to get to,” he said. 

Dobrin said he also sees organizations moving from a defensive, risk management posture around ESG to a proactive approach that is open to AI and other technologies to help. 

“It’s still somewhat of a compliance exercise, but it’s shifting,” he said. “Companies know they need to get on board and think proactively so that they are considered a thought leader in the space and not just a laggard doing the bare minimum.” 

One of the key areas IBM is focusing on, he added, is helping clients connect their ESG data and the data monitoring with the real operations of the business. 

“If we’re thinking about business facilities and assets, infrastructure and supply chain as something that’s relevant across industries, all the data that’s being sourced needs to be rolled up and integrated with data and process flows within the ESG reporting and management piece,” he said. “You’re sourcing the data from the business.” 

Deloitte works with Signal AI on ESG efforts

Deloitte recently partnered with Signal AI, which offers AI-powered media intelligence, to help the consulting firm’s clients spot and address supplier risks related to ESG issues. 

“With the rise of ESG and as businesses are navigating a more complex environment than ever before, the world has become awash in unstructured data,” said David Benigson, CEO of Signal AI. “Businesses may find themselves constantly on the back foot, responding to these issues reactively rather than having the sort of data and insights at their fingertips to be at the forefront.” 

The emergence of machine learning and AI, he said, can fundamentally address those challenges. “We can transform data into structured insights that help business leaders and organizations better understand their environment and get ahead of those risks, those threats faster, but also spot those opportunities more efficiently too – providing more of an outside-in perspective on issues such as ESG.” 

He pointed to exact backlash around “greenwashing,” including by Elon Musk (who called ESG a “scam” because Tesla was removed from S&P 500’s ESG Index). “There are accusations that organizations are essentially marking their own homework when it comes to sorting their performance and alignment against these sorts of ESG commitments,” he said. “At Signal, we provide the counter to that – we don’t necessarily analyze what the company says they’re going to do, but what the world thinks about what that company is doing and what that company is actually doing in the wild.” 

Deloitte’s de Spa said the firm uses Signal AI for what it calls a “responsible value chain” – basically, supplier risk management. 

“For example, a sustainable organization that cleans oceans and rivers from all kinds of waste asked us to help them get more insight into their own value chain,” he said. “They have a small number of often small suppliers they are dependent on and you cannot easily keep track of what they’re doing.” With Signal AI, he explained, Deloitte can follow what is happening with those companies to identify if there are any risks – if they are no longer able to deliver, for example, if there is a scandal that puts them out of business, or if the company is causing issues related to sustainability.” 

In one case, Deloitte discovered a company that was not treating their workers fairly. “You can definitely fight greenwashing because you can see what is going on,” he said. “You can leverage millions of sources to identify what is really happening.” 

ESG will need AI and humans going forward

As sustainability and other ESG-related regulations begin to proliferate around the world, AI and smart technology will continue to play a crucial role, said Deloitte’s de Spa. “It’s not just about carbon, or even having a responsible value chain that has a net zero footprint,” he said. “But it’s also about modern slavery and farmers and other social types of things that companies will need to report on in the next few years.” 

Going forward, a key factor will be how to connect and integrate data together using AI, said IBM’s Dobrin. “Many offer a carbon piece or sell AI just for energy efficiency or supply chain transparency,” he said. “But you need to connect all of it together in a one-stop-shop, that will be a total game-changer in this space.” 

No matter what, said Rucker, there is certainly going to be more for AI-driven tools to measure when it comes to ESG. “One of the reasons I get excited about this is because it’s not just about a carbon footprint anymore, and those massive amounts of data mean you’re going to have to have heavy lifting done by a machine,” she said. “I see an ESG future where the human needs the machine and the machine needs the human. I don’t think that they can exist without each other.” 

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn more about membership.

Tue, 12 Jul 2022 18:00:00 -0500 Sharon Goldman en-US text/html https://venturebeat.com/2022/07/13/why-ai-is-critical-to-meet-rising-esg-demands/
Killexams : IBM-owned SXiQ delivers migration for Bega Cheese following acquisition of rival dairy giant Lion Dairy and Drinks

IBM-owned, Melbourne-based cloud integrator SXiQ has completed migration services for Bega Cheese as part of its $560 million acquisition of rival dairy giant Lion Dairy & Drinks.

Bega bought Lion Dairy & Drinks in late 2020, which owns brands, such as Dairy Farmers, Yoplait, Big M, Dare, Masters, Juice Brothers, Daily Juice,  and Farmers Union iced coffee.

Bega Cheese chief information officer Zack Chisholm said in a statement that the Vegemite owner required Lion Dairy & Drinks’ applications, data and processes to be transitioned into its existing and expanded infrastructure.

Chisholm said that infrastructure migration and transition of 31 physical sites performing production, distribution and administration duties, had to be completed within a 12-month window with minimal disruption to the operations of both businesses. 

Bega Cheese’s IT team partnered with SXiQ on end-state design, migration planning and execution for several parts of the project - such as migration of applications, databases and associated backups, implementation of prod and non-prod AWS accounts and a landing zone to house all LD&D workloads. The work also included implementation of a continuous integration and continuous deployment toolset and workflow built on Cloud Formation, Ansible, Jenkins and GitHub.

SXiQ said it also assisted Bega Cheese’s IT team with cloud cost optimisation strategies for efficient consumption of cloud resources to support the newly acquired business, and uplifting the cloud ops team to ensure Bega Cheese IT incorporated true DevSecOps into its core capability to support the new platform.

SXiQ chief executive officer John Hanna said, “our experts executed deep analysis, strategic thinking, and detailed planning to ensure the successful migration of Lion to Bega Cheese’s existing infrastructure.”

“By uplifting infrastructure, cloud management tooling and practices, SXiQ has enhanced management of Bega Cheese’s cloud assets, improving consistency, security and reducing time to deploy cloud infrastructure in the future.” 

Global tech giant IBM acquired SXiQ late last year for an undisclosed sum to bolster its Consulting’s Hybrid Cloud Services business’ Amazon Web Services and Microsoft Azure consulting capabilities.

Wed, 03 Aug 2022 00:04:00 -0500 text/html https://www.crn.com.au/news/ibm-owned-sxiq-delivers-migration-for-bega-cheese-following-acquisition-of-rival-dairy-giant-lion-dairy-and-drinks-583545
Killexams : Intel’s ATX12VO Standard: A Study In Increasing Computer Power Supply Efficiency

The venerable ATX standard was developed in 1995 by Intel, as an attempt to standardize what had until then been a PC ecosystem formed around the IBM AT PC’s legacy. The preceding AT form factor was not so much a standard as it was the copying of the IBM AT’s approximate mainboard and with it all of its flaws.

With the ATX standard also came the ATX power supply (PSU), the standard for which defines the standard voltage rails and the function of each additional feature, such as soft power on (PS_ON).  As with all electrical appliances and gadgets during the 1990s and beyond, the ATX PSUs became the subject of power efficiency regulations, which would also lead to the 80+ certification program in 2004.

Starting in 2019, Intel has been promoting the ATX12VO (12 V only) standard for new systems, but what is this new standard about, and will switching everything to 12 V really be worth any power savings?

What ATX12VO Is

As the name implies, the ATX12VO standard is essentially about removing the other voltage rails that currently exist in the ATX PSU standard. The idea is that by providing one single base voltage, any other voltages can be generated as needed using step-down (buck) converters. Since the Pentium 4 era this has already become standard practice for the processor and much of the circuitry on the mainboard anyway.

As the ATX PSU standard moved from the old 1.x revisions into the current 2.x revision range, the -5V rail was removed, and the -12V rail made optional. The ATX power connector with the mainboard was increased from 20 to 24 pins to allow for more 12 V capacity to be added. Along with the Pentium 4’s appetite for power came the new 4-pin mainboard connector, which is commonly called the “P4 connector”, but officially the “+12 V Power 4 Pin Connector” in the v2.53 standard. This adds another two 12 V lines.

Power input and output on the ASRock Z490 Phantom Gaming 4SR, an ATX12VO mainboard. (Credit: Anandtech)

In the ATX12VO standard, the -12 V, 5 V, 5 VSB (standby) and 3.3 V rails are deleted. The 24-pin connector is replaced with a 10-pin one that carries three 12 V lines (one more than ATX v2.x) in addition to the new 12 VSB standby voltage rail. The 4-pin 12 V connectors would still remain, and still require one to squeeze one or two of those through impossibly small gaps in the system’s case to get them to the top of the mainboard, near the CPU’s voltage regulator modules (VRMs).

While the PSU itself would be somewhat streamlined, the mainboard would gain these VRM sections for the 5 V and 3.3 V rails, as well as power outputs for SATA, Molex and similar. Essentially the mainboard would take over some of the PSU’s functions.

Why ATX12VO exists

A range of Dell computers and server which will be subject to California’s strict efficiency regulations.

The folk over at GamersNexus have covered their research and the industry’s thoughts on the subject of ATX12VO in an article and video that were published last year. To make a long story short, OEM system builders and systems integrators are subject to pretty strong power efficiency regulations, especially in California. Starting in July of 2021, new Tier 2 regulations will come into force that add more strict requirements for OEM and SI computer equipment: see 1605.3(v)(5) (specifically table V-7) for details.

In order to meet these ever more stringent efficiency requirements, OEMs have been creating their own proprietary 12 V-only solutions, as detailed in GamersNexus’ recent video review on the Dell G5 5000 pre-built desktop system. Intel’s ATX12VO standard therefore would seem to be more targeted at unifying these proprietary standards rather than replacing ATX v2.x PSUs in DIY systems. For the latter group, who build their own systems out of standard ATX, mini-ITX and similar components, these stringent efficiency regulations do not apply.

The primary question thus becomes whether ATX12VO makes sense for DIY system builders. While the ability to (theoretically) increase power efficiency especially at low loads seems beneficial, it’s not impossible to accomplish the same with ATX v2.x PSUs. As stated by an anonymous PSU manufacturer in the GamersNexus article, SIs are likely to end up simply using high-efficiency ATX v2.x PSUs to meet California’s Tier 2 regulations.

Evolution vs Revolution

Seasonic’s CONNECT DC-DC module connected to a 12V PSU. (Credit: Seasonic)

Ever since the original ATX PSU standard, the improvements have been gradual and never disruptive. Although some got caught out by the negative voltage rails being left out when trying to power old mainboards that relied on -5 V and -12 V rails being present, in general these changes were minor enough to incorporate these into the natural upgrade cycle of computer systems. Not so with ATX12VO, as it absolutely requires an ATX12VO PSU and mainboard to accomplish the increased efficiency goals.

While the possibility of using an ATX v2.x to ATX12VO adapter exists that passively adapts the 12 V rails to the new 10-pin connector and boosts the 5 VSB line to 12 VSB levels, this actually lowers efficiency instead of increasing it. Essentially, the only way for ATX12VO to make a lot of sense is for the industry to switch over immediately and everyone to upgrade to it as well without reusing non-ATX12VO compatible mainboards and PSUs.

Another crucial point here is that OEMs and SIs are not required to adopt ATX12VO. Much like Intel’s ill-fated BTX alternative to the ATX standard, ATX12VO is a suggested standard that manufacturers and OEMs are free to adopt or ignore at their leisure.

Important here are probably the obvious negatives that ATX12VO introduces:

  • Adding another hot spot to the mainboard and taking up precious board space.
  • Turning mainboard manufacturers into PSU manufacturers.
  • Increasing the cost and complexity of mainboards.
  • Routing peripheral power (including case fans) from the mainboard.
  • Complicating troubleshooting of power issues.
Internals of Seasonic’s CONNECT modular power supply. (Credit: Tom’s Hardware)

Add to this potential alternatives like Seasonic’s CONNECT module. This does effectively the same as the ATX12VO standard, removing the 5 V and 3.3 V rails from the PSU and moving them to an external module, off of the mainboard. It can be fitted into the area behind the mainboard in many computer cases, making for very clean cable management. It also allows for increased efficiency.

As PSUs tend to survive at least a few system upgrades, it could be argued that from an environmental perspective, having the minor rails generated on the mainboard is undesirable. Perhaps the least desirable aspect of ATX12VO is that it reduces the modular nature of ATX-style computers, making them more like notebook-style systems. Instead, a more reasonable solution here might be that of a CONNECT-like solution which offers both an ATX 24-pin and ATX12VO-style 10-pin connectivity option.

Thinking larger

In the larger scheme of power efficiency it can be beneficial to take a few steps back from details like the innards of a computer system and look at e.g. the mains alternating current (AC) that powers these systems. A well-known property of switching mode power supplies (SMPS) like those used in any modern computer is that they’re more efficient at higher AC input voltages.

Power supply efficiency at different input voltages. (Credit: HP)

This can be seen clearly when looking for example at the rating levels for 80 Plus certification. Between 120 VAC and 230 VAC line voltage, the latter is significantly more efficient. To this one can also add the resistive losses from carrying double the amps over the house wiring for the same power draw at 120 V compared to 230 VAC. This is the reason why data centers in North America generally run on 208 VAC according to this APC white paper.

For crypto miners and similar, wiring up their computer room for 240 VAC (North American hot-neutral-hot) is also a popular topic, as it directly boosts their profits.

Future Outlook

Whether ATX12VO will become the next big thing or fizzle out like BTX and so many other proposed standards is hard to tell. One thing which the ATX12VO standard has against it is definitely that it requires a lot of big changes to happen in parallel, and the creation of a lot of electronic waste through forced upgrades within a short timespan. If we consider that many ATX and SFX-style PSUs are offered with 7-10 year warranties compared to the much shorter lifespan of mainboards, this poses a significant obstacle.

Based on the sounds from the industry, it seems highly likely that much will remain ‘business as usual’. There are many efficient ATX v2.x PSUs out there, including 80 Plus Platinum and Titanium rated ones, and Seasonic’s CONNECT and similar solutions would appeal heavily to those who are into neat cable management. For those who buy pre-built systems, the use of ATX12VO is also not relevant, so long as the hardware is compliant to all (efficiency) regulations. The ATX v2.x standard and 80 Plus certification are also changing to set strict 2-10% load efficiency targets, which is the main target with ATX12VO.

What would be the point for you to switch to ATX12VO, and would you pick it over a solution like Seasonic CONNECT if both offered the same efficiency levels?

(Heading image: Asrock Z490 Phantom Gaming 4SR with SATA power connected, credit: c’t)

Wed, 03 Aug 2022 11:59:00 -0500 Maya Posch en-US text/html https://hackaday.com/2021/06/07/intels-atx12vo-standard-a-study-in-increasing-computer-power-supply-efficiency/
C9550-400 exam dump and training guide direct download
Training Exams List