Precisely same A2090-463 questions as in real test, Amazing!

Download Free A2090-463 dumps to ensure that you would understand A2090-463 practice questions well. Then apply for full copy of A2090-463 questions and answers with VCE exam simulator. Memorize A2090-463 PDF questions, practice with VCE exam simulator and feel confident that you will get high score in actual A2090-463 exam.

Exam Code: A2090-463 Practice exam 2022 by Killexams.com team
Assessment- IBM InfoSphere Guardium - Assessment
IBM Assessment- exam format
Killexams : IBM Assessment- exam format - BingNews https://killexams.com/pass4sure/exam-detail/A2090-463 Search results Killexams : IBM Assessment- exam format - BingNews https://killexams.com/pass4sure/exam-detail/A2090-463 https://killexams.com/exam_list/IBM Killexams : IBM Annual Cost of Data Breach Report 2022: Record Costs Usually Passed On to Consumers, “Long Breach” Expenses Make Up Half of Total Damage

IBM’s annual Cost of Data Breach Report for 2022 is packed with revelations, and as usual none of them are good news. Headlining the report is the record-setting cost of data breaches, with the global average now at $4.35 million. The report also reveals that much of that expense comes with the data breach version of “long Covid,” expenses that are realized more than a year after the attack.

Most organizations (60%) are passing these added costs on to consumers in the form of higher prices. And while 83% of organizations now report experiencing at least one data breach, only a small minority are adopting zero trust strategies.

Security AI and automation greatly reduces expected damage

The IBM report draws on input from 550 global organizations surveyed about the period between March 2021 and March 2022, in partnership with the Ponemon Institute.

Though the average cost of a data breach is up, it is only by about 2.6%; the average in 2021 was $4.24 million. This represents a total climb of 13% since 2020, however, reflecting the general spike in cyber crime seen during the pandemic years.

Organizations are also increasingly not opting to absorb the cost of data breaches, with the majority (60%) compensating by raising consumer prices separate from any other accurate increases due to inflation or supply chain issues. The report indicates that this may be an underreported upward influence on prices of consumer goods, as 83% of organizations now say that they have been breached at least once.

Brad Hong, Customer Success Manager for Horizon3.ai, sees a potential consumer backlash on the horizon once public awareness of this practice grows: “It’s already a breach of confidence to lose the confidential data of customers, and sure there’s bound to be an organization across those surveyed who genuinely did put in the effort to protect against and curb attacks, but for those who did nothing, those who, instead of creating a disaster recovery plan, just bought cyber insurance to cover the org’s operational losses, and those who simply didn’t care enough to heed the warnings, it’s the coup de grâce to then pass the cost of breaches to the same customers who are now the victims of a data breach. I’d be curious to know what percent of the 60% of organizations who increased the price of their products and services are using the extra revenue for a war chest or to actually reinforce their security—realistically, it’s most likely just being used to fill a gap in lost revenue for shareholders’ sake post-breach. Without government regulations outlining restrictions on passing cost of breach to consumer, at the least, not without the honest & measurable efforts of a corporation as their custodian, what accountability do we all have against that one executive who didn’t want to change his/her password?”

Breach costs also have an increasingly long tail, as nearly half now come over a year after the date of the attack. The largest of these are generally fines that are levied after an investigation, and decisions or settlements in class action lawsuits. While the popular new “double extortion” approach of ransomware attacks can drive long-term costs in this way, the study finds that companies paying ransom demands to settle the problem quickly aren’t necessarily seeing a large amount of overall savings: their average breach cost drops by just $610,000.

Sanjay Raja, VP of Product with Gurucul, expands on how knock-on data breach damage can continue for years: “The follow-up attack effect, as described, is a significant problem as the playbooks and solutions provided to security operations teams are overly broad and lack the necessary context and response actions for proper remediation. For example, shutting down a user or application or adding a firewall block rule or quarantining a network segment to negate an attack is not a sustainable remediation step to protect an organization on an ongoing basis. It starts with a proper threat detection, investigation and response solution. Current SIEMs and XDR solutions lack the variety of data, telemetry and combined analytics to not only identify an attack campaign and even detect variants on previously successful attacks, but also provide the necessary context, accuracy and validation of the attack to build both a precise and complete response that can be trusted. This is an even greater challenge when current solutions cannot handle complex hybrid multi-cloud architectures leading to significant blind spots and false positives at the very start of the security analyst journey.”

Rising cost of data breach not necessarily prompting dramatic security action

In spite of over four out of five organizations now having experienced some sort of data breach, only slightly over 20% of critical infrastructure companies have moved to zero trust strategies to secure their networks. Cloud security is also lagging as well, with a little under half (43%) of all respondents saying that their security practices in this area are either “early stage” or do not yet exist.

Those that have onboarded security automation and AI elements are the only group seeing massive savings: their average cost of data breach is $3.05 million lower. This particular study does not track average ransom demands, but refers to Sophos research that puts the most accurate number at $812,000 globally.

The study also notes serious problems with incident response plans, especially troubling in an environment in which the average ransomware attack is now carried out in four days or less and the “time to ransom” has dropped to a matter of hours in some cases. 37% of respondents say that they do not test their incident response plans regularly. 62% say that they are understaffed to meet their cybersecurity needs, and these organizations tend to suffer over half a million more dollars in damages when they are breached.

Of course, cost of data breaches is not distributed evenly by geography or by industry type. Some are taking much bigger hits than others, reflecting trends established in prior reports. The health care industry is now absorbing a little over $10 million in damage per breach, with the average cost of data breach rising by $1 million from 2021. And companies in the United States face greater data breach costs than their counterparts around the world, at over $8 million per incident.

Shawn Surber, VP of Solutions Architecture and Strategy with Tanium, provides some insight into the unique struggles that the health care industry faces in implementing effective cybersecurity: “Healthcare continues to suffer the greatest cost of breaches but has among the lowest spend on cybersecurity of any industry, despite being deemed ‘critical infrastructure.’ The increased vulnerability of healthcare organizations to cyber threats can be traced to outdated IT systems, the lack of robust security controls, and insufficient IT staff, while valuable medical and health data— and the need to pay ransoms quickly to maintain access to that data— make healthcare targets popular and relatively easy to breach. Unlike other industries that can migrate data and sunset old systems, limited IT and security budgets at healthcare orgs make migration difficult and potentially expensive, particularly when an older system provides a small but unique function or houses data necessary for compliance or research, but still doesn’t make the cut to transition to a newer system. Hackers know these weaknesses and exploit them. Additionally, healthcare orgs haven’t sufficiently updated their security strategies and the tools that manufacturers, IT software vendors, and the FDA have made haven’t been robust enough to thwart the more sophisticated techniques of threat actors.”

Familiar incident types also lead the list of the causes of data breaches: compromised credentials (19%), followed by phishing (16%). Breaches initiated by these methods also tended to be a little more costly, at an average of $4.91 million per incident.

Global average cost of #databreach is now $4.35M, up 13% since 2020. Much of that are realized more than a year after the attack, and 60% of organizations are passing the costs on to consumers in the form of higher prices. #cybersecurity #respectdataClick to Tweet

Cutting the cost of data breach

Though the numbers are never as neat and clean as averages would indicate, it would appear that the cost of data breaches is cut dramatically for companies that implement solid automated “deep learning” cybersecurity tools, zero trust systems and regularly tested incident response plans. Mature cloud security programs are also a substantial cost saver.

Mon, 01 Aug 2022 10:00:00 -0500 Scott Ikeda en-US text/html https://www.cpomagazine.com/cyber-security/ibm-annual-cost-of-data-breach-report-2022-record-costs-usually-passed-on-to-consumers-long-breach-expenses-make-up-half-of-total-damage/
Killexams : Three Common Mistakes That May Sabotage Your Security Training

Phishing incidents are on the rise. A report from IBM shows that phishing was the most popular attack vector in 2021, resulting in one in five employees falling victim to phishing hacking techniques.

The Need for Security Awareness Training

Although technical solutions protect against phishing threats, no solution is 100% effective. Consequently, companies have no choice but to involve their employees in the fight against hackers. This is where security awareness training comes into play.

Security awareness training gives companies the confidence that their employees will execute the right response when they discover a phishing message in their inbox.

As the saying goes, "knowledge is power," but the effectiveness of knowledge depends heavily on how it is delivered. When it comes to phishing attacks, simulations are among the most effective forms of training because the events in training simulations directly mimic how an employee would react in the event of an genuine attack. Since employees do not know whether a suspicious email in their inbox is a simulation or a real threat, the training becomes even more valuable.

Phishing Simulations: What does the training include?

It is critical to plan, implement and evaluate a cyber awareness training program to ensure it truly changes employee behavior. However, for this effort to be successful, it should involve much more than just emailing employees. Key practices to consider include:

  • Real-life phishing simulations.
  • Adaptive learning - live response and protection from genuine cyberattacks.
  • Personalized training based on factors such as department, tenure, and cyber experience level.
  • Empowering and equipping employees with an always-on cybersecurity mindset.
  • Data-driven campaigns

Because employees do not recognize the difference between phishing simulations and real cyberattacks, it's important to remember that phishing simulations evoke different emotions and reactions, so awareness training should be conducted thoughtfully. As organizations need to engage their employees to combat the ever-increasing attacks and protect their assets, it is important to keep morale high and create a positive culture of cyber hygiene.

Three common phishing simulation mistakes.

Based on years of experience, cybersecurity firm CybeReady has seen companies fall into these common mistakes.

Mistake #1: Testing instead of educating

The approach of running a phishing simulation as a test to catch and punish "repeat offenders" can do more harm than good.

An educational experience that involves stress is counterproductive and even traumatic. As a result, employees will not go through the training but look for ways to circumvent the system. Overall, the fear-based "audit approach" is not beneficial to the organization in the long run because it cannot provide the necessary training over an extended period.

Solution #1: Be sensitive

Because maintaining positive employee morale is critical to the organization's well-being, provide positive just-in-time training.

Just-in-time training means that once employees have clicked on a link within the simulated attack, they are directed to a short and concise training session. The idea is to quickly educate the employee on their mistake and give them essential tips on spotting malicious emails in the future.

This is also an opportunity for positive reinforcement, so be sure to keep the training short, concise, and positive.

Solution #2: Inform relevant departments.

Communicate with relevant stakeholders to ensure they are aware of ongoing phishing simulation training. Many organizations forget to inform relevant stakeholders, such as HR or other employees, that the simulations are being conducted. Learning has the best effect when participants have the opportunity to feel supported, make mistakes, and correct them.

Mistake #2: Use the same simulation for all employees

It is important to vary the simulations. Sending the same simulation to all employees, especially at the same time, is not only not instructive but also has no valid metrics when it comes to organizational risk.

The "warning effect" - the first employee to discover or fall for the simulation warns the others. This prepares your employees to respond to the "threat" by anticipating the simulation, thus bypassing the simulation and the training opportunity.

Another negative impact is social desirability bias, which causes employees to over-report incidents to IT without noticing them in order to be viewed more favorably. This leads to an overloaded system and the department IT.

This form of simulation also leads to inaccurate results, such as unrealistically low click-through rates and over-reporting rates. Thus, the metrics do not show the real risks of the company or the problems that need to be addressed.

Solution: Drip mode

Drip mode allows sending multiple simulations to different employees at different times. Certain software solutions can even do this automatically by sending a variety of simulations to different groups of employees. It's also important to implement a continuous cycle to ensure that all new employees are properly onboarded and to reinforce that security is important 24/7 - not just checking a box for minimum compliance.

Mistake #3: Relying on data from a single campaign

With over 3.4 billion phishing attacks per day, it's safe to assume that at least a million of them differ in complexity, language, approach, or even tactics.

Unfortunately, no single phishing simulation can accurately reflect an organization's risk. Relying on a single phishing simulation result is unlikely to provide reliable results or comprehensive training.

Another important consideration is that different groups of employees respond differently to threats, not only because of their vigilance, training, position, tenure, or even education level but because the response to phishing attacks is also contextual.

Solution: Implement a variety of training programs

Behavior change is an evolutionary process and should therefore be measured over time. Each training session contributes to the progress of the training. Training effectiveness, or in other words, an accurate reflection of genuine organizational behavior change, can be determined after multiple training sessions and over time.

The most effective solution is to continuously conduct various training programs (at least once a month) with multiple simulations.

It is highly recommended to train employees according to their risk level. A diverse and comprehensive simulation program also provides reliable measurement data based on systematic behavior over time. To validate their efforts at effective training, organizations should be able to obtain a valid indication of their risk at any given point in time while monitoring progress in risk reduction.

Implement an effective phishing simulation program.

Creating such a program may seem overwhelming and time-consuming. That's why we have created a playbook of the 10 key practices you can use to create a simple and effective phishing simulation. Simply download the CybeReady Playbook or meet with one of our experts for a product demo and learn how CybeReady's fully automated security awareness training platform can help your organization achieve the fastest results with virtually zero effort IT.


Found this article interesting? Follow THN on Facebook, Twitter and LinkedIn to read more exclusive content we post.
Wed, 03 Aug 2022 22:37:00 -0500 The Hacker News en text/html https://thehackernews.com/2022/08/three-common-mistakes-that-may-sabotage.html
Killexams : IBM's Watson Set To Revolutionize Marketing

When a computer can figure out whether a movie trailer is going to positively affect an audience or not – it makes you wonder how close we are to computer generated predictions on everything else in life. The short answer, according to Michael Karasick, IBM’s VP and Research Director at Almaden Labs, is that IBM ’s Watson is already making them. Since conquering “Jeopardy” and Chess, Watson has been focused on predictive healthcare, customer service, investment advice and culinary pursuits. But they are not stopping there, IBM is allowing select customers to use “Watson as a service” and may soon open it up to developers to build Watson apps.

Yes, the Watson technology is still maturing, but I am convinced that within five years the Watson platform will learn faster and make better predictions with each new field it understands. That’s because, as Karasick told me, “If you train a system like Watson on domain A and domain B, then it knows how to make the equivalence between terminologies in different domains.” That means as Watson solves problems in chemistry; it can generate probable solutions in Physics and Metallurgy too.

Imagine how this might be applied to marketing. By using Watson as a service, a business could train Watson to understand its customers, then use predictive models to recognize new products or services that their customers will buy.

Here’s how Watson can revolutionize marketing

Predict new trends and shifting tastes

Watson is a voracious consumer of data, and it doesn’t forget anything. You can feed it data from credit cards, sales databases, social networks, location data, web pages and it can compile and categorize that information to make high probability predictions.

And most shockingly, Watson is well ahead of its competitors in sentiment analysis. According to Karasick, Watson can recognize irony and sarcasm - and properly apprehend the intended meaning. That means Watson can quickly analyze large demo sizes to determine whether a movie trailer, product offering or clothing line are going to work with consumers.

Analyze social conversations – generate leads

Most social listening solutions on the market today do an adequate job of giving the marketer signals and reports about their industry, competitors, partners and current customers. But it’s up to the marketer to analyze the information and take action.

As Watson has demonstrated in other domains, it can foreseeably predict what information is most important and make recommendations on how to act on it. For example, if it finds a cluster of people discussing problems that the marketer’s solution solves, Watson can automatically notify the sales team or take action on its own to educate the prospective customers.

Determine whether a new innovation will sell or not

Because Watson can learn from one domain of knowledge and make high probability predictions in another, it’s reasonable to assume that if a company wanted to understand whether a new innovation will sell or not, Watson could analyze a company’s current market and customer base to provide success probabilities.

We’re a long way off from a Watson with the taste of a Steve Jobs, but if it has enough understanding of the situation, it can produce insights that can give companies a clearer picture of the opportunities and threats.

Computer calculated and automated growth hacking

If you’re a marketer and not familiar with growth hacking, please study up fast. Growth hackers focus on innovative A/B testing techniques to maximize conversions on emails, websites, social media, online content or just about any digital media available to them. It’s a low cost but more effective alternative to traditional media.

I can see how Watson could proactively and intelligently test, measure and optimize digital content, ads, website pages even a company’s product to efficiently maximize customer growth. Andy Johns of Greylock, formerly a growth hacker for Facebook , Twitter and Quora told me that Facebook conducted 6 hacks a day to maximize growth opportunities. I suspect Watson could easily handle 10 times that amount.

This clearly is the digital march of progress. Watson has the potential to eliminate ineffective marketing, Excellerate good marketing to great marketing, and to predict how to better spend marketing dollars in the future.

Put it all together and you’ve revolutionized marketing.

Mon, 18 Jul 2022 05:33:00 -0500 Mark Fidelman en text/html https://www.forbes.com/sites/markfidelman/2013/09/04/ibms-watson-set-to-revolutionize-marketing/
Killexams : IBM Watson Gets a Factory Job

IBM has launched an Internet of Things system as part of Watson. The tools is called Cognitive Visual Inspection, and the idea is to provide manufacturers with a “cognitive assistant” on the factory floor to minimize defects and increase product quality. According to IBM, in early production-cycle testing, Watson was able to reduce 80% of the inspection time while reducing manufacturing defects by 7-10%.

The system uses an ultra-high definition camera and adds cognitive capabilities from Watson to create a tool that captures images of products as they move through production and assembly. Together with human inspectors, Watson recognizes defects in products, including scratches or pinhole-size punctures.

“Watson brings its cognitive capabilities to image recognition,” Bret Greenstein, VP of IoT at IBM, told Design News. “We’re applying this to a wide range of industries, including electronics and automotive.”

The Inspection Eye That Never Tires

The system continuously learns based on human assessment of the defect classifications in the images. The tool was designed to help manufacturers achieve specialization levels that were not possible with previous human or machine inspection. “We created a system and workflow to feed images of good and bad products into Watson and train it with algorithms,” said Greenstein. “This is a system that you can be trained in advance to see what acceptable products look like.”

According to IBM, more than half of product quality checks involve some form of visual confirmation. Visual checking helps ensure that all parts are in the correct location, have the right shape or color or texture, and are free from scratches, holes or foreign particles. Automating these visual checks is difficult due to volume and product variety. Add to that the challenge from defects that can be any size, from a tiny puncture to a cracked windshield on a vehicle.

Some of the inspection training precedes Watson’s appearance on the manufacturing line. “There are several components. You define the range of images, and feed the images into Watson. When it produces the confidence level you need, you push it to the operator stations,” said Greenstein. “Watson concludes whether the product awesome or defective. You let the system make the decision.”

The ultimate goal is to keep Watson on a continuous learning curve. “We can push this system out to different manufacturing lines, and we can train it based on operators in the field and suggest changes to make the system smarter, creating an evolving inspection process,” said Greenstein.

The ABB Partnership

As part of its move into the factory, IBM has formed a strategic collaboration with ABB. The goal is to combine ABB’s domain knowledge and digital solutions with IBM’s artificial intelligence and machine-learning capabilities. The first two joint industry solutions powered by ABB Ability and Watson were designed to bring real-time cognitive insights to the factory floor and smart grids.

READ MORE ARTICLES ON SMART MANUFACTURING:

The suite of solutions developed by ABB and IBM are intended to help companies Excellerate quality control, reduce downtime, and increase speed and yield. The goal is to Excellerate on current connected systems that simply gather data. Instead, Watson is designed to use data to understand, sense, reason, and take actions to help industrial workers to reduce inefficient processes and redundant tasks.

According to Greenstein, Watson is just getting its industry sea legs. In time, the thinking machine will take on increasing industrial tasks. “We found a wide range of uses. We’re working with drones to look at traffic flows in retail situation to analyze things that are hard to see from a human point of view,” said Greenstein. “We’re also applying Watson’s capabilities to predictive maintenance.

Rob Spiegel has covered automation and control for 17 years, 15 of them for Design News. Other subjects he has covered include supply chain technology, alternative energy, and cyber security. For 10 years, he was owner and publisher of the food magazine Chile Pepper.

Image courtesy of IBM

Thu, 28 Jul 2022 12:00:00 -0500 en text/html https://www.designnews.com/automation-motion-control/ibm-watson-gets-factory-job
Killexams : Bare Metal Cloud Market Business overview 2022, and Forecast to 2030 | By -IBM Corporation, Oracle Corporation, Scaleway

The MarketWatch News Department was not involved in the creation of this content.

Jul 26, 2022 (Heraldkeepers) -- New Jersey, United States- Bare Metal Cloud Market 2022-2030 was made with the help of industry certified and an all-around market study. The investigation reviews the market’s current status and future advancement potential. A discussion of the top players’ dynamic in this market is associated with the report. A gathering of examiners and industry certified collected the select data in this report. The assessment gives information on key market drivers, impediments, troubles, and possible opportunities to help accomplices with understanding the beat of the business.

The assessment report integrates an Bare Metal Cloud market size examination, as well as nearby and country level market size examination, a CAGR evaluation of market improvement over the guest time span, pay, critical drivers, vicious establishment, and arrangements assessment of the competitors. The investigation similarly analyzes the gigantic hardships and risks that will be investigated during the projection period. The Bare Metal Cloud market is apportioned into two arrangements: type and application. Players, accomplices, and other market individuals will really need to get an advantage by involving the assessment as a significant resource.

Receive the demo Report of Bare Metal Cloud Market 2022 to 2030:

The worldwide Bare Metal Cloud market is expected to grow at a booming CAGR of 2022-2030, rising from USD billion in 2021 to USD billion in 2030. It also shows the importance of the Bare Metal Cloud market main players in the sector, including their business overviews, financial summaries, and SWOT assessments.

Bare Metal Cloud Market Segmentation & Coverage:

Bare Metal Cloud Market segment by Type: 
Hardware, Software

Bare Metal Cloud Market segment by Application: 
Retail, Healthcare, Manufacturing, IT and Telecom, Media and Entertainment

The years examined in this study are the following to estimate the Bare Metal Cloud market size:

History Year: 2015-2019
Base Year: 2021
Estimated Year: 2022
Forecast Year: 2022 to 2030

Cumulative Impact of COVID-19 on Market:

The COVID-19 pandemic is immensely affecting the Bare Metal Cloud market in 2020, which is being hampered by application ventures’ sluggish reception of the current control and cooperation techniques. The far-reaching spread of the Covid has brought about lockdowns around the nation, limited work in a couple of present areas, transitory end-accumulation activities, and disturbances to supply chains.

Get a demo Copy of the Bare Metal Cloud Market Report: https://www.infinitybusinessinsights.com/request_sample.php?id=875538

Regional Analysis:

The particular assessment of utilization, pay, piece of the pie, and pace of advancement in the accompanying areas: North America (the United States, Mexico, and Canada), South America (the United States, Mexico, and Canada), and the Middle East and Africa (South Africa, Saudi Arabia, the United Arab Emirates, Israel, Egypt, and so forth). (Brazil, Venezuela, Argentina, Ecuador, Peru, Colombia, etc).

The Key companies profiled in the Bare Metal Cloud Market:

The study examines the Bare Metal Cloud market’s competitive landscape and includes data on important suppliers, including IBM Corporation, Oracle Corporation, Scaleway, Liquid Web, Joyent, RACKSPACE, Internap Corporation, CenturyLink, BIGSTEP, Packet, Alibaba, Huawei,& Others

Table of Contents:

List of Data Sources:
Chapter 2. Executive Summary
Chapter 3. Industry Outlook
3.1. Bare Metal Cloud Global Market segmentation
3.2. Bare Metal Cloud Global Market size and growth prospects, 2015 – 2026
3.3. Bare Metal Cloud Global Market Value Chain Analysis
3.3.1. Vendor landscape
3.4. Regulatory Framework
3.5. Market Dynamics
3.5.1. Market Driver Analysis
3.5.2. Market Restraint Analysis
3.6. Porter’s Analysis
3.6.1. Threat of New Entrants
3.6.2. Bargaining Power of Buyers
3.6.3. Bargaining Power of Buyers
3.6.4. Threat of Substitutes
3.6.5. Internal Rivalry
3.7. PESTEL Analysis
Chapter 4. Bare Metal Cloud Global Market Product Outlook
Chapter 5. Bare Metal Cloud Global Market Application Outlook
Chapter 6. Bare Metal Cloud Global Market Geography Outlook
6.1. Bare Metal Cloud Industry Share, by Geography, 2022 & 2030
6.2. North America
6.2.1. Market 2022 -2030 estimates and forecast, by product
6.2.2. Market 2022 -2030, estimates and forecast, by application
6.2.3. The U.S.
6.2.3.1. Market 2022 -2030 estimates and forecast, by product
6.2.3.2. Market 2022 -2030, estimates and forecast, by application
6.2.4. Canada
6.2.4.1. Market 2022 -2030 estimates and forecast, by product
6.2.4.2. Market 2022 -2030, estimates and forecast, by application
6.3. Europe
6.3.1. Market 2022 -2030 estimates and forecast, by product
6.3.2. Market 2022 -2030, estimates and forecast, by application
6.3.3. Germany
6.3.3.1. Market 2022 -2030 estimates and forecast, by product
6.3.3.2. Market 2022 -2030, estimates and forecast, by application
6.3.4. the UK
6.3.4.1. Market 2022 -2030 estimates and forecast, by product
6.3.4.2. Market 2022 -2030, estimates and forecast, by application
6.3.5. France
6.3.5.1. Market 2022 -2030 estimates and forecast, by product
6.3.5.2. Market 2022 -2030, estimates and forecast, by application
Chapter 7. Competitive Landscape
Chapter 8. Appendix

Download here the full INDEX of Bare Metal Cloud Market Research Report @

Faqs:
Is the Bare Metal Cloud market appropriate for long-haul speculative ventures?
What are the components that are bound to rouse consideration sooner than later?
What is the effect of various elements on the development of the worldwide Bare Metal Cloud market?
What are the latest regional Bare Metal Cloud market patterns, and how fruitful would they say they are considered to be?

Contact Us:
Amit Jain
Sales Co-Ordinator
International: +1 518 300 3575
Email: inquiry@infinitybusinessinsights.com
Website: https://www.infinitybusinessinsights.com

COMTEX_410964317/2582/2022-07-26T02:01:13

Is there a problem with this press release? Contact the source provider Comtex at editorial@comtex.com. You can also contact MarketWatch Customer Service via our Customer Center.

The MarketWatch News Department was not involved in the creation of this content.

Mon, 25 Jul 2022 18:01:00 -0500 en-US text/html https://www.marketwatch.com/press-release/bare-metal-cloud-market-business-overview-2022-and-forecast-to-2030-by--ibm-corporation-oracle-corporation-scaleway-2022-07-26
Killexams : How England aced their four spectacular Test chases this summer

This article is about the four Tests that were played earlier this English summer. A lot has been written about these amazing matches and how England took a sledgehammer to the conventional Test framework in them. This article is an analytical overview of these games, using measures that I have built over the years.

Let me first provide an overview of the four Tests in a tabular form. People will not have forgotten the numbers, but it is good to have a recap, to jog the memory.

At Lord's, New Zealand won the toss, batted first, and regretted that decision 30 minutes later. They slid to 7 for 3 and 45 for 7, and then recovered somewhat to 132. Not that England did any better, starting well to get to 59 without loss, but losing their way a finishing up with a lead of barely nine runs. Two innings were completed before the first drinks break on the second day. New Zealand recovered after an initial wobble in their second innings to post an impressive 285 and a tough target. England stumbled a few times but won by five wickets, with Joe Root anchoring the chase. England secured a TPP (Team Performance Points, out of 100) margin of 57.2 to New Zealand's 42.8 in the match. The scoring rate was, surprisingly, well above three for the Test.

In the second Test, England, determined to bat last, asked New Zealand to bat, and after 11 hours of hard grind, were staring at an imposing total of over 550. However, they took this as a challenge to be met, and posted a total of 539 themselves. The scoring rate of around four meant that nearly two days' play was available. Batting consistently well, New Zealand set a tough target of nearly 300 in five hours. England switched modes, imagined that there was a white ball being bowled, and got to it in exactly 50 overs. This time, the TPP margin was 58.1 vs 41.9.

At Headingley, New Zealand won the toss, batted, and made the par score of 329, batting circumspectly. It was the first time in the series that the scoring rate of three per over had not been not breached. Despite falling to 21 for 4 and 55 for 6, England, through Jonny Bairstow and Jamie Overton, eventually took a lead of 31. New Zealand posted a competitive second-innings total and set yet another tough target. England then switched to white-ball-mode again and made light of the target, winning by seven wickets. England's scoring rate in the Test was an amazing 5.4. The easy win made the TPP comparison a more emphatic 63.9 to New Zealand's 36.1.

Then came a change of opposition, at Edgbaston, but it was business as usual. India, asked to bat, put up an above-average 400-plus score. For the first time in the summer, there was a substantial first-innings lead, as England trailed by 132. No one really pushed on for a big score in India's second innings, though, and they finished on 245. However, that still meant England were set a huge target of 378. Buoyed by three successful chases of near-300 targets, England got to the mark with hours and wickets to spare, scoring at nearly five runs per over. It was a virtual replica of the previous Test, and England won by a TPP margin of 63.5 vs 36.5.

England scored at above five per over in three innings, went past four in six of the innings, and had an overall scoring rate of an impressive 4.6 in these four matches. Their tactics were clear: Let the other team bat and score whatever they can; we will try and match their first-innings score, and if we end up in deficit, it does not matter. We have the bowlers to dismiss them for a reasonable score. And, somewhere on the fourth or fifth day, we have, what Vithushan Ehantharajah called beautifully, the Number. And we will chase. It does not matter if we lose early wickets. We will motor on.

The amazing thing is that this strategy has worked, and how. It can be said that England have thrown down the gauntlet to the other teams, with their tactics and batting, daring them to counter them. And the two teams who came visiting earlier this summer failed.

Now we move on to the details. I will look at the first three innings of each match overall, and at the fourth innings in depth. I will be using a measure that I have developed, called WIP (Win-Percentage). This is the chance of a win for the batting team expressed as a percentage. I determine this at the beginning of each of the four innings. In addition, I determine the value at the fall of each wicket in the third and fourth innings. The methodology is explained below.

First Innings: This is calculated at the beginning of the match, and is based on the relative team strengths. For these four matches, since the three teams were matched very closely, I have pegged the WIP at 50%. If, say, Bangladesh had been the visiting team, this would have been different.

Second Innings: This depends on the first-innings score. The par score is the average first innings for the current period (2011-2022), which is 361. A first-innings score of 361 will have the WIP value at 50%. A higher score will make this below 50 and a lower score will move this to above 50. All values are subject to limits between 5% and 95%.

Third Innings: This depends on whether the team batting third has a lead or is behind, and the margin of the deficit. In general, the greater the lead, the higher the WIP value for the team leading, and vice versa. In addition, a team following on will have their WIP pegged at 5%.

Fourth Innings: This depends on the target that the team has been set. I determine a Base-RpW (Runs per Wicket) value using the formula "0.2*RpW-1+0.3*RpW-2+0.5*RpW-3" for a normal no-follow-on sequence. A brief explanation: 20% of the other team's first-innings RpW, 30% of own team's first innings RpW (because this reflects how this team batted first) and 50% of the most accurate RpW (since this will be a clear indicator of how the pitch is behaving). The importance of the last-mentioned RpW will be obvious in matches like the first and second Tests in this article: 132 and 141 improving to 285, and 553 and 539 dropping to 284; the two scores in the 280s take on different hues in different contexts.

Then I determine how many wickets will be needed to reach the fourth-innings target. A requirement of below one wicket gets a WIP of 95%, around nine wickets gets a WIP of 50%, and 20-plus wickets gets a WIP of 5%. The rest are extrapolated between 5% and 95%.

WIPs during third and fourth innings at fall of wickets: A similar method is used. At the fall of, say, the first wicket, the runs required to reach the target are evaluated with the Base RpW and the fact that only nine wickets are available. At the fall of the second wicket, eight wickets, and so on.

With this introduction, let us move on to the snapshots of each Test, based on WIP values.

When England dismissed New Zealand for 132, their winning chances hit 81%. Then their own poor batting show got them down to 52%. New Zealand's good second-innings showing and the substantial target they set meant that England's chances stood at 35% at the start of the fourth innings. This was based on a Base-RpW of 21.1; the very low RpWs for the two first innings were partly compensated for by the good third-innings value. Over 13 wickets were needed to reach the target. The fall of the first wicket at 31 did not do much damage and the WIP stayed stable. The fall of the second wicket at 32 knocked the WIP down to 28%. After the third wicket it went down to 23% and at 69 for 4, to 19% - the lowest in the chase. The Root-Stokes stand took the score to 159 for 5 and the WIP improved to 44%, still below 50 - which makes sense since only the late-order batters were left. The stand between Root and Ben Foakes stand took them to the win. The high scoring rate meant that there were still 76 overs left to be played.

The second Test ended similarly but the trajectory of the WIP was strikingly different. The imposing New Zealand total of 553 set England's WIP at 24%. England's brave response got it back up to 48%, almost restoring parity. New Zealand's par response in the third innings led to an above average Base-RpW of 41.4, indicating that the chase was on. The relatively low target (in the context of the scores in the match) meant that England started the fourth innings at a rather comfortable 63%. This did not drop much as a few wickets fell mainly because the pitch was still very good. At 93 for 4, the WIP reached its lowest value in that innings, 58%. Then it went up to 91% and a rather comfortable win ensued. There were 22 overs still left in the game despite the high match aggregate of 1675 runs.

The Headingley Test has scores that were almost in the middle of those in the first two Tests. New Zealand's slightly below-par first-innings total of 329 gave England the edge at 54%. That was only slightly improved when England secured a small lead. The third-innings score in the vicinity of the two first-innings scores kept the England WIP around the 55% mark. The Base-RpW was at a par value of 33.7. One could say that this Test was dominated by par values. The loss of two England wickets at 17 and 51 in the chase only dampened their chances a little, and the loss of the third wicket was only a blip. There were 74 overs left in this match at the end, and it was the most comfortable win England had the whole summer.

India's first-innings total at Edgbaston was well above par and put England on the back foot at 43%. The substantial deficit of 132 pushed England further down to 29% at the halfway stage. England recovered somewhat thanks to their very good bowling show, dismissing India for 245. The Base-RpW was just below 30 and this meant that England started the fourth innings way below the midpoint: a WIP of 35% was a fair reflection of England's chances. The hundred partnership for the first wicket in the chase moved them up to 48%, but it was still anybody's game. The loss of two quick wickets then pushed England down to 34%.Then came the 250-plus stand that took England to the win. Again, like with two of the other three games, there were at least 70 overs left.

Now for a look at the key England partnerships in their chasing innings.

At Lord's, Joe Root and Ben Stokes effected a sedate stabilising partnership of 90, at a run rate of only three. But significantly, this moved England's win percent from 19 to 44. Then Root and Foakes, in a much faster partnership of 120 runs, scored at 4.13 and took England to a win. Root was the dominant batter in this partnership.

At Trent Bridge there was only one partnership of note - of 179 runs in 20 overs between Bairstow and Stokes, as good as any that a top T20 team can offer.

At Headingley, Ollie Pope and Root added 134 in quick time at nearly five runs per over, moving the win percent from 54% to 75%. Then Bairstow walked in and, in the company of Root, added 111 runs in less than 15 overs - an RpO of 7.65, slow only by the standards set in Nottingham.

Finally, at Edgbaston, in that huge chase, Alex Lees and Zak Crawley added 107 for the first wicket at nearly five runs per over. After the fall of a few wickets, Root and Bairstow took only 42 overs to hammer the Indian attack for 269 runs. When they came in, England were tottering at 34%.

There were seven important partnerships in these four innings. Most of these were put on at well above 4.5 runs per over. Root was part of five of these match-winning stands, while Bairstow was involved in three. In the first three innings of the season, when Bairstow did not click, it was Root who held firm. Stokes was involved in two. It is relevant that three of these successful chases had two partnerships each, indicating that these were team efforts.

Now let us move on to the numbers of the England players. I have considered the four Tests together as a super series.

Root and Bairstow were the two leading England batters - by a mile. Root scored over 550 runs at an average exceeding 110, while Bairstow scored over 600 runs at 102. It is not often that two batters have dominated a series like this. In addition, Bairstow scored at a strike rate of just over 100. This combination of 100-plus in both measures is like Halley's Comet - the rarest of rare events. The other batters scored below 300 runs at sub-50 averages. Stokes scored at a good clip. Pope had two good days. But it is clear that these were only supporting actors. Of the eight hundreds scored by England in these four Tests, Root and Bairstow made seven.

For New Zealand, Daryl Mitchell scored 538 runs at an average of 107.6, and Tom Blundell 383 runs at 76.7. Two noteworthy performances in losing causes. Rishabh Pant scored over 200 runs in the only Test played by India.

Matthew Potts took the most wickets in his first season in Tests - 18 at 26.7. James Anderson, the wily aging fox, took 17 wickets in three Tests at an excellent 18.3. Stuart Broad was expensive, as were Stokes and Jack Leach. Anderson was incisive, taking a wicket every 40 balls. The others finished close on either side of 60. Leach's competent performance was a surprise, although ten of his 14 wickets came in one Test. Broad had, overall, a not-so-great time. But it was clear that this was a series for the English batters, not bowlers. The bowlers performed competently, nothing more.

The England-South Africa series
It is great that South Africa will be visiting England for a three-Test series. But for what happened in the first half of the English summer, this would have been a series of no interest to the English fans, since their WTC qualification hopes are virtually zero. South Africa still have a fighting chance of qualifying. However, the overwhelming success of England in the four Tests has made the forthcoming series one of the most eagerly awaited in accurate times. There are many questions to be answered.

- Will England keep chasing the "Number"?
- At some point, will the Stokes-McCullum brand of cricket become the norm?
- What can South Africa do that New Zealand and India could not?
- What will England's reaction be if the blueprint is changed and they need to set targets rather than chase them? How inventive will they be?

The last question is probably the most important one. Everything fell England's way in June and July. They won the toss twice, inserted the other team, saw 500-plus and 400-plus being scored, but still won. They lost the toss twice, saw the other team bat poorly once and competently once, matched the scores, and still won.

Let us look into a crystal ball a little. Let us say that Dean Elgar wins the toss at Lord's on August 17. When all the world is expecting that South Africa will bat, Elgar tells Ben Stokes that he will bowl. England, bolstered by yet another Root hundred, make 400. South Africa huff and puff their way to 380. England start their second innings on the fourth day.

- How do England tackle this in their new adventurous mode?
- How do they bat in the third innings?
- What target do their team go for? Do they offer something for South Africa?
- How many overs does Stokes leave his bowlers?
- Will England think "second new ball plus 20" or do they think different?
- How do England's bowlers, unused recently to defending a target, manage that challenge?
- If the target is 310, and South Africa are 200 for 3, do England try and shut shop?

Fascinating questions indeed. Interesting times ahead. Most serious cricket enthusiasts will be waiting with bated breath.

Talking Cricket Group
Any reader who wishes to join the general-purpose cricket ideas-exchange group of this name that I started last year can email me a request for inclusion, providing their name, place of residence, and what they do.

Email me your comments and I will respond. This email id is to be used only for sending in comments. Please note that readers whose emails are derogatory to the author or any player will be permanently blocked from sending in any feedback in future.

Sat, 06 Aug 2022 14:16:00 -0500 en text/html https://www.espn.co.uk/cricket/story/_/id/34360566/anantha-narayanan-how-england-aced-their-four-spectacular-test-chases-summer
Killexams : Emulating The IBM PC On An ESP32

The IBM PC spawned the basic architecture that grew into the dominant Wintel platform we know today. Once heavy, cumbersome and power thirsty, it’s a machine that you can now emulate on a single board with a cheap commodity microcontroller. That’s thanks to work from [Fabrizio Di Vittorio], who has shared a how-to on Youtube. 

The full playlist is quite something to watch, showing off a huge number of old-school PC applications and games running on the platform. There’s QBASIC, FreeDOS, Windows 3.0, and yes, of course, Flight Simulator. The latter game was actually considered somewhat of a de facto standard for PC compatibility in the 1980s, so the fact that the ESP32 can run it with [Fabrizio’s] code suggests he’s done well.

It’s amazingly complete, with the ESP32 handling everything from audio and video to sound output and keyboard and mouse inputs. It’s a testament to the capability of modern microcontrollers that this is such a simple feat in 2021.

We’ve seen the ESP32 emulate 8-bit gaming systems before, too. If you remember [Fabrizio’s] name, it’s probably from his excellent FabGL library. Videos after the break.

Thu, 04 Aug 2022 12:00:00 -0500 Lewin Day en-US text/html https://hackaday.com/2021/07/28/emulating-the-ibm-pc-on-an-esp32/
Killexams : IBM Research Open-Sources Deep Search Tools

(Laborant/Shutterstock)

IBM Research’s Deep Search product uses natural language processing (NLP) to “ingest and analyze massive amounts of data—structured and unstructured.” Over the years, Deep Search has seen a wide range of scientific uses, from Covid-19 research to molecular synthesis. Now, IBM Research is streamlining the scientific applications of Deep Search by open-sourcing part of the product through the release of Deep Search for Scientific Discovery (DS4SD).

DS4SD includes specific segments of Deep Search aimed at document conversion and processing. First is the Deep Search Experience, a document conversion service that includes a drag-and-drop interface and interactive conversion to allow for quality checks. The second element of DS4SD is the Deep Search Toolkit, a Python package that allows users to “programmatically upload and convert documents in bulk” by pointing the toolkit to a folder whose contents will then be uploaded and converted from PDFs into “easily decipherable” JSON files. The toolkit integrates with existing services, and IBM Research is welcoming contributions to the open-source toolkit from the developer community.

IBM Research paints DS4SD as a boon for handling unstructured data (data not contained in a structured database). This data, IBM Research said, holds a “lot of value” for scientific research; by way of example, they cited IBM’s own Project Photoresist, which in 2020 used Deep Search to comb through more than 6,000 patents, documents, and material data sheets in the hunt for a new molecule. IBM Research says that Deep Search offers up to a 1,000× data ingestion speedup and up to a 100× data screening speedup compared to manual alternatives.

The launch of DS4SD follows the launch of GT4SD—IBM Research’s Generative Toolkit for Scientific Discovery—in March of this year. GT4SD is an open-source library to accelerate hypothesis generation for scientific discovery. Together, DS4SD and GT4SD constitute the first steps in what IBM Research is calling its Open Science Hub for Accelerated Discovery. IBM Research says more is yet to come, with “new capabilities, such as AI models and high quality data sources” to be made available through DS4SD in the future. Deep Search has also added “over 364 million” public documents (like patents and research papers) for users to leverage in their research—a big change from the previous “bring your own data” nature of the tool.

The Deep Search Toolkit is accessible here.

Related Items

MIT-IBM Watson AI Lab Tackles Power Grid Failures with AI

IBM Acquires Observability Platform Databand.ai

A Nutrition Label for AI

Mon, 18 Jul 2022 02:37:00 -0500 text/html https://www.datanami.com/2022/07/18/ibm-research-open-sources-deep-search-tools/
Killexams : Prescriptive and Predictive Analytics Market Business overview 2022, and Forecast to 2030 | By -Accenture, Oracle, IBM, Microsoft

The MarketWatch News Department was not involved in the creation of this content.

Aug 01, 2022 (Market Insight Reports) -- New Jersey, United States- IBI’s most accurate assessment on the Prescriptive and Predictive Analytics Market assesses market size, example, and projection to 2030. The market study integrates basic assessment data and affirmations, making it a significant resource report for bosses, examiners, industry-trained professionals, and other key people who need a self-analyzed study to all more promptly fathom market designs, improvement drivers, open entryways, and looming challenges, as well as about competitors.

The Prescriptive and Predictive Analytics Market Report’s Objectives

• -To study and sort out the Prescriptive and Predictive Analytics market’s size concerning both worth and volume.
• -Measure a slice of the pie of huge Prescriptive and Predictive Analytics market parts.
• -To show how the Prescriptive and Predictive Analytics market is made in a different region of the planet.
• -To investigate and look at smaller than usual business areas concerning their Prescriptive and Predictive Analytics market responsibilities, potential outcomes, and individual advancement designs.
• -To give a quick and dirty assessment of key business procedures utilized by top firms checking out the post, as creative work, facilitated endeavors, plans, affiliations, acquisitions, unions, new developments, and thing dispatches.

Receive the demo Report of Prescriptive and Predictive Analytics Market 2022 to 2030:

The worldwide Prescriptive and Predictive Analytics market is expected to grow at a booming CAGR of 2022-2030, rising from USD billion in 2021 to USD billion in 2030. It also shows the importance of the Prescriptive and Predictive Analytics market main players in the sector, including their business overviews, financial summaries, and SWOT assessments.

Prescriptive and Predictive Analytics Market Segmentation & Coverage:

Prescriptive and Predictive Analytics Market segment by Type:
Collection Analytics, Marketing Analytics, Supply-Chain Analytics, Behavioral Analytics, Talent Analytics

Prescriptive and Predictive Analytics Market segment by Application:
Finance & Credit, Banking & Investment, Retail, Healthcare & Pharmaceutical, Insurance, Others

The years examined in this study are the following to estimate the Prescriptive and Predictive Analytics market size:

History Year: 2015-2019
Base Year: 2021
Estimated Year: 2022
Forecast Year: 2022 to 2030

Cumulative Impact of COVID-19 on Market:

Various enterprises have faced issues due to COVID-19. This is legitimate in the business as well. In light of the COVID-19 plague, a couple of countries’ watchman monetary plans have been cut. Most assessment projects are expected to momentarily stand by in this way. Results of onboard PC stages to various Middle Eastern, African, and Latin American countries have furthermore reduced. These possible results influence the PC stage’s progression.

Get a demo Copy of the Prescriptive and Predictive Analytics Market Report: https://www.infinitybusinessinsights.com/request_sample.php?id=878605

Regional Analysis:

Bargains procedures, adventure, and cost structures are among the critical subjects covered here, as well as a highlight on the Prescriptive and Predictive Analytics market in a key locales like Asia Pacific, North America, Latin America, Europe, and the Middle East and Africa. This market examination, which unites the two figures and real factors, moreover covers the money-related parts of affiliations.

The Key companies profiled in the Prescriptive and Predictive Analytics Market:

The study examines the Prescriptive and Predictive Analytics market’s competitive landscape and includes data on important suppliers, including Accenture, Oracle, IBM, Microsoft, QlikTech, SAP, SAS Institute, Alteryx, Angoss, Ayata, FICO, Information Builders, Inkiru, KXEN, Megaputer, Revolution Analytics, StatSoft, Splunk Anlytics, Tableau, Teradata, TIBCO, Versium, Pegasystems, Pitney Bowes, Zemantis,& Others

Table of Contents:

List of Data Sources:
Chapter 2. Executive Summary
Chapter 3. Industry Outlook
3.1. Prescriptive and Predictive Analytics Global Market segmentation
3.2. Prescriptive and Predictive Analytics Global Market size and growth prospects, 2015 – 2026
3.3. Prescriptive and Predictive Analytics Global Market Value Chain Analysis
3.3.1. Vendor landscape
3.4. Regulatory Framework
3.5. Market Dynamics
3.5.1. Market Driver Analysis
3.5.2. Market Restraint Analysis
3.6. Porter’s Analysis
3.6.1. Threat of New Entrants
3.6.2. Bargaining Power of Buyers
3.6.3. Bargaining Power of Buyers
3.6.4. Threat of Substitutes
3.6.5. Internal Rivalry
3.7. PESTEL Analysis
Chapter 4. Prescriptive and Predictive Analytics Global Market Product Outlook
Chapter 5. Prescriptive and Predictive Analytics Global Market Application Outlook
Chapter 6. Prescriptive and Predictive Analytics Global Market Geography Outlook
6.1. Prescriptive and Predictive Analytics Industry Share, by Geography, 2022 & 2030
6.2. North America
6.2.1. Market 2022 -2030 estimates and forecast, by product
6.2.2. Market 2022 -2030, estimates and forecast, by application
6.2.3. The U.S.
6.2.3.1. Market 2022 -2030 estimates and forecast, by product
6.2.3.2. Market 2022 -2030, estimates and forecast, by application
6.2.4. Canada
6.2.4.1. Market 2022 -2030 estimates and forecast, by product
6.2.4.2. Market 2022 -2030, estimates and forecast, by application
6.3. Europe
6.3.1. Market 2022 -2030 estimates and forecast, by product
6.3.2. Market 2022 -2030, estimates and forecast, by application
6.3.3. Germany
6.3.3.1. Market 2022 -2030 estimates and forecast, by product
6.3.3.2. Market 2022 -2030, estimates and forecast, by application
6.3.4. the UK
6.3.4.1. Market 2022 -2030 estimates and forecast, by product
6.3.4.2. Market 2022 -2030, estimates and forecast, by application
6.3.5. France
6.3.5.1. Market 2022 -2030 estimates and forecast, by product
6.3.5.2. Market 2022 -2030, estimates and forecast, by application
Chapter 7. Competitive Landscape
Chapter 8. Appendix

Download here the full INDEX of Prescriptive and Predictive Analytics Market Research Report @

FAQs

In 2030, what will the Prescriptive and Predictive Analytics market’s improvement rate be?
What are the principal drivers of the market?
Who are the Prescriptive and Predictive Analytics market’s driving makers?
What are the Prescriptive and Predictive Analytics market’s prospects, risks, and overall market picture?

Contact Us:
Amit Jain
Sales Co-Ordinator
International: +1 518 300 3575
Email: inquiry@infinitybusinessinsights.com
Website: https://www.infinitybusinessinsights.com

COMTEX_411326430/2599/2022-08-01T00:36:29

Is there a problem with this press release? Contact the source provider Comtex at editorial@comtex.com. You can also contact MarketWatch Customer Service via our Customer Center.

The MarketWatch News Department was not involved in the creation of this content.

Sun, 31 Jul 2022 16:36:00 -0500 en-US text/html https://www.marketwatch.com/press-release/prescriptive-and-predictive-analytics-market-business-overview-2022-and-forecast-to-2030-by--accenture-oracle-ibm-microsoft-2022-08-01
Killexams : From Floppies to Solid State: The Evolution of PC Storage Media

Since the dawn of computing, we've struggled with how to store all this digital stuff. International Business Machines helped launch the PC revolution in the 1980s, but computers were dealing with storage issues long before that. In fact, that same company had the first hard disk drive running back in 1956(Opens in a new window)—a 2,000-pound unit that cost $35,000 per year to operate.

It also held only 5 megabytes (MB). But just look at how streamlined that thing is.

Other ways to store data existed in those early days, from punch cards to giant, reel-to-reel magnetic tape machines. Thankfully, by the time PCs first made it to our offices and living rooms, storage devices were substantially smaller, if not yet as small as what we carry in our pockets today.

Let's look back at what it took to store data on a PC from the early days through today. It should give you a whole new appreciation for the size, speed, and capacity of today’s latest storage methods.


1. 5.25-Inch Floppy Disk

A 5.25-inch floppy drive from an original IBM PC

A 5.25-inch floppy drive from an original IBM PC (Credit: René Ramos/Molly Flores)

IBM created the floppy drive as a means of read-only magnetic storage in 1972. Floppy disks originally came in a size of 203.2mm, which is close enough to 8 inches for that to be the moniker used. The round disk inside was in a permanent flexible (floppy) jacket to keep fingers off.

The eight-inch size didn't stick around for very long. Steve Wozniak designed the first external Apple II disk drive in 1978; it used a 5.25-inch floppy disk. Soon, Commodore, Tandy, and Atari adopted the same format.

The original IBM PC 5150 that debuted in August 1981 offered the option of one or two internal 5.25-inch floppy drives. Each floppy diskette could hold 160 kilobytes on one side, or 320KB if you could use both (not all disks were double-sided). The drives required a controller card on the motherboard and were connected with ribbon cables. Back then, having two floppy drives made a huge difference because one of them could hold the operating system while the other drive loaded a program, such as Lotus 1-2-3(Opens in a new window). You wouldn't have to swap disks.

Hard drives soon became the permanent, long-term data storage standard, and next-generation floppy disks would soon take over for portability, both of which we'll get to below. The 5.25-inch floppy was fully ejected by 1994.


2. Cassette Tape

Iomega Ditto

Iomega Ditto (Credit: René Ramos/Dual Freq via Wikimedia Commons)

Magnetic tape isn't that far different from a floppy disk, although it's a lot slower when accessing stored data. In the 1980s, computer software was often sold on cassette tape, just like music albums. Cassette recorders were available for home computers such as the Apple II and Commodore 64.

The original IBM PC also had a port for one. A 90-minute cassette could hold about a megabyte of data. But few developers sold PC software on tape because the computer almost always came with at least one floppy drive. IBM soon dropped the 5-pin DIN cassette port on its later systems, but it continued to sell the original 5150 right up through 1987 without a floppy drive if a customer preferred tape.

Why include a port for tape at all? Some people wanted to run a version of BASIC called Cassette BASIC(Opens in a new window) that only worked off of tape, and DOS had no cassette tape support (DOS stood for Disk Operating System, after all). And because tape was the cheapest storage available(Opens in a new window).

Third parties made proprietary tape-based drives for backup, such as Iomega and its Ditto drive(Opens in a new window) of the 1990s. Iomega gave it up and sold off the tape drive biz before the end of the decade. 

Unlike the floppy drive, however, tape has never gone away. You can still buy uber-expensive cartridge drives using the Linear Tape-Open (LTO) spec(Opens in a new window) for massive backup use—usually they’re found in enterprises(Opens in a new window), backing up servers full of important data.


3. 3.5-Inch Floppy Disk

3.5-inch Floppy Disks

3.5-inch floppy disks (Credit: René Ramos/Javier Zayas Photography/Getty Images)

The 3.5-inch floppy disk is the universal iconic symbol for saving your work for a reason. The smaller disk wasn't as floppy as 8-inch and 5.25-inch diskettes because the 3.5-inch version came inside a hard plastic shell. It did, however, become the top-selling removable storage medium by 1988. This despite a limited capacity: first 720KB, then in a high-density 1.44MB version. IBM made a 2.88MB double-sided extended-density diskette for the IBM PS/2, but that standard went nowhere.

3.5-inch floppies were a mainstay of PC software well into the 90s; five billion 3.5-inch floppies(Opens in a new window) were in use by 1996.

But the small diskettes couldn’t keep up with the demands of bloated software. At one point, for example, Microsoft shipped a version of Windows 98 that required sequentially inserting 21 different floppy disks to install it on a hard drive. Microsoft Office required almost twice that many. You could build up your arm muscles by replacing disks while installing software to a hard drive. Sony, one of the biggest manufacturers, stopped making 3.5-inch floppies(Opens in a new window) in 2011.


4. Hard Disk Drive

The Seagate ST-412 Hard Disk Drive from the Original IBM PC

A Seagate ST-412 hard disk drive from an original IBM PC (Credit: Molly Flores)

Hard disk drives (HDDs) were nothing new in 1982, but a hard drive didn’t make it into the first IBM PC. Instead, the world (and PC Magazine) awaited the second-generation eXTension (XT) model. The PC XT included a standard 10MB HDD, which we called "certainly significant" in our Feb-Apr 1983 issue(Opens in a new window). The drive required a new power supply and a BIOS update, all of which contributed to the XT's much higher price of $4,995 (that’s $14,380 with 2022 inflation(Opens in a new window)).

The IBM PC's first HDD was the Seagate Technology Model ST-412(Opens in a new window). The interface between it and the motherboard became the de facto disk drive standard for several years.

Entire books have been written about HDDs (though one book entitled Hard Drive(Opens in a new window) was about the hard-driving influence of Microsoft). The impact of spacious, local, re-writable storage on a platter changed everything. Hard drives continued to dominate system storage decades later due to their overall reliability and ever-increasing speed and capacity.

Today, you can find 20-terabyte (TB) internal hard drives on the market, such as the Seagate Exos X20(Opens in a new window) for $389. That company alone has shipped a full 3 zettabytes of hard drive storage capacity as of 2021—the equivalent of 150,000,000 hard drives with 20TB each.


5. Zip Disk

Zip Disks

Zip disks (Credit: Young Swee Ming/Shutterstock)

The Zip Drive and its high-capacity floppy disks never really replaced the standard floppy, but of the many “superfloppy” products that tried, only Iomega’s came close. The company had limited success with its Bernoulli Box removable floppies in the 1980s. But the 1994 debut of the very affordable Zip Drive put Bernoulli on a whole other level.

Zip disks were the first to hold 100MB of data each; subsequent releases went to 250MB and even 750MB in 2002. Bernoulli also survived the famous Click of Death(Opens in a new window) lawsuit in 1998. By 2003, Iomega had shipped some 50 million Zip Drives.

But timing is everything. Zip Drives were caught between the era of the floppy and the onslaught of writable CDs that could seek data much faster, plus local networks that made file transfers much easier. EMC bought Iomega, and soon partnered with Lenovo before killing off the Zip drive line.


6. Jaz Disk

Jaz Disk

A Jaz disk (Credit: René Ramos/Science & Society Picture Library/Getty Images)

Following the debut of the popular Zip disk, Iomega tried to build on that success in 1995 with the Jaz(Opens in a new window). The thicker Jaz format boosted capacity to 1GB per disk, and then to 2GB by 1998—perfect for creatives who needed copious amounts of storage.

Iomega marketed the Jaz mainly as a $500 external drive, although an internal version was available, which the Zip also had as an option. The Jaz drive connected via a SCSI interface, which was big on the Macintosh, though some later models connected to parallel ports. A SCSI adapter worked with USB and even FireWire.

The Jaz had some of the same issues as the Zip, however, including the Click of Death problem and overheating. Like the Zip, the Jaz also pushed up against the coming of the CD and CD-R, and couldn't compete on price.


7. USB Flash Drive

A USB Drive in a Swiss Army Knife

A USB drive in a Swiss army knife (Credit: René Ramos/Victorinox via Amazon)

2000 saw the first ever Universal Serial Bus (USB)-based flash-memory drive, the ThumbDrive from Trek Technology(Opens in a new window). A holy matrimony between the easy-to-use and now mainstream USB port and (finally inexpensive) non-volatile NAND flash memory, the ThumbDrive was among the first chips that didn’t require power to retain data. IBM’s first flash drive that same year, the DiskOnKey(Opens in a new window), held 8MB for $50.

Soon, the floodgates opened. Tons of companies made small, fast, somewhat-high-capacity solid-state drives as big as your thumb. Many sued each other. It took years for Trek to win the US copyright to the name “Thumbdrive”(Opens in a new window) in 2010, by which time the term was genericized—but that win is also why PCMag and others now call them “flash drives” instead.

That initial 1.0 USB specification gave way to the 30x faster speeds of 2.0, which only helped flash memory drives. By 2004 the first 1GB flash drive shipped.

Today, USB flash drives typically use USB 3.0 with a read speed of 120 MB per second. In our Best USB Flash Drives for 2022, we tested devices with the old-school USB-A connector (Samsung Bar Plus 128GB USB 3.1 Flash Drive for $21) and even some that use the faster USB-C (SanDisk Ultra Dual Drive 128GB USB Type-C Flash Drive for $18). We’ve seen them in all shapes (including as part of a Swiss army knife), some with incredibly high capacity, some with security locks integrated, and other crazy things. For all-around ease of use coupled with secure, mobile storage, USB flash drives remain hard to beat.


8. Memory Card

Memory Cards

SD cards (Credit: René Ramos/Nattawut Lakjit/EyeEm/Getty Images)

It's not unfair to think of memory cards as USB flash drives without the USB. The cards can be much, much smaller. While they work as media storage for PCs, they’re more likely to be found in even smaller devices, requiring you to have an adapter for your PC to read them.

The first “cards” were the large PCMCIA devices that came in sized like a credit card, albeit substantially thicker. This gave way in the mid-1990s to Compact Flash, a format that you can still find in devices today—then to Toshiba’s SmartMedia Card (SMC), a NAND-based flash memory that held as much as 128MB on a card only 0.76mm thick.

Memory cards have had many subsequent names and sizes and shapes in the last two decades: Multimedia Cards (MMC), Secure Digital(Opens in a new window) (SD), SmartMedia, Memory Stick, XD-Card, and Universal Flash Storage (UFS), among others. Eventually, the most popular, SD, got smaller via miniSD and microSD; they remain the most prevalent today.

Recommended by Our Editors

Originally, memory cards were meant to replace floppy disks or even the high-capacity ones like the Zip. But the tiny size made them ideal to become the digital replacement for film in cameras. The memory card propelled the age of digital photography. Today, support for memory cards in Android-based smartphones ebbs and flows (usually, it ebbs). Some memory cards are even specific to various brands and generations of game consoles.


9. CD-ROM

Using a CD-ROM in a laptop

Using a CD-ROM with a laptop (Credit: silverjohn/Getty Images)

The read-only memory that changed the world. The fully-optical-and-digital compact disc full of data held up to 650MB on 1.2 mm of polycarbonate plastic with a reflective aluminum surface. They could be read only by a laser. CD-ROM became the standard for software and video game distribution in the late 1980s and persisted through the 90s. (Music CDs are similar, but they use a different format(Opens in a new window), although computer CD drives could eventually read those, too.)

The CD-ROM's only downside is that it is read-only memory (it's right there in the name). Users couldn’t write data to it. This did however make them ideal to software and game distributors who liked that it was easy to copy-protect.

The faster the drive could spin the CD-ROM, the faster the data could be accessed. The base standard of 1x was about 150Kb per second, but eventually, drives were hitting 52x or even 72x, but with some physical world caveats.


10. CD-R and CD-RW

CD-R and a CD-RW with printable surface

A CD-R and a CD-RW with a printable surface (Credit: René Ramos/Verbatim via Amazon)

The compact disc-recordable (CD-R) was originally called the CD-Write-Once and uses some of the same technology as the earlier magneto-optical drive—the ability to write your data to a disc one time only for backup or distribution. You could write to CD-Rs in the audio format (“Red Book”) holding up to 80 minutes of music or data format (“Yellow Book”) with 700MB of info, and they’d work in regular CD players or CD-ROM drives most of the time. The CD-R format was part of the “Orange Book(Opens in a new window)” standard, and writing to CD-Rs became known as “burning” a CD.

You can still easily buy CD-R media online. Verbatim sells a 100 disc pack for $19.22(Opens in a new window) on Amazon.

Another Orange Book-based product introduced in 1997, the CD-RW became the first truly mainstream optical media option that let you write to the disc, erase it, and write to it again. You couldn't do it forever, maybe 1,000 times, but that’s still a lot. They’re almost the same as CD-Rs but with different reflective layers to facilitate erasure and re-writing.

The biggest drawback of CD-RWs is that not all older CD and CD-R drives will read them. They also don’t necessarily last as long as the original CD-ROMs. A spindle of 50 CD-RW discs from Verbatim currently sells for $31(Opens in a new window). Plus, they’re printable—you can run them through select printers to label them.


11. DVD and DVD±RW

Stacks of DVD-R Disks

Stacks of DVD-R disks (Credit: René Ramos/Olga Sapegina/Shutterstock)

The Digital Video Disc, or Digital Versatile Disc depending on who you ask, also came in the late 1990s and became the primo way to distribute high-end video of films quickly. It was better than LaserDisc because it was much smaller, it included sharper digital video, and it also didn't need to be flipped halfway through a movie. The DVD was enough to replace VHS and also get Netflix off the ground in 1998 as a mail-order movie rental biz. Remember those red envelopes?

There are types of re-writeable DVD: the standard “dash” format (DVD-R/RW) from 1997 and the plus (DVD+R/RW) from 2002. Different industry consortiums back each standard. With an R, you can write once; with RWs you can re-write, just like with the CD version. The big upside of a DVD-R for computer storage, of course, is it holds a lot more than a CD-R. A regular DVD-R on a single side using a single layer can store 4.7GB of data. A 30-pack of Verbatim-brand DVD+RW discs goes now for $23.25(Opens in a new window)


12. Sony Blu-ray

Blu-Ray BD-RE disc

A Blu-ray BD-RE disc (Credit: René Ramos/bkindler/GettyImages)

You probably know Blu-ray as the format for buying high-definition movies (with lots of extras) on a disc. It's the format that won the war against Toshiba’s HD-DVD(Opens in a new window) in the aughts, finally giving Sony some justice over what happened with Betamax(Opens in a new window). It wasn’t originally created for the purpose, but Blu-ray became king of the movie-watching hill...at least until streaming went, uh, mainstream. (But some of us still prefer physical media.)

Recordable (BD-R) or Recordable Erasable (BD-RE) Blu-ray discs have been available since at least 2005, assuming you have the right kind of drive that can handle 45Mbps write speed. The standard disk capacity is 25GB or 50GB depending on whether it's single- or dual-layer.


13. Solid-State Drive (SSD)

Samsung Solid State Drive

Samsung Solid State Drive (Credit: René Ramos/Zlata Ivleva)

The first SSD appeared in 1991, but it took a few decades for the tech to go mainstream. It's essentially like flash drive memory, on a grander scale of capacity, and using semiconductor cells for non-volatile storage. SSDs work in a PC like HDDs, but without any of the moving parts that spell eventual doom. And SSDs are a lot faster, making them perfect for booting up an operating system.

SSDs often accompany HDDs in lower-cost PCs, and increasingly the SSD is the only drive on board. Plus, there are many external SSD options. SSDs also make great upgrades for PCs that need a new drive, even laptops, thanks to the small “gumstick” M.2 format. You can read more about them in SSD vs. HDD: What’s the Difference, and our sister site ExtremeTech has a deep dive on how SSDs work(Opens in a new window).


Want to see some stranger storage? Check out 10 Bizarre PC Storage Formats that Didn’t Quite Cut It.

Thu, 04 Aug 2022 02:21:00 -0500 en text/html https://www.pcmag.com/news/the-evolution-of-pc-storage-media
A2090-463 exam dump and training guide direct download
Training Exams List