IBM’s annual Cost of Data Breach Report for 2022 is packed with revelations, and as usual none of them are good news. Headlining the report is the record-setting cost of data breaches, with the global average now at $4.35 million. The report also reveals that much of that expense comes with the data breach version of “long Covid,” expenses that are realized more than a year after the attack.
Most organizations (60%) are passing these added costs on to consumers in the form of higher prices. And while 83% of organizations now report experiencing at least one data breach, only a small minority are adopting zero trust strategies.
The IBM report draws on input from 550 global organizations surveyed about the period between March 2021 and March 2022, in partnership with the Ponemon Institute.
Though the average cost of a data breach is up, it is only by about 2.6%; the average in 2021 was $4.24 million. This represents a total climb of 13% since 2020, however, reflecting the general spike in cyber crime seen during the pandemic years.
Organizations are also increasingly not opting to absorb the cost of data breaches, with the majority (60%) compensating by raising consumer prices separate from any other accurate increases due to inflation or supply chain issues. The report indicates that this may be an underreported upward influence on prices of consumer goods, as 83% of organizations now say that they have been breached at least once.
Brad Hong, Customer Success Manager for Horizon3.ai, sees a potential consumer backlash on the horizon once public awareness of this practice grows: “It’s already a breach of confidence to lose the confidential data of customers, and sure there’s bound to be an organization across those surveyed who genuinely did put in the effort to protect against and curb attacks, but for those who did nothing, those who, instead of creating a disaster recovery plan, just bought cyber insurance to cover the org’s operational losses, and those who simply didn’t care enough to heed the warnings, it’s the coup de grâce to then pass the cost of breaches to the same customers who are now the victims of a data breach. I’d be curious to know what percent of the 60% of organizations who increased the price of their products and services are using the extra revenue for a war chest or to actually reinforce their security—realistically, it’s most likely just being used to fill a gap in lost revenue for shareholders’ sake post-breach. Without government regulations outlining restrictions on passing cost of breach to consumer, at the least, not without the honest & measurable efforts of a corporation as their custodian, what accountability do we all have against that one executive who didn’t want to change his/her password?”
Breach costs also have an increasingly long tail, as nearly half now come over a year after the date of the attack. The largest of these are generally fines that are levied after an investigation, and decisions or settlements in class action lawsuits. While the popular new “double extortion” approach of ransomware attacks can drive long-term costs in this way, the study finds that companies paying ransom demands to settle the problem quickly aren’t necessarily seeing a large amount of overall savings: their average breach cost drops by just $610,000.
Sanjay Raja, VP of Product with Gurucul, expands on how knock-on data breach damage can continue for years: “The follow-up attack effect, as described, is a significant problem as the playbooks and solutions provided to security operations teams are overly broad and lack the necessary context and response actions for proper remediation. For example, shutting down a user or application or adding a firewall block rule or quarantining a network segment to negate an attack is not a sustainable remediation step to protect an organization on an ongoing basis. It starts with a proper threat detection, investigation and response solution. Current SIEMs and XDR solutions lack the variety of data, telemetry and combined analytics to not only identify an attack campaign and even detect variants on previously successful attacks, but also provide the necessary context, accuracy and validation of the attack to build both a precise and complete response that can be trusted. This is an even greater challenge when current solutions cannot handle complex hybrid multi-cloud architectures leading to significant blind spots and false positives at the very start of the security analyst journey.”
In spite of over four out of five organizations now having experienced some sort of data breach, only slightly over 20% of critical infrastructure companies have moved to zero trust strategies to secure their networks. Cloud security is also lagging as well, with a little under half (43%) of all respondents saying that their security practices in this area are either “early stage” or do not yet exist.
Those that have onboarded security automation and AI elements are the only group seeing massive savings: their average cost of data breach is $3.05 million lower. This particular study does not track average ransom demands, but refers to Sophos research that puts the most accurate number at $812,000 globally.
The study also notes serious problems with incident response plans, especially troubling in an environment in which the average ransomware attack is now carried out in four days or less and the “time to ransom” has dropped to a matter of hours in some cases. 37% of respondents say that they do not test their incident response plans regularly. 62% say that they are understaffed to meet their cybersecurity needs, and these organizations tend to suffer over half a million more dollars in damages when they are breached.
Of course, cost of data breaches is not distributed evenly by geography or by industry type. Some are taking much bigger hits than others, reflecting trends established in prior reports. The health care industry is now absorbing a little over $10 million in damage per breach, with the average cost of data breach rising by $1 million from 2021. And companies in the United States face greater data breach costs than their counterparts around the world, at over $8 million per incident.
Shawn Surber, VP of Solutions Architecture and Strategy with Tanium, provides some insight into the unique struggles that the health care industry faces in implementing effective cybersecurity: “Healthcare continues to suffer the greatest cost of breaches but has among the lowest spend on cybersecurity of any industry, despite being deemed ‘critical infrastructure.’ The increased vulnerability of healthcare organizations to cyber threats can be traced to outdated IT systems, the lack of robust security controls, and insufficient IT staff, while valuable medical and health data— and the need to pay ransoms quickly to maintain access to that data— make healthcare targets popular and relatively easy to breach. Unlike other industries that can migrate data and sunset old systems, limited IT and security budgets at healthcare orgs make migration difficult and potentially expensive, particularly when an older system provides a small but unique function or houses data necessary for compliance or research, but still doesn’t make the cut to transition to a newer system. Hackers know these weaknesses and exploit them. Additionally, healthcare orgs haven’t sufficiently updated their security strategies and the tools that manufacturers, IT software vendors, and the FDA have made haven’t been robust enough to thwart the more sophisticated techniques of threat actors.”
Familiar incident types also lead the list of the causes of data breaches: compromised credentials (19%), followed by phishing (16%). Breaches initiated by these methods also tended to be a little more costly, at an average of $4.91 million per incident.
Though the numbers are never as neat and clean as averages would indicate, it would appear that the cost of data breaches is cut dramatically for companies that implement solid automated “deep learning” cybersecurity tools, zero trust systems and regularly tested incident response plans. Mature cloud security programs are also a substantial cost saver.
Red Hat, the open source software company that IBM acquired for a record $34 billion in 2019, is gearing up to capture more of the hybrid cloud market with a new leader at its helm.
Matt Hicks, a 16-year Red Hat veteran who most recently led its products and technologies division, was appointed CEO last week. He takes over for Paul Cormier, who had been in the CEO seat for two years and will become chairman of Red Hat.
Red Hat has grown at a steady clip since it was bought by IBM as part of its ambition to become a leader in hybrid cloud, where customers use a combination of on-premises, physical data centers with cloud providers like Amazon Web Services for their IT infrastructure.
It reported 17% growth in the second quarter earlier this week, and revenue from its flagship OpenShift software, which lets developers run code across multiple clouds and data centers, grew four and a half times from pre-acquisition, IBM said. Its software unit, which includes Red Hat, reported revenue of $6.2 billion.
The plan for Hicks and Cormier, who both now report to IBM CEO Arvind Krishna, is to continue what they consider a "close working relationship" with Krishna that hinges on maintaining Red Hat's independence and neutrality as a vendor that works openly with all the major cloud providers.
Some of that relationship has already paid off — IBM's sales force is selling Red Hat products, Cormier previously told Insider, boosting its reach.
Hicks, who has worked alongside Cormier for the past decade, said his plan for Red Hat's future isn't too different from his predecessor's, but will focus on three key areas.
"I'm in a pretty lucky position where I'm inheriting a phenomenal strategy, a core product strategy and team," Hicks told Insider. "And so my focus is on executing the opportunity that we have ahead of us."
First, the company is firmly targeting the hybrid cloud market. Echoing what parent company IBM has repeatedly called a trillion-dollar opportunity, Hicks said there's plenty more of that market up for grabs.
"Customers are starting to understand that the public clouds will specialize in innovation and they'll probably be using multiple, and their on-premise environments aren't going away," he said. "We think only a quarter of workloads have been moved so far."
Second, Hicks said Red Hat is investing more in running its OpenShift software for customers directly. The company last year unveiled managed cloud services, an offering where it can fully manage customers' IT operations in cloud providers like AWS, Microsoft Azure, and Google Cloud.
"It's a new area for us to execute and prove that we can be relied on for running mission-critical workloads, not just supporting them when customers run them," Hicks said. "I think it has equal opportunity."
Lastly, Red Hat is pushing heavily into edge computing, the model where processing power is placed closer to where data is created in the physical world, such as devices. For Red Hat, that means helping customers use their hybrid cloud infrastructure in edge environments like manufacturing floors.
Hicks says the open source company is focusing on the telecom and automotive industries, because "we do have to pick and make sure we're focused where we can deliver the most value, realistically." That includes investing in 5G, the next-generation telecommunications network that will support edge computing.
And Cormier, who has been with Red Hat for over two decades, says he will advise Red Hat's largest customers on their cloud deployments. He will run a strategic advisory board dedicated to those clients, he told Insider, and keep an office at Red Hat headquarters.
Critically, Cormier says he'll continue having a "big voice" when it comes to the company's acquisition strategy. "The only difference in the M&A strategy is, Matt will be the one making the final decision," he said.
Both IBM at large and Red Hat, in particular, appear to be gearing up for acquisition sprees amid the market downturn: Krishna recently said Big Blue is looking at firms specializing in hybrid cloud and automation, and Hicks told Insider Red Hat is focused on edge computing and hybrid cloud opportunities.
"There are a lot of options there," Hicks said. "I think it will be an exciting space to watch."
IBM has published details on a collection of techniques it hopes will usher in quantum advantage, the inflection point at which the utility of quantum computers exceeds that of traditional machines.
The focus is on a process known as error mitigation, which is designed to Improve the consistency and reliability of circuits running on quantum processors by eliminating sources of noise.
IBM says that advances in error mitigation will allow quantum computers to scale steadily in performance, in a similar pattern exhibited over the years in the field of classical computing.
Although plenty has been said about the potential of quantum computers, which exploit a phenomenon known as superposition to perform calculations extremely quickly, the reality is that current systems are incapable of outstripping traditional supercomputers on a consistent basis.
A lot of work is going into improving performance by increasing the number of qubits on a quantum processor, but researchers are also investigating opportunities related to qubit design, the pairing of quantum and classical computers, new refrigeration techniques and more.
IBM, for its part, has now said it believes an investment in error mitigation will bear the most fruit at this stage in the development of quantum computing.
“Indeed, it is widely accepted that one must first build a large fault-tolerant quantum processor before any of the quantum algorithms with proven super-polynomial speed-up can be implemented. Building such a processor therefore is the central goal for our development,” explained IBM, in a blog post (opens in new tab).
“However, accurate advances in techniques we refer to broadly as quantum error mitigation allow us to lay out a smoother path towards this goal. Along this path, advances in qubit coherence, gate fidelities, and speed immediately translate to measurable advantage in computation, akin to the steady progress historically observed with classical computers.”
The post is geared towards a highly technical audience and goes into great detail, but the main takeaway is this: the ability to quiet certain sources of error will allow for increasingly complex quantum workloads to be executed with reliable results.
According to IBM, the latest error mitigation techniques go “beyond just theory”, with the advantage of these methods having already been demonstrated on some of the most powerful quantum hardware currently available.
“At IBM Quantum, we plan to continue developing our hardware and software with this path in mind,” the company added.
“At the same time, together with our partners and the growing quantum community, we will continue expanding the list of problems that we can map to quantum circuits and develop better ways of comparing quantum circuit approaches to traditional classical methods to determine if a problem can demonstrate quantum advantage. We fully expect that this continuous path that we have outlined will bring us practical quantum computing.”
The new edition of IBM Security’s annual Cost of a Data Breach Report, reveals costlier and higher-impact data breaches than ever before. It found that the average cost of a data breach in South Africa has reached an all-time high of R49.25-million for surveyed organisations. With breach costs increasing nearly 20% over the last two years of the report, the findings suggest that security incidents became more costly and harder to contain compared to the year prior.
The 2022 report revealed that the average time to detect and contain a data breach was at its highest in seven years for organisations in South Africa – taking 247 days (187 to detect, 60 to contain). Companies that contained a breach in under 200 days were revealed to save almost R12 million – while breaches cost organisations R2650 per lost or stolen record on average.
The 2022 Cost of a Data Breach Report is based on an in-depth analysis of real-world data breaches experienced by 550 organisations globally between March 2021 and March 2022. The research, which was sponsored and analysed by IBM Security, was conducted by the Ponemon Institute.
“As this year’s report reveals – organisations must adopt the right strategies coupled with the right technologies that can help make all the difference when they are attacked. Businesses today need to continuously look into solutions that reduce complexity and speed up response to cyber threats across the hybrid cloud environment – minimising the impact of attacks,” says Ria Pinto, General Manager and Technology Leader, IBM South Africa.
Some of the key findings in the 2022 IBM report include:
Hybrid Cloud Advantage
Globally, the report also showcased hybrid cloud environments as the most prevalent (45%) infrastructure amongst organisations studied. Global findings revealed that organisations that adopted a hybrid cloud model observed lower breach costs compared to businesses with a solely public or private cloud model. In fact, hybrid cloud adopters studied were able to identify and contain data breaches 15 days faster on average than the global average of 277 days for participants.
The report highlights that 45% of studied breaches globally occurred in the cloud, emphasising the importance of cloud security.
South African businesses studied that had not started to deploy zero trust security practices across their cloud environments suffered losses averaging R56 million. Those in the mature stages of deployment decreased this cost significantly – recording R20 million savings as their total cost of a data breach was found to be R36 million.
The study revealed that more businesses are implementing security practices to protect their cloud environments, lowering breach costs with 44% of reporting organisations stating their zero-trust deployment is in the mature stage and another 42% revealing they are in the midstage.
A month after the National Institute of Standards and Technology (NIST) revealed the first quantum-safe algorithms, Amazon Web Services (AWS) and IBM have swiftly moved forward. Google was also quick to outline an aggressive implementation plan for its cloud service that it started a decade ago.
It helps that IBM researchers contributed to three of the four algorithms, while AWS had a hand in two. Google contributed to one of the submitted algorithms, SPHINCS+.
A long process that started in 2016 with 69 original candidates ends with the selection of four algorithms that will become NIST standards, which will play a critical role in protecting encrypted data from the vast power of quantum computers.
NIST's four choices include CRYSTALS-Kyber, a public-private key-encapsulation mechanism (KEM) for general asymmetric encryption, such as when connecting websites. For digital signatures, NIST selected CRYSTALS-Dilithium, FALCON, and SPHINCS+. NIST will add a few more algorithms to the mix in two years.
Vadim Lyubashevsky, a cryptographer who works in IBM's Zurich Research Laboratories, contributed to the development of CRYSTALS-Kyber, CRYSTALS-Dilithium, and Falcon. Lyubashevsky was predictably pleased by the algorithms selected, but he had only anticipated NIST would pick two digital signature candidates rather than three.
Ideally, NIST would have chosen a second key establishment algorithm, according to Lyubashevsky. "They could have chosen one more right away just to be safe," he told Dark Reading. "I think some people expected McEliece to be chosen, but maybe NIST decided to hold off for two years to see what the backup should be to Kyber."
After NIST identified the algorithms, IBM moved forward by specifying them into its recently launched z16 mainframe. IBM introduced the z16 in April, calling it the "first quantum-safe system," enabled by its new Crypto Express 8S card and APIs that provide access to the NIST APIs.
IBM was championing three of the algorithms that NIST selected, so IBM had already included them in the z16. Since IBM had unveiled the z16 before the NIST decision, the company implemented the algorithms into the new system. IBM last week made it official that the z16 supports the algorithms.
Anne Dames, an IBM distinguished engineer who works on the company's z Systems team, explained that the Crypto Express 8S card could implement various cryptographic algorithms. Nevertheless, IBM was betting on CRYSTAL-Kyber and Dilithium, according to Dames.
"We are very fortunate in that it went in the direction we hoped it would go," she told Dark Reading. "And because we chose to implement CRYSTALS-Kyber and CRYSTALS-Dilithium in the hardware security module, which allows clients to get access to it, the firmware in that hardware security module can be updated. So, if other algorithms were selected, then we would add them to our roadmap for inclusion of those algorithms for the future."
A software library on the system allows application and infrastructure developers to incorporate APIs so that clients can generate quantum-safe digital signatures for both classic computing systems and quantum computers.
"We also have a CRYSTALS-Kyber interface in place so that we can generate a key and provide it wrapped by a Kyber key so that could be used in a potential key exchange scheme," Dames said. "And we've also incorporated some APIs that allow clients to have a key exchange scheme between two parties."
Dames noted that clients might use Kyber to generate digital signatures on documents. "Think about code signing servers, things like that, or documents signing services, where people would like to actually use the digital signature capability to ensure the authenticity of the document or of the code that's being used," she said.
During Amazon's AWS re:Inforce security conference last week in Boston, the cloud provider emphasized its post-quantum cryptography (PQC) efforts. According to Margaret Salter, director of applied cryptography at AWS, Amazon is already engineering the NIST standards into its services.
During a breakout session on AWS' cryptography efforts at the conference, Salter said AWS had implemented an open source, hybrid post-quantum key exchange based on a specification called s2n-tls, which implements the Transport Layer Security (TLS) protocol across different AWS services. AWS has contributed it as a draft standard to the Internet Engineering Task Force (IETF).
Salter explained that the hybrid key exchange brings together its traditional key exchanges while enabling post-quantum security. "We have regular key exchanges that we've been using for years and years to protect data," she said. "We don't want to get rid of those; we're just going to enhance them by adding a public key exchange on top of it. And using both of those, you have traditional security, plus post quantum security."
Last week, Amazon announced that it deployed s2n-tls, the hybrid post-quantum TLS with CRYSTALS-Kyber, which connects to the AWS Key Management Service (AWS KMS) and AWS Certificate Manager (ACM). In an update this week, Amazon documented its stated support for AWS Secrets Manager, a service for managing, rotating, and retrieving database credentials and API keys.
While Google didn't make implementation announcements like AWS in the immediate aftermath of NIST's selection, VP and CISO Phil Venables said Google has been focused on PQC algorithms "beyond theoretical implementations" for over a decade. Venables was among several prominent researchers who co-authored a technical paper outlining the urgency of adopting PQC strategies. The peer-reviewed paper was published in May by Nature, a respected journal for the science and technology communities.
"At Google, we're well into a multi-year effort to migrate to post-quantum cryptography that is designed to address both immediate and long-term risks to protect sensitive information," Venables wrote in a blog post published following the NIST announcement. "We have one goal: ensure that Google is PQC ready."
Venables recalled an experiment in 2016 with Chrome where a minimal number of connections from the Web browser to Google servers used a post-quantum key-exchange algorithm alongside the existing elliptic-curve key-exchange algorithm. "By adding a post-quantum algorithm in a hybrid mode with the existing key exchange, we were able to test its implementation without affecting user security," Venables noted.
Google and Cloudflare announced a "wide-scale post-quantum experiment" in 2019 implementing two post-quantum key exchanges, "integrated into Cloudflare's TLS stack, and deployed the implementation on edge servers and in Chrome Canary clients." The experiment helped Google understand the implications of deploying two post-quantum key agreements with TLS.
Venables noted that last year Google tested post-quantum confidentiality in TLS and found that various network products were not compatible with post-quantum TLS. "We were able to work with the vendor so that the issue was fixed in future firmware updates," he said. "By experimenting early, we resolved this issue for future deployments."
The four algorithms NIST announced are an important milestone in advancing PQC, but there's other work to be done besides quantum-safe encryption. The AWS TLS submission to the IETF is one example; others include such efforts as Hybrid PQ VPN.
"What you will see happening is those organizations that work on TLS protocols, or SSH, or VPN type protocols, will now come together and put together proposals which they will evaluate in their communities to determine what's best and which protocols should be updated, how the certificates should be defined, and things like things like that," IBM's Dames said.
Dustin Moody, a mathematician at NIST who leads its PQC project, shared a similar view during a panel discussion at the RSA Conference in June. "There's been a lot of global cooperation with our NIST process, rather than fracturing of the effort and coming up with a lot of different algorithms," Moody said. "We've seen most countries and standards organizations waiting to see what comes out of our nice progress on this process, as well as participating in that. And we see that as a very good sign."
IBM Security has released the annual Cost of a Data Breach Report, revealing costlier and higher-impact data breaches than ever before, with the average cost of a data breach in South Africa reaching an all-time high of R49,25-million for surveyed organisations.
With breach costs increasing nearly 20% over the last two years of the report, the findings suggest that security incidents became more costly and harder to contain compared to the year prior.
The 2022 report revealed that the average time to detect and contain a data breach was at its highest in seven years for organisations in South Africa – taking 247 days (187 to detect, 60 to contain). Companies that contained a breach in under 200 days were revealed to save almost R12-million – while breaches cost organisations R2 650 per lost or stolen record on average.
The 2022 Cost of a Data Breach Report is based on in-depth analysis of real-world data breaches experienced by 550 organisations globally between March 2021 and March 2022. The research, which was sponsored and analysed by IBM Security, was conducted by the Ponemon Institute.
“As this year’s report reveals – organisations must adopt the right strategies coupled with the right technologies can help make all the difference when they are attacked. Businesses today need to continuously look into solutions that reduce complexity and speed up response to cyber threats across the hybrid cloud environment – minimising the impact of attacks,” says Ria Pinto, GM and technology leader at IBM South Africa.
Some of the key findings in the 2022 IBM report include:
* Security Immaturity in Clouds – Organisations studied which had mature security across their cloud environments, the costs of a breach were observed to be R4-million lower than those that were in the midstage and applied many practices across their organisation.
* Incident Response Testing is a Multi-Million Rand Cost Saver – Organisations with an Incident Response (IR) team saved over R3,4-million, while those that extensively tested their IR plan lowered the cost of a breach by over R2,6-million, the study revealed. The study also found that organisations which deployed security AI or analytics incurred over R2-million less on average in breach costs compared to studied organisations that have not deployed either technology- making them the top mitigating factors shown to reduce the cost of a breach.
* Cloud Misconfiguration, Malicious Insider Attacks and Stolen Credentials are Costliest Breach Causes – Cloud misconfiguration reigned as the costliest cause of a breach (R58,6-million), malicious insider attacks came in second (R55-million) and the stolen credentials came in third, leading to R53-million in average breach costs for responding organisations.
* Financial Services organisations experienced the Highest Breach Costs – Financial participants saw the costliest breaches amongst industries with average breach costs reaching a high of R4,9-million per record. This was followed by the industrial sector with losses per record reaching R4,7-million.
Hybrid Cloud Advantage
Globally, the report also showcased hybrid cloud environments as the most prevalent (45%) infrastructure amongst organisations studied. Global findings revealed that organisations that adopted a hybrid cloud model observed lower breach costs compared to businesses with a solely public or private cloud model. In fact, hybrid cloud adopters studied were able to identify and contain data breaches 15 days faster on average than the global average of 277 days for participants.
The report highlights that 45% of studied breaches globally occurred in the cloud, emphasising the importance of cloud security.
South African businesses studied that had not started to deploy zero trust security practices across their cloud environments suffered losses averaging R56-million. Those in the mature stages of deployment decreased this cost significantly – recording R20-million savings as their total cost of a data breach was found to be R36-million.
The study revealed that more businesses are implementing security practices to protect their cloud environments, lowering breach costs with 44% of reporting organisations stating their zero-trust deployment is in the mature stage and another 42% revealing they are in the midstage.
US – IBM’s Watson Health business has completed the sale of its healthcare data and analytics assets to global investment firm Francisco Partners.
The assets will be included in a new standalone company under the ownership of Francisco called Merative, which will be headquartered in Ann Arbor, Michigan, US.
Merative will work in areas such as life sciences, provider, imaging, health plan, employer and government health and human services.
Gerry McCarthy has been hired to lead Merative, having most recently been chief executive of eSolutions, a Francisco Partners company that was sold to Waystar in October 2020.
Before eSolutions, McCarthy was president of TransUnion Healthcare and an executive at McKesson.
Paul Roma, general manager of the Watson Health business, will become senior adviser to Francisco Partners.
Merative’s products will be organised into six ‘product families’, including Health Insights; MarketScan; Clinical Development; Social Programme Management and Phytel; Micromedex; and Merge Imaging solutions.
Francisco Partners said its investment would provide Merative with significant resources and opportunities for new investment, acquisitions, partnerships, and growth.
Ezra Perlman and Justin Chen of Francisco Partners said: “Francisco Partners is excited about the opportunity to partner with the team and employees at Merative to help them achieve their mission, bringing technology and expertise to clients across healthcare through industry-leading data and analytics solutions.
“We appreciate IBM’s work in developing this business, and our ownership will help Merative drive crucial focus in executing on organic and inorganic growth strategies.”
Press release content from Business Wire. The AP news staff was not involved in its creation.
BOSTON--(BUSINESS WIRE)--Aug 3, 2022--
Astadia is pleased to announce the release of a new series of Mainframe-to-Cloud reference architecture guides. The documents cover how to refactor IBM mainframes applications to Microsoft Azure, Amazon Web Services (AWS), Google Cloud, and Oracle Cloud Infrastructure (OCI). The documents offer a deep dive into the migration process to all major target cloud platforms using Astadia’s FastTrack software platform and methodology.
As enterprises and government agencies are under pressure to modernize their IT environments and make them more agile, scalable and cost-efficient, refactoring mainframe applications in the cloud is recognized as one of the most efficient and fastest modernization solutions. By making the guides available, Astadia equips business and IT professionals with a step-by-step approach on how to refactor mission-critical business systems and benefit from highly automated code transformation, data conversion and testing to reduce costs, risks and timeframes in mainframe migration projects.
“Understanding all aspects of legacy application modernization and having access to the most performant solutions is crucial to accelerating digital transformation,” said Scott G. Silk, Chairman and CEO. “More and more organizations are choosing to refactor mainframe applications to the cloud. These guides are meant to assist their teams in transitioning fast and safely by benefiting from Astadia’s expertise, software tools, partnerships, and technology coverage in mainframe-to-cloud migrations,” said Mr. Silk.
The new guides are part of Astadia’s free Mainframe-to-Cloud Modernization series, an ample collection of guides covering various mainframe migration options, technologies, and cloud platforms. The series covers IBM (NYSE:IBM) Mainframes.
In addition to the reference architecture diagrams, these comprehensive guides include various techniques and methodologies that may be used in forming a complete and effective Legacy Modernization plan. The documents analyze the important role of the mainframe platform, and how to preserve previous investments in information systems when transitioning to the cloud.
In each of the IBM Mainframe Reference Architecture white papers, readers will explore:
The guides are available for obtain here:
To access more mainframe modernization resources, visit the Astadia learning center on www.astadia.com.
About Astadia
Astadia is the market-leading software-enabled mainframe migration company, specializing in moving IBM and Unisys mainframe applications and databases to distributed and cloud platforms in unprecedented timeframes. With more than 30 years of experience, and over 300 mainframe migrations completed, enterprises and government organizations choose Astadia for its deep expertise, range of technologies, and the ability to automate complex migrations, as well as testing at scale. Learn more on www.astadia.com.
View source version on businesswire.com:https://www.businesswire.com/news/home/20220803005031/en/
CONTACT: Wilson Rains, Chief Revenue Officer
Wilson.Rains@astadia.com
+1.877.727.8234
KEYWORD: UNITED STATES NORTH AMERICA MASSACHUSETTS
INDUSTRY KEYWORD: DATA MANAGEMENT TECHNOLOGY OTHER TECHNOLOGY SOFTWARE NETWORKS INTERNET
SOURCE: Astadia
Copyright Business Wire 2022.
PUB: 08/03/2022 10:00 AM/DISC: 08/03/2022 10:02 AM
http://www.businesswire.com/news/home/20220803005031/en
As with many goods and services, healthcare has not been immune to inflationary pressures. U.S. health systems faced a combination of rising supply and labor expenses in accurate months, according to Healthcare Dive, even as patient volumes have increased. As a result, providers are likely to pass on increased costs to commercial insurers during upcoming contract negotiations, Fierce Healthcare reported last week.
But from the employer perspective, employee benefits programs — including health benefits — remain a key component of talent management in a difficult hiring market, according to a McKinsey & Co. report from May. The consultancy also found that many employers have turned to HDHPs, among other strategies, as a way to address rising costs.
Yet increasing employee deductibles creates a “fundamental tension” between employers’ dual goals of securing workers’ well-being and controlling costs, according to EBRI.
“On the one hand, employers are more frequently implementing financial wellness programs as a means to Improve their employees’ financial wellbeing,” Jake Spiegel, research associate at EBRI, said in the statement. “On the other hand, in an effort to wrangle health care cost increases, employers often turn to raising their health plan’s deductible, potentially offsetting the positive impact of any financial wellness initiatives.”
In their report, EBRI’s researchers also noted the role that health savings accounts, which may be offered in conjunction with a HDHP, may play in balancing increased out-of-pocket costs. Those enrolled in an HSA-eligible HDHP may be able to cover those costs using HSA contributions made by themselves and their employers. A previous EBRI report highlighted the role that pre-deductible coverage of chronic condition medications may play in HSA-eligible plans.
Aside from increasing patient deductibles, there are a variety of other cost-saving healthcare measures employers may seek. An executive for insurer NFP previously told HR Dive that these options can include care navigation services, virtual care options and value-based care arrangements, among others.