C2040-421 questions are changed today. Download new questions.

killexams.com C2040-421 practice test contains a Complete Pool of Questions and Answers and test prep confirmed and substantial including references and clarifications (where appropriate). Our objective to rehearse the IBM WebSphere Portal 8.5 System Administration Core A test prep is not just to finish the C2040-421 test at first endeavor however Really Improve Your Knowledge about the C2040-421 test course destinations.

Exam Code: C2040-421 Practice test 2022 by Killexams.com team
IBM WebSphere Portal 8.5 System Administration Core A
IBM Administration Practice Test
Killexams : IBM Administration practice questions - BingNews https://killexams.com/pass4sure/exam-detail/C2040-421 Search results Killexams : IBM Administration practice questions - BingNews https://killexams.com/pass4sure/exam-detail/C2040-421 https://killexams.com/exam_list/IBM Killexams : Cybersecurity - what’s the real cost? Ask IBM
(Pixabay)

Cybersecurity has always been a concern for every type of organization. Even in normal times, a major breach is more than just the data economy’s equivalent of a ram-raid on Fort Knox; it has knock-on effects on trust, reputation, confidence, and the viability of some technologies. This is what IBM calls the “haunting effect”.

A successful attack breeds more, of course, both on the same organization again, and on others in similar businesses, or in those that use the same compromised systems. The unspoken effect of this is rising costs for everyone, as all enterprises are forced to spend money and time on checking if they have been affected too.

But in our new world of COVID-19, disrupted economies, climate change, remote working, soaring inflation, and looming recession, all such effects are all amplified. Throw in a war that’s hammering on Europe’s door (with political echoes across the Middle East and Asia) and it’s a wonder any of us can get out of bed in the morning.

So, what are the real costs of a successful cyberattack – not just hacks, viruses, and Trojans, but also phishing, ransomware, and concerted campaigns against supply chains and code repositories?

According to IBM’s latest annual survey, breach costs have risen by an unlucky 13% over the past two years, as attackers, which include hostile states, have probed the systemic and operational weaknesses exposed by the pandemic.

The global average cost of a data breach has reached an all-time high of $4.35 million – at least, among the 550 organizations surveyed by the Ponemon Institute for IBM Security (over a year from March 2021). Indeed, IBM goes so far as to claim that breaches may be contributing to the rising costs of goods and services. The survey states:

Sixty percent of studied organizations raised their product or services prices due to the breach, when the cost of goods is already soaring worldwide amid inflation and supply chain issues.

Incidents are also “haunting” organizations, says the company, with 83% having experienced more than one data breach, and with 50% of costs occurring more than a year after the successful attack.

Cloud maturity is a key factor, adds the report:

Forty-three percent of studied organizations are in the early stages [of cloud adoption] or have not started applying security practices across their cloud environments, observing over $660,000 in higher breach costs, on average, than studied organizations with mature security across their cloud environments.

Forty-five percent of respondents run a hybrid cloud infrastructure. This leads to lower average breach costs than among those operating a public- or private-cloud model: $3.8 million versus $5.02 million (public) and $4.24 million (private).

That said, those are still significant costs, and may suggest that complexity is what deters attackers, rather than having a single target to hit. Nonetheless, hybrid cloud adopters are able to identify and contain data breaches 15 days faster on average, says the report.

However, with 277 days being the average time lag – an extraordinary figure – the real lesson may be that today’s enterprise systems are adept at hiding security breaches, which may appear as normal network traffic. Forty-five percent of breaches occurred in the cloud, says the report, so it is clearly imperative to get on top of security in that domain.

IBM then makes the following bold claim :

Participating organizations fully deploying security AI and automation incurred $3.05 million less on average in breach costs compared to studied organizations that have not deployed the technology – the biggest cost saver observed in the study.

Whether this finding will stand for long as attackers explore new ways to breach automated and/or AI-based systems – and perhaps automate attacks of their own invisibly – remains to be seen. Compromised digital employee, anyone?

Global systems at risk

But perhaps the most telling finding is that cybersecurity has a political dimension – beyond the obvious one of Russian, Chinese, North Korean, or Iranian state incursions, of course.

Concerns over critical infrastructure and global supply chains are rising, with threat actors seeking to disrupt global systems that include financial services, industrial, transportation, and healthcare companies, among others.

A year ago in the US, the Biden administration issued an Executive Order on cybersecurity that focused on the urgent need for zero-trust systems. Despite this, only 21% of critical infrastructure organizations have so far adopted a zero-trust security model, according to the report. It states:

Almost 80% of the critical infrastructure organizations studied don’t adopt zero-trust strategies, seeing average breach costs rise to $5.4 million – a $1.17 million increase compared to those that do. All while 28% of breaches among these organizations were ransomware or destructive attacks.

Add to that, 17% of breaches at critical infrastructure organizations were caused due to a business partner being initially compromised, highlighting the security risks that over-trusting environments pose.

That aside, one of the big stories over the past couple of years has been the rise of ransomware: malicious code that locks up data, enterprise systems, or individual computers, forcing users to pay a ransom to (they hope) retrieve their systems or data.

But according to IBM, there are no obvious winners or losers in this insidious practice. The report adds:

Businesses that paid threat actors’ ransom demands saw $610,000 less in average breach costs compared to those that chose not to pay – not including the ransom amount paid.

However, when accounting for the average ransom payment – which according to Sophos reached $812,000 in 2021 – businesses that opt to pay the ransom could net higher total costs, all while inadvertently funding future ransomware attacks.”

The persistence of ransomware is fuelled by what IBM calls the “industrialization of cybercrime”.

The risk profile is also changing. Ransomware attack times show a massive drop of 94% over the past three years, from over two months to just under four days. Good news? Not at all, says the report, as the attacks may be higher impact, with more immediate consequences (such as destroyed data, or private data being made public on hacker forums).

My take

The key lesson in cybersecurity today is that all of us are both upstream and downstream from partners, suppliers, and customers in today’s extended enterprises. We are also at the mercy of reused but compromised code from trusted repositories, and even sometimes from hardware that has been compromised at source.

So, what is the answer? Businesses should ensure that their incident responses are tested rigorously and frequently in advance – along with using red-, blue-, or purple-team approaches (thinking like a hacker, a defender, or both).

Regrettably, IBM says that 37% of organizations that have IR plans in place fail to test them regularly. To paraphrase Spinal Tap, you can’t code for stupid.

Wed, 27 Jul 2022 12:00:00 -0500 BRAINSUM en text/html https://diginomica.com/cybersecurity-whats-real-cost-ask-ibm
Killexams : Beacon Leadership Council

Vincent Caprio founded the Water Innovations Alliance Foundation (WIAF) in October 2008. In this role he created the Water 2.0 Conference series of which he is currently the Chairman Emeritus. As an early advocate for nanotechnology, Mr. Caprio is the Founder and Chairman Emeritus of the NanoBusiness Commercialization Association (NanoBCA). In 2002, he launched the highly successful NanoBusiness Conference series, now in its 19th year. 

A pioneer at the intersection of business and technology, Vincent Caprio possesses a unique ability to spot emerging and societally significant technologies in their early stages. He successfully creates brands and business organizations focused on specific technology markets, and launches events that not only educate, but also connect and empower stakeholders that include investors, technologists, CEOs and politicians. 

It is Mr. Caprio’s avid interest in history and background in finance that enabled him to be among the first to recognize the impact that specific technologies will have on business and society. By building community networks centered around his conferences, he has facilitated the growth of important new technologies, including nanotechnology, clean water technology and most recently, engineering software. 

Mr. Caprio is also one of the foremost advocates for government funding of emerging technology at both the State and Federal levels. He has testified before Congress, EPA, Office of Science and Technology Policy (OSTP), as well as the state legislatures of New York and Connecticut, and has been an invited speaker at over 100 events. Mr. Caprio has also organized public policy tours in Washington, DC, educating politicians about emerging tech through meetings with high-level technology executives. 

In the events sector, Mr. Caprio served as the Event Director who launched of The Emerging Technologies Conference in association with MIT’s Technology Review Magazine. He also acted as consultant to the leading emerging technology research and advisory firm Lux Research, for its Lux Executive Summit in 2005 & 2006. In 2002, Mr. Caprio served as the Event Director and Program Director of the Forbes/IBM Executive Summit. 

Prior to founding the NanoBCA, Mr. Caprio was Event Director for Red Herring Conferences, producing the company’s Venture Market conferences and Annual Summit reporting to Red Herring Magazine Founder and Publisher Tony Perkins, and Editor, Jason Pontin. His industry peers have formally recognized Mr. Caprio on several occasions for his talents in both tradeshow and conference management. 

Mr. Caprio was named Sales Executive of the Year in 1994 while employed with Reed Exhibitions, and was further honored with three Pathfinder Awards in 1995 for launching The New York Restaurant Show, Buildings Chicago and Buildings LA. 

Prior to joining Reed Elsevier’s office of the Controller in 1989, Mr. Caprio was employed at Henry Charles Wainwright investment group as a Senior Tax Accountant. In the 1980’s, he specialized in the preparation of 1120, 1065 and 1040 tax forms, and was also employed with the Internal Revenue Service from 1979- 1981. 

During the past 10 years, Mr. Caprio has been involved in numerous nonprofit philanthropic activities including: Fabricators & Manufacturers Association (FMA), Easton Learning Foundation, Easton Community Center, Easton Racquet Club, First Presbyterian Church of Fairfield, Omni Nano, FBI Citizen’s Academy, Villanova Alumni Recruitment Network and Easton Exchange Club. 

Mr. Caprio graduated from Villanova University with a Bachelor of Science in Accounting/MIS from the Villanova School of Business. He received an MBA/MPA from Fairleigh Dickinson University. 

In the spring of 2015, Mr. Caprio was appointed to Wichita State University's Applied Technology Acceleration Institute (ATAI) as a water and energy expert. In 2017 he was named Program Director of the Center for Digital Transformation at Pfeiffer University. Mr. Caprio was elected in November 2016 and serves as the Easton, Connecticut Registrar of Voters. 

Mon, 23 May 2022 19:36:00 -0500 en text/html https://www.clarkson.edu/beacon-leadership-council
Killexams : IBM Annual Cost of Data Breach Report 2022: Record Costs Usually Passed On to Consumers, “Long Breach” Expenses Make Up Half of Total Damage

IBM’s annual Cost of Data Breach Report for 2022 is packed with revelations, and as usual none of them are good news. Headlining the report is the record-setting cost of data breaches, with the global average now at $4.35 million. The report also reveals that much of that expense comes with the data breach version of “long Covid,” expenses that are realized more than a year after the attack.

Most organizations (60%) are passing these added costs on to consumers in the form of higher prices. And while 83% of organizations now report experiencing at least one data breach, only a small minority are adopting zero trust strategies.

Security AI and automation greatly reduces expected damage

The IBM report draws on input from 550 global organizations surveyed about the period between March 2021 and March 2022, in partnership with the Ponemon Institute.

Though the average cost of a data breach is up, it is only by about 2.6%; the average in 2021 was $4.24 million. This represents a total climb of 13% since 2020, however, reflecting the general spike in cyber crime seen during the pandemic years.

Organizations are also increasingly not opting to absorb the cost of data breaches, with the majority (60%) compensating by raising consumer prices separate from any other accurate increases due to inflation or supply chain issues. The report indicates that this may be an underreported upward influence on prices of consumer goods, as 83% of organizations now say that they have been breached at least once.

Brad Hong, Customer Success Manager for Horizon3.ai, sees a potential consumer backlash on the horizon once public awareness of this practice grows: “It’s already a breach of confidence to lose the confidential data of customers, and sure there’s bound to be an organization across those surveyed who genuinely did put in the effort to protect against and curb attacks, but for those who did nothing, those who, instead of creating a disaster recovery plan, just bought cyber insurance to cover the org’s operational losses, and those who simply didn’t care enough to heed the warnings, it’s the coup de grâce to then pass the cost of breaches to the same customers who are now the victims of a data breach. I’d be curious to know what percent of the 60% of organizations who increased the price of their products and services are using the extra revenue for a war chest or to actually reinforce their security—realistically, it’s most likely just being used to fill a gap in lost revenue for shareholders’ sake post-breach. Without government regulations outlining restrictions on passing cost of breach to consumer, at the least, not without the honest & measurable efforts of a corporation as their custodian, what accountability do we all have against that one executive who didn’t want to change his/her password?”

Breach costs also have an increasingly long tail, as nearly half now come over a year after the date of the attack. The largest of these are generally fines that are levied after an investigation, and decisions or settlements in class action lawsuits. While the popular new “double extortion” approach of ransomware attacks can drive long-term costs in this way, the study finds that companies paying ransom demands to settle the problem quickly aren’t necessarily seeing a large amount of overall savings: their average breach cost drops by just $610,000.

Sanjay Raja, VP of Product with Gurucul, expands on how knock-on data breach damage can continue for years: “The follow-up attack effect, as described, is a significant problem as the playbooks and solutions provided to security operations teams are overly broad and lack the necessary context and response actions for proper remediation. For example, shutting down a user or application or adding a firewall block rule or quarantining a network segment to negate an attack is not a sustainable remediation step to protect an organization on an ongoing basis. It starts with a proper threat detection, investigation and response solution. Current SIEMs and XDR solutions lack the variety of data, telemetry and combined analytics to not only identify an attack campaign and even detect variants on previously successful attacks, but also provide the necessary context, accuracy and validation of the attack to build both a precise and complete response that can be trusted. This is an even greater challenge when current solutions cannot handle complex hybrid multi-cloud architectures leading to significant blind spots and false positives at the very start of the security analyst journey.”

Rising cost of data breach not necessarily prompting dramatic security action

In spite of over four out of five organizations now having experienced some sort of data breach, only slightly over 20% of critical infrastructure companies have moved to zero trust strategies to secure their networks. Cloud security is also lagging as well, with a little under half (43%) of all respondents saying that their security practices in this area are either “early stage” or do not yet exist.

Those that have onboarded security automation and AI elements are the only group seeing massive savings: their average cost of data breach is $3.05 million lower. This particular study does not track average ransom demands, but refers to Sophos research that puts the most accurate number at $812,000 globally.

The study also notes serious problems with incident response plans, especially troubling in an environment in which the average ransomware attack is now carried out in four days or less and the “time to ransom” has dropped to a matter of hours in some cases. 37% of respondents say that they do not test their incident response plans regularly. 62% say that they are understaffed to meet their cybersecurity needs, and these organizations tend to suffer over half a million more dollars in damages when they are breached.

Of course, cost of data breaches is not distributed evenly by geography or by industry type. Some are taking much bigger hits than others, reflecting trends established in prior reports. The health care industry is now absorbing a little over $10 million in damage per breach, with the average cost of data breach rising by $1 million from 2021. And companies in the United States face greater data breach costs than their counterparts around the world, at over $8 million per incident.

Shawn Surber, VP of Solutions Architecture and Strategy with Tanium, provides some insight into the unique struggles that the health care industry faces in implementing effective cybersecurity: “Healthcare continues to suffer the greatest cost of breaches but has among the lowest spend on cybersecurity of any industry, despite being deemed ‘critical infrastructure.’ The increased vulnerability of healthcare organizations to cyber threats can be traced to outdated IT systems, the lack of robust security controls, and insufficient IT staff, while valuable medical and health data— and the need to pay ransoms quickly to maintain access to that data— make healthcare targets popular and relatively easy to breach. Unlike other industries that can migrate data and sunset old systems, limited IT and security budgets at healthcare orgs make migration difficult and potentially expensive, particularly when an older system provides a small but unique function or houses data necessary for compliance or research, but still doesn’t make the cut to transition to a newer system. Hackers know these weaknesses and exploit them. Additionally, healthcare orgs haven’t sufficiently updated their security strategies and the tools that manufacturers, IT software vendors, and the FDA have made haven’t been robust enough to thwart the more sophisticated techniques of threat actors.”

Familiar incident types also lead the list of the causes of data breaches: compromised credentials (19%), followed by phishing (16%). Breaches initiated by these methods also tended to be a little more costly, at an average of $4.91 million per incident.

Global average cost of #databreach is now $4.35M, up 13% since 2020. Much of that are realized more than a year after the attack, and 60% of organizations are passing the costs on to consumers in the form of higher prices. #cybersecurity #respectdataClick to Tweet

Cutting the cost of data breach

Though the numbers are never as neat and clean as averages would indicate, it would appear that the cost of data breaches is cut dramatically for companies that implement solid automated “deep learning” cybersecurity tools, zero trust systems and regularly tested incident response plans. Mature cloud security programs are also a substantial cost saver.

Mon, 01 Aug 2022 10:00:00 -0500 Scott Ikeda en-US text/html https://www.cpomagazine.com/cyber-security/ibm-annual-cost-of-data-breach-report-2022-record-costs-usually-passed-on-to-consumers-long-breach-expenses-make-up-half-of-total-damage/
Killexams : Making the DevOps Pipeline Transparent and Governable

Subscribe on:

Transcript

Shane Hastie: Good day folks. This is Shane Hastie for the InfoQ Engineering Culture podcast. Today, I'm sitting down with David Williams from Quali. David, welcome. Thanks for taking the time to talk to us today.

David Williams: Thanks, Shane. It's great to be here.

Shane Hastie: Probably my first starting point for most of these conversations is who's David?

Introductions [00:23]

David, he's a pretty boring character, really. He's been in the IT industry all his life, so there's only so many parties you can go and entertain people with that subject now. But I've been working since I first went to school. My first jobs were working in IT operations in a number of financial companies. I started at the back end. For those of you who want to know how old I was, I remember a time when printing was a thing. And so decorating was my job, carrying tapes, separating print out, doing those sort of things. So really I got a very grassroots level of understanding about what technology was all about, and it was nowhere near as glamorous as I've been full to believe. So I started off, I'd say, working operations. I've worked my way through computer operations systems administration, network operations. So I used to be part of a NOC team, customer support.

David Williams: I did that sort of path, as low as you can get in the ladder, to arguably about a rung above. And then what happened over that period of time was I worked a lot with distributed systems, lights out computing scenarios, et cetera and it enabled me to get more involved in some of the development work that was being done, specifically to manage these new environments, specifically mesh computing, clusters, et cetera. How do you move workloads around dynamically and how does the operating system become much more aware of what it's doing and why? Because obviously, it just sees them as workloads but needed to be smarter. So I got into development that way, really. I worked for Digital Equipment in its heyday, working on clusters and part of the team that was doing the operating system work. And so that, combined with my knowledge of how people were using the tech, being one of the people that was once an operations person, it enabled me as a developer to have a little bit of a different view on what needed to be done.

And that's what really motivated me to excel in that area, because I wanted to make sure that a lot of the things that were being built could be built in support of making operations simpler, making the accountability of what was going on more accountable to the business, to enable the services to be a little more transparent in how IT was using them around. So that throughout my career, luckily for me, the tech industry reinvents itself in a very similar way every seven years. So I just have to wait seven years to look like one of the smart guys again. So that's how I really got into it from the get go.

Shane Hastie: So that developer experience is what we'd call thinking about making it better for developers today. What are the key elements of this developer experience for us?

The complexity in the developer role today [02:54]

David Williams: When I was in development, the main criteria that I was really responsible for was time. It was around time and production rates. I really had no clue why I was developing the software. Obviously, I knew what application I was working on and I knew what it was, but I never really saw the results. So over the years, I wasn't doing it for a great amount of time, to be honest with you. Because when I started looking at what needed to be done, I moved quite quickly from being a developer into being a product manager, which by the way, if you go from development to product management, it's not exactly a smooth path. But I think it was something that enabled me to be a better product manager at the time, because then I understood the operations aspects, I was a developer and I understood what it was that made the developer tick because that's why I did it.

It was a great job to create something and work on it and actually show the results. And I think over the years, it enabled me to look at the product differently. And I think that as a developer today, what developers do today is radically more advanced than what I was expected to do. I did not have continuous delivery. I did not really have a continuous feedback. I did not have the responsibility for testing whilst developing. So there was no combined thing. It was very segmented and siloed. And I think over the years, I've seen what I used to do as an art form become extremely sophisticated with a lot more requirements of it than was there. And I think for my career, I was a VP of Products at IBM Tivoli, I was a CTO at BMT software, and I worked for CA Technology prior to its acquisition by Broadcom, where I was the Senior Vice President of Product Strategy.

But in all those jobs, it enabled me to really understand the value of the development practices and how these practices can be really honed in, in support between the products and the IT operations world, as well as really more than anything else, the connection between the developer and the consumer. That was never part of my role. I had no clue who was using my product. And as an operations person, I only knew the people that were unhappy. So I think today's developer is a much more... They tend to be highly skilled in a way that I was not because coding is part of their role. Communication, collaboration, the integration, the cloud computing aspects, everything that you have to now include from an infrastructure is significantly in greater complexity. And I'll summarize by saying that I was also an analyst for Gartner for many years and I covered the DevOps toolchains.

And the one thing I found out there was there isn't a thing called DevOps that you can put into a box. It's very much based upon a culture and a type of company that you're with. So everybody had their interpretation of their box. But one thing was very common, the complexity in all cases was significantly high and growing to the point where the way that you provision and deliver the infrastructure in support of the code you're building, became much more of a frontline job than something that you could accept as being a piece of your role. It became a big part of your role. And that's what really drove me towards joining Quali, because this company is dealing with something that I found as being an inhibitor to my productivity, both as a developer, but also when I was also looking up at the products, I found that trying to work out what the infrastructure was doing in support of what the code was doing was a real nightmare.

Shane Hastie: Let's explore that when it comes, step back a little bit, you made the point about DevOps as a culture. What are the key cultural elements that need to be in place for DevOps to be effective in an organization?

The elements of DevOps culture [06:28]

David Williams: Yeah, this is a good one. When DevOps was an egg, it really was an approach that was radically different from the norm. And what I mean, obviously for people that remember it back then, it was the continuous... Had nothing to do with Agile. It was really about continuous delivery of software into the environment in small chunks, microservices coming up. It was delivering very specific pieces of code into the infrastructure, continuously, evaluating the impact of that release and then making adjustments and change in respect to the feedback that gave you. So the fail forward thing was very much an accepted behavior, what it didn't do at the time, and it sort of glossed over it a bit, was it did remove a lot of the compliance and regulatory type of mandatory things that people would use in the more traditional ways of developing and delivering code, but it was a fledging practice.

And from that base form, it became a much, much bigger one. So really what that culturally meant was initially it was many, many small teams working in combination of a bigger outcome, whether it was stories in support of epics or whatever the response was. But I find today, it has a much bigger play because now it does have Agile as an inherent construct within the DevOps procedures, so you've got the ability to do teamwork and collaboration and all the things that Agile defines, but you've also got the continuous delivery part of that added on top, which means that at any moment in time, you're continually putting out updates and changes and then measuring the impact. And I think today's challenge is really the feedback loop isn't as clear as it used to be because people are starting to use it for a serious applications delivery now.

The consumer, which used to be the primary recipient, the lamp stacks that used to be built out there have now moved into the back end type of tech. And at that point, it gets very complex. So I think that the complexity of the pipeline is something that the DevOps team needs to work on, which means that even though collaboration and people working closely together, it's a no brainer in no matter what you're doing, to be honest. But I think that the ability to understand and have a focused understanding of the outcome objective, no matter who you are in the DevOps pipeline, that you understand what you're doing and why it is, and everybody that's in that team understands their contribution, irrespective of whether they talk to each other, I think is really important, which means that technology supporting that needs to have context.

I need to understand what the people around me have done to be code. I need to know what stage it's in. I need to understand where it came from and who do I pass it to? So all that needs to be not just the cultural thing, but the technology itself also needs to adhere to that type of practice.

Shane Hastie: One of the challenges or one of the pushbacks we often hear about is the lack of governance or the lack of transparency for governance in the DevOps space. How do we overcome that?

Governance in DevOps [09:29]

David Williams: The whole approach of the DevOps, initially, was to think about things in small increments, the bigger objective, obviously being the clarity. But the increments were to provide lots and lots of enhancements and advances. When you fragmented in that way and deliver the ability for the developer to make choices on how they both code and provision infrastructure, it can sometimes not necessarily lead to things being unsecure or not governed, but it means that there's different security and different governance within a pipeline. So where the teams are working quite closely together, that may not automatically move if you've still got your different testing team. So if your testing is not part of your development code, which in some cases it is, some cases it isn't, and you move from one set of infrastructure, for example, that supports the code to another one, they might be using a completely different set of tooling.

They might have different ways with which to measure the governance. They might have different guardrails, obviously, and everything needs to be accountable to change because financial organizations, in fact, most organizations today, have compliance regulations that says any changes to any production, non-production environment, in fact, in most cases, requires accountability. And so if you're not reporting in a, say, consistent way, it makes the job of understanding what's going on in support of compliance and governance really difficult. So it really requires governance to be a much more abstract, but end to end thing as opposed to each individual stay as its own practices. So governance today is starting to move to a point where one person needs to see the end to end pipeline and understand what exactly is going on? Who is doing what, where and how? Who has permissions and access? What are the configurations that are changing?

Shane Hastie: Sounds easy, but I suspect there's a whole lot of... Again, coming back to the culture, we're constraining things that for a long time, we were deliberately releasing.

Providing freedom withing governance constraints [11:27]

David Williams: This is a challenge. When I was a developer of my choice, it's applicable today. When I heard the word abstract, it put the fear of God into me, to be honest with you. I hated the word abstract. I didn't want anything that made my life worse. I mean, being accountable was fine. When I used to heard the word frameworks and I remember even balking at the idea of a technology that brought all my coding environment into one specific view. So today, nothing's changed. A developer has got to be able to use the tools that they want to use and I think that the reason for that is that with the amount of skills that people have, we're going to have to, as an industry, get used to the fact that people have different skills and different focuses and different preferences of technology.

And so to actually mandate a specific way of doing something or implementing a governance engine that inhibits my ability to innovate is counterproductive. It needs to have that balance. You need to be able to have innovation, freedom of choice, and the ability to use the technology in the way that you need to use to build the code. But you also need to be able to provide the accountability to the overall objective, so you need to have that end to end view on what you're doing. So as you are part of a team, each team member should have responsibility for it and you need to be able to provide the business with the things that it needs to make sure that nothing goes awry and that there's nothing been breached. So no security issues occurring, no configurations are not tracked. So how do you do that?

Transparency through tooling [12:54]

David Williams: And as I said, that's what drove me towards Quali, because as a company, the philosophy was very much on the infrastructure. But when I spoke to the CEO of the company, we had a conversation prior to my employment here, based upon my prior employer, which was a company that was developing toolchain products to help developers and to help people release into production. And the biggest challenge that we had there was really understanding what the infrastructure was doing and the governance that was being put upon those pieces. So think about it as you being a train, but having no clue about what gauge the track is at any moment in time. And you had to put an awful lot of effort into working out what is being done underneath the hood. So what I'm saying is that there needed to be something that did that magic thing.

It enabled you with a freedom of choice, captured your freedom of choice, translated it into a way that adhered it to a set of common governance engines without inhibiting your ability to work, but also provided visibility to the business to do governance and cost control and things that you can do when you take disparate complexity, translate it and model it, and then actually provide that consistency view to the higher level organizations that enable you to prove that you are meeting all the compliance and governance rules.

Shane Hastie: Really important stuff there, but what are the challenges? How do we address this?

The challenges of complexity [14:21]

David Williams: See, the ability to address it and to really understand why the problems are occurring. Because if you talk to a lot of developers today and say, “How difficult is your life and what are the issues?", the conversation you'll have with a developer is completely different than the conversation you'll have with a DevOps team lead or a business unit manager, in regards to how they see applications being delivered and coded. So at the developer level, I think the tools that are being developed today, so the infrastructure providers, for example, the application dictates what it needs. It's no longer, I will build an infrastructure and then you will layer the applications on like you used to be able to do. Now what happens is applications and the way that they behave is actually defining where you need to put the app, the tools that are used to both create it and manage it from the Dev and the Op side.

So really what the understanding is, okay, that's the complexity. So you've got infrastructure providers, the clouds, so you've got different clouds. And no matter what you say, they're all different impact, serverless, classic adoption of serverless, is very proprietary in nature. You can't just move one serverless environment from one to another. I'm sure there'll be a time when you might be able to do that, but today it's extremely proprietary. So you've got the infrastructure providers. Then you've got the people that are at the top layer. So you've got the infrastructure technology layer. And that means that on top of that, you're going to have VMs or containers or serverless something that sits on your cloud. And that again is defined by what the application needs, in respect to portability, where it lives, whether it lives in the cloud or it's partly an edge, wherever you want to put it.

And then of course on top of that, you've got all the things that you can use that enables you to instrument and code to those things. So you've got things like Helm charts for containers, and you've got a Terraform where developing the infrastructure as code pieces, or you might be using Puppet or Chef or Ansible. So you've got lots of tools out there, including all the other tools from the service providers themselves. So you've got a lot of the instrumentation. And so you've got that stack. So the skills you've got, so you've got the application defining what you want to do, the developer chooses how they use it in support of the application outcome. So really what you want to be able to do is have something that has a control plane view that says, okay, you can do whatever you want.

Visibility into the pipeline [16:36]

David Williams: These are the skills that you need. But if people leave, what do you do? Do you go and get all the other developers to try and debug and translate what the coding did? Wouldn't it be cool instead to have a set of tech that you could understand what the different platform configuration tools did and how they applied, so look at it in a much more consistent form. Doesn't stop them using what they want, but the layer basically says, "I know, I've discovered what you're using. I've translated how it's used, and I'm now enabling you to model it in a way that enables everybody to use it." So the skills thing is always going to exist. The turnover of people is also very extremely, I would say, more damaging than the skills because people come and go quite freely today. It's the way that the market is.

And then there's the accountability. What do the tools do and why do they do it? So you really want to also deal with the governance piece that we mentioned earlier on, you also want to provide context. And I think that the thing that's missing when you build infrastructure as code and you do all these other things is even though you know why you're building it and you know what it does to build it, that visibility that you're going to have a conversation with the DevOps lead and the business unit manager, wouldn't it be cool if they could actually work out that what you did is in support of what they need. So it has the application ownership pieces, for example, a business owner. These are the things that we provide context. So as each piece of infrastructure is developed through the toolchain, it adds context and the context is consistent.

So as the environments are moved in a consistent way, you actually have context that says this was planned, this was developed, and this is what it was done for. This is how it was tested. I'm now going to leverage everything that the developer did, but now add my testing tools on top. And I'm going to move that in with the context. I'm now going to release the technology until I deploy, release it, into either further testing or production. But the point is that as things get provisioned, whether you are using different tools at different stages, or whether you are using different platforms with which to develop and then test and then release, you should have some view that says all these things are the same thing in support of the business outcome and that is all to do with context. So again, why I joined Quali was because it provides models that provide that context and I think context is very important and it's not always mentioned.

As a coder, I used to write lots and lots of things in the code that gave people a clue on what I was doing. I used to have revision numbers. But outside of that and what I did to modify the code within a set of files, I really didn't have anything about what the business it was supporting it. And I think today with the fragmentation that exists, you've got to deliver people clues on why infrastructure is being deployed, used, and retired, and it needs to be done in our life cycle because you don't want dormant infrastructure sitting out there. So you've got to have it accountable and that's where the governance comes in. So the one thing I didn't mention earlier on was you've got to have ability to be able to work out what you're using, why it's being used and why is it out there absorbing capacity and compute, costing me money, and yet no one seems to be using it.

Accountability and consistency without constraining creativity and innovation [19:39]

David Williams: So you want to be out of accountability and with context in it, that at least gives you information that you can rely back to the business to say, "This is what it cost to actually develop the full life cycle of our app, in that particular stage of the development cycle." So it sounds very complex because it is, but the way to simplify it is really to not abstract it, but consume it. So you discover it, you work out what's going on and you create a layer of technology that can actually provide consistent costing, through consistent tagging, which you can do with the governance, consistent governance, so you're actually measuring things in the same way, and you're providing consistency through the applications layer. So you're saying all these things happen in support, these applications, et cetera. So if issues occur, bugs occur, when it reports itself integrated with the service management tools, suddenly what you have there is a problem that's reported in response to an application, to a release specific to an application, which then associates itself with a service level, which enables you to actually do report and remediation that much more efficiently.

So that's where I think we're really going is that the skills are always going to be fragmented and you shouldn't inhibit people doing what they need. And I think the last thing I mentioned is you should have the infrastructure delivered in the way you want it. So you've got CLIs, if that's a preferred way, APIs to call it if you want to. But for those who don't have the skills, it's not a developer only world if I'm an abstraction layer and I'm more of an operations person or someone that doesn't have the deep diving code skills, I should need to see a catalog of available environments built by coding, built by the people that actually have that skill. But I should be able to, in a single click, provision an environment in support of an application requirement that doesn't require me to be a coder.

So that means that you can actually share things. So coders can code, that captures the environment. If that environment is needed by someone that doesn't have the skills, but it's consistently, because it has all that information in it, I can hit a click. It goes and provisions that infrastructure and I haven't touched code at all. So that's how you see the skills being leveraged. And you just got to accept the fact that people will be transient going forward. They will work from company to company, project to project, and that skills will be diverse, but you've got to provide a layer with which that doesn't matter.

Shane Hastie: Thank you very much. If people want to continue the conversation, where do they find you?

David Williams: They can find me in a number of places. I think the best place is I'm at Quali. It is David.W@Quali.com. I'm the only David W., which is a good thing, so you'll find me very easily. Unlike a plane I got the other day, where I was the third David Williams on the plane, the only one not to get an upgrade. So that's where you can find me. I'm also on LinkedIn, Dave Williams on LinkedIn can be found under Quali and all the companies that I've spoken to you about. So as I say, I'm pretty easy to find. And I would encourage, by the way, anybody to reach out to me, if they have any questions about what I've said. It'd be a great conversation.

Shane Hastie: Thanks, David. We really appreciate it.

David Williams: Thank you, Shane.

Mentioned

. From this page you also have access to our recorded show notes. They all have clickable links that will take you directly to that part of the audio.

Sun, 10 Jul 2022 13:55:00 -0500 en text/html https://www.infoq.com/podcasts/making-devops-pipeline-transparent/
Killexams : Unleashing Breakthrough Innovation in Government

(Illustration by Dan Page) 

The innovators who shake up industries the most do so by reimagining how things should look from the ground up. Apple co-founder Steve Jobs imagined a world where everyone owned a computer, not just the corporations that could afford an IBM mainframe. Twitter cofounder Evan Williams imagined a world where everyone could publish content on the Internet, not just the media companies who could afford expensive Web publishing programs.

Incremental innovations occur everywhere, but breakthrough innovations—the kind that leverage new technologies and business models to drive down costs, increase accessibility, and Excellerate services—have tended to remain the province of the private sector. Returnseeking investors and entrepreneurs reap the financial rewards of changing the world by tearing down the structures of old industries.

Fortunately, that type of innovation is beginning to trickle into government as well. Leaders inside the public sector are slowly learning to pursue these major breakthroughs without the benefit of the profit motives that drive entrepreneurs elsewhere. Take, for instance, an innovation pursued in the US capitol by the District of Columbia Department of Transportation (DDOT). The agency envisioned a time when the city would no longer need traditional, coin-operated parking meters and the expensive employees required to collect the coins. In its place, DDOT would create a system in which people simply hit a “pay-my-meter” button on their Internet-connected phones. The system would be easier for drivers to use (no more carrying around bags of change) and less expensive for the city to operate. Despite these obvious benefits, the improvement would also require the city to migrate away from an established system that employed many workers and relied on existing infrastructure—the type of situation that has long made it difficult to implement innovations in the public sector. To the surprise of many, DDOT’s two-year endeavor was successful. In a sector known for special interests, unions, and a lack of competition, the agency successfully pioneered a model that embraced new technology to Excellerate convenience for citizens and drive down costs for the city.

All too often, this kind of success has not been the outcome. Many citizens believe that the public sector is incapable of such innovation because of the absence of competitive forces, lack of incentives for employees, and excessive red tape. And ordinary citizens are not alone in their concern. Government leaders and employees are quick to point toward systemic problems such as outmoded human resources systems, a budgeting process that rewards extraordinary performance by reducing future resources, and burdensome request for proposal (RFP) systems as explanations for their lack of change.

For many reasons, this sorry state of public sector innovation cannot stand. The US economy has stagnated for nearly four years. In 2012, gross domestic product (GDP) was $15.7 trillion, having grown only 0.6 percent in inflation-adjusted terms since 2007. Similar economic conditions exist throughout much of the developed world. At the same time that public leaders struggle to find a means to spur growth, municipal and state governments hurtle toward fiscal crises of unparalleled proportions, carrying billions in unfunded debt obligations. During this time of adversity, government, a sector that accounts for 24 percent of US GDP and one-sixth of employment, needs to be a solution to our problems—not one of the sources.1

Over the past year, our research group at Harvard Business School led an effort to discover how to empower public leaders to drive out unnecessary costs where possible, freeing up capital to help spur economic growth. Our group—supported by contributions from research groups at Harvard Kennedy School, various municipalities, and the Office of Science and Technology Policy of the White House—surveyed hundreds of government initiatives, interviewed public sector innovators, collaborated with academics across the country, and convened some of the brightest minds in the field for a conference at Harvard Business School. We grounded our research in theories of causality from both the studies of microeconomics and the management sciences, such as the theory of disruption.

What we found confirmed our hypothesis: Breakthrough innovation in government is possible.

As we studied instances of successful and unsuccessful innovation in government, we identified scenarios in which leaders were able to drive out costs through the implementation of novel technologies and service models that got the job done better for constituents. As the causal theories suggested, the difference between success and failure was the ability to create or preserve most if not all of these five conditions for breakthrough innovation:

  • Ability to experiment
  • Ability to sunset outdated infrastructure
  • Existence of feedback loops
  • Existence of incentives for product or service improvement
  • Existence of budget constraints for end users

For instance, by developing robust feedback loops along with the other conditions into their traditional budgeting process, the government of Hampton, Virginia, was able to survive an 8.4 percent budget gap—including program reductions ranging from 18 to 23 percent for economic vitality and neighborhoods, infrastructure, and leisure services—without experiencing a decrease in citizen satisfaction. Instead of cutting across the board, the feedback loop helped the city of Hampton cut only those programs in which the government was providing taxpayers with luxury services when they were happy to settle for more economical services. In Philadelphia the addition of experimental infrastructure for waste collection empowered the city to identify a new service model that reduced departmental operating budgets by almost 70 percent.

In this article, we illustrate how these five conditions enable breakthrough innovation in the public sector. Though our research focused on municipal service innovation, we suspect that the same principles are true at all levels of government. We will also address some of the practical barriers to creating innovative organizations—knowing what to do is only part of the answer; understanding how to create change is an integral part of the solution. To that end, we will offer recommendations on how public leaders, social entrepreneurs, and non-government organization (NGO) managers can encourage innovation in ways that will not be rejected by the system.

By documenting what empowers successful innovation, we hope to make the process repeatable and scalable. Government progress should not have to rest on the herculean efforts of lone innovators; it must be based on sound theory if it can help us to solve the pressing problems facing our society.

The Five Conditions for Innovation

In the book Seeing What’s Next,2 members of our research group introduced a framework to evaluate innovation systems. The authors suggested that two primary factors set the stage for innovation: ability and motivation. These broad categories simplify underlying economic conditions of market structure and information flow within well-functioning free markets. When both ability and motivation are present in a market, a hotbed of innovation forms—in much the same way as the Internet has led to a deluge of entrepreneurship and innovation. When ability and motivation are not present, innovation stalls.

This framework had one principal limitation, however: The scope of the analysis was limited to the private sector, where access to markets (ability) and the profit motive (motivation) are intrinsically present. In the public sector, by contrast, we cannot assume that entry and exit are as simple as incorporating and declaring bankruptcy, or that profit will serve as the primary motivation.

Our challenge was to discover what underlying conditions inherent to private sector innovation needed to be replicated in the public sector. We found that the ability to innovate is derived from the first two conditions—the ability to experiment and the ability to sunset outdated infrastructure. Fundamentally, innovation requires something new to replace the old. Often, it is difficult for incumbents with a vested interest in the status quo to participate in pushing their own obsolescence. In the public sector where startups do not naturally attack incumbents for market share, leaders must find other methods to preserve these two conditions.

The remaining three conditions—the existence of feedback loops, the existence of incentives for product or service improvement, and the existence of budget constraints for end users—all can motivate government innovators in the right direction. Whereas profit and price work together to drive private sector innovators toward optimal solutions, motivating government innovators toward socially optimal outcomes requires more thoughtful direction.

Together, the five conditions allow public sector innovators to try, test, adopt, and reject new technologies and service models. The conditions ensure that the dramatic transitions to less expensive products, which generally perform worse when compared to incumbents, occur only in situations where customers are over-served by existing solutions. These same conditions ensure that public managers do not pursue unnecessary incremental innovation when constituents do not value it. By thoughtfully creating and preserving the five conditions, public innovators can harness much of the power previously relegated to the private sector.

To illustrate how these conditions affect the innovation process, we will examine each of the five conditions and their influence on the implementation of the mobile-payment parking system in Washington, D.C. (In 2010, the municipal government contracted with the private firm Parkmobile to provide a remotely monitored parking system alongside traditional parking meters. The technology was developed and managed by Parkmobile, and the deployment parameters and budgeting decisions remained in the hands of local government.)3

Ability to experiment | Any organization that wishes to adapt to its changing environment needs a system for experimenting with new technologies and delivery models. Without the ability to develop experimental infrastructure, fundamentally new and different approaches rarely emerge. In the private sector, we see this mechanism arise in both the form of corporate innovation and new entrants in existing industries. Unfortunately, public managers often encounter structural barriers when they attempt to experiment. Instead of eliciting exuberance from voters, deployment of capital for experimental projects draws scrutiny from watchdog groups and regulators alike. Without data to validate an initiative’s existence, the public sector attacks the experimental efforts. Yet the inherent paradox is that in order to generate data, experiments are required. To overcome this dilemma, public leaders must behave like venture capitalists by placing small bets based on a theory about the future and using those bets to guide subsequent action.

In the case of Washington’s parking innovation, the DDOT created a thoughtful methodology for experimentation. In 2010, a year before a full rollout was planned and approved, the city started a pilot program for Parkmobile in a single area of the city. This experiment allowed the municipality to gauge how citizens would react to the new service. The positive reception and uptake of the Parkmobile system indicated that the service had promise and would likely be successful across the city. An important feature of the initial system was that it did not require removing the legacy infrastructure. Instead, the Parkmobile system was overlay as an alternative experimental system, minimizing the disruption to citizens’ lives.

Ability to sunset outdated infrastructure | If an experiment is successful, a new challenge is revealed—namely, phasing out the old product or service. In the private sector, when businesses fail to adopt the appropriate technologies or service models, competitors steal their customers and market forces push laggards out of the market. Most government agencies do not experience this process—just look at the difficulty the US Postal Service is having in cutting back its delivery schedule. In fact, many agencies actually lack the ability to freely remove outdated technology and business models.

Though Parkmobile has been successful as an additional layer of Washington’s transportation infrastructure, the full value of the innovation will be realized only after the old infrastructure and collection system is phased out. It was not possible to phase out the old system until the new one was in place. Thus during the rollout, traditional and mobile-payment technology were duplicated for all of the more than 17,000 parking spots. Now the city can phase out the old parking meters and begin to realize the benefits of the new system.

Existence of feedback loops | Once the experimental infrastructure is in place, it should be no surprise that strong feedback loops between the citizens and public servants are required to motivate investment into and adoption of the right innovations. In the private sector, when products and services fail to meet customer expectations, firms have a natural incentive to Excellerate their offerings: the allure of increased market share and the pursuit of premium prices. The feedback loops offered through free market transactions also help private sector innovators identify when their offerings have exceeded customer desire: At some point customers stop paying for incremental improvements. In government, this sort of signal is often lost. Citizens can express dissatisfaction through votes, but these votes are rarely effective at critiquing the performance of specific programs. Unfortunately, without explicit feedback, it is difficult for managers running these programs to judge when to focus on improving service versus reducing cost.

For the mobile-payment rollout in Washington, D.C., a feedback loop was embedded into the experimental system itself. Municipal leaders captured and analyzed a great deal of data from Parkmobile’s online system. The behavior of people using the system led them to see the value the system created directly. After one year, transactions through the mobile-payment system increased by more than 430 percent. This aggressive adoption rate and widespread usage indicated that parkers preferred the new system, providing justification to sunset the old one. The city also learned that 74 percent of all transactions were occurring through the cell phone application, a fact that allowed the government to extrapolate which geographic areas would be more likely to embrace the system upon full implementation.

Existence of incentives for product or service improvement | Armed with the knowledge of what customers want, suppliers can Excellerate their offerings. They must also, however, have the motivation to make improvements. In the private sector, this motivation often stems from the ability to charge higher prices or reach more customers, thereby increasing profits. Though the profit motive does not exist in the public sector, motivation can still be created. For example, decreasing their budget difficulties through access to increased revenue and reduced costs will incentivize senior managers to innovate. Similarly, individual government employees can be motivated by the mission of the work or by recognition for doing it. The difficulty in public management is not creating motivation—it is ensuring that motivation is appropriately aligned with the goals of the organization.

In Washington, D.C., the motivation to Excellerate performance was twofold. First, municipal leaders saw the mobile payments system as a way to capture savings and increase revenue—thereby decreasing budget burdens on the city. Municipal innovators also had another meaningful motivator: being considered forward-thinking. Adrian Fenty, the mayor of Washington, D.C., at the time of the effort, was known to promote this trait in his managers. Innovators inside the government knew that they would be recognized for their innovative solutions, a public reward that provided a powerful, non-financial incentive.

Existence of budget constraints for end users | In any transaction, customer behavior is affected by budget constraints. Budgets force prioritization. For example, when a person has a limited amount of money, she will probably pay the rent on her apartment before she goes on a vacation. Not only do limited financial resources force people to prioritize, they also create incentives to cut costs. If the same person can find a less expensive apartment, she can use the savings to go on vacation. For breakthrough innovation to take hold, government leaders should ensure that budget constraints exist for end users in order to motivate the appropriate prioritization. In some situations, such as in the case of individually distributed services like postal delivery, those constraints should be placed on the customers themselves. In other situations, such as in the case of defense procurement, the constraint should be placed on the person responsible for acquisition. Regardless of where the constraint falls, it is vital that budget incentives be used to force prioritization.

In the case of Parkmobile, customer time and cash constraints naturally force prioritization. Setting up Parkmobile can be a hassle: the user needs to obtain an app, create an account, and register her car before she can pay for parking. Fortunately, after the initial set-up, the enhanced functionality compared to parking meters is realized: the ability to pay without quarters; receiving text messages warning that time is about to run out; and the ease of paying for more time when the driver is miles away from the car. The mobile payment story in Washington, D.C., is still unfolding, but it is undeniable that following the five conditions has allowed the city to pursue a breakthrough in a core service. And the US capitol is not an outlier. There are many other examples of municipalities successfully embedding the five conditions for successful innovation. (See “Breakthrough Innovations in Government Across the United States” below.)

 

The five conditions for innovation make continuous change possible. Though many of our examples highlight cities that have embraced new service models and technologies and driven unnecessary costs out of their systems, continuous change also allows for improvements in other areas of government such as transparency, performance-based funding, civic engagement, and measuring social outcomes, each of which provides an even stronger argument for enabling breakthrough innovation in the public sector.

Planning for Breakthrough Innovation

Of course, ensuring that the five conditions are properly embedded in a public service or product does not by itself ensure successful innovation. Innovation is always an uncertain endeavor—no innovator ever enjoys a 100 percent hit rate. Therefore, ensuring that the system facilitates experimentation even in the wake of failures, identifying what is working through small-scale data gathering efforts, and then scaling up new solutions become even more important. Successful public sector innovators actively shield themselves from the scrutiny and interference that can derail their efforts. Through our study, we identified four best practices to help public leaders succeed.

Identify white space for innovation | Academics often point private sector managers toward innovation in areas lacking competition. Our colleague Mark Johnson codified this sort of thinking in his 2011 book Seizing the White Space.4 By delivering differentiated products and services in underdeveloped segments of the market, innovators can avoid profit-inhibiting competition. Though public sector innovation does not suffer the same competitive threat, the threats of special interests and existing regulation create equally compelling support for innovating in new ways. For instance, Web and mobile application development, bike sharing, and pop-up retail represent burgeoning areas of opportunity for municipal innovators. As each of these areas is relatively novel, little policy has been created that dictates how public leaders can leverage them to affect change. This white space empowers government innovators to test novel solutions to problems on top of existing structures, in some situations generating compelling evidence for how products or services can be further developed.

Minimize expenditure, embed in an existing budget | Watchful public interest groups are always on the lookout for new, unnecessary, or redundant programs that might be evidence of pork barrel spending or waste. Although transparency is generally a good thing, it can make it more difficult for government innovators to launch new programs, especially ones that might seem to replicate existing services (as the D.C. parking program did). One way to avoid such scrutiny is to stay lean, spending the least amount of money to learn the most in any experimental process. The Office of Science and Technology of the White House, for example, created a program called RFP-EZ to solicit solutions from non-traditional sources. Because RFP-EZ is restricted to projects costing below $125,000, a small amount for federal procurement, the executive branch has been able to minimize scrutiny and increase efficiency in the procurement process. Another way to protect programs is to embed them inside existing offices. Boston’s Office of New Urban Mechanics and New York City’s <ahref=”http://wagner.nyu.edu/leadership/leadership_dev/bidf.php”>Innovation Delivery Fellows have both positioned themselves inside mayoral offices in order to provide risk-tolerant spaces for innovators. By minimizing attention in these ways, innovators can ensure that they are not seen as harming so-called sacred cows before they have collected valuable data in support of their hypotheses.

Invest in constituent alignment | Nothing breaks down barriers better than making sure that the people affected by an innovation are aware of and in agreement with the change. Many of the programs and services that we have identified as shining examples of public sector innovation were led by managers who were very conscious of bringing along their various constituencies. In Philadelphia, for example, when a new trash collection system was implemented, municipal employees could have resisted efficiency improvements in fear for their jobs. The mayor’s office, however, made it clear that reductions in workforce would be achieved through natural attrition (retirement), not through layoffs. Employees no longer required by the new system would not be fired, but instead would be redirected toward services that needed additional support. By taking into account employee interests, Philadelphia could roll out its program without resistance.

Validate with data | The best case against the status quo is one grounded in scientific research. When the benefits of new services are speculative—even if supported by pundits and academics—it is easy for stakeholders to resist change. Innovators should know what they are testing for and experiment in such a way that makes their achievement irrefutable. Boston’s Office of New Urban Mechanics, for example, has done so with the Citizens Connect mobile application—what some might consider a quantum leap in 3-1-1 services. By demonstrating quantitatively how much additional geographic coverage is achieved, the group has made it difficult for stakeholders in the legacy call center to resist the city’s investment in the program.

The five conditions we have identified lie at the core of breakthrough innovation, enabling a repeatable process that can overcome the absence of competitive forces, lack of incentives for employees, and proliferation of red tape. We no longer need to think of public sector innovation as an exception to the time-tested rule; in fact, we believe the pursuit of breakthrough innovation in government can turn into a more scientific practice than the art form it resembles today.

Throughout the United States and much of the developed world, governments are on the brink of crisis. They need answers to a paradoxical challenge—how to spur economic growth while simultaneously reducing spending. This can be done only when we find novel solutions to the real problems that we have relied on government to solve. By embedding the five conditions for innovation inside new services and products, public innovators can best position their organizations for success in these trying conditions. Though there is no silver bullet for our problems, ensuring that the ability and motivation to innovate effectively exists throughout the public sector is a vital piece of any solution we develop.

Read more stories by Nikhil R. Sahni, Maxwell Wessel & Clayton M. Christensen.

Wed, 15 May 2013 08:09:00 -0500 en-us text/html https://ssir.org/articles/entry/unleashing_breakthrough_innovation_in_government
Killexams : IBM Report: Consumers Pay the Price as Data Breach Costs Reach All-Time High

60% of breached businesses raised product prices post-breach; vast majority of critical infrastructure lagging in zero trust adoption; $550,000 in extra costs for insufficiently staffed businesses

CAMBRIDGE, Mass., July 27, 2022 /CNW/ -- IBM IBM Security today released the annual Cost of a Data Breach Report,1 revealing costlier and higher-impact data breaches than ever before, with the global average cost of a data breach reaching an all-time high of $4.35 million for studied organizations. With breach costs increasing nearly 13% over the last two years of the report, the findings suggest these incidents may also be contributing to rising costs of goods and services. In fact, 60% of studied organizations raised their product or services prices due to the breach, when the cost of goods is already soaring worldwide amid inflation and supply chain issues.

The perpetuality of cyberattacks is also shedding light on the "haunting effect" data breaches are having on businesses, with the IBM report finding 83% of studied organizations have experienced more than one data breach in their lifetime. Another factor rising over time is the after-effects of breaches on these organizations, which linger long after they occur, as nearly 50% of breach costs are incurred more than a year after the breach.

The 2022 Cost of a Data Breach Report is based on in-depth analysis of real-world data breaches experienced by 550 organizations globally between March 2021 and March 2022. The research, which was sponsored and analyzed by IBM Security, was conducted by the Ponemon Institute.

Some of the key findings in the 2022 IBM report include:

  • Critical Infrastructure Lags in Zero Trust – Almost 80% of critical infrastructure organizations studied don't adopt zero trust strategies, seeing average breach costs rise to $5.4 million – a $1.17 million increase compared to those that do. All while 28% of breaches amongst these organizations were ransomware or destructive attacks.
  • It Doesn't Pay to Pay – Ransomware victims in the study that opted to pay threat actors' ransom demands saw only $610,000 less in average breach costs compared to those that chose not to pay – not including the cost of the ransom. Factoring in the high cost of ransom payments, the financial toll may rise even higher, suggesting that simply paying the ransom may not be an effective strategy.
  • Security Immaturity in Clouds – Forty-three percent of studied organizations are in the early stages or have not started applying security practices across their cloud environments, observing over $660,000 on average in higher breach costs than studied organizations with mature security across their cloud environments.
  • Security AI and Automation Leads as Multi-Million Dollar Cost Saver – Participating organizations fully deploying security AI and automation incurred $3.05 million less on average in breach costs compared to studied organizations that have not deployed the technology – the biggest cost saver observed in the study.

"Businesses need to put their security defenses on the offense and beat attackers to the punch. It's time to stop the adversary from achieving their objectives and start to minimize the impact of attacks. The more businesses try to perfect their perimeter instead of investing in detection and response, the more breaches can fuel cost of living increases." said Charles Henderson, Global Head of IBM Security X-Force. "This report shows that the right strategies coupled with the right technologies can help make all the difference when businesses are attacked."

Over-trusting Critical Infrastructure Organizations
Concerns over critical infrastructure targeting appear to be increasing globally over the past year, with many governments' cybersecurity agencies urging vigilance against disruptive attacks. In fact, IBM's report reveals that ransomware and destructive attacks represented 28% of breaches amongst critical infrastructure organizations studied, highlighting how threat actors are seeking to fracture the global supply chains that rely on these organizations. This includes financial services, industrial, transportation and healthcare companies amongst others.

Despite the call for caution, and a year after the Biden Administration issued a cybersecurity executive order that centers around the importance of adopting a zero trust approach to strengthen the nation's cybersecurity, only 21% of critical infrastructure organizations studied adopt a zero trust security model, according to the report. Add to that, 17% of breaches at critical infrastructure organizations were caused due to a business partner being initially compromised, highlighting the security risks that over-trusting environments pose.

Businesses that Pay the Ransom Aren't Getting a "Bargain"
According to the 2022 IBM report, businesses that paid threat actors' ransom demands saw $610,000 less in average breach costs compared to those that chose not to pay – not including the ransom amount paid. However, when accounting for the average ransom payment, which according to Sophos reached $812,000 in 2021, businesses that opt to pay the ransom could net higher total costs - all while inadvertently funding future ransomware attacks with capital that could be allocated to remediation and recovery efforts and looking at potential federal offenses.

The persistence of ransomware, despite significant global efforts to impede it, is fueled by the industrialization of cybercrime. IBM Security X-Force discovered the duration of studied enterprise ransomware attacks shows a drop of 94% over the past three years – from over two months to just under four days. These exponentially shorter attack lifecycles can prompt higher impact attacks, as cybersecurity incident responders are left with very short windows of opportunity to detect and contain attacks. With "time to ransom" dropping to a matter of hours, it's essential that businesses prioritize rigorous testing of incident response (IR) playbooks ahead of time. But the report states that as many as 37% of organizations studied that have incident response plans don't test them regularly.

Hybrid Cloud Advantage
The report also showcased hybrid cloud environments as the most prevalent (45%) infrastructure amongst organizations studied. Averaging $3.8 million in breach costs, businesses that adopted a hybrid cloud model observed lower breach costs compared to businesses with a solely public or private cloud model, which experienced $5.02 million and $4.24 million on average respectively. In fact, hybrid cloud adopters studied were able to identify and contain data breaches 15 days faster on average than the global average of 277 days for participants.

The report highlights that 45% of studied breaches occurred in the cloud, emphasizing the importance of cloud security. However, a significant 43% of reporting organizations stated they are just in the early stages or have not started implementing security practices to protect their cloud environments, observing higher breach costs2. Businesses studied that did not implement security practices across their cloud environments required an average 108 more days to identify and contain a data breach than those consistently applying security practices across all their domains.

Additional findings in the 2022 IBM report include:

  • Phishing Becomes Costliest Breach Cause – While compromised credentials continued to reign as the most common cause of a breach (19%), phishing was the second (16%) and the costliest cause, leading to $4.91 million in average breach costs for responding organizations.
  • Healthcare Breach Costs Hit Double Digits for First Time Ever– For the 12th year in a row, healthcare participants saw the costliest breaches amongst industries with average breach costs in healthcare increasing by nearly $1 million to reach a record high of $10.1 million.
  • Insufficient Security Staffing – Sixty-two percent of studied organizations stated they are not sufficiently staffed to meet their security needs, averaging $550,000 more in breach costs than those that state they are sufficiently staffed.

Additional Sources

  • To obtain a copy of the 2022 Cost of a Data Breach Report, please visit: https://www.ibm.com/security/data-breach.
  • Read more about the report's top findings in this IBM Security Intelligence blog.
  • Sign up for the 2022 IBM Security Cost of a Data Breach webinar on Wednesday, August 3, 2022, at 11:00 a.m. ET here.
  • Connect with the IBM Security X-Force team for a personalized review of the findings: https://ibm.biz/book-a-consult.

About IBM Security
IBM Security offers one of the most advanced and integrated portfolios of enterprise security products and services. The portfolio, supported by world-renowned IBM Security X-Force® research, enables organizations to effectively manage risk and defend against emerging threats. IBM operates one of the world's broadest security research, development, and delivery organizations, monitors 150 billion+ security events per day in more than 130 countries, and has been granted more than 10,000 security patents worldwide. For more information, please check www.ibm.com/security, follow @IBMSecurity on Twitter or visit the IBM Security Intelligence blog.

Press Contact:

IBM Security Communications
Georgia Prassinos
gprassinos@ibm.com

1 Cost of a Data Breach Report 2022, conducted by Ponemon Institute, sponsored, and analyzed by IBM
2 Average cost of $4.53M, compared to average cost $3.87 million at participating organizations with mature-stage cloud security practices

View original content to obtain multimedia:https://www.prnewswire.com/news-releases/ibm-report-consumers-pay-the-price-as-data-breach-costs-reach-all-time-high-301592749.html

SOURCE IBM

View original content to obtain multimedia: http://www.newswire.ca/en/releases/archive/July2022/27/c2517.html

© 2022 Benzinga.com. Benzinga does not provide investment advice. All rights reserved.

Ad Disclosure: The rate information is obtained by Bankrate from the listed institutions. Bankrate cannot guaranty the accuracy or availability of any rates shown above. Institutions may have different rates on their own websites than those posted on Bankrate.com. The listings that appear on this page are from companies from which this website receives compensation, which may impact how, where, and in what order products appear. This table does not include all companies or all available products.

All rates are subject to change without notice and may vary depending on location. These quotes are from banks, thrifts, and credit unions, some of whom have paid for a link to their own Web site where you can find additional information. Those with a paid link are our Advertisers. Those without a paid link are listings we obtain to Excellerate the consumer shopping experience and are not Advertisers. To receive the Bankrate.com rate from an Advertiser, please identify yourself as a Bankrate customer. Bank and thrift deposits are insured by the Federal Deposit Insurance Corp. Credit union deposits are insured by the National Credit Union Administration.

Consumer Satisfaction: Bankrate attempts to verify the accuracy and availability of its Advertisers' terms through its quality assurance process and requires Advertisers to agree to our Terms and Conditions and to adhere to our Quality Control Program. If you believe that you have received an inaccurate quote or are otherwise not satisfied with the services provided to you by the institution you choose, please click here.

Rate collection and criteria: Click here for more information on rate collection and criteria.

Tue, 26 Jul 2022 16:01:00 -0500 text/html https://www.benzinga.com/pressreleases/22/07/n28217609/ibm-report-consumers-pay-the-price-as-data-breach-costs-reach-all-time-high
Killexams : Oncology Training Needs Assessment Among Health Care Professionals in Nigeria

Cancers are leading causes of death globally with an increasing morbidity and mortality burden in sub-Saharan African countries (SSA), including Nigeria.1 In the absence of effective responses, the burden will continue to increase because of several factors. These include an increase in the prevalence of cancer risk factors that are linked to globalization, epidemiologic transitions, aging, and exposures to environmental carcinogens.2,3 Country-specific cancer research is crucial for the conceptualization and implementation of innovative evidence-based interventions in low- and middle-income countries (LMICs), as research findings from high-income countries may not be sufficiently robust to meet the needs of patients with cancer in their own communities.4,5

CONTEXT

  • Key Objective

  • Our study investigated the status of training and preparedness for oncology practice and research and degree of interprofessional collaboration among health care professionals in Nigeria.

  • Knowledge Generated

  • We identified significant training gaps in cancer care and research, which include poor preparedness in data analysis and bioinformatics, writing clinical trial protocols, and writing grant/research proposals. In addition, the educational opportunities for structured oncology sub-specialist training in the country are inadequate and this has implications for practice. Lack of interprofessional practice and research collaborations was identified as the factor contributing to the dearth of highly trained oncology clinicians and researchers in Nigeria.

  • Relevance

  • Our findings provide data to inform the adoption and development of a structured curriculum in clinical oncology, oncology nursing, and oncology pharmacy aimed at building the clinical and research capacities of oncologists in Nigeria and Africa at large. Invariably, this will contribute to an improvement in the quality of cancer care and accelerate progress toward the reduction of the global disparities in cancer outcomes.

Studies in SSA have documented suboptimal training opportunities in cancer research,6,7 care, and practice.7-9 This is a persistent challenge at every level of higher education in Nigeria.8-11 Suboptimal cancer research capacity in sub-Saharan Africa has deep roots in the long history of colonialism followed by years of political instability and disinvestments by donor countries in the robust science and technology needed to power sustainable development in SSA. The resulting dearth of human resources, the weak clinical and research competencies of oncology scientists/clinicians coupled with poor engagement in research, and the inability of countries to attract and retain very skilled and experienced cancer researchers and clinicians demand urgent attention.6,12-15

Of greater significance is the lack of adequate curriculum for advanced training in health professions and structured career pathways in academic medical centers to facilitate the development of independent cancer researchers.13,14 Standard training in oncology in most developed countries consists of a graduate program or a board certification fellowship guided by a curriculum that is in line with the recommendations of the European Society for Medical Oncology and ASCO.16,17 Without leadership and a structured curriculum, there is weak interprofessional collaboration to Excellerate quality of cancer care in Nigeria and other LMICs. Such collaboration is essential for providing holistic care that takes into account the physical, psychological, and social needs of each patient with cancer in Nigeria.18 Studies indicate that interprofessional collaboration in all aspects of cancer care, including administration of chemotherapy,19 radiation therapy,20 nursing care,21 and cancer screening,22 is vital for improving cancer outcomes.

These challenges are not unique to Nigeria. A accurate global survey that assessed the training of oncologists revealed that most of the respondents from LMICs expressed interest in cancer research. Identified impediments to pursuing this goal included lack of mentorship and guidance in the conduct of research and lack of structured teaching in clinical settings.14 To build on the findings from this global survey and explore the needs for training in cancer research and clinical practice in Nigeria, we developed a survey to investigate the status of training in oncology practice and research for clinicians in Nigeria.

Study Design and Scope

The study used a convergent parallel design.23 An online, semistructured questionnaire, key informant interviews (KIIs), and in-depth interviews (IDIs) were administered to clinicians and researchers in academic tertiary institutions, cancer centers, oncology units, clinics, and wards where oncology services are available, in the six geopolitical zones of Nigeria.

Study Population

The quantitative component of the study involved the administration of questionnaires to pharmacists; nurses; doctors; postdoctoral fellows; early-, mid-, and late-career researchers; and practitioners involved in the prevention and treatment of cancers. Other stakeholders such as the head of departments of nursing, pharmacy, and medicine (oncology, radiation oncology, pediatric oncology, obstetrics, and gynecology); members of the Association of Nigerian Nurses and Midwives and the Pharmacists Council of Nigeria; and resident doctors participated in the qualitative interviews.

The study participants were identified primarily in two ways. First, the members of the research team appointed six coordinators to recruit mobilizers from each geopolitical zone who sent invitations to individuals from their zones. Second, trainees were identified through existing national databases of oncology researchers obtained from institutions such as the Association of Resident Doctors; Association of Oncology Pharmacists, pediatricians, gynecologists, and oncology nurses; and the Association of Nigeria Nurses and Midwives and the Pharmacists Council of Nigeria.

Sample Size Determination and Sampling Technique

The demo size was determined using the demo size calculator for proportions,24 with the following assumptions: α = .05, β = .8, a design effect of 1.0, 95% CI, a population size of 315,325 clinicians,25 and a hypothesized proportion of 50%. This resulted in a minimum required demo size of 384. The total demo size was stratified by profession. The ratio was determined by the statistics documented in the Second National Strategic Health Development Plan, which states that Nigeria has 21,892 pharmacists, 65,759 doctors, and 249,566 nurses and midwives.24,25 The survey used a purposive sampling technique.

Quantitative Data Collection

Quantitative data were collected using a semistructured, online questionnaire developed through a review of the literature.14,26,27 The instrument comprised six sections: sociodemographic characteristics of respondents, opportunities for oncology-focused training, preparedness and competency in oncology practice and research, interprofessional practice, and specific cancer-focused training needs. Competency in oncology research was assessed with 26 items in the instrument, each requiring a response on a one- to five-point self-rating scale (maximum score of 130). Competency in oncology practice was assessed with 16 items (maximum score of 80), whereas interprofessional collaboration was assessed with nine items using the same scale (maximum score of 45). The online questionnaire was distributed via Survey Monkey using purposive and snowball sampling techniques.

Qualitative Data Collection

Qualitative data collection interview guides were used to conduct six KIIs with stakeholders and leaders in nursing, pharmacy, and medicine across the six geopolitical zones of Nigeria. Twenty-four IDI were also administered with selected pharmacists, nurses, resident doctors, postdoctoral fellows, and early- or mid-career researchers involved in cancer prevention and treatment. The interview sessions were conducted by trained interviewers either by telephone or face-to-face and were audio-recorded.

Data Analysis

Quantitative data.

IBM SPSS statistics, version 25 was used to analyze the data. Descriptive statistics, means and standard deviations or proportions, as appropriate, were calculated. One-way analysis of variance and pairwise t-tests were used to assess the statistical significance of differences among group means.

Qualitative data.

The audio recordings were transcribed verbatim, and the transcripts were coded using NVIVO 12.0 software. The data were analyzed using a deductive thematic analysis approach. The set of codes were predefined in relation with the research questions and reviewed by the research team. Final approved themes were used to summarize the study findings and reporting.

Ethical Considerations

The University of Ibadan/University College Hospital Ethics Review Committee reviewed and approved the protocol for the study before the commencement of data collection (UI/EC/21/0114). An electronic signature on an informed consent form was obtained from each participant after information was provided on the nature of the study.

Status and Opportunities of Oncology-Focused Training in Nigeria

With respect to their undergraduate and graduate education, 32.8% and 34.4% of respondents, respectively, stated that there was a standardized set of cancer learning activities, courses, or competencies. Only 44 (13.9%) respondents felt that the oncology education provided during their training program was adequate. Only 77 (24.3%) felt that the oncology training they had prepared them to manage patients with cancer very well, whereas 169 (53.3%) felt that their training did not prepare them for a career in cancer research. A majority (20 of 30) of interview participants also decried the state of oncology training and education as being below standard or of low standard. Very few described it as above average or good/very good. To quote one interview participant: “It's poorer than it needs to be. I think that right now, we just have… We have a center in Abuja that is affiliated with, International Atomic Energy Agency. They are the only ones that do a certification in oncology nursing. I am aware that University College Hospital used to have a program in oncology nursing, but I don't know if that's still ongoing now. The status is rather poor. We don't have so many oncology nurses that are professionally trained. We might have a few who have worked in oncology for years but professional training is still something that is very poor in my opinion” <KII_Male Doct_SS>.

Sixty-seven percent of the respondents reported that the oncology content in identified training programs/opportunities was sparse. The majority (267, 84.2%) reported that opportunities for continuing education in oncology care were inadequate. This statement by a respondent buttresses this claim: “It's very poor. We don't really have a standard, a standardized training for oncology nursing practice in Nigeria. The few [trained nurses] that we have, are learning on the job” <KII_Male Pharm_SW>.

About a third (117, 36.9%) reported that refresher trainings are not conducted in their specialty or that where they occur, the frequency is variable (1-5 times a year, with the most common (29.3%) being once a year). In the qualitative study, more than half of respondents rated the refresher training for nurses as average. Only a few rated it as good. The status of oncology training and education for pharmacists was rated as being poor across all stages of training. The majority of respondents rated the training of medical doctors to be good because of the numbers of trained, mentored, or certified oncology practitioners in the profession. However, the majority of practitioners in the field opined that the status is not structured; many claimed that there are no established patterns, curricula, or modes of training. This quote underscores this view: “many are oncologist by just practice not by certification.”my main concern about oncology training is that it is not structured. People are just there based on the fact that there is a team/unit called oncology unit and you are in the team. Of course you will learn some things but there is no structure…people are oncologist by just practice not by certification” <IDI_Female Doct_NC>.

Regarding opportunities for continuing education, nurses constituting 59.0% of the respondents had access to Mandatory Continuing Professional Development Programs. Other opportunities available to the respondents included oncology conferences (29.3%), continuing medical education (24.6%), short courses on oncology (25.9%), and clinical observerships (22.7%). Similar findings emerged from the qualitative interviews, where more than a quarter of the 30 participants (seven participants) had no cancer-related training aside from the institutional trainings. Of these, only about one sixth (five) had cancer-related trainings through workshops, seminars, and online/virtual trainings. According to the respondents, optimal ways to teach oncology to professionals are face-to-face lectures (35.3%), hands-on sessions in an advanced cancer center (29.7%), a blended approach (16.1%), and online forums and materials (8.8%; Table 2). Our qualitative study participants were in alignment with this finding, with the majority desiring practical or clinical exposure. This was considered the best way to teach oncology practice, training, and mentoring.

Table

TABLE 2 Opportunities for Oncology-Focused Training (N = 317)

Preparedness for Oncology Practice

About a quarter (24.6%) of the respondents felt that they were slightly well prepared for initial consultation/assessment of a patient with a new cancer diagnosis and (20.5%) felt that they were moderately well prepared. An assessment of all competencies in oncology practice showed that < 40% to 50% of the respondents felt moderately well/quite well prepared (Table 3).

Table

TABLE 3 Preparedness for Oncology Practice (N = 317)

Preparedness for Oncology Research

Regarding preparedness for oncology research, about a third (31.9%) reported being slightly well prepared to identify research topics, 27.4% to conduct systematic literature searches, and 28.1% to access publications online. Key perceived deficiencies reported were poor preparedness in big data analytics and bioinformatics (138, 43.5%), writing institutional review board protocols (119, 37.5%), writing grant or research proposals (105, 33.1%), and development of manuscripts (96, 30.3; Table 4). The majority of interview respondents stated that they would like to have structured trainings on cancer research and statistical analysis and other Topics such as molecular diagnosis, patient care and wound management, hysterectomy, patient navigation, genetic counseling, and health education.

Table

TABLE 4 Preparedness for Oncology Research (N = 317)

Interprofessional Collaboration Oncology Research/Practice

Almost two fifths either strongly disagreed (11.0%) or disagreed (31.5%) that clinicians (doctors, nurses, and pharmacists) always met to discuss oncology care as a group before interacting with patients with cancer. Similarly, regarding collaboration for cancer research, the majority of interview respondents lamented a lack of, or inadequate, collaboration among clinicians. According to one respondent, “I will tell you the truth is that everybody has been in silos; meaning the doctors, the clinicians have been doing their own [things] on their own, I do not know what the nurses are even doing, the pharmacist are coming up, agreeing on their own …” <KII_Female Pharm_NW>.

Another respondent characterized the poor interprofessional collaboration among health care professional by saying “there is very little collaboration between professionals in terms of cancer research. For example, doctors doing research may take the nurses under them in oncology as their data collectors and that is not appropriate as that does not qualify them to be researchers. Research should be carried out such that various experts bring their contributions from the medicine, pharmacy, nursing and psychological perspectives… this would enhance self-confidence and interpersonal relationship and reduce conflicts between experts” <KII_Female Academic/Nurse_SE> (Table 5).

Table

TABLE 5 Interprofessional Collaboration in Oncology Research/Practice

Association of Competency for Practice, Research, and Interprofessional Collaborations by Professional Group

There are statistically significant differences in the competency mean scores for oncology practice across the professional groups (P = .009), specifically between doctors and pharmacists (P = .0076) and between nurses and pharmacists (P = .0032). However, the difference in the mean score for competency in oncology research across the professions is not significant (F = 1.256, P = .286). Similarly, there is no statistically significant difference in the mean score for interprofessional collaboration in oncology research/practice among the professional groups (Table 6).

Table

TABLE 6 Association Between Competency for Oncology Practice, Research, and Interprofessional Collaborations by Clinician Groups

This was a mixed methods study designed to investigate the status of training in oncology practice and research for clinicians in Nigeria and their training needs. The use of a mixed methods approach provided a rich, valid description of the gaps in training for oncology practice and research in Nigeria. Despite the fact that Nigeria's six geopolitical regions were not equally represented in our sample, efforts exerted to include as many participants from all regions as possible ensured more informative results than that would have been achieved if some regions were left out. The goal of this study was to provide data to inform development of a tailored capacity-building program.

The majority of participants in our study described the state of oncology training and education in Nigeria as being below the standard. Similar findings have been reported in studies in different regions of the world.14,28-31 Findings from a study in Europe revealed that a majority of medical students rated clinical exposure to the management of patients with cancer as unsatisfactory.28 Similarly, in Turkey, more than half of the nurses in one study did not receive any training regarding oncology palliative care. They stated that the education was not sufficient and knowledge and skills were mostly acquired during in-service education.29 The status of oncology training and education for pharmacists in our study was rated grossly inadequate across all stages of training. This finding aligns with that of a study in Ghana, where a majority of pharmacists had never attended oncology continuing education and only a few had received university education training in oncology.30

The findings established that more than half of the respondents had no standard oncology training in either pre- or postprofessional training. Similarly, the majority had no further cancer-related training aside from that received during institutional trainings. Only a few had cancer-related training through workshops, seminars, and online/virtual trainings. These were mostly self-funded. This finding was supported by the study by Jalan et al on oncology training in LMICs, where the higher proportion of respondents report having to self-fund core oncology training. Ability versus inability to self-fund has implications for workforce selection and may increase disparities within and between countries, which may adversely affect the most disadvantaged populations and diminish the social accountability of training programs.14 The majority of respondents in our study would like to have structured trainings on cancer research and statistical analysis. The demand for capacity building in cancer research that we identified aligns with findings in East Africa.6

The majority of respondents acknowledged their lack of capacity to conduct oncology research. A quarter advocated for strong collaboration/partnerships within and among countries, with others emphasizing the need for adequate hands-on experience in oncology research. Mentoring and training were identified as good approaches to build their capacities. This corresponded with the views of study participants in East Africa.6 Straus et al31 also noted that structured and longitudinal mentorship from a competent and dedicated expert is critical during the foundational educational experience. The majority of our study participants expressed a need for more practical or clinical exposure. This was considered the best way to teach oncology practice. This was consonant with findings from Calgary, Canada, where interactive methods of teaching, such as case study activities and class discussions, have been advocated for.3,32,33 Almost all our study participants attributed inadequate research capacity and poor clinical practice to lack of funding. Other inadequacies were identified, including in leadership and governance, patients/participants' attitudes toward research, and poor collaborations. These challenges have also been reported in other studies.13,34-36

In the opinion of a majority of respondents, there is optimal collaboration among clinicians in terms of clinical care. However, with regard to cancer research, the majority reported poor collaboration among clinicians. This corresponds with findings from previous studies in Nigeria, which revealed low collaboration, interprofessional practice, and teamwork among health care professionals and researchers.12,18

Despite the use of multiple strategies and much effort on the part of the investigators for the recruitment of the respondents, the study attained only 82% of the target accrual. Participants from the north-east and north-central geopolitical zones were fewer in number than those from other regions, because of the difficulty of reaching and engaging them. This is attributed to insurgency and unrest in those regions of Nigeria at the time of data collection.

In conclusion, this study identified significant training gaps in cancer care and research among clinicians and researchers in Nigeria. Lack of interprofessional practice and research collaborations and lack of mentorship were identified as factors responsible for the dearth of highly trained oncologists in Nigeria. Specifically, this study identified an urgent need for the development of structured curriculum in the primary disciplines of clinical oncology, oncology nursing, and oncology pharmacy that will facilitate interdisciplinary and interprofessional training in oncology clinical trials and patient‐oriented translational cancer research. By building the clinical and research capacities of the next generation of highly qualified subspecialists in oncology associated research in Nigeria, the country can begin to address the growing burden of cancer and thereby Excellerate interdisciplinary and interprofessional collaboration across the frontline health care workers. With investment in much needed training infrastructure and adoption of structured postgraduate curriculum leading to board certification, Nigeria has the potential to lead the transformation of cancer care across Sub-Saharan Africa.

© 2022 by American Society of Clinical Oncology
PRIOR PRESENTATION

Presented at AORTIC 2021 virtual cancer conference.

SUPPORT

Supported by NCI P20CA233307, Susan G. Komen for the Cure (OIO) and Breast Cancer Research Foundation (OIO) and a Kiphart Global Health Equity Scholar award from the University of Chicago Center for Global Health (P.O.A. and M.M.O.).

The following represents disclosure information provided by authors of this manuscript. All relationships are considered compensated unless otherwise noted. Relationships are self-held unless noted. I = Immediate Family Member, Inst = My Institution. Relationships may not relate to the subject matter of this manuscript. For more information about ASCO's conflict of interest policy, please refer to www.asco.org/rwc or ascopubs.org/go/authors/author-center.

Open Payments is a public database containing information reported by companies about payments made to US-licensed physicians (Open Payments).

Atara Ntekim

Consulting or Advisory Role: Roche Pharma AG

Research Funding: Roche/Genentech

Olufunmilayo I. Olopade

Employment: CancerIQ (I)

Leadership: CancerIQ

Stock and Other Ownership Interests: CancerIQ, Tempus, 54gene, HealthWell Solutions

Research Funding: Novartis (Inst), Roche/Genentech (Inst), Cepheid (Inst), Color Genomics (Inst), Ayala Pharmaceuticals (Inst)

Other Relationship: Tempus, Color Genomics, Roche/Genentech

Uncompensated Relationships: Healthy Life for All Foundation

Open Payments Link: https://openpaymentsdata.cms.gov/physician/olopade

No other potential conflicts of interest were reported.

1. International Agency for Research on Cancer and World Health Organization: Nigeria, Source: GLOBACON, the Global Cancer Observatory. https://gco.iarc.fr/today/data/factsheets/populations/566-nigeria-fact- sheets.pdf Google Scholar
2. Adewole I, Martin DN, Williams MJ, et al: Building capacity for sustainable research programmes for cancer in Africa. Nat Rev Clin Oncol 11:251-259, 2014 Crossref, MedlineGoogle Scholar
3. Jemal A, Bray F, Forman D, et al: Cancer burden in Africa and opportunities for prevention. Cancer 118:4372-4384, 2012 Crossref, MedlineGoogle Scholar
4. Oluwasanu M, Olopade OI: Global disparities in breast cancer outcomes: New perspectives, widening inequities, unanswered questions. Lancet Glob Health 8:e978-e979, 2020 Crossref, MedlineGoogle Scholar
5. Vogel AL, Freeman JA, Duncan K, et al: Advancing cancer research in Africa through early-career awards: The BIG Cat Initiative. J Glob Oncol 5:1-8, 2019 Google Scholar
6. Ezeani A, Odedina F, Rivers D, et al: SWOT analysis of oncology clinical trials in Africa: A town hall report from the Global Congress on Oncology Clinical Trials in Blacks. JCO Glob Oncol 6:966-972, 2020 LinkGoogle Scholar
7. Rubagumya F, Nyagabona SK, Msami KH, et al: Attitudes and barriers to research among oncology trainees in east Africa. Oncologist 24:e864-e869, 2019 Crossref, MedlineGoogle Scholar
8. Oluwatosin OA, Fadodun O, Brown V, et al: Educational preparation and perceived proficiency of nurses working in oncology units at the University College Hospital, Ibadan, Nigeria. JCO Glob Oncol 4, 2016 (suppl 2; abstr: 85s) Google Scholar
9. Nwozichi CU, Ojewole F, Oluwatosin AO: Understanding the challenges of providing holistic oncology nursing care in Nigeria. Asia Pac J Oncol Nurs 4:18-22, 2017 Crossref, MedlineGoogle Scholar
10. Leng J, Ibraheem AF, Ntekim AI, et al: Survey of medical student knowledge in oncology and survivorship care in Nigeria: A pilot study. JCO Glob Oncol 37, 2019 (suppl 15; abstr: e18106) Google Scholar
11. Galassi A, Morgan C, Muha C: Making the invisible visible: Oncology nursing efforts of NCI-designated cancer centers in LMICs. J Cancer Policy 1:34-37, 2018 CrossrefGoogle Scholar
12. Oluwasanu MM, Atara N, Balogun W, et al: Causes and remedies for low research productivity among postgraduate scholars and early career researchers on non-communicable diseases in Nigeria. BMC Res Notes 12:1-6, 2019 Crossref, MedlineGoogle Scholar
13. Kumwenda S, Niang EHA, Orondo PW, et al: Challenges facing young African scientists in their research careers: A qualitative exploratory study. Malawi Med J 29:1-4, 2017 Crossref, MedlineGoogle Scholar
14. Jalan D, Rubagumya F, Hopman WM, et al: Training of oncologists: Results of a global survey. Ecancermedicalscience 14:1-12, 2020 CrossrefGoogle Scholar
15. Umar SS, Babandi ZS, Suleiman AG, et al: Assessing research engagement of resident doctors in training in Northwestern Nigeria. Niger J Med 30:91, 2021 CrossrefGoogle Scholar
16. Hanna G: So you want to be an oncologist? Ulster Med J 82:65, 2013 MedlineGoogle Scholar
17. Dittrich C, Kosty M, Jezdic S, et al: ESMO/ASCO recommendations for a global curriculum in medical oncology edition 2016. ESMO Open 1:e000097, 2016 Crossref, MedlineGoogle Scholar
18. Ibraheem AF, Giurcanu M, Sowunmi AC, et al: Formal assessment of teamwork among cancer health care professionals in three large tertiary centers in Nigeria. JCO Glob Oncol 6:560-568, 2020 LinkGoogle Scholar
19. Aebersold ML, Kraft S, Farris KB, et al: Evaluation of an interprofessional training program to Excellerate cancer drug therapy safety. JCO Oncol Pract 17: e1551-e1558, 2021 LinkGoogle Scholar
20. James TA, Page JS, Sprague J. Promoting interprofessional collaboration in oncology through a teamwork skills simulation programme. J Interprof Care 30:539-541, 2016 Crossref, MedlineGoogle Scholar
21. Mansour T, Tustison K, Debay M, et al: An interprofessional approach to training in cervical cancer screening in a low resource setting of rural Haiti [12H]. Obstet Gynecol 133:87S, 2019 CrossrefGoogle Scholar
22. Arfons, LM, Mazanec, P, Smith J, et al: Training health care professionals in interprofessional collaborative Cancer Care. Health Interprof Pract Educ 2:eP1073, 2015 Google Scholar
23. Fetters MD, Curry LA, Creswell JW. Achieving integration in mixed methods designs—Principles and practices. Health Serv Res 48:2134-2156, 2013 Crossref, MedlineGoogle Scholar
24. Dean A, Sullivan K, Soe M: OpenEpi: Open-Source Epidemiologic Statistics for Public Health, Version. 2013. www.OpenEpi.com Google Scholar
25. Federal Ministry of Health: Second National Strategic Health Development Plan, 2018–2022. Abuja, Nigeria, Federal Ministry of Health, 2018 Google Scholar
26. Nwachukwu CR, Mudasiru O, Million L, et al: Evaluating barriers and opportunities in delivering high-quality oncology care in a resource-limited setting using a comprehensive needs assessment tool. J Glob Oncol 4:1-9, 2018 MedlineGoogle Scholar
27. Abdul-Sater Z, Kobeissi E, Menassa M, et al: Research capacity and training needs for cancer in conflict-affected MENA countries. Ann Glob Health 86:142, 2020 Crossref, MedlineGoogle Scholar
28. Pavlidis N, Vermorken JB, Stahel R, et al: Undergraduate training in oncology: An ESO continuing challenge for medical students. Surg Oncol 21:15-21, 2012 Crossref, MedlineGoogle Scholar
29. Uslu-Sahan F, Terzioglu F: Nurses' knowledge and practice toward gynecologic oncology palliative care. J Palliat Care Med 7:1-5, 2017 CrossrefGoogle Scholar
30. Mensah KB, Oosthuizen F, Bangalee V: Community pharmacy practice towards cancer health and the need for continuous cancer education: Ghana situation. J Basic Clin Pharma 10:32-37, 2019 Google Scholar
31. Straus SE, Johnson MO, Marquez C, et al: Characteristics of successful and failed mentoring relationships: A qualitative study across two academic health centers. Acad Med 88:82-89, 2013 Crossref, MedlineGoogle Scholar
32. Mitchell C, Laing CM: Revision of an undergraduate nursing oncology course using the Taylor Curriculum Review Process. Can Oncol Nurs J 29:47-51, 2019 Crossref, MedlineGoogle Scholar
33. Kantar LD, Massouh A: Case-based learning: What traditional curricula fail to teach. Nurse Educ Today 35:e8-e14, 2015 Crossref, MedlineGoogle Scholar
34. Nelson AM, Milner DA, Rebbeck TR, et al: Oncologic care and pathology resources in Africa: Survey and recommendations. J Clin Oncol 34:20-26, 2016 LinkGoogle Scholar
35. Vanderpuye V, Hammad N, Martei Y, et al: Cancer care workforce in Africa: Perspectives from a global survey. Infect Agents Cancer 14:1-8, 2019 Crossref, MedlineGoogle Scholar
36. Kmietowicz Z: Tackle cancer in Africa now to prevent catastrophe, say health activists. BMJ 334:1022-1023, 2007 Crossref, MedlineGoogle Scholar
Fri, 20 May 2022 12:00:00 -0500 en text/html https://ascopubs.org/doi/10.1200/GO.22.00017
Killexams : Comparison of Annotation Services for Next-Generation Sequencing in a Large-Scale Precision Oncology Program

The framework to implement genomics-driven therapeutic oncology has been rapidly established in leading cancer centers worldwide. An exponential growth in characterized cancer genomes has ensued.1 The results of next-generation sequencing (NGS) assays are intricate and provide a multitude of somatic mutations in hundreds of genes. Mutation rates vary significantly between different cancers.2 Many somatic genetic variants are characterized as variants of unknown significance, and the implication of multiple mutations within a single tumor is poorly defined.3 Moreover, tumors possess intratumoral genetic heterogeneity, adding another layer of complexity to NGS results.4

CONTEXT

  • Key Objective

  • Algorithms to interpret the pathogenicity and clinical actionability of next-generation sequencing (NGS)–detected sequence variants have been implemented; however, little is known about their relative performance in clinical practice. The clinical interpretation of genomic alterations remains a major bottleneck for realizing precision medicine. We sought to examine the concordance for pathogenicity and clinical actionability of three annotation services available to the VA Precision Oncology Program (N-of-One, OncoKB, and Watson for Genomics).

  • Knowledge Generated

  • We demonstrate that NGS annotation services have wide-ranging agreements in pathogenicity (30%-76%). Moreover, there was moderate agreement (96.9%) in level 1 drug actionability.

  • Relevance

  • We anticipate these findings will encourage improvement in the precision of NGS multigene panel annotation. We provide detailed information regarding NGS panels, gene variant pathogenicity, annotation ontology, and level 1 drug actionability by annotation service. This study has significant implications for precision oncology clinical trials and molecular tumor boards.

At many institutions, including the Veterans Affairs (VA) healthcare system, molecular tumor boards (MTBs) have been instituted to interpret the results of NGS and develop treatment recommendations. In some instances the whole genome is sequenced, and in other cases selected subsets of genes are sequenced with smaller panels. Identified alterations that are of particular clinical interest include driver mutations or druggable mutations.1 For use in clinical decision making, identified gene variants must be annotated or interpreted in terms of the likelihood of their tumorigenicity and drug actionability.5 There are many resources to assist with data curation, including OncoKB (Memorial Sloan Kettering Cancer Center, New York, NY),6 My Cancer Genome (Vanderbilt-Ingram Cancer Center, Nashville, TN),7 Precision Medicine Knowledge Base (Weill Cornell Medicine Englander Institute for Precision Medicine, New York, NY),8 Personalized Cancer Therapy (MD Anderson Cancer Center, Houston, TX),9 CIViC (Washington University in St Louis School of Medicine, St Louis, MO),10 Jackson Laboratory Clinical Knowledgebase (Jackson Laboratory, Bar Harbor, ME),11 Cancer Genome Interpreter (Barcelona, Spain),5 Cancer Driver Log (omicX, Le-Petit-Quevilly, France),12 and N-of-One (NoO; Concord, MA).13 The identification of mutations that are pathogenic and actionable is generally performed by multidisciplinary teams who reference genomic databases, published literature, and clinical trials. The development of cognitive computing, such as Watson for Genomics (WfG), has also garnered interest in precision oncology. WfG14 is a cloud-based service that uses computerized models to simulate human thought to analyze large volumes of genome data and generate evidence-based guidelines.

The ultimate objective of gene sequencing is to Excellerate patient outcomes by matching patients to specific treatments that target mutations driving the growth of individual tumors. The landscape of genetic tests has evolved from single mutation to small hotspot mutation panels, large gene panels, and whole exome and whole genome platforms. The interpretation of the clinical significance of these genomic mutations has become a formidable task because of the number of genes tested and limited understanding of normal genetic variation as well as pathogenic gene-to-gene interaction. Although professional societies have recently published consensus guidelines15 for the use and interpretation of NGS, they have not been extensively used in practice. Moreover, traditional clinical trial design approaches of molecularly unselected tumor types may no longer be appropriate because of the molecular heterogeneity of tumors from each primary site (eg, breast). This has led to the development of basket and umbrella trial designs as well as precision genomic oncology trials at the point of care. An urgency exists to examine the various annotation services and clinical support tools available to ensure quality of NGS in oncology.

The VA National Precision Oncology Program (NPOP)16 initially used two commercial vendors for NGS testing, Personal Genome Diagnostics (PGDx, Baltimore, MD) and Personalis (Menlo Park, CA), that used the NoO commercial annotation service. In addition, WfG was available to the VA through a gift from IBM. Until NPOP accrues substantial outcomes data, treatment recommendations are largely based on biomarker panels, annotations, and relevant patient characteristics.17 NPOP also references open data sources, including OncoKB (an evidence source, not an annotation provider), when interpreting pathogenicity and actionability of gene variants. For some variants and cases, annotation is assisted by a molecular pathologist. The primary objective of this study was to assess the concordance for pathogenicity determination and clinical actionability for the annotation services available to NPOP.

Sequencing data from all patients who underwent NGS testing under NPOP since its genesis in 2015 through 2017 were analyzed. The work described here was conducted to assess the quality of annotation services available to the NPOP, a clinical operational program that is not research and does not require institutional review board review; permission to publish was obtained from the appropriate VA authority. Details of the NPOP have been published previously.16 The goal is to use NGS testing to facilitate patient access to US Food and Drug Administration (FDA)–approved targeted therapies and immune checkpoint inhibitors and increase participation in clinical trials (Data Supplement). Clinical trials have been the recommended option in > 50% of cases for which there was no FDA-approved therapy. NGS results were generated through two contracted vendors: PGDx18 (CancerSELECT 125, PlasmaSELECT 64, CancerSELECT 88, and CancerSELECT 203) and Personalis (ACE Cancer Plus 181)19 (Data Supplement).

We matched gene variants for the two contracted vendors, PGDx and Personalis, to compare vendor recommendations as well as commercial annotation services for the same unique gene variant. NGS-detected variants were annotated by the sequencing vendor using the commercial annotation service NoO. We examined NoO as implemented by the vendors. NPOP staff re-annotated DNA sequence results using WfG and OncoKB. IBM donated use of WfG to the VA; OncoKB is created and maintained by Memorial Sloan Kettering Cancer Center (MSKCC) and available online.6

After completion of sequencing, results were uploaded by an NPOP data scientist to WfG (including tumor type, a list of sequence variants as a variant call file, presence of gene fusions, and gene copy number variation). Results were routinely uploaded into WfG as part of routine test interpretation workflow. Batch analysis was performed approximately weekly using a custom-built informatics workflow and a packaging tool provided by IBM. WfG assigned a pathogenicity label to each variant or fusion. Drug matching for pathogenic variants is also part of the WfG analysis pipeline. Variants that are pathogenic or likely pathogenic are matched with FDA-approved drugs and actively recruiting clinical trials using levels of evidence (level 1, 2A, 2B, 3A, 3B, 4, or R1; Tables 1 and 2).3 For actionability, WfG uses a subset of National Cancer Institute (NCI) thesaurus terms for diagnosis coding.20 WfG was performed on all samples as part of NPOP workflow, and OncoKB analysis was performed on an ad hoc basis in response to NPOP consults and/or MTB cases.

Table

TABLE 1. Levels of Evidence Used by Watson for Genomics

Table

TABLE 2. Levels of Evidence Used by OncoKB

Using the same unique gene variants annotated by NoO and WfG, we queried all variants using the OncoKB curated database for the current study (Data Supplement). The public database21 includes information on the clinical actionability for each somatic gene variant organized by indication and four-tier levels of evidence (Tables 1 and 2) which are ultimately incorporated into cBioPortal22 for Cancer Genomics to aid physicians and cancer researchers.6 Many genes involved in tumorigenesis are not targetable with currently available drugs. More than 90% of alterations in OncoKB have biologic effects and are classified as oncogenic but not actionable. For actionability, tumor ontology is considered. OncoKB contains 43 tumor types with biomarker-drug associations and uses an open-source ontology, OncoTree,23 which was developed at MSKCC (699 tumor types). The Clinical Genomics Annotation Committee (CGAC) reviews the OncoKB alteration across 22 disease management teams. Curation reviews occur every 3 months, and CGAC recommendations and feedback are updated in real time.6

For this study, levels of evidence 2A and 2B were consolidated to level 2 and levels 3A and 3B to level 3 for both WfG and OncoKB. For actionability annotation comparisons, we mapped ontology from WfG to OncoKB to ensure that gene variants were properly annotated with their respective tumor type. In addition, for actionability annotation comparisons, we excluded microsatellite instability (MSI), as the MSI status was not entered into WfG and we do not have MSI information from WfG.

Among 1,227 NGS results, 1,388 unique variants were observed in 117 genes. The entire set of unique gene variants is shown in the Data Supplement. The genes with the largest number of variants included were: TP53 (270), STK11 (92), CDKN2A (81), ATM (67), PTEN (52), NF1 (46), and BRCA2 (45). The most common cancer was lung adenocarcinoma in 35.86% (440). Other cancer types included colon adenocarcinoma 9.21% (113) and lung squamous cell carcinoma 9.05% (111). The complete list of cancer types is shown in Figure 1.

Of the 1,388 unique gene variants, 1,082 were identified only by PGDx (a larger proportion of the samples were sequenced by PGDx), and 480 were identified only by Personalis, whereas 174 gene variants were generated by both vendors in different specimens. NoO classification should be similar for both vendors. The unique gene variants generated by each vendor are shown in the Data Supplement. The genes included in the gene panels are listed in the Data Supplement. We also list the well-characterized genes of each panel as well as the genes for which copy analysis is performed. For panel distribution by NGS samples, see the Data Supplement. All unique gene variants were annotated by NoO, WfG, and OncoKB. Out of the 1,388 unique gene variants, 337 (24.2%) were variants of unknown significance by OncoKB. Out of the 1,388 unique gene variants, 270 (19.4%) were variants of unknown significance by WfG.

For pathogenicity annotation, (ie, pathogenic and likely pathogenic variants v all other variants), there was fair agreement between WfG and OncoKB (76%; kappa, 0.22) and no agreement between WfG and NoO (30%; kappa, −0.26) as well as NoO and OncoKB (42%; kappa, −0.07; Table 3; Fig 2; Appendix Fig A1).

Table

TABLE 3. Unique Gene Variants and Their Annotation as Pathogenic or Likely Pathogenic Using Three Annotation Services: N-of-One, WfG, and OncoKB

There were 91 unique gene variant–diagnosis combinations identified as having level 1 drug actionability recommendations identified by WfG, OncoKB, or both (not available from NoO). As part of actionability annotation, we mapped diagnosis ontology from WfG to OncoKB for each observed diagnosis (Data Supplement).

There was moderate agreement between WfG and OncoKB (96.9%; kappa, 0.445), with 58 variants identified only by WfG as level 1 and 6 variants identified only by OncoKB as level 1 (Table 4). The complete set of level 1 drug recommendations for WfG and OncoKB are shown in the Data Supplement. When both annotation services had level 1 drug actionability, the recommended drugs were overwhelmingly identical. An example of a unique gene variant with drug actionability concordance is the exon 19 deletion in lung adenocarcinoma, EGFR-L747_S752del, with level 1 evidence for use of erlotinib, afatinib, or gefitinib using both WfG and OncoKB annotation services. Response rates of tumors with EGFR exon 19 deletions at L747 have been reported as high as 83.3%.24

Table

TABLE 4. Level 1 Drug Actionability Comparing IBM WfG and Open Academic Somatic Variant Database OncoKB

Among the 33 unique gene variant diagnoses identified as level 1 by OncoKB, WfG classification was level 1 in 27; for the 6 cases with discordant classification in WfG, 5 cases were lung adenocarcinoma EGFR mutations and a melanoma BRAF-V600R mutation. For the BRAF-V600R mutation in melanoma, WfG had no level 1 recommendations, whereas OncoKB had 5 drug or drug combinations as level 1. BRAF-V600R mutations constitute approximately 3%-7% of all BRAF mutations and were not included in the original BRAF/MEK inhibitors clinical trials.25 Although V600R is not listed on the BRAF/MEK FDA label, National Comprehensive Cancer Network panel guidelines from as early as 2016 consider single-agent BRAF inhibitor monotherapy and BRAF/MEK inhibitor combination therapy an appropriate treatment of all activating BRAF mutations, including V600R, V600D, and others.26 For EGFR mutations in lung adenocarcinoma (EGFR-A750P, EGFR-E746_T751delinsI, EFGR-G719C, EGFR-L861Q, and EGFR-S7681), WfG had no recommendations in 3 variants and had level 2A for EGFR-G719C and level 3B for both EGFR-A740P and EGFR L861Q for drugs afatinib, erlotinib, and gefitinib. For EGFR-L861Q, an uncommon EGFR mutation, the afatinib FDA label expanded to include EGFR-L861Q in January of 2018, which was after our annotation analysis. These approvals were based on findings from the phase 2 LUX-Lung 2 trial as well as the phase 3 LUX-Lung3 and LUX-Lung 6 trials that showed an objective response rate of 66% (95% CI, 47% to 81%).27

Among the 85 unique gene variant diagnoses identified as level 1 by WfG, OncoKB classification was level 1 in 27, and 58 had discordant classification. Most of the gene variant diagnoses (40/58 [68.9%]) with discordant recommendations were mismatch repair (MMR) genes, such as MLH1, MSH2, MSH6, and PMS2. The remaining gene variant diagnoses (18/58; 31.0%) with discordant recommendations included genes CDK4, EGFR, FGFR1, FLT4, KIT, PDGFRA, PIK3CA, POLE, PTEN, TSC1, TSC2, VEGFA, and VHL. For a lung adenocarcinoma case with MLH1-A42, which is part of DNA MMR genes and known to have increased MSI, WfG provide a level 1 recommendation for atezolizumab, nivolumab, and pembrolizumab. Although MMR status predicts the clinical benefit of immune checkpoint blockade in multiple tumor types,28,29 none of these drugs are FDA approved for MLH1 mutants. Similarly, for other MMR gene variants, WfG provided level 1 recommendations in the absence of an FDA on-label indication.

Precision medicine has promoted the development of biomarker-driven treatment strategies. There has been a surge in the genetic testing market, with a 10% annual increase in new genetic tests and a 20% annual increase in gene-based diagnostic tests.15,30 NGS generates massive amounts of data, and different biologic questions require the development of specific bioinformatics pipelines, which are frequently platform specific.30,31 The processing of raw sequence data has a profound effect on patient care and outcomes, and NGS test validation is necessary.32 Until recently, there were no established uniform technical standards for reporting tumor-derived NGS gene panel sequencing results.33 In this analysis, we examine a large number of clinically observed unique gene variants and compare annotations for pathogenicity and actionability through two commercial services, NoO and WfG, as well as an open database, OncoKB.

In efforts to establish guidelines for NGS gene panel interpretation of somatic variants, the Association of Molecular Pathology (AMP) and College of American Pathologists (CAP) issued a joint consensus of classification (Table 5).32,33 The four-tier system has different nomenclature and classification as compared with WfG and OncoKB (Tables 1 and 2). The WfG and OncoKB levels of evidence are similar and compatible, but not identical, to the AMP/CAP/ASCO evidence levels. The OncoKB evidence levels were mapped to the evidence levels for AMP/CAP/ASCO tier I and II variants (Table 5).34 We used the WfG and OncoKB classification systems to compare their levels to each other, and we found that variants classified as pathogenic had a wide range of concordance, from 30% to 76% (Fig 2; Table 3; Appendix Fig A1). In routine clinical practice, only gene variants that are pathogenic or likely pathogenic are considered for actionability, so misclassification in pathogenicity could readily affect provider prescribing. For drug actionability, there was moderate-strength agreement 96.9% (kappa, 0.445) for levels 1. In practice, levels of evidence 1 and 2A both frequently result in use of a targeted drug, so concordance in level 2A may be similar to level 1, and their combination may Excellerate concordance among annotation services for actionability. However, we expect greater discordance among levels of evidence 2B to 4.

Table

TABLE 5. Guidelines for Evidence-Based Variant Categories by the AMP, CAP, and ASCO Compared With Levels of Evidence for OncoKB and WfG

Although no direct comparisons between NoO, WfG, and OncoKB have been previously performed, WfG has been compared with MTBs.3 The utility of WfG was compared with MTBs by examining 1,018 cases analyzed by the University of North Carolina’s Human-MTB. WfG identified an additional 8 actionable genes (7 of which passed actionability criteria by Human-MTB). Mutations in these 8 genes were identified in 231 and 96 patients out of 703 and 315 patients with already identified actionable and no actionable mutations, respectively.3 Of the 7 actionable genes identified by WfG, 3 had no clinical trial available, and 4 were made potentially eligible for a biomarker-selected clinical trial. The reason for the added genes by WfG was believed to be the opening of several clinical trials within weeks of the WfG analysis. Most genes identified and reclassified as actionable, however, have yet to demonstrate their utility as predictive biomarkers of response to the recommended therapy. These actionable mutations identified by WfG were found retrospectively in 323 patients, of whom only 47 (4.6%) had active disease requiring further therapy. Moreover, none of the patients had treatment altered based on WfG’s recommendation. Cognitive computing may supplement MTBs, and multiple studies have shown concordance of > 90%.35,36 WfG may provide additional decision support in the current era of rapid generation of information from clinical trials.

Currently, existing evidence does not support population-based universal tumor sequencing.37 The SHIVA trial (ClinicalTrials.gov identifier: NCT01771458) examined patients with any metastatic solid tumor refractory to standard treatment and randomly assigned 195 to receive treatment on the basis of molecular profile (n = 99) versus standard treatment (n = 95). Median progression-free survival was 2.3 months (95% CI, 1.7 to 3.8 months) in the experimental group versus 2.0 months (95% CI, 1.8 to 2.1 months) in the control group (hazard ratio, 0.88; 95% CI, 0.65 to 1.19; P = .41). The study used a variety of tumor types and histologies with single molecular alterations as a predictor for efficacy of targeted agents in heavily pretreated patients, possibly reducing their effectiveness. Nevertheless, there were no differences in progression-free survival or overall survival, and off-label use of targeted agents was discouraged.38 The benefit of off-label use of molecularly targeted agents is largely debated in oncology; however, most agree that clinical trial enrollment should be encouraged to identify predictive druggable biomarkers.

Other prospective studies, including EXACT,39 MOSCATO-01 (ClinicalTrials.gov identifier: NCT01566019),40,41 and MD Anderson’s Phase I Clinic42,43 have shown promising results for targeted therapy on previously treated solid tumors. Additional studies examining the efficacy of off-label use of targeted therapy on the basis of NGS test results are underway: ASCO’s Targeted Agent and Profiling Utilization Registry (TAPUR) study (ClinicalTrials.gov identifier: NCT02693535),44 NCI-MATCH (Molecular Analysis for Therapy Choice) trial (ClinicalTrials.gov identifier: NCT02465060),45 and Pediatric MATCH (ClinicalTrials.gov identifier: NCT03155620). Results from NCI-MATCH for patients with PIK3CA mutations treated with taselisib show 0% objective response rate,46 whereas patients with FGFR pathway mutations treated with AZD4547 showed a 5% objective response rate.47 Currently, there are no randomized controlled trial results supporting a universal NGS-based treatment paradigm. Despite this, NGS testing continues to be incorporated into the clinic.48 In a accurate survey of 1,281 US oncologists, 75.6% reported using NGS tests to guide treatment decisions. Of these, 34% used them for patients with advanced refractory disease, and 17.5% used them for decisions on off-label use of FDA-approved drugs.49 More than 50% reported that NGS tests were difficult to interpret either often or sometimes, which is a separate but related challenge facing personalized medicine.49 The use of NGS will likely continue to grow, with providers facing uncertainty as to how to integrate its use into the clinic. Studies like ours are critical and further highlight the current shortcomings in precisely interpreting results of NGS gene panels for use in clinical management.

© 2020 by American Society of Clinical Oncology

This work was conducted as a clinical operational analysis of the National Oncology Program Office, Office of Specialty Care Services, Department of Veterans Affairs. The views expressed in this article are those of the authors and do not necessarily reflect the position or policy of the Department of Veterans Affairs or the United States government. Presented at the ASCO Annual Meeting, Chicago, IL, June 1-5, 2018.

Supported by the Department of Veterans Affairs, Office of Specialty Care Services.

The following represents disclosure information provided by authors of this manuscript. All relationships are considered compensated unless otherwise noted. Relationships are self-held unless noted. I = Immediate Family Member, Inst = My Institution. Relationships may not relate to the subject matter of this manuscript. For more information about ASCO's conflict of interest policy, please refer to www.asco.org/rwc or ascopubs.org/po/author-center.

Open Payments is a public database containing information reported by companies about payments made to US-licensed physicians (Open Payments).

Evangelia Katsoulakis

Stock and Other Ownership Interests: AbbVie, Alexion Pharmaceuticals, Biogen

Research Funding: Advantagene (Inst)

Neil L. Spector

Stock and Other Ownership Interests: Eydis Bio, Bessor Pharma

Research Funding: Immunolight

Patents, Royalties, Other Intellectual Property: I am on a patent related to my work with the company Immunolight; Patent through Eydis Bio

Michael J. Kelley

Consulting or Advisory Role: AstraZeneca, Eisai, IBM Japan

Research Funding: Bavarian Nordic, Novartis, AstraZeneca, Bristol-Myers Squibb

Other Relationship: IBM

No other potential conflicts of interest were reported.

Veterans Affairs National Precision Oncology Program.

National Precision Oncology Program (NPOP) services are available to all Veterans Affairs (VA) facilities and as of June 2018, over 72 different VA facilities submitted tumor samples. Following sequencing, a formal report of identified genomic aberrations is collated, annotated and included in patient records. The turnaround time is 14 days. NPOP provides a molecular oncology consultation service to assist VA clinicians with treatment decisions, and a case-based education-focused molecular tumor board. The users of NPOP are VA general oncologists and pathologists. Specialized oncologists and molecular pathologists oversee the program.

Panel information.

Panels were constructed to identify base/missense substitutions, insertions/deletions in protein-encoding regions, copy number variations, selected gene fusions/rearrangements and microsatellite instability (only on CancerSELECT 125 panel). There was overlap between the panels and a minimum of 500X DNA sequence coverage was required for all assays.

OncoKB pathogenicity and actionability.

Annotation of variant pathogenicity was executed using OncoKB support and through the API (http://oncokb.org/). Annotation of actionability as defined by the OncoKB Levels of Evidence was executed using OncoKB support and are available through the OncoKB annotator (https://github.com/). Data regarding all variants annotated as actionable per OncoKB at the time of the study are publicly available at https://github.com/.

Panel breakdown of the NGS samples.

The panel breakdown of the 1,227 NGS samples was: Personalis ACE CancerPlus (n = 404); Personal Genome Diagnostics (PGDx) CancerSELECT 125 (n = 286); PGDx CancerSELECT 203 (n = 75); PGDx CancerSELECT 88 (n = 176); PGDx CancerSELECT 64 (n = 1); PGDx unknown panel (n = 285).

1. Hayes DN, Kim WY: The next steps in next-gen sequencing of cancer genomes. J Clin Invest 125:462-468, 2015 Crossref, MedlineGoogle Scholar
2. Kandoth C, McLellan MD, Vandin F, et al: Mutational landscape and significance across 12 major cancer types. Nature 502:333-339, 2013 Crossref, MedlineGoogle Scholar
3. Patel NM, Michelini VV, Snell JM, et al: Enhancing next-generation sequencing-guided cancer care through cognitive computing. Oncologist 23:179-185, 2018 Crossref, MedlineGoogle Scholar
4. Morris LG, Riaz N, Desrichard A, et al: Pan-cancer analysis of intratumor heterogeneity as a prognostic determinant of survival. Oncotarget 7:10051-10063, 2016 Crossref, MedlineGoogle Scholar
5. Tamborero D, Rubio-Perez C, Deu-Pons J, et al: Cancer Genome Interpreter annotates the biological and clinical relevance of tumor alterations. Genome Med 10:25, 2018 Crossref, MedlineGoogle Scholar
6. Chakravarty D, Gao J, Phillips SM, et al: OncoKB: A precision oncology knowledge base. JCO Precis Oncol 10.1200/PO.17.00011 Google Scholar
7. Taylor AD, Micheel CM, Anderson IA, et al: The path(way) less traveled: A pathway-oriented approach to providing information about precision cancer medicine on My Cancer Genome. Transl Oncol 9:163-165, 2016 Crossref, MedlineGoogle Scholar
8. Huang L, Fernandes H, Zia H, et al: The cancer precision medicine knowledge base for structured clinical-grade mutations and interpretations. J Am Med Inform Assoc 24:513-519, 2017 Crossref, MedlineGoogle Scholar
9. Dumbrava EI, Meric-Bernstam F: Personalized cancer therapy—leveraging a knowledge base for clinical decision-making. Cold Spring Harb Mol Case Stud. 4:a001578, 2018 Crossref, MedlineGoogle Scholar
10. Griffith M, Spies NC, Krysiak K, et al: CIViC is a community knowledgebase for expert crowdsourcing the clinical interpretation of variants in cancer. Nat Genet 49:170-174, 2017 Crossref, MedlineGoogle Scholar
11. Patterson SE, Liu R, Statz CM, et al: The clinical trial landscape in oncology and connectivity of somatic mutational profiles to targeted therapies. Hum Genomics 10:4, 2016 Crossref, MedlineGoogle Scholar
12. Damodaran S, Miya J, Kautto E, et al: Cancer Driver Log (CanDL): Catalog of potentially actionable cancer mutations. J Mol Diagn 17:554-559, 2015 Crossref, MedlineGoogle Scholar
13. N-of-One: A Qiagen Company. Clinical interpretation solutions. https://n-of-one.com/ Google Scholar
14. Rhrissorrakrai K, Koyama T, Parida L: Watson for Genomics: Moving personalized medicine forward. Trends Cancer 2:392-395, 2016 Crossref, MedlineGoogle Scholar
15. Robson ME, Bradbury AR, Arun B, et al: American Society of Clinical Oncology policy statement update: Genetic and genomic testing for cancer susceptibility. J Clin Oncol 33:3660-3667, 2015 LinkGoogle Scholar
16. Poonnen PJ, Duffy JE, Hintze B, et al: Genomic analysis of metastatic solid tumors in veterans: Findings from the VHA National Precision Oncology Program. JCO Precis Oncol 10.1200/PO.19.00075 Google Scholar
17. Fiore LD, Brophy MT, Turek S, et al: The VA point-of-care precision oncology program: Balancing access with rapid learning in molecular cancer medicine. Biomark Cancer 8:9-16, 2016 Crossref, MedlineGoogle Scholar
18. Personal Genome Diagnostics (PGDx): CLIA certified. https://www.personalgenome.com/ Google Scholar
19. Personalis, Inc: A leading precision medicine company focused on advanced NGS based services for cancer and clinical trials and translational research. https://www.personalis.com/ Google Scholar
20. Fragoso G, de Coronado S, Haber M, et al: Overview and utilization of the NCI thesaurus. Comp Funct Genomics 5:648-654, 2004 Crossref, MedlineGoogle Scholar
21. OncoKB: Precision oncology knowledge base. https://www.oncokb.org/ Google Scholar
22. Cerami E, Gao J, Dogrusoz U, et al: The cBio cancer genomics portal: An open platform for exploring multidimensional cancer genomics data. Cancer Discov 2:401-404, 2012 Crossref, MedlineGoogle Scholar
23. OncoTree: Displaying: oncotree_latest_stable. http://oncotree.mskcc.org/#/home Google Scholar
24. Su J, Zhong W, Zhang X, et al: Molecular characteristics and clinical outcomes of EGFR exon 19 indel subtypes to EGFR TKIs in NSCLC patients. Oncotarget 8:111246-111257, 2017 Crossref, MedlineGoogle Scholar
25. Klein O, Clements A, Menzies AM, et al: BRAF inhibitor activity in V600R metastatic melanoma. Eur J Cancer 49:1073-1079, 2013 Crossref, MedlineGoogle Scholar
26. Coit DG, Thompson JA, Algazi A, et al: NCCN guidelines insights: Melanoma, version 3.2016. J Natl Compr Canc Netw 14:945-958, 2016 Crossref, MedlineGoogle Scholar
27. Wu TH, Hsiue EH, Lee JH, et al: New data on clinical decisions in NSCLC patients with uncommon EGFR mutations. Expert Rev Respir Med 11:51-55, 2017 Crossref, MedlineGoogle Scholar
28. Le DT, Uram JN, Wang H, et al: PD-1 blockade in tumors with mismatch-repair deficiency. N Engl J Med 372:2509-2520, 2015 Crossref, MedlineGoogle Scholar
29. Le DT, Durham JN, Smith KN, et al: Mismatch repair deficiency predicts response of solid tumors to PD-1 blockade. Science 357:409-413, 2017 Crossref, MedlineGoogle Scholar
30. Gargis AS, Kalman L, Bick DP, et al: Good laboratory practice for clinical next-generation sequencing informatics pipelines. Nat Biotechnol 33:689-693, 2015 Crossref, MedlineGoogle Scholar
31. Roy S, Coldren C, Karunamurthy A, et al: Standards and guidelines for validating next-generation sequencing bioinformatics pipelines: A joint recommendation of the Association for Molecular Pathology and the College of American Pathologists. J Mol Diagn 20:4-27, 2018 Crossref, MedlineGoogle Scholar
32. Jennings LJ, Arcila ME, Corless C, et al: Guidelines for validation of next-generation sequencing-based oncology panels: A joint consensus recommendation of the Association for Molecular Pathology and College of American Pathologists. J Mol Diagn 19:341-365, 2017 Crossref, MedlineGoogle Scholar
33. Li MM, Datto M, Duncavage EJ, et al: Standards and guidelines for the interpretation and reporting of sequence variants in cancer: A joint consensus recommendation of the Association for Molecular Pathology, American Society of Clinical Oncology, and College of American Pathologists. J Mol Diagn 19:4-23, 2017 Crossref, MedlineGoogle Scholar
34. Wagner AH, Walsh B, Mayfield G, et al: A harmonized meta-knowledgebase of clinical interpretations of cancer genomic variants. Nat Genet (in press) Google Scholar
35. Somashekhar SP, Sepúlveda MJ, Puglielli S, et al: Watson for Oncology and breast cancer treatment recommendations: Agreement with an expert multidisciplinary tumor board. Ann Oncol 29:418-423, 2018 Crossref, MedlineGoogle Scholar
36. Kim YY, Oh SJ, Chun YS, et al: Gene expression assay and Watson for Oncology for optimization of treatment in ER-positive, HER2-negative breast cancer. PLoS One 13:e0200100, 2018 MedlineGoogle Scholar
37. West HJ: No solid evidence, only hollow argument for universal tumor sequencing: Show me the data. JAMA Oncol 2:717-718, 2016 Crossref, MedlineGoogle Scholar
38. Le Tourneau C, Delord JP, Gonçalves A, et al: Molecularly targeted therapy based on tumour molecular profiling versus conventional therapy for advanced cancer (SHIVA): A multicentre, open-label, proof-of-concept, randomised, controlled phase 2 trial. Lancet Oncol 16:1324-1334, 2015 Crossref, MedlineGoogle Scholar
39. Prager GW, Unseld M, Waneck F, et al: Results of the extended analysis for cancer treatment (EXACT) trial: A prospective translational study evaluating individualized treatment regimens in oncology. Oncotarget 10:942-952, 2019 Crossref, MedlineGoogle Scholar
40. Verlingue L, Malka D, Allorant A, et al: Precision medicine for patients with advanced biliary tract cancers: An effective strategy within the prospective MOSCATO-01 trial. Eur J Cancer 87:122-130, 2017 [Erratum: Eur J Cancer 93:156-157, 2018] Google Scholar
41. Massard C, Michiels S, Ferté C, et al: High-throughput genomics and clinical outcome in hard-to-treat advanced cancers: Results of the MOSCATO 01 trial. Cancer Discov 7:586-595, 2017 Crossref, MedlineGoogle Scholar
42. Tsimberidou AM, Iskander NG, Hong DS, et al: Personalized medicine in a phase I clinical trials program: The MD Anderson Cancer Center initiative. Clin Cancer Res 18:6373-6383, 2012 Crossref, MedlineGoogle Scholar
43. Tsimberidou AM, Wen S, Hong DS, et al: Personalized medicine for patients with advanced cancer in the phase I program at MD Anderson: Validation and landmark analyses. Clin Cancer Res 20:4827-4836, 2014 Crossref, MedlineGoogle Scholar
44. Mangat PK, Halabi S, Bruinooge SS, et al: Rationale and Design of the Targeted Agent and Profiling Utilization Registry (TAPUR) Study. JCO Precis Oncol 10.1200/PO.18.00122 Google Scholar
45. Moore KN, Mannel RS: Is the NCI MATCH trial a match for gynecologic oncology? Gynecol Oncol. 140:161-166, 2016 Google Scholar
46. Krop I, Jegede O, Grilley-Olson J, et al: Results from Molecular Analysis for Therapy Choice (MATCH) arm I: Taselisib for PIK3CA-mutated tumors. J Clin Oncol 36:101, 2018 (15_suppl; abstr 101) Google Scholar
47. Chae YK Vaklavas C, Cheng HH, et al: Molecular analysis for therapy choice (MATCH) arm W: Phase II study of AZD4547 in patients with tumors with aberrations in the FGFR. J Clin Oncol 36:2503, 2018 (15_suppl; abstr 2503) Google Scholar
48. Morash M, Mitchell H, Beltran H, et al: The role of next-generation sequencing in precision medicine: A review of outcomes in oncology. J Pers Med 10.3390/jpm8030030 Google Scholar
49. Freedman AN, Klabunde CN, Wiant K, et al: Use of next-generation sequencing tests to guide cancer treatment: Results from a nationally representative survey of oncologists in the United States. JCO Precis Oncol 10.1200/PO.18.00169 Google Scholar
Mon, 31 May 2021 00:57:00 -0500 en text/html https://ascopubs.org/doi/10.1200/PO.19.00118 Killexams : IBM Report: Consumers Pay the Price as Data Breach Costs Reach All-Time High

IBM Report: Consumers Pay the Price as Data Breach Costs Reach All-Time High

60% of breached businesses raised product prices post-breach; vast majority of critical infrastructure lagging in zero trust adoption; $550,000 in extra costs for insufficiently staffed businesses

CAMBRIDGE, Mass., July 27, 2022 /PRNewswire/ -- IBM (NYSE: IBM) Security today released the annual Cost of a Data Breach Report,1  revealing costlier and higher-impact data breaches than ever before, with the global average cost of a data breach reaching an all-time high of $4.35 million for studied organizations. With breach costs increasing nearly 13% over the last two years of the report, the findings suggest these incidents may also be contributing to rising costs of goods and services. In fact, 60% of studied organizations raised their product or services prices due to the breach, when the cost of goods is already soaring worldwide amid inflation and supply chain issues.

The perpetuality of cyberattacks is also shedding light on the "haunting effect" data breaches are having on businesses, with the IBM report finding 83% of studied organizations have experienced more than one data breach in their lifetime. Another factor rising over time is the after-effects of breaches on these organizations, which linger long after they occur, as nearly 50% of breach costs are incurred more than a year after the breach.

The 2022 Cost of a Data Breach Report is based on in-depth analysis of real-world data breaches experienced by 550 organizations globally between March 2021 and March 2022. The research, which was sponsored and analyzed by IBM Security, was conducted by the Ponemon Institute.

Some of the key findings in the 2022 IBM report include:

  • Critical Infrastructure Lags in Zero Trust – Almost 80% of critical infrastructure organizations studied don't adopt zero trust strategies, seeing average breach costs rise to $5.4 million – a $1.17 million increase compared to those that do. All while 28% of breaches amongst these organizations were ransomware or destructive attacks.
  • It Doesn't Pay to Pay – Ransomware victims in the study that opted to pay threat actors' ransom demands saw only $610,000 less in average breach costs compared to those that chose not to pay – not including the cost of the ransom. Factoring in the high cost of ransom payments, the financial toll may rise even higher, suggesting that simply paying the ransom may not be an effective strategy.
  • Security Immaturity in Clouds – Forty-three percent of studied organizations are in the early stages or have not started applying security practices across their cloud environments, observing over $660,000 on average in higher breach costs than studied organizations with mature security across their cloud environments.
  • Security AI and Automation Leads as Multi-Million Dollar Cost Saver – Participating organizations fully deploying security AI and automation incurred $3.05 million less on average in breach costs compared to studied organizations that have not deployed the technology – the biggest cost saver observed in the study.

"Businesses need to put their security defenses on the offense and beat attackers to the punch. It's time to stop the adversary from achieving their objectives and start to minimize the impact of attacks. The more businesses try to perfect their perimeter instead of investing in detection and response, the more breaches can fuel cost of living increases." said Charles Henderson, Global Head of IBM Security X-Force. "This report shows that the right strategies coupled with the right technologies can help make all the difference when businesses are attacked."

Over-trusting Critical Infrastructure Organizations
Concerns over critical infrastructure targeting appear to be increasing globally over the past year, with many governments' cybersecurity agencies  urging vigilance against disruptive attacks. In fact, IBM's report reveals that ransomware  and destructive attacks represented 28% of breaches amongst critical infrastructure organizations studied, highlighting how threat actors are seeking to fracture the global supply chains that rely on these organizations. This includes financial services, industrial, transportation and healthcare companies amongst others.

Despite the call for caution, and a year after the Biden Administration issued a cybersecurity executive order that centers around the importance of adopting a zero trust approach to strengthen the nation's cybersecurity, only 21% of critical infrastructure organizations studied adopt a zero trust security model, according to the report. Add to that, 17% of breaches at critical infrastructure organizations were caused due to a business partner being initially compromised, highlighting the security risks that over-trusting environments pose.

Businesses that Pay the Ransom Aren't Getting a "Bargain"
According to the 2022 IBM report, businesses that paid threat actors' ransom demands saw $610,000 less in average breach costs compared to those that chose not to pay – not including the ransom amount paid. However, when accounting for the average ransom payment, which according to Sophos reached $812,000 in 2021, businesses that opt to pay the ransom could net higher total costs - all while inadvertently funding future ransomware attacks with capital that could be allocated to remediation and recovery efforts and looking at potential federal offenses.

The persistence of ransomware, despite significant global efforts to impede it, is fueled by the industrialization of cybercrime. IBM Security X-Force discovered the duration of studied enterprise ransomware attacks shows a drop of 94% over the past three years – from over two months to just under four days. These exponentially shorter attack lifecycles can prompt higher impact attacks, as cybersecurity incident responders are left with very short windows of opportunity to detect and contain attacks. With "time to ransom" dropping to a matter of hours, it's essential that businesses prioritize rigorous testing of incident response (IR) playbooks ahead of time. But the report states that as many as 37% of organizations studied that have incident response plans don't test them regularly.

Hybrid Cloud Advantage
The report also showcased hybrid cloud environments as the most prevalent (45%) infrastructure amongst organizations studied. Averaging $3.8 million in breach costs, businesses that adopted a hybrid cloud model observed lower breach costs compared to businesses with a solely public or private cloud model, which experienced $5.02 million and $4.24 million on average respectively. In fact, hybrid cloud adopters studied were able to identify and contain data breaches 15 days faster on average than the global average of 277 days for participants.

The report highlights that 45% of studied breaches occurred in the cloud, emphasizing the importance of cloud security. However, a significant 43% of reporting organizations stated they are just in the early stages or have not started implementing security practices to protect their cloud environments, observing higher breach costs2. Businesses studied that did not implement security practices across their cloud environments required an average 108 more days to identify and contain a data breach than those consistently applying security practices across all their domains.

Additional findings in the 2022 IBM report include:

  • Phishing Becomes Costliest Breach Cause – While compromised credentials continued to reign as the most common cause of a breach (19%), phishing was the second (16%) and the costliest cause, leading to $4.91 million in average breach costs for responding organizations.
  • Healthcare Breach Costs Hit Double Digits for First Time Ever– For the 12th year in a row, healthcare participants saw the costliest breaches amongst industries with average breach costs in healthcare increasing by nearly $1 million to reach a record high of $10.1 million.
  • Insufficient Security Staffing – Sixty-two percent of studied organizations stated they are not sufficiently staffed to meet their security needs, averaging $550,000 more in breach costs than those that state they are sufficiently staffed.

Additional Sources

  • To obtain a copy of the 2022 Cost of a Data Breach Report, please visit: https://www.ibm.com/security/data-breach.
  • Read more about the report's top findings in this IBM Security Intelligence blog.
  • Sign up for the 2022 IBM Security Cost of a Data Breach webinar on Wednesday, August 3, 2022, at 11:00 a.m. ET here.
  • Connect with the IBM Security X-Force team for a personalized review of the findings: https://ibm.biz/book-a-consult.

About IBM Security
IBM Security offers one of the most advanced and integrated portfolios of enterprise security products and services. The portfolio, supported by world-renowned IBM Security X-Force ®  research, enables organizations to effectively manage risk and defend against emerging threats.  IBM operates one of the world's broadest security research, development, and delivery organizations, monitors  150 billion+ security events per day in more than 130 countries, and has been granted more than 10,000 security patents worldwide. For more information, please check www.ibm.com/security, follow @IBMSecurity on  Twitter or visit the IBM Security Intelligence blog.

Press Contact:

IBM Security Communications
Georgia Prassinos
gprassinos@ibm.com

1 Cost of a Data Breach Report 2022, conducted by Ponemon Institute, sponsored, and analyzed by IBM
2  Average cost of $4.53M, compared to average cost $3.87 million at participating organizations with mature-stage cloud security practices

Photo - https://mma.prnewswire.com/media/1865847/IBM_CODB.jpg

Logo - https://mma.prnewswire.com/media/95470/ibm_logo.jpg  

More News by PR Newswire India

Atlanta Site Strengthens Sims Lifecycle Services' Operational Footprint

PRA Group Leader LaTisha Tarrant Named Chief Human Resources Officer

Media Advisory - BMO Financial Group to Announce its Third Quarter 2022 Results

Vieworks Wins FDA Approval for Its existing VIVIX-S F series

GACL Financial results for the first quarter ended 30th June 2022

Strong Technical Advantages of TANK Platform, GWM TANK300 Will Bring a New Intelligent Off-road Experience

Bitget Launches $200 Million Protection Fund To Safeguard Users' Asset Security

Chandigarh University joins hands with Punjab Remote Sensing Centre for resources management and disaster mitigation

Saudi Arabia to Bid for the 2026 AFC Women's Asian Cup

FIRMENICH DELIVERS RECORD RESULTS DESPITE CHALLENGING MACRO-ECONOMIC ENVIRONMENT

A Center of Hope and New Life

CHAIRMAN OF ATLAS CORP., FAIRFAX FINANCIAL HOLDINGS LIMITED AND THE WASHINGTON FAMILY PARTNER WITH OCEAN NETWORK EXPRESS PTE. LTD. TO JOINTLY PROPOSE ACQUIRING ALL COMMON SHARES OF ATLAS CORP. NOT CONTROLLED BY ITS MAJORITY SHAREHOLDERS FOR $14.45 PER SHARE IN CASH

Hon'ble Prime Minister Modi inaugurates multispecialty charitable Shrimad Rajchandra Hospital; a revolution in rural healthcare

PARAMOUNT PICTURES' "TOP GUN: MAVERICK" FLIES PAST $50 MILLION IN SCREENX AND 4DX

Empowering Young Women and Refugees Worldwide through STEM & Sustainability

ViewSonic's Visual Solutions Ignite Love and Hope in 2022 World Women's Art Festival

Besrey Celebrates Seventh Anniversary with Giveaways and Gifts for Parents

ADVENTURERS ROSS EDGLEY AND KATIE TUNN OPEN DOORS AT NEW TALISKER VISITOR EXPERIENCE

Frost & Sullivan Institute lauds Global Companies with Prestigious Enlightened Growth Leadership Awards

Oracle Names WorkForce Software the Cloud HCM ISV Partner of the Year as Winner of the 2021 Oracle Cloud HCM HR Heroes Visionary Award

In a first of its kind, Rural Banking Outlets in India to get a distinct identity through 'RedBlue Revolution'

Titan.ium Platform, LLC Announces 5G SA Evolution of NetCore to Expand Its Private Networks Solutions

Dentsu Group Acquires a Majority Stake in Extentia to Strengthen Customer Transformation and Technology Capabilities

Agility Completes £763 Million Acquisition of Menzies Aviation

Elearnmarkets goes global with their 2nd edition of the Face2Face Trading Conclave in Bangkok

Global Tree presents Study Fair 2022

Innoviti joins ONDC platform to help small sellers accelerate sales through co-promotion opportunities with banks and brands

VNPT Group partners with Comviva to deliver advanced digital customer experience

SVKM'S NMIMS Shirpur campus bags NBA accreditation for School of Pharmacy & Technology Management

Crest, an Inventory Planning Tool Gets Funding from Its first 2 Customers, Sirona Hygiene and Samosa Party

Panchshil Realty To Sponsor Accomplished Mountaineer Baljeet Kaur's Next Adventure

The Adecco Group: HALF YEAR REPORT 2022

The Adecco Group: Q2 22 Results - Market share momentum, solid growth and margin

Cielo continues global expansion, opens new office in India

VCTI Achieves Dramatic Growth Fueled by U.S.'s Plans and Funding for Broadband Expansion

Sun Life Appoints Chris Wei to EVP and Chief Client and Innovation Officer

ROSHN signs agreement to sponsor and rename Jeddah Waterfront

GameChange Solar Announces 6 GW Midwest Tracker Factory, Increases USA Capacity to 14 GW

Franklin Templeton, Inc. Renews with Empire State Realty Trust for 79k Square Feet at 100 First Stamford Place

Music industry veteran Lefroy Verghese joins Audio Network

Women's World Banking Announces 2023 Fintech Innovation Challenge to Elevate Fintechs with New Solutions to Close the Gender Gap in Financial Services

KPM Analytics Introduces New Discrete Analyzer to Complete Its SmartChem® Product Line

HAVAS HEALTH & YOU PARTNERS WITH REPUBLICA HAVAS TO CREATE REPUBLICA HAVAS HEALTH

Collaboration of ePayLater, the SME Credit Platform with JioMart Partner

Firmenich Appoints Maurizio Clementi ad interim President of its Taste & Beyond Division

Enfinity Global closes $242 million of long term financing for three operational solar power plants in Japan

HIGHLIGHTS FROM THE ALZHEIMER'S ASSOCIATION INTERNATIONAL CONFERENCE 2022

HISTORY OF HYPERTENSIVE DISORDERS DURING PREGNANCY LINKED TO INCREASED RISK OF DEMENTIA

HarperCollins Publishers India is proud to announce the forthcoming publication of Chitra Banerjee Divakaruni's new novel INDEPENDENCE

Waterborne Trade in Rio South Texas

A Bond Tightly Woven Together With Sneh - An Exquisite Rakhi Range From Ferns N Petals (FNP)

FMCG and Consulting companies top recruiters at SPJIMR Autumn Internship 2022

Jury unveiled for UXplorer'22; features design industry stalwarts from India and abroad

Hisense Shines in UEFA Women's EURO 2022â„¢, Achieved Champion Position in Global Market

The Israeli Ministry of Defense Selects Mona's Enterprise Monitoring Solution to Gain Complete Visibility into Their AI / ML Systems

Over 22% of small businesses that availed loans in FY 2022 were run by women: NeoGrowth Social Impact Report

FORTUNE RELEASES ANNUAL FORTUNE GLOBAL 500 LIST

Technology Solutions to Boost Employee Experience and Excellerate Patient Outcomes

Cendyn announces acquisition of digitalhotelier

Credgenics launches Digital Collections Technology Platform in Indonesia

MOMENTUM NAMED A BEST COMPANY TO WORK FOR WITH "OUTSTANDING" LEVELS OF WORKPLACE ENGAGEMENT

Zimyo at-service with its 'Start-up Program'

Sanjay Ghodawat conferred with the Maharashtra Leadership Award 2022

Jio Platforms partners with Subex HyperSense AI to augment its 5G product line

Globalization Partners Again Named an Industry Leader in NelsonHall's 2022 Global Employer of Record Research

Detect Technologies announces global agreement with Vedanta

CLA Nearly Triples Office Space with ESRT at One Grand Central Place

Sparrow Introduces Newly Updated Sparrow Cloud and Sparrow SCA at Black Hat USA 2022

CELEBRATING CULTURAL REVIVAL AND THE TRADITION OF PASSING ON GENERATIONAL KNOWLEDGE THROUGH A NEW ROYAL CANADIAN MINT COIN HONOURING THE RED RIVER MÉTIS

LOWER SOCIOECONOMIC STATUS IN CHILDHOOD, PERSISTENT LOW WAGES LINKED TO RISK FOR DEMENTIA AND FASTER MEMORY DECLINE

Expel Applauded by Frost & Sullivan for Enabling Managed and Automated Threat Detection and Response

Context Therapeutics and The Menarini Group Announce Clinical Trial Collaboration and Supply Agreement to Evaluate ONA-XR and Elacestrant Combination

Bigtincan Unveils Analytics Experience and Delivers Enhancements to Award-Winning Sales Enablement Platform

Zift Solutions Marks Six Years of SOC 2 Compliance

Adorable has a new meaning with Little Surprise Box the pioneers of Kids Monsoon Fashion in India

Firmenich Announces Grand Opening of West Coast Pilot Plant for Food & Beverage Customers

Leica Camera engages in the Laser TV equipment segment and agrees on technological cooperation with Hisense

Metabolon Awarded ISO 9001:2015 Recertification

At Chandigarh University, multiple scholarships worth Crores of rupees help talented and meritorious students overcome the barrier of weaker economic situation

Altair Announces Winners of 10th Annual Enlighten Award

iHerb Achieves Customer Service Milestones

10,000 RMB! "My Guangdong Story" short video solicitation calls for global entries

STL launches comprehensive optical suite for India's 5G readiness

LambdaTest's intelligent test orchestration platform HyperExecute is now available on the Microsoft Azure Marketplace

Northwest Analytics Applauded by Frost & Sullivan for Preventing Institutional Knowledge Loss and Bridging Skills Gaps with NWA Analytics Knowledge Suiteâ„¢

New Research Reveals Risk Factors to Business Success in a Post-Pandemic World

Comviva receives Mastercard Cloud Based Payments Certification

EZVIZ LAUNCHES TY1 With 4MP, A SMART Wi-Fi PAN and TILT CAMERA

Novel and fast typing method developed to control monkeypox by Molecular Biology Systems (MBS) in collaboration with the Amsterdam UMC and Danish Rigshospitalet

Intersec Reports Most Successful Fiscal Year In Company History

Protean forays into cyber security business, launches Protean InfoSec Services Limited

Prudent Corporate Advisory Services Limited Profits grew by 25% YoY & Flows from SIPs Reach Highest Ever in Q1FY23

Pharming Announces New ICD-10-CM Code for APDS, a Rare Primary Immunodeficiency

SEERIST ESTABLISHES THE FIRST AUGMENTED ANALYTICS TECHNOLOGY FOR SECURITY AND THREAT INTELLIGENCE PROFESSIONALS

Quinnox named a Strong Performer in Modern Application Development Services by Independent Analyst Firm

Tech Mahindra Announces Strategic Partnership with Soroco to Establish a Large Task Mining Center of Excellence

Eventbrite Welcomes Sapna Nair as Managing Director and VP of Engineering in India

PNY launches XLR8 Gaming EPIC-X RGB(TM) DDR4 Silver 3200MHz and 3600MHz desktop memory

The Benefits of a Smart Connected Commerce for MSMEs in APAC

Allied Universal® Expands By 22,000 Square Feet with Empire State Realty Trust at 501 Seventh Avenue

Ithra's Hijrah: In the Footsteps of the Prophet exhibit revisits the birth of Islam from a modern perspective

New initiative from Turkish Airlines to combat climate change: Co2mission

EXPERIENCES OF RACISM ASSOCIATED WITH POOR MEMORY, INCREASED COGNITIVE DECLINE

INEOS STYROLUTION SELLS ITS ENTIRE EQUITY INTEREST IN INEOS STYROLUTION INDIA

IMG Announces Start of Sixth Annual 'Leave Your Mark' Essay Contest

RevBits Delivers End-to-End Email Security - Defending from the Mail Server to the Edge User's Inbox

Frost & Sullivan Recognizes Leading Organizations with Prestigious 2022 Best Practices Awards

Religare Broking Moves its Corporate Office to Andheri East

"The only sustainable way we can get to 5 trillion is if the bottom 600 billion Indians are part of the process," said Dr. D Subbarao, Former Finance Secretary and RBI Governor

Comviva and Vietnamobile announce strategic partnership to power AI-led intelligent customer engagement

Union Bank of India launches Union Prerna 2.0 - EmpowerHim

Passion Gaming Orchestrates Real-Time Data-Based Decision Making With Tableau

Shanghai Electric Announces Continued Commitment to Promoting Friendship Between China and Pakistan on International Friendship Day

Azure Power Announces Appointment of Rupesh Agarwal as Chief Strategy & Commercial Officer

Merged Solcon-IGEL GROUP Launches New Website

City of Glasgow College, UK partners with Scottish Institute of Hospitality Sciences to strengthen their presence in India

NIA highlights four key magnets to attract foreign investors, pushing Thailand as a "Global Hub" for international startups and investors

Global Automotive Plastics Market Boosted by the Need to Minimize CO2 Emissions

Vantage partners UNHCR for global fundraising activity for refugees, matches donations dollar-for-dollar.

Experion Technologies and TRL Software UK's new JV promises safer roads using technology-driven solutions iROADS and iMAAP

DigiFT Launches Decentralized Security Token Exchange

Anirban Gupta joins Colliers as Managing Director, East India to drive business growth

PLDT fires up PH link of US-Transpacific Jupiter Cable system

SVKM's NMIMS School of Pharmacy & Technology Management, Shirpur Campus bags NBA Accreditation

Zimbabwe President Emmerson Mnangagwa launched US$500 million Zim Cyber City by UAE investor Mulk Int'l opens new investment opportunities for Indians

Tejas Networks awarded Rs 298 crore optical network contract by Power Grid Corporation of India Limited (PGCIL)

Pharming Group Receives Accelerated Assessment in Europe for leniolisib for the Treatment of Rare Immunodeficiency, APDS

Active Motif and EpiCypher Execute Cross-Licensing Agreement and End Ongoing Litigation Involving Targeted Transposition/CUT&Tag Technology for Epigenomics

Active Motif Incorporated Licenses Targeted Transposition to Diagenode S.A. Technology

Resulticks Named in 2022 Gartner® Magic Quadrant™ for Multichannel Marketing Hubs for the Sixth Year in a Row

Linear TV viewing down as online long form viewing time increases according to Omdia

Global Fund Reports Significant Progress in Breaking Down Human Rights-Related Barriers to HIV and TB Services

PERSISTENT LOSS OF SMELL DUE TO COVID-19 CLOSELY CONNECTED TO LONG-LASTING COGNITIVE PROBLEMS

Gibson Wins Another Definitive Ruling On Its Iconic Shapes and Trademarks

PARAMOUNT APPOINTS PAMELA KAUFMAN TO LEAD INTERNATIONAL BUSINESS AS PRESIDENT AND CEO, INTERNATIONAL MARKETS, GLOBAL CONSUMER PRODUCTS & EXPERIENCES

The fifth CIIE benefits countries along the Belt and Road

Heritage Foods announces Q1FY23 results reporting

LIMES Randomised Controlled Trial about Sirolimus Coated Balloon in BTK takes the first Stride: patient enrolment started

Chandigarh University Student invents AI-based Smart Stick 'Netra' to make visually-impaired people self-reliant

CarTrade Tech reports a 47% Y-O-Y Growth in Revenue & PAT of Rs. 3.3 crore for Q1 FY23

TransFunnel Consulting is now a HubSpot elite partner

SafeTree: Technology-led innovation is enhancing the customer's digital journey

Empowering Better World with Technology, Hisense and FIFA Create a Perfect Future through Long-term Collaborations

DRIVING TO EUROPE: FOTON iBlue EV Truck Obtained EU WVTA Certificate

LEXUS INDIA ANNOUNCES CALL FOR ENTRIES FOR LEXUS DESIGN AWARD INDIA 2023

Cashfree Payments launches GSTIN Verification to help businesses verify GSTIN of vendors and partners

Rainshine Entertainment brings home their second National Film Award, wins the 'Best Editing' award for documentary 'Borderlands' at the 68th National Film Awards

Icertis Names Rajat Bahri as Chief Financial Officer

IPICOL & FICCI sign MoU for the upcoming Make in Odisha Conclave 2022

Tue, 02 Aug 2022 06:35:00 -0500 text/html https://news.webindia123.com/news/press_showdetailsPR.asp?id=1268114&cat=PR%20News%20Wire
C2040-421 exam dump and training guide direct download
Training Exams List