Cybersecurity professionals tasked with responding to attacks experience stress, burnout, and mental health issues that are exacerbated by a lack of breach preparedness and sufficient incident response practice in their organizations.
A new IBM Security-sponsored survey published this week found that two-thirds (67%) of incident responders suffer stress and anxiety during at least some of their engagements, while 44% have sacrificed the well-being of their relationships, and 42% have suffered burnout, according to the survey conducted by Morning Consult. In addition, 68% of incidents responders often have to work on two or more incidents at the same time, increasing their stress, according to the survey's results.
Companies that plan and practice responding to a variety of incidents can lower the stress levels of their incident responders, employees, and executives, says John Dwyer, head of research for IBM Security's X-Force response team.
"Organizations are not effectively establishing their response strategies with the responders in mind — it does not need to be as stressful as it is," he says. "There is a lot of time when the responders are managing organizations during an incident, because those organizations were not prepared for the crisis that occurs these attacks happen every day."
The IBM Security-funded study underscores why the cybersecurity community has focused increasingly on the mental health of its members. About half (51%) of cybersecurity defenders have suffered burnout or extreme stress in the past year, according to a VMware survey released in August 2021. Cybersecurity executives have also spotlighted the issue as one that affects the community and companies' ability to retain skilled workers.
The IBM survey found that 62% of US-based incident responders sought mental health support as a result of their job, but that 82% US companies had an adequate program and services in place to help their workers.
"I've worked some really big incidents in the past with some clients that were very prepared, and I found that was really fulfilling work to do," Dwyer says. "I have had other incidents, where the company's incident response process was not ready, and that was very stressful."
Incident response professionals have three main reasons for pursuing the profession, the survey found. Thirty-six percent cited a sense of duty to protect others and the business as their top reason, 19% pointed to their interest in problem solving, and another 19% cited the continuous opportunities to learn.
However, some of those reasons are also the causes of stress for incident response professionals. Half of those surveyed cited managing expectations from multiple stakeholders as a top-three stressor, while 48% cited their sense of responsibility toward their client or business as a top-three stressor. Incident responders are very dedicated to their work, with a third (34%) working 13 or more hours a day during the most stressful periods of the incident response process, the survey found.
"The general public is probably not aware of how much these men and women are working long hours to make sure that people's lives and businesses are not impacted," Dwyer says.
The survey looked at incident responders in 10 different countries: Australia, Brazil, Canada, France, Germany, India, Japan, Spain, the United Kingdom, and the United States. Spain had the highest rate of burnout (69%), India saw the most significant impact on relationships, and Brazil had the most cases of insomnia, according to the survey data.
The largest group (39%) found the most stressful period of responding to a cybersecurity incident to be the first three days; 29% found the first 24 hours to be the most stressful; and some (20%) considered the entire first week to be the most demanding.
Companies need not only to be prepared to respond to an incident, but also have practiced the response and have playbooks to make response-focused activity second nature and remove the stress from incident responders, says IBM Security's Dwyer.
"If I went to an organization and asked them to run a script on every system with 24 hours — how many could do that?" he says. "Organizations need to practice, practice, practice. Not just tabletop, but practice with purpose. Ask, 'What would happen if my business went offline for 24 hours and how do we deal with that?'"
Incident response is a firehose of experience that professionals have to be able to handle, and companies need to support the team as much as possible, Dwyer says. Mental health support is a good start, he says, but having a process in place to handle the early hours and days of an incident is better.
"Will every incident we respond to be a walk in the park? Probably not," he says. "However, we can make this life manageable. There is nothing like being a responder, but you grow as a person in ways like no other discipline."
It is critical for small business owners to protect confidential data proactively because the average cost of a data breach is huge. And in a worst-case scenario, data breach incidents can force businesses to be defunct as well. In this post, you will learn – what is the cost of a data breach and what you do to cut down the cost.
Some companies prepare data breach reports each year to help understand various aspects of data breach incidents.
Here are the top three reputed resources for data breach reports:
Sponsored by IBM Security, The Cost of Data Breach Report features research independently done by Ponemon Institute annually. In this report, the institute studies companies impacted by data breaches worldwide.
Each year, Verizon Data Breach Investigations Report explains data-driven, real-world views on what commonly befalls companies with regard to cybercrime. This data breach study offers many actionable insights to beef up your cybersecurity.
If you want to look at how data breaches affect consumers across various reasons and industries, ForgeRock Consumer Identity Breach Report is the right resource. The report also offers insights into how one can strengthen cybersecurity to stay protected from attacks.
Data breaches often have long-term impacts on businesses. In addition to the cost of loss of data and regulatory fines, data breaches also have indirect costs, including employee turnover, lost revenue, customer turnover, negative search results, etc.
The following are key findings pertinent to data breach cost:
The average global cost of a data breach touched $4.35 million globally in 2022, according to the data breach report from IBM. This is an all-time high, up by 2.6% from the last year.
If you are curious about the average total cost of a breach in the US? The answer is $9.44 million. The average cost of a breach in the US is the highest globally.
Do you want to know the average cost of a data breach by industry? Look at the following data from the IBM report:
The healthcare industry has the highest data breach costs, and the public sector has the lowest costs.
Smishing, a type of phishing that relies on text messages to induce users to reveal sensitive data, is emerging as one of the most significant data breach threats.
A Proofpoint report states that smishing attacks doubled in US users in 2021. And data from Federal Trade Commission (FTC) revealed that 378,119 SMS-related fraud reports were filed in 2021.
Double checking messages that create a sense of urgency or fear, avoiding clicking suspicious links, and contacting banks and other authorities directly for account-related issues are some effective ways to protect from smishing attacks.
Here are industries that are the main data breach victims, according to the Verizon Data Breach Investigations Report:
After threat actors have encrypted data in a ransomware attack, business owners often consider paying the ransom. In fact, 53% of companies opt to pay for ransom. However, paying money isn’t always a good option. This is because paying threat actors encourages them to target more businesses And there is no certain that you will get full access to your data after the payment.
Uber paid hackers $100,000 to delete the compromised data but eventually spent $148 million in the final settlement.
The longer a breach goes undetected, the more time threat actors will have to exfiltrate/encrypt data. So it is no surprise that a shorter data breach lifecycle (time passed between the first detection of the breach and its containment) links with lower data breach costs.
Data breach lifecycle within 200 days related to a global average cost of $3.74 million in 2022. But a data breach lifecycle of more than 200 days is linked with an average cost of $ 4.86 million.
The average cost of a data breach per record is $164 globally, finds the IBM data breach report. The average data breach per record cost has increased by 1.2% from 2021.
Having remote workforces increases the costs of data breaches. According to the IBM data breach report, companies having more than 80% remote workforces pay $5.10 million in average data breach costs.
According to the IBM report, the average data breach lifecycle duration is 277 days in 2022. In 2021, it took an average of 212 days to identify a breach and 75 days to contain it, making an average data breach lifecycle duration of 287 days. The longer the data breach lifecycle is, the more data breach costs will be.
When it comes to data breaches, small businesses are heavily impacted. In fact, 28% of Data Breaches in 2020 involved small businesses, according to 2020 DBIR. An incident of a data breach can result in increased cost of products/services. The IBM report states that 60% of organizations’ breaches cause an increase in prices.
Implementing security AI and automation, having an incident response team, focusing on risks, and adopting a zero-trust model can cut down on the cost of a data breach.
Here are findings from the IBM report on data breach cost to prove it:
Now that you know – what is the cost of a data breach? It is time to strengthen your data security to protect customer data or any other kinds of sensitive data from any potential data breach.
Image: Envato Elements
IBM shook up the digital health space Friday with the news that it is selling its healthcare data and analytics assets, currently part of the Watson Health business, to an investment firm. The sale price is reportedly more than $1 billion, although the companies are not officially disclosing the financial terms.
There are a lot of interesting factors to consider as we unpack this news, although some thought leaders say the divestiture did not come as a surprise.
“The Watson Health sale has been anticipated for quite some time. IBM was clearly not gaining much traction in the healthcare market while others such as Google and Microsoft have pulled ahead. Even Oracle has made a big splash in healthcare with its recent announcement to acquire Cerner," said Paddy Padmanabhan, founder and CEO of Damo Consulting, a growth strategy and digital transformation advisory firm that works with healthcare and technology companies.
IBM was one of the first big tech companies to dive into healthcare with its well-known Watson Health supercomputer known for defeating the greatest champions on “Jeopardy!" The platform created a lot of buzz back in 2011, and many people had high hopes for the platform's potential applications in healthcare. In recent years, however, that buzz has significantly died down.
"In the current competitive landscape, IBM would not be considered a significant player in healthcare. Selling off the data assets essentially means an end to the Watson Health experiment; however, it may allow IBM as an organization to refocus and develop a new approach to healthcare,” Padmanabhan said.
Assuming there are no regulatory snags, the deal is expected to close in the second quarter of this year.
“Today’s agreement with Francisco Partners is a clear next step as IBM becomes even more focused on our platform-based hybrid cloud and AI strategy,” said Tom Rosamilia, senior vice president of IBM Software. “IBM remains committed to Watson, our broader AI business, and to the clients and partners we support in healthcare IT. Through this transaction, Francisco Partners acquires data and analytics assets that will benefit from the enhanced investment and expertise of a healthcare industry focused portfolio.”
The agreement calls for the current management team to continue in similar roles in the new standalone company, serving existing clients in life sciences, provider, imaging, payer and employer, and government health and human services sectors.
“We have followed IBM’s journey in healthcare data and analytics for a number of years and have a deep appreciation for its portfolio of innovative healthcare products,” said Ezra Perlman, co-president at Francisco Partners. “IBM built a market-leading team and provides its customers with mission critical products and outstanding service.”
In 2016 IBM doubled the size of its Watson Health business through the $2.6 billion acquisition of Truven Health Analytics. Truven offers healthcare data services targeted at employers, hospitals, and drug companies, and makes software that can parse through millions of patient records. Truven's main offices are in Ann Arbor, MI, Chicago, and Denver. At the time of the acquisition, Truven had around 2,500 employees.
The Truven deal followed other major healthcare acquisitions in the company, including Cleveland-based Explorys, Dallas-based Phytel, and Chicago-based Merge Healthcare. The company paid about $1 billion for Merge.
IBM said the assets acquired by Francisco Partners include extensive and diverse data sets and products, including Health Insights, MarketScan, Clinical Development, Social Program Management, Micromedex, and imaging software offerings.
Padmanabhan said it will be interesting to see how the new owners are able to leverage those data assets.
“IBM’s decision to sell its data assets is an indication that it’s not just enough to have the data. Applying advanced analytics on the data to generate insights that can make a difference in real-world applications is where the true value lies. IBM had several missteps early on, especially in cancer care applications, that created significant setbacks for the business that they could not recover from.
In 2018, the Watson Health business went through a round of layoffs. The company declined to tell MD+DI at the time how many of employees were let go other than to say it was a "small percentage" of the global business, but online commenters on TheLayoff.com and Watching IBM, along with multiple news reports citing unnamed sources from within the organization painted a different picture of the situation. One Dallas-based commenter on TheLayoff.com said that "we all knew it was coming but nobody expected it to be this fast and rampant," while another commenter estimated that 80% of that same Dallas-based office was let go.
While we have seen a trend in recent years with big tech firms showing an interest in healthcare, some of those companies are finding those efforts to be easier said than done.
“IBM’s decision to sell the Watson Health assets is another instance of a big tech firm acknowledging the challenges of the healthcare space. Last year, Google and Apple had significant setbacks, and Amazon has acknowledged challenges in scaling its Amazon Care business," Padmanabhan said. "In IBM’s case, they have missed out on the cloud opportunity and have lagged behind peers in emerging technology areas such as voice. While IBM’s challenges with Watson Health may have been unique to the organization, the fact is that big tech firms have multiple irons in the fire at any time, and for some healthcare may just be too hard.”
Padmanabhan does not think, however, that IBM's decision to sell the Watson Health assets is an indictment of the promise of AI in healthcare.
"Our research indicates AI was one of the top technology investments for health systems in 2021," he said. "Sure, there are challenges such as data quality and bias in the application of AI in the healthcare context, but by and large there has been progress with AI in healthcare. The emergence of other players, notably Google with its Mayo Partnership, or Microsoft with its partnership with healthcare industry consortium Truveta are strong indicators of progress."
Padmanabhan is co-author with Edward W. Marx, of Healthcare Digital Transformation: How Consumerism, Technology and Pandemic are Accelerating the Future (2020), and the host of The Big Unlock, a podcast focusing on healthcare digital transformation.
The world of cryptography moves at a very slow, but steady pace. New cryptography standards must be vetted over an extended period and therefore new threats to existing standards need to be judged by decades-long timelines because updating crypto standards is a multiyear journey. Quantum computing is an important threat looming on the horizon. Quantum computers can solve many equations simultaneously, and based on Shor’s Algorithm, crypto experts estimate that they will be able to crack asymmetric encryption. In addition, Grover’s algorithm provides a quadratic reduction in decryption time of symmetric encryption. And the question these same crypto experts try to answer is not if this will happen, but when.
Today’s crypto algorithms use mathematical problems such as factorization of large numbers to protect data. With fault-tolerant quantum computers, factorization can be solved in theory in just a few hours using Shor’s algorithm. This same capability also compromises cryptographic methods based on the difficulty of solving the discrete logarithm problems.
The term used to describe these new, sturdier crypto standards is “quantum safe.” The challenge is we don’t know exactly when fault-tolerant quantum computers will have the power to consistently break existing encryption standards, which are now in wide use. There’s also a concern that some parties could get and store encrypted data for decryption later, when suitably capable quantum computers are available. Even if the data is over ten years old, there still could be relevant confidential information in the stored data. Think state secrets, financial and securities records and transactions, health records, or even private or classified communications between public and/or government figures.
U.S. Department of Commerce’s National Institute of Standards and Technology (NIST) believes it’s possible that RSA2048 encryption can be cracked by 2035. Other U.S. government agencies and other security-minded entities have similar timelines. Rather than wait for the last minute to upgrade security, NIST started a competition to develop quantum-safe encryption back in 2016. After several rounds of reviews, on July 5th of this year, NIST chose four algorithms for the final stages of review before setting the standard. IBM developed three of them, two of those are supported in IBM’s Z16 mainframe today.
The new IBM crypto algorithms are based on a family of math problems called structured lattices. Lattice problems have a unique characteristic that will make it reasonably difficult to solve with quantum computing. Structured lattice problems require solving for two unknowns – a multiplier array and an offset and is extremely difficult for quantum computing to solve the lattice problems. The shortest vector problem (SVP) and the closest vector problem (CVP) – upon which lattice cryptography is built – is considered extremely difficult to a quantum computer to solve. Each candidate crypto algorithm is evaluated not just for data security, but also for performance - the overhead cannot be too large for wide spread use.
The final selections are expected in 2024, but there’s still a chance there will be changes before the final standards are released.
IBM Supports Quantum Safe in New Z-Series Mainframes
IBM made a strategic bet before the final NIST selections. The recently released IBM Z16 Series computers already support two of the final four quantum safe crypto candidates: the CRYSTALS-Kyber public-key encryption and the CRYSTALS-Dilithium digital signature algorithms. IBM is set to work with the industry to substantiate these algorithms in production systems. Initially, IBM is using its tape drive storage systems as a test platform. Because tape is often used for cold storage, it's an excellent medium for long-term data protection. IBM is working with its client base to find the appropriate way to roll out quantum-safe encryption to the market. This must be approached as a life cycle transformation. And, in fact, IBM is working with its customers to create a crypto-agile solution, which allows the exact crypto algorithm to change at any point in time without disrupting the entire system. It’s not just a rip and replace process. With crypto-agility, the algorithm is abstracted from the system software stack so a new algorithms can be deployed seamlessly. IBM is developing tools making crypto status part of the overall observability with a suitable dashboard to see crypto events, etc.
These new algorithms must be deployable to existing computing platforms, even at the edge. However, it's not going to feasible to upgrade every system; it’s probably going to be an industry-by-industry effort and industry consortia will be required. For example, IBM, GSMA (Global System for Mobile Communication Association), and Vodafone recently announced they will work via a GSMA Task Force to identify a process to implement quantum-safe technologies across critical telecommunications infrastructure, including the networks underpinning internet access and public utility management. The telecommunication network carries financial data, health information, public-sector infrastructure systems, and sensitive business data which needs to be protected as it traverses global networks.
What’s Next for Quantum Safe Algorithms
Fault-tolerant quantum computing is coming. When it will be available is still a guessing game, but the people who most care about data security are targeting 2035 to have quantum-safe cryptographic algorithms in place to meet the threat. But that’s not good enough. We need to start protecting critical data and infrastructure sooner than that, considering the length of time systems are deployed in the field and data is stored. Systems such as satellites and power stations are not easy to update in the field.
And there’s data that must be stored securely for future retrieval, including HIPAA (for medical applications), tax records, toxic substance control act and clinical trial data, and others.
Even after the deployment of these new algorithms, this is not the end – there may still be developments that can break even the next generation quantum-safe algorithms. The struggle between those that want to keep systems and data safe and those that want to crack them continues and why companies should look to building in crypto agility into their security plans.
Tirias Research tracks and consults for companies throughout the electronics ecosystem from semiconductors to systems and sensors to the cloud. Members of the Tirias Research team have consulted for IBM and other companies throughout the Security, AI and Quantum ecosystems.
“We do business with … the top 100 insurance companies in the world,” said Mark McLaughlin (pictured), IBM’s general manager of insurance.
That reality appears to be not widely known.
“The number one complaint I hear from our customers, from our insurtech partners and from our salespeople is ‘I didn’t know IBM did that,’” McLaughlin said during the recent ITC 2022 conference in Las Vegas. “I don’t think we’ve gotten the word out as much as we need to.”
McLaughlin, speaking during the recent ITC 2022 conference in Las Vegas, explained that the company’s insurance industry business is quite substantive, helping insurer clients pull together software, data, AI, security, and services. IBM also offers process consulting, process automation, technical assistance support and hardware capabilities.
IBM’s insurance reach includes personal lines, commercial lines, life insurance, group benefits and reinsurance, McLaughlin noted.
McLaughlin, in his current position for about six months, handled R&D strategies for the insurance industry at IBM before that. Going back further, he’s a 15-year veteran of IBM’s insurance practice.
McLaughlin noted that many insurers still use mainframes and have huge investments in their core operational systems, while others are looking at cloud-based applications and trying to figure out how to customize that technology to their needs. Still others are trying to partner with insurtechs to take advantage of their distribution, digital capabilities, models and newer sources of data. IBM, he said, works in all scenarios.
“We catalyze [products and services] across that,” McLaughlin said. “IBM is really investing a great deal of money to provide the sort of connective glue that helps connect the legacy systems that you might still need, and also modernize the systems that you decide you don’t need.”
The goal, he said is to connect those technology pieces to data in the insurtech “layer.”
IBM also helps insurance companies automate, he said.
“I’ve got automation tools, AI models, I’ve got different AI capabilities, but I have to deploy those across an average of 14 policy administration systems that insurance companies carry,” McLaughlin said.
IBM has long provided business consulting to insurance companies, via its database tools, to help them with challenges such as pricing more efficiently. The emergence of insurtechs has led IBM to expand its focus toward helping clients partner with startups. Insurers have plenty in this area with which they need help, McLaughlin explained.
“People think the insurtechs are trying to displace the insurers … but most of them, honestly, just want to partner with the insurer and build a better mousetrap and do claims faster or market better,” McLaughlin said. “We are about, how do you connect with [them] quickly. The number one thing that insurers complain to us about is speed to market. They know they have to roll-out value added services. They know they have to have a mobile experience that connects with insurers. Doing that and the sprawl of IT that insurers have today –it’s really hard and takes a long time.”
In the insurtech age, some insurers are getting rid of their old legacy systems and replacing them with cloud-based platforms that are easier to adapt and modernize. Insurtechs, in turn, often pronounce those older networks as outdated and unnecessary in today’s market. According to McLaughlin, it’s not as simple as getting rid of an older system and replacing it with something new.
“You have to think about ‘what are the characteristics of your insurance workload’ and ‘where is the best place to run that workload?’ In that world, theirs is going to be a mix of public cloud, private cloud, mainframe and specialized data appliances in … an existing data center,” McLaughlin said. “Most insurers report that as the future.”
At the same time, McLaughlin declined to advocate for a specific must-have technology for insurers in the insurtech age. Instead, he urges insurers to avoid “locking” themselves into a rigid technology option.
“If you are stuck on one core, if you are stuck with one cloud, you are limited in your ability to compete relative to insurance companies,” McLaughlin said.
AI and in-depth data analytics are also particularly useful these days toward helping insurers strengthen their customer base, he added.
“We have to know our customers better [and] we have to know our risks better,” McLaughlin said. “We have to be able to deploy those insights in ways that help insurance and healthcare distributors serve insurance more effectively.”
Katya Laviolette, chief people officer at 1Password. Passionate about all things people.
If competition for talent is fierce, competition for cybersecurity talent has become ferocious. Job openings in this field have increased by 43% since last year, according to data from CyberSeek—more than double the growth rate of new job listings overall (18%).
It’s not hard to understand why organizations are desperately seeking experts in data protection. Cyberattacks are soaring, with 68% more data breaches in 2021 than in 2020. The implications are reputationally and financially devastating: The average breach costs organizations an estimated $4.35 million, according to the "Cost of a Data Breach Report 2022" from IBM and the Ponemon Institute.
This translates into a hyper-competitive talent market for cybersecurity professionals, as the demand-supply imbalance creates a plethora of opportunities. For employers having trouble filling these critical roles, the picture is far more challenging. So how can organizations develop and retain strong, committed teams of top-notch cybersecurity talent? Here are seven strategies that have worked for my organization.
1. Recruit for the soft skills and train for the hard skills.
Not everyone realizes it, but soft skills are critical to cybersecurity work. According to Verizon’s "2022 Data Breach Investigations Report," 82% of breaches are a result of human error, and many of those could be avoided with thoughtful human direction.
As companies continue to embrace agile ways of working, it’s critical that technical experts can effectively communicate critical security approaches with colleagues and customers who don’t have a technology background. Communication skills, leadership qualities, active listening, empathy and an ability to instill trust in others are critical assets that are difficult to teach.
2. Look within.
At my company, we have employees who began in our customer support department and then moved into security as the next step in their careers. While customer support and security may seem like very different divisions, there’s a lot of overlap in the qualities that make people successful at these jobs.
Look within your company for potential talent, especially among teams on the front lines who are engaging directly with both the product and the customer. They may have detailed familiarity with your products, know how users interact with them, have empathy for users and know what it takes to put concerns to rest.
3. Seek out high-potential junior candidates.
Cybersecurity is evolving quickly. Junior candidates who demonstrate a commitment to immersing themselves in the field may soon be ready for more senior positions. Those who participate in extracurricular activities, or have taken on side projects, build transferable skills and are likely to continue growing as the cybersecurity field evolves.
4. Look beyond education.
While some security roles may require certifications, experience comes in many forms. In tech, some of the most successful people are self-taught. We love to see boot camps and other forms of grassroots education listed on candidates’ résumés—it reflects a commitment to the field and can also increase the diversity of the candidate pool from several angles.
Another recruitment strategy is to consider applicants from adjacent disciplines. While it might be challenging to teach security professionals how to become good software developers, it might be easier to teach security concepts to software developers.
5. Bug bounty programs could be another avenue for companies to pursue talent.
More companies have bug bounty programs that deliver “ethical hackers” the ability to test a company’s security procedures by attempting to break into their systems in pursuit of fortifying companies’ defenses. These ethical hackers are as creative as the best artists; they have creativity, motivation and excellent technique. While technical knowledge is an important prerequisite—a sculptor must know how to sculpt—technical knowledge only brings one so far.
So, ask an ethical hacker for their portfolio. Consider asking them to share more information about the work they’ve produced or look into their GitHub repos or blogs.
6. Extinguish burnout to prevent 'quiet quitting.'
Our company’s "State of Access Report" found that security professionals are more prone to burnout than other members of the workforce. We found that because of burnout, they are “completely checked out” and “doing the bare minimum at work.”
Last year, burnout largely fueled the mass exodus of employees labeled the Great Resignation; this year, it’s translating into “quiet quitting,” in which employees resolve to do only what their job requires and nothing more.
Companies that invest in recruiting and hiring top security talent must protect their investments by supporting employees, connecting them to a sense of purpose, and ensuring they are challenged, engaged and energized by their work. This may include competitive compensation packages, monitoring workloads to ensure no one is overburdened, taking mental health challenges seriously, offering attractive benefits and, whenever possible, giving workers flexibility in when and where they work.
7. Develop a security- and human-focused culture.
It’s more effective to create a work environment that encourages productivity than it is to enforce it. Workplace processes should be regularly reviewed to enable productivity and remove blockers.
One great way to enable productivity and support security professionals is by making it as easy as possible for all employees to adhere to security protocols. Because we’re all human, this means making the secure thing to do the easy thing to do.
Even if we enter a recession, I believe the red-hot market for cybersecurity talent is unlikely to cool substantially. If companies are willing to get creative and embrace new strategies to identify, attract and support talent, they could take the lead in securing—and holding onto—security teams that are as valuable as the assets they’re charged with protecting.
Forbes Human Resources Council is an invitation-only organization for HR executives across all industries. Do I qualify?
A four-year bachelor’s degree has long been the first rung to climbing America’s corporate ladder.
But the move to prioritize skills over a college education is sweeping through some of America’s largest companies, including Google, EY, Microsoft, and Apple. Strong proponents say the shift helps circumvent a needless barrier to workplace diversity.
“I really do believe an inclusive diverse workforce is better for your company, it’s good for the business,” Ginni Rometty, former IBM CEO, told Fortune Media CEO Alan Murray during a panel last month for Connect, Fortune’s executive education community. “That’s not just altruistic.”
Under Rometty’s leadership in 2016, tech giant IBM coined the term “new collar jobs” in reference to roles that require a specific set of skills rather than a four-year degree. It’s a personal commitment for Rometty, one that hits close to home for the 40-year IBM veteran.
When Rometty was 16, her father left the family, leaving her mother, who’d never worked outside the home, suddenly in the position to provide.
“She had four children and nothing past high school, and she had to get a job to…get us out of this downward spiral,” Rometty recalled to Murray. “What I saw in that was that my mother had aptitude; she wasn’t dumb, she just didn’t have access, and that forever stayed in my mind.”
When Rometty became CEO in 2012 following the Great Recession, the U.S. unemployment rate hovered around 8%. Despite the influx of applicants, she struggled to find employees who were trained in the particular cybersecurity area she was looking for.
“I realized I couldn’t hire them, so I had to start building them,” she said.
In 2011, IBM launched a corporate social responsibility effort called the Pathways in Technology Early College High School (P-TECH) in Brooklyn. It’s since expanded to 11 states in the U.S. and 28 countries.
Through P-TECH, Rometty visited “a very poor high school in a bad neighborhood” that received the company’s support, as well as a community college where IBM was offering help with a technology-based curriculum and internships.
“Voilà! These kids could do the work. I didn’t have [applicants with] college degrees, so I learned that propensity to learn is way more important than just having a degree,” Rometty said.
Realizing the students were fully capable of the tasks that IBM needed moved Rometty to return to the drawing board when it came to IBM’s own application process and whom it was reaching. She said that at the time, 95% of job openings at IBM required a four-year degree. As of January 2021, less than half do, and the company is continuously reevaluating its roles.
For the jobs that now no longer require degrees and instead rely on skills and willingness to learn, IBM had always hired Ph.D. holders from the very best Ivy League schools, Rometty told Murray. But data shows that the degree-less hires for the same jobs performed just as well. “They were more loyal, higher retention, and many went on to get college degrees,” she said.
Rometty has since become cochair of OneTen, a civic organization committed to hiring, promoting, and advancing 1 million Black individuals without four-year degrees within the next 10 years.
If college degrees no longer become compulsory for white-collar jobs, many other qualifications—skills that couldn’t be easily taught in a boot camp, apprenticeship program, or in the first month on the job—could die off, too, University of Virginia Darden School of Business professor Sean Martin told Fortune last year.
“The companies themselves miss out on people that research suggests…might be less entitled, more culturally savvy, more desirous of being there,” Martin said. Rather than pedigree, he added, hiring managers should look for motivation.
That’s certainly the case at IBM. Once the company widened its scope, Rometty said, the propensity to learn quickly became more of an important hiring factor than just a degree.
This story was originally featured on Fortune.com
More from Fortune:
A 2007 flashback: home flippers are in trouble again
Managing Gen Z is like working with people ‘from a different country’
The Renault Nissan empire once held together by fugitive Carlos Ghosn may slowly be unraveling
PayPal tells users it will fine them $2,500 for misinformation, then backtracks immediately
IBM Corp. is making some big changes to its data storage services, announcing today that it will bring Red Hat Inc.’s storage products and associates under the “IBM Storage” umbrella.
The aim, IBM said, is to deliver a more consistent application and data storage experience across on-premises and cloud infrastructures. It’s a big move that will see IBM Spectrum Fusion data management software adopt the storage technologies of Red Hat’s OpenShift Data Foundation as its new base layer.
Even more interesting, perhaps, is that the open-source Red Hat Ceph Storage offering will be transformed into a new IBM Ceph storage offering. IBM said this will result in a unified, software-defined storage platform that’s better able to bridge the architectural divide between data centers and cloud computing providers.
The computing giant said the move is in line with its software-defined storage strategy of a “born in the cloud, for the cloud” approach that will unlock bidirectional application and data mobility based on a shared, secure and cloud-scale solution.
IBM Systems General Manager of Storage Denis Kennelly said the shift is designed to streamline the two companies’ portfolios. “By bringing together the teams and integrating our products under one roof, we are accelerating IBM’s hybrid cloud strategy while maintaining commitments to Red Hat’s customers and the open-source community,” he insisted.
The company presented the changes as a big win for customers, saying they will gain access to a more consistent set of storage services that preserve data resilience, security and governance across bare metal, virtualized and containerized environments. More specifically, IBM is promising that customers will have a more unified storage experience for container-based applications running on Red Hat OpenShift, with the ability to use IBM Spectrum Fusion, which is now based on Red Hat OpenShift Data Foundation. Doing so will provide higher performance, greater scale and more automation for OpenShift applications that require block, file and object access to data, the company said.
As for IBM Ceph, the company said this will deliver a more consistent hybrid cloud experience with enterprise-grade scale and resiliency.
Furthermore, by unifying IBM’s and Red Hat’s storage technologies, customers will be able to build a single data lakehouse on IBM Spectrum Scale to aggregate all of their unstructured data in one place. Benefits will include less time spent on maintenance, reduced data movement and redundancy, and more advanced schema management and data governance.
Industry watchers were united in their belief that the changes would be of benefit to customers. Steve McDowell of Moor Insights & Strategy told SiliconANGLE that today’s move makes a lot of sense because it enables IBM to leverage the storage strengths of both companies.
McDowell explained that although IBM Spectrum is considered to be one of the most comprehensive data management platforms around, its foundation predates the rise of cloud-native technologies. On the other hand, he said, Red Hat OpenShift was built from the ground up to support cloud-native workloads.
“IBM is evolving Spectrum Fusion to take the best of Red Hat’s efforts, and is using Red Hat’s storage software as the base for its IBM-branded products moving forward,” McDowell said. “It makes a lot of business sense for IBM to leverage R&D from Red Hat into its more traditionally proprietary systems. It also gives IBM an easy path to better serve the needs of containerized workloads.”
International Data Corp. analyst Ashish Nadkarni said the two companies are now “speaking with one voice on storage” and finally delivering on the synergies between them that were mentioned when IBM acquired Red Hat in 2019.
“The combining of the two storage teams is a win for IT organizations as it brings together the best that both offer: An industry-leading storage systems portfolio meets an industry-leading software-defined data services offering,” Nadkarni said. “This initiative enables IBM and Red Hat to streamline their family of offerings, passing the benefits to their customers.”
IBM also moved to reassure users of Red Hat’s open-source technologies that it will remain fully committed to them following today’s announcements. As part of the deal, IBM will take over Premier Sponsorship of the Ceph Foundation and, along with Red Hat’s teams, continue to drive innovation and development. Both IBM Ceph and Red Hat OpenShift will remain 100% open-source, the company added, and will continue to follow an upstream-first development model.
McDowell said today’s move would likely make some users nervous about the prospect of Red Hat’s technology becoming more proprietary over time. “IBM has been very careful since it acquired Red Hat in 2019 to keep Red Hat’s open-source business segregated from IBM’s branded offerings,” he said. “This is the first time we’re seeing IBM cross that that line, and it’s natural to wonder how blurred those lines will become.”
Still, McDowell said, he’s inclined to believe IBM’s promises as it has been very deliberate about keeping Red Hat’s storage technologies open-source.
“Red Hat OpenShift Data Foundation and Ceph will still be available as they always have, though its evolution will undoubtedly be more strongly guided by the needs of IBM’s storage business,” the analyst continued. “Overall this is a net positive for IBM and its customers. It makes good business sense and there should be minimal impact to Red Hat’s existing community.”
IBM said the first storage solutions to launch under the new IBM Ceph Storage and IBM Spectrum Fusion banners will arrive in the first half of 2023, so users will have plenty of time to digest the changes.
Photo by Desola Lanre-Ologun on Unsplash
Everywhere you turn, in nearly every industry, there are labour challenges.
In manufacturing 82% of businesses are looking for help, and for the first time since 2015 there are more jobs available than there are people.
In Ontario the Chamber of Commerce reports that 60% of its members are having trouble filling roles in health care, retail, construction, tourism, and financial services.
And despite recent layoffs with technology companies, a 2021 Information and Communications Technology Council report forecasts 11% of all employment in Canada will be in the digital economy by 2025, requiring 250,000 more people to fill roles.
Labour shortages are being fuelled by a growing number of people retiring, combined with a decline in immigration during the pandemic. The problem is exacerbated by a mismatch between available jobs and skills. Digital transformation — especially the rush to digital-first initiatives during the COVID-19 pandemic — has meant that technology and related jobs are in high-demand. A whopping 80% of businesses surveyed by KPMG say they need more workers with digital skills.
So where does one look to help fix the talent shortage and skills gap?
“Upskilling, reskilling, and teaching students digital and technical skills is going to be a critical step in meeting growing demand in Canada,” says Lila Adamec, Leader of Academic Integration and Innovation at IBM Canada Lab. “Young students have a huge opportunity to help lead a new digital generation, and many skilled people already in the workforce can look to have their skills upgraded to learn more about cloud, or AI, or analytics, for example.”
Adamec has been working on this problem for years.
In 2017 she developed a program at IBM to address this skills gap. The approach her team took was to deliver turnkey curriculum solutions to universities across Canada. Essentially a “complete curriculum solution” as Adamec calls it, IBM provides higher-ed institutions with a complete toolkit, including curriculum material, access to IBM’s enterprise software, tests, and microcredentials. The curriculum is designed to address skills gaps commonly found in the tech sector by delivering and providing learners with “skill security.”
“With the gig economy, skill security is important, so that you can move from one gig to the next,” Adamec says. “Or if you need to upskill in an area or level up your education, and quickly get the skills you need. So much of this is becoming an agile service offering.”
With this solution, everyone wins: students get upskilling potential, and colleges and universities meet academic criteria with a curriculum that’s approved by government ministries and professional associations.
Adamec rolled out the [email protected] program in 2018, and since then it’s been adopted by dozens of institutions including York University, Mohawk College, Bow Valley College, SAIT, NAIT, Holland College, and Vanier College, to name a few. Classes are often filled to capacity or over-subscribed, and students credit the courses for helping them land their dream jobs.
Since its rollout, Adamec says IBM Canada has delivered more than 3,000 microcredentials to students who have taken their courses.
The education space has undergone a radical transformation in the last five years. First because of evolving student needs and increased demand for newer tech-focused curriculum, then because the pandemic forced a rethink of how and where people attend post-secondary institutions.
While the hybrid classroom is now more commonplace, colleges and universities still face pressure to deliver curriculum that is in line with what students need in today’s workforce, and that is a real challenge.
This stems from the fact that a university or college course needs to be approved by a government ministry, which can take years. The process typically requires at least one full-time person to prepare a curriculum for ministry approval, and it can cost more than $1 million per course when all is said and done. As a result, some colleges and university programs opt for a standardized approach so students achieve a minimum set of skills that correspond to the industry or sector.
It’s no wonder that many professors will teach the same material, without extensive updates, for 5-7 years.
When Adamec designed the [email protected] program, she knew it had to meet ministry approval, it needed to provide students with real-world skills that are current, and it had to be done faster and cheaper than traditional means.
Adamec and her team positioned IBM to be more of a business partner than just a solutions provider. With on-staff personnel who develop course curriculum for higher ed, academic partners don’t need to bear the cost or burden of full-time staff to manage curriculum development. And because IBM has a team of technical experts, the enterprise-level software offering can be paired with curriculum to deliver a cutting-edge, up-to-date learning experience.
These academic solutions are updated frequently, and delivered to universities at a huge cost savings, functioning more like a Software as a Service (SaaS) offering. By flipping the standard, slower model of curriculum development on its head, Adamec and her team deliver subscription-based, digital solutions that’ll help schools and learners better access tech skills to fill a desperate workforce need.
With huge success across the Canadian academic landscape, Adamec will bring the [email protected] experience to the WeaveSphere technology conference happening in November.
Attracting industry leaders, academics and developers, WeaveSphere is an innovation event taking place Nov 15-17 in Toronto.
As part of the conference’s Education Day, developed by Adamec, high school, undergraduate, and graduate students will have the opportunity to participate in a full day bootcamp on Design Thinking, a problem-solving methodology that first identifies the end result and stakeholders. Even more, students will earn an IBM MyLearning microcredential at the end of the course. The course will be open twice, with 80 spots each session.
“We are excited about giving students and academics the opportunity to weave ideas and research with challenges of the business world,” says Adamec. “With Enterprise Design Thinking, everyone is welcome to the problem-solving table and we find these moments are incredible opportunities for young people to engage with, and learn from people in the workforce.”
The WeaveSphere Education Day and IBM’s Design Thinking course allows students to leverage a valuable learning experience that’ll deliver them a leg up in both their professional and personal lives.
“It’s the type of course that is open to all learning disciplines,” she elaborates. “Whether you’re in finance, liberal arts, tech, engineering, medicine — whatever you’re studying, you don’t have to be an IT guru to do Design Thinking.”
And it’s these opportunities that make WeaveSphere an impactful experience for students. The opportunity to ‘weave’ with and learn from experts and industry leaders is, simply put, unforgettable.
As student and passionate “weaver” Ali Hamdy explained, “it was cool having a voice among people who are way more experienced than me and way more educated than me. And it was cool for them to actually sit down and listen to me and respond.”
Interested in earning a Design Thinking microcredential? Visit WeaveSphere for more information and to register for Education Day.
Digital Journal is an official media partner for WeaveSphere. We will share updates leading up to the event, and we’ll be live on location from November 15-17,2022. Join us and get your tickets at weavesphere.co.