But IBM has always carved its own path. For example, the Armonk, NY-based company doesn’t use the term “Great Resignation,” at least internally. Of course, that doesn’t mean the tech giant isn’t aware of the nationwide talent shortage and the highly competitive labor market that’s resulted.
“This is a time to ensure we re-engage our population,” Louissaint says. “By nature of my title, my goal is to continue to transform and pivot our company toward being more growth-minded, transforming it directly through leadership: leadership development, getting people in the right jobs and ensuring we have the right succession plans.”
Like many companies since the COVID-19 pandemic, IBM has relied upon its business resource groups – its label for employee resource groups (ERGs) – to maintain and even boost retention. Traditionally, ERGs consist of employees who volunteer their time and effort to foster an inclusive workplace. Due to their motivations, needs and the general nature of ERG work, employees who lead these groups are more likely to be Black, Indigenous and People of Color (BIPOC) and oftentimes women. ERGs are a way for underrepresented groups to band together to recruit more talent like them into their companies and make sure that talent feels supported and gets promoted.
“It’s a lot easier to leave a company where you’ve only interacted with colleagues through a screen,” Louissaint says. “Our diversity groups and communities have gotten a lot stronger, which builds commitment to the company and community to each other. We’ve found that through our communities, business resource groups, open conversations and by democratizing leadership by using virtual technologies like Slack, the company has become smaller and the interactions are a lot more personal.”
A major contributor to the Great Resignation has been the push for workers to return to the office. While Apple and Google have ruffled feathers with requesting employees back for at least a couple days a week, Tesla went one step further by demanding employees head to the office five days a week, as if the COVID-19 pandemic never happened.
Ahead of the game, IBM was one of the first major tech firms to embrace remote work, with as much as 40% of its workforce at home during the 2000s. A shift came in 2017, but since the pandemic, only 20% of the company’s U.S. employees are in the office for three days a week or more, according to IBM CEO Arvind Krishna. In June, Krishna added that he doesn’t think the balance will ever get back to more than 60% of workers in the office.
“We’ve always been defined by flexibility, even prior to the pandemic that’s what we were known for and what differentiated us,” Louissaint says. “Continuing to double down on flexibility has been a value to us and to our people.”
IBM has also been defined by its eye toward the future, particularly when it comes to workforce development. Over the past decade, the tech giant has partnered with educational institutions, non-governmental organizations and other companies to discover and nurture talent from untapped pools and alternative channels. Last year, the company vowed to train 30 million individuals on technical skills by 2030.
“Our people crave learning and are highly curious,” Louissaint says, adding that the average IBM employee consumes about 88 hours of learning through its platform each year. Nearly all (95%) employees are on the platform in any given quarter.
“We’ve been building a strong learning environment where employees can build new skills and drive toward new jobs and experiences,” he says. “We also find that the individuals who consume the most learning are more likely to get promoted. It’s 30% more likely for a super learner to be promoted or switch jobs, so the incentive is continued growth and opportunity for advancement.”
Interested, he talked to a counselor to learn more about P-TECH, an early college program where he could earn an associate’s degree along with his high school diploma. Liking the sound of the program, he enrolled in the inaugural P-TECH class as a freshman at Longmont’s Skyline High School.
“I really loved working on computers, even before P-TECH,” he said. “I was a hobbyist. P-TECH gave me a pathway.”
IBM hired him as a cybersecurity analyst once he completed the apprenticeship.
“P-TECH has given me a great advantage,” he said. “Without it, I would have been questioning whether to go into college. Having a college degree at 18 is great to put on a resume.”
Litow’s idea was to get more underrepresented young people into tech careers by giving them a direct path to college while in high school — and in turn create a pipeline of employees with the job skills businesses were starting to value over four-year college degrees.
The program, which includes mentors and internships provided by business partners, gives high school students up to six years to earn an associate's degree at no cost.
In Colorado, St. Vrain Valley was among the first school districts chosen by the state to offer a P-TECH program after the Legislature passed a bill to provide funding — and the school district has embraced the program.
Colorado’s first P-TECH programs started in the fall of 2016 at three high schools, including Skyline High. Over the last six years, 17 more Colorado high schools have adopted P-TECH, for at total of 20. Three of those are in St. Vrain Valley, with a fourth planned to open in the fall of 2023 at Longmont High School.
Each St. Vrain Valley high school offers a different focus supported by different industry partners.
Skyline partners with IBM, with students earning an associate’s degree in Computer Information Systems from Front Range. Along with being the first, Skyline’s program is the largest, enrolling up to 55 new freshmen each year.
Programs at the other schools are capped at 35 students per grade.
Frederick High’s program, which started in the fall of 2019, has a bioscience focus, partners with Aims Community College and works with industry partners Agilent Technologies, Tolmar, KBI Biopharma, AGC Biologics and Corden Pharma.
Silver Creek High’s program started a year ago with a cybersecurity focus. The Longmont school partners with Front Range and works with industry partners Seagate, Cisco, PEAK Resources and Comcast.
The new program coming to Longmont High will focus on business.
District leaders point to Skyline High’s graduation statistics to illustrate the program’s success. At Skyline, 100 percent of students in the first three P-TECH graduating classes earned a high school diploma in four years.
For the 2020 Skyline P-TECH graduates, 24 of the 33, or about 70 percent, also earned associate’s degrees. For the 2021 graduating class, 30 of the 47 have associate’s degrees — with one year left for those students to complete the college requirements.
For the most latest 2022 graduates, who have two years left to complete the college requirements, 19 of 59 have associate’s degrees and another six are on track to earn their degrees by the end of the summer.
Louise March, Skyline High’s P-TECH counselor, keeps in touch with the graduates, saying 27 are working part time or full time at IBM. About a third are continuing their education at a four year college. Of the 19 who graduated in 2022 with an associate’s degree, 17 are enrolling at a four year college, she said.
Two of those 2022 graduates are Anahi Sarmiento, who is headed to the University of Colorado Boulder’s Leeds School of Business, and Jose Ivarra, who will study computer science at Colorado State University.
“I’m the oldest out of three siblings,” Ivarra said. “When you hear that someone wants to deliver you free college in high school, you take it. I jumped at the opportunity.”
Sarmiento added that her parents, who are immigrants, are already working two jobs and don’t have extra money for college costs.
“P-TECH is pushing me forward,” she said. “I know my parents want me to have a better life, but I want them to have a better life, too. Going into high school, I kept that mentality that I would push myself to my full potential. It kept me motivated.”
While the program requires hard work, the two graduates said, they still enjoyed high school and had outside interests. Ivarra was a varsity football player who was named player of the year. Sarmiento took advantage of multiple opportunities, from helping elementary students learn robotics to working at the district’s Innovation Center.
Ivarra said he likes that P-TECH has the same high expectations for all students, no matter their backgrounds, and gives them support in any areas where they need help. Spanish is his first language and, while math came naturally, language arts was more challenging.
“It was tough for me to see all these classmates use all these big words, and I didn’t know them,” he said. “I just felt less. When I went into P-TECH, the teachers focus on you so much, checking on every single student.”
They said it’s OK to struggle or even fail. Ivarra said he failed a tough class during the pandemic, but was able to retake it and passed. Both credited March, their counselor, with providing unending support as they navigated high school and college classes.
“She’s always there for you,” Sarmiento said. “It’s hard to be on top of everything. You have someone to go to.”
Students also supported each other.
“You build bonds,” Ivarra said. “You’re all trying to figure out these classes. You grow together. It’s a bunch of people who want to succeed. The people that surround you in P-TECH, they push you to be better.”
P-TECH has no entrance requirements or prerequisite classes. You don’t need to be a top student, have taken advanced math or have a background in technology.
With students starting the rigorous program with a wide range of skills, teachers and counselors said, they quickly figured out the program needed stronger support systems.
March said freshmen in the first P-TECH class struggled that first semester, prompting the creation of a guided study class. The every other day, hour-and-a-half class includes both study time and time to learn workplace skills, including writing a resume and interviewing. Teachers also offer tutoring twice a week after school.
“The guided study has become crucial to the success of the program,” March said.
Another way P-TECH provides extra support is through summer orientation programs for incoming freshmen.
At Skyline, ninth graders take a three-week bridge class — worth half a credit — that includes learning good study habits. They also meet IBM mentors and take a field trip to Front Range Community College.
“They get their college ID before they get their high school ID,” March said.
During a session in June, 15 IBM mentors helped the students program a Sphero robot to travel along different track configurations. Kathleen Schuster, who has volunteered as an IBM mentor since the P-TECH program started here, said she wants to “return some of the favors I got when I was younger.”
“Even this play stuff with the Spheros, it’s teaching them teamwork and a little computing,” she said. “Hopefully, through P-TECH, they will learn what it takes to work in a tech job.”
Incoming Skyline freshman Blake Baker said he found a passion for programming at Trail Ridge Middle and saw P-TECH as a way to capitalize on that passion.
“I really love that they deliver you options and a path,” he said.
Trail Ridge classmate Itzel Pereyra, another programming enthusiast, heard about P-TECH from her older brother.
“It’s really good for my future,” she said. “It’s an exciting moment, starting the program. It will just help you with everything.”
While some of the incoming ninth graders shared dreams of technology careers, others see P-TECH as a good foundation to pursue other dreams.
Skyline incoming ninth grader Marisol Sanchez wants to become a traveling nurse, demonstrating technology and new skills to other nurses. She added that the summer orientation sessions are a good introduction, helping calm the nerves that accompany combining high school and college.
“There’s a lot of team building,” she said. “It’s getting us all stronger together as a group and introducing everyone.”
Silver Creek’s June camp for incoming ninth graders included field trips to visit Cisco, Seagate, PEAK Resources, Comcast and Front Range Community College.
During the Front Range Community College field trip, the students heard from Front Range staff members before going on a scavenger hunt. Groups took photos to prove they completed tasks, snapping pictures of ceramic pieces near the art rooms, the most expensive tech product for sale in the bookstore and administrative offices across the street from the main building.
Emma Horton, an incoming freshman, took a cybersecurity class as a Flagstaff Academy eighth grader that hooked her on the idea of technology as a career.
“I’m really excited about the experience I will be getting in P-TECH,’ she said. “I’ve never been super motivated in school, but with something I’m really interested in, it becomes easier.”
Deb Craven, dean of instruction at Front Range’s Boulder County campus, promised the Silver Creek students that the college would support them. She also gave them some advice.
“You need to advocate and ask for help,” she said. “These two things are going to help you the most. Be present, be engaged, work together and lean on each other.”
Craven, who oversees Front Range’s P-TECH program partnership, said Front Range leaders toured the original P-TECH program in New York along with St. Vrain and IBM leaders in preparation for bringing P-TECH here.
“Having IBM as a partner as we started the program was really helpful,” she said.
When the program began, she said, freshmen took a more advanced technology class as their first college class. Now, she said, they start with a more fundamental class in the spring of their freshman year, learning how to build a computer.
“These guys have a chance to grow into the high school environment before we stick them in a college class,” she said.
Summer opportunities aren’t just for P-TECH’s freshmen. Along with summer internships, the schools and community colleges offer summer classes.
Silver Creek incoming 10th graders, for example, could take a personal financial literacy class at Silver Creek in the mornings and an introduction to cybersecurity class at the Innovation Center in the afternoons in June.
Over at Skyline, incoming 10th graders in P-TECH are getting paid to teach STEM lessons to elementary students while earning high school credit. Students in the fifth or sixth year of the program also had the option of taking computer science and algebra classes at Front Range.
And at Frederick, incoming juniors are taking an introduction to manufacturing class at the district's Career Elevation and Technology Center this month in preparation for an advanced manufacturing class they’re taking in the fall.
“This will deliver them a head start for the fall,” said instructor Chester Clark.
Incoming Frederick junior Destini Johnson said she’s not sure what she wants to do after high school, but believes the opportunities offered by P-TECH will prepare her for the future.
“I wanted to try something challenging, and getting a head start on college can only help,” she said. “It’s really incredible that I’m already halfway done with an associate’s degree and high school.”
IBM P-TECH program manager Tracy Knick, who has worked with the Skyline High program for three years, said it takes a strong commitment from all the partners — the school district, IBM and Front Range — to make the program work.
“It’s not an easy model,” she said. “When you say there are no entrance requirements, we all have to be OK with that and support the students to be successful.”
IBM hosted 60 St. Vrain interns this summer, while two Skyline students work as IBM “co-ops” — a national program — to assist with the P-TECH program.
The company hosts two to four formal events for the students each year to work on professional and technical skills, while IBM mentors provide tutoring in algebra. During the pandemic, IBM also paid for subscriptions to tutor.com so students could get immediate help while taking online classes.
“We want to get them truly workforce ready,” Knick said. “They’re not IBM-only skills we’re teaching. Even though they choose a pathway, they can really do anything.”
As the program continues to expand in the district, she said, her wish is for more businesses to recognize the value of P-TECH.
“These students have had intensive training on professional skills,” she said. “They have taken college classes enhanced with the same digital credentials that an IBM employee can learn. There should be a waiting list of employers for these really talented and skilled young professionals.”
©2022 the Daily Camera (Boulder, Colo.). Distributed by Tribune Content Agency, LLC.
For decades, scientists have been giddy and citizens have been fearful of the power of computers. In 1965 Herbert Simon, a Nobel laureate in economics and also a winner of the Turing Award (considered “The Nobel Prize of computing”), predicted that “machines will be capable, within 20 years, of doing any work a man can do.” His misplaced faith in computers is hardly unique. Sixty-seven years later, we are still waiting for computers to become our slaves and masters.
Businesses have spent hundreds of billions of dollars on AI moonshots that have crashed and burned. IBM’s “Dr. Watson” was supposed to revolutionize health care and “eradicate cancer.” Eight years later, after burning through $15 billion with no demonstrable successes, IBM fired Dr. Watson.
In 2016 Turing Award Winner Geoffrey Hinton advised that “We should stop training radiologists now. It’s just completely obvious that within five years, deep learning is going to do better than radiologists.” Six years later, the number of radiologists has gone up, not down. Researchers have spent billions of dollars working on thousands of radiology image-recognition algorithms that are not as good as human radiologists.
What about those self-driving vehicles, promised by many including Elon Musk in his 2016 boast that “I really consider autonomous driving a solved problem. I think we are probably less than two years away.” Six years later, the most advanced self-driving vehicles are arguably Waymos in San Francisco, which only operate between 10 p.m. and 6 a.m. on the least crowded roads and still have accidents and cause traffic tie-ups. They are a long way from successfully operating in downtown traffic during the middle of the day at a required 99.9999% level of proficiency.
The list goes on. Zillow’s house-flipping misadventure lost billions of dollars trying to revolutionize home-buying before they shuttered it. Carvana’s car-flipping gambit still loses billions.
We have argued for years that we should be developing AI that makes people more productive instead of trying to replace people. Computers have wondrous memories, make calculations that are lightning-fast and error-free, and are tireless, but humans have the real-world experience, common sense, wisdom and critical thinking skills that computers lack. Together, they can do more than either could do on their own.
Catch up on the day’s top five stories every weekday afternoon.
Effective augmentation appears to be finally happening with medical images. A large-scale study just published in Lancet Digital Health is the first to directly compare AI cancer screening when used alone or to assist humans. The software comes from a German startup, Vara, whose AI is already used in more than 25% of Germany’s breast cancer screening centers.
Researchers from Vara, Essen University and the Memorial Sloan Kettering Cancer Center trained the algorithm on more than 367,000 mammograms, and then tested it on 82,851 mammograms that had been held back for that purpose.
In the first strategy, the algorithm was used alone to analyze the 82,851 mammograms. In the second strategy, the algorithm separated the mammograms into three groups: clearly cancer, clearly no cancer, and uncertain. The uncertain mammograms were then sent to board-certified radiologists who were given no information about the AI diagnosis.
Doctors and AI working together turned out to be better than either working alone. The AI pre-screening reduced the number of images the doctors examined by 37% while lowering the false-positive and false-negative rates by about a third compared to AI alone and by 14%-20% compared to doctors alone. Less work and better results!
As machine learning improves, the AI analysis of X-rays will no doubt become more efficient and accurate. There will come a time when AI can be trusted to work alone. However, that time is likely to be decades in the future and attempts to jump directly to that point are dangerous.
We are optimistic that the productivity of many workers can be improved by similar augmentation strategies — not to mention the fact that many of the tasks that computers excel at are dreadful drudgery; e.g., legal research, inventory control and statistical calculations. But far too many attempts to replace humans entirely have not only been an enormous waste of resources but have also undermined the credibility of AI research. The last thing we need is another AI winter where funding dries up, resources are diverted and the tremendous potential of these technologies are put on hold. We are optimistic that the accumulating failures of moonshots and successes of augmentation strategies will change the way that we think about AI.
Funk is an independent technology consultant who previously taught at National University of Singapore, Hitotsubashi and Kobe Universities in Japan, and Penn State, where he taught courses on the economics of new technologies. Smith is the author of ”The AI Delusion” and co-author (with Jay Cordes) of ”The 9 Pitfalls of Data Science” and ”The Phantom Pattern Problem.”
Gurugram (Haryana) [India], July 30 (ANI/NewsVoir): Edology has announced a partnership with IBM, one of the world's top leading and reputed corporations, to introduce its Post Graduate Certificate Program in Data Science for working professionals and everyone wanting to enter the field of Data Science. Developed by IBM inventors and experts who hold numerous patents in the field of Data Science, this is the first IBM programme that has been completely designed by IBM and is being delivered by its faculty.
"The programme for the Edology x IBM Data Science course is a very special offering from IBM, and this is one-of-a-kind initiative," according to Hari Ramasubramanian, Leader, Business Development and Academia Relationships, IBM Expert Labs, India/South Asia. He further added, "There is a strong demand for skilled technology and trained professionals across the industry. Data science is not confined to IT. It includes all the verticals one can imagine-from board meetings to sports, data science brings a lot of value to organizations worldwide. For students, as well as professionals with experience, if you want to fast track your career on to the next level, this is the course you should be doing."
"The IBM Data Science certificate program through the Edology platform, will equip to adapt to the dynamics in the industry and drive technology innovation," said, Vithal Madyalkar, Program Director, IBM Innovation Centre for Education, India/South Asia. "The Data Science course modules will provide deep practical knowledge, coupled with broad-based industry alignment, interaction, talent discoverability as well as excellence in their professional practice."
A global Ed-Tech company, Edology helps students and professionals all around the world advance their careers in a variety of subjects, including data science, artificial intelligence, machine learning, cyber security, and more.
Unique Offerings of the IBM x Edology PG Certificate Programme in Data Science:
- 100+ hours of Live classes by IBM experts
- Globally recognized IBM digital badge
- Job opportunities with 300+ corporate partners
- Edology-IBM Award for Top Performers
- 1 on 1 mentorship from industry experts
- 1 day networking session with IBM team
- Guaranteed interview with IBM for top performers in each cohort
- Dedicated career assistance team
Sumanth Palepu, the Business Head at Edology, states, "Statistical estimates reveal that the worldwide market size for Data Science and analytics is anticipated to reach around a whopping $450 billion by 2025, which also means that the rivalry would be quite severe at the employee level, the competition will be very fierce. Thus, this collaboration with IBM is now more essential than ever, so that we are collectively able to deliver advanced level teaching to the students and working professionals and they get first-hand industry knowledge with our IBM experts."
Edology is a Global Ed-Tech Brand that provides industry-powered education and skills to students and professionals across the world, to help them achieve fast-track career growth. Launched in 2017, Edology connects professionals from across the globe with higher education programmes in the fields of law, finance, accounting, business, computing, marketing, fashion, criminology, psychology, and more.
It's a part of Global University Systems (GUS), an international network of higher-education institutions, brought together by a shared passion of providing industry-driven global education accessible and affordable. All the programs of Edology are built with the objective of providing its learners career enhancement and strong CV credentials, along with a quality learning experience.
The courses offered by Edology include Data Science, Certification in AI and Machine Learning, Data Analytics, PGP in International Business, PGP in Renewable Energy Management, PGP in Oil and Gas Management among others. These offerings are done through hands-on industry projects, interactive live classes, global peer-to-peer learning and other facilities.
This story is provided by NewsVoir. ANI will not be responsible in any way for the content of this article. (ANI/NewsVoir)
Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Watch here.
Could analog artificial intelligence (AI) hardware – rather than digital – tap fast, low-energy processing to solve machine learning’s rising costs and carbon footprint?
Researchers say yes: Logan Wright and Tatsuhiro Onodera, research scientists at NTT Research and Cornell University, envision a future where machine learning (ML) will be performed with novel physical hardware, such as those based on photonics or nanomechanics. These unconventional devices, they say, could be applied in both edge and server settings.
Deep neural networks, which are at the heart of today’s AI efforts, hinge on the heavy use of digital processors like GPUs. But for years, there have been concerns about the monetary and environmental cost of machine learning, which increasingly limits the scalability of deep learning models.
A 2019 paper out of the University of Massachusetts, Amherst, for example, performed a life cycle assessment for training several common large AI models. It found that the process can emit more than 626,000 pounds of carbon dioxide equivalent — nearly five times the lifetime emissions of the average American car, including the manufacturing of the car itself.
At a session with NTT Research at VentureBeat Transform’s Executive Summit on July 19, CEO Kazu Gomi said machine learning doesn’t have to rely on digital circuits, but instead can run on a physical neural network. This is a type of artificial neural network in which physical analog hardware is used to emulate neurons as opposed to software-based approaches.
“One of the obvious benefits of using analog systems rather than digital is AI’s energy consumption,” he said. “The consumption issue is real, so the question is what are new ways to make machine learning faster and more energy-efficient?”
From the early history of AI, people weren’t trying to think about how to make digital computers, Wright pointed out.
“They were trying to think about how we could emulate the brain, which of course is not digital,” he explained. “What I have in my head is an analog system, and it’s actually much more efficient at performing the types of calculations that go on in deep neural networks than today’s digital logic circuits.”
The brain is one example of analog hardware for doing AI, but others include systems that use optics.
“My favorite example is waves, because a lot of things like optics are based on waves,” he said. “In a bathtub, for instance, you could formulate the problem to encode a set of numbers. At the front of the bathtub, you can set up a wave and the height of the wave gives you this vector X. You let the system evolve for some time and the wave propagates to the other end of the bathtub. After some time you can then measure the height of that, and that gives you another set of numbers.”
Essentially, nature itself can perform computations. “And you don’t need to plug it into anything,” he said.
Researchers across the industry are using a variety of approaches to developing analog hardware. IBM Research, for example, has invested in analog electronics, in particular memristor technology, to perform machine learning calculations.
“It’s quite promising,” said Onodera. “These memristor circuits have the property of having information be naturally computed by nature as the electrons ‘flow’ through the circuit, allowing them to have potentially much lower energy consumption than digital electronics.”
NTT Research, however, is focused on a more general framework that isn’t limited to memristor technology. “Our work is focused on also enabling other physical systems, for instance those based on light and mechanics (sound), to perform machine learning,” he said. “By doing so, we can make smart sensors in the native physical domain where the information is generated, such as in the case of a smart microphone or a smart camera.”
Startups including Mythic also focus on analog AI using electronics – which Wright says is a “great step, and it is probably the lowest risk way to get into analog neural networks.” But it’s also incremental and has a limited ceiling, he added: “There is only so much improvement in performance that is possible if the hardware is still based on electronics.”
Several startups, such as Lightmatter, Lightelligence and Luminous Computing, use light, rather than electronics, to do the computing – known as photonics. This is riskier, less-mature technology, said Wright.
“But the long-term potential is much more exciting,” he said. “Light-based neural networks could be much more energy-efficient.”
However, light and electrons aren’t the only thing you can make a computer out of, especially for AI, he added. “You could make it out of biological materials, electrochemistry (like our own brains), or out of fluids, acoustic waves (sound), or mechanical objects, modernizing the earliest mechanical computers.”
MIT Research, for example, announced last week that it had new protonic programmable resistors, a network of analog artificial neurons and synapses that can do calculations similarly to a digital neural network by repeatedly repeating arrays of programmable resistors in intricate layers. They used an “a practical inorganic material in the fabrication process,” they said, that enables their devices “to run 1 million times faster than previous versions, which is also about 1 million times faster than the synapses in the human brain.”
NTT Research says it’s taking a step further back from all these approaches and asking much bigger, much longer-term questions: What can we make a computer out of? And if we want to achieve the highest speed and energy efficiency AI systems, what should we physically make them out of?
“Our paper provides the first answer to these questions by telling us how we can make a neural network computer using any physical substrate,” said Logan. “And so far, our calculations suggest that making these weird computers will one day soon actually make a lot of sense, since they can be much more efficient than digital electronics, and even analog electronics. Light-based neural network computers seem like the best approach so far, but even that question isn’t completely answered.”
According to Sara Hooker, a former Google Brain researcher who currently runs the nonprofit research lab Cohere for AI, the AI industry is “in this really interesting hardware stage.”
Ten years ago, she explains, AI’s massive breakthrough was really a hardware breakthrough. “Deep neural networks did not work until GPUs, which were used for video games [and] were just repurposed for deep neural networks,” she said.
The change, she added, was almost instantaneous. “Overnight, what took 13,000 CPUs overnight took two GPUs,” she said. “That was how dramatic it was.”
It’s very likely that there’s other ways of representing the world that could be equally powerful as digital, she said. “If even one of these data directions starts to show progress, it can unlock a lot of both efficiency as well as different ways of learning representations,” she explained. “That’s what makes it worthwhile for labs to back them.”
Hooker, whose 2020 essay “The Hardware Lottery” explored the reasons why various hardware tools have succeeded and failed, says the success of GPUs for deep neural networks was “actually a bizarre, lucky coincidence – it was winning the lottery.”
GPUs, she explained, were never designed for machine learning — they were developed for video games. So much of the adoption of GPUs for AI use “depended upon the right moment of alignment between progress on the hardware side and progress on the modeling side,” she said. “Making more hardware options available is the most important ingredient because it allows for more unexpected moments where you see those breakthroughs.”
Analog AI, however, isn’t the only option researchers are looking at when it comes to reducing the costs and carbon emissions of AI. Researchers are placing bets on other areas like field-programmable gate arrays (FPGAs) as application-specific accelerators in data centers, that can reduce energy consumption and increase operating speed. There are also efforts to Boost software, she explained.
Analog, she said, “is one of the riskier bets.”
Still, risks have to be taken, Hooker said. When asked whether she thought the big tech companies are supporting analog and other types of alternative nondigital AI future, she said, “One hundred percent. There is a clear motivation,” adding that what is lacking is sustained government investment in a long-term hardware landscape.
“It’s always been tricky when investment rests solely on companies, because it’s so risky,” she said. “It often has to be part of a nationalist strategy for it to be a compelling long-term bet.”
Hooker said she wouldn’t place her own bet on widespread analog AI hardware adoption, but insists the research efforts good for the ecosystem as a whole.
“It’s kind of like the initial NASA flight to the moon,” she said. “There’s so many scientific breakthroughs that happen just by having an objective.
And there is an expiration date on the industry’s current approach, she cautioned: “There’s an understanding among people in the field that there has to be some bet on more riskier projects.”
The NTT researchers made clear that the earliest, narrowest applications of their analog AI work will take at least 5-10 years to come to fruition – and even then will likely be used first for specific applications such as at the edge.
“I think the most near-term applications will happen on the edge, where there are less resources, where you might not have as much power,” said Onodera. “I think that’s really where there’s the most potential.”
One of the things the team is thinking about is which types of physical systems will be the most scalable and offer the biggest advantage in terms of energy efficiency and speed. But in terms of entering the deep learning infrastructure, it will likely happen incrementally, Wright said.
“I think it would just slowly come into the market, with a multilayered network with maybe the front end happening on the analog domain,” he said. “I think that’s a much more sustainable approach.”
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn more about membership.
IBM’s annual Cost of Data Breach Report for 2022 is packed with revelations, and as usual none of them are good news. Headlining the report is the record-setting cost of data breaches, with the global average now at $4.35 million. The report also reveals that much of that expense comes with the data breach version of “long Covid,” expenses that are realized more than a year after the attack.
Most organizations (60%) are passing these added costs on to consumers in the form of higher prices. And while 83% of organizations now report experiencing at least one data breach, only a small minority are adopting zero trust strategies.
The IBM report draws on input from 550 global organizations surveyed about the period between March 2021 and March 2022, in partnership with the Ponemon Institute.
Though the average cost of a data breach is up, it is only by about 2.6%; the average in 2021 was $4.24 million. This represents a total climb of 13% since 2020, however, reflecting the general spike in cyber crime seen during the pandemic years.
Organizations are also increasingly not opting to absorb the cost of data breaches, with the majority (60%) compensating by raising consumer prices separate from any other latest increases due to inflation or supply chain issues. The report indicates that this may be an underreported upward influence on prices of consumer goods, as 83% of organizations now say that they have been breached at least once.
Brad Hong, Customer Success Manager for Horizon3.ai, sees a potential consumer backlash on the horizon once public awareness of this practice grows: “It’s already a breach of confidence to lose the confidential data of customers, and sure there’s bound to be an organization across those surveyed who genuinely did put in the effort to protect against and curb attacks, but for those who did nothing, those who, instead of creating a disaster recovery plan, just bought cyber insurance to cover the org’s operational losses, and those who simply didn’t care enough to heed the warnings, it’s the coup de grâce to then pass the cost of breaches to the same customers who are now the victims of a data breach. I’d be curious to know what percent of the 60% of organizations who increased the price of their products and services are using the extra revenue for a war chest or to actually reinforce their security—realistically, it’s most likely just being used to fill a gap in lost revenue for shareholders’ sake post-breach. Without government regulations outlining restrictions on passing cost of breach to consumer, at the least, not without the honest & measurable efforts of a corporation as their custodian, what accountability do we all have against that one executive who didn’t want to change his/her password?”
Breach costs also have an increasingly long tail, as nearly half now come over a year after the date of the attack. The largest of these are generally fines that are levied after an investigation, and decisions or settlements in class action lawsuits. While the popular new “double extortion” approach of ransomware attacks can drive long-term costs in this way, the study finds that companies paying ransom demands to settle the problem quickly aren’t necessarily seeing a large amount of overall savings: their average breach cost drops by just $610,000.
Sanjay Raja, VP of Product with Gurucul, expands on how knock-on data breach damage can continue for years: “The follow-up attack effect, as described, is a significant problem as the playbooks and solutions provided to security operations teams are overly broad and lack the necessary context and response actions for proper remediation. For example, shutting down a user or application or adding a firewall block rule or quarantining a network segment to negate an attack is not a sustainable remediation step to protect an organization on an ongoing basis. It starts with a proper threat detection, investigation and response solution. Current SIEMs and XDR solutions lack the variety of data, telemetry and combined analytics to not only identify an attack campaign and even detect variants on previously successful attacks, but also provide the necessary context, accuracy and validation of the attack to build both a precise and complete response that can be trusted. This is an even greater challenge when current solutions cannot handle complex hybrid multi-cloud architectures leading to significant blind spots and false positives at the very start of the security analyst journey.”
In spite of over four out of five organizations now having experienced some sort of data breach, only slightly over 20% of critical infrastructure companies have moved to zero trust strategies to secure their networks. Cloud security is also lagging as well, with a little under half (43%) of all respondents saying that their security practices in this area are either “early stage” or do not yet exist.
Those that have onboarded security automation and AI elements are the only group seeing massive savings: their average cost of data breach is $3.05 million lower. This particular study does not track average ransom demands, but refers to Sophos research that puts the most latest number at $812,000 globally.
The study also notes serious problems with incident response plans, especially troubling in an environment in which the average ransomware attack is now carried out in four days or less and the “time to ransom” has dropped to a matter of hours in some cases. 37% of respondents say that they do not test their incident response plans regularly. 62% say that they are understaffed to meet their cybersecurity needs, and these organizations tend to suffer over half a million more dollars in damages when they are breached.
Of course, cost of data breaches is not distributed evenly by geography or by industry type. Some are taking much bigger hits than others, reflecting trends established in prior reports. The health care industry is now absorbing a little over $10 million in damage per breach, with the average cost of data breach rising by $1 million from 2021. And companies in the United States face greater data breach costs than their counterparts around the world, at over $8 million per incident.
Shawn Surber, VP of Solutions Architecture and Strategy with Tanium, provides some insight into the unique struggles that the health care industry faces in implementing effective cybersecurity: “Healthcare continues to suffer the greatest cost of breaches but has among the lowest spend on cybersecurity of any industry, despite being deemed ‘critical infrastructure.’ The increased vulnerability of healthcare organizations to cyber threats can be traced to outdated IT systems, the lack of robust security controls, and insufficient IT staff, while valuable medical and health data— and the need to pay ransoms quickly to maintain access to that data— make healthcare targets popular and relatively easy to breach. Unlike other industries that can migrate data and sunset old systems, limited IT and security budgets at healthcare orgs make migration difficult and potentially expensive, particularly when an older system provides a small but unique function or houses data necessary for compliance or research, but still doesn’t make the cut to transition to a newer system. Hackers know these weaknesses and exploit them. Additionally, healthcare orgs haven’t sufficiently updated their security strategies and the tools that manufacturers, IT software vendors, and the FDA have made haven’t been robust enough to thwart the more sophisticated techniques of threat actors.”
Familiar incident types also lead the list of the causes of data breaches: compromised credentials (19%), followed by phishing (16%). Breaches initiated by these methods also tended to be a little more costly, at an average of $4.91 million per incident.
Though the numbers are never as neat and clean as averages would indicate, it would appear that the cost of data breaches is cut dramatically for companies that implement solid automated “deep learning” cybersecurity tools, zero trust systems and regularly tested incident response plans. Mature cloud security programs are also a substantial cost saver.
LAWRENCE, Kan.--(BUSINESS WIRE)--Jul 28, 2022--
Cobalt Iron Inc., a leading provider of SaaS-based enterprise data protection, today announced that the company has been deemed one of the 10 Most Promising IBM Solution Providers 2022 by CIOReview Magazine. The annual list of companies is selected by a panel of experts and members of CIOReview Magazine’s editorial board to recognize and promote innovation and entrepreneurship. A technology partner for IBM, Cobalt Iron earned the distinction based on its Compass ® enterprise SaaS backup platform for monitoring, managing, provisioning, and securing the entire enterprise backup landscape.
This press release features multimedia. View the full release here: https://www.businesswire.com/news/home/20220728005043/en/
Cobalt Iron Compass® is a SaaS-based data protection platform leveraging strong IBM technologies for delivering a secure, modernized approach to data protection. (Graphic: Business Wire)
According to CIOReview, “Cobalt Iron has built a patented cyber-resilience technology in a SaaS model to alleviate the complexities of managing large, multivendor setups, providing an effectual humanless backup experience. This SaaS-based data protection platform, called Compass, leverages strong IBM technologies. For example, IBM Spectrum Protect is embedded into the platform from a data backup and recovery perspective. ... By combining IBM’s technologies and the intellectual property built by Cobalt Iron, the company delivers a secure, modernized approach to data protection, providing a ‘true’ software as a service.”
Through proprietary technology, the Compass data protection platform integrates with, automates, and optimizes best-of-breed technologies, including IBM Spectrum Protect, IBM FlashSystem, IBM Red Hat Linux, IBM Cloud, and IBM Cloud Object Storage. Compass enhances and extends IBM technologies by automating more than 80% of backup infrastructure operations, optimizing the backup landscape through analytics, and securing backup data, making it a valuable addition to IBM’s data protection offerings.
CIOReview also praised Compass for its simple and intuitive interface to display a consolidated view of data backups across an entire organization without logging in to every backup product instance to extract data. The machine learning-enabled platform also automates backup processes and infrastructure, and it uses open APIs to connect with ticket management systems to generate tickets automatically about any backups that need immediate attention.
To ensure the security of data backups, Cobalt Iron has developed an architecture and security feature set called Cyber Shield for 24/7 threat protection, detection, and analysis that improves ransomware responsiveness. Compass is also being enhanced to use several patented techniques that are specific to analytics and ransomware. For example, analytics-based cloud brokering of data protection operations helps enterprises make secure, efficient, and cost-effective use of their cloud infrastructures. Another patented technique — dynamic IT infrastructure optimization in response to cyberthreats — offers unique ransomware analytics and automated optimization that will enable Compass to reconfigure IT infrastructure automatically when it detects cyberthreats, such as a ransomware attack, and dynamically adjust access to backup infrastructure and data to reduce exposure.
Compass is part of IBM’s product portfolio through the IBM Passport Advantage program. Through Passport Advantage, IBM sellers, partners, and distributors around the world can sell Compass under IBM part numbers to any organizations, particularly complex enterprises, that greatly benefit from the automated data protection and anti-ransomware solutions Compass delivers.
CIOReview’s report concludes, “With such innovations, all eyes will be on Cobalt Iron for further advancements in humanless, secure data backup solutions. Cobalt Iron currently focuses on IP protection and continuous R&D to bring about additional cybersecurity-related innovations, promising a more secure future for an enterprise’s data.”
About Cobalt Iron
Cobalt Iron was founded in 2013 to bring about fundamental changes in the world’s approach to secure data protection, and today the company’s Compass ® is the world’s leading SaaS-based enterprise data protection system. Through analytics and automation, Compass enables enterprises to transform and optimize legacy backup solutions into a simple cloud-based architecture with built-in cybersecurity. Processing more than 8 million jobs a month for customers in 44 countries, Compass delivers modern data protection for enterprise customers around the world. www.cobaltiron.com
Product or service names mentioned herein are the trademarks of their respective owners.
Link to Word Doc:www.wallstcom.com/CobaltIron/220728-Cobalt_Iron-CIOReview_Top_IBM_Provider_2022.docx
Photo Caption: Cobalt Iron Compass ® is a SaaS-based data protection platform leveraging strong IBM technologies for delivering a secure, modernized approach to data protection.
Follow Cobalt Iron
View source version on businesswire.com:https://www.businesswire.com/news/home/20220728005043/en/
CONTACT: Agency Contact:
Wall Street Communications
Tel: +1 801 326 9946
Web:www.wallstcom.comCobalt Iron Contact:
VP of Marketing
Tel: +1 785 979 9461
KEYWORD: EUROPE UNITED STATES NORTH AMERICA KANSAS
INDUSTRY KEYWORD: DATA MANAGEMENT SECURITY TECHNOLOGY SOFTWARE NETWORKS INTERNET
SOURCE: Cobalt Iron
Copyright Business Wire 2022.
PUB: 07/28/2022 09:00 AM/DISC: 07/28/2022 09:03 AM
Matt Hicks, Red Hat's new CEO, doesn't have the background of your typical chief executive.
He studied computer hardware engineering in college.
He began his career as an IT consultant at IBM.
And instead of jumping into management at Red Hat, Hicks started at the open-source software business in 2006 as a developer on the IT team.
SEE: Red Hat names new CEO
His on-the-ground experience, however, is one of his core assets as the company's new leader, Hicks says.
"The markets are changing really quickly," he tells ZDNet. "And just having that intuition -- of where hardware is going, having spent time in the field with what enterprise IT shops struggle with and what they do well, and then having a lot of years in Red Hat engineering -- I know that's intuition that I'll lean on... Around that, there's a really good team at Red Hat, and I get to lean on their expertise of how to best deliver, but that I love having that core intuition."
Hicks believes his core knowledge helps him to guide the company's strategic bets.
While his experience is an asset, Hicks says it's not a given that a good developer will make a good leader. You also need to know how to communicate your ideas persuasively.
SEE: Hands on: Putting Ubuntu Linux on my Microsoft Surface Go
"You can't just be the best coder in the room," he says.
"Especially in STEM and engineering, the softer skills of learning how to present, learning how to influence a group and show up really well in a leadership presentation or at a conference -- they really start to define people's careers."
Hicks says that focus on influence is an important part of his role now that he didn't relish earlier in his career.
"I think a lot of people don't love that," he says.
"And yet, you can be the best engineer on the planet and work hard, but if you can't be heard, if you can't influence, it's harder to deliver on those opportunities."
Hicks embraced the art of persuasion to advance his career. And as an open-source developer, he learned to embrace enterprise products to advance Red Hat's mission.
He joined Red Hat just a few years after Paul Cormier -- then Red Hat's VP of engineering, and later Hicks' predecessor as CEO -- moved the company from its early distribution, Red Hat Linux, to Red Hat Enterprise Linux (RHEL). It was a move that not everyone liked.
"There was a significant amount of angst," Hicks says, with developers questioning whether it was the right business model for open source. "People are passionate about creating sustaining models. So I think at the time [the question] was, is Red Hat departing from that, or will this continue to make open source better?"
Hicks had an understanding of both sides of that debate.
"I really started my whole journey with technology with Linux itself," he says. "I was on the consumer side -- you know, I bought the boxes of Linux from Best Buy. But my first professional job was actually in consulting with IBM at the time. And as much as I knew about Linux, there's a difference when you're a consultant at an enterprise and you're deploying Linux next to [IBM's Unix-based operating system] AIX.
"I had my consumer view -- I loved this open source thing," he says. "But I also had that practitioner view of, I'm going into large enterprises, and I'm only going to be here on a consulting gig and this has to have credibility. And RHEL really delivered to that well."
Hicks says he was drawn to Red Hat because of the inherent tension between community and commerce.
"There's the pull to either side, of how do you enable a community where the software is accessible to anyone on the planet, your partners and your competitors on it?" he says, versus the question of, "How do you harness that innovation, to have a really successful commercial model that customers value?"
In all the years he's been at Red Hat, Hicks doesn't think much has changed around the challenge of balancing those two forces.
Of course, plenty of other stuff has changed, both in the profession of software development and at Red Hat. Hicks wants to ensure that the company is always ready to evolve -- that's why in a message to Red Hat's workforce, he wrote: "When we hire, look for culture add, not culture fit."
He believes the idea of searching for a culture fit among prospective employees has a very static feel to it.
"It's not you're not adding anything, you're not looking at potential," he says. "If you're always staying with what you know, the culture you have today, fitting your current constraints, I think you're going to lose out on a lot of that potential, both the potential for today and then as that talent evolves and changes tomorrow."
Red Hat was acquired by IBM in 2019 for $34 billion, but the company continues to operate as a standalone division. Meanwhile, RHEL is still the industry's leading enterprise Linux platform. As Steven Vaughan-Nichols noted for ZDNet, it's used by more than 90% of Fortune 500 organizations and touches $13 trillion in global business revenues in 2022.
"Pretty much any industry you look at is starting to define their innovation with software at this point, and we're in the software business," Hick says, stressing the opportunity in front of Red Hat.
SEE: Red Hat's next steps, according to its new CEO and chairman
The company is focused on supporting the "open hybrid cloud," enabling IT teams to work across public clouds, data centers and the edge.
"We're at the intersection of the potential of open source, the potential of open hybrid cloud and software innovation, and that's what gets me excited every day," Hicks says.
As he settles into his new role as CEO, the main challenge ahead of Hicks will be picking the right industries and partners to pursue at the edge. Red Hat is already working at the edge, in a range of different industries. It's working with General Motors on Ultifi, GM's end-to-end software platform, and it's partnering with ABB, one of the world's leading manufacturing automation companies. It's also working with Verizon on hybrid mobile edge computing.
Even so, the opportunity is vast. Red Hat expects to see around $250 billion in spending at the edge by 2025.
"There'll be a tremendous growth of applications that are written to be able to deliver to that," Hicks says. "And so our goals in the short term are to pick the industries and build impactful partnerships in those industries -- because it's newer, and it's evolving."
Press release content from Business Wire. The AP news staff was not involved in its creation.
BOSTON--(BUSINESS WIRE)--Aug 3, 2022--
Astadia is pleased to announce the release of a new series of Mainframe-to-Cloud reference architecture guides. The documents cover how to refactor IBM mainframes applications to Microsoft Azure, Amazon Web Services (AWS), Google Cloud, and Oracle Cloud Infrastructure (OCI). The documents offer a deep dive into the migration process to all major target cloud platforms using Astadia’s FastTrack software platform and methodology.
As enterprises and government agencies are under pressure to modernize their IT environments and make them more agile, scalable and cost-efficient, refactoring mainframe applications in the cloud is recognized as one of the most efficient and fastest modernization solutions. By making the guides available, Astadia equips business and IT professionals with a step-by-step approach on how to refactor mission-critical business systems and benefit from highly automated code transformation, data conversion and testing to reduce costs, risks and timeframes in mainframe migration projects.
“Understanding all aspects of legacy application modernization and having access to the most performant solutions is crucial to accelerating digital transformation,” said Scott G. Silk, Chairman and CEO. “More and more organizations are choosing to refactor mainframe applications to the cloud. These guides are meant to assist their teams in transitioning fast and safely by benefiting from Astadia’s expertise, software tools, partnerships, and technology coverage in mainframe-to-cloud migrations,” said Mr. Silk.
The new guides are part of Astadia’s free Mainframe-to-Cloud Modernization series, an ample collection of guides covering various mainframe migration options, technologies, and cloud platforms. The series covers IBM (NYSE:IBM) Mainframes.
In addition to the reference architecture diagrams, these comprehensive guides include various techniques and methodologies that may be used in forming a complete and effective Legacy Modernization plan. The documents analyze the important role of the mainframe platform, and how to preserve previous investments in information systems when transitioning to the cloud.
In each of the IBM Mainframe Reference Architecture white papers, readers will explore:
The guides are available for obtain here:
To access more mainframe modernization resources, visit the Astadia learning center on www.astadia.com.
Astadia is the market-leading software-enabled mainframe migration company, specializing in moving IBM and Unisys mainframe applications and databases to distributed and cloud platforms in unprecedented timeframes. With more than 30 years of experience, and over 300 mainframe migrations completed, enterprises and government organizations choose Astadia for its deep expertise, range of technologies, and the ability to automate complex migrations, as well as testing at scale. Learn more on www.astadia.com.
View source version on businesswire.com:https://www.businesswire.com/news/home/20220803005031/en/
CONTACT: Wilson Rains, Chief Revenue Officer
KEYWORD: UNITED STATES NORTH AMERICA MASSACHUSETTS
INDUSTRY KEYWORD: DATA MANAGEMENT TECHNOLOGY OTHER TECHNOLOGY SOFTWARE NETWORKS INTERNET
Copyright Business Wire 2022.
PUB: 08/03/2022 10:00 AM/DISC: 08/03/2022 10:02 AM