Pass4sure C9020-563 PDF Questions and Real Exam Questions
The fundamental issue that individuals face in C9020-563 test readiness is precarious inquiries that you can not plan with C9020-563 course books. They are simply given by killexams.com in C9020-563 Free Exam PDF. We recommend downloading 100 percent free Exam Cram to assess before you purchase full C9020-563 Real Exam Questions.
Exam Code: C9020-563 Practice exam 2022 by Killexams.com team IBM System Storage DS8000 Technical Solutions V5 IBM Technical study help Killexams : IBM Technical study help - BingNews
Search resultsKillexams : IBM Technical study help - BingNews
https://killexams.com/exam_list/IBMKillexams : IBM Research Rolls Out A Comprehensive AI And Platform-Based Edge Research Strategy Anchored By Enterprise Use Cases And Partnerships
I recently met with Dr. Nick Fuller, Vice President, Distributed Cloud, at IBM Research for a discussion about IBM’s long-range plans and strategy for artificial intelligence and machine learning at the edge.
Dr. Fuller is responsible for providing AI and platform–based innovation for enterprise digital transformation spanning edge computing and distributed cloud management. He is an IBM Master Inventor with over 75 patents and co-author of 75 technical publications. Dr. Fuller obtained his Bachelor of Science in Physics and Math from Morehouse College and his PhD in Applied Physics from Columbia University.
Edge In, not Cloud Out
In general, Dr. Fuller told me that IBM is focused on developing an "edge in" position versus a "cloud out" position with data, AI, and Kubernetes-based platform technologies to scale hub and spoke deployments of edge applications.
A hub plays the role of a central control plane used for orchestrating the deployment and management of edge applications in a number of connected spoke locations such as a factory floor or a retail branch, where data is generated or locally aggregated for processing.
“Cloud out” refers to the paradigm where cloud service providers are extending their cloud architecture out to edge locations. In contrast, “edge in” refers to a provider-agnostic architecture that is cloud-independent and treats the data-plane as a first-class citizen.
IBM's overall architectural principle is scalability, repeatability, and full stack solution management that allows everything to be managed using a single unified control plane.
IBM’s Red Hat platform and infrastructure strategy anchors the application stack with a unified, scalable, and managed OpenShift-based control plane equipped with a high-performance storage appliance and self-healing system capabilities (inclusive of semi-autonomous operations).
IBM’s strategy also includes several in-progress platform-level technologies for scalable data, AI/ML runtimes, accelerator libraries for Day-2 AI operations, and scalability for the enterprise.
It is an important to mention that IBM is designing its edge platforms with labor cost and technical workforce in mind. Data scientists with PhDs are in high demand, making them difficult to find and expensive to hire once you find them. IBM is designing its edge system capabilities and processes so that domain experts rather than PhDs can deploy new AI models and manage Day-2 operations.
Why edge is important
Advances in computing and storage have made it possible for AI to process mountains of accumulated data to provide solutions. By bringing AI closer to the source of data, edge computing is faster and more efficient than cloud. While Cloud data accounts for 60% of the world’s data today, vast amounts of new data is being created at the edge, including industrial applications, traffic cameras, and order management systems, all of which can be processed at the edge in a fast and timely manner.
Public cloud and edge computing differ in capacity, technology, and management. An advantage of edge is that data is processed and analyzed at / near its collection point at the edge. In the case of cloud, data must be transferred from a local device and into the cloud for analytics and then transferred back to the edge again. Moving data through the network consumes capacity and adds latency to the process. It’s easy to see why executing a transaction at the edge reduces latency and eliminates unnecessary load on the network.
Increased privacy is another benefit of processing data at the edge. Analyzing data where it originates limits the risk of a security breach. Most of the communications between the edge and the cloud is then confined to such things as reporting, data summaries, and AI models, without ever exposing the raw data.
IBM at the Edge
In our discussion, Dr. Fuller provided a few examples to illustrate how IBM plans to provide new and seamless edge solutions for existing enterprise problems.
Example #1 – McDonald’s drive-thru
Dr. Fuller’s first example centered around Quick Service Restaurant’s (QSR) problem of drive-thru order taking. Last year, IBM acquired an automated order-taking system from McDonald's. As part of the acquisition, IBM and McDonald's established a partnership to perfect voice ordering methods using AI. Drive-thru orders are a significant percentage of total QSR orders for McDonald's and other QSR chains.
McDonald's and other QSR restaurants would like every order to be processed as quickly and accurately as possible. For that reason, McDonald's conducted trials at ten Chicago restaurants using an edge-based AI ordering system with NLP (Natural Language Processing) to convert spoken orders into a digital format. It was found that AI had the potential to reduce ordering errors and processing time significantly. Since McDonald's sells almost 7 million hamburgers daily, shaving a minute or two off each order represents a significant opportunity to address labor shortages and increase customer satisfaction.
Example #2 – Boston Dynamics and Spot the agile mobile robot
According to an earlier IBM survey, many manufacturers have already implemented AI-driven robotics with autonomous decision-making capability. The study also indicated that over 80 percent of companies believe AI can help Boost future business operations. However, some companies expressed concern about the limited mobility of edge devices and sensors.
To develop a mobile edge solution, IBM teamed up with Boston Dynamics. The partnership created an agile mobile robot using IBM Research and IBM Sustainability Software AI technology. The device can analyze visual sensor readings in hazardous and challenging industrial environments such as manufacturing plants, warehouses, electrical grids, waste treatment plants and other hazardous environments. The value proposition that Boston Dynamics brought to the partnership was Spot the agile mobile robot, a walking, sensing, and actuation platform. Like all edge applications, the robot’s wireless mobility uses self-contained AI/ML that doesn’t require access to cloud data. It uses cameras to read analog devices, visually monitor fire extinguishers, and conduct a visual inspection of human workers to determine if required safety equipment is being worn.
IBM was able to show up to a 10X speedup by automating some manual tasks, such as converting the detection of a problem into an immediate work order in IBM Maximo to correct it. A fast automated response was not only more efficient, but it also improved the safety posture and risk management for these facilities. Similarly, some factories need to thermally monitor equipment to identify any unexpected hot spots that may show up over time, indicative of a potential failure.
IBM is working with National Grid, an energy company, to develop a mobile solution using Spot, the agile mobile robot, for image analysis of transformers and thermal connectors. As shown in the above graphic, Spot also monitored connectors on both flat surfaces and 3D surfaces. IBM was able to show that Spot could detect excessive heat build-up in small connectors, potentially avoiding unsafe conditions or costly outages. This AI/ML edge application can produce faster response times when an issue is detected, which is why IBM believes significant gains are possible by automating the entire process.
IBM market opportunities
Drive-thru orders and mobile robots are just a few examples of the millions of potential AI applications that exist at the edge and are driven by several billion connected devices.
Edge computing is an essential part of enterprise digital transformation. Enterprises seek ways to demonstrate the feasibility of solving business problems using AI/ML and analytics at the edge. However, once a proof of concept has been successfully demonstrated, it is a common problem for a company to struggle with scalability, data governance, and full-stack solution management.
Challenges with scaling
“Determining entry points for AI at the edge is not the difficult part,” Dr. Fuller said. “Scale is the real issue.”
Scaling edge models is complicated because there are so many edge locations with large amounts of diverse content and a high device density. Because large amounts of data are required for training, data gravity is a potential problem. Further, in many scenarios, vast amounts of data are generated quickly, leading to potential data storage and orchestration challenges. AI Models are also rarely "finished." Monitoring and retraining of models are necessary to keep up with changes the environment.
Through IBM Research, IBM is addressing the many challenges of building an all-encompassing edge architecture and horizontally scalable data and AI technologies. IBM has a wealth of edge capabilities and an architecture to create the appropriate platform for each application.
IBM AI entry points at the edge
IBM sees Edge Computing as a $200 billion market by 2025. Dr. Fuller and his organization have identified four key market entry points for developing and expanding IBM’s edge compute strategy. In order of size, IBM believes its priority edge markets to be intelligent factories (Industry 4.0), telcos, retail automation, and connected vehicles.
IBM and its Red Hat portfolio already have an established presence in each market segment, particularly in intelligent operations and telco. Red Hat is also active in the connected vehicles space.
There have been three prior industrial revolutions, beginning in the 1700s up to our current in-progress fourth revolution, Industry 4.0, that promotes a digital transformation.
Manufacturing is the fastest growing and the largest of IBM’s four entry markets. In this segment, AI at the edge can Boost quality control, production optimization, asset management, and supply chain logistics. IBM believes there are opportunities to achieve a 4x speed up in implementing edge-based AI solutions for manufacturing operations.
For its Industry 4.0 use case development, IBM, through product, development, research and consulting teams, is working with a major automotive OEM. The partnership has established the following joint objectives:
Increase automation and scalability across dozens of plants using 100s of AI / ML models. This client has already seen value in applying AI/ML models for manufacturing applications. IBM Research is helping with re-training models and implementing new ones in an edge environment to help scale even more efficiently. Edge offers faster inference and low latency, allowing AI to be deployed in a wider variety of manufacturing operations requiring instant solutions.
Dramatically reduce the time required to onboard new models. This will allow training and inference to be done faster and allow large models to be deployed much more quickly. The quicker an AI model can be deployed in production; the quicker the time-to-value and the return-on-investment (ROI).
Accelerate deployment of new inspections by reducing the labeling effort and iterations needed to produce a production-ready model via data summarization. Selecting small data sets for annotation means manually examining thousands of images, this is a time-consuming process that will result in - labeling of redundant data. Using ML-based automation for data summarization will accelerate the process and produce better model performance.
Enable Day-2 AI operations to help with data lifecycle automation and governance, model creation, reduce production errors, and provide detection of out-of-distribution data to help determine if a model’s inference is accurate. IBM believes this will allow models to be created faster without data scientists.
Maximo Application Suite
IBM’s Maximo Application Suite plays an important part in implementing large manufacturers' current and future IBM edge solutions. Maximo is an integrated public or private cloud platform that uses AI, IoT, and analytics to optimize performance, extend asset lifecycles and reduce operational downtime and costs. IBM is working with several large manufacturing clients currently using Maximo to develop edge use cases, and even uses it within its own Manufacturing.
IBM has research underway to develop a more efficient method of handling life cycle management of large models that require immense amounts of data. Day 2 AI operations tasks can sometimes be more complex than initial model training, deployment, and scaling. Retraining at the edge is difficult because resources are typically limited.
Once a model is trained and deployed, it is important to monitor it for drift caused by changes in data distributions or anything that might cause a model to deviate from original requirements. Inaccuracies can adversely affect model ROI.
Day-2 AI Operations (retraining and scaling)
Day-2 AI operations consist of continual updates to AI models and applications to keep up with changes in data distributions, changes in the environment, a drop in model performance, availability of new data, and/or new regulations.
IBM recognizes the advantages of performing Day-2 AI Operations, which includes scaling and retraining at the edge. It appears that IBM is the only company with an architecture equipped to effectively handle Day-2 AI operations. That is a significant competitive advantage for IBM.
A company using an architecture that requires data to be moved from the edge back into the cloud for Day-2 related work will be unable to support many factory AI/ML applications because of the sheer number of AI/ML models to support (100s to 1000s).
“There is a huge proliferation of data at the edge that exists in multiple spokes,” Dr. Fuller said. "However, all that data isn’t needed to retrain a model. It is possible to cluster data into groups and then use sampling techniques to retrain the model. There is much value in federated learning from our point of view.”
Federated learning is a promising training solution being researched by IBM and others. It preserves privacy by using a collaboration of edge devices to train models without sharing the data with other entities. It is a good framework to use when resources are limited.
Dealing with limited resources at the edge is a challenge. IBM’s edge architecture accommodates the need to ensure resource budgets for AI applications are met, especially when deploying multiple applications and multiple models across edge locations. For that reason, IBM developed a method to deploy data and AI applications to scale Day-2 AI operations utilizing hub and spokes.
The graphic above shows the current status quo methods of performing Day-2 operations using centralized applications and a centralized data plane compared to the more efficient managed hub and spoke method with distributed applications and a distributed data plane. The hub allows it all to be managed from a single pane of glass.
Data Fabric Extensions to Hub and Spokes
IBM uses hub and spoke as a model to extend its data fabric. The model should not be thought of in the context of a traditional hub and spoke. IBM’s hub provides centralized capabilities to manage clusters and create multiples hubs that can be aggregated to a higher level. This architecture has four important data management capabilities.
First, models running in unattended environments must be monitored. From an operational standpoint, detecting when a model’s effectiveness has significantly degraded and if corrective action is needed is critical.
Secondly, in a hub and spoke model, data is being generated and collected in many locations creating a need for data life cycle management. Working with large enterprise clients, IBM is building unique capabilities to manage the data plane across the hub and spoke estate - optimized to meet data lifecycle, regulatory & compliance as well as local resource requirements. Automation determines which input data should be selected and labeled for retraining purposes and used to further Boost the model. Identification is also made for atypical data that is judged worthy of human attention.
The third issue relates to AI pipeline compression and adaptation. As mentioned earlier, edge resources are limited and highly heterogeneous. While a cloud-based model might have a few hundred million parameters or more, edge models can’t afford such resource extravagance because of resource limitations. To reduce the edge compute footprint, model compression can reduce the number of parameters. As an example, it could be reduced from several hundred million to a few million.
Lastly, suppose a scenario exists where data is produced at multiple spokes but cannot leave those spokes for compliance reasons. In that case, IBM Federated Learning allows learning across heterogeneous data in multiple spokes. Users can discover, curate, categorize and share data assets, data sets, analytical models, and their relationships with other organization members.
In addition to AI deployments, the hub and spoke architecture and the previously mentioned capabilities can be employed more generally to tackle challenges faced by many enterprises in consistently managing an abundance of devices within and across their enterprise locations. Management of the software delivery lifecycle or addressing security vulnerabilities across a vast estate are a case in point.
Multicloud and Edge platform
In the context of its strategy, IBM sees edge and distributed cloud as an extension of its hybrid cloud platform built around Red Hat OpenShift. One of the newer and more useful options created by the Red Hat development team is the Single Node OpenShift (SNO), a compact version of OpenShift that fits on a single server. It is suitable for addressing locations that are still servers but come in a single node, not clustered, deployment type.
For smaller footprints such as industrial PCs or computer vision boards (for example NVidia Jetson Xavier), Red Hat is working on a project which builds an even smaller version of OpenShift, called MicroShift, that provides full application deployment and Kubernetes management capabilities. It is packaged so that it can be used for edge device type deployments.
Overall, IBM and Red Hat have developed a full complement of options to address a large spectrum of deployments across different edge locations and footprints, ranging from containers to management of full-blown Kubernetes applications from MicroShift to OpenShift and IBM Edge Application Manager.
Much is still in the research stage. IBM's objective is to achieve greater consistency in terms of how locations and application lifecycle is managed.
First, Red Hat plans to introduce hierarchical layers of management with Red Hat Advanced Cluster Management (RHACM), to scale by two to three orders of magnitude the number of edge locations managed by this product. Additionally, securing edge locations is a major focus. Red Hat is continuously expanding platform security features, for example by recently including Integrity Measurement Architecture in Red Hat Enterprise Linux, or by adding Integrity Shield to protect policies in Red Hat Advanced Cluster Management (RHACM).
Red Hat is partnering with IBM Research to advance technologies that will permit it to protect platform integrity and the integrity of client workloads through the entire software supply chains. In addition, IBM Research is working with Red Hat on analytic capabilities to identify and remediate vulnerabilities and other security risks in code and configurations.
Telco network intelligence and slice management with AL/ML
Communication service providers (CSPs) such as telcos are key enablers of 5G at the edge. 5G benefits for these providers include:
Reduced operating costs
Increased distribution and density
The end-to-end 5G network comprises the Radio Access Network (RAN), transport, and core domains. Network slicing in 5G is an architecture that enables multiple virtual and independent end-to-end logical networks with different characteristics such as low latency or high bandwidth, to be supported on the same physical network. This is implemented using cloud-native technology enablers such as software defined networking (SDN), virtualization, and multi-access edge computing. Slicing offers necessary flexibility by allowing the creation of specific applications, unique services, and defined user groups or networks.
An important aspect of enabling AI at the edge requires IBM to provide CSPs with the capability to deploy and manage applications across various enterprise locations, possibly spanning multiple end-to-end network slices, using a single pane of glass.
5G network slicing and slice management
Network slices are an essential part of IBM's edge infrastructure that must be automated, orchestrated and optimized according to 5G standards. IBM’s strategy is to leverage AI/ML to efficiently manage, scale, and optimize the slice quality of service, measured in terms of bandwidth, latency, or other metrics.
5G and AI/ML at the edge also represent a significant opportunity for CSPs to move beyond traditional cellular services and capture new sources of revenue with new services.
Communications service providers need management and control of 5G network slicing enabled with AI-powered automation.
Dr. Fuller sees a variety of opportunities in this area. "When it comes to applying AI and ML on the network, you can detect things like intrusion detection and malicious actors," he said. "You can also determine the best way to route traffic to an end user. Automating 5G functions that run on the network using IBM network automation software also serves as an entry point.”
In IBM’s current telecom trial, IBM Research is spearheading the development of a range of capabilities targeted for the IBM Cloud Pak for Network Automation product using AI and automation to orchestrate, operate and optimize multivendor network functions and services that include:
End-to-end 5G network slice management with planning & design, automation & orchestration, and operations & assurance
Network Data and AI Function (NWDAF) that collects data for slice monitoring from 5G Core network functions, performs network analytics, and provides insights to authorized data consumers.
Improved operational efficiency and reduced cost
Future leverage of these capabilities by existing IBM Clients that use the Cloud Pak for Network Automation (e.g., DISH) can offer further differentiation for CSPs.
5G radio access
Open radio access networks (O-RANs) are expected to significantly impact telco 5G wireless edge applications by allowing a greater variety of units to access the system. The O-RAN concept separates the DU (Distributed Units) and CU (Centralized Unit) from a Baseband Unit in 4G and connects them with open interfaces.
O-RAN system is more flexible. It uses AI to establish connections made via open interfaces that optimize the category of a device by analyzing information about its prior use. Like other edge models, the O-RAN architecture provides an opportunity for continuous monitoring, verification, analysis, and optimization of AI models.
The IBM-telco collaboration is expected to advance O-RAN interfaces and workflows. Areas currently under development are:
Multi-modal (RF level + network-level) analytics (AI/ML) for wireless communication with high-speed ingest of 5G data
Capability to learn patterns of metric and log data across CUs and DUs in RF analytics
Utilization of the antenna control plane to optimize throughput
Primitives for forecasting, anomaly detection and root cause analysis using ML
Opportunity of value-added functions for O-RAN
IBM Cloud and Infrastructure
The cornerstone for the delivery of IBM's edge solutions as a service is IBM Cloud Satellite. It presents a consistent cloud-ready, cloud-native operational view with OpenShift and IBM Cloud PaaS services at the edge. In addition, IBM integrated hardware and software Edge systems will provide RHACM - based management of the platform when clients or third parties have existing managed as a service models. It is essential to note that in either case this is done within a single control plane for hubs and spokes that helps optimize execution and management from any cloud to the edge in the hub and spoke model.
IBM's focus on “edge in” means it can provide the infrastructure through things like the example shown above for software defined storage for federated namespace data lake that surrounds other hyperscaler clouds. Additionally, IBM is exploring integrated full stack edge storage appliances based on hyperconverged infrastructure (HCI), such as the Spectrum Fusion HCI, for enterprise edge deployments.
As mentioned earlier, data gravity is one of the main driving factors of edge deployments. IBM has designed its infrastructure to meet those data gravity requirements, not just for the existing hub and spoke topology but also for a future spoke-to-spoke topology where peer-to-peer data sharing becomes imperative (as illustrated with the wealth of examples provided in this article).
Edge is a distributed computing model. One of its main advantages is that computing, and data storage and processing is close to where data is created. Without the need to move data to the cloud for processing, real-time application of analytics and AI capabilities provides immediate solutions and drives business value.
IBM’s goal is not to move the entirety of its cloud infrastructure to the edge. That has little value and would simply function as a hub to spoke model operating on actions and configurations dictated by the hub.
IBM’s architecture will provide the edge with autonomy to determine where data should reside and from where the control plane should be exercised.
Equally important, IBM foresees this architecture evolving into a decentralized model capable of edge-to-edge interactions. IBM has no firm designs for this as yet. However, the plan is to make the edge infrastructure and platform a first-class citizen instead of relying on the cloud to drive what happens at the edge.
Developing a complete and comprehensive AI/ML edge architecture - and in fact, an entire ecosystem - is a massive undertaking. IBM faces many known and unknown challenges that must be solved before it can achieve success.
However, IBM is one of the few companies with the necessary partners and the technical and financial resources to undertake and successfully implement a project of this magnitude and complexity.
It is reassuring that IBM has a plan and that its plan is sound.
Paul Smith-Goodsonis Vice President and Principal Analyst for quantum computing, artificial intelligence and space at Moor Insights and Strategy. You can follow him onTwitterfor more current information on quantum, AI, and space.
Note: Moor Insights & Strategy writers and editors may have contributed to this article.
Moor Insights & Strategy, like all research and tech industry analyst firms, provides or has provided paid services to technology companies. These services include research, analysis, advising, consulting, benchmarking, acquisition matchmaking, and speaking sponsorships. The company has had or currently has paid business relationships with 8×8, Accenture, A10 Networks, Advanced Micro Devices, Amazon, Amazon Web Services, Ambient Scientific, Anuta Networks, Applied Brain Research, Applied Micro, Apstra, Arm, Aruba Networks (now HPE), Atom Computing, AT&T, Aura, Automation Anywhere, AWS, A-10 Strategies, Bitfusion, Blaize, Box, Broadcom, C3.AI, Calix, Campfire, Cisco Systems, Clear Software, Cloudera, Clumio, Cognitive Systems, CompuCom, Cradlepoint, CyberArk, Dell, Dell EMC, Dell Technologies, Diablo Technologies, Dialogue Group, Digital Optics, Dreamium Labs, D-Wave, Echelon, Ericsson, Extreme Networks, Five9, Flex, Foundries.io, Foxconn, Frame (now VMware), Fujitsu, Gen Z Consortium, Glue Networks, GlobalFoundries, Revolve (now Google), Google Cloud, Graphcore, Groq, Hiregenics, Hotwire Global, HP Inc., Hewlett Packard Enterprise, Honeywell, Huawei Technologies, IBM, Infinidat, Infosys, Inseego, IonQ, IonVR, Inseego, Infosys, Infiot, Intel, Interdigital, Jabil Circuit, Keysight, Konica Minolta, Lattice Semiconductor, Lenovo, Linux Foundation, Lightbits Labs, LogicMonitor, Luminar, MapBox, Marvell Technology, Mavenir, Marseille Inc, Mayfair Equity, Meraki (Cisco), Merck KGaA, Mesophere, Micron Technology, Microsoft, MiTEL, Mojo Networks, MongoDB, MulteFire Alliance, National Instruments, Neat, NetApp, Nightwatch, NOKIA (Alcatel-Lucent), Nortek, Novumind, NVIDIA, Nutanix, Nuvia (now Qualcomm), onsemi, ONUG, OpenStack Foundation, Oracle, Palo Alto Networks, Panasas, Peraso, Pexip, Pixelworks, Plume Design, PlusAI, Poly (formerly Plantronics), Portworx, Pure Storage, Qualcomm, Quantinuum, Rackspace, Rambus, Rayvolt E-Bikes, Red Hat, Renesas, Residio, Samsung Electronics, Samsung Semi, SAP, SAS, Scale Computing, Schneider Electric, SiFive, Silver Peak (now Aruba-HPE), SkyWorks, SONY Optical Storage, Splunk, Springpath (now Cisco), Spirent, Splunk, Sprint (now T-Mobile), Stratus Technologies, Symantec, Synaptics, Syniverse, Synopsys, Tanium, Telesign,TE Connectivity, TensTorrent, Tobii Technology, Teradata,T-Mobile, Treasure Data, Twitter, Unity Technologies, UiPath, Verizon Communications, VAST Data, Ventana Micro Systems, Vidyo, VMware, Wave Computing, Wellsmith, Xilinx, Zayo, Zebra, Zededa, Zendesk, Zoho, Zoom, and Zscaler. Moor Insights & Strategy founder, CEO, and Chief Analyst Patrick Moorhead is an investor in dMY Technology Group Inc. VI, Dreamium Labs, Groq, Luminar Technologies, MemryX, and Movandi.
Mon, 08 Aug 2022 03:51:00 -0500Paul Smith-Goodsonentext/htmlhttps://www.forbes.com/sites/moorinsights/2022/08/08/ibm-research-rolls-out-a-comprehensive-ai-and-ml-edge-research-strategy-anchored-by-enterprise-partnerships-and-use-cases/Killexams : Colorado’s P-TECH Students Graduate Ready for Tech Careers(TNS) — Abraham Tinajero was an eighth grader when he saw a poster in his Longmont middle school’s library advertising a new program offering free college with a technology focus.
Interested, he talked to a counselor to learn more about P-TECH, an early college program where he could earn an associate’s degree along with his high school diploma. Liking the sound of the program, he enrolled in the inaugural P-TECH class as a freshman at Longmont’s Skyline High School.
“I really loved working on computers, even before P-TECH,” he said. “I was a hobbyist. P-TECH gave me a pathway.”
He worked with an IBM mentor and interned at the company for six weeks as a junior. After graduating in 2020 with his high school diploma and the promised associate’s degree in computer science from Front Range Community College, he was accepted to IBM’s yearlong, paid apprenticeship program.
IBM hired him as a cybersecurity analyst once he completed the apprenticeship.
“P-TECH has given me a great advantage,” he said. “Without it, I would have been questioning whether to go into college. Having a college degree at 18 is great to put on a resume.”
Stanley Litow, a former vice president of IBM, developed the P-TECH, or Pathways in Technology Early College High Schools, model. The first P-TECH school opened 11 years ago in Brooklyn, New York, in partnership with IBM.
Litow’s idea was to get more underrepresented young people into tech careers by giving them a direct path to college while in high school — and in turn create a pipeline of employees with the job skills businesses were starting to value over four-year college degrees.
The program, which includes mentors and internships provided by business partners, gives high school students up to six years to earn an associate's degree at no cost.
SKYLINE HIGH A PIONEER IN PROGRAM
In Colorado, St. Vrain Valley was among the first school districts chosen by the state to offer a P-TECH program after the Legislature passed a bill to provide funding — and the school district has embraced the program.
Colorado’s first P-TECH programs started in the fall of 2016 at three high schools, including Skyline High. Over the last six years, 17 more Colorado high schools have adopted P-TECH, for at total of 20. Three of those are in St. Vrain Valley, with a fourth planned to open in the fall of 2023 at Longmont High School.
Each St. Vrain Valley high school offers a different focus supported by different industry partners.
Skyline partners with IBM, with students earning an associate’s degree in Computer Information Systems from Front Range. Along with being the first, Skyline’s program is the largest, enrolling up to 55 new freshmen each year.
Programs at the other schools are capped at 35 students per grade.
Frederick High’s program, which started in the fall of 2019, has a bioscience focus, partners with Aims Community College and works with industry partners Agilent Technologies, Tolmar, KBI Biopharma, AGC Biologics and Corden Pharma.
Silver Creek High’s program started a year ago with a cybersecurity focus. The Longmont school partners with Front Range and works with industry partners Seagate, Cisco, PEAK Resources and Comcast.
The new program coming to Longmont High will focus on business.
District leaders point to Skyline High’s graduation statistics to illustrate the program’s success. At Skyline, 100 percent of students in the first three P-TECH graduating classes earned a high school diploma in four years.
For the 2020 Skyline P-TECH graduates, 24 of the 33, or about 70 percent, also earned associate’s degrees. For the 2021 graduating class, 30 of the 47 have associate’s degrees — with one year left for those students to complete the college requirements.
For the most accurate 2022 graduates, who have two years left to complete the college requirements, 19 of 59 have associate’s degrees and another six are on track to earn their degrees by the end of the summer.
JUMPING AT AN OPPORTUNITY
Louise March, Skyline High’s P-TECH counselor, keeps in touch with the graduates, saying 27 are working part time or full time at IBM. About a third are continuing their education at a four year college. Of the 19 who graduated in 2022 with an associate’s degree, 17 are enrolling at a four year college, she said.
Two of those 2022 graduates are Anahi Sarmiento, who is headed to the University of Colorado Boulder’s Leeds School of Business, and Jose Ivarra, who will study computer science at Colorado State University.
“I’m the oldest out of three siblings,” Ivarra said. “When you hear that someone wants to supply you free college in high school, you take it. I jumped at the opportunity.”
Sarmiento added that her parents, who are immigrants, are already working two jobs and don’t have extra money for college costs.
“P-TECH is pushing me forward,” she said. “I know my parents want me to have a better life, but I want them to have a better life, too. Going into high school, I kept that mentality that I would push myself to my full potential. It kept me motivated.”
While the program requires hard work, the two graduates said, they still enjoyed high school and had outside interests. Ivarra was a varsity football player who was named player of the year. Sarmiento took advantage of multiple opportunities, from helping elementary students learn robotics to working at the district’s Innovation Center.
Ivarra said he likes that P-TECH has the same high expectations for all students, no matter their backgrounds, and gives them support in any areas where they need help. Spanish is his first language and, while math came naturally, language arts was more challenging.
“It was tough for me to see all these classmates use all these big words, and I didn’t know them,” he said. “I just felt less. When I went into P-TECH, the teachers focus on you so much, checking on every single student.”
They said it’s OK to struggle or even fail. Ivarra said he failed a tough class during the pandemic, but was able to retake it and passed. Both credited March, their counselor, with providing unending support as they navigated high school and college classes.
“She’s always there for you,” Sarmiento said. “It’s hard to be on top of everything. You have someone to go to.”
Students also supported each other.
“You build bonds,” Ivarra said. “You’re all trying to figure out these classes. You grow together. It’s a bunch of people who want to succeed. The people that surround you in P-TECH, they push you to be better.”
SUPPORT SYSTEMS ARE KEY
P-TECH has no entrance requirements or prerequisite classes. You don’t need to be a top student, have taken advanced math or have a background in technology.
With students starting the rigorous program with a wide range of skills, teachers and counselors said, they quickly figured out the program needed stronger support systems.
March said freshmen in the first P-TECH class struggled that first semester, prompting the creation of a guided study class. The every other day, hour-and-a-half class includes both study time and time to learn workplace skills, including writing a resume and interviewing. Teachers also offer tutoring twice a week after school.
“The guided study has become crucial to the success of the program,” March said.
Another way P-TECH provides extra support is through summer orientation programs for incoming freshmen.
At Skyline, ninth graders take a three-week bridge class — worth half a credit — that includes learning good study habits. They also meet IBM mentors and take a field trip to Front Range Community College.
“They get their college ID before they get their high school ID,” March said.
During a session in June, 15 IBM mentors helped the students program a Sphero robot to travel along different track configurations. Kathleen Schuster, who has volunteered as an IBM mentor since the P-TECH program started here, said she wants to “return some of the favors I got when I was younger.”
“Even this play stuff with the Spheros, it’s teaching them teamwork and a little computing,” she said. “Hopefully, through P-TECH, they will learn what it takes to work in a tech job.”
Incoming Skyline freshman Blake Baker said he found a passion for programming at Trail Ridge Middle and saw P-TECH as a way to capitalize on that passion.
“I really love that they supply you options and a path,” he said.
Trail Ridge classmate Itzel Pereyra, another programming enthusiast, heard about P-TECH from her older brother.
“It’s really good for my future,” she said. “It’s an exciting moment, starting the program. It will just help you with everything.”
While some of the incoming ninth graders shared dreams of technology careers, others see P-TECH as a good foundation to pursue other dreams.
Skyline incoming ninth grader Marisol Sanchez wants to become a traveling nurse, demonstrating technology and new skills to other nurses. She added that the summer orientation sessions are a good introduction, helping calm the nerves that accompany combining high school and college.
“There’s a lot of team building,” she said. “It’s getting us all stronger together as a group and introducing everyone.”
THE SPARK OF MOTIVATION
Silver Creek’s June camp for incoming ninth graders included field trips to visit Cisco, Seagate, PEAK Resources, Comcast and Front Range Community College.
During the Front Range Community College field trip, the students heard from Front Range staff members before going on a scavenger hunt. Groups took photos to prove they completed tasks, snapping pictures of ceramic pieces near the art rooms, the most expensive tech product for sale in the bookstore and administrative offices across the street from the main building.
Emma Horton, an incoming freshman, took a cybersecurity class as a Flagstaff Academy eighth grader that hooked her on the idea of technology as a career.
“I’m really excited about the experience I will be getting in P-TECH,’ she said. “I’ve never been super motivated in school, but with something I’m really interested in, it becomes easier.”
Deb Craven, dean of instruction at Front Range’s Boulder County campus, promised the Silver Creek students that the college would support them. She also gave them some advice.
“You need to advocate and ask for help,” she said. “These two things are going to help you the most. Be present, be engaged, work together and lean on each other.”
Craven, who oversees Front Range’s P-TECH program partnership, said Front Range leaders toured the original P-TECH program in New York along with St. Vrain and IBM leaders in preparation for bringing P-TECH here.
“Having IBM as a partner as we started the program was really helpful,” she said.
When the program began, she said, freshmen took a more advanced technology class as their first college class. Now, she said, they start with a more fundamental class in the spring of their freshman year, learning how to build a computer.
“These guys have a chance to grow into the high school environment before we stick them in a college class,” she said.
Summer opportunities aren’t just for P-TECH’s freshmen. Along with summer internships, the schools and community colleges offer summer classes.
Silver Creek incoming 10th graders, for example, could take a personal financial literacy class at Silver Creek in the mornings and an introduction to cybersecurity class at the Innovation Center in the afternoons in June.
Over at Skyline, incoming 10th graders in P-TECH are getting paid to teach STEM lessons to elementary students while earning high school credit. Students in the fifth or sixth year of the program also had the option of taking computer science and algebra classes at Front Range.
EMBRACING THE CHALLENGE
And at Frederick, incoming juniors are taking an introduction to manufacturing class at the district's Career Elevation and Technology Center this month in preparation for an advanced manufacturing class they’re taking in the fall.
“This will supply them a head start for the fall,” said instructor Chester Clark.
Incoming Frederick junior Destini Johnson said she’s not sure what she wants to do after high school, but believes the opportunities offered by P-TECH will prepare her for the future.
“I wanted to try something challenging, and getting a head start on college can only help,” she said. “It’s really incredible that I’m already halfway done with an associate’s degree and high school.”
IBM P-TECH program manager Tracy Knick, who has worked with the Skyline High program for three years, said it takes a strong commitment from all the partners — the school district, IBM and Front Range — to make the program work.
“It’s not an easy model,” she said. “When you say there are no entrance requirements, we all have to be OK with that and support the students to be successful.”
IBM hosted 60 St. Vrain interns this summer, while two Skyline students work as IBM “co-ops” — a national program — to assist with the P-TECH program.
The company hosts two to four formal events for the students each year to work on professional and technical skills, while IBM mentors provide tutoring in algebra. During the pandemic, IBM also paid for subscriptions to tutor.com so students could get immediate help while taking online classes.
“We want to get them truly workforce ready,” Knick said. “They’re not IBM-only skills we’re teaching. Even though they choose a pathway, they can really do anything.”
As the program continues to expand in the district, she said, her wish is for more businesses to recognize the value of P-TECH.
“These students have had intensive training on professional skills,” she said. “They have taken college classes enhanced with the same digital credentials that an IBM employee can learn. There should be a waiting list of employers for these really talented and skilled young professionals.”
Thu, 04 Aug 2022 02:41:00 -0500entext/htmlhttps://www.govtech.com/education/k-12/colorados-p-tech-students-graduate-ready-for-tech-careersKillexams : How Tech Sector is Significantly Disadvantaged by an AI Skills Shortage
Guest blog: Sreeram Visvanathan, Chief Executive of IBM UK and Ireland, exposes a worrying shortfall in skills required for a career in AI.
I believe that the last two decades in enterprise computing has been the prequel to the main act to follow. In this main act, the winners will be enterprises willing to change, to question everything, to leverage the latest in digital innovation to scale the impact of AI, Hybrid Cloud and automation on every aspect of their business.
The Covid pandemic disrupted business-as-usual for most companies, and several spined to digital technology, containing AI, to sustain operations. Earlier this year, IBM launched a study that revealed the size of the AI skills gap across Europe that found the tech sector is struggling to find employees with adequate AI knowledge or experience. The research found nearly 7 in 10 tech job seekers and tech employees believe that potential recruits lack the skills necessary for a career in AI. The impact of this deficit has the potential to stifle digital innovation and hold back economic growth.
Mind the gap
The IBM report, ‘Addressing the AI Skills Gap in Europe’, exposed a worrying shortfall in skills required for a career in AI. Although technical capabilities are vital for a career in the sector, problem solving is considered the most critical soft skill needed for tech roles among all survey participants (up to 37%). However, around a quarter of tech recruiters (23%) have difficulty finding applicants with this aptitude along with shortfalls in critical and strategic thinking. Along with soft skills, 40% of tech job seekers and employees noted that software engineering and knowledge of programming languages are the most important technical capabilities for the AI/tech workforce to have.
How to address the issue
As AI moves into the mainstream, specialist tech staff are working more closely than ever with business managers. In order to secure the best possible outcomes, the soft skills of interpersonal communication, strategic problem solving, and critical thinking are required across all disciplines to help ensure the most beneficial personal interactions. Demonstrating these skills can greatly Boost employability and career developments in AI.
The report showed that offering education and skills training is seen as a top priority for many companies looking to Boost AI recruitment in the future. As a result, IBM have already taken proactive steps to help applicants and employees enhance their AI skills.
IBM launched IBM SkillsBuild, which brings together two world-class, skills-based learning programs—"Open P-TECH" and "SkillsBuild"—under one umbrella. Through the program, students, educators, job seekers, and the organisations that support them have access to free digital learning, resources, and support focused on the core technology and workplace skills needed to succeed in jobs. SkillsBuild is a free programme which contains an AI skills module for secondary education students and adults seeking entry-level employment.
Further concerted effort
A great deal remains to be done to solve this skills gap. However, I believe we can agree that a solution is achievable. What’s required now is for industry, government and academia to work together to put existing ideas into practice and to think of new ways to solve the challenge. At the start of the year, the DCMS announced £23 million of government funding to create 2,000 scholarships in AI and data science in England. The new scholarships from this funding will ensure more people can build successful careers in AI, create and develop new and bigger businesses, and will Boost the diversity of this growing and innovative sector. I hope to see further investment and programs such as ours with SkillsBuild as key drivers in change. Finding solutions and initiatives such as these will ensure we are providing a significant boost for the UK while providing a rewarding career for many.
This article was authored by Sreeram Visvanathan, Chief Executive of IBM UK and Ireland
NEW YORK, Aug. 9, 2022 /PRNewswire/ -- The Insight Partners published latest research study on "Predictive Analytics Market Forecast to 2028 - COVID-19 Impact and Global Analysis By Component [Solution (Risk Analytics, Marketing Analytics, Sales Analytics, Customer Analytics, and Others) and Service], Deployment Mode (On-Premise and Cloud-Based), Organization Size [Small and Medium Enterprises (SMEs) and Large Enterprises], and Industry Vertical (IT & Telecom, BFSI, Energy & Utilities, Government and Defence, Retail and e-Commerce, Manufacturing, and Others)", the global predictive analytics market size is projected to grow from $12.49 billion in 2022 to $38.03 billion by 2028; it is expected to grow at a CAGR of 20.4% from 2022 to 2028.
Predictive Analytics Market Report Scope & Strategic Insights:
Market Size Value in
US$ 12.49 Billion in 2022
Market Size Value by
US$ 38.03 Billion by 2028
CAGR of 20.4% from 2022 to 2028
No. of Pages
No. of Charts & Figures
Historical data available
Component, Deployment Mode, Organization Size, and Industry Vertical
North America; Europe; Asia Pacific; Latin America; MEA
US, UK, Canada, Germany, France, Italy, Australia, Russia, China, Japan, South Korea, Saudi Arabia, Brazil, Argentina
Revenue forecast, company ranking, competitive landscape, growth factors, and trends
Predictive Analytics Market: Competitive Landscape and Key Developments
IBM Corporation; Microsoft Corporation; Oracle Corporation; SAP SE; Google LLC; SAS Institute Inc.; Salesforce.com, inc.; Amazon Web Services; Hewlett Packard Enterprise Development LP (HPE); and NTT DATA Corporation are among the leading players profiled in this report of the predictive analytics market. Several other essential predictive analytics market players were analyzed for a holistic view of the predictive analytics market and its ecosystem. The report provides detailed predictive analytics market insights, which help the key players strategize their growth.
In 2022, Microsoft partnered with Teradata, a provider of a multi-cloud platform for enterprise analytics, for the integration of Teradata's Vantage data platform into Microsoft Azure.
In 2021, IBM and Black & Veatch collaborated to assist customers in keeping their assets and equipment working at peak performance and reliability by integrating AI with real-time data analytics.
In 2020, Microsoft partnered with SAS for the extension of their business solutions. As a part of this move, the companies will migrate SAS analytical products and solutions to Microsoft Azure as a preferred cloud provider for SAS cloud.
Increase in Uptake of Predictive Analytics Tools Propels Predictive Analytics Market Growth:
Predictive analytics tools use data to state the probabilities of the possible outcomes in the future. Knowing these probabilities can help users plan many aspects of their business. Predictive analytics is part of a larger set of data analytics; other aspects of data analytics include descriptive analytics, which helps users understand what their data represent; diagnostic analytics, which helps identify the causes of past events; and prescriptive analytics, which provides users with practical advice to make better decisions.
Prescriptive analytics is similar to predictive analytics. Predictive modeling is the most technical aspect of predictive analytics. Data analysts perform modeling with statistics and other historical data. The model then estimates the likelihood of different outcomes. In e-commerce, predictive modeling tools help analyze customer data. It can predict how many people are likely to buy a certain product. It can also predict the return on investment (ROI) of targeted marketing campaigns. Some software-as-a-service (SaaS) may collect data directly from online stores, such as Amazon Marketplace.
Predictive analytics tools may benefit social media marketing by guiding users to plan the type of content to post; these tools also recommend the best time and day to post. Manufacturing industries need predictive analytics to manage inventory, supply chains, and staff hiring processes. Transport planning and execution are performed more efficiently with predictive analytics tools. For instance, SAP is a leading multinational software company. Its Predictive Analytics was one of the leading data analytics platforms across the world. Now, the software is gradually being integrated into SAP's larger Cloud Analytics platform, which does more business intelligence (BI) than SAP Predictive Analytics. SAP Analytics Cloud, which works on all devices, utilizes artificial intelligence (AI) to Boost business planning and forecasting. This analytics platform can be easily extended to businesses of all sizes.
North America is one of the most vital regions for the uptake and growth of new technologies due to favorable government policies that boost innovation, the presence of a substantial industrial base, and high purchasing power, especially in developed countries such as the US and Canada. The industrial sector in the US is a prominent market for security analytics. The country consists of a large number of predictive analytics platform developers. The COVID-19 pandemic enforced companies to adopt the work-from-home culture, increasing the demand for big data and data analytics.
The pandemic created an enormous challenge for businesses in North America to continue operating despite massive shutdowns of offices and other facilities. Furthermore, the surge in digital traffic presented an opportunity for numerous online frauds, phishing attacks, denial of inventory, and ransomware attacks. Due to the increased risk of cybercrimes, enterprises began adopting advanced predictive analytics-based solutions to detect and manage any abnormal behavior in their networks. Thus, with the growing number of remote working facilities, the need for predictive analytics solutions also increased in North America during the COVID-19 pandemic.
Predictive Analytics Market: Industry Overview
The predictive analytics market is segmented on the basis of component, deployment mode, organization size, industry vertical, and geography. The predictive analytics market analysis, by component, is segmented into solutions and services. The predictive analytics market based on solution is segmented into risk analytics, marketing analytics, sales analytics, customer analytics, and others. The predictive analytics market analysis, by deployment mode, is bifurcated into cloud and on-premises. The predictive analytics market, by organization size, is segmented into large enterprises, and small and medium-sized enterprises (SMEs). The predictive analytics market, by vertical, is segmented into BFSI, manufacturing, retail and e-Commerce, IT and telecom, energy and utilities, government and defense, and others.
In terms of geography, the predictive analytics market is categorized into five regions—North America, Europe, Asia Pacific (APAC), the Middle East & Africa (MEA), and South America (SAM). The predictive analytics market in North America is sub segmented into the US, Canada, and Mexico. Predictive analytics software is increasingly being adopted in multiple organizations, and cloud-based predictive analytics software solutions are gaining significance in SMEs in North America. The highly competitive retail sector in this region is harnessing the potential of this technique to efficiently transform store layouts and enhance the customer experience in various businesses. In a few North American countries, retailers use smart carts with locator beacons, pin-sized cameras installed near shelves, or the store's Wi-Fi network to determine the footfall in the store, provide directions to a specific product section, and check key areas visited by customers. This process can also provide basic demographic data for parameters such as gender and age.
Wal-Mart, Costco, Kroger, The Home Depot, and Target have their origin in North America. The amount of data generated by stores surges with the rise in sales. Without implementing analytics solutions, it becomes difficult to manage such vast data that include records, behaviors, etc., of all customers. Players such as Euclid Analytics offer spatial analytics platforms for retailers operating offline to help them track customer traffic, loyalty, and other indicators associated with customer visits. Euclid's solutions include preconfigured sensors connected to switches that are linked through a network. These sensors can detect customer calls from devices that have Wi-Fi turned on. Additionally, IBM's Sterling Store Engagement solution provides a real-time view of store inventory, and order data through an intuitive user interface that can be accessed by store owners from counters and mobile devices.
Heavy investments in healthcare sectors, advancements in technologies to help manage a large number of medical records, and the use of Big Data analytics to efficiently predict at-risk patients and create effective treatment plans are further contributing to the growth of the predictive analytics market in North America. Predictive analytics helps assess patterns in a patients' medical records, thereby allowing healthcare professionals to develop effective treatment plans to Boost outcomes. During the COVID-19 pandemic, healthcare predictive analytics solutions helped provide hospitals with insightful predictions of the number of hospitalizations for various treatments, which significantly helped them deal with the influx of a large number of patients. However, the high costs of installation and a shortage of skilled workers may limit the use of predictive analytics solutions in, both, the retail and healthcare sectors.
Browse Adjoining Reports:
Procurement Analytics Market Forecast to 2028 - COVID-19 Impact and Global Analysis By Application (Supply Chain Analytics, Risk Analytics, Spend Analytics, Demand Forecasting, Contract Management, Vendor Management); Deployment (Cloud, On Premises); Industry Vertical (Retail and E Commerce, Manufacturing, Government and Defense, Healthcare and Life sciences, Telecom and IT, Energy and Utility, Banking Financial Services and Insurance) and Geography
Risk Analytics Market Forecast to 2028 - Covid-19 Impact and Global Analysis - by Component (Software, Services); Type (Strategic Risk, Financial Risk, Operational Risk, Others); Deployment Mode (Cloud, On-Premise); Industry Vertical (BFSI, IT and Telecom, Manufacturing, Retail and Consumer Goods, Transportation and Logistics, Government and Defense, Energy and Utilities, Healthcare and Life Sciences, Others) and Geography
Preventive Risk Analytics Market Forecast to 2028 - COVID-19 Impact and Global Analysis By Component (Solution, Services); Deployment Type (On-Premise, Cloud); Organization Size (SMEs, Large Enterprises); Type (Strategic Risks, Financial Risks, Operational Risks, Compliance Risks); Industry (BFSI, Energy and Utilities, Government and Defense, Healthcare, Manufacturing, IT and Telecom, Retail, Others) and Geography
Business Analytics Market Forecast to 2028 - Covid-19 Impact and Global Analysis - by Application (Supply Chain Analytics, Spatial Analytics, Workforce Analytics, Marketing Analytics, Behavioral Analytics, Risk And Credit Analytics, and Pricing Analytics); Deployment (On-Premise, Cloud, and Hybrid); End-user (BFSI, IT & Telecom, Manufacturing, Retail, Energy & Power, and Healthcare)
Big Data Analytics Market Forecast to 2028 - COVID-19 Impact and Global Analysis By Component (Software and Services), Analytics Tool (Dashboard and Data Visualization, Data Mining and Warehousing, Self-Service Tool, Reporting, and Others), Application (Customer Analytics, Supply Chain Analytics, Marketing Analytics, Pricing Analytics, Workforce Analytics, and Others), and End Use Industry (Pharmaceutical, Semiconductor, Battery Manufacturing, Electronics, and Others)
Data Analytics Outsourcing Market to 2027 - Global Analysis and Forecasts by Type (Descriptive Data Analytics, Predictive Data Analytics, and Prescriptive Data Analytics); Application (Sales Analytics, Marketing Analytics, Risk & Finance Analytics, and Supply Chain Analytics); and End-user (BFSI, Healthcare, Retail, Manufacturing, Telecom, and Media & Entertainment)
Sales Performance Management Market Forecast to 2028 - Covid-19 Impact and Global Analysis - by Solution (Incentive Compensation Management, Territory Management, Sales Monitoring and Planning, and Sales Analytics), Deployment Type (On-premise, Cloud), Services (Professional Services, Managed Services), End User (BFSI, Manufacturing, Energy and Utility, and Healthcare)
Customer Analytics Market Forecast to 2028 - COVID-19 Impact and Global Analysis By Component (Solution, Services); Deployment Type (On-premises, Cloud); Enterprise Size (Small and Medium-sized Enterprises, Large Enterprises); End-user (BFSI, IT and Telecom, Media and Entertainment, Consumer Goods and Retail, Travel and Hospitality, Others) and Geography
Life Science Analytics Market Forecast to 2028 - COVID-19 Impact and Global Analysis By Type (Predictive Analytics, Prescriptive Analytics, Descriptive Analytics); Component (Services, Software); End User (Pharmaceutical & Biotechnology Companies, Research Centers, Medical Device Companies, Third-Party Administrators)
The Insight Partners is a one stop industry research provider of actionable intelligence. We help our clients in getting solutions to their research requirements through our syndicated and consulting research services. We specialize in industries such as Semiconductor and Electronics, Aerospace and Defense, Automotive and Transportation, Biotechnology, Healthcare IT, Manufacturing and Construction, Medical Device, Technology, Media and Telecommunications, Chemicals and Materials.
If you have any queries about this report or if you would like further information, please contact us:
Tue, 09 Aug 2022 00:56:00 -0500text/htmlhttps://www.tmcnet.com/usubmit/-predictive-analytics-market-worth-38-billion-2028-204-/2022/08/09/9652572.htmKillexams : NetSPI rolls out 2 new open-source pen-testing tools at Black Hat
Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Watch here.
Preventing and mitigating cyberattacks is a day-to-day — sometimes hour-to-hour — is a massive endeavor for enterprises. New, more advanced techniques are revealed constantly, especially with the rise in ransomware-as-a-service, crime syndicates and cybercrime commoditization. Likewise, statistics are seemingly endless, with a regular churn of new, updated reports and research studies revealing worsening conditions.
According to Fortune Business Insights, the worldwide information security market will reach just around $376 billion in 2029. And, IBM research revealed that the average cost of a data breach is $4.35 million.
The harsh truth is that many organizations are exposed due to common software, hardware or organizational process vulnerabilities — and 93% of all networks are open to breaches, according to another accurate report.
Cybersecurity must therefore be a team effort, said Scott Sutherland, senior director at NetSPI, which specializes in enterprise penetration testing and attack-surface management.
The company today announced the release of two new open-source tools for the information security community: PowerHuntShares and PowerHunt. Sutherland is demoing both at Black Hat USA this week.
These new tools are aimed at helping defense, identity and access management (IAM) and security operations center (SOC) teams discover vulnerable network shares and Boost detections, said Sutherland.
They have been developed — and released in an open-source capacity — to “help ensure our penetration testers and the IT community can more effectively identify and remediate excessive share permissions that are being abused by bad actors like ransomware groups,” said Sutherland.
He added, “They can be used as part of a regular quarterly cadence, but the hope is they’ll be a starting point for companies that lacked awareness around these issues before the tools were released.”
Vulnerabilities revealed (by the good guys)
The new PowerHuntShares capability inventories, analyzes and reports excessive privilege assigned to server message block (SMB) shares on Microsoft’s Active Directory (AD) domain-joined computers.
SMB allows applications on a computer to read and write to files and to request services from server programs in a computer network.
NetSPI’s new tool helps address risks of excessive share permissions in AD environments that can lead to data exposure, privilege escalation and ransomware attacks within enterprise environments, explained Sutherland.
“PowerHuntShares is focused on identifying shares configured with excessive permissions and providing data insight to understand how they are related to each other, when they were introduced into the environment, who owns them and how exploitable they are,” said Sutherland.
For instance, according to a accurate study from cybersecurity company ExtraHop, SMB was the most prevalent protocol exposed in many industries: 34 out of 10,000 devices in financial services; seven out of 10,000 devices in healthcare; and five out of 10,000 devices in state, local and education (SLED).
Enhanced threat hunting
Meanwhile, PowerHunt is a modular threat-hunting framework that identifies signs of compromise based on artifacts from common MITRE ATT&CK techniques. It also detects anomalies and outliers specific to the target environment.
The new tool can be used to quickly collect artifacts commonly associated with malicious behavior, explained Sutherland. It automates the collection of artifacts at scale using Microsoft PowerShell and by performing initial analysis. It can also output .csv files that are easy to consume. This allows for additional triage and analysis through other tools and processes.
“While [the PowerHunt tool] calls out suspicious artifacts and statistical anomalies, its greatest value is simply producing data that can be used by other tools during threat-hunting exercises,” said Sutherland.
NetSPI offers penetration testing-as-a-service (PTaaS) through its ResolveTM penetration testing and vulnerability management platform. With this, its experts perform deep-dive manual penetration testing across application, network and cloud attack surfaces, said Sutherland. Historically, they test more than one million assets to find 4 million unique vulnerabilities.
The company’s global penetration testing team has also developed several open-source tools, including PowerUpSQL and MicroBurst.
Sutherland underscored the importance of open-source tool development and said that NetSPI actively encourages innovation through collaboration.
Open source offers “the ability to use tools for free to better understand a concept or issue,” he said. And, while most open-source tools may not end up being an enterprise solution, they can bring awareness to specific issues and “encourage exploration of long-term solutions.”
The ability to customize code is another advantage — anyone can obtain an open-source project and customize it to their needs.
Ultimately, open source offers an “incredibly powerful” ability, said Sutherland. “It’s great to be able to learn from someone else’s code, build off that idea, collaborate with a complete stranger and produce something new that you can share with thousands of people instantly around the world.”
Specifically relating to PowerHuntShares and PowerHunt, he urged the security community to check them out and contribute to them.
“This will allow the community to better understand our SMB share attack surfaces and Boost strategies for remediation — together,” he said.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn more about membership.
Tue, 09 Aug 2022 09:00:00 -0500Taryn Plumben-UStext/htmlhttps://venturebeat.com/security/netspi-rolls-out-2-new-open-source-pen-testing-tools-at-black-hat/Killexams : IBM report: Middle Eastern consumers pay the price as regional data breach costs reach all-time high
Riyadh, Saudi Arabia: IBM, the leading global technology company, has published a study highlighting the importance of cybersecurity in an increasingly digital age. According to IBM Security’s annual Cost of a Data Breach Report, the Middle East has incurred losses of SAR 28 million from data breaches in 2022 alone — this figure already exceeding the total amount of losses accrued in each of the last eight years.
The latest edition of the Cost of a Data Breach Report — now in its 17th year — reveals costlier and higher-impact data breaches than ever before. As outlined by the study, the global average cost of a data breach has reached an all-time high of $4.35 million for surveyed organizations. With breach costs increasing nearly 13% over the last two years of the report, the findings suggest these incidents may also be contributing to rising costs of goods and services. In fact, 60% of studied organizations raised their product or services prices due to the breach, when the cost of goods is already soaring worldwide amid inflation and supply chain issues.
Notably, the report ranks the Middle East2 among the top five countries and regions for the highest average cost of a data breach. As per the study, the average total cost of a data breach in the Middle East amounted to SAR 28 million in 2022, the region being second only to the United States on the list. The report also spotlights the industries across the Middle East that have suffered the highest per-record costs in millions; the financial (SAR 1,039), health (SAR 991) and energy (SAR 950) sectors taking first, second and third spot, respectively.
Fahad Alanazi, IBM Saudi General Manager, said: “Today, more so than ever, in an increasingly connected and digital age, cybersecurity is of the utmost importance. It is essential to safeguard businesses and privacy. As the digital economy continues to evolve, enhanced security will be the marker of a modern, world class digital ecosystem.”
He continued: “At IBM, we take great pride in enabling the people, businesses and communities we serve to fulfil their potential by empowering them with state-of-the-art services and support. Our findings reiterate just how important it is for us, as a technology leader, to continue pioneering solutions that will help the Kingdom distinguish itself as the tech capital of the region.”
The perpetuality of cyberattacks is also shedding light on the “haunting effect” data breaches are having on businesses, with the IBM report finding 83% of studied organizations have experienced more than one data breach in their lifetime. Another factor rising over time is the after-effects of breaches on these organizations, which linger long after they occur, as nearly 50% of breach costs are incurred more than a year after the breach.
The 2022 Cost of a Data Breach Report is based on in-depth analysis of real-world data breaches experienced by 550 organizations globally between March 2021 and March 2022. The research, which was sponsored and analyzed by IBM Security, was conducted by the Ponemon Institute.
Some of the key global findings in the 2022 IBM report include:
Critical Infrastructure Lags in Zero Trust – Almost 80% of critical infrastructure organizations studied don’t adopt zero trust strategies, seeing average breach costs rise to $5.4 million – a $1.17 million increase compared to those that do. All while 28% breaches amongst these organizations were ransomware or destructive attacks.
It Doesn’t Pay to Pay – Ransomware victims in the study that opted to pay threat actors’ ransom demands saw only $610,000 less in average breach costs compared to those that chose not to pay – not including the cost of the ransom. Factoring in the high cost of ransom payments, the financial toll may rise even higher, suggesting that simply paying the ransom may not be an effective strategy.
Security Immaturity in Clouds – Forty-three percent of studied organizations are in the early stages or have not started applying security practices across their cloud environments, observing over $660,000 on average in higher breach costs than studied organizations with mature security across their cloud environments.
Security AI and Automation Leads as Multi-Million Dollar Cost Saver – Participating organizations fully deploying security AI and automation incurred $3.05 million less on average in breach costs compared to studied organizations that have not deployed the technology – the biggest cost saver observed in the study.
“Businesses need to put their security defenses on the offense and beat attackers to the punch. It’s time to stop the adversary from achieving their objectives and start to minimize the impact of attacks. The more businesses try to perfect their perimeter instead of investing in detection and response, the more breaches can fuel cost of living increases.” said Charles Henderson, Global Head of IBM Security X-Force. “This report shows that the right strategies coupled with the right technologies can help make all the difference when businesses are attacked.”
Concerns over critical infrastructure targeting appear to be increasing globally over the past year, with many governments’ cybersecurity agencies urging vigilance against disruptive attacks. In fact, IBM’s report reveals that ransomware and destructive attacks represented 28% of breaches amongst critical infrastructure organizations studied, highlighting how threat actors are seeking to fracture the global supply chains that rely on these organizations. This includes financial services, industrial, transportation and healthcare companies amongst others.
Despite the call for caution, and a year after the Biden Administration issued a cybersecurity executive order that centers around the importance of adopting a zero trust approach to strengthen the nation’s cybersecurity, only 21% of critical infrastructure organizations studied adopt a zero trust security model, according to the report. Add to that, 17% of breaches at critical infrastructure organizations were caused due to a business partner being initially compromised, highlighting the security risks that over-trusting environments pose.
Businesses that Pay the Ransom Aren’t Getting a “Bargain”
According to the 2022 IBM report, businesses that paid threat actors’ ransom demands saw $610,000 less in average breach costs compared to those that chose not to pay – not including the ransom amount paid. However, when accounting for the average ransom payment, which according to Sophos reached $812,000 in 2021, businesses that opt to pay the ransom could net higher total costs - all while inadvertently funding future ransomware attacks with capital that could be allocated to remediation and recovery efforts and looking at potential federal offenses.
The persistence of ransomware, despite significant global efforts to impede it, is fueled by the industrialization of cybercrime. IBM Security X-Force discovered the duration of studied enterprise ransomware attacks shows a drop of 94% over the past three years – from over two months to just under four days. These exponentially shorter attack lifecycles can prompt higher impact attacks, as cybersecurity incident responders are left with very short windows of opportunity to detect and contain attacks. With “time to ransom” dropping to a matter of hours, it's essential that businesses prioritize rigorous testing of incident response (IR) playbooks ahead of time. But the report states that as many as 37% of organizations studied that have incident response plans don’t test them regularly.
Hybrid Cloud Advantage
The report also showcased hybrid cloud environments as the most prevalent (45%) infrastructure amongst organizations studied. Averaging $3.8 million in breach costs, businesses that adopted a hybrid cloud model observed lower breach costs compared to businesses with a solely public or private cloud model, which experienced $5.02 million and $4.24 million on average respectively. In fact, hybrid cloud adopters studied were able to identify and contain data breaches 15 days faster on average than the global average of 277 days for participants.
The report highlights that 45% of studied breaches occurred in the cloud, emphasizing the importance of cloud security. However, a significant 43% of reporting organizations stated they are just in the early stages or have not started implementing security practices to protect their cloud environments, observing higher breach costs3 . Businesses studied that did not implement security practices across their cloud environments required an average 108 more days to identify and contain a data breach than those consistently applying security practices across all their domains.
Additional findings in the 2022 IBM report include:
Phishing Becomes Costliest Breach Cause – While compromised credentials continued to reign as the most common cause of a breach (19%), phishing was the second (16%) and the costliest cause, leading to $4.91 million in average breach costs for responding organizations.
Healthcare Breach Costs Hit Double Digits for First Time Ever– For the 12th year in a row, healthcare participants saw the costliest breaches amongst industries with average breach costs in healthcare increasing by nearly $1 million to reach a record high of $10.1 million.
Insufficient Security Staffing – Sixty-two percent of studied organizations stated they are not sufficiently staffed to meet their security needs, averaging $550,000 more in breach costs than those that state they are sufficiently staffed.
To obtain a copy of the 2022 Cost of a Data Breach Report, please visit: https://www.ibm.com/security/data-breach.
Read more about the report’s top findings in this IBM Security Intelligence blog.
Sign up for the 2022 IBM Security Cost of a Data Breach webinar on Wednesday, August 3, 2022, at 11:00 a.m. ET here.
Connect with the IBM Security X-Force team for a personalized review of the findings: https://ibm.biz/book-a-consult.
About IBM Security
IBM Security offers one of the most advanced and integrated portfolios of enterprise security products and services. The portfolio, supported by world-renowned IBM Security X-Force® research, enables organizations to effectively manage risk and defend against emerging threats. IBM operates one of the world's broadest security research, development, and delivery organizations, monitors 150 billion+ security events per day in more than 130 countries, and has been granted more than 10,000 security patents worldwide. For more information, please check www.ibm.com/security, follow @IBMSecurity on Twitter or visit the IBM Security Intelligence blog.
Wed, 27 Jul 2022 22:20:00 -0500entext/htmlhttps://www.zawya.com/en/press-release/research-and-studies/ibm-report-middle-eastern-consumers-pay-the-price-as-regional-data-breach-costs-reach-all-time-high-q1wbuec0Killexams : SVVSD embraces early college P-TECH programNo result found, try new keyword!In Colorado, St. Vrain Valley was among the first school districts chosen by the state to offer a P-TECH program after the Legislature passed a bill to provide funding — and the school ...Sat, 30 Jul 2022 15:39:40 -0500en-ustext/htmlhttps://www.msn.com/en-us/money/careersandeducation/svvsd-embraces-early-college-p-tech-program/ar-AA1098v3Killexams : TIME Innovative Teacher Helps Students with Disabilities Tap Their Superpowers
When Joann Blumenfeld, an educator with the Wake County Public School System, looks at her students, she doesn’t see disabilities — she sees superpowers.
“We need to flip the mindset that these kids have disabilities,” says Blumenfeld, a special education teacher and founder of Catalyst, a STEM program for high school students with disabilities. “They have different superpowers.”
Blumenfeld was recognized as a TIME Innovative Teacher of the Year, presented by Verizon, for her work getting students with disabilities into high-paying STEM careers. Through Catalyst, which she started in 2014, Blumenfeld helps her students learn social and technical skills while changing the mindset around what they’re able to achieve.
Teaching Students with Disabilities STEM Career Skills
The lack of support in public education for students with disabilities inspired Blumenfeld to create Catalyst. The program operates within North Carolina State University and is built on multiple prongs of skill building.
“My favorite part of the program was actually getting introduced to different types of jobs and careers,” Jenkins says. He now has a scholarship to study computer science with the goal of working in cybersecurity.
Students also learn workforce readiness skills, such as budgeting and resume writing. They also meet with professionals to practice their professional interpersonal skills.
“We have companies like IBM come in and interview the kids in mock interviews, so they get practice in that,” says Blumenfeld.
Finally, students get STEM internships as part of the program. This gives them first-hand experience in the field and allows them to apply for college with two or three internships on their resumes.
“We can’t just supply kids accommodations. We need innovative programs like this, that connect all the prongs,” Blumenfeld says.
Changing the Minds of Students and Professionals Through Awareness
Beyond introducing students with disabilities to STEM careers, Blumenfeld’s other goal through the Catalyst program is to change people’s thinking.
If we want an innovative STEM workforce, we need to have a place at the table for everybody.”
Joann Blumenfeld Educator, Wake County Public School System
She hopes to open students’ eyes, so they can see all that they’re capable of achieving, and she wants to show professionals in the field that these students have superpowers.
“Breaking those barriers is really important on many levels,” Blumenfeld says. “Kids with autism tend to have great visual acuity, so that’s great for a coder. People who are hearing impaired, NASA wants them because they don’t get motion sickness. People with ADHD are great scanners, and they’re really good at seeing the whole picture. They make great people in emergency rooms and great teachers.”
“If we want an innovative STEM workforce, we need to have a place at the table for everybody,” she adds.
Blumenfeld encourages other districts to follow her lead, and she recommends finding a local college or university with which to partner for a good base of professors. She also suggests writing grants.
“We started small. I got a $5,000 grant. You can build up your base, and the more you build up, the more kids you can help,” she says.
Tue, 09 Aug 2022 03:51:00 -0500Rebecca Torchiaentext/htmlhttps://edtechmagazine.com/k12/article/2022/08/time-innovative-teacher-helps-students-disabilities-tap-their-superpowersKillexams : IBM Report: South African data breach costs reach all-time high
IBM Security today released the annual Cost of a Data Breach Report, revealing costlier and higher-impact data breaches than ever before, with the average cost of a data breach in South Africa reaching an all-time high of R49.25 million for surveyed organisations. With breach costs increasing nearly 20% over the last two years of the report, the findings suggest that security incidents became more costly and harder to contain compared to the year prior.
The 2022 report revealed that the average time to detect and contain a data breach was at its highest in seven years for organisations in South Africa – taking 247 days (187 to detect, 60 to contain). Companies who contained a breach in under 200 days were revealed to save almost R12 million – while breaches cost organisations R2650 per lost or stolen record on average.
The 2022 Cost of a Data Breach Report is based on in-depth analysis of real-world data breaches experienced by 550 organisations globally between March 2021 and March 2022. The research, which was sponsored and analysed by IBM Security, was conducted by the Ponemon Institute.
“As this year’s report reveals – organisations must adopt the right strategies coupled with the right technologies can help make all the difference when they are attacked. Businesses today need to continuously look into solutions that reduce complexity and speed up response to cyber threats across the hybrid cloud environment – minimising the impact of attacks,” says Ria Pinto, General Manager and Technology Leader, IBM South Africa.
Some of the key findings in the 2022 IBM report include:
Security Immaturity in Clouds – Organisations studied which had mature security across their cloud environments, the costs of a breach were observed to be R4 million lower than those that were in the midstage and applied many practices across their organisation.
Incident Response Testing is a Multi-Million Rand Cost Saver – Organisations with an Incident Response (IR) team saved over R3.4 million, while those that extensively tested their IR plan lowered the cost of a breach by over R2.6 million, the study revealed. The study also found that organisations which deployed security AI or analytics incurred over R2 million less on average in breach costs compared to studied organisations that have not deployed either technology– making them the top mitigating factors shown to reduce the cost of a breach.
Cloud Misconfiguration, Malicious Insider Attacks and Stolen Credentials are Costliest Breach Causes – Cloud misconfiguration reigned as the costliest cause of a breach (R58.6 million), malicious insider attacks came in second (R55 million) and the stolen credentials came in third, leading to R53 million in average breach costs for responding organisations.
Financial Services organisations experienced the Highest Breach Costs – Financial participants saw the costliest breaches amongst industries with average breach costs reaching a high of R4.9 million per record. This was followed by the industrial sector with losses per record reaching R4.7 million.
Hybrid Cloud Advantage
Globally, the report also showcased hybrid cloud environments as the most prevalent (45%) infrastructure amongst organisations studied. Global findings revealed that organisations that adopted a hybrid cloud model observed lower breach costs compared to businesses with a solely public or private cloud model. In fact, hybrid cloud adopters studied were able to identify and contain data breaches 15 days faster on average than the global average of 277 days for participants.
The report highlights that 45% of studied breaches globally occurred in the cloud, emphasising the importance of cloud security.
South African businesses studied that had not started to deploy zero trust security practices across their cloud environments suffered losses averaging R56 million. Those in the mature stages of deployment decreased this cost significantly – recording R20 million savings as their total cost of a data breach was found to be R36 million.
The study revealed that more businesses are implementing security practices to protect their cloud environments, lowering breach costs with 44% of reporting organisations stating their zero-trust deployment is in the mature stage and another 42% revealing they are in the midstage.
Thu, 28 Jul 2022 00:16:00 -0500text/htmlhttps://www.biztechafrica.com/article/ibm-report-south-african-data-breach-costs-reach-a/17008/Killexams : Girls in Tech Celebrates 15 Years of Success in Narrowing the Gender Gap
Press release content from PR Newswire. The AP news staff was not involved in its creation.
Click to copy
100,000+ members in 40+ countries mark great strides since 2007, and the immense challenges that lay ahead in the fight to end the gender gap
Girls in Tech to mark the milestone at its annual conference on September 7th with keynote speeches from senior executives at Accenture, Edward Jones, Gap, IBM, McKesson, Okta, TIAA, Trend Micro, and Verizon
NASHVILLE, Tenn., Aug. 3, 2022 /PRNewswire/ -- Girls in Tech, a global nonprofit working to erase the gender gap in tech, is celebrating its 15th anniversary at its annual conference on September 7th in Nashville – a forum for executives from across the globe to gather and discuss industry trends, tricks of the trade, setbacks, triumphs, and life experiences uniquely tailored to women in technology.
Founded in 2007 by CEO Adriana Gascoigne, Girls in Tech has grown into a global leader in the gender equality movement with 100,000+ women and allies in 56 cities, 42 countries and 6 continents. Among the organization’s biggest achievements:
15,000+ entrepreneurs funded, mentored and supported through the Startup Challenge, the organization’s signature entrepreneurship pitch competition;
100,000+ participants in the Girls in Tech Hackathon series, solving local and global problems;
35,000+ participants in coding, design and startup bootcamps;
the recently launched “Next Generation of Public Sector and Service Leaders,” a program to provide education and raise awareness of career opportunities in federal, state, and local governments.
The pandemic has disproportionately impacted women in the workplace, and many of the hard-fought gains in gender equality from the last 15 years are under threat. According to last year’s Girls in Tech study “The Tech Workplace for Women in the Pandemic,” 79% of women who have children in the household report feeling burned out, and more than one in four women report being sexually harassed in the workplace. The situation is even worse around the globe, with the World Economic Forum’s Global Gender Gap Report finding that the pandemic has set women back so significantly that the gender gap isn’t likely to be closed for more than 135 years.
The Girls in Tech community starts a new chapter this September 7th at its annual conference, featuring a dynamic selection of speakers with inspiring stories and practical insights to share. Keynote speakers include:
Jill Anderson, Principal, Technology Software Infrastructure at Edward Jones
Alvina Antar, Chief Information Officer at Okta
Debika Bhattacharya, Senior Vice President, 5G & Enterprise Solutions at Verizon
Latrise Brissett, Managing Director, Global IT, Business Operations at Accenture
Ruth Davis, Director of Call for Code, Worldwide Ecosystems at IBM
Wendy Harrington, Chief Data & Artificial Intelligence Officer at TIAA
Maria Lensing, Senior Vice President & Chief Technology Officer of Infrastructure Engineering & Operations at McKesson
Louise McEvoy, Vice President, US Channel Sales at Trend Micro
Heather Mickman, Chief Information Officer at Gap Inc.
“It’s amazing to look back on the progress we’ve made in 15 years and the 100,000+ women and allies who are united for change, but the fight to end the gender gap in tech and beyond isn’t going to get any easier,” said Adriana Gascoigne, Founder and CEO, Girls in Tech. “This year’s Girls in Tech Conference is going to feature some of the boldest and most successful women in technology delivering their unique visions for the road ahead.”
The Girls in Tech Conference is sponsored by AWS, Banyan Labs, CDW, Comcast, Gap Inc, Guideware, Infoblox, Marsh, McKesson, McKinsey & Company, Nike, Okta, Pega, Trend Micro, Unstoppable Domains, and Verizon.
A full agenda for the Girls in Tech Conference can be found here.
About Girls in Tech
Girls in Tech is a global non-profit that works to erase the gender gap in tech. Today, every industry is a tech industry, with a need for people of all skills and backgrounds. We offer education and experiences to help people discover their unique superpower and hone it. We aim to see every person accepted, confident, and valued in tech—just as they are.