Download C9010-250 study guide free enjoy your success

If you need to pass IBM C9010-250 exam, killexams.com has produced IBM Power Systems with POWER8 Sales Skills V1 questions and answers questions database that will guarantee a person pass C9010-250 exam! killexams.com provides a person the valid, Most recent, and 2022 up-to-date C9010-250 Exam Questions questions and provided a totally Guarantee.

Exam Code: C9010-250 Practice test 2022 by Killexams.com team
Power Systems with POWER8 Sales Skills V1
IBM Systems learn
Killexams : IBM Systems learn - BingNews https://killexams.com/pass4sure/exam-detail/C9010-250 Search results Killexams : IBM Systems learn - BingNews https://killexams.com/pass4sure/exam-detail/C9010-250 https://killexams.com/exam_list/IBM Killexams : IBM Research Rolls Out A Comprehensive AI And Platform-Based Edge Research Strategy Anchored By Enterprise Use Cases And Partnerships

I recently met with Dr. Nick Fuller, Vice President, Distributed Cloud, at IBM Research for a discussion about IBM’s long-range plans and strategy for artificial intelligence and machine learning at the edge.

Dr. Fuller is responsible for providing AI and platform–based innovation for enterprise digital transformation spanning edge computing and distributed cloud management. He is an IBM Master Inventor with over 75 patents and co-author of 75 technical publications. Dr. Fuller obtained his Bachelor of Science in Physics and Math from Morehouse College and his PhD in Applied Physics from Columbia University.

Edge In, not Cloud Out

In general, Dr. Fuller told me that IBM is focused on developing an "edge in" position versus a "cloud out" position with data, AI, and Kubernetes-based platform technologies to scale hub and spoke deployments of edge applications.

A hub plays the role of a central control plane used for orchestrating the deployment and management of edge applications in a number of connected spoke locations such as a factory floor or a retail branch, where data is generated or locally aggregated for processing.

“Cloud out” refers to the paradigm where cloud service providers are extending their cloud architecture out to edge locations. In contrast, “edge in” refers to a provider-agnostic architecture that is cloud-independent and treats the data-plane as a first-class citizen.

IBM's overall architectural principle is scalability, repeatability, and full stack solution management that allows everything to be managed using a single unified control plane.

IBM’s Red Hat platform and infrastructure strategy anchors the application stack with a unified, scalable, and managed OpenShift-based control plane equipped with a high-performance storage appliance and self-healing system capabilities (inclusive of semi-autonomous operations).

IBM’s strategy also includes several in-progress platform-level technologies for scalable data, AI/ML runtimes, accelerator libraries for Day-2 AI operations, and scalability for the enterprise.

It is an important to mention that IBM is designing its edge platforms with labor cost and technical workforce in mind. Data scientists with PhDs are in high demand, making them difficult to find and expensive to hire once you find them. IBM is designing its edge system capabilities and processes so that domain experts rather than PhDs can deploy new AI models and manage Day-2 operations.

Why edge is important

Advances in computing and storage have made it possible for AI to process mountains of accumulated data to provide solutions. By bringing AI closer to the source of data, edge computing is faster and more efficient than cloud. While Cloud data accounts for 60% of the world’s data today, vast amounts of new data is being created at the edge, including industrial applications, traffic cameras, and order management systems, all of which can be processed at the edge in a fast and timely manner.

Public cloud and edge computing differ in capacity, technology, and management. An advantage of edge is that data is processed and analyzed at / near its collection point at the edge. In the case of cloud, data must be transferred from a local device and into the cloud for analytics and then transferred back to the edge again. Moving data through the network consumes capacity and adds latency to the process. It’s easy to see why executing a transaction at the edge reduces latency and eliminates unnecessary load on the network.

Increased privacy is another benefit of processing data at the edge. Analyzing data where it originates limits the risk of a security breach. Most of the communications between the edge and the cloud is then confined to such things as reporting, data summaries, and AI models, without ever exposing the raw data.

IBM at the Edge

In our discussion, Dr. Fuller provided a few examples to illustrate how IBM plans to provide new and seamless edge solutions for existing enterprise problems.

Example #1 – McDonald’s drive-thru

Dr. Fuller’s first example centered around Quick Service Restaurant’s (QSR) problem of drive-thru order taking. Last year, IBM acquired an automated order-taking system from McDonald's. As part of the acquisition, IBM and McDonald's established a partnership to perfect voice ordering methods using AI. Drive-thru orders are a significant percentage of total QSR orders for McDonald's and other QSR chains.

McDonald's and other QSR restaurants would like every order to be processed as quickly and accurately as possible. For that reason, McDonald's conducted trials at ten Chicago restaurants using an edge-based AI ordering system with NLP (Natural Language Processing) to convert spoken orders into a digital format. It was found that AI had the potential to reduce ordering errors and processing time significantly. Since McDonald's sells almost 7 million hamburgers daily, shaving a minute or two off each order represents a significant opportunity to address labor shortages and increase customer satisfaction.

Example #2 – Boston Dynamics and Spot the agile mobile robot

According to an earlier IBM survey, many manufacturers have already implemented AI-driven robotics with autonomous decision-making capability. The study also indicated that over 80 percent of companies believe AI can help Boost future business operations. However, some companies expressed concern about the limited mobility of edge devices and sensors.

To develop a mobile edge solution, IBM teamed up with Boston Dynamics. The partnership created an agile mobile robot using IBM Research and IBM Sustainability Software AI technology. The device can analyze visual sensor readings in hazardous and challenging industrial environments such as manufacturing plants, warehouses, electrical grids, waste treatment plants and other hazardous environments. The value proposition that Boston Dynamics brought to the partnership was Spot the agile mobile robot, a walking, sensing, and actuation platform. Like all edge applications, the robot’s wireless mobility uses self-contained AI/ML that doesn’t require access to cloud data. It uses cameras to read analog devices, visually monitor fire extinguishers, and conduct a visual inspection of human workers to determine if required safety equipment is being worn.

IBM was able to show up to a 10X speedup by automating some manual tasks, such as converting the detection of a problem into an immediate work order in IBM Maximo to correct it. A fast automated response was not only more efficient, but it also improved the safety posture and risk management for these facilities. Similarly, some factories need to thermally monitor equipment to identify any unexpected hot spots that may show up over time, indicative of a potential failure.

IBM is working with National Grid, an energy company, to develop a mobile solution using Spot, the agile mobile robot, for image analysis of transformers and thermal connectors. As shown in the above graphic, Spot also monitored connectors on both flat surfaces and 3D surfaces. IBM was able to show that Spot could detect excessive heat build-up in small connectors, potentially avoiding unsafe conditions or costly outages. This AI/ML edge application can produce faster response times when an issue is detected, which is why IBM believes significant gains are possible by automating the entire process.

IBM market opportunities

Drive-thru orders and mobile robots are just a few examples of the millions of potential AI applications that exist at the edge and are driven by several billion connected devices.

Edge computing is an essential part of enterprise digital transformation. Enterprises seek ways to demonstrate the feasibility of solving business problems using AI/ML and analytics at the edge. However, once a proof of concept has been successfully demonstrated, it is a common problem for a company to struggle with scalability, data governance, and full-stack solution management.

Challenges with scaling

“Determining entry points for AI at the edge is not the difficult part,” Dr. Fuller said. “Scale is the real issue.”

Scaling edge models is complicated because there are so many edge locations with large amounts of diverse content and a high device density. Because large amounts of data are required for training, data gravity is a potential problem. Further, in many scenarios, vast amounts of data are generated quickly, leading to potential data storage and orchestration challenges. AI Models are also rarely "finished." Monitoring and retraining of models are necessary to keep up with changes the environment.

Through IBM Research, IBM is addressing the many challenges of building an all-encompassing edge architecture and horizontally scalable data and AI technologies. IBM has a wealth of edge capabilities and an architecture to create the appropriate platform for each application.

IBM AI entry points at the edge

IBM sees Edge Computing as a $200 billion market by 2025. Dr. Fuller and his organization have identified four key market entry points for developing and expanding IBM’s edge compute strategy. In order of size, IBM believes its priority edge markets to be intelligent factories (Industry 4.0), telcos, retail automation, and connected vehicles.

IBM and its Red Hat portfolio already have an established presence in each market segment, particularly in intelligent operations and telco. Red Hat is also active in the connected vehicles space.

Industry 4.0

There have been three prior industrial revolutions, beginning in the 1700s up to our current in-progress fourth revolution, Industry 4.0, that promotes a digital transformation.

Manufacturing is the fastest growing and the largest of IBM’s four entry markets. In this segment, AI at the edge can Boost quality control, production optimization, asset management, and supply chain logistics. IBM believes there are opportunities to achieve a 4x speed up in implementing edge-based AI solutions for manufacturing operations.

For its Industry 4.0 use case development, IBM, through product, development, research and consulting teams, is working with a major automotive OEM. The partnership has established the following joint objectives:

  • Increase automation and scalability across dozens of plants using 100s of AI / ML models. This client has already seen value in applying AI/ML models for manufacturing applications. IBM Research is helping with re-training models and implementing new ones in an edge environment to help scale even more efficiently. Edge offers faster inference and low latency, allowing AI to be deployed in a wider variety of manufacturing operations requiring instant solutions.
  • Dramatically reduce the time required to onboard new models. This will allow training and inference to be done faster and allow large models to be deployed much more quickly. The quicker an AI model can be deployed in production; the quicker the time-to-value and the return-on-investment (ROI).
  • Accelerate deployment of new inspections by reducing the labeling effort and iterations needed to produce a production-ready model via data summarization. Selecting small data sets for annotation means manually examining thousands of images, this is a time-consuming process that will result in - labeling of redundant data. Using ML-based automation for data summarization will accelerate the process and produce better model performance.
  • Enable Day-2 AI operations to help with data lifecycle automation and governance, model creation, reduce production errors, and provide detection of out-of-distribution data to help determine if a model’s inference is accurate. IBM believes this will allow models to be created faster without data scientists.

Maximo Application Suite

IBM’s Maximo Application Suite plays an important part in implementing large manufacturers' current and future IBM edge solutions. Maximo is an integrated public or private cloud platform that uses AI, IoT, and analytics to optimize performance, extend asset lifecycles and reduce operational downtime and costs. IBM is working with several large manufacturing clients currently using Maximo to develop edge use cases, and even uses it within its own Manufacturing.

IBM has research underway to develop a more efficient method of handling life cycle management of large models that require immense amounts of data. Day 2 AI operations tasks can sometimes be more complex than initial model training, deployment, and scaling. Retraining at the edge is difficult because resources are typically limited.

Once a model is trained and deployed, it is important to monitor it for drift caused by changes in data distributions or anything that might cause a model to deviate from original requirements. Inaccuracies can adversely affect model ROI.

Day-2 AI Operations (retraining and scaling)

Day-2 AI operations consist of continual updates to AI models and applications to keep up with changes in data distributions, changes in the environment, a drop in model performance, availability of new data, and/or new regulations.

IBM recognizes the advantages of performing Day-2 AI Operations, which includes scaling and retraining at the edge. It appears that IBM is the only company with an architecture equipped to effectively handle Day-2 AI operations. That is a significant competitive advantage for IBM.

A company using an architecture that requires data to be moved from the edge back into the cloud for Day-2 related work will be unable to support many factory AI/ML applications because of the sheer number of AI/ML models to support (100s to 1000s).

“There is a huge proliferation of data at the edge that exists in multiple spokes,” Dr. Fuller said. "However, all that data isn’t needed to retrain a model. It is possible to cluster data into groups and then use sampling techniques to retrain the model. There is much value in federated learning from our point of view.”

Federated learning is a promising training solution being researched by IBM and others. It preserves privacy by using a collaboration of edge devices to train models without sharing the data with other entities. It is a good framework to use when resources are limited.

Dealing with limited resources at the edge is a challenge. IBM’s edge architecture accommodates the need to ensure resource budgets for AI applications are met, especially when deploying multiple applications and multiple models across edge locations. For that reason, IBM developed a method to deploy data and AI applications to scale Day-2 AI operations utilizing hub and spokes.

The graphic above shows the current status quo methods of performing Day-2 operations using centralized applications and a centralized data plane compared to the more efficient managed hub and spoke method with distributed applications and a distributed data plane. The hub allows it all to be managed from a single pane of glass.

Data Fabric Extensions to Hub and Spokes

IBM uses hub and spoke as a model to extend its data fabric. The model should not be thought of in the context of a traditional hub and spoke. IBM’s hub provides centralized capabilities to manage clusters and create multiples hubs that can be aggregated to a higher level. This architecture has four important data management capabilities.

  1. First, models running in unattended environments must be monitored. From an operational standpoint, detecting when a model’s effectiveness has significantly degraded and if corrective action is needed is critical.
  2. Secondly, in a hub and spoke model, data is being generated and collected in many locations creating a need for data life cycle management. Working with large enterprise clients, IBM is building unique capabilities to manage the data plane across the hub and spoke estate - optimized to meet data lifecycle, regulatory & compliance as well as local resource requirements. Automation determines which input data should be selected and labeled for retraining purposes and used to further Boost the model. Identification is also made for atypical data that is judged worthy of human attention.
  3. The third issue relates to AI pipeline compression and adaptation. As mentioned earlier, edge resources are limited and highly heterogeneous. While a cloud-based model might have a few hundred million parameters or more, edge models can’t afford such resource extravagance because of resource limitations. To reduce the edge compute footprint, model compression can reduce the number of parameters. As an example, it could be reduced from several hundred million to a few million.
  4. Lastly, suppose a scenario exists where data is produced at multiple spokes but cannot leave those spokes for compliance reasons. In that case, IBM Federated Learning allows learning across heterogeneous data in multiple spokes. Users can discover, curate, categorize and share data assets, data sets, analytical models, and their relationships with other organization members.

In addition to AI deployments, the hub and spoke architecture and the previously mentioned capabilities can be employed more generally to tackle challenges faced by many enterprises in consistently managing an abundance of devices within and across their enterprise locations. Management of the software delivery lifecycle or addressing security vulnerabilities across a vast estate are a case in point.

Multicloud and Edge platform

In the context of its strategy, IBM sees edge and distributed cloud as an extension of its hybrid cloud platform built around Red Hat OpenShift. One of the newer and more useful options created by the Red Hat development team is the Single Node OpenShift (SNO), a compact version of OpenShift that fits on a single server. It is suitable for addressing locations that are still servers but come in a single node, not clustered, deployment type.

For smaller footprints such as industrial PCs or computer vision boards (for example NVidia Jetson Xavier), Red Hat is working on a project which builds an even smaller version of OpenShift, called MicroShift, that provides full application deployment and Kubernetes management capabilities. It is packaged so that it can be used for edge device type deployments.

Overall, IBM and Red Hat have developed a full complement of options to address a large spectrum of deployments across different edge locations and footprints, ranging from containers to management of full-blown Kubernetes applications from MicroShift to OpenShift and IBM Edge Application Manager.

Much is still in the research stage. IBM's objective is to achieve greater consistency in terms of how locations and application lifecycle is managed.

First, Red Hat plans to introduce hierarchical layers of management with Red Hat Advanced Cluster Management (RHACM), to scale by two to three orders of magnitude the number of edge locations managed by this product. Additionally, securing edge locations is a major focus. Red Hat is continuously expanding platform security features, for example by recently including Integrity Measurement Architecture in Red Hat Enterprise Linux, or by adding Integrity Shield to protect policies in Red Hat Advanced Cluster Management (RHACM).

Red Hat is partnering with IBM Research to advance technologies that will permit it to protect platform integrity and the integrity of client workloads through the entire software supply chains. In addition, IBM Research is working with Red Hat on analytic capabilities to identify and remediate vulnerabilities and other security risks in code and configurations.

Telco network intelligence and slice management with AL/ML

Communication service providers (CSPs) such as telcos are key enablers of 5G at the edge. 5G benefits for these providers include:

  • Reduced operating costs
  • Improved efficiency
  • Increased distribution and density
  • Lower latency

The end-to-end 5G network comprises the Radio Access Network (RAN), transport, and core domains. Network slicing in 5G is an architecture that enables multiple virtual and independent end-to-end logical networks with different characteristics such as low latency or high bandwidth, to be supported on the same physical network. This is implemented using cloud-native technology enablers such as software defined networking (SDN), virtualization, and multi-access edge computing. Slicing offers necessary flexibility by allowing the creation of specific applications, unique services, and defined user groups or networks.

An important aspect of enabling AI at the edge requires IBM to provide CSPs with the capability to deploy and manage applications across various enterprise locations, possibly spanning multiple end-to-end network slices, using a single pane of glass.

5G network slicing and slice management

Network slices are an essential part of IBM's edge infrastructure that must be automated, orchestrated and optimized according to 5G standards. IBM’s strategy is to leverage AI/ML to efficiently manage, scale, and optimize the slice quality of service, measured in terms of bandwidth, latency, or other metrics.

5G and AI/ML at the edge also represent a significant opportunity for CSPs to move beyond traditional cellular services and capture new sources of revenue with new services.

Communications service providers need management and control of 5G network slicing enabled with AI-powered automation.

Dr. Fuller sees a variety of opportunities in this area. "When it comes to applying AI and ML on the network, you can detect things like intrusion detection and malicious actors," he said. "You can also determine the best way to route traffic to an end user. Automating 5G functions that run on the network using IBM network automation software also serves as an entry point.”

In IBM’s current telecom trial, IBM Research is spearheading the development of a range of capabilities targeted for the IBM Cloud Pak for Network Automation product using AI and automation to orchestrate, operate and optimize multivendor network functions and services that include:

  • End-to-end 5G network slice management with planning & design, automation & orchestration, and operations & assurance
  • Network Data and AI Function (NWDAF) that collects data for slice monitoring from 5G Core network functions, performs network analytics, and provides insights to authorized data consumers.
  • Improved operational efficiency and reduced cost

Future leverage of these capabilities by existing IBM Clients that use the Cloud Pak for Network Automation (e.g., DISH) can offer further differentiation for CSPs.

5G radio access

Open radio access networks (O-RANs) are expected to significantly impact telco 5G wireless edge applications by allowing a greater variety of units to access the system. The O-RAN concept separates the DU (Distributed Units) and CU (Centralized Unit) from a Baseband Unit in 4G and connects them with open interfaces.

O-RAN system is more flexible. It uses AI to establish connections made via open interfaces that optimize the category of a device by analyzing information about its prior use. Like other edge models, the O-RAN architecture provides an opportunity for continuous monitoring, verification, analysis, and optimization of AI models.

The IBM-telco collaboration is expected to advance O-RAN interfaces and workflows. Areas currently under development are:

  • Multi-modal (RF level + network-level) analytics (AI/ML) for wireless communication with high-speed ingest of 5G data
  • Capability to learn patterns of metric and log data across CUs and DUs in RF analytics
  • Utilization of the antenna control plane to optimize throughput
  • Primitives for forecasting, anomaly detection and root cause analysis using ML
  • Opportunity of value-added functions for O-RAN

IBM Cloud and Infrastructure

The cornerstone for the delivery of IBM's edge solutions as a service is IBM Cloud Satellite. It presents a consistent cloud-ready, cloud-native operational view with OpenShift and IBM Cloud PaaS services at the edge. In addition, IBM integrated hardware and software Edge systems will provide RHACM - based management of the platform when clients or third parties have existing managed as a service models. It is essential to note that in either case this is done within a single control plane for hubs and spokes that helps optimize execution and management from any cloud to the edge in the hub and spoke model.

IBM's focus on “edge in” means it can provide the infrastructure through things like the example shown above for software defined storage for federated namespace data lake that surrounds other hyperscaler clouds. Additionally, IBM is exploring integrated full stack edge storage appliances based on hyperconverged infrastructure (HCI), such as the Spectrum Fusion HCI, for enterprise edge deployments.

As mentioned earlier, data gravity is one of the main driving factors of edge deployments. IBM has designed its infrastructure to meet those data gravity requirements, not just for the existing hub and spoke topology but also for a future spoke-to-spoke topology where peer-to-peer data sharing becomes imperative (as illustrated with the wealth of examples provided in this article).

Wrap up

Edge is a distributed computing model. One of its main advantages is that computing, and data storage and processing is close to where data is created. Without the need to move data to the cloud for processing, real-time application of analytics and AI capabilities provides immediate solutions and drives business value.

IBM’s goal is not to move the entirety of its cloud infrastructure to the edge. That has little value and would simply function as a hub to spoke model operating on actions and configurations dictated by the hub.

IBM’s architecture will provide the edge with autonomy to determine where data should reside and from where the control plane should be exercised.

Equally important, IBM foresees this architecture evolving into a decentralized model capable of edge-to-edge interactions. IBM has no firm designs for this as yet. However, the plan is to make the edge infrastructure and platform a first-class citizen instead of relying on the cloud to drive what happens at the edge.

Developing a complete and comprehensive AI/ML edge architecture - and in fact, an entire ecosystem - is a massive undertaking. IBM faces many known and unknown challenges that must be solved before it can achieve success.

However, IBM is one of the few companies with the necessary partners and the technical and financial resources to undertake and successfully implement a project of this magnitude and complexity.

It is reassuring that IBM has a plan and that its plan is sound.

Paul Smith-Goodson is Vice President and Principal Analyst for quantum computing, artificial intelligence and space at Moor Insights and Strategy. You can follow him on Twitter for more current information on quantum, AI, and space.

Note: Moor Insights & Strategy writers and editors may have contributed to this article.

Moor Insights & Strategy, like all research and tech industry analyst firms, provides or has provided paid services to technology companies. These services include research, analysis, advising, consulting, benchmarking, acquisition matchmaking, and speaking sponsorships. The company has had or currently has paid business relationships with 8×8, Accenture, A10 Networks, Advanced Micro Devices, Amazon, Amazon Web Services, Ambient Scientific, Anuta Networks, Applied Brain Research, Applied Micro, Apstra, Arm, Aruba Networks (now HPE), Atom Computing, AT&T, Aura, Automation Anywhere, AWS, A-10 Strategies, Bitfusion, Blaize, Box, Broadcom, C3.AI, Calix, Campfire, Cisco Systems, Clear Software, Cloudera, Clumio, Cognitive Systems, CompuCom, Cradlepoint, CyberArk, Dell, Dell EMC, Dell Technologies, Diablo Technologies, Dialogue Group, Digital Optics, Dreamium Labs, D-Wave, Echelon, Ericsson, Extreme Networks, Five9, Flex, Foundries.io, Foxconn, Frame (now VMware), Fujitsu, Gen Z Consortium, Glue Networks, GlobalFoundries, Revolve (now Google), Google Cloud, Graphcore, Groq, Hiregenics, Hotwire Global, HP Inc., Hewlett Packard Enterprise, Honeywell, Huawei Technologies, IBM, Infinidat, Infosys, Inseego, IonQ, IonVR, Inseego, Infosys, Infiot, Intel, Interdigital, Jabil Circuit, Keysight, Konica Minolta, Lattice Semiconductor, Lenovo, Linux Foundation, Lightbits Labs, LogicMonitor, Luminar, MapBox, Marvell Technology, Mavenir, Marseille Inc, Mayfair Equity, Meraki (Cisco), Merck KGaA, Mesophere, Micron Technology, Microsoft, MiTEL, Mojo Networks, MongoDB, MulteFire Alliance, National Instruments, Neat, NetApp, Nightwatch, NOKIA (Alcatel-Lucent), Nortek, Novumind, NVIDIA, Nutanix, Nuvia (now Qualcomm), onsemi, ONUG, OpenStack Foundation, Oracle, Palo Alto Networks, Panasas, Peraso, Pexip, Pixelworks, Plume Design, PlusAI, Poly (formerly Plantronics), Portworx, Pure Storage, Qualcomm, Quantinuum, Rackspace, Rambus, Rayvolt E-Bikes, Red Hat, Renesas, Residio, Samsung Electronics, Samsung Semi, SAP, SAS, Scale Computing, Schneider Electric, SiFive, Silver Peak (now Aruba-HPE), SkyWorks, SONY Optical Storage, Splunk, Springpath (now Cisco), Spirent, Splunk, Sprint (now T-Mobile), Stratus Technologies, Symantec, Synaptics, Syniverse, Synopsys, Tanium, Telesign,TE Connectivity, TensTorrent, Tobii Technology, Teradata,T-Mobile, Treasure Data, Twitter, Unity Technologies, UiPath, Verizon Communications, VAST Data, Ventana Micro Systems, Vidyo, VMware, Wave Computing, Wellsmith, Xilinx, Zayo, Zebra, Zededa, Zendesk, Zoho, Zoom, and Zscaler. Moor Insights & Strategy founder, CEO, and Chief Analyst Patrick Moorhead is an investor in dMY Technology Group Inc. VI, Dreamium Labs, Groq, Luminar Technologies, MemryX, and Movandi.

Mon, 08 Aug 2022 03:51:00 -0500 Paul Smith-Goodson en text/html https://www.forbes.com/sites/moorinsights/2022/08/08/ibm-research-rolls-out-a-comprehensive-ai-and-ml-edge-research-strategy-anchored-by-enterprise-partnerships-and-use-cases/
Killexams : IBM aims for immediate quantum advantage with error mitigation technique

You don’t have to be a physicist to know that noise and quantum computing don’t mix. Any noise, movement or temperature swing causes qubits – the quantum computing equivalent to a binary bit in classical computing – to fail.

That’s one of the main reasons quantum advantage (the point at which quantum surpasses classic computing) and quantum supremacy (when quantum computers solve a problem not feasible for classical computing) feel like longer-term goals and emerging technology.  It’s worth the wait, though, as quantum computers promise exponential increases over classic computing, which tops out at supercomputing.  However, due to the intricacies of quantum physics (e.g., entanglement), quantum computers are also more prone to errors based on environmental factors when compared to supercomputers or high-performance computers.

Quantum errors arise from what’s known as decoherence, a process that occurs when noise or nonoptimal temperatures interfere with qubits, changing their quantum states and causing information stored by the quantum computer to be lost.

The road(s) to quantum

Many enterprises view quantum computing technology as a zero-sum scenario and that if you want value from a quantum computer, you need fault-tolerant quantum processors and a multitude of qubits. While we wait, we’re stuck in the NISQ era — noisy intermediate-scale quantum — where quantum hasn’t surpassed  classical computers.

That’s an impression IBM hopes to change.

In a blog published today by IBM, its quantum team (Kristan Temme, Ewout van den Berg, Abhinav Kandala and Jay Gambett) writes that the history of classical computing is one of incremental advances. 

“Although quantum computers have seen tremendous improvements in their scale, quality and speed in recent years, such a gradual evolution seems to be missing from the narrative,” the team wrote.  “However, recent advances in techniques we refer to broadly as quantum error mitigation allow us to lay out a smoother path towards this goal. Along this path, advances in qubit coherence, gate fidelities and speed immediately translate to measurable advantage in computation, akin to the steady progress historically observed with classical computers.”

Finding value in noisy qubits

In a move to get a quantum advantage sooner – and in incremental steps – IBM claims to have created a technique that’s designed to tap more value from noisy qubits and move away from NISQ.

Instead of focusing solely on fault-tolerant computers. IBM’s goal is continuous and incremental improvements, Jerry Chow, the director of hardware development for IBM Quantum, told VentureBeat.

To mitigate errors, Chow points to IBM’s new probabilistic error cancellation, a technique designed to invert noisy quantum circuits to achieve error-free results, even though the circuits themselves are noisy. It does bring a runtime tradeoff, he said, because you’re giving up running more circuits to gain insight into the noise causing the errors.

The goal of the new technique is to provide a step, rather than a leap, towards quantum supremacy.  It’s “a near-term solution,” Chow said, and a part of a suite of techniques that will help IBM learn about error correction through error migration. “As you increase the runtime, you learn more as you run more qubits,” he explained.

Chow said that while  IBM continues to scale its quantum platform, this offers an incremental step. Last year, IBM unveiled a 127-qubit Eagle processor, which is capable of running quantum circuits that can’t be replicated classically.  Based on its quantum roadmap laid out in May, IBM systems is on track to reach 4,000-plus qubit quantum devices in 2025.

Not an either-or scenario: Quantum starts now

Probabilistic error cancellation represents a shift for IBM and the quantum field overall. Rather than relying solely on experiments to achieve full error correction under certain circumstances, IBM has focused on a continuous push to address quantum errors today while still moving toward fault-tolerant machines, Chow said. “You need high-quality hardware to run billions of circuits. Speed is needed. The goal is not to do error mitigation  long-term. It’s not all or nothing.”

IBM quantum computing bloggers add that its quantum error mitigation technique “is the continuous path that will take us from today’s quantum hardware to tomorrow’s fault-tolerant quantum computers. This path will let us run larger circuits needed for quantum advantage, one hardware improvement at a time.”

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn more about membership.

Tue, 19 Jul 2022 15:37:00 -0500 Dan Muse en-US text/html https://venturebeat.com/quantum-computing/ibm-aims-for-immediate-quantum-advantage-with-error-mitigation-technique/
Killexams : IBM extends Power10 server lineup for enterprise use cases

Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Watch here.


IBM is looking to grow its enterprise server business with the expansion of its Power10 portfolio announced today.

IBM Power is a RISC (reduced instruction set computer) based chip architecture that is competitive with other chip architectures including x86 from Intel and AMD. IBM’s Power hardware has been used for decades for running IBM’s AIX Unix operating system, as well as the IBM i operating system that was once known as the AS/400. In more recent years, Power has increasingly been used for Linux and specifically in support of Red Hat and its OpenShift Kubernetes platform that enables organizations to run containers and microservices.

The IBM Power10 processor was announced in August 2020, with the first server platform, the E1080 server, coming a year later in September 2021. Now IBM is expanding its Power10 lineup with four new systems, including the Power S1014, S1024, S1022 and E1050, which are being positioned by IBM to help solve enterprise use cases, including the growing need for machine learning (ML) and artificial intelligence (AI).

What runs on IBM Power servers?

Usage of IBM’s Power servers could well be shifting into territory that Intel today still dominates.

Steve Sibley, vp, IBM Power product management, told VentureBeat that approximately 60% of Power workloads are currently running AIX Unix. The IBM i operating system is on approximately 20% of workloads. Linux makes up the remaining 20% and is on a growth trajectory.

IBM owns Red Hat, which has its namesake Linux operating system supported on Power, alongside the OpenShift platform. Sibley noted that IBM has optimized its new Power10 system for Red Hat OpenShift.

“We’ve been able to demonstrate that you can deploy OpenShift on Power at less than half the cost of an Intel stack with OpenShift because of IBM’s container density and throughput that we have within the system,” Sibley said.

A look inside IBM’s four new Power servers

Across the new servers, the ability to access more memory at greater speed than previous generations of Power servers is a key feature. The improved memory is enabled by support of the Open Memory Interface (OMI) specification that IBM helped to develop, and is part of the OpenCAPI Consortium.

“We have Open Memory Interface technology that provides increased bandwidth but also reliability for memory,” Sibley said. “Memory is one of the common areas of failure in a system, particularly when you have lots of it.”

The new servers announced by IBM all use technology from the open-source OpenBMC project that IBM helps to lead. OpenBMC provides secure code for managing the baseboard of the server in an optimized approach for scalability and performance.

E1050

Among the new servers announced today by IBM is the E1050, which is a 4RU (4 rack unit) sized server, with 4 CPU sockets, that can scale up to 16TB of memory, helping to serve large data- and memory-intensive workloads.

S1014 and S1024

The S1014 and the S1024 are also both 4RU systems, with the S1014 providing a single CPU socket and the S1024 integrating a dual-socket design. The S1014 can scale up to 2TB of memory, while the S1024 supports up to 8TB.

S1022

Rounding out the new services is the S1022, which is a 1RU server that IBM is positioning as an ideal platform for OpenShift container-based workloads.

Bringing more Power to AI and ML

AI and ML workloads are a particularly good use case for all the Power10 systems, thanks to optimizations that IBM has built into the chip architecture.

Sibley explained that all Power10 chips benefit from IBM’s Matrix Match Acceleration (MMA) capability. The enterprise use cases that Power10-based servers can help to support include organizations that are looking to build out risk analytics, fraud detection and supply chain forecasting AI models, among others. 

IBM’s Power10 systems support and have been optimized for multiple popular open-source machine learning frameworks including PyTorch and TensorFlow.

“The way we see AI emerging is that a vast majority of AI in the future will be done on the CPU from an inference standpoint,” Sibley said.

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn more about membership.

Mon, 11 Jul 2022 23:01:00 -0500 Sean Michael Kerner en-US text/html https://venturebeat.com/programming-development/ibm-extends-power10-server-lineup-for-enterprise-use-cases/
Killexams : Colorado’s P-TECH Students Graduate Ready for Tech Careers (TNS) — Abraham Tinajero was an eighth grader when he saw a poster in his Longmont middle school’s library advertising a new program offering free college with a technology focus.

Interested, he talked to a counselor to learn more about P-TECH, an early college program where he could earn an associate’s degree along with his high school diploma. Liking the sound of the program, he enrolled in the inaugural P-TECH class as a freshman at Longmont’s Skyline High School.

“I really loved working on computers, even before P-TECH,” he said. “I was a hobbyist. P-TECH gave me a pathway.”


He worked with an IBM mentor and interned at the company for six weeks as a junior. After graduating in 2020 with his high school diploma and the promised associate’s degree in computer science from Front Range Community College, he was accepted to IBM’s yearlong, paid apprenticeship program.

IBM hired him as a cybersecurity analyst once he completed the apprenticeship.

“P-TECH has given me a great advantage,” he said. “Without it, I would have been questioning whether to go into college. Having a college degree at 18 is great to put on a resume.”


Stanley Litow, a former vice president of IBM, developed the P-TECH, or Pathways in Technology Early College High Schools, model. The first P-TECH school opened 11 years ago in Brooklyn, New York, in partnership with IBM.

Litow’s idea was to get more underrepresented young people into tech careers by giving them a direct path to college while in high school — and in turn create a pipeline of employees with the job skills businesses were starting to value over four-year college degrees.

The program, which includes mentors and internships provided by business partners, gives high school students up to six years to earn an associate's degree at no cost.

SKYLINE HIGH A PIONEER IN PROGRAM

In Colorado, St. Vrain Valley was among the first school districts chosen by the state to offer a P-TECH program after the Legislature passed a bill to provide funding — and the school district has embraced the program.

Colorado’s first P-TECH programs started in the fall of 2016 at three high schools, including Skyline High. Over the last six years, 17 more Colorado high schools have adopted P-TECH, for at total of 20. Three of those are in St. Vrain Valley, with a fourth planned to open in the fall of 2023 at Longmont High School.

Each St. Vrain Valley high school offers a different focus supported by different industry partners.

Skyline partners with IBM, with students earning an associate’s degree in Computer Information Systems from Front Range. Along with being the first, Skyline’s program is the largest, enrolling up to 55 new freshmen each year.

Programs at the other schools are capped at 35 students per grade.

Frederick High’s program, which started in the fall of 2019, has a bioscience focus, partners with Aims Community College and works with industry partners Agilent Technologies, Tolmar, KBI Biopharma, AGC Biologics and Corden Pharma.

Silver Creek High’s program started a year ago with a cybersecurity focus. The Longmont school partners with Front Range and works with industry partners Seagate, Cisco, PEAK Resources and Comcast.

The new program coming to Longmont High will focus on business.

District leaders point to Skyline High’s graduation statistics to illustrate the program’s success. At Skyline, 100 percent of students in the first three P-TECH graduating classes earned a high school diploma in four years.

For the 2020 Skyline P-TECH graduates, 24 of the 33, or about 70 percent, also earned associate’s degrees. For the 2021 graduating class, 30 of the 47 have associate’s degrees — with one year left for those students to complete the college requirements.

For the most recent 2022 graduates, who have two years left to complete the college requirements, 19 of 59 have associate’s degrees and another six are on track to earn their degrees by the end of the summer.

JUMPING AT AN OPPORTUNITY

Louise March, Skyline High’s P-TECH counselor, keeps in touch with the graduates, saying 27 are working part time or full time at IBM. About a third are continuing their education at a four year college. Of the 19 who graduated in 2022 with an associate’s degree, 17 are enrolling at a four year college, she said.

Two of those 2022 graduates are Anahi Sarmiento, who is headed to the University of Colorado Boulder’s Leeds School of Business, and Jose Ivarra, who will study computer science at Colorado State University.

“I’m the oldest out of three siblings,” Ivarra said. “When you hear that someone wants to supply you free college in high school, you take it. I jumped at the opportunity.”

Sarmiento added that her parents, who are immigrants, are already working two jobs and don’t have extra money for college costs.

“P-TECH is pushing me forward,” she said. “I know my parents want me to have a better life, but I want them to have a better life, too. Going into high school, I kept that mentality that I would push myself to my full potential. It kept me motivated.”

While the program requires hard work, the two graduates said, they still enjoyed high school and had outside interests. Ivarra was a varsity football player who was named player of the year. Sarmiento took advantage of multiple opportunities, from helping elementary students learn robotics to working at the district’s Innovation Center.

Ivarra said he likes that P-TECH has the same high expectations for all students, no matter their backgrounds, and gives them support in any areas where they need help. Spanish is his first language and, while math came naturally, language arts was more challenging.

“It was tough for me to see all these classmates use all these big words, and I didn’t know them,” he said. “I just felt less. When I went into P-TECH, the teachers focus on you so much, checking on every single student.”

They said it’s OK to struggle or even fail. Ivarra said he failed a tough class during the pandemic, but was able to retake it and passed. Both credited March, their counselor, with providing unending support as they navigated high school and college classes.

“She’s always there for you,” Sarmiento said. “It’s hard to be on top of everything. You have someone to go to.”

Students also supported each other.

“You build bonds,” Ivarra said. “You’re all trying to figure out these classes. You grow together. It’s a bunch of people who want to succeed. The people that surround you in P-TECH, they push you to be better.”

SUPPORT SYSTEMS ARE KEY

P-TECH has no entrance requirements or prerequisite classes. You don’t need to be a top student, have taken advanced math or have a background in technology.

With students starting the rigorous program with a wide range of skills, teachers and counselors said, they quickly figured out the program needed stronger support systems.

March said freshmen in the first P-TECH class struggled that first semester, prompting the creation of a guided study class. The every other day, hour-and-a-half class includes both study time and time to learn workplace skills, including writing a resume and interviewing. Teachers also offer tutoring twice a week after school.

“The guided study has become crucial to the success of the program,” March said.

Another way P-TECH provides extra support is through summer orientation programs for incoming freshmen.

At Skyline, ninth graders take a three-week bridge class — worth half a credit — that includes learning good study habits. They also meet IBM mentors and take a field trip to Front Range Community College.

“They get their college ID before they get their high school ID,” March said.

During a session in June, 15 IBM mentors helped the students program a Sphero robot to travel along different track configurations. Kathleen Schuster, who has volunteered as an IBM mentor since the P-TECH program started here, said she wants to “return some of the favors I got when I was younger.”

“Even this play stuff with the Spheros, it’s teaching them teamwork and a little computing,” she said. “Hopefully, through P-TECH, they will learn what it takes to work in a tech job.”

Incoming Skyline freshman Blake Baker said he found a passion for programming at Trail Ridge Middle and saw P-TECH as a way to capitalize on that passion.

“I really love that they supply you options and a path,” he said.

Trail Ridge classmate Itzel Pereyra, another programming enthusiast, heard about P-TECH from her older brother.

“It’s really good for my future,” she said. “It’s an exciting moment, starting the program. It will just help you with everything.”

While some of the incoming ninth graders shared dreams of technology careers, others see P-TECH as a good foundation to pursue other dreams.

Skyline incoming ninth grader Marisol Sanchez wants to become a traveling nurse, demonstrating technology and new skills to other nurses. She added that the summer orientation sessions are a good introduction, helping calm the nerves that accompany combining high school and college.

“There’s a lot of team building,” she said. “It’s getting us all stronger together as a group and introducing everyone.”

THE SPARK OF MOTIVATION

Silver Creek’s June camp for incoming ninth graders included field trips to visit Cisco, Seagate, PEAK Resources, Comcast and Front Range Community College.

During the Front Range Community College field trip, the students heard from Front Range staff members before going on a scavenger hunt. Groups took photos to prove they completed tasks, snapping pictures of ceramic pieces near the art rooms, the most expensive tech product for sale in the bookstore and administrative offices across the street from the main building.

Emma Horton, an incoming freshman, took a cybersecurity class as a Flagstaff Academy eighth grader that hooked her on the idea of technology as a career.

“I’m really excited about the experience I will be getting in P-TECH,’ she said. “I’ve never been super motivated in school, but with something I’m really interested in, it becomes easier.”

Deb Craven, dean of instruction at Front Range’s Boulder County campus, promised the Silver Creek students that the college would support them. She also gave them some advice.

“You need to advocate and ask for help,” she said. “These two things are going to help you the most. Be present, be engaged, work together and lean on each other.”

Craven, who oversees Front Range’s P-TECH program partnership, said Front Range leaders toured the original P-TECH program in New York along with St. Vrain and IBM leaders in preparation for bringing P-TECH here.

“Having IBM as a partner as we started the program was really helpful,” she said.

When the program began, she said, freshmen took a more advanced technology class as their first college class. Now, she said, they start with a more fundamental class in the spring of their freshman year, learning how to build a computer.

“These guys have a chance to grow into the high school environment before we stick them in a college class,” she said.

Summer opportunities aren’t just for P-TECH’s freshmen. Along with summer internships, the schools and community colleges offer summer classes.

Silver Creek incoming 10th graders, for example, could take a personal financial literacy class at Silver Creek in the mornings and an introduction to cybersecurity class at the Innovation Center in the afternoons in June.

Over at Skyline, incoming 10th graders in P-TECH are getting paid to teach STEM lessons to elementary students while earning high school credit. Students in the fifth or sixth year of the program also had the option of taking computer science and algebra classes at Front Range.

EMBRACING THE CHALLENGE

And at Frederick, incoming juniors are taking an introduction to manufacturing class at the district's Career Elevation and Technology Center this month in preparation for an advanced manufacturing class they’re taking in the fall.

“This will supply them a head start for the fall,” said instructor Chester Clark.

Incoming Frederick junior Destini Johnson said she’s not sure what she wants to do after high school, but believes the opportunities offered by P-TECH will prepare her for the future.

“I wanted to try something challenging, and getting a head start on college can only help,” she said. “It’s really incredible that I’m already halfway done with an associate’s degree and high school.”

IBM P-TECH program manager Tracy Knick, who has worked with the Skyline High program for three years, said it takes a strong commitment from all the partners — the school district, IBM and Front Range — to make the program work.

“It’s not an easy model,” she said. “When you say there are no entrance requirements, we all have to be OK with that and support the students to be successful.”

IBM hosted 60 St. Vrain interns this summer, while two Skyline students work as IBM “co-ops” — a national program — to assist with the P-TECH program.

The company hosts two to four formal events for the students each year to work on professional and technical skills, while IBM mentors provide tutoring in algebra. During the pandemic, IBM also paid for subscriptions to tutor.com so students could get immediate help while taking online classes.

“We want to get them truly workforce ready,” Knick said. “They’re not IBM-only skills we’re teaching. Even though they choose a pathway, they can really do anything.”

As the program continues to expand in the district, she said, her wish is for more businesses to recognize the value of P-TECH.

“These students have had intensive training on professional skills,” she said. “They have taken college classes enhanced with the same digital credentials that an IBM employee can learn. There should be a waiting list of employers for these really talented and skilled young professionals.”

©2022 the Daily Camera (Boulder, Colo.). Distributed by Tribune Content Agency, LLC.

Thu, 04 Aug 2022 02:41:00 -0500 en text/html https://www.govtech.com/education/k-12/colorados-p-tech-students-graduate-ready-for-tech-careers
Killexams : Learning Management System Market to See Huge Growth by 2027 : Xerox, IBM, SAP

The latest study released on the Global Learning Management System Market by AMA Research evaluates market size, trend, and forecast to 2027. The Learning Management System market study covers significant research data and proofs to be a handy resource document for managers, analysts, industry experts and other key people to have ready-to-access and self-analyzed study to help understand market trends, growth drivers, opportunities and upcoming challenges and about the competitors.

Key Players in This Report Include:

Cornerstone Ondemand, Inc. (United States), Xerox Corporation (United States), IBM Corporation (United States), Net dimensions Ltd. (United States), SAP Se (Germany), Blackboard, Inc. (United States), Saba Software, Inc. (United States), Mcgraw-Hill Companies (United States), Pearson Plc (United Kingdom), D2L Corporation (Canada),

Download trial Report PDF (Including Full TOC, Table & Figures) @ https://www.advancemarketanalytics.com/sample-report/4918-global-learning-management-system-market

Definition:

Learning management system basically a software application for the administration, documentation, tracking, reporting, and delivery of educational courses, training programs, and many others. The learning management system highly emerges from e-Learning.

Market Trends:

Increasing Competition Among Market Players

High Adoption of Cloud-Based Solutions

Market Drivers:

Growing Awareness Towards the Adoption of Digital Learning

Rapid Inclination to BYOD Policy and Enterprise Mobility

Widespread of Government Initiatives for Growth Of LMS

Growing Implication Of E-Learning in Corporates

Market Opportunities:

Growing Demand for Gamification in LMS Delivers Strong Opportunities for LMS Providers

High Surge in Demand for Collaborative Learning in LMS to Provide High Potentials for Trainees

The Global Learning Management System Market segments and Market Data Break Down are illuminated below:

by Type (Academic, Corporate), Industry Verticals (Banking, Financial Services, and Insurance, Healthcare, Retail, Government, Manufacturing, Others), Delivery Mode (Distance Learning, Instructor-Led Training), Organizations Size (Small and Medium Size Enterprises, Large Size Enterprises), Offerings (Solution, Services)

Global Learning Management System market report highlights information regarding the current and future industry trends, growth patterns, as well as it offers business strategies to helps the stakeholders in making sound decisions that may help to ensure the profit trajectory over the forecast years.

Have a query? Market an enquiry before purchase @ https://www.advancemarketanalytics.com/enquiry-before-buy/4918-global-learning-management-system-market

Geographically, the detailed analysis of consumption, revenue, market share, and growth rate of the following regions:

  • The Middle East and Africa (South Africa, Saudi Arabia, UAE, Israel, Egypt, etc.)
  • North America (United States, Mexico & Canada)
  • South America (Brazil, Venezuela, Argentina, Ecuador, Peru, Colombia, etc.)
  • Europe (Turkey, Spain, Turkey, Netherlands Denmark, Belgium, Switzerland, Germany, Russia UK, Italy, France, etc.)
  • Asia-Pacific (Taiwan, Hong Kong, Singapore, Vietnam, China, Malaysia, Japan, Philippines, Korea, Thailand, India, Indonesia, and Australia).

Objectives of the Report

  • -To carefully analyze and forecast the size of the Learning Management System market by value and volume.
  • -To estimate the market shares of major segments of the Learning Management System
  • -To showcase the development of the Learning Management System market in different parts of the world.
  • -To analyze and study micro-markets in terms of their contributions to the Learning Management System market, their prospects, and individual growth trends.
  • -To offer precise and useful details about factors affecting the growth of the Learning Management System
  • -To provide a meticulous assessment of crucial business strategies used by leading companies operating in the Learning Management System market, which include research and development, collaborations, agreements, partnerships, acquisitions, mergers, new developments, and product launches.

Buy Complete Assessment of Learning Management System market Now @ https://www.advancemarketanalytics.com/buy-now?format=1&report=4918

Major highlights from Table of Contents:

Learning Management System Market Study Coverage:

  • It includes major manufacturers, emerging player’s growth story, and major business segments of Learning Management System market, years considered, and research objectives. Additionally, segmentation on the basis of the type of product, application, and technology.
  • Learning Management System Market Executive Summary: It gives a summary of overall studies, growth rate, available market, competitive landscape, market drivers, trends, and issues, and macroscopic indicators.
  • Learning Management System Market Production by Region Learning Management System Market Profile of Manufacturers-players are studied on the basis of SWOT, their products, production, value, financials, and other vital factors.
  • Key Points Covered in Learning Management System Market Report:
  • Learning Management System Overview, Definition and Classification Market drivers and barriers
  • Learning Management System Market Competition by Manufacturers
  • Impact Analysis of COVID-19 on Learning Management System Market
  • Learning Management System Capacity, Production, Revenue (Value) by Region (2021-2027)
  • Learning Management System Supply (Production), Consumption, Export, Import by Region (2021-2027)
  • Learning Management System Production, Revenue (Value), Price Trend by Type {Academic, Corporate,}
  • Learning Management System Manufacturers Profiles/Analysis Learning Management System  Manufacturing Cost Analysis, Industrial/Supply Chain Analysis, Sourcing Strategy and Downstream Buyers, Marketing
  • Strategy by Key Manufacturers/Players, Connected Distributors/Traders Standardization, Regulatory and collaborative initiatives, Industry road map and value chain Market Effect Factors Analysis.

Browse Complete Summary and Table of Content @ https://www.advancemarketanalytics.com/reports/4918-global-learning-management-system-market

Key questions answered

  • How feasible is Learning Management System market for long-term investment?
  • What are influencing factors driving the demand for Learning Management System near future?
  • What is the impact analysis of various factors in the Global Learning Management System market growth?
  • What are the recent trends in the regional market and how successful they are?

Thanks for studying this article; you can also get individual chapter wise section or region wise report version like North America, Middle East, Africa, Europe or LATAM, Southeast Asia.

Contact US:

Craig Francis (PR & Marketing Manager)
AMA Research & Media LLP
Unit No. 429, Parsonage Road Edison, NJ
New Jersey USA – 08837
Phone: +1 (206) 317 1218
[email protected]

Connect with us at
https://www.linkedin.com/company/advance-market-analytics
https://www.facebook.com/AMA-Research-Media-LLP-344722399585916
https://twitter.com/amareport

Sun, 17 Jul 2022 20:09:00 -0500 Newsmantraa en-US text/html https://www.digitaljournal.com/pr/learning-management-system-market-to-see-huge-growth-by-2027-xerox-ibm-sap
Killexams : Edology partners with IBM to launch Post Graduate Certificate Program in Data Science

Gurugram (Haryana) [India], July 30 (ANI/NewsVoir): Edology has announced a partnership with IBM, one of the world's top leading and reputed corporations, to introduce its Post Graduate Certificate Program in Data Science for working professionals and everyone wanting to enter the field of Data Science. Developed by IBM inventors and experts who hold numerous patents in the field of Data Science, this is the first IBM programme that has been completely designed by IBM and is being delivered by its faculty.

"The programme for the Edology x IBM Data Science course is a very special offering from IBM, and this is one-of-a-kind initiative," according to Hari Ramasubramanian, Leader, Business Development and Academia Relationships, IBM Expert Labs, India/South Asia. He further added, "There is a strong demand for skilled technology and trained professionals across the industry. Data science is not confined to IT. It includes all the verticals one can imagine-from board meetings to sports, data science brings a lot of value to organizations worldwide. For students, as well as professionals with experience, if you want to fast track your career on to the next level, this is the course you should be doing."

"The IBM Data Science certificate program through the Edology platform, will equip to adapt to the dynamics in the industry and drive technology innovation," said, Vithal Madyalkar, Program Director, IBM Innovation Centre for Education, India/South Asia. "The Data Science course modules will provide deep practical knowledge, coupled with broad-based industry alignment, interaction, talent discoverability as well as excellence in their professional practice."

A global Ed-Tech company, Edology helps students and professionals all around the world advance their careers in a variety of subjects, including data science, artificial intelligence, machine learning, cyber security, and more.

Unique Offerings of the IBM x Edology PG Certificate Programme in Data Science:

- 100+ hours of Live classes by IBM experts

- Globally recognized IBM digital badge

- Job opportunities with 300+ corporate partners

- Edology-IBM Award for Top Performers

- 1 on 1 mentorship from industry experts

- 1 day networking session with IBM team

- Guaranteed interview with IBM for top performers in each cohort

- Dedicated career assistance team

Sumanth Palepu, the Business Head at Edology, states, "Statistical estimates reveal that the worldwide market size for Data Science and analytics is anticipated to reach around a whopping $450 billion by 2025, which also means that the rivalry would be quite severe at the employee level, the competition will be very fierce. Thus, this collaboration with IBM is now more essential than ever, so that we are collectively able to deliver advanced level teaching to the students and working professionals and they get first-hand industry knowledge with our IBM experts."

www.youtube.com/watch?v=rjWGU_k2Dhg

Edology is a Global Ed-Tech Brand that provides industry-powered education and skills to students and professionals across the world, to help them achieve fast-track career growth. Launched in 2017, Edology connects professionals from across the globe with higher education programmes in the fields of law, finance, accounting, business, computing, marketing, fashion, criminology, psychology, and more.

It's a part of Global University Systems (GUS), an international network of higher-education institutions, brought together by a shared passion of providing industry-driven global education accessible and affordable. All the programs of Edology are built with the objective of providing its learners career enhancement and strong CV credentials, along with a quality learning experience.

The courses offered by Edology include Data Science, Certification in AI and Machine Learning, Data Analytics, PGP in International Business, PGP in Renewable Energy Management, PGP in Oil and Gas Management among others. These offerings are done through hands-on industry projects, interactive live classes, global peer-to-peer learning and other facilities.

This story is provided by NewsVoir. ANI will not be responsible in any way for the content of this article. (ANI/NewsVoir)

Fri, 29 Jul 2022 21:31:00 -0500 en text/html https://www.bignewsnetwork.com/news/272637512/edology-partners-with-ibm-to-launch-post-graduate-certificate-program-in-data-science
Killexams : CIOReview Names Cobalt Iron Among 10 Most Promising IBM Solution Providers 2022

LAWRENCE, Kan.--(BUSINESS WIRE)--Jul 28, 2022--

Cobalt Iron Inc., a leading provider of SaaS-based enterprise data protection, today announced that the company has been deemed one of the 10 Most Promising IBM Solution Providers 2022 by CIOReview Magazine. The annual list of companies is selected by a panel of experts and members of CIOReview Magazine’s editorial board to recognize and promote innovation and entrepreneurship. A technology partner for IBM, Cobalt Iron earned the distinction based on its Compass ® enterprise SaaS backup platform for monitoring, managing, provisioning, and securing the entire enterprise backup landscape.

This press release features multimedia. View the full release here: https://www.businesswire.com/news/home/20220728005043/en/

Cobalt Iron Compass® is a SaaS-based data protection platform leveraging strong IBM technologies for delivering a secure, modernized approach to data protection. (Graphic: Business Wire)

According to CIOReview, “Cobalt Iron has built a patented cyber-resilience technology in a SaaS model to alleviate the complexities of managing large, multivendor setups, providing an effectual humanless backup experience. This SaaS-based data protection platform, called Compass, leverages strong IBM technologies. For example, IBM Spectrum Protect is embedded into the platform from a data backup and recovery perspective. ... By combining IBM’s technologies and the intellectual property built by Cobalt Iron, the company delivers a secure, modernized approach to data protection, providing a ‘true’ software as a service.”

Through proprietary technology, the Compass data protection platform integrates with, automates, and optimizes best-of-breed technologies, including IBM Spectrum Protect, IBM FlashSystem, IBM Red Hat Linux, IBM Cloud, and IBM Cloud Object Storage. Compass enhances and extends IBM technologies by automating more than 80% of backup infrastructure operations, optimizing the backup landscape through analytics, and securing backup data, making it a valuable addition to IBM’s data protection offerings.

CIOReview also praised Compass for its simple and intuitive interface to display a consolidated view of data backups across an entire organization without logging in to every backup product instance to extract data. The machine learning-enabled platform also automates backup processes and infrastructure, and it uses open APIs to connect with ticket management systems to generate tickets automatically about any backups that need immediate attention.

To ensure the security of data backups, Cobalt Iron has developed an architecture and security feature set called Cyber Shield for 24/7 threat protection, detection, and analysis that improves ransomware responsiveness. Compass is also being enhanced to use several patented techniques that are specific to analytics and ransomware. For example, analytics-based cloud brokering of data protection operations helps enterprises make secure, efficient, and cost-effective use of their cloud infrastructures. Another patented technique — dynamic IT infrastructure optimization in response to cyberthreats — offers unique ransomware analytics and automated optimization that will enable Compass to reconfigure IT infrastructure automatically when it detects cyberthreats, such as a ransomware attack, and dynamically adjust access to backup infrastructure and data to reduce exposure.

Compass is part of IBM’s product portfolio through the IBM Passport Advantage program. Through Passport Advantage, IBM sellers, partners, and distributors around the world can sell Compass under IBM part numbers to any organizations, particularly complex enterprises, that greatly benefit from the automated data protection and anti-ransomware solutions Compass delivers.

CIOReview’s report concludes, “With such innovations, all eyes will be on Cobalt Iron for further advancements in humanless, secure data backup solutions. Cobalt Iron currently focuses on IP protection and continuous R&D to bring about additional cybersecurity-related innovations, promising a more secure future for an enterprise’s data.”

About Cobalt Iron

Cobalt Iron was founded in 2013 to bring about fundamental changes in the world’s approach to secure data protection, and today the company’s Compass ® is the world’s leading SaaS-based enterprise data protection system. Through analytics and automation, Compass enables enterprises to transform and optimize legacy backup solutions into a simple cloud-based architecture with built-in cybersecurity. Processing more than 8 million jobs a month for customers in 44 countries, Compass delivers modern data protection for enterprise customers around the world. www.cobaltiron.com

Product or service names mentioned herein are the trademarks of their respective owners.

Link to Word Doc:www.wallstcom.com/CobaltIron/220728-Cobalt_Iron-CIOReview_Top_IBM_Provider_2022.docx

Photo Link:www.wallstcom.com/CobaltIron/Cobalt_Iron_CIO_Review_Top_IBM_Solution_Provider_Award_Logo.pdf

Photo Caption: Cobalt Iron Compass ® is a SaaS-based data protection platform leveraging strong IBM technologies for delivering a secure, modernized approach to data protection.

Follow Cobalt Iron

https://twitter.com/cobaltiron

https://www.linkedin.com/company/cobalt-iron/

https://www.youtube.com/user/CobaltIronLLC

View source version on businesswire.com:https://www.businesswire.com/news/home/20220728005043/en/

CONTACT: Agency Contact:

Sunny Branson

Wall Street Communications

Tel: +1 801 326 9946

Email:sunny@wallstcom.com

Web:www.wallstcom.comCobalt Iron Contact:

Mary Spurlock

VP of Marketing

Tel: +1 785 979 9461

Email:maspurlock@cobaltiron.com

Web:www.cobaltiron.com

KEYWORD: EUROPE UNITED STATES NORTH AMERICA KANSAS

INDUSTRY KEYWORD: DATA MANAGEMENT SECURITY TECHNOLOGY SOFTWARE NETWORKS INTERNET

SOURCE: Cobalt Iron

Copyright Business Wire 2022.

PUB: 07/28/2022 09:00 AM/DISC: 07/28/2022 09:03 AM

http://www.businesswire.com/news/home/20220728005043/en

Thu, 28 Jul 2022 01:03:00 -0500 en text/html https://www.eagletribune.com/region/cioreview-names-cobalt-iron-among-10-most-promising-ibm-solution-providers-2022/article_56f7dda7-cbd5-586a-9d5f-f882022100da.html
Killexams : Astadia Publishes Mainframe to Cloud Reference Architecture Series

Press release content from Business Wire. The AP news staff was not involved in its creation.

BOSTON--(BUSINESS WIRE)--Aug 3, 2022--

Astadia is pleased to announce the release of a new series of Mainframe-to-Cloud reference architecture guides. The documents cover how to refactor IBM mainframes applications to Microsoft Azure, Amazon Web Services (AWS), Google Cloud, and Oracle Cloud Infrastructure (OCI). The documents offer a deep dive into the migration process to all major target cloud platforms using Astadia’s FastTrack software platform and methodology.

As enterprises and government agencies are under pressure to modernize their IT environments and make them more agile, scalable and cost-efficient, refactoring mainframe applications in the cloud is recognized as one of the most efficient and fastest modernization solutions. By making the guides available, Astadia equips business and IT professionals with a step-by-step approach on how to refactor mission-critical business systems and benefit from highly automated code transformation, data conversion and testing to reduce costs, risks and timeframes in mainframe migration projects.

“Understanding all aspects of legacy application modernization and having access to the most performant solutions is crucial to accelerating digital transformation,” said Scott G. Silk, Chairman and CEO. “More and more organizations are choosing to refactor mainframe applications to the cloud. These guides are meant to assist their teams in transitioning fast and safely by benefiting from Astadia’s expertise, software tools, partnerships, and technology coverage in mainframe-to-cloud migrations,” said Mr. Silk.

The new guides are part of Astadia’s free Mainframe-to-Cloud Modernization series, an ample collection of guides covering various mainframe migration options, technologies, and cloud platforms. The series covers IBM (NYSE:IBM) Mainframes.

In addition to the reference architecture diagrams, these comprehensive guides include various techniques and methodologies that may be used in forming a complete and effective Legacy Modernization plan. The documents analyze the important role of the mainframe platform, and how to preserve previous investments in information systems when transitioning to the cloud.

In each of the IBM Mainframe Reference Architecture white papers, readers will explore:

  • Benefits, approaches, and challenges of mainframe modernization
  • Understanding typical IBM Mainframe Architecture
  • An overview of Azure/AWS/Google Cloud/Oracle Cloud
  • Detailed diagrams of IBM mappings to Azure/AWS/ Google Cloud/Oracle Cloud
  • How to ensure project success in mainframe modernization

The guides are available for download here:

To access more mainframe modernization resources, visit the Astadia learning center on www.astadia.com.

About Astadia

Astadia is the market-leading software-enabled mainframe migration company, specializing in moving IBM and Unisys mainframe applications and databases to distributed and cloud platforms in unprecedented timeframes. With more than 30 years of experience, and over 300 mainframe migrations completed, enterprises and government organizations choose Astadia for its deep expertise, range of technologies, and the ability to automate complex migrations, as well as testing at scale. Learn more on www.astadia.com.

View source version on businesswire.com:https://www.businesswire.com/news/home/20220803005031/en/

CONTACT: Wilson Rains, Chief Revenue Officer

Wilson.Rains@astadia.com

+1.877.727.8234

KEYWORD: UNITED STATES NORTH AMERICA MASSACHUSETTS

INDUSTRY KEYWORD: DATA MANAGEMENT TECHNOLOGY OTHER TECHNOLOGY SOFTWARE NETWORKS INTERNET

SOURCE: Astadia

Copyright Business Wire 2022.

PUB: 08/03/2022 10:00 AM/DISC: 08/03/2022 10:02 AM

http://www.businesswire.com/news/home/20220803005031/en

Wed, 03 Aug 2022 02:02:00 -0500 en text/html https://apnews.com/press-release/BusinessWire/technology-f50b643965d24115b2c526c8f96321a6
Killexams : IBM hopes a new error mitigation technique will help it get to quantum advantage

It felt like for a long time, the quantum computing industry avoided talking about "quantum advantage" or "quantum supremacy," the point where quantum computers can solve problems that would simply take too long to solve on classical computers. To some degree, that's because the industry wanted to avoid the hype that comes with that, but IBM today brought back talk about quantum advantage again by detailing how it plans to use a novel error mitigation technique to chart a path toward running the increasingly large circuits it'll take to reach this goal -- at least for a certain set of algorithms.

It's no secret that quantum computers hate nothing more than noise. Qubits are fickle things, after all, and the smallest change in temperature or vibration can make them decohere. There's a reason the current era of quantum computing is associated with "noisy intermediate-scale quantum (NISQ) technology."

The engineers at IBM and every other quantum computing company are making slow but steady strides toward reducing that noise on the hardware and software level, with IBM's 65-qubit systems from 2020 now showing twice the coherence time compared to when they first launched, for example. The coherence time of IBM's transmon superconducting qubits is now over 1 ms.

But IBM is also taking another approach but betting on new error mitigation techniques, dubbed probabilistic error cancellation and zero-noise extrapolation. At a very basic level, you can almost think of this as the quantum equivalent of the active noise cancellation in your headphones. The system regularly checks the system for noise and then essentially inverts those noisy circuits to enable it to create virtually error-free results.

IBM has now shown that this isn't just a theoretical possibility but actually works in its existing systems. One disadvantage here is that there is quite a bit of overhead when you constantly trial these noisy circuits and that overhead is exponential in the number of qubits and the circuit depths. But that's a trade-off worth making, argues Jerry Chow, the director of Hardware Development for IBM Quantum.

"Error mitigation is about finding ways to deal with the physical errors in certain ways, by learning about the errors and also just running quantum circuits in such a way that allows us to cancel them," explained Chow. "In some ways, error correction is like the ultimate error mitigation, but the point is that there are techniques that are more near term with a lot of the hardware that we're building that already provide this avenue. The one that we're really excited about is called probabilistic error cancellation. And that one really is a way of trading off runtime -- trading off running more circuits in order to learn about the noise that might be inherent to the system that is impacting your calculations."

The system essentially inserts additional gates into existing circuits to trial the noise inherent in the system. And while the overhead increases exponentially with the size of the system, the IBM team believes it's a weaker exponential than the best classical methods to estimate those same circuits.

As IBM previously announced, it plans to introduce error mitigation and suppression techniques into its Qiskit Runtime by 2024 or 2025 so developers won't even have to think about these when writing their code.

Tue, 19 Jul 2022 01:17:00 -0500 en-US text/html https://www.yahoo.com/entertainment/ibm-hopes-error-mitigation-technique-120008722.html
Killexams : IBM Annual Cost of Data Breach Report 2022: Record Costs Usually Passed On to Consumers, “Long Breach” Expenses Make Up Half of Total Damage

IBM’s annual Cost of Data Breach Report for 2022 is packed with revelations, and as usual none of them are good news. Headlining the report is the record-setting cost of data breaches, with the global average now at $4.35 million. The report also reveals that much of that expense comes with the data breach version of “long Covid,” expenses that are realized more than a year after the attack.

Most organizations (60%) are passing these added costs on to consumers in the form of higher prices. And while 83% of organizations now report experiencing at least one data breach, only a small minority are adopting zero trust strategies.

Security AI and automation greatly reduces expected damage

The IBM report draws on input from 550 global organizations surveyed about the period between March 2021 and March 2022, in partnership with the Ponemon Institute.

Though the average cost of a data breach is up, it is only by about 2.6%; the average in 2021 was $4.24 million. This represents a total climb of 13% since 2020, however, reflecting the general spike in cyber crime seen during the pandemic years.

Organizations are also increasingly not opting to absorb the cost of data breaches, with the majority (60%) compensating by raising consumer prices separate from any other recent increases due to inflation or supply chain issues. The report indicates that this may be an underreported upward influence on prices of consumer goods, as 83% of organizations now say that they have been breached at least once.

Brad Hong, Customer Success Manager for Horizon3.ai, sees a potential consumer backlash on the horizon once public awareness of this practice grows: “It’s already a breach of confidence to lose the confidential data of customers, and sure there’s bound to be an organization across those surveyed who genuinely did put in the effort to protect against and curb attacks, but for those who did nothing, those who, instead of creating a disaster recovery plan, just bought cyber insurance to cover the org’s operational losses, and those who simply didn’t care enough to heed the warnings, it’s the coup de grâce to then pass the cost of breaches to the same customers who are now the victims of a data breach. I’d be curious to know what percent of the 60% of organizations who increased the price of their products and services are using the extra revenue for a war chest or to actually reinforce their security—realistically, it’s most likely just being used to fill a gap in lost revenue for shareholders’ sake post-breach. Without government regulations outlining restrictions on passing cost of breach to consumer, at the least, not without the honest & measurable efforts of a corporation as their custodian, what accountability do we all have against that one executive who didn’t want to change his/her password?”

Breach costs also have an increasingly long tail, as nearly half now come over a year after the date of the attack. The largest of these are generally fines that are levied after an investigation, and decisions or settlements in class action lawsuits. While the popular new “double extortion” approach of ransomware attacks can drive long-term costs in this way, the study finds that companies paying ransom demands to settle the problem quickly aren’t necessarily seeing a large amount of overall savings: their average breach cost drops by just $610,000.

Sanjay Raja, VP of Product with Gurucul, expands on how knock-on data breach damage can continue for years: “The follow-up attack effect, as described, is a significant problem as the playbooks and solutions provided to security operations teams are overly broad and lack the necessary context and response actions for proper remediation. For example, shutting down a user or application or adding a firewall block rule or quarantining a network segment to negate an attack is not a sustainable remediation step to protect an organization on an ongoing basis. It starts with a proper threat detection, investigation and response solution. Current SIEMs and XDR solutions lack the variety of data, telemetry and combined analytics to not only identify an attack campaign and even detect variants on previously successful attacks, but also provide the necessary context, accuracy and validation of the attack to build both a precise and complete response that can be trusted. This is an even greater challenge when current solutions cannot handle complex hybrid multi-cloud architectures leading to significant blind spots and false positives at the very start of the security analyst journey.”

Rising cost of data breach not necessarily prompting dramatic security action

In spite of over four out of five organizations now having experienced some sort of data breach, only slightly over 20% of critical infrastructure companies have moved to zero trust strategies to secure their networks. Cloud security is also lagging as well, with a little under half (43%) of all respondents saying that their security practices in this area are either “early stage” or do not yet exist.

Those that have onboarded security automation and AI elements are the only group seeing massive savings: their average cost of data breach is $3.05 million lower. This particular study does not track average ransom demands, but refers to Sophos research that puts the most recent number at $812,000 globally.

The study also notes serious problems with incident response plans, especially troubling in an environment in which the average ransomware attack is now carried out in four days or less and the “time to ransom” has dropped to a matter of hours in some cases. 37% of respondents say that they do not test their incident response plans regularly. 62% say that they are understaffed to meet their cybersecurity needs, and these organizations tend to suffer over half a million more dollars in damages when they are breached.

Of course, cost of data breaches is not distributed evenly by geography or by industry type. Some are taking much bigger hits than others, reflecting trends established in prior reports. The health care industry is now absorbing a little over $10 million in damage per breach, with the average cost of data breach rising by $1 million from 2021. And companies in the United States face greater data breach costs than their counterparts around the world, at over $8 million per incident.

Shawn Surber, VP of Solutions Architecture and Strategy with Tanium, provides some insight into the unique struggles that the health care industry faces in implementing effective cybersecurity: “Healthcare continues to suffer the greatest cost of breaches but has among the lowest spend on cybersecurity of any industry, despite being deemed ‘critical infrastructure.’ The increased vulnerability of healthcare organizations to cyber threats can be traced to outdated IT systems, the lack of robust security controls, and insufficient IT staff, while valuable medical and health data— and the need to pay ransoms quickly to maintain access to that data— make healthcare targets popular and relatively easy to breach. Unlike other industries that can migrate data and sunset old systems, limited IT and security budgets at healthcare orgs make migration difficult and potentially expensive, particularly when an older system provides a small but unique function or houses data necessary for compliance or research, but still doesn’t make the cut to transition to a newer system. Hackers know these weaknesses and exploit them. Additionally, healthcare orgs haven’t sufficiently updated their security strategies and the tools that manufacturers, IT software vendors, and the FDA have made haven’t been robust enough to thwart the more sophisticated techniques of threat actors.”

Familiar incident types also lead the list of the causes of data breaches: compromised credentials (19%), followed by phishing (16%). Breaches initiated by these methods also tended to be a little more costly, at an average of $4.91 million per incident.

Global average cost of #databreach is now $4.35M, up 13% since 2020. Much of that are realized more than a year after the attack, and 60% of organizations are passing the costs on to consumers in the form of higher prices. #cybersecurity #respectdataClick to Tweet

Cutting the cost of data breach

Though the numbers are never as neat and clean as averages would indicate, it would appear that the cost of data breaches is cut dramatically for companies that implement solid automated “deep learning” cybersecurity tools, zero trust systems and regularly tested incident response plans. Mature cloud security programs are also a substantial cost saver.

Mon, 01 Aug 2022 10:00:00 -0500 Scott Ikeda en-US text/html https://www.cpomagazine.com/cyber-security/ibm-annual-cost-of-data-breach-report-2022-record-costs-usually-passed-on-to-consumers-long-breach-expenses-make-up-half-of-total-damage/
C9010-250 exam dump and training guide direct download
Training Exams List