Get 100% marks with 000-156 free pdf download and questions and answers
killexams.com is happy for helping candidates pass the 000-156 exam with its valid, refreshed and tried 000-156 real questions. We proceed as an educator in your meeting to get ready and breeze through 000-156 test. Your insight will be incredibly improved by going through our 000-156 practice test. You will be really prepared for a genuine 000-156 test. Thats all that we do.
Exam Code: 000-156 Practice exam 2022 by Killexams.com team System x Server Family Technical Support V1 IBM Technical information Killexams : IBM Technical information - BingNews
Search resultsKillexams : IBM Technical information - BingNews
https://killexams.com/exam_list/IBMKillexams : IBM Research Rolls Out A Comprehensive AI And Platform-Based Edge Research Strategy Anchored By Enterprise Partnerships & Use Cases
I recently met with Dr. Nick Fuller, Vice President, Distributed Cloud, at IBM Research for a discussion about IBM’s long-range plans and strategy for artificial intelligence and machine learning at the edge.
Dr. Fuller is responsible for providing AI and platform–based innovation for enterprise digital transformation spanning edge computing and distributed cloud management. He is an IBM Master Inventor with over 75 patents and co-author of 75 technical publications. Dr. Fuller obtained his Bachelor of Science in Physics and Math from Morehouse College and his PhD in Applied Physics from Columbia University.
Edge In, not Cloud Out
In general, Dr. Fuller told me that IBM is focused on developing an "edge in" position versus a "cloud out" position with data, AI, and Kubernetes-based platform technologies to scale hub and spoke deployments of edge applications.
A hub plays the role of a central control plane used for orchestrating the deployment and management of edge applications in a number of connected spoke locations such as a factory floor or a retail branch, where data is generated or locally aggregated for processing.
“Cloud out” refers to the paradigm where cloud service providers are extending their cloud architecture out to edge locations. In contrast, “edge in” refers to a provider-agnostic architecture that is cloud-independent and treats the data-plane as a first-class citizen.
IBM's overall architectural principle is scalability, repeatability, and full stack solution management that allows everything to be managed using a single unified control plane.
IBM’s Red Hat platform and infrastructure strategy anchors the application stack with a unified, scalable, and managed OpenShift-based control plane equipped with a high-performance storage appliance and self-healing system capabilities (inclusive of semi-autonomous operations).
IBM’s strategy also includes several in-progress platform-level technologies for scalable data, AI/ML runtimes, accelerator libraries for Day-2 AI operations, and scalability for the enterprise.
It is an important to mention that IBM is designing its edge platforms with labor cost and technical workforce in mind. Data scientists with PhDs are in high demand, making them difficult to find and expensive to hire once you find them. IBM is designing its edge system capabilities and processes so that domain experts rather than PhDs can deploy new AI models and manage Day-2 operations.
Why edge is important
Advances in computing and storage have made it possible for AI to process mountains of accumulated data to provide solutions. By bringing AI closer to the source of data, edge computing is faster and more efficient than cloud. While Cloud data accounts for 60% of the world’s data today, vast amounts of new data is being created at the edge, including industrial applications, traffic cameras, and order management systems, all of which can be processed at the edge in a fast and timely manner.
Public cloud and edge computing differ in capacity, technology, and management. An advantage of edge is that data is processed and analyzed at / near its collection point at the edge. In the case of cloud, data must be transferred from a local device and into the cloud for analytics and then transferred back to the edge again. Moving data through the network consumes capacity and adds latency to the process. It’s easy to see why executing a transaction at the edge reduces latency and eliminates unnecessary load on the network.
Increased privacy is another benefit of processing data at the edge. Analyzing data where it originates limits the risk of a security breach. Most of the communications between the edge and the cloud is then confined to such things as reporting, data summaries, and AI models, without ever exposing the raw data.
IBM at the Edge
In our discussion, Dr. Fuller provided a few examples to illustrate how IBM plans to provide new and seamless edge solutions for existing enterprise problems.
Example #1 – McDonald’s drive-thru
Dr. Fuller’s first example centered around Quick Service Restaurant’s (QSR) problem of drive-thru order taking. Last year, IBM acquired an automated order-taking system from McDonald's. As part of the acquisition, IBM and McDonald's established a partnership to perfect voice ordering methods using AI. Drive-thru orders are a significant percentage of total QSR orders for McDonald's and other QSR chains.
McDonald's and other QSR restaurants would like every order to be processed as quickly and accurately as possible. For that reason, McDonald's conducted trials at ten Chicago restaurants using an edge-based AI ordering system with NLP (Natural Language Processing) to convert spoken orders into a digital format. It was found that AI had the potential to reduce ordering errors and processing time significantly. Since McDonald's sells almost 7 million hamburgers daily, shaving a minute or two off each order represents a significant opportunity to address labor shortages and increase customer satisfaction.
Example #2 – Boston Dynamics and Spot the agile mobile robot
According to an earlier IBM survey, many manufacturers have already implemented AI-driven robotics with autonomous decision-making capability. The study also indicated that over 80 percent of companies believe AI can help Boost future business operations. However, some companies expressed concern about the limited mobility of edge devices and sensors.
To develop a mobile edge solution, IBM teamed up with Boston Dynamics. The partnership created an agile mobile robot using IBM Research and IBM Sustainability Software AI technology. The device can analyze visual sensor readings in hazardous and challenging industrial environments such as manufacturing plants, warehouses, electrical grids, waste treatment plants and other hazardous environments. The value proposition that Boston Dynamics brought to the partnership was Spot the agile mobile robot, a walking, sensing, and actuation platform. Like all edge applications, the robot’s wireless mobility uses self-contained AI/ML that doesn’t require access to cloud data. It uses cameras to read analog devices, visually monitor fire extinguishers, and conduct a visual inspection of human workers to determine if required safety equipment is being worn.
IBM was able to show up to a 10X speedup by automating some manual tasks, such as converting the detection of a problem into an immediate work order in IBM Maximo to correct it. A fast automated response was not only more efficient, but it also improved the safety posture and risk management for these facilities. Similarly, some factories need to thermally monitor equipment to identify any unexpected hot spots that may show up over time, indicative of a potential failure.
IBM is working with National Grid, an energy company, to develop a mobile solution using Spot, the agile mobile robot, for image analysis of transformers and thermal connectors. As shown in the above graphic, Spot also monitored connectors on both flat surfaces and 3D surfaces. IBM was able to show that Spot could detect excessive heat build-up in small connectors, potentially avoiding unsafe conditions or costly outages. This AI/ML edge application can produce faster response times when an issue is detected, which is why IBM believes significant gains are possible by automating the entire process.
IBM market opportunities
Drive-thru orders and mobile robots are just a few examples of the millions of potential AI applications that exist at the edge and are driven by several billion connected devices.
Edge computing is an essential part of enterprise digital transformation. Enterprises seek ways to demonstrate the feasibility of solving business problems using AI/ML and analytics at the edge. However, once a proof of concept has been successfully demonstrated, it is a common problem for a company to struggle with scalability, data governance, and full-stack solution management.
Challenges with scaling
“Determining entry points for AI at the edge is not the difficult part,” Dr. Fuller said. “Scale is the real issue.”
Scaling edge models is complicated because there are so many edge locations with large amounts of diverse content and a high device density. Because large amounts of data are required for training, data gravity is a potential problem. Further, in many scenarios, vast amounts of data are generated quickly, leading to potential data storage and orchestration challenges. AI Models are also rarely "finished." Monitoring and retraining of models are necessary to keep up with changes the environment.
Through IBM Research, IBM is addressing the many challenges of building an all-encompassing edge architecture and horizontally scalable data and AI technologies. IBM has a wealth of edge capabilities and an architecture to create the appropriate platform for each application.
IBM AI entry points at the edge
IBM sees Edge Computing as a $200 billion market by 2025. Dr. Fuller and his organization have identified four key market entry points for developing and expanding IBM’s edge compute strategy. In order of size, IBM believes its priority edge markets to be intelligent factories (Industry 4.0), telcos, retail automation, and connected vehicles.
IBM and its Red Hat portfolio already have an established presence in each market segment, particularly in intelligent operations and telco. Red Hat is also active in the connected vehicles space.
There have been three prior industrial revolutions, beginning in the 1700s up to our current in-progress fourth revolution, Industry 4.0, that promotes a digital transformation.
Manufacturing is the fastest growing and the largest of IBM’s four entry markets. In this segment, AI at the edge can Boost quality control, production optimization, asset management, and supply chain logistics. IBM believes there are opportunities to achieve a 4x speed up in implementing edge-based AI solutions for manufacturing operations.
For its Industry 4.0 use case development, IBM, through product, development, research and consulting teams, is working with a major automotive OEM. The partnership has established the following joint objectives:
Increase automation and scalability across dozens of plants using 100s of AI / ML models. This client has already seen value in applying AI/ML models for manufacturing applications. IBM Research is helping with re-training models and implementing new ones in an edge environment to help scale even more efficiently. Edge offers faster inference and low latency, allowing AI to be deployed in a wider variety of manufacturing operations requiring instant solutions.
Dramatically reduce the time required to onboard new models. This will allow training and inference to be done faster and allow large models to be deployed much more quickly. The quicker an AI model can be deployed in production; the quicker the time-to-value and the return-on-investment (ROI).
Accelerate deployment of new inspections by reducing the labeling effort and iterations needed to produce a production-ready model via data summarization. Selecting small data sets for annotation means manually examining thousands of images, this is a time-consuming process that will result in - labeling of redundant data. Using ML-based automation for data summarization will accelerate the process and produce better model performance.
Enable Day-2 AI operations to help with data lifecycle automation and governance, model creation, reduce production errors, and provide detection of out-of-distribution data to help determine if a model’s inference is accurate. IBM believes this will allow models to be created faster without data scientists.
Maximo Application Suite
IBM’s Maximo Application Suite plays an important part in implementing large manufacturers' current and future IBM edge solutions. Maximo is an integrated public or private cloud platform that uses AI, IoT, and analytics to optimize performance, extend asset lifecycles and reduce operational downtime and costs. IBM is working with several large manufacturing clients currently using Maximo to develop edge use cases, and even uses it within its own Manufacturing.
IBM has research underway to develop a more efficient method of handling life cycle management of large models that require immense amounts of data. Day 2 AI operations tasks can sometimes be more complex than initial model training, deployment, and scaling. Retraining at the edge is difficult because resources are typically limited.
Once a model is trained and deployed, it is important to monitor it for drift caused by changes in data distributions or anything that might cause a model to deviate from original requirements. Inaccuracies can adversely affect model ROI.
Day-2 AI Operations (retraining and scaling)
Day-2 AI operations consist of continual updates to AI models and applications to keep up with changes in data distributions, changes in the environment, a drop in model performance, availability of new data, and/or new regulations.
IBM recognizes the advantages of performing Day-2 AI Operations, which includes scaling and retraining at the edge. It appears that IBM is the only company with an architecture equipped to effectively handle Day-2 AI operations. That is a significant competitive advantage for IBM.
A company using an architecture that requires data to be moved from the edge back into the cloud for Day-2 related work will be unable to support many factory AI/ML applications because of the sheer number of AI/ML models to support (100s to 1000s).
“There is a huge proliferation of data at the edge that exists in multiple spokes,” Dr. Fuller said. "However, all that data isn’t needed to retrain a model. It is possible to cluster data into groups and then use sampling techniques to retrain the model. There is much value in federated learning from our point of view.”
Federated learning is a promising training solution being researched by IBM and others. It preserves privacy by using a collaboration of edge devices to train models without sharing the data with other entities. It is a good framework to use when resources are limited.
Dealing with limited resources at the edge is a challenge. IBM’s edge architecture accommodates the need to ensure resource budgets for AI applications are met, especially when deploying multiple applications and multiple models across edge locations. For that reason, IBM developed a method to deploy data and AI applications to scale Day-2 AI operations utilizing hub and spokes.
The graphic above shows the current status quo methods of performing Day-2 operations using centralized applications and a centralized data plane compared to the more efficient managed hub and spoke method with distributed applications and a distributed data plane. The hub allows it all to be managed from a single pane of glass.
Data Fabric Extensions to Hub and Spokes
IBM uses hub and spoke as a model to extend its data fabric. The model should not be thought of in the context of a traditional hub and spoke. IBM’s hub provides centralized capabilities to manage clusters and create multiples hubs that can be aggregated to a higher level. This architecture has four important data management capabilities.
First, models running in unattended environments must be monitored. From an operational standpoint, detecting when a model’s effectiveness has significantly degraded and if corrective action is needed is critical.
Secondly, in a hub and spoke model, data is being generated and collected in many locations creating a need for data life cycle management. Working with large enterprise clients, IBM is building unique capabilities to manage the data plane across the hub and spoke estate - optimized to meet data lifecycle, regulatory & compliance as well as local resource requirements. Automation determines which input data should be selected and labeled for retraining purposes and used to further Boost the model. Identification is also made for atypical data that is judged worthy of human attention.
The third issue relates to AI pipeline compression and adaptation. As mentioned earlier, edge resources are limited and highly heterogeneous. While a cloud-based model might have a few hundred million parameters or more, edge models can’t afford such resource extravagance because of resource limitations. To reduce the edge compute footprint, model compression can reduce the number of parameters. As an example, it could be reduced from several hundred million to a few million.
Lastly, suppose a scenario exists where data is produced at multiple spokes but cannot leave those spokes for compliance reasons. In that case, IBM Federated Learning allows learning across heterogeneous data in multiple spokes. Users can discover, curate, categorize and share data assets, data sets, analytical models, and their relationships with other organization members.
In addition to AI deployments, the hub and spoke architecture and the previously mentioned capabilities can be employed more generally to tackle challenges faced by many enterprises in consistently managing an abundance of devices within and across their enterprise locations. Management of the software delivery lifecycle or addressing security vulnerabilities across a vast estate are a case in point.
Multicloud and Edge platform
In the context of its strategy, IBM sees edge and distributed cloud as an extension of its hybrid cloud platform built around Red Hat OpenShift. One of the newer and more useful options created by the Red Hat development team is the Single Node OpenShift (SNO), a compact version of OpenShift that fits on a single server. It is suitable for addressing locations that are still servers but come in a single node, not clustered, deployment type.
For smaller footprints such as industrial PCs or computer vision boards (for example NVidia Jetson Xavier), Red Hat is working on a project which builds an even smaller version of OpenShift, called MicroShift, that provides full application deployment and Kubernetes management capabilities. It is packaged so that it can be used for edge device type deployments.
Overall, IBM and Red Hat have developed a full complement of options to address a large spectrum of deployments across different edge locations and footprints, ranging from containers to management of full-blown Kubernetes applications from MicroShift to OpenShift and IBM Edge Application Manager.
Much is still in the research stage. IBM's objective is to achieve greater consistency in terms of how locations and application lifecycle is managed.
First, Red Hat plans to introduce hierarchical layers of management with Red Hat Advanced Cluster Management (RHACM), to scale by two to three orders of magnitude the number of edge locations managed by this product. Additionally, securing edge locations is a major focus. Red Hat is continuously expanding platform security features, for example by recently including Integrity Measurement Architecture in Red Hat Enterprise Linux, or by adding Integrity Shield to protect policies in Red Hat Advanced Cluster Management (RHACM).
Red Hat is partnering with IBM Research to advance technologies that will permit it to protect platform integrity and the integrity of client workloads through the entire software supply chains. In addition, IBM Research is working with Red Hat on analytic capabilities to identify and remediate vulnerabilities and other security risks in code and configurations.
Telco network intelligence and slice management with AL/ML
Communication service providers (CSPs) such as telcos are key enablers of 5G at the edge. 5G benefits for these providers include:
Reduced operating costs
Increased distribution and density
The end-to-end 5G network comprises the Radio Access Network (RAN), transport, and core domains. Network slicing in 5G is an architecture that enables multiple virtual and independent end-to-end logical networks with different characteristics such as low latency or high bandwidth, to be supported on the same physical network. This is implemented using cloud-native technology enablers such as software defined networking (SDN), virtualization, and multi-access edge computing. Slicing offers necessary flexibility by allowing the creation of specific applications, unique services, and defined user groups or networks.
An important aspect of enabling AI at the edge requires IBM to provide CSPs with the capability to deploy and manage applications across various enterprise locations, possibly spanning multiple end-to-end network slices, using a single pane of glass.
5G network slicing and slice management
Network slices are an essential part of IBM's edge infrastructure that must be automated, orchestrated and optimized according to 5G standards. IBM’s strategy is to leverage AI/ML to efficiently manage, scale, and optimize the slice quality of service, measured in terms of bandwidth, latency, or other metrics.
5G and AI/ML at the edge also represent a significant opportunity for CSPs to move beyond traditional cellular services and capture new sources of revenue with new services.
Communications service providers need management and control of 5G network slicing enabled with AI-powered automation.
Dr. Fuller sees a variety of opportunities in this area. "When it comes to applying AI and ML on the network, you can detect things like intrusion detection and malicious actors," he said. "You can also determine the best way to route traffic to an end user. Automating 5G functions that run on the network using IBM network automation software also serves as an entry point.”
In IBM’s current telecom trial, IBM Research is spearheading the development of a range of capabilities targeted for the IBM Cloud Pak for Network Automation product using AI and automation to orchestrate, operate and optimize multivendor network functions and services that include:
End-to-end 5G network slice management with planning & design, automation & orchestration, and operations & assurance
Network Data and AI Function (NWDAF) that collects data for slice monitoring from 5G Core network functions, performs network analytics, and provides insights to authorized data consumers.
Improved operational efficiency and reduced cost
Future leverage of these capabilities by existing IBM Clients that use the Cloud Pak for Network Automation (e.g., DISH) can offer further differentiation for CSPs.
5G radio access
Open radio access networks (O-RANs) are expected to significantly impact telco 5G wireless edge applications by allowing a greater variety of units to access the system. The O-RAN concept separates the DU (Distributed Units) and CU (Centralized Unit) from a Baseband Unit in 4G and connects them with open interfaces.
O-RAN system is more flexible. It uses AI to establish connections made via open interfaces that optimize the category of a device by analyzing information about its prior use. Like other edge models, the O-RAN architecture provides an opportunity for continuous monitoring, verification, analysis, and optimization of AI models.
The IBM-telco collaboration is expected to advance O-RAN interfaces and workflows. Areas currently under development are:
Multi-modal (RF level + network-level) analytics (AI/ML) for wireless communication with high-speed ingest of 5G data
Capability to learn patterns of metric and log data across CUs and DUs in RF analytics
Utilization of the antenna control plane to optimize throughput
Primitives for forecasting, anomaly detection and root cause analysis using ML
Opportunity of value-added functions for O-RAN
IBM Cloud and Infrastructure
The cornerstone for the delivery of IBM's edge solutions as a service is IBM Cloud Satellite. It presents a consistent cloud-ready, cloud-native operational view with OpenShift and IBM Cloud PaaS services at the edge. In addition, IBM integrated hardware and software Edge systems will provide RHACM - based management of the platform when clients or third parties have existing managed as a service models. It is essential to note that in either case this is done within a single control plane for hubs and spokes that helps optimize execution and management from any cloud to the edge in the hub and spoke model.
IBM's focus on “edge in” means it can provide the infrastructure through things like the example shown above for software defined storage for federated namespace data lake that surrounds other hyperscaler clouds. Additionally, IBM is exploring integrated full stack edge storage appliances based on hyperconverged infrastructure (HCI), such as the Spectrum Fusion HCI, for enterprise edge deployments.
As mentioned earlier, data gravity is one of the main driving factors of edge deployments. IBM has designed its infrastructure to meet those data gravity requirements, not just for the existing hub and spoke topology but also for a future spoke-to-spoke topology where peer-to-peer data sharing becomes imperative (as illustrated with the wealth of examples provided in this article).
Edge is a distributed computing model. One of its main advantages is that computing, and data storage and processing is close to where data is created. Without the need to move data to the cloud for processing, real-time application of analytics and AI capabilities provides immediate solutions and drives business value.
IBM’s goal is not to move the entirety of its cloud infrastructure to the edge. That has little value and would simply function as a hub to spoke model operating on actions and configurations dictated by the hub.
IBM’s architecture will provide the edge with autonomy to determine where data should reside and from where the control plane should be exercised.
Equally important, IBM foresees this architecture evolving into a decentralized model capable of edge-to-edge interactions. IBM has no firm designs for this as yet. However, the plan is to make the edge infrastructure and platform a first-class citizen instead of relying on the cloud to drive what happens at the edge.
Developing a complete and comprehensive AI/ML edge architecture - and in fact, an entire ecosystem - is a massive undertaking. IBM faces many known and unknown challenges that must be solved before it can achieve success.
However, IBM is one of the few companies with the necessary partners and the technical and financial resources to undertake and successfully implement a project of this magnitude and complexity.
It is reassuring that IBM has a plan and that its plan is sound.
Paul Smith-Goodsonis Vice President and Principal Analyst for quantum computing, artificial intelligence and space at Moor Insights and Strategy. You can follow him onTwitterfor more current information on quantum, AI, and space.
Note: Moor Insights & Strategy writers and editors may have contributed to this article.
Moor Insights & Strategy, like all research and tech industry analyst firms, provides or has provided paid services to technology companies. These services include research, analysis, advising, consulting, benchmarking, acquisition matchmaking, and speaking sponsorships. The company has had or currently has paid business relationships with 8×8, Accenture, A10 Networks, Advanced Micro Devices, Amazon, Amazon Web Services, Ambient Scientific, Anuta Networks, Applied Brain Research, Applied Micro, Apstra, Arm, Aruba Networks (now HPE), Atom Computing, AT&T, Aura, Automation Anywhere, AWS, A-10 Strategies, Bitfusion, Blaize, Box, Broadcom, C3.AI, Calix, Campfire, Cisco Systems, Clear Software, Cloudera, Clumio, Cognitive Systems, CompuCom, Cradlepoint, CyberArk, Dell, Dell EMC, Dell Technologies, Diablo Technologies, Dialogue Group, Digital Optics, Dreamium Labs, D-Wave, Echelon, Ericsson, Extreme Networks, Five9, Flex, Foundries.io, Foxconn, Frame (now VMware), Fujitsu, Gen Z Consortium, Glue Networks, GlobalFoundries, Revolve (now Google), Google Cloud, Graphcore, Groq, Hiregenics, Hotwire Global, HP Inc., Hewlett Packard Enterprise, Honeywell, Huawei Technologies, IBM, Infinidat, Infosys, Inseego, IonQ, IonVR, Inseego, Infosys, Infiot, Intel, Interdigital, Jabil Circuit, Keysight, Konica Minolta, Lattice Semiconductor, Lenovo, Linux Foundation, Lightbits Labs, LogicMonitor, Luminar, MapBox, Marvell Technology, Mavenir, Marseille Inc, Mayfair Equity, Meraki (Cisco), Merck KGaA, Mesophere, Micron Technology, Microsoft, MiTEL, Mojo Networks, MongoDB, MulteFire Alliance, National Instruments, Neat, NetApp, Nightwatch, NOKIA (Alcatel-Lucent), Nortek, Novumind, NVIDIA, Nutanix, Nuvia (now Qualcomm), onsemi, ONUG, OpenStack Foundation, Oracle, Palo Alto Networks, Panasas, Peraso, Pexip, Pixelworks, Plume Design, PlusAI, Poly (formerly Plantronics), Portworx, Pure Storage, Qualcomm, Quantinuum, Rackspace, Rambus, Rayvolt E-Bikes, Red Hat, Renesas, Residio, Samsung Electronics, Samsung Semi, SAP, SAS, Scale Computing, Schneider Electric, SiFive, Silver Peak (now Aruba-HPE), SkyWorks, SONY Optical Storage, Splunk, Springpath (now Cisco), Spirent, Splunk, Sprint (now T-Mobile), Stratus Technologies, Symantec, Synaptics, Syniverse, Synopsys, Tanium, Telesign,TE Connectivity, TensTorrent, Tobii Technology, Teradata,T-Mobile, Treasure Data, Twitter, Unity Technologies, UiPath, Verizon Communications, VAST Data, Ventana Micro Systems, Vidyo, VMware, Wave Computing, Wellsmith, Xilinx, Zayo, Zebra, Zededa, Zendesk, Zoho, Zoom, and Zscaler. Moor Insights & Strategy founder, CEO, and Chief Analyst Patrick Moorhead is an investor in dMY Technology Group Inc. VI, Dreamium Labs, Groq, Luminar Technologies, MemryX, and Movandi.
Mon, 08 Aug 2022 03:51:00 -0500Paul Smith-Goodsonentext/htmlhttps://www.forbes.com/sites/moorinsights/2022/08/08/ibm-research-rolls-out-a-comprehensive-ai-and-ml-edge-research-strategy-anchored-by-enterprise-partnerships-and-use-cases/Killexams : IBM report shows cyberattacks growing fast in number, scaleNo result found, try new keyword!A new report out of IBM shows that when it comes to the rising threat of data breaches, it’s the consumer – not the company – fronting the price tag.Fri, 29 Jul 2022 22:30:00 -0500text/htmlhttps://www.bizjournals.com/triad/news/2022/07/30/ibm-data-cyberattacks-growing-in-number-scale.htmlKillexams : Amazon, IBM Move Swiftly on Post-Quantum Cryptographic Algorithms Selected by NIST
A month after the National Institute of Standards and Technology (NIST) revealed the first quantum-safe algorithms, Amazon Web Services (AWS) and IBM have swiftly moved forward. Google was also quick to outline an aggressive implementation plan for its cloud service that it started a decade ago.
It helps that IBM researchers contributed to three of the four algorithms, while AWS had a hand in two. Google contributed to one of the submitted algorithms, SPHINCS+.
A long process that started in 2016 with 69 original candidates ends with the selection of four algorithms that will become NIST standards, which will play a critical role in protecting encrypted data from the vast power of quantum computers.
NIST's four choices include CRYSTALS-Kyber, a public-private key-encapsulation mechanism (KEM) for general asymmetric encryption, such as when connecting websites. For digital signatures, NIST selected CRYSTALS-Dilithium, FALCON, and SPHINCS+. NIST will add a few more algorithms to the mix in two years.
Vadim Lyubashevsky, a cryptographer who works in IBM's Zurich Research Laboratories, contributed to the development of CRYSTALS-Kyber, CRYSTALS-Dilithium, and Falcon. Lyubashevsky was predictably pleased by the algorithms selected, but he had only anticipated NIST would pick two digital signature candidates rather than three.
Ideally, NIST would have chosen a second key establishment algorithm, according to Lyubashevsky. "They could have chosen one more right away just to be safe," he told Dark Reading. "I think some people expected McEliece to be chosen, but maybe NIST decided to hold off for two years to see what the backup should be to Kyber."
IBM's New Mainframe Supports NIST-Selected Algorithms
After NIST identified the algorithms, IBM moved forward by specifying them into its recently launched z16 mainframe. IBM introduced the z16 in April, calling it the "first quantum-safe system," enabled by its new Crypto Express 8S card and APIs that provide access to the NIST APIs.
IBM was championing three of the algorithms that NIST selected, so IBM had already included them in the z16. Since IBM had unveiled the z16 before the NIST decision, the company implemented the algorithms into the new system. IBM last week made it official that the z16 supports the algorithms.
Anne Dames, an IBM distinguished engineer who works on the company's z Systems team, explained that the Crypto Express 8S card could implement various cryptographic algorithms. Nevertheless, IBM was betting on CRYSTAL-Kyber and Dilithium, according to Dames.
"We are very fortunate in that it went in the direction we hoped it would go," she told Dark Reading. "And because we chose to implement CRYSTALS-Kyber and CRYSTALS-Dilithium in the hardware security module, which allows clients to get access to it, the firmware in that hardware security module can be updated. So, if other algorithms were selected, then we would add them to our roadmap for inclusion of those algorithms for the future."
A software library on the system allows application and infrastructure developers to incorporate APIs so that clients can generate quantum-safe digital signatures for both classic computing systems and quantum computers.
"We also have a CRYSTALS-Kyber interface in place so that we can generate a key and provide it wrapped by a Kyber key so that could be used in a potential key exchange scheme," Dames said. "And we've also incorporated some APIs that allow clients to have a key exchange scheme between two parties."
Dames noted that clients might use Kyber to generate digital signatures on documents. "Think about code signing servers, things like that, or documents signing services, where people would like to actually use the digital signature capability to ensure the authenticity of the document or of the code that's being used," she said.
AWS Engineers Algorithms Into Services
During Amazon's AWS re:Inforce security conference last week in Boston, the cloud provider emphasized its post-quantum cryptography (PQC) efforts. According to Margaret Salter, director of applied cryptography at AWS, Amazon is already engineering the NIST standards into its services.
During a breakout session on AWS' cryptography efforts at the conference, Salter said AWS had implemented an open source, hybrid post-quantum key exchange based on a specification called s2n-tls, which implements the Transport Layer Security (TLS) protocol across different AWS services. AWS has contributed it as a draft standard to the Internet Engineering Task Force (IETF).
Salter explained that the hybrid key exchange brings together its traditional key exchanges while enabling post-quantum security. "We have regular key exchanges that we've been using for years and years to protect data," she said. "We don't want to get rid of those; we're just going to enhance them by adding a public key exchange on top of it. And using both of those, you have traditional security, plus post quantum security."
While Google didn't make implementation announcements like AWS in the immediate aftermath of NIST's selection, VP and CISO Phil Venables said Google has been focused on PQC algorithms "beyond theoretical implementations" for over a decade. Venables was among several prominent researchers who co-authored a technical paper outlining the urgency of adopting PQC strategies. The peer-reviewed paper was published in May by Nature, a respected journal for the science and technology communities.
"At Google, we're well into a multi-year effort to migrate to post-quantum cryptography that is designed to address both immediate and long-term risks to protect sensitive information," Venables wrote in a blog post published following the NIST announcement. "We have one goal: ensure that Google is PQC ready."
Venables recalled an experiment in 2016 with Chrome where a minimal number of connections from the Web browser to Google servers used a post-quantum key-exchange algorithm alongside the existing elliptic-curve key-exchange algorithm. "By adding a post-quantum algorithm in a hybrid mode with the existing key exchange, we were able to test its implementation without affecting user security," Venables noted.
Google and Cloudflare announced a "wide-scale post-quantum experiment" in 2019 implementing two post-quantum key exchanges, "integrated into Cloudflare's TLS stack, and deployed the implementation on edge servers and in Chrome Canary clients." The experiment helped Google understand the implications of deploying two post-quantum key agreements with TLS.
Venables noted that last year Google tested post-quantum confidentiality in TLS and found that various network products were not compatible with post-quantum TLS. "We were able to work with the vendor so that the issue was fixed in future firmware updates," he said. "By experimenting early, we resolved this issue for future deployments."
Other Standards Efforts
The four algorithms NIST announced are an important milestone in advancing PQC, but there's other work to be done besides quantum-safe encryption. The AWS TLS submission to the IETF is one example; others include such efforts as Hybrid PQ VPN.
"What you will see happening is those organizations that work on TLS protocols, or SSH, or VPN type protocols, will now come together and put together proposals which they will evaluate in their communities to determine what's best and which protocols should be updated, how the certificates should be defined, and things like things like that," IBM's Dames said.
Dustin Moody, a mathematician at NIST who leads its PQC project, shared a similar view during a panel discussion at the RSA Conference in June. "There's been a lot of global cooperation with our NIST process, rather than fracturing of the effort and coming up with a lot of different algorithms," Moody said. "We've seen most countries and standards organizations waiting to see what comes out of our nice progress on this process, as well as participating in that. And we see that as a very good sign."
Thu, 04 Aug 2022 09:03:00 -0500entext/htmlhttps://www.darkreading.com/dr-tech/amazon-ibm-move-swiftly-on-post-quantum-cryptographic-algorithms-selected-by-nistKillexams : IBM aims for immediate quantum advantage with error mitigation technique
You don’t have to be a physicist to know that noise and quantum computing don’t mix. Any noise, movement or temperature swing causes qubits – the quantum computing equivalent to a binary bit in classical computing – to fail.
That’s one of the main reasons quantum advantage (the point at which quantum surpasses classic computing) and quantum supremacy (when quantum computers solve a problem not feasible for classical computing) feel like longer-term goals and emerging technology. It’s worth the wait, though, as quantum computers promise exponential increases over classic computing, which tops out at supercomputing. However, due to the intricacies of quantum physics (e.g., entanglement), quantum computers are also more prone to errors based on environmental factors when compared to supercomputers or high-performance computers.
Quantum errors arise from what’s known as decoherence, a process that occurs when noise or nonoptimal temperatures interfere with qubits, changing their quantum states and causing information stored by the quantum computer to be lost.
The road(s) to quantum
Many enterprises view quantum computing technology as a zero-sum scenario and that if you want value from a quantum computer, you need fault-tolerant quantum processors and a multitude of qubits. While we wait, we’re stuck in the NISQ era — noisy intermediate-scale quantum — where quantum hasn’t surpassed classical computers.
That’s an impression IBM hopes to change.
In a blog published today by IBM, its quantum team (Kristan Temme, Ewout van den Berg, Abhinav Kandala and Jay Gambett) writes that the history of classical computing is one of incremental advances.
“Although quantum computers have seen tremendous improvements in their scale, quality and speed in recent years, such a gradual evolution seems to be missing from the narrative,” the team wrote. “However, recent advances in techniques we refer to broadly as quantum error mitigation allow us to lay out a smoother path towards this goal. Along this path, advances in qubit coherence, gate fidelities and speed immediately translate to measurable advantage in computation, akin to the steady progress historically observed with classical computers.”
Finding value in noisy qubits
In a move to get a quantum advantage sooner – and in incremental steps – IBM claims to have created a technique that’s designed to tap more value from noisy qubits and move away from NISQ.
Instead of focusing solely on fault-tolerant computers. IBM’s goal is continuous and incremental improvements, Jerry Chow, the director of hardware development for IBM Quantum, told VentureBeat.
To mitigate errors, Chow points to IBM’s new probabilistic error cancellation, a technique designed to invert noisy quantum circuits to achieve error-free results, even though the circuits themselves are noisy. It does bring a runtime tradeoff, he said, because you’re giving up running more circuits to gain insight into the noise causing the errors.
The goal of the new technique is to provide a step, rather than a leap, towards quantum supremacy. It’s “a near-term solution,” Chow said, and a part of a suite of techniques that will help IBM learn about error correction through error migration. “As you increase the runtime, you learn more as you run more qubits,” he explained.
Chow said that while IBM continues to scale its quantum platform, this offers an incremental step. Last year, IBM unveiled a 127-qubit Eagle processor, which is capable of running quantum circuits that can’t be replicated classically. Based on its quantum roadmap laid out in May, IBM systems is on track to reach 4,000-plus qubit quantum devices in 2025.
Not an either-or scenario: Quantum starts now
Probabilistic error cancellation represents a shift for IBM and the quantum field overall. Rather than relying solely on experiments to achieve full error correction under certain circumstances, IBM has focused on a continuous push to address quantum errors today while still moving toward fault-tolerant machines, Chow said. “You need high-quality hardware to run billions of circuits. Speed is needed. The goal is not to do error mitigation long-term. It’s not all or nothing.”
IBM quantum computing bloggers add that its quantum error mitigation technique “is the continuous path that will take us from today’s quantum hardware to tomorrow’s fault-tolerant quantum computers. This path will let us run larger circuits needed for quantum advantage, one hardware improvement at a time.”
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn more about membership.
Tue, 19 Jul 2022 15:31:00 -0500Dan Museen-UStext/htmlhttps://venturebeat.com/2022/07/19/ibm-aims-for-immediate-quantum-advantage-with-error-mitigation-technique/Killexams : Colorado’s P-TECH Students Graduate Ready for Tech Careers(TNS) — Abraham Tinajero was an eighth grader when he saw a poster in his Longmont middle school’s library advertising a new program offering free college with a technology focus.
Interested, he talked to a counselor to learn more about P-TECH, an early college program where he could earn an associate’s degree along with his high school diploma. Liking the sound of the program, he enrolled in the inaugural P-TECH class as a freshman at Longmont’s Skyline High School.
“I really loved working on computers, even before P-TECH,” he said. “I was a hobbyist. P-TECH gave me a pathway.”
He worked with an IBM mentor and interned at the company for six weeks as a junior. After graduating in 2020 with his high school diploma and the promised associate’s degree in computer science from Front Range Community College, he was accepted to IBM’s yearlong, paid apprenticeship program.
IBM hired him as a cybersecurity analyst once he completed the apprenticeship.
“P-TECH has given me a great advantage,” he said. “Without it, I would have been questioning whether to go into college. Having a college degree at 18 is great to put on a resume.”
Stanley Litow, a former vice president of IBM, developed the P-TECH, or Pathways in Technology Early College High Schools, model. The first P-TECH school opened 11 years ago in Brooklyn, New York, in partnership with IBM.
Litow’s idea was to get more underrepresented young people into tech careers by giving them a direct path to college while in high school — and in turn create a pipeline of employees with the job skills businesses were starting to value over four-year college degrees.
The program, which includes mentors and internships provided by business partners, gives high school students up to six years to earn an associate's degree at no cost.
SKYLINE HIGH A PIONEER IN PROGRAM
In Colorado, St. Vrain Valley was among the first school districts chosen by the state to offer a P-TECH program after the Legislature passed a bill to provide funding — and the school district has embraced the program.
Colorado’s first P-TECH programs started in the fall of 2016 at three high schools, including Skyline High. Over the last six years, 17 more Colorado high schools have adopted P-TECH, for at total of 20. Three of those are in St. Vrain Valley, with a fourth planned to open in the fall of 2023 at Longmont High School.
Each St. Vrain Valley high school offers a different focus supported by different industry partners.
Skyline partners with IBM, with students earning an associate’s degree in Computer Information Systems from Front Range. Along with being the first, Skyline’s program is the largest, enrolling up to 55 new freshmen each year.
Programs at the other schools are capped at 35 students per grade.
Frederick High’s program, which started in the fall of 2019, has a bioscience focus, partners with Aims Community College and works with industry partners Agilent Technologies, Tolmar, KBI Biopharma, AGC Biologics and Corden Pharma.
Silver Creek High’s program started a year ago with a cybersecurity focus. The Longmont school partners with Front Range and works with industry partners Seagate, Cisco, PEAK Resources and Comcast.
The new program coming to Longmont High will focus on business.
District leaders point to Skyline High’s graduation statistics to illustrate the program’s success. At Skyline, 100 percent of students in the first three P-TECH graduating classes earned a high school diploma in four years.
For the 2020 Skyline P-TECH graduates, 24 of the 33, or about 70 percent, also earned associate’s degrees. For the 2021 graduating class, 30 of the 47 have associate’s degrees — with one year left for those students to complete the college requirements.
For the most recent 2022 graduates, who have two years left to complete the college requirements, 19 of 59 have associate’s degrees and another six are on track to earn their degrees by the end of the summer.
JUMPING AT AN OPPORTUNITY
Louise March, Skyline High’s P-TECH counselor, keeps in touch with the graduates, saying 27 are working part time or full time at IBM. About a third are continuing their education at a four year college. Of the 19 who graduated in 2022 with an associate’s degree, 17 are enrolling at a four year college, she said.
Two of those 2022 graduates are Anahi Sarmiento, who is headed to the University of Colorado Boulder’s Leeds School of Business, and Jose Ivarra, who will study computer science at Colorado State University.
“I’m the oldest out of three siblings,” Ivarra said. “When you hear that someone wants to supply you free college in high school, you take it. I jumped at the opportunity.”
Sarmiento added that her parents, who are immigrants, are already working two jobs and don’t have extra money for college costs.
“P-TECH is pushing me forward,” she said. “I know my parents want me to have a better life, but I want them to have a better life, too. Going into high school, I kept that mentality that I would push myself to my full potential. It kept me motivated.”
While the program requires hard work, the two graduates said, they still enjoyed high school and had outside interests. Ivarra was a varsity football player who was named player of the year. Sarmiento took advantage of multiple opportunities, from helping elementary students learn robotics to working at the district’s Innovation Center.
Ivarra said he likes that P-TECH has the same high expectations for all students, no matter their backgrounds, and gives them support in any areas where they need help. Spanish is his first language and, while math came naturally, language arts was more challenging.
“It was tough for me to see all these classmates use all these big words, and I didn’t know them,” he said. “I just felt less. When I went into P-TECH, the teachers focus on you so much, checking on every single student.”
They said it’s OK to struggle or even fail. Ivarra said he failed a tough class during the pandemic, but was able to retake it and passed. Both credited March, their counselor, with providing unending support as they navigated high school and college classes.
“She’s always there for you,” Sarmiento said. “It’s hard to be on top of everything. You have someone to go to.”
Students also supported each other.
“You build bonds,” Ivarra said. “You’re all trying to figure out these classes. You grow together. It’s a bunch of people who want to succeed. The people that surround you in P-TECH, they push you to be better.”
SUPPORT SYSTEMS ARE KEY
P-TECH has no entrance requirements or prerequisite classes. You don’t need to be a top student, have taken advanced math or have a background in technology.
With students starting the rigorous program with a wide range of skills, teachers and counselors said, they quickly figured out the program needed stronger support systems.
March said freshmen in the first P-TECH class struggled that first semester, prompting the creation of a guided study class. The every other day, hour-and-a-half class includes both study time and time to learn workplace skills, including writing a resume and interviewing. Teachers also offer tutoring twice a week after school.
“The guided study has become crucial to the success of the program,” March said.
Another way P-TECH provides extra support is through summer orientation programs for incoming freshmen.
At Skyline, ninth graders take a three-week bridge class — worth half a credit — that includes learning good study habits. They also meet IBM mentors and take a field trip to Front Range Community College.
“They get their college ID before they get their high school ID,” March said.
During a session in June, 15 IBM mentors helped the students program a Sphero robot to travel along different track configurations. Kathleen Schuster, who has volunteered as an IBM mentor since the P-TECH program started here, said she wants to “return some of the favors I got when I was younger.”
“Even this play stuff with the Spheros, it’s teaching them teamwork and a little computing,” she said. “Hopefully, through P-TECH, they will learn what it takes to work in a tech job.”
Incoming Skyline freshman Blake Baker said he found a passion for programming at Trail Ridge Middle and saw P-TECH as a way to capitalize on that passion.
“I really love that they supply you options and a path,” he said.
Trail Ridge classmate Itzel Pereyra, another programming enthusiast, heard about P-TECH from her older brother.
“It’s really good for my future,” she said. “It’s an exciting moment, starting the program. It will just help you with everything.”
While some of the incoming ninth graders shared dreams of technology careers, others see P-TECH as a good foundation to pursue other dreams.
Skyline incoming ninth grader Marisol Sanchez wants to become a traveling nurse, demonstrating technology and new skills to other nurses. She added that the summer orientation sessions are a good introduction, helping calm the nerves that accompany combining high school and college.
“There’s a lot of team building,” she said. “It’s getting us all stronger together as a group and introducing everyone.”
THE SPARK OF MOTIVATION
Silver Creek’s June camp for incoming ninth graders included field trips to visit Cisco, Seagate, PEAK Resources, Comcast and Front Range Community College.
During the Front Range Community College field trip, the students heard from Front Range staff members before going on a scavenger hunt. Groups took photos to prove they completed tasks, snapping pictures of ceramic pieces near the art rooms, the most expensive tech product for sale in the bookstore and administrative offices across the street from the main building.
Emma Horton, an incoming freshman, took a cybersecurity class as a Flagstaff Academy eighth grader that hooked her on the idea of technology as a career.
“I’m really excited about the experience I will be getting in P-TECH,’ she said. “I’ve never been super motivated in school, but with something I’m really interested in, it becomes easier.”
Deb Craven, dean of instruction at Front Range’s Boulder County campus, promised the Silver Creek students that the college would support them. She also gave them some advice.
“You need to advocate and ask for help,” she said. “These two things are going to help you the most. Be present, be engaged, work together and lean on each other.”
Craven, who oversees Front Range’s P-TECH program partnership, said Front Range leaders toured the original P-TECH program in New York along with St. Vrain and IBM leaders in preparation for bringing P-TECH here.
“Having IBM as a partner as we started the program was really helpful,” she said.
When the program began, she said, freshmen took a more advanced technology class as their first college class. Now, she said, they start with a more fundamental class in the spring of their freshman year, learning how to build a computer.
“These guys have a chance to grow into the high school environment before we stick them in a college class,” she said.
Summer opportunities aren’t just for P-TECH’s freshmen. Along with summer internships, the schools and community colleges offer summer classes.
Silver Creek incoming 10th graders, for example, could take a personal financial literacy class at Silver Creek in the mornings and an introduction to cybersecurity class at the Innovation Center in the afternoons in June.
Over at Skyline, incoming 10th graders in P-TECH are getting paid to teach STEM lessons to elementary students while earning high school credit. Students in the fifth or sixth year of the program also had the option of taking computer science and algebra classes at Front Range.
EMBRACING THE CHALLENGE
And at Frederick, incoming juniors are taking an introduction to manufacturing class at the district's Career Elevation and Technology Center this month in preparation for an advanced manufacturing class they’re taking in the fall.
“This will supply them a head start for the fall,” said instructor Chester Clark.
Incoming Frederick junior Destini Johnson said she’s not sure what she wants to do after high school, but believes the opportunities offered by P-TECH will prepare her for the future.
“I wanted to try something challenging, and getting a head start on college can only help,” she said. “It’s really incredible that I’m already halfway done with an associate’s degree and high school.”
IBM P-TECH program manager Tracy Knick, who has worked with the Skyline High program for three years, said it takes a strong commitment from all the partners — the school district, IBM and Front Range — to make the program work.
“It’s not an easy model,” she said. “When you say there are no entrance requirements, we all have to be OK with that and support the students to be successful.”
IBM hosted 60 St. Vrain interns this summer, while two Skyline students work as IBM “co-ops” — a national program — to assist with the P-TECH program.
The company hosts two to four formal events for the students each year to work on professional and technical skills, while IBM mentors provide tutoring in algebra. During the pandemic, IBM also paid for subscriptions to tutor.com so students could get immediate help while taking online classes.
“We want to get them truly workforce ready,” Knick said. “They’re not IBM-only skills we’re teaching. Even though they choose a pathway, they can really do anything.”
As the program continues to expand in the district, she said, her wish is for more businesses to recognize the value of P-TECH.
“These students have had intensive training on professional skills,” she said. “They have taken college classes enhanced with the same digital credentials that an IBM employee can learn. There should be a waiting list of employers for these really talented and skilled young professionals.”
Thu, 04 Aug 2022 02:41:00 -0500entext/htmlhttps://www.govtech.com/education/k-12/colorados-p-tech-students-graduate-ready-for-tech-careersKillexams : TONE Scoops Up the CASI Suite of z/OS Output Transformation Products
Tone Software Corporation, a global provider of management and productivity solutions for IBM Z mainframes, is acquiring the JES2Mail, JES2FTP, Mail2ZOS, and CICS2PDF host output transformation and delivery products from CASI Software, Inc.
Effective June 1, 2022, the acquisition of the CASI JES2Mail suite will expand Tone’s OMC z/OS Output Management offerings for mainframe shops seeking to deliver the right information to the right users, in the most cost effective format for the business.
Further, the combination of OMC with the CASI product suite enables legacy z/OS applications, including CICS transactional applications, to send host output directly to users’ email inboxes, in a familiar format that is highly portable and secure, according to the vendor.
“Acquiring the JES2Mail products positions Tone to help mainframe shops deliver host output in the most convenient format and media for each user, on the most cost-efficient platform,” said Shirley Balarezo, president of Tone Software. “The results are optimized business processes that modernize the mainframe environment for the future.”
Both organizations entered the acquisition deal with identical priorities top of mind: ensure current CASI customers enjoy full technology benefits and zero disruption to their ongoing product support needs. As such, the CASI management and technical teams are working directly with Tone throughout the next year to provide a smooth transition of products and services for all CASI users.
“We are pleased to enter into this agreement with Tone and provide a logical home for the JES2Mail/FTP, CICS2PDF, and Mail2ZOS products and customers,” said Robert LaBayne, president and founder of CASI Software, Inc. “The acquisition will leverage our respective product capabilities, bring greater value to our z/OS customers, and provide a solid foundation for expanding CASI technologies going forward,” he added.
‘Given its ability to boost innovation, productivity, resilience, and help organizations scale, IT has become a high priority in a company’s budget. As such, there is every reason to believe technology spending in the B2B space will continue to surpass GDP growth,’ says IBM CEO Arvind Krishna.
A strengthening IT environment that is playing into IBM AI and hybrid cloud capabilities means a rosy future for IBM and its B2B business, CEO Arvind Krishna told investors Monday.
Krishna, in his prepared remarks for IBM’s second fiscal quarter 2022 financial analyst conference call, said that technology serves as a fundamental source of competitive advantage for businesses.
“It serves as both a deflationary force and a force multiplier, and is especially critical as clients face challenges on multiple fronts from supply chain bottlenecks to demographic shifts,” he said. “Given its ability to boost innovation, productivity, resilience, and help organizations scale, IT has become a high priority in a company’s budget. As such, there is every reason to believe technology spending in the B2B space will continue to surpass GDP growth.”
That plays well with IBM’s hybrid cloud and AI strategy where the company is investing in its offerings, technical talent, ecosystem, and go-to-market model, Krishna said.
“Demand for our solutions remains strong,” he said. “We continued to have double-digit performance in IBM Consulting, broad-based strength in software, and with the z16 [mainframe] platform launch, our infrastructure business had a good quarter. By integrating technology and expertise from IBM and our partners, our clients will continue to see our hybrid cloud and AI solutions as a crucial source of business opportunity and growth.”
Krishna said hybrid clouds are about offering clients a platform to straddle multiple public clouds, private clouds, on-premises infrastructures, and the edge, which is where Red Hat, which IBM acquired in 2019, comes into play, Krishna said.
“Our software has been optimized to run on that platform, and includes advanced data and AI, automation, and the security capabilities our clients need,” he said. “Our global team of consultants offers deep business expertise and co-creates with clients to accelerate their digital transformation journeys. Our infrastructure allows clients to take full advantage of an extended hybrid cloud environment.”
As a result, IBM now has over 4,000 hybrid cloud platform clients, with over 250 new clients added during the second fiscal quarter, Krishna said.
“Those who adopt our platform tend to consume more of our solutions across software, consulting, and infrastructure, [and] expanding our footprint within those clients,” he said.
IBM is also benefitting from the steady adoption by businesses of artificial intelligence technologies as those businesses try to process the enormous amount of data generated from hybrid cloud environments all the way to the edge, Krishna said. An IBM study released during the second fiscal quarter found that 35 percent of companies are now using some form of AI with automation in their business to address demographic shifts and move their employees to higher value work, he said.
“This is one of the many reasons we are investing heavily in both AI and automation,” he said. “These investments are paying off.”
IBM is also moving to develop leadership in quantum computing, Krishna said. The company currently has a 127-qubit quantum computer it its cloud, and is committed to demonstrate the first 400-plus-qubit system before year-end as part of its path to deliver a 1,000-plus-qubit system next year and a 4,000-plus-qubit system in 2025, he said.
“One of the implications of quantum computing will be the need to change how information is encrypted,” he said. “We are proud that technology developed by IBM and our collaborators has been selected by NIST (National Institute of Standards and Technology) as the basis of the next generation of quantum-safe encryption protocols.”
“The z16 is designed for cloud-native development, cybersecurity resilience, [and] quantum-safe encryption, and includes an on-chip AI accelerator, which allows clients to reduce fraud within real-time transactions,” he said.
IBM also made two acquisitions during the quarter related to cybersecurity, Krishna said. The first was Randori, an attack surface management and offensive cybersecurity provider. That acquisition built on IBM’s November acquisition of ReaQta, an endpoint security firm, he said.
While analysts during the question and answer part of Monday’s financial analyst conference call did not ask about the news that IBM has brought in Matt Hicks as the new CEO of Red Hat, they did seem concerned about how the 17-percent growth in Red Had revenue over last year missed expectations.
When asked about Red Hat revenue, Krishna said IBM feels very good about the Red Hat business and expect continued strong demand.
“That said, we had said late last year that we expect growth in Red Hat to be in the upper teens,” he said. “That expectation is what we are going to continue with. … Deferred revenue accounts for the bulk of what has been the difference in the growth rates coming down from last year to this year.”
IBM CFO James Kavanaugh followed by saying that while IBM saw 17 percent growth overall for Red Hat, the company took market share with its core REL (Red Hat Enterprise Linux) and in its Red Hat OpenShift hybrid cloud platform foundation. Red Hat OpenShift revenue is now four-and-a-half times the revenue before IBM acquired Red Hat, and Red Hat OpenShift bookings were up over 50 percent, Kavanaugh said.
“So we feel pretty good about our Red Hat portfolio overall. … Remember, we‘re three years into this acquisition right now,” he said. “And we couldn’t be more pleased as we move forward.”
When asked about the potential impact from an economic downturn, Krishna said IBM’s pipelines remain healthy and consistent with what the company saw in the first half of fiscal 2022, making him more optimistic than many of his peers.
“In an inflationary environment, when clients take our technology, deploy it, leverage our consulting, it acts as a counterbalance to all of the inflation and all of the labor demographics that people are facing all over the globe,” he said.
Krishna also said IBM’s consulting business is less likely than most vendors’ business to be impacted by the economic cycle as it involves a lot of work around deploying the kinds of applications critical to clients’ need to optimize their costs. Furthermore, he said. Because consulting is very labor-intensive, it is easy to hire or let go tens of thousands of employees as needed, he said.
For its second fiscal quarter 2022, which ended June 30, IBM reported total revenue of $15.5 billion, up about 9 percent from the $14.2 billion the company reported for its second fiscal quarter 2021.
This includes software revenue of $6.2 billion, up from $5.9 billion; consulting revenue of $4.8 billion, up from $4.4 billion; infrastructure revenue of $4.2 billion, up from $3.6 billion; financing revenue of $146 million, down from $209 million; and other revenue of $180 million, down from $277 million.
On the software side, IBM reported annual recurring revenue of $12.9 billion, which was up 8 percent over last year. Software revenue from its Red Hat business was up 17 percent over last year, while automation software was up 8 percent, data and AI software up 4 percent, and security software up 5 percent.
On the consulting side, technology consulting revenue was up 23 percent over last year, applications operations up 17 percent, and business transformation up 16 percent.
Infrastructure revenue growth was driven by hybrid infrastructure sales, which rose 7 percent over last year, and infrastructure support, which grew 5 percent. Hybrid infrastructure revenue saw a significant boost from zSystems mainframe sales, which rose 77 percent over last year.
IBM also reported revenue of $8.1 billion from sales to the Americas, up 15 percent over last year; sales to Europe, Middle East, and Africa of $4.5 billion, up 17 percent; and $2.9 billion to the Asia Pacific area, up 16 percent.
Joseph F. Kovar is a senior editor and reporter for the storage and the non-tech-focused channel beats for CRN. He keeps readers abreast of the latest issues related to such areas as data life-cycle, business continuity and disaster recovery, and data centers, along with related services and software, while highlighting some of the key trends that impact the IT channel overall. He can be reached at email@example.com.
Tue, 19 Jul 2022 13:17:00 -0500entext/htmlhttps://www.crn.com/news/cloud/ibm-touts-ai-hybrid-cloud-demand-for-our-solutions-remains-strong-Killexams : Logicalis Acquires Q Associates to extend specialist Microsoft and data-centric IT services capabilities across their UK&I operation
London, 8th August, 2022 – Logicalis, an international IT solutions and managed services provider, today announced it has acquired Q Associates, one of the UK's leading providers of IT consultancy and advisory services around data management, data protection, compliance and information security.
The acquisition adds complementary capabilities to Logicalis UKI's core expertise in digital infrastructure, networking & cloud, enabling a broader portfolio of best-in-class solutions and services for customers operating in the digital-enabled World. Q Associates provides technology solutions to UK Universities and Research Councils, Government Security Services and Home Office departments and commercial clients across major industry sectors, including finance, legal, transportation and energy.
Q Associates holds advanced technical accreditations with many of the World's leading technology vendors, including Microsoft, NetApp, Oracle, IBM and Rubrik. The company is headquartered in Newbury, Berkshire, with regional offices in London, Manchester and Newcastle, as well as a Microsoft technical delivery team in Zimbabwe.
"The acquisition of Q Associates is fantastic news for all our customers and further strengthens our partnership portfolio. This announcement shows our commitment to being at the top table in the UKI partner market and customer landscape, especially around the Higher Education and Government Secured Services sectors, says Alex Louth, CEO of Logicalis UKI. "In addition, extending the reach and skills of Logicalis UKI shows our hunger to grow and provide increased value to customers across all sectors."
Commenting on the announcement, Andrew Griffiths, Business Development Director, Q Associates, adds: "We are extremely proud of the achievements of Q Associates with strong values around technical excellence and customer satisfaction. This acquisition is a natural fit for both organisations and will provide clear benefits to our customers through the extended capability and reach of Logicalis. I am very excited by this next stage in our evolution."
------ ENDS ------
About Logicalis Logicalis is an international solutions provider of digital services currently accelerating the digital transformation of its 10,000 customers around the world.
Through a globally connected network of specialist hubs, sector-leading experts (in education, financial services, government, healthcare, manufacturing, professional services, retail, and telecommunications) and strategic partnerships (including Cisco, Microsoft, HPE, IBM, NetApp, Oracle, ServiceNow, and VMware), Logicalis has more than 6,500 employees focused on understanding customer priorities and enhancing their experience.
As Architects of Change, Logicalis’ focus is to design, support, and execute customers’ digital transformation by bringing together their vision with its technological expertise and industry insights. The company, through its deep knowledge in key IT industry drivers such as Security, Cloud, Data Management and IoT, can address customer priorities such as revenue and business growth, operational efficiency, innovation, risk and compliance, data governance and sustainability.
The Logicalis Group has annualised revenues of $1.5 billion, from operations in Europe, North America, Latin America, Asia Pacific, and Africa. It is a division of Datatec Limited, listed on the Johannesburg Stock Exchange, with revenues of over $4.1 billion.
About Q Associates Established in 1986, Q Associates is an award-winning IT solutions provider, specialising in the in the design, deployment and support of IT infrastructure and data management platforms to more than 400 clients across the UK commercial and public sectors. The company is recognised as a leading provider of technology solutions to UK Universities and Research Councils and works with commercial clients across all major industry sectors including finance, legal, retail, transportation and energy. Working closely with many of the World’s leading technology providers, Q Associates has strategic partnerships with organisations that include NetApp, Oracle, Microsoft, VMware, AWS, Lenovo, and Dell. The company is headquartered in Newbury, Berkshire, with regional offices in London, Manchester, and Newcastle.
Sun, 07 Aug 2022 21:54:00 -0500text/htmlhttps://www.realwire.com/releases/Logicalis-Acquires-Q-Associates-to-extend-capabilities-across-UKI-operationKillexams : SVVSD embraces early college P-TECH programNo result found, try new keyword!In Colorado, St. Vrain Valley was among the first school districts chosen by the state to offer a P-TECH program after the Legislature passed a bill to provide funding — and the school ...Sat, 30 Jul 2022 15:39:40 -0500en-ustext/htmlhttps://www.msn.com/en-us/money/careersandeducation/svvsd-embraces-early-college-p-tech-program/ar-AA1098v3Killexams : Ex IBM manager: Pay, political neglect and public indifference all to blame for cyber failingsNo result found, try new keyword!Paul Farrell retired as IBM Ireland’s General Manager last year but also was an officer in the Irish Defence Forces ...Sat, 30 Jul 2022 21:00:09 -0500en-ietext/htmlhttps://www.msn.com/en-ie/news/other/ex-ibm-manager-pay-political-neglect-and-public-indifference-all-to-blame-for-cyber-failings/ar-AA108S8m000-156 exam dump and training guide direct download Training Exams List