Secure your 100 percent marks with these 000-M80 examcollection
Our 000-M80 Exam dumps are collected from our special practice test team. A large portion of applicants are befuddled to choose from an enormous number of sites, they should initially download free 000-M80 questions answers, actually take a look at the nature of braindumps, and afterward, choose to purchase a full form that contains complete 000-M80 questions bank and VCE exam simulator.
Exam Code: 000-M80 Practice test 2022 by Killexams.com team IBM ISW-9.7 & Smart Analytics Technical Mastery Test v1 IBM Analytics resources Killexams : IBM Analytics resources - BingNews
Search resultsKillexams : IBM Analytics resources - BingNews
https://killexams.com/exam_list/IBMKillexams : IBM Research Rolls Out A Comprehensive AI And Platform-Based Edge Research Strategy Anchored By Enterprise Use Cases And Partnerships
I recently met with Dr. Nick Fuller, Vice President, Distributed Cloud, at IBM Research for a discussion about IBM’s long-range plans and strategy for artificial intelligence and machine learning at the edge.
Dr. Fuller is responsible for providing AI and platform–based innovation for enterprise digital transformation spanning edge computing and distributed cloud management. He is an IBM Master Inventor with over 75 patents and co-author of 75 technical publications. Dr. Fuller obtained his Bachelor of Science in Physics and Math from Morehouse College and his PhD in Applied Physics from Columbia University.
Edge In, not Cloud Out
In general, Dr. Fuller told me that IBM is focused on developing an "edge in" position versus a "cloud out" position with data, AI, and Kubernetes-based platform technologies to scale hub and spoke deployments of edge applications.
A hub plays the role of a central control plane used for orchestrating the deployment and management of edge applications in a number of connected spoke locations such as a factory floor or a retail branch, where data is generated or locally aggregated for processing.
“Cloud out” refers to the paradigm where cloud service providers are extending their cloud architecture out to edge locations. In contrast, “edge in” refers to a provider-agnostic architecture that is cloud-independent and treats the data-plane as a first-class citizen.
IBM's overall architectural principle is scalability, repeatability, and full stack solution management that allows everything to be managed using a single unified control plane.
IBM’s Red Hat platform and infrastructure strategy anchors the application stack with a unified, scalable, and managed OpenShift-based control plane equipped with a high-performance storage appliance and self-healing system capabilities (inclusive of semi-autonomous operations).
IBM’s strategy also includes several in-progress platform-level technologies for scalable data, AI/ML runtimes, accelerator libraries for Day-2 AI operations, and scalability for the enterprise.
It is an important to mention that IBM is designing its edge platforms with labor cost and technical workforce in mind. Data scientists with PhDs are in high demand, making them difficult to find and expensive to hire once you find them. IBM is designing its edge system capabilities and processes so that domain experts rather than PhDs can deploy new AI models and manage Day-2 operations.
Why edge is important
Advances in computing and storage have made it possible for AI to process mountains of accumulated data to provide solutions. By bringing AI closer to the source of data, edge computing is faster and more efficient than cloud. While Cloud data accounts for 60% of the world’s data today, vast amounts of new data is being created at the edge, including industrial applications, traffic cameras, and order management systems, all of which can be processed at the edge in a fast and timely manner.
Public cloud and edge computing differ in capacity, technology, and management. An advantage of edge is that data is processed and analyzed at / near its collection point at the edge. In the case of cloud, data must be transferred from a local device and into the cloud for analytics and then transferred back to the edge again. Moving data through the network consumes capacity and adds latency to the process. It’s easy to see why executing a transaction at the edge reduces latency and eliminates unnecessary load on the network.
Increased privacy is another benefit of processing data at the edge. Analyzing data where it originates limits the risk of a security breach. Most of the communications between the edge and the cloud is then confined to such things as reporting, data summaries, and AI models, without ever exposing the raw data.
IBM at the Edge
In our discussion, Dr. Fuller provided a few examples to illustrate how IBM plans to provide new and seamless edge solutions for existing enterprise problems.
Example #1 – McDonald’s drive-thru
Dr. Fuller’s first example centered around Quick Service Restaurant’s (QSR) problem of drive-thru order taking. Last year, IBM acquired an automated order-taking system from McDonald's. As part of the acquisition, IBM and McDonald's established a partnership to perfect voice ordering methods using AI. Drive-thru orders are a significant percentage of total QSR orders for McDonald's and other QSR chains.
McDonald's and other QSR restaurants would like every order to be processed as quickly and accurately as possible. For that reason, McDonald's conducted trials at ten Chicago restaurants using an edge-based AI ordering system with NLP (Natural Language Processing) to convert spoken orders into a digital format. It was found that AI had the potential to reduce ordering errors and processing time significantly. Since McDonald's sells almost 7 million hamburgers daily, shaving a minute or two off each order represents a significant opportunity to address labor shortages and increase customer satisfaction.
Example #2 – Boston Dynamics and Spot the agile mobile robot
According to an earlier IBM survey, many manufacturers have already implemented AI-driven robotics with autonomous decision-making capability. The study also indicated that over 80 percent of companies believe AI can help Boost future business operations. However, some companies expressed concern about the limited mobility of edge devices and sensors.
To develop a mobile edge solution, IBM teamed up with Boston Dynamics. The partnership created an agile mobile robot using IBM Research and IBM Sustainability Software AI technology. The device can analyze visual sensor readings in hazardous and challenging industrial environments such as manufacturing plants, warehouses, electrical grids, waste treatment plants and other hazardous environments. The value proposition that Boston Dynamics brought to the partnership was Spot the agile mobile robot, a walking, sensing, and actuation platform. Like all edge applications, the robot’s wireless mobility uses self-contained AI/ML that doesn’t require access to cloud data. It uses cameras to read analog devices, visually monitor fire extinguishers, and conduct a visual inspection of human workers to determine if required safety equipment is being worn.
IBM was able to show up to a 10X speedup by automating some manual tasks, such as converting the detection of a problem into an immediate work order in IBM Maximo to correct it. A fast automated response was not only more efficient, but it also improved the safety posture and risk management for these facilities. Similarly, some factories need to thermally monitor equipment to identify any unexpected hot spots that may show up over time, indicative of a potential failure.
IBM is working with National Grid, an energy company, to develop a mobile solution using Spot, the agile mobile robot, for image analysis of transformers and thermal connectors. As shown in the above graphic, Spot also monitored connectors on both flat surfaces and 3D surfaces. IBM was able to show that Spot could detect excessive heat build-up in small connectors, potentially avoiding unsafe conditions or costly outages. This AI/ML edge application can produce faster response times when an issue is detected, which is why IBM believes significant gains are possible by automating the entire process.
IBM market opportunities
Drive-thru orders and mobile robots are just a few examples of the millions of potential AI applications that exist at the edge and are driven by several billion connected devices.
Edge computing is an essential part of enterprise digital transformation. Enterprises seek ways to demonstrate the feasibility of solving business problems using AI/ML and analytics at the edge. However, once a proof of concept has been successfully demonstrated, it is a common problem for a company to struggle with scalability, data governance, and full-stack solution management.
Challenges with scaling
“Determining entry points for AI at the edge is not the difficult part,” Dr. Fuller said. “Scale is the real issue.”
Scaling edge models is complicated because there are so many edge locations with large amounts of diverse content and a high device density. Because large amounts of data are required for training, data gravity is a potential problem. Further, in many scenarios, vast amounts of data are generated quickly, leading to potential data storage and orchestration challenges. AI Models are also rarely "finished." Monitoring and retraining of models are necessary to keep up with changes the environment.
Through IBM Research, IBM is addressing the many challenges of building an all-encompassing edge architecture and horizontally scalable data and AI technologies. IBM has a wealth of edge capabilities and an architecture to create the appropriate platform for each application.
IBM AI entry points at the edge
IBM sees Edge Computing as a $200 billion market by 2025. Dr. Fuller and his organization have identified four key market entry points for developing and expanding IBM’s edge compute strategy. In order of size, IBM believes its priority edge markets to be intelligent factories (Industry 4.0), telcos, retail automation, and connected vehicles.
IBM and its Red Hat portfolio already have an established presence in each market segment, particularly in intelligent operations and telco. Red Hat is also active in the connected vehicles space.
There have been three prior industrial revolutions, beginning in the 1700s up to our current in-progress fourth revolution, Industry 4.0, that promotes a digital transformation.
Manufacturing is the fastest growing and the largest of IBM’s four entry markets. In this segment, AI at the edge can Boost quality control, production optimization, asset management, and supply chain logistics. IBM believes there are opportunities to achieve a 4x speed up in implementing edge-based AI solutions for manufacturing operations.
For its Industry 4.0 use case development, IBM, through product, development, research and consulting teams, is working with a major automotive OEM. The partnership has established the following joint objectives:
Increase automation and scalability across dozens of plants using 100s of AI / ML models. This client has already seen value in applying AI/ML models for manufacturing applications. IBM Research is helping with re-training models and implementing new ones in an edge environment to help scale even more efficiently. Edge offers faster inference and low latency, allowing AI to be deployed in a wider variety of manufacturing operations requiring instant solutions.
Dramatically reduce the time required to onboard new models. This will allow training and inference to be done faster and allow large models to be deployed much more quickly. The quicker an AI model can be deployed in production; the quicker the time-to-value and the return-on-investment (ROI).
Accelerate deployment of new inspections by reducing the labeling effort and iterations needed to produce a production-ready model via data summarization. Selecting small data sets for annotation means manually examining thousands of images, this is a time-consuming process that will result in - labeling of redundant data. Using ML-based automation for data summarization will accelerate the process and produce better model performance.
Enable Day-2 AI operations to help with data lifecycle automation and governance, model creation, reduce production errors, and provide detection of out-of-distribution data to help determine if a model’s inference is accurate. IBM believes this will allow models to be created faster without data scientists.
Maximo Application Suite
IBM’s Maximo Application Suite plays an important part in implementing large manufacturers' current and future IBM edge solutions. Maximo is an integrated public or private cloud platform that uses AI, IoT, and analytics to optimize performance, extend asset lifecycles and reduce operational downtime and costs. IBM is working with several large manufacturing clients currently using Maximo to develop edge use cases, and even uses it within its own Manufacturing.
IBM has research underway to develop a more efficient method of handling life cycle management of large models that require immense amounts of data. Day 2 AI operations tasks can sometimes be more complex than initial model training, deployment, and scaling. Retraining at the edge is difficult because resources are typically limited.
Once a model is trained and deployed, it is important to monitor it for drift caused by changes in data distributions or anything that might cause a model to deviate from original requirements. Inaccuracies can adversely affect model ROI.
Day-2 AI Operations (retraining and scaling)
Day-2 AI operations consist of continual updates to AI models and applications to keep up with changes in data distributions, changes in the environment, a drop in model performance, availability of new data, and/or new regulations.
IBM recognizes the advantages of performing Day-2 AI Operations, which includes scaling and retraining at the edge. It appears that IBM is the only company with an architecture equipped to effectively handle Day-2 AI operations. That is a significant competitive advantage for IBM.
A company using an architecture that requires data to be moved from the edge back into the cloud for Day-2 related work will be unable to support many factory AI/ML applications because of the sheer number of AI/ML models to support (100s to 1000s).
“There is a huge proliferation of data at the edge that exists in multiple spokes,” Dr. Fuller said. "However, all that data isn’t needed to retrain a model. It is possible to cluster data into groups and then use sampling techniques to retrain the model. There is much value in federated learning from our point of view.”
Federated learning is a promising training solution being researched by IBM and others. It preserves privacy by using a collaboration of edge devices to train models without sharing the data with other entities. It is a good framework to use when resources are limited.
Dealing with limited resources at the edge is a challenge. IBM’s edge architecture accommodates the need to ensure resource budgets for AI applications are met, especially when deploying multiple applications and multiple models across edge locations. For that reason, IBM developed a method to deploy data and AI applications to scale Day-2 AI operations utilizing hub and spokes.
The graphic above shows the current status quo methods of performing Day-2 operations using centralized applications and a centralized data plane compared to the more efficient managed hub and spoke method with distributed applications and a distributed data plane. The hub allows it all to be managed from a single pane of glass.
Data Fabric Extensions to Hub and Spokes
IBM uses hub and spoke as a model to extend its data fabric. The model should not be thought of in the context of a traditional hub and spoke. IBM’s hub provides centralized capabilities to manage clusters and create multiples hubs that can be aggregated to a higher level. This architecture has four important data management capabilities.
First, models running in unattended environments must be monitored. From an operational standpoint, detecting when a model’s effectiveness has significantly degraded and if corrective action is needed is critical.
Secondly, in a hub and spoke model, data is being generated and collected in many locations creating a need for data life cycle management. Working with large enterprise clients, IBM is building unique capabilities to manage the data plane across the hub and spoke estate - optimized to meet data lifecycle, regulatory & compliance as well as local resource requirements. Automation determines which input data should be selected and labeled for retraining purposes and used to further Boost the model. Identification is also made for atypical data that is judged worthy of human attention.
The third issue relates to AI pipeline compression and adaptation. As mentioned earlier, edge resources are limited and highly heterogeneous. While a cloud-based model might have a few hundred million parameters or more, edge models can’t afford such resource extravagance because of resource limitations. To reduce the edge compute footprint, model compression can reduce the number of parameters. As an example, it could be reduced from several hundred million to a few million.
Lastly, suppose a scenario exists where data is produced at multiple spokes but cannot leave those spokes for compliance reasons. In that case, IBM Federated Learning allows learning across heterogeneous data in multiple spokes. Users can discover, curate, categorize and share data assets, data sets, analytical models, and their relationships with other organization members.
In addition to AI deployments, the hub and spoke architecture and the previously mentioned capabilities can be employed more generally to tackle challenges faced by many enterprises in consistently managing an abundance of devices within and across their enterprise locations. Management of the software delivery lifecycle or addressing security vulnerabilities across a vast estate are a case in point.
Multicloud and Edge platform
In the context of its strategy, IBM sees edge and distributed cloud as an extension of its hybrid cloud platform built around Red Hat OpenShift. One of the newer and more useful options created by the Red Hat development team is the Single Node OpenShift (SNO), a compact version of OpenShift that fits on a single server. It is suitable for addressing locations that are still servers but come in a single node, not clustered, deployment type.
For smaller footprints such as industrial PCs or computer vision boards (for example NVidia Jetson Xavier), Red Hat is working on a project which builds an even smaller version of OpenShift, called MicroShift, that provides full application deployment and Kubernetes management capabilities. It is packaged so that it can be used for edge device type deployments.
Overall, IBM and Red Hat have developed a full complement of options to address a large spectrum of deployments across different edge locations and footprints, ranging from containers to management of full-blown Kubernetes applications from MicroShift to OpenShift and IBM Edge Application Manager.
Much is still in the research stage. IBM's objective is to achieve greater consistency in terms of how locations and application lifecycle is managed.
First, Red Hat plans to introduce hierarchical layers of management with Red Hat Advanced Cluster Management (RHACM), to scale by two to three orders of magnitude the number of edge locations managed by this product. Additionally, securing edge locations is a major focus. Red Hat is continuously expanding platform security features, for example by recently including Integrity Measurement Architecture in Red Hat Enterprise Linux, or by adding Integrity Shield to protect policies in Red Hat Advanced Cluster Management (RHACM).
Red Hat is partnering with IBM Research to advance technologies that will permit it to protect platform integrity and the integrity of client workloads through the entire software supply chains. In addition, IBM Research is working with Red Hat on analytic capabilities to identify and remediate vulnerabilities and other security risks in code and configurations.
Telco network intelligence and slice management with AL/ML
Communication service providers (CSPs) such as telcos are key enablers of 5G at the edge. 5G benefits for these providers include:
Reduced operating costs
Increased distribution and density
The end-to-end 5G network comprises the Radio Access Network (RAN), transport, and core domains. Network slicing in 5G is an architecture that enables multiple virtual and independent end-to-end logical networks with different characteristics such as low latency or high bandwidth, to be supported on the same physical network. This is implemented using cloud-native technology enablers such as software defined networking (SDN), virtualization, and multi-access edge computing. Slicing offers necessary flexibility by allowing the creation of specific applications, unique services, and defined user groups or networks.
An important aspect of enabling AI at the edge requires IBM to provide CSPs with the capability to deploy and manage applications across various enterprise locations, possibly spanning multiple end-to-end network slices, using a single pane of glass.
5G network slicing and slice management
Network slices are an essential part of IBM's edge infrastructure that must be automated, orchestrated and optimized according to 5G standards. IBM’s strategy is to leverage AI/ML to efficiently manage, scale, and optimize the slice quality of service, measured in terms of bandwidth, latency, or other metrics.
5G and AI/ML at the edge also represent a significant opportunity for CSPs to move beyond traditional cellular services and capture new sources of revenue with new services.
Communications service providers need management and control of 5G network slicing enabled with AI-powered automation.
Dr. Fuller sees a variety of opportunities in this area. "When it comes to applying AI and ML on the network, you can detect things like intrusion detection and malicious actors," he said. "You can also determine the best way to route traffic to an end user. Automating 5G functions that run on the network using IBM network automation software also serves as an entry point.”
In IBM’s current telecom trial, IBM Research is spearheading the development of a range of capabilities targeted for the IBM Cloud Pak for Network Automation product using AI and automation to orchestrate, operate and optimize multivendor network functions and services that include:
End-to-end 5G network slice management with planning & design, automation & orchestration, and operations & assurance
Network Data and AI Function (NWDAF) that collects data for slice monitoring from 5G Core network functions, performs network analytics, and provides insights to authorized data consumers.
Improved operational efficiency and reduced cost
Future leverage of these capabilities by existing IBM Clients that use the Cloud Pak for Network Automation (e.g., DISH) can offer further differentiation for CSPs.
5G radio access
Open radio access networks (O-RANs) are expected to significantly impact telco 5G wireless edge applications by allowing a greater variety of units to access the system. The O-RAN concept separates the DU (Distributed Units) and CU (Centralized Unit) from a Baseband Unit in 4G and connects them with open interfaces.
O-RAN system is more flexible. It uses AI to establish connections made via open interfaces that optimize the category of a device by analyzing information about its prior use. Like other edge models, the O-RAN architecture provides an opportunity for continuous monitoring, verification, analysis, and optimization of AI models.
The IBM-telco collaboration is expected to advance O-RAN interfaces and workflows. Areas currently under development are:
Multi-modal (RF level + network-level) analytics (AI/ML) for wireless communication with high-speed ingest of 5G data
Capability to learn patterns of metric and log data across CUs and DUs in RF analytics
Utilization of the antenna control plane to optimize throughput
Primitives for forecasting, anomaly detection and root cause analysis using ML
Opportunity of value-added functions for O-RAN
IBM Cloud and Infrastructure
The cornerstone for the delivery of IBM's edge solutions as a service is IBM Cloud Satellite. It presents a consistent cloud-ready, cloud-native operational view with OpenShift and IBM Cloud PaaS services at the edge. In addition, IBM integrated hardware and software Edge systems will provide RHACM - based management of the platform when clients or third parties have existing managed as a service models. It is essential to note that in either case this is done within a single control plane for hubs and spokes that helps optimize execution and management from any cloud to the edge in the hub and spoke model.
IBM's focus on “edge in” means it can provide the infrastructure through things like the example shown above for software defined storage for federated namespace data lake that surrounds other hyperscaler clouds. Additionally, IBM is exploring integrated full stack edge storage appliances based on hyperconverged infrastructure (HCI), such as the Spectrum Fusion HCI, for enterprise edge deployments.
As mentioned earlier, data gravity is one of the main driving factors of edge deployments. IBM has designed its infrastructure to meet those data gravity requirements, not just for the existing hub and spoke topology but also for a future spoke-to-spoke topology where peer-to-peer data sharing becomes imperative (as illustrated with the wealth of examples provided in this article).
Edge is a distributed computing model. One of its main advantages is that computing, and data storage and processing is close to where data is created. Without the need to move data to the cloud for processing, real-time application of analytics and AI capabilities provides immediate solutions and drives business value.
IBM’s goal is not to move the entirety of its cloud infrastructure to the edge. That has little value and would simply function as a hub to spoke model operating on actions and configurations dictated by the hub.
IBM’s architecture will provide the edge with autonomy to determine where data should reside and from where the control plane should be exercised.
Equally important, IBM foresees this architecture evolving into a decentralized model capable of edge-to-edge interactions. IBM has no firm designs for this as yet. However, the plan is to make the edge infrastructure and platform a first-class citizen instead of relying on the cloud to drive what happens at the edge.
Developing a complete and comprehensive AI/ML edge architecture - and in fact, an entire ecosystem - is a massive undertaking. IBM faces many known and unknown challenges that must be solved before it can achieve success.
However, IBM is one of the few companies with the necessary partners and the technical and financial resources to undertake and successfully implement a project of this magnitude and complexity.
It is reassuring that IBM has a plan and that its plan is sound.
Paul Smith-Goodsonis Vice President and Principal Analyst for quantum computing, artificial intelligence and space at Moor Insights and Strategy. You can follow him onTwitterfor more current information on quantum, AI, and space.
Note: Moor Insights & Strategy writers and editors may have contributed to this article.
Moor Insights & Strategy, like all research and tech industry analyst firms, provides or has provided paid services to technology companies. These services include research, analysis, advising, consulting, benchmarking, acquisition matchmaking, and speaking sponsorships. The company has had or currently has paid business relationships with 8×8, Accenture, A10 Networks, Advanced Micro Devices, Amazon, Amazon Web Services, Ambient Scientific, Anuta Networks, Applied Brain Research, Applied Micro, Apstra, Arm, Aruba Networks (now HPE), Atom Computing, AT&T, Aura, Automation Anywhere, AWS, A-10 Strategies, Bitfusion, Blaize, Box, Broadcom, C3.AI, Calix, Campfire, Cisco Systems, Clear Software, Cloudera, Clumio, Cognitive Systems, CompuCom, Cradlepoint, CyberArk, Dell, Dell EMC, Dell Technologies, Diablo Technologies, Dialogue Group, Digital Optics, Dreamium Labs, D-Wave, Echelon, Ericsson, Extreme Networks, Five9, Flex, Foundries.io, Foxconn, Frame (now VMware), Fujitsu, Gen Z Consortium, Glue Networks, GlobalFoundries, Revolve (now Google), Google Cloud, Graphcore, Groq, Hiregenics, Hotwire Global, HP Inc., Hewlett Packard Enterprise, Honeywell, Huawei Technologies, IBM, Infinidat, Infosys, Inseego, IonQ, IonVR, Inseego, Infosys, Infiot, Intel, Interdigital, Jabil Circuit, Keysight, Konica Minolta, Lattice Semiconductor, Lenovo, Linux Foundation, Lightbits Labs, LogicMonitor, Luminar, MapBox, Marvell Technology, Mavenir, Marseille Inc, Mayfair Equity, Meraki (Cisco), Merck KGaA, Mesophere, Micron Technology, Microsoft, MiTEL, Mojo Networks, MongoDB, MulteFire Alliance, National Instruments, Neat, NetApp, Nightwatch, NOKIA (Alcatel-Lucent), Nortek, Novumind, NVIDIA, Nutanix, Nuvia (now Qualcomm), onsemi, ONUG, OpenStack Foundation, Oracle, Palo Alto Networks, Panasas, Peraso, Pexip, Pixelworks, Plume Design, PlusAI, Poly (formerly Plantronics), Portworx, Pure Storage, Qualcomm, Quantinuum, Rackspace, Rambus, Rayvolt E-Bikes, Red Hat, Renesas, Residio, Samsung Electronics, Samsung Semi, SAP, SAS, Scale Computing, Schneider Electric, SiFive, Silver Peak (now Aruba-HPE), SkyWorks, SONY Optical Storage, Splunk, Springpath (now Cisco), Spirent, Splunk, Sprint (now T-Mobile), Stratus Technologies, Symantec, Synaptics, Syniverse, Synopsys, Tanium, Telesign,TE Connectivity, TensTorrent, Tobii Technology, Teradata,T-Mobile, Treasure Data, Twitter, Unity Technologies, UiPath, Verizon Communications, VAST Data, Ventana Micro Systems, Vidyo, VMware, Wave Computing, Wellsmith, Xilinx, Zayo, Zebra, Zededa, Zendesk, Zoho, Zoom, and Zscaler. Moor Insights & Strategy founder, CEO, and Chief Analyst Patrick Moorhead is an investor in dMY Technology Group Inc. VI, Dreamium Labs, Groq, Luminar Technologies, MemryX, and Movandi.
Mon, 08 Aug 2022 03:51:00 -0500Paul Smith-Goodsonentext/htmlhttps://www.forbes.com/sites/moorinsights/2022/08/08/ibm-research-rolls-out-a-comprehensive-ai-and-ml-edge-research-strategy-anchored-by-enterprise-partnerships-and-use-cases/Killexams : Predictive Analytics Market Worth $38 Billion by 2028
NEW YORK, Aug. 9, 2022 /PRNewswire/ -- The Insight Partners published latest research study on "Predictive Analytics Market Forecast to 2028 - COVID-19 Impact and Global Analysis By Component [Solution (Risk Analytics, Marketing Analytics, Sales Analytics, Customer Analytics, and Others) and Service], Deployment Mode (On-Premise and Cloud-Based), Organization Size [Small and Medium Enterprises (SMEs) and Large Enterprises], and Industry Vertical (IT & Telecom, BFSI, Energy & Utilities, Government and Defence, Retail and e-Commerce, Manufacturing, and Others)", the global predictive analytics market size is projected to grow from $12.49 billion in 2022 to $38.03 billion by 2028; it is expected to grow at a CAGR of 20.4% from 2022 to 2028.
Predictive Analytics Market Report Scope & Strategic Insights:
Market Size Value in
US$ 12.49 Billion in 2022
Market Size Value by
US$ 38.03 Billion by 2028
CAGR of 20.4% from 2022 to 2028
No. of Pages
No. of Charts & Figures
Historical data available
Component, Deployment Mode, Organization Size, and Industry Vertical
North America; Europe; Asia Pacific; Latin America; MEA
US, UK, Canada, Germany, France, Italy, Australia, Russia, China, Japan, South Korea, Saudi Arabia, Brazil, Argentina
Revenue forecast, company ranking, competitive landscape, growth factors, and trends
Predictive Analytics Market: Competitive Landscape and Key Developments
IBM Corporation; Microsoft Corporation; Oracle Corporation; SAP SE; Google LLC; SAS Institute Inc.; Salesforce.com, inc.; Amazon Web Services; Hewlett Packard Enterprise Development LP (HPE); and NTT DATA Corporation are among the leading players profiled in this report of the predictive analytics market. Several other essential predictive analytics market players were analyzed for a holistic view of the predictive analytics market and its ecosystem. The report provides detailed predictive analytics market insights, which help the key players strategize their growth.
In 2022, Microsoft partnered with Teradata, a provider of a multi-cloud platform for enterprise analytics, for the integration of Teradata's Vantage data platform into Microsoft Azure.
In 2021, IBM and Black & Veatch collaborated to assist customers in keeping their assets and equipment working at peak performance and reliability by integrating AI with real-time data analytics.
In 2020, Microsoft partnered with SAS for the extension of their business solutions. As a part of this move, the companies will migrate SAS analytical products and solutions to Microsoft Azure as a preferred cloud provider for SAS cloud.
Increase in Uptake of Predictive Analytics Tools Propels Predictive Analytics Market Growth:
Predictive analytics tools use data to state the probabilities of the possible outcomes in the future. Knowing these probabilities can help users plan many aspects of their business. Predictive analytics is part of a larger set of data analytics; other aspects of data analytics include descriptive analytics, which helps users understand what their data represent; diagnostic analytics, which helps identify the causes of past events; and prescriptive analytics, which provides users with practical advice to make better decisions.
Prescriptive analytics is similar to predictive analytics. Predictive modeling is the most technical aspect of predictive analytics. Data analysts perform modeling with statistics and other historical data. The model then estimates the likelihood of different outcomes. In e-commerce, predictive modeling tools help analyze customer data. It can predict how many people are likely to buy a certain product. It can also predict the return on investment (ROI) of targeted marketing campaigns. Some software-as-a-service (SaaS) may collect data directly from online stores, such as Amazon Marketplace.
Predictive analytics tools may benefit social media marketing by guiding users to plan the type of content to post; these tools also recommend the best time and day to post. Manufacturing industries need predictive analytics to manage inventory, supply chains, and staff hiring processes. Transport planning and execution are performed more efficiently with predictive analytics tools. For instance, SAP is a leading multinational software company. Its Predictive Analytics was one of the leading data analytics platforms across the world. Now, the software is gradually being integrated into SAP's larger Cloud Analytics platform, which does more business intelligence (BI) than SAP Predictive Analytics. SAP Analytics Cloud, which works on all devices, utilizes artificial intelligence (AI) to Boost business planning and forecasting. This analytics platform can be easily extended to businesses of all sizes.
North America is one of the most vital regions for the uptake and growth of new technologies due to favorable government policies that boost innovation, the presence of a substantial industrial base, and high purchasing power, especially in developed countries such as the US and Canada. The industrial sector in the US is a prominent market for security analytics. The country consists of a large number of predictive analytics platform developers. The COVID-19 pandemic enforced companies to adopt the work-from-home culture, increasing the demand for big data and data analytics.
The pandemic created an enormous challenge for businesses in North America to continue operating despite massive shutdowns of offices and other facilities. Furthermore, the surge in digital traffic presented an opportunity for numerous online frauds, phishing attacks, denial of inventory, and ransomware attacks. Due to the increased risk of cybercrimes, enterprises began adopting advanced predictive analytics-based solutions to detect and manage any abnormal behavior in their networks. Thus, with the growing number of remote working facilities, the need for predictive analytics solutions also increased in North America during the COVID-19 pandemic.
Predictive Analytics Market: Industry Overview
The predictive analytics market is segmented on the basis of component, deployment mode, organization size, industry vertical, and geography. The predictive analytics market analysis, by component, is segmented into solutions and services. The predictive analytics market based on solution is segmented into risk analytics, marketing analytics, sales analytics, customer analytics, and others. The predictive analytics market analysis, by deployment mode, is bifurcated into cloud and on-premises. The predictive analytics market, by organization size, is segmented into large enterprises, and small and medium-sized enterprises (SMEs). The predictive analytics market, by vertical, is segmented into BFSI, manufacturing, retail and e-Commerce, IT and telecom, energy and utilities, government and defense, and others.
In terms of geography, the predictive analytics market is categorized into five regions—North America, Europe, Asia Pacific (APAC), the Middle East & Africa (MEA), and South America (SAM). The predictive analytics market in North America is sub segmented into the US, Canada, and Mexico. Predictive analytics software is increasingly being adopted in multiple organizations, and cloud-based predictive analytics software solutions are gaining significance in SMEs in North America. The highly competitive retail sector in this region is harnessing the potential of this technique to efficiently transform store layouts and enhance the customer experience in various businesses. In a few North American countries, retailers use smart carts with locator beacons, pin-sized cameras installed near shelves, or the store's Wi-Fi network to determine the footfall in the store, provide directions to a specific product section, and check key areas visited by customers. This process can also provide basic demographic data for parameters such as gender and age.
Wal-Mart, Costco, Kroger, The Home Depot, and Target have their origin in North America. The amount of data generated by stores surges with the rise in sales. Without implementing analytics solutions, it becomes difficult to manage such vast data that include records, behaviors, etc., of all customers. Players such as Euclid Analytics offer spatial analytics platforms for retailers operating offline to help them track customer traffic, loyalty, and other indicators associated with customer visits. Euclid's solutions include preconfigured sensors connected to switches that are linked through a network. These sensors can detect customer calls from devices that have Wi-Fi turned on. Additionally, IBM's Sterling Store Engagement solution provides a real-time view of store inventory, and order data through an intuitive user interface that can be accessed by store owners from counters and mobile devices.
Heavy investments in healthcare sectors, advancements in technologies to help manage a large number of medical records, and the use of Big Data analytics to efficiently predict at-risk patients and create effective treatment plans are further contributing to the growth of the predictive analytics market in North America. Predictive analytics helps assess patterns in a patients' medical records, thereby allowing healthcare professionals to develop effective treatment plans to Boost outcomes. During the COVID-19 pandemic, healthcare predictive analytics solutions helped provide hospitals with insightful predictions of the number of hospitalizations for various treatments, which significantly helped them deal with the influx of a large number of patients. However, the high costs of installation and a shortage of skilled workers may limit the use of predictive analytics solutions in, both, the retail and healthcare sectors.
Browse Adjoining Reports:
Procurement Analytics Market Forecast to 2028 - COVID-19 Impact and Global Analysis By Application (Supply Chain Analytics, Risk Analytics, Spend Analytics, Demand Forecasting, Contract Management, Vendor Management); Deployment (Cloud, On Premises); Industry Vertical (Retail and E Commerce, Manufacturing, Government and Defense, Healthcare and Life sciences, Telecom and IT, Energy and Utility, Banking Financial Services and Insurance) and Geography
Risk Analytics Market Forecast to 2028 - Covid-19 Impact and Global Analysis - by Component (Software, Services); Type (Strategic Risk, Financial Risk, Operational Risk, Others); Deployment Mode (Cloud, On-Premise); Industry Vertical (BFSI, IT and Telecom, Manufacturing, Retail and Consumer Goods, Transportation and Logistics, Government and Defense, Energy and Utilities, Healthcare and Life Sciences, Others) and Geography
Preventive Risk Analytics Market Forecast to 2028 - COVID-19 Impact and Global Analysis By Component (Solution, Services); Deployment Type (On-Premise, Cloud); Organization Size (SMEs, Large Enterprises); Type (Strategic Risks, Financial Risks, Operational Risks, Compliance Risks); Industry (BFSI, Energy and Utilities, Government and Defense, Healthcare, Manufacturing, IT and Telecom, Retail, Others) and Geography
Business Analytics Market Forecast to 2028 - Covid-19 Impact and Global Analysis - by Application (Supply Chain Analytics, Spatial Analytics, Workforce Analytics, Marketing Analytics, Behavioral Analytics, Risk And Credit Analytics, and Pricing Analytics); Deployment (On-Premise, Cloud, and Hybrid); End-user (BFSI, IT & Telecom, Manufacturing, Retail, Energy & Power, and Healthcare)
Big Data Analytics Market Forecast to 2028 - COVID-19 Impact and Global Analysis By Component (Software and Services), Analytics Tool (Dashboard and Data Visualization, Data Mining and Warehousing, Self-Service Tool, Reporting, and Others), Application (Customer Analytics, Supply Chain Analytics, Marketing Analytics, Pricing Analytics, Workforce Analytics, and Others), and End Use Industry (Pharmaceutical, Semiconductor, Battery Manufacturing, Electronics, and Others)
Data Analytics Outsourcing Market to 2027 - Global Analysis and Forecasts by Type (Descriptive Data Analytics, Predictive Data Analytics, and Prescriptive Data Analytics); Application (Sales Analytics, Marketing Analytics, Risk & Finance Analytics, and Supply Chain Analytics); and End-user (BFSI, Healthcare, Retail, Manufacturing, Telecom, and Media & Entertainment)
Sales Performance Management Market Forecast to 2028 - Covid-19 Impact and Global Analysis - by Solution (Incentive Compensation Management, Territory Management, Sales Monitoring and Planning, and Sales Analytics), Deployment Type (On-premise, Cloud), Services (Professional Services, Managed Services), End User (BFSI, Manufacturing, Energy and Utility, and Healthcare)
Customer Analytics Market Forecast to 2028 - COVID-19 Impact and Global Analysis By Component (Solution, Services); Deployment Type (On-premises, Cloud); Enterprise Size (Small and Medium-sized Enterprises, Large Enterprises); End-user (BFSI, IT and Telecom, Media and Entertainment, Consumer Goods and Retail, Travel and Hospitality, Others) and Geography
Life Science Analytics Market Forecast to 2028 - COVID-19 Impact and Global Analysis By Type (Predictive Analytics, Prescriptive Analytics, Descriptive Analytics); Component (Services, Software); End User (Pharmaceutical & Biotechnology Companies, Research Centers, Medical Device Companies, Third-Party Administrators)
The Insight Partners is a one stop industry research provider of actionable intelligence. We help our clients in getting solutions to their research requirements through our syndicated and consulting research services. We specialize in industries such as Semiconductor and Electronics, Aerospace and Defense, Automotive and Transportation, Biotechnology, Healthcare IT, Manufacturing and Construction, Medical Device, Technology, Media and Telecommunications, Chemicals and Materials.
If you have any queries about this report or if you would like further information, please contact us:
Tue, 09 Aug 2022 00:56:00 -0500text/htmlhttps://www.tmcnet.com/usubmit/-predictive-analytics-market-worth-38-billion-2028-204-/2022/08/09/9652572.htmKillexams : Prescriptive and Predictive Analytics Market Business overview 2022, and Forecast to 2030 | By -Accenture, Oracle, IBM, Microsoft
The MarketWatch News Department was not involved in the creation of this content.
Aug 01, 2022 (Market Insight Reports) -- New Jersey, United States- IBI’s most recent assessment on the Prescriptive and Predictive Analytics Market assesses market size, example, and projection to 2030. The market study integrates basic assessment data and affirmations, making it a significant resource report for bosses, examiners, industry-trained professionals, and other key people who need a self-analyzed study to all more promptly fathom market designs, improvement drivers, open entryways, and looming challenges, as well as about competitors.
The Prescriptive and Predictive Analytics Market Report’s Objectives
• -To study and sort out the Prescriptive and Predictive Analytics market’s size concerning both worth and volume. • -Measure a slice of the pie of huge Prescriptive and Predictive Analytics market parts. • -To show how the Prescriptive and Predictive Analytics market is made in a different region of the planet. • -To investigate and look at smaller than usual business areas concerning their Prescriptive and Predictive Analytics market responsibilities, potential outcomes, and individual advancement designs. • -To give a quick and dirty assessment of key business procedures utilized by top firms checking out the post, as creative work, facilitated endeavors, plans, affiliations, acquisitions, unions, new developments, and thing dispatches.
The worldwide Prescriptive and Predictive Analytics market is expected to grow at a booming CAGR of 2022-2030, rising from USD billion in 2021 to USD billion in 2030. It also shows the importance of the Prescriptive and Predictive Analytics market main players in the sector, including their business overviews, financial summaries, and SWOT assessments.
Prescriptive and Predictive Analytics Market Segmentation & Coverage:
Prescriptive and Predictive Analytics Market segment by Type: Collection Analytics, Marketing Analytics, Supply-Chain Analytics, Behavioral Analytics, Talent Analytics
Prescriptive and Predictive Analytics Market segment by Application: Finance & Credit, Banking & Investment, Retail, Healthcare & Pharmaceutical, Insurance, Others
The years examined in this study are the following to estimate the Prescriptive and Predictive Analytics market size:
History Year: 2015-2019 Base Year: 2021 Estimated Year: 2022 Forecast Year: 2022 to 2030
Cumulative Impact of COVID-19 on Market:
Various enterprises have faced issues due to COVID-19. This is legitimate in the business as well. In light of the COVID-19 plague, a couple of countries’ watchman monetary plans have been cut. Most assessment projects are expected to momentarily stand by in this way. Results of onboard PC stages to various Middle Eastern, African, and Latin American countries have furthermore reduced. These possible results influence the PC stage’s progression.
Bargains procedures, adventure, and cost structures are among the critical Topics covered here, as well as a highlight on the Prescriptive and Predictive Analytics market in a key locales like Asia Pacific, North America, Latin America, Europe, and the Middle East and Africa. This market examination, which unites the two figures and real factors, moreover covers the money-related parts of affiliations.
The Key companies profiled in the Prescriptive and Predictive Analytics Market:
The study examines the Prescriptive and Predictive Analytics market’s competitive landscape and includes data on important suppliers, including Accenture, Oracle, IBM, Microsoft, QlikTech, SAP, SAS Institute, Alteryx, Angoss, Ayata, FICO, Information Builders, Inkiru, KXEN, Megaputer, Revolution Analytics, StatSoft, Splunk Anlytics, Tableau, Teradata, TIBCO, Versium, Pegasystems, Pitney Bowes, Zemantis,& Others
Table of Contents:
List of Data Sources: Chapter 2. Executive Summary Chapter 3. Industry Outlook 3.1. Prescriptive and Predictive Analytics Global Market segmentation 3.2. Prescriptive and Predictive Analytics Global Market size and growth prospects, 2015 – 2026 3.3. Prescriptive and Predictive Analytics Global Market Value Chain Analysis 3.3.1. Vendor landscape 3.4. Regulatory Framework 3.5. Market Dynamics 3.5.1. Market Driver Analysis 3.5.2. Market Restraint Analysis 3.6. Porter’s Analysis 3.6.1. Threat of New Entrants 3.6.2. Bargaining Power of Buyers 3.6.3. Bargaining Power of Buyers 3.6.4. Threat of Substitutes 3.6.5. Internal Rivalry 3.7. PESTEL Analysis Chapter 4. Prescriptive and Predictive Analytics Global Market Product Outlook Chapter 5. Prescriptive and Predictive Analytics Global Market Application Outlook Chapter 6. Prescriptive and Predictive Analytics Global Market Geography Outlook 6.1. Prescriptive and Predictive Analytics Industry Share, by Geography, 2022 & 2030 6.2. North America 6.2.1. Market 2022 -2030 estimates and forecast, by product 6.2.2. Market 2022 -2030, estimates and forecast, by application 6.2.3. The U.S. 220.127.116.11. Market 2022 -2030 estimates and forecast, by product 18.104.22.168. Market 2022 -2030, estimates and forecast, by application 6.2.4. Canada 22.214.171.124. Market 2022 -2030 estimates and forecast, by product 126.96.36.199. Market 2022 -2030, estimates and forecast, by application 6.3. Europe 6.3.1. Market 2022 -2030 estimates and forecast, by product 6.3.2. Market 2022 -2030, estimates and forecast, by application 6.3.3. Germany 188.8.131.52. Market 2022 -2030 estimates and forecast, by product 184.108.40.206. Market 2022 -2030, estimates and forecast, by application 6.3.4. the UK 220.127.116.11. Market 2022 -2030 estimates and forecast, by product 18.104.22.168. Market 2022 -2030, estimates and forecast, by application 6.3.5. France 22.214.171.124. Market 2022 -2030 estimates and forecast, by product 126.96.36.199. Market 2022 -2030, estimates and forecast, by application Chapter 7. Competitive Landscape Chapter 8. Appendix
In 2030, what will the Prescriptive and Predictive Analytics market’s improvement rate be? What are the principal drivers of the market? Who are the Prescriptive and Predictive Analytics market’s driving makers? What are the Prescriptive and Predictive Analytics market’s prospects, risks, and overall market picture?
Is there a problem with this press release? Contact the source provider Comtex at email@example.com. You can also contact MarketWatch Customer Service via our Customer Center.
The MarketWatch News Department was not involved in the creation of this content.
Sun, 31 Jul 2022 16:36:00 -0500en-UStext/htmlhttps://www.marketwatch.com/press-release/prescriptive-and-predictive-analytics-market-business-overview-2022-and-forecast-to-2030-by--accenture-oracle-ibm-microsoft-2022-08-01Killexams : Top 10 data lake solution vendors in 2022
Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Watch here.
As the world becomes increasingly data-driven, businesses must find suitable solutions to help them achieve their desired outcomes. Data lake storage has garnered the attention of many organizations that need to store large amounts of unstructured, raw information until it can be used in analytics applications.
The data lake solution market is expected to grow rapidly in the coming years and is driven by vendors that offer cost-effective, scalable solutions for their customers.
Learn more about data lake solutions, what key features they should have and some of the top vendors to consider this year.
What is a data lake solution?
A data lake is defined as a single, centralized repository that can store massive amounts of unstructured and semi-structured information in its native, raw form.
It’s common for an organization to store unstructured data in a data lake if it hasn’t decided how that information will be used. Some examples of unstructured data include images, documents, videos and audio. These data types are useful in today’s advanced machine learning (ML) and advanced analytics applications.
Data lakes differ from data warehouses, which store structured, filtered information for specific purposes in files or folders. Data lakes were created in response to some of the limitations of data warehouses. For example, data warehouses are expensive and proprietary, cannot handle certain business use cases an organization must address, and may lead to unwanted information homogeneity.
On-premise data lake solutions were commonly used before the widespread adoption of the cloud. Now, it’s understood that some of the best hosts for data lakes are cloud-based platforms on the edge because of their inherent scalability and considerably modular services.
A 2019 report from the Government Accountability Office (GAO) highlights several business benefits of using the cloud, including better customer service and the acquisition of cost-effective options for IT management services.
Cloud data lakes and on-premise data lakes have pros and cons. Businesses should consider cost, scale and available technical resources to decide which type is best.
It’s critical to understand what features a data lake offers. Most solutions come with the same core components, but each vendor may have specific offerings or unique selling points (USPs) that could influence a business’s decision.
Below are five key features every data lake should have:
1. Various interfaces, APIs and endpoints
Data lakes that offer diverse interfaces, APIs and endpoints can make it much easier to upload, access and move information. These capabilities are important for a data lake because it allows unstructured data for a wide range of use cases, depending on a business’s desired outcome.
2. Support for or connection to processing and analytics layers
ML engineers, data scientists, decision-makers and analysts benefit most from a centralized data lake solution that stores information for easy access and availability. This characteristic can help data professionals and IT managers work with data more seamlessly and efficiently, thus improving productivity and helping companies reach their goals.
3. Robust search and cataloging features
Imagine a data lake with large amounts of information but no sense of organization. A viable data lake solution must incorporate generic organizational methods and search capabilities, which provide the most value for its users. Other features might include key-value storage, tagging, metadata, or tools to classify and collect subsets of information.
4. Security and access control
Security and access control are two must-have features with any digital tool. The current cybersecurity landscape is expanding, making it easier for threat actors to exploit a company’s data and cause irreparable damage. Only certain users should have access to a data lake, and the solution must have strong security to protect sensitive information.
5. Flexibility and scalability
More organizations are growing larger and operating at a much faster rate. Data lake solutions must be flexible and scalable to meet the ever-changing needs of modern businesses working with information.
Some data lake solutions are best suited for businesses in certain industries. In contrast, others may work well for a company of a particular size or with a specific number of employees or customers. This can make choosing a potential data lake solution vendor challenging.
Companies considering investing in a data lake solution this year should check out some of the vendors below.
The AWS Cloud provides many essential tools and services that allow companies to build a data lake that meets their needs. The AWS data lake solution is widely used, cost-effective and user-friendly. It leverages the security, durability, flexibility and scalability that Amazon S3 object storage offers to its users.
The data lake also features Amazon DynamoDB to handle and manage metadata. AWS data lake offers an intuitive, web-based console user interface (UI) to manage the data lake easily. It also forms data lake policies, removes or adds data packages, creates manifests of datasets for analytics purposes, and features search data packages.
Cloudera is another top data lake vendor that will create and maintain safe, secure storage for all data types. Some of Cloudera SDX’s Data Lake Service capabilities include:
Data schema/metadata information
Metadata management and governance
Compliance-ready access auditing
Data access authorization and authentication for improved security
Other benefits of Cloudera’s data lake include product support, downloads, community and documentation. GSK and Toyota leveraged Cloudera’s data lake to garner critical business intelligence (BI) insights and manage data analytics processes.
Databricks is another viable vendor, and it also offers a handful of data lake alternatives. The Databricks Lakehouse Platform combines the best elements of data lakes and warehouses to provide reliability, governance, security and performance.
Databricks’ platform helps break down silos that normally separate and complicate data, which frustrates data scientists, ML engineers and other IT professionals. Aside from the platform, Databricks also offers its Delta Lake solution, an open-format storage layer that can Boost data lake management processes.
Domo is a cloud-based software company that can provide big data solutions to all companies. Users have the freedom to choose a cloud architecture that works for their business. Domo is an open platform that can augment existing data lakes, whether it’s in the cloud or on-premise. Users can use combined cloud options, including:
Choosing Domo’s cloud
Connecting to any cloud data
Selecting a cloud data platform
Domo offers advanced security features, such as BYOK (bring your own key) encryption, control data access and governance capabilities. Well-known corporations such as Nestle, DHL, Cisco and Comcast leverage the Domo Cloud to better manage their needs.
Google is another big tech player offering customers data lake solutions. Companies can use Google Cloud’s data lake to analyze any data securely and cost-effectively. It can handle large volumes of information and IT professionals’ various processing tasks. Companies that don’t want to rebuild their on-premise data lakes in the cloud can easily lift and shift their information to Google Cloud.
Some key features of Google’s data lakes include Apache Spark and Hadoop migration, which are fully managed services, integrated data science and analytics, and cost management tools. Major companies like Twitter, Vodafone, Pandora and Metro have benefited from Google Cloud’s data lakes.
Hewlett Packard Enterprise (HPE) is another data lake solution vendor that can help businesses harness the power of their big data. HPE’s solution is called GreenLake — it offers organizations a truly scalable, cloud-based solution that simplifies their Hadoop experiences.
HPE GreenLake is an end-to-end solution that includes software, hardware and HPE Pointnext Services. These services can help businesses overcome IT challenges and spend more time on meaningful tasks.
Business technology leader IBM also offers data lake solutions for companies. IBM is well-known for its cloud computing and data analytics solutions. It’s a great choice if an operation is looking for a suitable data lake solution. IBM’s cloud-based approach operates on three key principles: embedded governance, automated integration and virtualization.
These are some data lake solutions from IBM:
IBM Db2 BigSQL
IBM Watson Query
IBM Watson Knowledge Catalog
IBM Cloud Pak for Data
With so many data lakes available, there’s surely one to fit a company’s unique needs. Financial services, healthcare and communications businesses often use IBM data lakes for various purposes.
Microsoft offers its Azure Data Lake solution, which features easy storage methods, processing, and analytics using various languages and platforms. Azure Data Lake also works with a company’s existing IT investments and infrastructure to make IT management seamless.
The Azure Data Lake solution is affordable, comprehensive, secure and supported by Microsoft. Companies benefit from 24/7 support and expertise to help them overcome any big data challenges they may face. Microsoft is a leader in business analytics and tech solutions, making it a popular choice for many organizations.
Companies can use Oracle’s Big Data Service to build data lakes to manage the influx of information needed to power their business decisions. The Big Data Service is automated and will provide users with an affordable and comprehensive Hadoop data lake platform based on Cloudera Enterprise.
This solution can be used as a data lake or an ML platform. Another important feature of Oracle is it is one of the best open-source data lakes available. It also comes with Oracle-based tools to add even more value. Oracle’s Big Data Service is scalable, flexible, secure and will meet data storage requirements at a low cost.
Snowflake’s data lake solution is secure, reliable and accessible and helps businesses break down silos to Boost their strategies. The top features of Snowflake’s data lake include a central platform for all information, fast querying and secure collaboration.
Siemens and Devon Energy are two companies that provide testimonials regarding Snowflake’s data lake solutions and offer positive feedback. Another benefit of Snowflake is its extensive partner ecosystem, including AWS, Microsoft Azure, Accenture, Deloitte and Google Cloud.
The importance of choosing the right data lake solution vendor
Companies that spend extra time researching which vendors will offer the best enterprise data lake solutions for them can manage their information better. Rather than choose any vendor, it’s best to consider all options available and determine which solutions will meet the specific needs of an organization.
Every business uses information, some more than others. However, the world is becoming highly data-driven — therefore, leveraging the right data solutions will only grow more important in the coming years. This list will help companies decide which data lake solution vendor is right for their operations.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn more about membership.
Fri, 15 Jul 2022 09:40:00 -0500Shannon Flynnen-UStext/htmlhttps://venturebeat.com/data-infrastructure/top-10-data-lake-solution-vendors-in-2022/Killexams : UTC And IBM Help Students Develop Job Skills For The Future
A new collaboration between The University of Tennessee at Chattanooga and IBM will give students an array of marketable skills in data analytics, one of the fastest growing sectors in the national and regional economy. Smart technologies, cloud computing, mobile platforms, social media and other new generation technologies are fueling the revolution of big data.
In the U.S. alone, IT jobs are expected to grow by 22 percent through 2020 according to the US Bureau of Labor Statistics.
As part of the UTC-IBM relationship, faculty have access to the resources in IBM’s Academic Initiative. This program provides access to technology, curriculum and learning materials at no cost for faculty members around the world. IBM has also provided advanced education for UTC faculty members, which has resulted in the development of curriculum, according to Dr. Joseph Kizza, Professor and Head of the Department of Computer Science and Engineering. Kizza is currently teaching the course Big Data Analytics using IBM software, including Cognos, BigInsights and InfoSphereStreams.
“The collaboration with IBM is already helping the program to prepare graduating students with the most up-to-date and in-demand skills in the industry. This is going to help our students get better and more paying jobs. In the process, this will eventually help our recruitment,” explained Kizza. “Additionally, we hold two Computer Science and Engineering (CSE) Showcases a year where we invite high school students to spend a Saturday with us and do hands-on labs. This opportunity will help us demonstrate the latest technologies in Big Data analytics, for example, which may increase or ignite students interests in attending UTC.”
BlueCross BlueShield of Tennessee, the city’s largest private employer, also collaborated by providing guidance on the curriculum and sharing insights on the real-world skills needed for success. The company believes the program will not only help meet its own need for technical talent, but provide a boost for the entire community.
“I see UTC becoming a leading academic institution for business analysis and data management—it will begin to produce the business intelligence experts Chattanooga companies need,” according to Brian Green, Manager of Business Intelligence and Performance Management at BlueCross. “Through this collaboration, local businesses will be able to tap into UTC’s academic offerings to make their companies more successful in critical new areas like Big Data analytics.”
He says students who take advantage of these opportunities can become “immediately marketable” in this fast-growing field. “They will graduate with hands-on experience in the IBM tools commonly used in the workplace.”
Mr.Green, who graduated from UTC in 1980, has been in the business of information management and system development in the insurance industry for more than 30 years.
“UTC has been involved in the IBM Academic Skills Cloud pilot program and we are proud to support the university’s efforts to provide students with data-driven education,” said Dan Hauenstein, director, IBM Academic Initiative. “Through the IBM Academic Initiative, students and faculty have access to industry leading technology and courseware to help develop the advanced big data analytics skills needed for jobs of the future.”
IBM’s Academic Initiative provides no charge access to curriculum, software and learning materials to more than 30,000 faculty members around the world.
Tue, 12 Jul 2022 12:00:00 -0500entext/htmlhttps://www.chattanoogan.com/2013/10/12/261192/UTC-And-IBM-Help-Students-Develop-Job.aspxKillexams : Customer Analytics Applications Market to Witness Huge Growth by 2027: Woopra, Crazyegg, IBM
This press release was orginally distributed by SBWire
New Jersey, USA — (SBWIRE) — 07/12/2022 — The latest study released on the Global Customer Analytics Applications Market by AMA Research evaluates market size, trend, and forecast to 2027. The Customer Analytics Applications market study covers significant research data and proofs to be a handy resource document for managers, analysts, industry experts and other key people to have ready-to-access and self-analyzed study to help understand market trends, growth drivers, opportunities and upcoming challenges and about the competitors.
Key Players in This Report Include: Mixpanel (California),Kissmetrics (United States),Woopra (United States),Zoho pagesense (India),Crazyegg (California),Adobe Inc. (United States),IBM Corp. (United States),Sprout Social (United States),Brightedge (California),Tableau Software (United States),RapidMiner (United States),Qlik (United States),Sisense (United States),SAS (United States),Domo Inc. (United States),Clicktale (Israel)
Definition: Customers analytics application is specialized by apps which are used to gain insight into the customer experience, understand customer behavior and help tailor marketing campaigns to specific customer segments. This software is helpful to provide businesses with crucial data about their marketing efforts and customer behavior.
Market Trends: – Analyzing online behavior to increase sales and big data – Increasing penetration of voice-enabled smart devices, in-home automation systems, and wearables devices has a key trend of the market
Market Drivers: – Increasing preferences of better understanding of a customer's buying habits and lifestyle – Increased customer response to promotions, strengthens customers loyalty that boosts the sales revenue of the market
The Global Customer Analytics Applications Market segments and Market Data Break Down are illuminated below: by Type (Descriptive Analytics, Predictive Analytics, Prescriptive Analytics), Software (Analytical Customer Analytics Software, Operational Customer Analytics Software, Collaborative Customer Analytics Software), Platform (Mobile adoption, Customer retention, User engagement, In-app purchases), End-User (Marketing & Sales, Customer Service, IT, Others)
Global Customer Analytics Applications market report highlights information regarding the current and future industry trends, growth patterns, as well as it offers business strategies to helps the stakeholders in making sound decisions that may help to ensure the profit trajectory over the forecast years.
Geographically, the detailed analysis of consumption, revenue, market share, and growth rate of the following regions: – The Middle East and Africa (South Africa, Saudi Arabia, UAE, Israel, Egypt, etc.) – North America (United States, Mexico & Canada) – South America (Brazil, Venezuela, Argentina, Ecuador, Peru, Colombia, etc.) – Europe (Turkey, Spain, Turkey, Netherlands Denmark, Belgium, Switzerland, Germany, Russia UK, Italy, France, etc.) – Asia-Pacific (Taiwan, Hong Kong, Singapore, Vietnam, China, Malaysia, Japan, Philippines, Korea, Thailand, India, Indonesia, and Australia).
Objectives of the Report – -To carefully analyze and forecast the size of the Customer Analytics Applications market by value and volume. – -To estimate the market shares of major segments of the Customer Analytics Applications market. – -To showcase the development of the Customer Analytics Applications market in different parts of the world. – -To analyze and study micro-markets in terms of their contributions to the Customer Analytics Applications market, their prospects, and individual growth trends. – -To offer precise and useful details about factors affecting the growth of the Customer Analytics Applications market. – -To provide a meticulous assessment of crucial business strategies used by leading companies operating in the Customer Analytics Applications market, which include research and development, collaborations, agreements, partnerships, acquisitions, mergers, new developments, and product launches.
Major highlights from Table of Contents: Customer Analytics ApplicationsMarket Study Coverage: – It includes major manufacturers, emerging player's growth story, and major business segments of Customer Analytics Applications market, years considered, and research objectives. Additionally, segmentation on the basis of the type of product, application, and technology. – Customer Analytics Applications Market Executive Summary: It gives a summary of overall studies, growth rate, available market, competitive landscape, market drivers, trends, and issues, and macroscopic indicators. – Customer Analytics Applications Market Production by Region Customer Analytics Applications Market Profile of Manufacturers-players are studied on the basis of SWOT, their products, production, value, financials, and other vital factors. – Key Points Covered in Customer Analytics Applications Market Report: – Customer Analytics Applications Overview, Definition and Classification Market drivers and barriers – Customer Analytics Applications Market Competition by Manufacturers – Impact Analysis of COVID-19 on Customer Analytics Applications Market – Customer Analytics Applications Capacity, Production, Revenue (Value) by Region (2022-2027) – Customer Analytics Applications Supply (Production), Consumption, Export, Import by Region (2022-2027) – Customer Analytics Applications Manufacturers Profiles/Analysis Customer Analytics Applications Manufacturing Cost Analysis, Industrial/Supply Chain Analysis, Sourcing Strategy and Downstream Buyers, Marketing – Strategy by Key Manufacturers/Players, Connected Distributors/Traders Standardization, Regulatory and collaborative initiatives, Industry road map and value chain Market Effect Factors Analysis.
Key questions answered – How feasible is Customer Analytics Applications market for long-term investment? – What are influencing factors driving the demand for Customer Analytics Applications near future? – What is the impact analysis of various factors in the Global Customer Analytics Applications market growth? – What are the recent trends in the regional market and how successful they are?
Thanks for reading this article; you can also get individual chapter wise section or region wise report version like North America, Middle East, Africa, Europe or LATAM, Asia.
Contact US: Craig Francis (PR & Marketing Manager) AMA Research & Media LLP Unit No. 429, Parsonage Road Edison, NJ New Jersey USA – 08837 Phone: +1 (206) 317 1218 [email protected]
Tue, 12 Jul 2022 05:47:00 -0500ReleaseWireen-UStext/htmlhttps://www.digitaljournal.com/pr/customer-analytics-applications-market-to-witness-huge-growth-by-2027-woopra-crazyegg-ibmKillexams : IoT Analytics Market is expected to Grow USD 92.46 Billion by 2030 | Sap, Oracle, IBM
The IoT analytics market has been esteemed at USD 9.1 billion in 2018 and required to develop at a CAGR of 24.63% by 2030, to arrive at USD 92.46 Billion by 2030.
The market is being driven by the growing development of bury-related devices and the sharing of data across a variety of industries. The IoT Analytics market is rapidly expanding due to the growing need to have data from numerous endeavors cautiously accessible. Continuous observation and sharing of knowledge are critical and should be prioritized. It has become easier to share data as a result of recent mechanical advancements and improvements. IoT analytics are used in a variety of businesses. The IoT analytics sector is used by the medical services business to Boost the nature of therapy. It’s also used in web-based business, retail, and assembly to refresh existing patterns and customer behavior that can be used to develop new products and services.
The flexibility of the IoT analytics market forecast merchants to set restrictions or provide more highlights for similar pricing is one silver lining to the COVID-19 emergency. Most IoT analytics market implementers are optimistic about the potential of IoT innovation expenditure plans during the COVID times. COVID-19 drove spending increases at the same time. In terms of IoT analytics market spending adjustments, half of the respondents said COVID-19 increased the demand for computerized activities, including IoT.
Based on the Type, the market has been segmented into Predictive Analytics, Descriptive Analytics, and Prescriptive Analytics.
Based on the Application, the market has been segmented into energy the executives, building mechanization, prescient, stock administration, deals and client the board and security, and resource the board, and crisis the executives. To identify, filter, investigate, address, and quickly recover from major events, the organizations use advanced logical devices.
North America continues to hold the largest share of the market, with revenue expected to reach approximately USD 50,000 Million during the forecast period and is expected to grow at the fastest rate in the global IoT investigation market. In addition, Europe is expected to account for 10% of the entire industry, as well as other IoT analytics market demands, allowing it to rank second in the global IoT investigation market by the end of the forecast period. Despite this, the Middle East and Africa (MEA) region would have a relatively low CAGR throughout the forecast period. Medical services will continue to be the most important driving vertical for the global IoT examination market, as the impact of retail is required to see the fastest growth for IoT investigation. During the forecasted time frame, medical care alone will be required to account for more than 70% of the IoT analytics industry. Transportation and coordination are expected to have the second-highest CAGR in the industry. Similarly, the Energy and Utilities vertical in the IoT analysis would have a low CAGR over the forecasted time range.
The major key players in the market are Amazon Web Services, Inc., Google, Inc., Microsoft Corporation, SAP SE, Oracle Corporation, IBM Corporation, Dell Technologies, Inc., Cisco Systems, Inc., HP Enterprise Company, and PTC, Inc. The market is receiving a boost as executives place a greater emphasis on cost and time, reducing the demand for continuous information, growing severe competition, increasing the use of robotization in businesses, and the introduction of trendsetting technologies.
At Market Research Future (MRFR), we enable our customers to unravel the complexity of various industries through our Cooked Research Report (CRR), Half-Cooked Research Reports (HCRR), Raw Research Reports (3R), Continuous-Feed Research (CFR), and Market Research & Consulting Services.
MRFR team have supreme objective to provide the optimum quality market research and intelligence services to our clients. Our market research studies by products, services, technologies, applications, end users, and market players for global, regional, and country level market segments, enable our clients to see more, know more, and do more, which help to answer all their most important questions.
Market Research Future (Part of Wantstats Research and Media Private Limited)
Mon, 25 Jul 2022 20:33:00 -0500Market Research Futureen-UStext/htmlhttps://www.digitaljournal.com/pr/iot-analytics-market-is-expected-to-grow-usd-92-46-billion-by-2030-sap-oracle-ibmKillexams : Embedded Analytics Market Size, Share to Observe Exponential Growth By 2022-2030
The MarketWatch News Department was not involved in the creation of this content.
Jul 13, 2022 (Alliance News via COMTEX) -- Key Companies Covered in the Embedded Analytics Market Research are Birst, Inc., IBM Corporation, Information Builders, Logi Analytics, Microsoft Corporation, Microstrategy Inc., Opentext Corporation, Qlik Technologies, Inc., SAP SE, and TIBCO (The Information Bus Company) Software Inc. and other key market players.
Embedded analytics integrates analytic capabilities and content within the business process applications including enterprise resource planning (ERP), customer relationship management (CRM), financial systems, and marketing automation. It offers analytics tools and relevant information for users to work effectively on particular task. Common analytical capabilities included in the software applications are dashboard and data visualization, self-service analytics, reporting, and benchmarking. As compared to traditional business intelligence, embedded analytics offers additional awareness and analytic or contextual capabilities to support decision-making related to exclusive tasks.
The global embedded analytics market is attributed to emergence of big data and Internet of Technology (IoT) among organizations, increase in reliability on mobile devices and cloud technology, and rise in need to integrate data analytics with the business applications to achieve optimum performance. In addition, growth in adoption of bring your own devices (BYOD), increased demand for real-time visualization tools in business applications, and rise in enterprise mobility drive the growth of the global embedded analytics market. Upsurge in demand for real-time streaming analysis and high demand for standalone self-service analytics tools are expected to offer significant growth opportunities for the global embedded analytics industry in the near future. However, high investment costs and lack of analytical knowledge among the enterprises hamper the growth of the global embedded analytics market.
The global embedded analytics industry is segmented based on deployment model, business application, analytics tool, industry vertical, and geography. Based on deployment model, it is bifurcated into on-premise and cloud-based. As per business application, it is classified into sales & marketing, finance, operation, and human resource. Based on analytics tool, the global embedded analytics market is categorized into dashboards and data visualization, self-service tools, benchmarking, and reporting. Based on vertical, the global embedded analytics industry is divided into banking, financial services, and insurance (BFSI); IT & telecom; public sector; manufacturing; retail; healthcare; energy & utilities; and others. Geographically, it is analyzed across North America, Europe, Asia-Pacific, and LAMEA.
KEY BENEFITS FOR STAKEHOLDERS
In-depth analysis of the global embedded analytics market and dynamics is provided to understand the market scenario. Quantitative analysis of the current trends and future estimations from 2017 to 2023 is provided to assist strategists and stakeholders to capitalize on prevailing opportunities. Porter’s Five Forces analysis examines the competitive structure of the global embedded analytics market and provides a clear understanding of the factors that influence the market entry and expansion. A detailed analysis of the geographical segments enables identification of profitable segments for market players. Comprehensive analyses of the trends, sub-segments, and key revenue pockets are provided. Detailed analyses of the key players operating in the global embedded analytics market and their business strategies are anticipated to assist stakeholders to take informed business decisions.
KEY MARKET SEGMENTS
By Deployment Model
By Business Application
Sales & Marketing Finance Operations Human Resource
By Analytics Tool
Dashboard and Data Visualization Self-service Tools Benchmarking Reporting
BFSI IT & Telecom Public Sector Manufacturing Retail Healthcare Energy & Utilities Others
About Report Ocean: We are the best market research reports provider in the industry. Report Ocean believes in providing quality reports to clients to meet the top line and bottom line goals which will boost your market share in today's competitive environment. Report Ocean is a 'one-stop solution' for individuals, organizations, and industries that are looking for innovative market research reports.
Risk Analytics Market by Component (Software (ETL Tools, Risk Calculation Engines, GRC Software) and Services), Risk Type (Strategic Risk, Operational Risk, Financial Risk), Deployment Mode, Organization Size, Vertical and Region - Global Forecast to 2027
The global Risk Analytics Market size to grow from USD 39.3 billion in 2022 to USD 70.5 billion by 2027, at a Compound Annual Growth Rate (CAGR) of 12.4% during the forecast period. Various factors such as rising government compliance with stringent industry regulation, growing incidences of data thefts and security breaches and increasing complexities across business processes, are expected to drive the adoption of risk analytics solutions and services.
The COVID-19 pandemic has made an adverse impact on credit portfolios. There has been an unprecedented rise in unemployment and disruption in economic activity, putting a strain on the solvency of customers and companies. Central banks have taken a proactive approach by injecting liquidity into the market by lowering interest rates and asset purchase programs. Managing and monitoring credit, market, liquidity, and operational risk across financial markets were hard enough with ongoing geopolitical tensions, international trade wars, and the occasional hurricanes and earthquakes. The current pandemic situation has forced chief risk officers and their teams to recalibrate old assumptions and models used to manage and monitor risk. COVID-19’s global impact has shown that interconnectedness plays an important role in international cooperation. As a result, many governments started rushing toward identifying, evaluating, and procuring reliable solutions powered by AI.
The software segment to account for largest market size during the forecast period
Based on components, the risk analytics market is segmented into software and services. The software segment has been further segmented into ETL tools, risk calculation engines, scorecard and visualization tools, dashboard analytics and risk reporting tools, and GRC software, and others (operational risk management, human resource risk management, and project risk management). The software segment is expected to hold the maximum market share in the global risk analytics market. Among all software offered in the market, GRC software has shown the highest adoption across the globe. The services segment has been divided into professional and managed services.. With the rising adoption of risk analytics software is expected to boost the adoption of professional and managed services.
According to MarketsandMarkets, risk analytics is a technique that measures, quantifies, and predicts risks from organizational data. It provides a foundation for organizations to get a unified view of enterprise risks and thus, offers more visibility into operational, financial, strategic, and other related risks, enabling decision-makers to make informed decisions. The risk analytics market comprises risk analytics services and software solutions embedded with advanced technologies, such as Artificial Intelligence (AI) and Machine Learning (ML), and techniques that comprehensively measure, quantify, and predict risks through effective data utilization.
Some of the key players operating in the risk analytics market include IBM (US), SAP (Germany), SAS (US), Oracle (US), FIS (US), Moody’s Analytics (US), Verisk Analytics (US), Alteryx (US), AxiomSL (US), Gurucul (US), Provenir (US), BRIDGEi2i (India), Recorded Future (US), AcadiaSoft (US), Qlik (US), DataFactZ (US), CubeLogic Limited (UK), Risk Edge Solutions (India), Equarius Risk Analytics (US), Quantifi (US), Actify Data Labs (India), Amlgo Labs (India), Zesty.ai (US), Artivatic (India), Artivatic (US), RiskVille (Ireland), Quantexa (UK), Spin Analytics (UK), Kyvos Insights (US), Imply (US). These risk analytics vendors have adopted various organic and inorganic strategies to sustain their positions and increase their market shares in the global risk analytics market.
SAP is one of the leading providers of enterprise application solutions and services. It is also a leading experience management, analytics, and BI company. Its solutions are compliant with GDPR. They enable enterprises to build intelligent AI and ML-based software to combine human expertise with machine-generated insights. The company segments its diverse portfolio into applications, technology and support, services, Concur, and Qualtrics. It works on an intelligent enterprise framework, which includes experience, intelligence, and operations business models.
The company’s software, technologies, and services address the three core elements of the intelligent enterprise: intelligent suite, business technology platform, and experience management platform, for 25 industries and 12 lines of business. SAP’s core business consists of selling software licenses and providing services, such as consulting, training, and maintenance. It has regional offices in 180 countries and caters to various industries, including energy and natural resources, financial services, consumer, discrete, and public services. SAP has around 437,000 customers worldwide, and its vast product portfolio covers Enterprise Resource Planning (ERP) and Finance, Customer Relationship Management (CRM) and Customer Experience, Network and Spend Management, Digital Supply Chain, HR and People Engagement, and Business Technology Platform. It has a geographical presence in the Americas, Europe, MEA, and APAC
Alteryx is one of the leaders in data blending and advanced analytics software. The company caters to over 200,000 users, and its customers include Anheuser Busch, LLC, Barclays Capital Inc., Biogen Idec Inc., Chevron Corporation, Daimler AG, Federal Home Loan Mortgage Association, General Mills, Inc., LOreal USA, Inc., Netflix, Inc., Pfizer Inc., salesforce.com, inc., Société Générale S.A., Unilever PLC., and Visa Inc. It also caters to various growing organizations, including Vertix, Rosenblatt Securities, and ConsumerOrbit. Alteryx has entered into partnerships and collaborations with top companies, such as Impala, Amazon RedShift, Teradata, and Spark, expanding its presence in North America, Europe, APAC, MEA, and Latin America. Since 2010, the company has been expanding its statistical and analytics capabilities, increasing its big data and SaaS connectivity, and delivering a cloud platform for analytics applications called Alteryx Analytics Gallery. Alteryx caters to various verticals, including BFSI, healthcare, retail, transportation and logistics, oil and gas, real estate, communication, energy and utilities, education, manufacturing, media and entertainment, travel and hospitality, and the public sector.
Ad Disclosure: The rate information is obtained by Bankrate from the listed institutions. Bankrate cannot guaranty the accuracy or availability of any rates shown above. Institutions may have different rates on their own websites than those posted on Bankrate.com. The listings that appear on this page are from companies from which this website receives compensation, which may impact how, where, and in what order products appear. This table does not include all companies or all available products.
All rates are subject to change without notice and may vary depending on location. These quotes are from banks, thrifts, and credit unions, some of whom have paid for a link to their own Web site where you can find additional information. Those with a paid link are our Advertisers. Those without a paid link are listings we obtain to Boost the consumer shopping experience and are not Advertisers. To receive the Bankrate.com rate from an Advertiser, please identify yourself as a Bankrate customer. Bank and thrift deposits are insured by the Federal Deposit Insurance Corp. Credit union deposits are insured by the National Credit Union Administration.
Consumer Satisfaction: Bankrate attempts to verify the accuracy and availability of its Advertisers' terms through its quality assurance process and requires Advertisers to agree to our Terms and Conditions and to adhere to our Quality Control Program. If you believe that you have received an inaccurate quote or are otherwise not satisfied with the services provided to you by the institution you choose, please click here.
Rate collection and criteria: Click here for more information on rate collection and criteria.
Thu, 21 Jul 2022 12:11:00 -0500text/htmlhttps://www.benzinga.com/pressreleases/22/07/ab28162285/risk-analytics-market-size-opportunities-business-scenario-share-scope-key-segments-and-forecast-Killexams : How IBM's Watson Could Disrupt Medical Imaging
Expect some major changes now that IBM is spending $1 billion to acquire Merge Healthcare, says IHS analyst Stephen Holloway.
Chicago-based Merge Healthcare's medical imaging management platform is used at more than 7500 U.S. healthcare sites, as well as many of the world's leading clinical research institutes and pharmaceutical firms. Stephen Holloway, associate director for IHS Inc., notes that the deal will give Watson access to more than a half billion medical images stored in Merge's enterprise archive storage platform.
The goal at IBM (Armonk, NY) is to enable Merge's customers to use the Watson Health Cloud to analyze and cross-reference medical images against a deep trove of lab results, electronic health records, genomic tests, clinical studies, and other health-related data sources. IBM officials think there is a desire in the healthcare field for such imaging analytics. According to IBM, radiologists in some hospital emergency rooms are presented with as many as 100,000 images a day.
Holloway lists three ways IBM's Watson could spur what he describes as a new era of radiology:
1. It could disrupt the dominance of medical imaging 'big guns.'
IBM represents a deep-pocketed entrant into a market that has been dominated by six companies--GE Healthcare, Philips Healthcare, Siemens Healthcare, Toshiba Medical Systems, Hitachi Medical, and Samsung. Most of these vendors already have their own radiology IT platforms that they've bundled with the hardware, Holloway says.
The arrival of image storage and management software vendors such as Merge and Lexmark Healthcare has already eroded traditional imaging vendor share in recent years. "If IBM can make Watson AI products for image analytics clinically relevant and seamlessly integrate these tools into the EMR, control of the radiology IT market will increasingly shift away from traditional radiology IT vendors. It may even force a departure of industrial medical imaging suppliers away from IT software all-together, as most do not have the big data or analytics capability to compete," Holloway says.
2. A radiologist versus artificial intelligence turf war?
In the short term, Watson will likely provide decision-support tools, similar to the computer aided diagnosis software for breast imaging that has assisted radiologist reporting. But in the long term, look out for Watson joining the dots by drawing on a wealth of other medical diagnostic information gathered from the health and medical record data of a huge population.
"If this happens, radiologists may increasingly find themselves redefining their role in care provision," Holloway says.
3. New ethical and legal issues
Bringing artificial intelligence into healthcare could spark a whole host of ethical and legal issues, according to Holloway.
"Will AI decision-support tools remain just so, as decision support tool, or will over-time the judgement of physicians be called into question? With increasing electronic tracking of care management and metrics to ensure quality of care and drive efficiency, will reliance on such analytics override physician diagnosis?" Holloway says.
Watson's advice could even conceivably become evidence in a lawsuit against a physician over an incorrect diagnosis.
"What is certainly clear though, is that radiology will likely never be the same again," Holloway says.
Refresh your medical device industry knowledge at MEDevice San Diego, September 1-2, 2015.