Pass exam in 24 hours with killexams 000-552 real questions

Are usually you searching with regard to IBM IBM InfoSphere Optim for Distributed Systems - V7.3.1 Exam Questions of real queries for the IBM InfoSphere Optim for Distributed Systems - V7.3.1 Examination prep? We offer recently updated plus great 000-552 Latest Topics. We have put together a database associated with 000-552 sample test from real examinations if you would like to, all of us are able in order to help you download, memorize and complete 000-552 examination on the particular first attempt. Simply put together our own 000-552 PDF Questions and rest guaranteed. You may pass the particular 000-552 exam.

Exam Code: 000-552 Practice exam 2022 by Killexams.com team
IBM InfoSphere Optim for Distributed Systems - V7.3.1
IBM Distributed information source
Killexams : IBM Distributed information source - BingNews https://killexams.com/pass4sure/exam-detail/000-552 Search results Killexams : IBM Distributed information source - BingNews https://killexams.com/pass4sure/exam-detail/000-552 https://killexams.com/exam_list/IBM Killexams : Is OTP a Viable Alternative to NIST's Post-Quantum Algorithms?

The quantum threat to RSA-based encryption is deemed to be so pressing that NIST is seeking a quantum safe alternative

The cracking of the SIKE encryption algorithm (deemed to be on its way to NIST standardization) on a single classical PC should make us evaluate our preconceptions on what is necessary for the post-quantum era. SecurityWeek has spoken to several cryptography experts to discuss the implications of the SIKE crack.

The issue

NIST, through the vehicle of a competition, is in the process of developing new cryptographic algorithms for the post quantum era. Shor’s algorithm has already shown that the existing RSA encryption, underlying modern internet communication, will be broken probably within the next decade.

IBM currently has quantum processors with 127 qubits. Mike Osborne, CTO of IBM Quantum Safe, added, “and a roadmap essentially, more or less, up to 4000 cubits [with] an idea how we get to a million cubits… the era of what we call cryptographically, relevant quantum machines is getting closer all the time.”

The threat to RSA-based communication has become known as the ‘harvest now, decrypt later’ problem. Adversarial nations can steal and copy currently encrypted data now, knowing that in a relatively few years’ time, they will be able to decrypt it.

Many secrets have a lifetime of decades – at the personal level, for example, social security numbers and family secrets; while at the nation level this can include state secrets, international policies, and the truth behind covert activity. The quantum threat to RSA is deemed to be so pressing that NIST is seeking a quantum safe alternative.

But the SIKE crack should remind us that the threat to encryption already exists – encryption, even post quantum encryption – can be defeated by classical computing.

Some cryptographic theory

The new algorithms being considered by NIST are designed to be ‘quantum safe’. This is not the same as ‘quantum secure’. ‘Safe’ means there is no known way to decrypt the algorithm. ‘Secure’ means that it can be mathematically or otherwise proven that the algorithm cannot be decrypted. Existing algorithms, and those in the current NIST competition, are thought to be quantum safe, not quantum secure.

As the SIKE crack shows us, any quantum safe encryption will be safe only until it is cracked.

There is only one quantum secure possibility – a one-time pad (OTP). A one-time pad is an encryption method that cannot be cracked. It requires a single-use (one-time) pre-shared key that is not smaller than the message being sent. The result is information-theoretically secure – that is, it provides perfect secrecy that is provably secure against mathematical decryption, whether by classical or quantum computers.

But there are difficulties – generating keys of that length with true randomness and delivering the key to the destination have so far proven impractical by electronic means. 

Scott Bledsoe, CEO at Theon Technology, summarized the current status: “The only encryption method guaranteeing survivorship even at the creation of the quantum computer is one-time pad encryption.” But he told SecurityWeek there is an issue with randomness and the uniformity of the distribution in the keys – any issue at this top level can allow you to predict all future keys.  

“Secondly,” he added, “the size of the key needs to be equal or larger than the message, and this requires more compute time and is slower than other classical algorithms.” The third problem is, “Key distribution and how the initial keys can be transmitted. This was handled in the past by person-to-person exchange, guaranteeing secrecy.”

This is the nub of the issue. NIST’s algorithms can only be ‘safe’. OTPs can be ‘secure’ but have been impractical to use. But the need for ‘secure’ rather than ‘safe’ is highlighted by the SIKE crack. Any algorithm can be considered safe until it is cracked, or until new methods of decryption suggest it is unsafe. During the time it is used before it is unsafe, it remains susceptible to harvest now, decrypt later.

This can happen at any time to any mathematical algorithm. The original RSA had a key length of 128 bits with a projected lifetime of millions of years before it could be cracked. As computers got better, the lifetime was progressively reduced requiring the key length to be increased. RSA now requires a key length in excess of 2,000 bits to be considered safe against classical computers, but cannot be secure against Shor’s quantum algorithm.

So, since no mathematical encryption can be proven secure, any communication using that algorithm can be decrypted if the algorithm can be broken – and SIKE demonstrates that it doesn’t always require quantum power to do so. So, at the very best, NIST’s quantum safe algorithms provide no ensure of long-lasting security.

“There are multiple research organizations and companies working on these problems,” says Bledsoe. “In the future we will see algorithms based on OTP concepts that have answers to the current shortcomings. They will leverage information theory and become viable options as an alternative to NIST-approved algorithms.”

The pros and cons of OTP

The NIST competition is solely focused on developing new encryption algorithms that should, theoretically, survive quantum decryption. In other words, it is an incremental advance on the current status quo. This will produce quantum safe encryption. But quantum safe is not the same as quantum secure; that is, encrypted communications will only remain encrypted until the encryption is broken.

History and mathematical theory suggest this will inevitably, eventually, happen. When that does happen, we will be back to the same situation as today, and all data harvested during the use of the broken algorithm will be decrypted by the adversary. Since there is an alternative approach – the one-time pad – that is secure against quantum decryption, we should consider why this approach isn’t also being pursued.

SecurityWeek spoke to senior advocates on both sides: NIST’s computer security mathematician Dustin Moody, and Qrypt’s cofounder and CTO Denis Mandich.

Moody accepts that one-time pads provide theoretically perfect security, but suggests their use has several drawbacks that make them impractical. “The one-time pad,” he said, “must be generated by a source of true randomness, and not a pseudo-random process.  This is not as trivial as it sounds at first glance.”

Mandich agrees with this, but comments, “[This is] why Qrypt uses quantum random number generators (QRNGs) licensed from the Oak Ridge National Laboratory and the Los Alamos National Laboratory.” These are quantum entropy sources that are the only known source of genuine randomness in science. (See Mitigating Threats to Encryption From Quantum and Bad Random for more information on QRNGs.)

Moody also suggests that OTP size is a problem. “The one-time pad must be as long as the message which is to be encrypted,” he said. “If you wish to encrypt a long message, the size of the one-time pad will be much larger than key sizes of the algorithms we [NIST) selected.”

Again, Mandich agrees, saying the trade-off for higher security is longer keys. “This is true for 100% of all crypto systems,” he says: “the smaller the keys, the less security is a general statement.” But he adds, “One of the other [NIST] finalists is ‘Classic McEliece’ which also has enormous key sizes but will likely be standardized. In many common use cases, like messaging and small files, McEliece keys will be much larger than OTPs.”

Moody’s next concern is authentication. “There is no way to provide authentication using one-time pads,” he said.

Here, Mandich simply disagrees. “Authentication can be provided for any type of data or endpoint.” He thinks the idea may stem from the NSA’s objection to QKD. The NSA has said, “QKD does not provide a means to authenticate the QKD transmission source.”

But Mandich adds, “A simple counter example is that the OTP of an arbitrary length may be hashed and sent in the clear between parties to authenticate that they have the same OTP. This could be appended to the encrypted data.”

“As the name implies,” said Moody, “one-time pads can only be used once. This makes them very impractical.”

But Mandich responds, “This is the trade-off to achieve higher security. Re-use of encryption keys means that breaking or getting access to the key facilitates decryption of all the previously encrypted data. OTPs are only used once, so if someone gets access to one OTP, it does not help in any other decryption.”

For Moody, the biggest problem for OTPs is the exchange of ‘keys’. “Probably the most major drawback,” he told SecurityWeek, “is that to use a one-time pad with another party, you must have securely exchanged the secret one time pad itself with the other party.”

He believes this distribution at scale is impossible and doesn’t work where the requirement is to communicate with another party that hasn’t been communicated with before. “You could send the one-time pad through the mail or via a courier, but not electronically,” he continued. “And if you could securely send the one-time pad, why didn’t you just send the message you wanted to share with the other party? Which makes the one-time pad not needed.” 

Mandich points out that the difficulty in key transfer and distribution at scale apply equally to all the public key encryption keys currently being considered by NIST. “There is nothing unique about OTPs other than size,” he said. “OTPs can be generated continuously and consumed when the messages are created at a later date. There is no reason to do it simultaneously unless it is a realtime communications channel.” He adds that combining keys for decryption with the encrypted data makes it easy to attack. “Decoupling these two mechanisms [as with OTPs] makes it almost impossible.”

Finally, comments Moody, “Modern cryptosystems overcome these obstacles and are very efficient.”

Mandich concedes this point but refers to the distinction between NIST’s quantum safe approach, and the OTP’s ability to be quantum secure. “Modern systems are very efficient and a one-size-fits-all solution – but at the cost of less security. Obstacles to using OTPs have long been overcome by the cloud, high bandwidth networks, and distributed and decentralized data centers. The PQC evolution from RSA is just changing an algorithm based on a 1970s pre-internet architecture, when Alice and Bob were connected by a single copper wire channel and a few network switches.”

Current examples

Some companies are already using OTP concepts in their technology. Two examples include startups Rixon and Qrypt. The first borrows OTP ideas to secure data, while the second can enable genuine OTP communication.

Rixon

Rixon delivers a cloud-based vaultless tokenization system. Information received from a customer is immediately sent to the cloud and tokenized. What is returned to the client is data where each character has been randomly tokenized, and detokenization is under the control of the client’s customer; that is, the original end user.

No encryption algorithm nor encryption key is directly used in the tokenization, just a large set of random steps. The purpose is not to provide secure communications nor to provide a one-time pad. The purpose is to remove clear text data from a customer’s computers so that it cannot be stolen.

Nevertheless, the process borrows many of the concepts of the OTP. There is no algorithm that can be decrypted to provide widescale adversarial access to the data. Each character is independently tokenized, so that even if the tokenization process for that character is broken or discovered, it will only provide access to the single character.

The effect is that no two sets of customer data have the same ‘cryptographic’ process, making it similar to the OTP approach. 

“Everyone starts with a robust key management system, with key rotation, and key retirement being a keystone of every encryption management model,” Dave Johnson, CEO and cofounder of Rixon, told SecurityWeek. “After a time, all systems become looser in the sense that the processes and procedures become lax. Paperwork is easily adjusted to reflect compliance, but the reality is that key management systems become outdated and useless. Keys are stolen, compromised, and become known – organizations end up over time with an illusion of security.”

This will get worse in the quantum era. He continued, “With the advent of quantum processors – not that they’re really necessary to compromise encryption –with the implementation of these extremely fast processors the faults and the frailties of encryption will become blatantly apparent.”

Qrypt

Qrypt generates genuinely random numbers through a quantum process. This is the only known approach able to produce true randomness. The company has also developed a method able to provide the same random numbers simultaneously with both the sender and receiver. Both ends of the communication channel can use these numbers to generate the encryption keys without requiring the keys to be sent across the untrusted internet.

The initial purpose was primarily to provide true random numbers for any key generation, since poor or bad random numbers are the primary encryption attack vector. The second purpose was to eliminate the need to send keys across an untrusted network by having the same key independently built at both ends of the communications channel.

This process can be used to Improve the safety of both current classical algorithms and NIST’s PQC algorithms, or to facilitate a move toward the security of one-time pads – the same process can be harnessed as a one-time pad.

The future for encryption

There is no doubt that current encryption algorithms need to be replaced before the quantum era. NIST is focused on staying with the existing approach – by using more complex algorithms to counter more powerful computers. If one-time pads were still impractical (NIST believes that to be true), then this is the only valid way forward.

But startups are already demonstrating that the problems that have prevented electronic OTPs in the past are being circumvented by new cloud technology. This throws into stark relief that there is now a genuine choice between NIST’s quantum safe solutions, and OTP’s quantum secure solution.

Related: Senators Introduce Bipartisan Quantum Computing Cybersecurity Bill

Related: NIST Announces Post Quantum Encryption Competition Winners

Related: CISA Urges Critical Infrastructure to Prepare for Post-Quantum Cryptography

Related: QuSecure Launches Quantum-Resilient Encryption Platform

Kevin Townsend is a Senior Contributor at SecurityWeek. He has been writing about high tech issues since before the birth of Microsoft. For the last 15 years he has specialized in information security; and has had many thousands of articles published in dozens of different magazines – from The Times and the Financial Times to current and long-gone computer magazines.
Previous Columns by Kevin Townsend:
Tue, 04 Oct 2022 04:15:00 -0500 en text/html https://www.securityweek.com/otp-viable-alternative-nists-post-quantum-algorithms
Killexams : Data Load Tool (DLT) Market – Business Strategies, Industry Share, Size 2022 to 2030 | By -Amazon Web Services, Pennant Technologies, IBM

The MarketWatch News Department was not involved in the creation of this content.

Oct 13, 2022 (Heraldkeepers) -- New Jersey, United States-A development pursued and Opportunities for Global Data Load Tool (DLT) Market 2022 gives an organized picture of the market through an assessment of research and information gathered from various resources to help the innovators in the general market to accept a significant role in logically affecting the overall economy. This is according to the most exact prepared understanding report distributed by Infinity Business Insights. The study offers and exaggerates a compelling picture of the general situation in terms of market size, market insights, and hostile circumstances.

The Global Data Load Tool (DLT) Market investigation report contains Types (On-Premise, Cloud Based), Segmentation & all logical and factual briefs about the Market 2022 Overview, CAGR, Production Volume, Sales, and Revenue with the regional analysis covers North America, Europe, Asia-Pacific, South America, Middle East Africa & The Prime Players & Others.

Download trial Data Load Tool (DLT) Market Report 2022 to 2030 here:

The Worldwide Data Load Tool (DLT) market size is estimated to be worth USD million in 2022 and is forecast to a readjusted size of USD million by 2030 with a CAGR of % during the review period.

The Data Load Tool (DLT) market is currently present all over the world. The research study provides a comprehensive assessment of the market and includes data supported by the industry, as well as future examples, development elements, utilization, creation volume, and CAGR respect. This study assists people and market rivals in making basic judgments for organization growth and anticipating future benefits.

Data Load Tool (DLT) Market Segmentation & Coverage:

Data Load Tool (DLT) Market segment by Type: 
On-Premise, Cloud Based

Data Load Tool (DLT) Market segment by Application: 
Large Enterprises, Small and Medium-Sized Enterprises

The years examined in this study are the following to estimate the Data Load Tool (DLT) market size:

History Year: 2015-2019
Base Year: 2021
Estimated Year: 2022
Forecast Year: 2022 to 2030

Cumulative Impact of COVID-19 on Market:

Flooding COVID instances, cautious travel restrictions, a decline in social events, breaks in shop affiliation, and disruptions in the supply of standard substances have created a tolerant environment for market expansion in 2020 and right now. No matter how the Covid pandemic issue is resolved, active inventive activity in the sector might sustain the market’s moving chart in the future.

Access a trial Report Copy of the Data Load Tool (DLT) Market: https://www.infinitybusinessinsights.com/request_sample.php?id=836274

Regional Analysis:

The primary named regions for the Data Load Tool (DLT) market are listed next: North America (United States, Canada, and Mexico), Europe (European Union), Asia-Pacific region (China, Japan, Korea, India, Southeast Asia, and Australia), Americas, South (Brazil, Argentina, Colombia, and Rest of South America), Africa, and the Middle East (Saudi Arabia, UAE, Egypt).

The Key companies profiled in the Data Load Tool (DLT) Market:

The study examines the Data Load Tool (DLT) market’s competitive landscape and includes data on important suppliers, including Pennant Technologies, IBM, Amazon Web Services, Microsoft, Oracle, SAP, Skillsoft, XLM Solutions,& Others

Table of Contents:

List of Data Sources:

Chapter 2. Executive Summary
Chapter 3. Industry Outlook
3.1. Data Load Tool (DLT) Global Market segmentation
3.2. Data Load Tool (DLT) Global Market size and growth prospects, 2015 – 2026
3.3. Data Load Tool (DLT) Global Market Value Chain Analysis
3.3.1. Vendor landscape
3.4. Regulatory Framework
3.5. Market Dynamics
3.5.1. Market Driver Analysis
3.5.2. Market Restraint Analysis
3.6. Porter’s Analysis
3.6.1. Threat of New Entrants
3.6.2. Bargaining Power of Buyers
3.6.3. Bargaining Power of Buyers
3.6.4. Threat of Substitutes
3.6.5. Internal Rivalry
3.7. PESTEL Analysis
Chapter 4. Data Load Tool (DLT) Global Market Product Outlook
Chapter 5. Data Load Tool (DLT) Global Market Application Outlook
Chapter 6. Data Load Tool (DLT) Global Market Geography Outlook
6.1. Data Load Tool (DLT) Industry Share, by Geography, 2022 & 2030
6.2. North America
6.2.1. Data Load Tool (DLT) Market 2022 -2030 estimates and forecast, by product
6.2.2. Data Load Tool (DLT) Market 2022 -2030, estimates and forecast, by application
6.2.3. The U.S.
6.2.4. Canada
6.3. Europe
6.3.3. Germany
6.3.4. the UK
6.3.5. France
Chapter 7. Competitive Landscape
Chapter 8. Appendix

Get Full INDEX of Data Load Tool (DLT) Market Research Report. Stay tuned for more updates @

FAQs:
What are the Data Load Tool (DLT) market’s complexities?
What are the prospects for the Data Load Tool (DLT) market?
What financial effects will the Data Load Tool (DLT) market have?
Which significant events are most likely to have an important impact on the growth of the Data Load Tool (DLT) industry?

Contact Us:
Amit Jain
Sales Co-Ordinator
International: +1 518 300 3575
Email: inquiry@infinitybusinessinsights.com
Website: https://www.infinitybusinessinsights.com

The post Data Load Tool (DLT) Market – Business Strategies, Industry Share, Size 2022 to 2030 | By -Amazon Web Services, Pennant Technologies, IBM appeared first on Herald Keeper.

COMTEX_416511766/2582/2022-10-13T00:12:51

Is there a problem with this press release? Contact the source provider Comtex at editorial@comtex.com. You can also contact MarketWatch Customer Service via our Customer Center.

The MarketWatch News Department was not involved in the creation of this content.

Wed, 12 Oct 2022 16:12:00 -0500 en-US text/html https://www.marketwatch.com/press-release/data-load-tool-dlt-market-business-strategies-industry-share-size-2022-to-2030-by--amazon-web-services-pennant-technologies-ibm-2022-10-13
Killexams : IBM Streamlines Red Hat Storage Products Within the IBM Storage Business Unit

IBM announced it will add Red Hat storage product roadmaps and Red Hat associate teams to the IBM Storage business unit, bringing consistent application and data storage across on-premises infrastructure and cloud.

With the move, IBM will integrate the storage technologies from Red Hat OpenShift Data Foundation (ODF) as the foundation for IBM Spectrum Fusion. This combines IBM and Red Hat's container storage technologies for data services and helps accelerate IBM's capabilities in the burgeoning Kubernetes platform market.

In addition, IBM intends to offer new Ceph solutions delivering a unified and software defined storage platform that bridges the architectural divide between the data center and cloud providers. This further advances IBM's leadership in the software defined storage and Kubernetes platform markets, according to the vendor.

"Red Hat and IBM have been working closely for many years, and today's announcement enhances our partnership and streamlines our portfolios," said Denis Kennelly, general manager of IBM Storage, IBM Systems. "By bringing together the teams and integrating our products under one roof, we are accelerating the IBM's hybrid cloud storage strategy while maintaining commitments to Red Hat customers and the open-source community."

Benefits of the software defined portfolio available from IBM will include:

  • A unified storage experience for all containerized apps running on Red Hat OpenShift: Customers can use IBM Spectrum Fusion (now with Red Hat OpenShift Data Foundation) to achieve the highest levels of performance, scale, automation, data protection, and data security for production applications running on OpenShift that require block, file, and/or object access to data. This enables development teams to focus on the apps, not the ops, with infrastructure-as-code designed for simplified, automated managing and provisioning.
  • A consistent hybrid cloud experience at enterprise levels of scale and resiliency with IBM Ceph: Customers can deliver their private and hybrid cloud architectures on IBM's unified and software defined storage solution, providing capacity and management features. Capabilities include data protection, disaster recovery, high availability, security, auto-scaling, and self-healing portability, that are not tied to hardware, and travel with the data as it moves between on-premises and cloud environments.
  • A single data lakehouse to aggregate and derive intelligence from unstructured data on IBM Spectrum Scale: Customers can address the challenges that often come with quickly scaling a centralized data approach with a single platform to support data-intensive workloads such as AI/ML, high performance computing, and others. Benefits can include less time and effort to administer, reduced data movement and redundancy, direct access to data for analytics tools, advanced schema management and data governance, all supported by distributed file and object storage engineered to be cost effective.
  • Build in the cloud, deploy on-premises with automation: Customers can move developed applications from the cloud to on-premises services, automate the creation of staging environments to test deployment procedures, validate configuration changes, database schema and data updates, and ready package updates to overcome obstacles in production or correct errors before they become a problem that affects business operations.

"Red Hat and IBM have a shared belief in the mission of hybrid cloud-native storage and its potential to help customers transform their applications and data," said Joe Fernandes, vice president of hybrid platforms, Red Hat. "With IBM Storage taking stewardship of Red Hat Ceph Storage and OpenShift Data Foundation, IBM will help accelerate open-source storage innovation and expand the market opportunity beyond what each of us could deliver on our own. We believe this is a clear win for customers who can gain a more comprehensive platform with new hybrid cloud-native storage capabilities."

Under the agreement between IBM and Red Hat, IBM will assume Premier Sponsorship of the Ceph Foundation, whose members collaborate to drive innovation, development, marketing, and community events for the Ceph open-source project.

IBM Ceph and Red Hat OpenShift Data Foundation will remain 100% open source and will continue to follow an upstream-first model, reinforcing IBM's commitment to these vital communities, according to the company.

Red Hat and IBM intend to complete the transition by January 1, 2023, which will involve the transfer of storage roadmaps and Red Hat associates to the IBM Storage business unit.

Following this date, Red Hat OpenShift Platform Plus will continue to include OpenShift Data Foundation, sold by Red Hat and its partners.

Additionally, Red Hat OpenStack customers will still be able to buy Red Hat Ceph Storage from Red Hat and its partners. Red Hat OpenShift and Red Hat OpenStack customers with existing subscriptions will be able to maintain and grow their storage footprints as needed, with no change in their Red Hat relationship.

Forthcoming IBM Ceph and IBM Spectrum Fusion storage solutions based on Ceph are expected to ship beginning in the first half of 2023.

For more information about this news, visit www.ibm.com.


Tue, 04 Oct 2022 02:03:00 -0500 en text/html https://www.dbta.com/Editorial/News-Flashes/IBM-Streamlines-Red-Hat-Storage-Products-Within-the-IBM-Storage-Business-Unit-155227.aspx
Killexams : 5 Best CEFs To Buy This Month (October 2022)
Stimulate business. Solutions to boost income.

Olivier Le Moal

In this monthly article, we try to identify five closed-end funds ("CEFs") that have a solid past history, pay high-enough distribution and offer reasonable valuations at the current time.

Since last month, the market sentiment has taken a turn for the worse. In the meantime, the Fed has not only raised the short-term interest rates by 75 basis points for the third time, but another rate hike of 75 basis points is also fully baked in for the next Fed meeting in early November. The 10-year Treasury yield (US10Y) spiked from roughly 3.05% last month to over 3.90% as we write this. As of now, the market is all about the Fed policy and the likelihood of a recession as a result of the Fed's policy. The market, as represented by S&P500, has been testing new lows recently.

Obviously, the CEFs have lost a lot of value in exact months, along with the broader markets. The current market downturn has not spared any of the asset classes; hence almost all funds are significantly down. Even then, it is of utmost importance that we make our CEF portfolio a diversified one in terms of underlying asset classes. For reference, for our own benchmark, as of 10/11/2022, our "8%-CEF-Income" portfolio is down roughly 17% since the beginning of the year, compared to -24.15% of the S&P 500. Besides, income investors need to measure the attractiveness of a closed-end fund in terms of the income yield and its reliability and sustainability.

All that said, market uncertainties will always remain with us, but that should not prevent us from acting on our long-term investing goals. It's best to keep the focus on our long-term goals and strategies that have proven to work in good times and bad. If you are a new investor and/or starting a brand new CEF portfolio, our recommendation would be to start small and build the positions over time. We believe, for most investors (but not all), a 20%-25% allocation to closed-end and high-income funds should be enough. In that spirit, we keep looking for good investment opportunities and try to separate the wheat from the chaff on a regular basis.

Why Invest In CEFs?

For income-focused investors, closed-end funds remain an attractive investment class that offers high income (generally in the range of 6%-10%, often 8% plus), broad diversification (in terms of variety of asset classes), and market-matching total returns in the long term, if selected carefully and acquired at reasonable price points. A $500K CEF portfolio can generate nearly $40,000 a year, compared to a paltry $7,500 from the S&P 500. Now, if you were a retiree and needed to use all of that income, the portfolio probably might not grow as much, but it may still grow enough to beat the rate of inflation. That certainly beats investment vehicles like annuities. However, if you are in a position to withdraw 5% (or under 6%), the rest of the yield can be reinvested in the original fund or a new fund to ensure reasonable growth of the capital. In our view, if managed with some due diligence and care, a CEF portfolio could deliver 10% (or better) long-term total returns.

All that said, it's important to be aware of the risks and challenges that come with investing in CEFs. We list various risk factors at the end of this article. They are not suitable for everyone, so please consider your goals, income needs, and risk tolerance carefully before you invest in CEFs.

With that in view, one should buy selectively and in small and multiple lots. No one can predict the future direction of the market with any degree of certainty. So, we continue to be on the lookout for good investment candidates that have a solid track record, offer good yields, and are offering great discounts.

Five Best CEFs To Consider Every Month

This series of articles attempts to separate the wheat from the chaff by applying a broad-based screening process to 500 CEF funds followed by an eight-criteria weighting system. In the end, we're presented with about 30-40 of the most attractive funds in order to select the best five. However, please note that we do not consider funds that have a history of fewer than five years. We use our multi-step filtering process to select just five CEFs from around 500 available funds. For readers who are looking for a wider selection and diversification, we also include a list of the top 10 funds.

This is our regular series on CEFs, where we highlight five CEFs that are relatively cheap, offer "excess" discounts to their NAVs, pay reasonably high distributions, and have a solid track record. We also write a monthly series to identify "5 Safe and Cheap DGI" stocks. You can read our most exact article here.

The selected five CEFs this month, as a group, are offering an average distribution rate of 9.30% (as of 10/07/2022). Besides, these five funds have a proven past record and collectively returned 7.34%, 6.69%, and 7.98% in the last three, five, and ten years. Please keep in mind that these returns are much lower than what they were just a couple of months ago and should be measured in comparison to the broader market. The average leverage for the group is very low at about 6%, with an average discount of -6.50%. Since this is a monthly series, there may be some selections that could overlap from month to month.

Please note that these are not recommendations to buy but should be considered as a starting point for further research.

Author's Note: This article is part of our monthly series that tries to discover the five best buys in the CEF arena at that point in time. Certain parts of the introduction, definitions, and sections describing selection criteria/process may have some commonality and repetitiveness with our other articles in this series. This is unavoidable as well as intentional to keep the entire series consistent and easy to follow for new readers. Regular readers who follow the series from month to month could skip the general introduction and sections describing the selection process. Further, a version of this article is made available a few days early to the subscribers of the HIDIY Marketplace service.

Goals For The Selection Process

Our goals are simple and are aligned with most conservative income investors, including retirees who wish to dabble in CEFs. We want to shortlist five closed-end funds that are relatively cheap, offer good discounts to their NAVs, pay relatively high distributions, and have a solid and substantial past track record in maintaining and growing their NAVs. Please note that we are not necessarily going for the cheapest funds (in terms of discounts or highest yields), but we also require our funds to stand out qualitatively. We adopt a systematic approach to filter down the 500-plus funds into a small subset.

Here's a summary of our primary goals:

  • High income/distributions.
  • Reasonable long-term performance in terms of total return on NAV: We also try to measure if there has been an excess NAV return over and above the distribution rate.
  • Cheaper valuation at the time of buying, determined by the absolute discount to NAV and the "excess" discount offered compared to their history.
  • Coverage ratio: We try to measure to what extent the income generated by the fund covers the distribution. Not all CEFs fully cover the distribution, especially the equity, and specialty funds, as they depend on capital gains to cover their distributions. We adjust this weight according to the type and nature of the fund.

We believe that a well-diversified CEF portfolio should consist of at least 10 CEFs, preferably from different asset classes. It's also advisable to build the portfolio over a period rather than invest in one lump sum. If you were to invest in one CEF every month for a year, you would have a well-diversified CEF portfolio by the year's end. What we provide here every month is a list of five probable candidates for further research. We think a CEF portfolio can be an important component in the overall portfolio strategy. One should preferably have a DGI portfolio as the foundation, and the CEF portfolio could be used to boost the income level to the desired level. How much should one allocate to CEFs? Each investor needs to answer this question himself/herself based on personal situation and factors like the size of the portfolio, income needs, risk appetite, or risk tolerance.

Selection Process

We have more than 500 CEF funds to choose from, which come from different asset classes like equity, preferred stocks, mortgage bonds, government and corporate bonds, energy MLPs, utilities, infrastructure, and municipal income. Just like in other life situations, even though the broader choice always is good, it does make it more difficult to make a final selection. The first thing we want to do is to shorten this list of 500 CEFs to a more manageable subset of around 75-100 funds. We can apply some criteria to shorten our list, but the criteria need to be broad and loose enough at this stage to keep all the potentially good candidates. Also, the criteria that we build should revolve around our original goals. We also demand at least a five-year history for the funds that we consider. However, we do take into account the 10-year history, if available.

Criteria to Shortlist:

Criteria

Brings down the number of funds to...

Reason for the Criteria

Baseline expense < 2.5% and Avg. Daily Volume > 10,000

Approx. 435 Funds

We do not want funds that charge excessive fees. Also, we want funds that have fair liquidity.

Market-capitalization > 100 Million

Approx. 400 Funds

We do not want funds that are too small.

Track record/ History longer than five years (inception date 2016 or earlier)

Approx. 375 Funds

We want funds that have a reasonably long track record.

Discount/Premium < +7%

Approx. 350 Funds

We do not want to pay too high a premium; in fact, we want bigger discounts.

Distribution (dividend) Rate > 5%

Approx. 260-290 Funds

The current distribution (income) to be reasonably high.

5-Year Annualized Return on NAV > 0% AND

3-Year Annualized Return on NAV >0%

Approx. 220-250 Funds

We want funds that have a reasonably good past track record in maintaining their NAVs.

After we applied the above criteria this month, we were left with roughly 240 funds on our list. But it's too long a list to present here or meaningfully select five funds.

Note: All tables in this article have been created by the author (unless explicitly specified). Most of the data in this article are sourced from Cefconnect.com, Cefa.com, and Morningstar.com.

Narrowing Down To 50-60 Funds

To bring down the number of funds to a more manageable count, we will shortlist ten funds based on each of the following criteria. After that, we will apply certain qualitative criteria to each fund and rank them to select the top five.

At this stage, we also eliminate certain funds that have had substantial negative NAV returns for both three-year and five-year periods.

Seven Broad Criteria:

  • Excess discount/premium (explained below).
  • Distribution rate.
  • Return on NAV, last three years (medium-term).
  • Return on NAV, last five years (long term).
  • Coverage ratio.
  • Excess return over distributions.
  • The total weight (calculated up to this point).

Excess Discount/Premium:

We certainly like funds that are offering large discounts (not premiums) to their NAVs. But sometimes, we may consider paying near zero or a small premium if the fund is otherwise great. So, what's important is to look at the "excess discount/premium" and not at the absolute value. We want to see the discount (or premium) on a relative basis to their record, say 52-week average.

Subtracting the 52-week average discount/premium from the current discount/premium will provide us the excess discount/premium. For example, if the fund has the current discount of -5%, but the 52-week average was +1.5% (premium), the excess discount/premium would be -6.5%.

Excess Discount/Premium = Current Discount/Premium (Minus) 52-Wk Avg. Discount/ Premium

So, what's the difference between the 12-month Z-score and this measurement of Excess Discount/Premium? The two measurements are quite similar, maybe with a subtle difference. The 12-month Z-score would indicate how expensive (or cheap) the CEF is in comparison to the 12 months. Z-score also takes into account the standard deviation of the discount/premium. Our measurement (excess discount/premium) compares the current valuation with the last 12-month average.

We sort our list (of 242 funds) on the "excess discount/premium" in descending order. For this criterion, the lower the value, the better it is. So, we select the top 11 funds (most negative values) from this sorted list.

(All data as of 10/07/2022)

Table 1:

T1

Author

High Current Distribution Rate:

After all, most investors invest in CEF funds for their juicy distributions. We sort our list on the current distribution rate (descending order, highest at the top) and select the top 12 funds from this sorted list.

Table 2:

T2

Author

Medium-Term Return on NAV (last three years):

We then sort our list on a three-year return on NAV (in descending order, highest at the top) and select the top 11 funds.

Table 3:

T3

Author

Five-Year Annualized Return on NAV:

We then sort our list on the five-year return on NAV (in descending order, highest at the top) and select the top 11 funds.

Table 4:

T4

Author

Coverage Ratio (Distributions Vs. Earnings):

We then sort our list on the coverage ratio and select the top 10 funds. The coverage ratio is derived by dividing the earnings per share by the distribution amount for a specific period. Please note that in some cases, the coverage ratio may not be very accurate since the "earnings per share" maybe three to six months old. But in most cases, it's fairly accurate.

Table 5:

T5

Author

Excess Return Over Distribution:

This is the "excess return" provided by the fund over the distribution rate. It's calculated by subtracting the distribution rate from the three-year NAV return.

Table 6:

T6

Author

Total Weight (Quality Score) Calculated Up to This Point:

Note: The Total Weight calculation is not fully completed at this point since we have not taken into account the 10-year NAV return. Also, we would adjust the weight for the coverage ratio at a later stage. However, we select the top 15 names on this basis.

Table 7:

T7

Author

Now we have 80 funds in total from the above selections.

We will see if there are any duplicates. In our current list of 80 funds, there were 36 duplicates, meaning there are funds that appeared more than once. The following names appear twice (or more):

Appeared two times:

  • ACP, BST, ECC, GLQ, PFD, PSF, BANX, XFLT, ZTR (9 duplicates)

Appeared three times:

  • ACV, BCX, BME, DFP, THQ , STK (12 duplicates)

Appeared four times:

Appeared five times:

  • HFRO, MPV, OXLC (12 duplicates)

So, once we remove 36 duplicate rows, we are left with 44 (80 - 36) funds.

Note: It may be worthwhile to mention here that just because a fund has appeared multiple times does not necessarily make it an attractive candidate. Sometimes, a fund may appear multiple times simply for the wrong reasons, like a high current discount, high excess discount, or a very high distribution rate that may not be sustainable. But during the second stage of filtering, it may not score well on the overall quality score due to other factors like poor track record. That said, if a fund has appeared four times or more, it may be worth a second look.

Narrowing Down To Just 10-12 Funds

In our list of funds, we already may have some of the best probable candidates. However, so far, they have been selected based on one single criterion that each of them may be good at. That's not nearly enough. So, we will apply a combination of criteria by applying weights to eight factors to calculate the total quality score and filter out the best ones.

We will apply weights to each of the eight criteria:

  • Baseline expense (Max weight 5)
  • Current distribution rate (Max weight 7.5)
  • Excess discount/premium (Max weight 5)
  • 3-YR NAV return (Max weight 5)
  • 5-YR NAV return (Max weight 5)
  • 10-YR NAV return (Max weight 5, if less than ten years history, an average of three-year and five-year)
  • Excess NAV return over distribution rate (Max weight 5)
  • Adjusted Coverage Ratio (Max weight 7.5): This weight is adjusted based on the type of fund to provide fair treatment to certain types like equity and sector funds. We assign some bonus points to certain types of funds, which by their make-up, depend on capital gains to fund their distributions, to bring them at par with fixed-income funds. These fund types include Equity/ Sector equity (two bonus points), real estate (two points), covered call (two points), and MLP funds (variable). However, please note that this is just one of nine criteria that are being used to calculate the total quality score.

Once we have calculated the weights, we combine them to calculate the "Total Combined Weight," also called the "Quality Score."

The sorted list (spreadsheet) of 44 funds on the "combined total weight" is attached here:

File-for-export_-_5_Best_CEFs_-_OCT_2022.xlsx

Sector-Wise (Asset-Class) Diversification:

In order to structure a CEF portfolio, it's highly recommended to diversify in funds that invest in different types of asset classes. We try to select roughly 30 top funds (out of the current list of 44) based on the types of asset class and quality scores. They are listed below. This list includes no more than three funds from any single asset class. Also, please note that the quality score only indicates the likelihood of a good candidate, but investors should do further research and due diligence on individual names. Also, an otherwise good fund may not make it to the top because it may have become expensive and may not offer value at the current pricing.

In our list of 44, if we were to look at first on the basis of asset type (sector) and then based on the total quality score/weight, below is the list of top funds. However, if we had too many similar funds from the same fund family, we would generally ignore some of them to avoid duplicity. We selected 30 names as top funds this month. Please note that some asset classes may not show any names in a particular month due to the fact that these ratings are dynamic and time-sensitive and change from month to month.

STK, EOS, GLQ, FDEU, ZTR, ASG, NIE, CSQ, RIV, ACV, NCV, NCZ, MPV, MCI, HYT, HFRO, OXLC, BGT, BTA, BLE, PSF, LDP, DFP, HQL, THQ, BME, BCX, BANX, BGR, BST.

Table 8:

T8

Author

10-Positions Portfolio of The Month

If you were to select ten picks, we could simply pick the top one from each of the above categories. That said, due diligence on each name is still highly recommended. Please note that some of these funds may have cut their distributions recently, and for some folks, that may be a good enough reason not to consider them. Also, in our final selections, we tend to provide priority to funds that pay regular and consistent distributions on a monthly or quarterly basis. Funds that may have inconsistent dividends (even if they are high) generally do not make it to our top list. Also, be aware that many times, single-country funds score high in our rankings. Many of them pay variable dividends. In addition, being single-country funds, they can be inherently riskier since their future returns are tied to just one country, be it economic, regulatory, or geopolitical factors.

Here's the list of the top 10 selections (from different asset classes):

(MPV), (HQL), (BCX), (STK), (ACV), (PSF), (OXLC), (ASG), (EOS), (ZTR).

Table 9:

T9

Author

Final Selection: Our List Of Final Top 5

5-Positions Portfolio of the Month:

Now, if we had only five slots for investment and needed to select just five funds, we would need to make some subjective choices. We think our list of 10 selections above is quite compelling, and there are certainly more than five names that we like. While we narrow down this list, we should be careful to keep the list as diversified as possible in terms of asset classes. Since this step is mostly subjective, the choice would differ from person to person. Nonetheless, here are the selections for this month, based on our perspective:

Table: The Final 5 Funds:

Table 10:

T10

Author

Some information about the selections:

  • We recommend that readers look at both the top 10 and top 5 lists. The top-10 list offers much more of a diversified lot (compared to the top 5) for the current market environment. The distribution amount per share has been going down with the NAV, but the yield is still over 9% at current prices.
  • HQL is our pick from the Healthcare sector (from Tekla group). The discount and yield are excellent currently. The healthcare sector is likely to do well in the medium to long term. To top it, we are getting nearly an 8% excess discount right now (compared to the 52-wk average).
  • EOS (from the Eaton fund family) is our choice for a fund using covered calls. The distribution yield is excellent at 10.5%, and the past performance record is pretty good. It is being offered at a slight discount to NAV right now.
  • BCX is our selection from the Resource and Commodities sector. It belongs to the Blackrock fund family and is likely to do well in an inflationary environment. The discount to NAV is in excess of -10%, though the yield is still a bit on the lower side at 6.75%. The fund has a decent long-term record as well, and the distributions are covered nearly 50% by the investment income.
  • ACV is mainly a convertible fund with a mix of equity. It invests roughly 70% in convertible and fixed-income securities and about 30% in equity stocks. It has an excellent track record of providing average annual returns of 9.21% and 9.66% over the last three and 5-years. The fund increased the distributions earlier this year by about 8%, and the yield is at 11.5% currently. If you believe that equity markets are going to go lower from here, then you should buy this in multiple lots (the second and maybe the third lot at further lower prices).
  • Finally, ASG is a pure equity fund. Due to the fact that it is a growth-oriented fund, it has been hammered in exact months. The fund uses almost no leverage and is generally invested in large-cap growth-oriented companies like Microsoft, Visa, UnitedHealth, Amazon, etc. It has a variable dividend and currently yields over 8.5%.

CEF-Specific Investment Risks

It goes without saying that CEFs, in general, have some additional risks. This section is specifically relevant for investors who are new to CEF investing, but in general, all CEF investors should be aware of it.

They generally use some amount of leverage, which adds to the risk. The leverage can be hugely beneficial in good times but can be detrimental during tough times. The leverage also causes higher fees because of the interest expense in addition to the baseline expense. In the tables above, we have used the baseline expense only. If a fund is using significant leverage, we want to make sure that the leverage is used effectively by the management team - the best way to know this is to look at the long-term returns on the NAV. NAV is the "net asset value" of the fund after counting all expenses and after paying the distributions. So, if a fund is paying high distributions and maintaining or growing its NAV over time, it should bode well for its investors.

Due to leverage, the market prices of CEFs can be more volatile as they can go from premium pricing to discount pricing (and vice versa) in a relatively short period. Especially during corrections, the market prices can drop much faster than the NAV (the underlying assets). Investors who do not have an appetite for higher volatility should generally stay away from CEFs or at least avoid the leveraged CEFs.

CEFs have market prices that are different from their NAVs (net asset values). They can trade either at discounts or at premiums to their NAVs. Generally, we should stay away from paying any significant premiums over the NAV prices unless there are some very compelling reasons.

Another risk factor may come from asset concentration risk. Many funds may hold similar underlying assets. However, this is easy to mitigate by diversifying into different types of CEFs ranging from equity, equity-covered calls, preferred stocks, mortgage bonds, government and corporate bonds, energy MLPs, utilities, and municipal income.

Concluding Thoughts

We use our screening process to highlight five likely best closed-end funds for investment each month. We also provide a larger list of ten CEFs, with some of the top candidates from each of the asset classes. As always, our filtering process demands that our selections have an excellent long-term record, maintain decent earnings to cover the distributions (in certain categories), offer an average of 7%-8% distributions, are cheaper on a relative basis, and offer a reasonable discount. Also, we ensure that the selected five funds are from a diverse group in terms of the types of assets. Please note that these selections are based on our rating system and are dynamic in nature. So, they can change from month to month (or even week to week). At the same time, some of the funds can repeat from month to month if they remain attractive over an extended period. Also, note that not every good fund would make it to this list because they may not be attractively priced or trading at a significant premium at the time of running our filtering process.

The selected five CEFs this month, as a group, are offering an average distribution rate of 9.30% (as of 10/07/2022). Besides, these five funds have a proven past record and collectively returned 7.34%, 6.69%, and 7.98% in the last three, five, and ten years. Please keep in mind that these returns are much lower than what they were just a couple of months ago and should be measured in comparison to the broader market. The average leverage for the group is very low at about 6%, with an average discount of -6.50%

When it comes to CEF investing, we always recommend that it's best to be a bit conservative and build our positions by adding in small and multiple lots to take advantage of dollar-cost averaging. We believe that the above group of CEFs makes a great watchlist for further research.

Sat, 15 Oct 2022 01:00:00 -0500 en text/html https://seekingalpha.com/article/4546334-5-best-cef-buy-month-october-2022
Killexams : Professional Video Live Streaming Solution Market Overview 2022 to 2030, Future Trends and Forecast | By -Brightcove, Haivision, IBM

The MarketWatch News Department was not involved in the creation of this content.

Oct 11, 2022 (Heraldkeepers) -- New Jersey, United States-This Professional Video Live Streaming Solution market examines the regional and global markets as well as the overall development opportunities in the industry. Additionally, it provides insight into the entire serious scene of the global Professional Video Live Streaming Solution industry. The research also includes a dashboard summary of the leading companies, outlining their successful marketing strategies, market commitment, and ongoing improvements in both historical and contemporary contexts.

The Global Professional Video Live Streaming Solution Market investigation report contains Types (Transcoding and Processing, Video Management, Video Delivery and Distribution, Video Analytics, Video Security, Others), Segmentation & all logical and factual briefs about the Market 2022 Overview, CAGR, Production Volume, Sales, and Revenue with the regional analysis covers North America, Europe, Asia-Pacific, South America, Middle East Africa & The Prime Players & Others.

Download trial Professional Video Live Streaming Solution Market Report 2022 to 2030 here:

The Worldwide Professional Video Live Streaming Solution market size is estimated to be worth USD million in 2022 and is forecast to a readjusted size of USD million by 2030 with a CAGR of % during the review period.

The Professional Video Live Streaming Solution market research provides a detailed analysis of the industry by providing information on a variety of angles, including drivers, constraints, opportunities, and risks. This information can help partners make wise decisions before contributing by guiding them. Beginning with the approval of the data handled in the auxiliary investigation is the crucial examination.

Professional Video Live Streaming Solution Market Segmentation & Coverage:

Professional Video Live Streaming Solution Market segment by Type: 
Transcoding and Processing, Video Management, Video Delivery and Distribution, Video Analytics, Video Security, Others

Professional Video Live Streaming Solution Market segment by Application: 
Broadcasters, Operators, and Media, BFSI, Education, Healthcare, Government, Others

The years examined in this study are the following to estimate the Professional Video Live Streaming Solution market size:

History Year: 2015-2019
Base Year: 2021
Estimated Year: 2022
Forecast Year: 2022 to 2030

Cumulative Impact of COVID-19 on Market:

More than 20 million COVID-19 cases would have been confirmed as of the study’s start date, and the pandemic had not been effectively controlled. We predict that the global Professional Video Live Streaming Solution market will reach million USD by the end of 2021 with a CAGR of between 2022 and 2030 and that the entire pandemic will have been largely contained by then.

Access a trial Report Copy of the Professional Video Live Streaming Solution Market: https://www.infinitybusinessinsights.com/request_sample.php?id=1020035

Regional Analysis:

The APAC (Asia Pacific) district is anticipated to experience the greatest rate of growth in the Professional Video Live Streaming Solution Market among all geographical areas. One explanation for this development could be the widespread advancement in countries like South Korea, China, Japan, and India. The rate of improvement in China is stabilizing as it considers various evened-out measures, stock charges, and current yield.

The Key companies profiled in the Professional Video Live Streaming Solution Market:

The study examines the Professional Video Live Streaming Solution market’s competitive landscape and includes data on important suppliers, including Brightcove, Haivision, IBM, Ooyala, Vbrick, Qumu Corporation, Kaltura, Contus, Sonic Foundry, Panopto, Wowza Media Systems, Kollective Technology,& Others

Table of Contents:

List of Data Sources:

Chapter 2. Executive Summary
Chapter 3. Industry Outlook
3.1. Professional Video Live Streaming Solution Global Market segmentation
3.2. Professional Video Live Streaming Solution Global Market size and growth prospects, 2015 – 2026
3.3. Professional Video Live Streaming Solution Global Market Value Chain Analysis
3.3.1. Vendor landscape
3.4. Regulatory Framework
3.5. Market Dynamics
3.5.1. Market Driver Analysis
3.5.2. Market Restraint Analysis
3.6. Porter’s Analysis
3.6.1. Threat of New Entrants
3.6.2. Bargaining Power of Buyers
3.6.3. Bargaining Power of Buyers
3.6.4. Threat of Substitutes
3.6.5. Internal Rivalry
3.7. PESTEL Analysis
Chapter 4. Professional Video Live Streaming Solution Global Market Product Outlook
Chapter 5. Professional Video Live Streaming Solution Global Market Application Outlook
Chapter 6. Professional Video Live Streaming Solution Global Market Geography Outlook
6.1. Professional Video Live Streaming Solution Industry Share, by Geography, 2022 & 2030
6.2. North America
6.2.1. Professional Video Live Streaming Solution Market 2022 -2030 estimates and forecast, by product
6.2.2. Professional Video Live Streaming Solution Market 2022 -2030, estimates and forecast, by application
6.2.3. The U.S.
6.2.4. Canada
6.3. Europe
6.3.3. Germany
6.3.4. the UK
6.3.5. France
Chapter 7. Competitive Landscape
Chapter 8. Appendix

Get Full INDEX of Professional Video Live Streaming Solution Market Research Report. Stay tuned for more updates @

FAQs:
Where can Professional Video Live Streaming Solution market participants be certain that the most fruitful local business sectors will survive?
What elements will affect Professional Video Live Streaming Solution market interest?
What will the changing Professional Video Live Streaming Solution market trends mean?
What effects will COVID-19 have on the Professional Video Live Streaming Solution market?

Contact Us:
Amit Jain
Sales Co-Ordinator
International: +1 518 300 3575
Email: inquiry@infinitybusinessinsights.com
Website: https://www.infinitybusinessinsights.com

The post Professional Video Live Streaming Solution Market Overview 2022 to 2030, Future Trends and Forecast | By -Brightcove, Haivision, IBM appeared first on Herald Keeper.

COMTEX_416370831/2582/2022-10-11T03:49:47

Is there a problem with this press release? Contact the source provider Comtex at editorial@comtex.com. You can also contact MarketWatch Customer Service via our Customer Center.

The MarketWatch News Department was not involved in the creation of this content.

Mon, 10 Oct 2022 19:49:00 -0500 en-US text/html https://www.marketwatch.com/press-release/professional-video-live-streaming-solution-market-overview-2022-to-2030-future-trends-and-forecast-by--brightcove-haivision-ibm-2022-10-11
Killexams : Third-party risk: What it is and how CISOs can address it

Did you miss a session from MetaBeat 2022? Head over to the on-demand library for all of our featured sessions here.


In today’s world where business processes are becoming more complex and dynamic, organizations have started to rely increasingly on third parties to bolster their capabilities for providing essential services. 

However, while onboarding third-party capabilities can optimize distribution and profits, third parties come with their own set of risks and dangers. For example, third-party vendors who share systems with an organization may pose security risks that can have significant financial, legal and business consequences. 

According to Gartner, organizations that hesitate to expand their ecosystem for fear of the risks it can create will likely be overtaken by organizations that boldly decide to seize the value of third-party relationships, confident in their ability to identify and manage the accompanying risks effectively. Therefore, it’s critical to handle third-party security risks efficiently and effectively.

Risk and compliance

Third parties can increase an organization’s exposure to several risks that include disrupted or failed operations, data security failures, compliance failures and an inconsistent view of goals for the organization. According to an Intel471 threat intelligence report, 51% of organizations experienced a data breach caused by a third party. 

Event

Low-Code/No-Code Summit

Join today’s leading executives at the Low-Code/No-Code Summit virtually on November 9. Register for your free pass today.

Register Here

“Organizations often grant third parties access to networks, applications, and resources for legitimate business reasons. However, when doing so with a legacy VPN, they often provide overly broad access to an entire network, rather than granular access to the specific apps and resources needed to do their job,” John Dasher, VP of product marketing, Banyan Security told VentureBeat.

Third-party risks have grown so much that compliance regulations have become essential to an organization’s processes and policies. Despite evolving regulations and an increase in confidence for risk programs across the board, a report by Deloitte found that third-party risk estimates have also concluded that more than 40% of organizations do not do enhanced due diligence on third parties.

The rising cybersecurity threat 

As the need for third-party risk management becomes more apparent to organizations, risk management teams have begun going to great lengths to ensure that vendors do not become liabilities when they become a crucial part of business operations. 

However, when organizations often incorporate a third party into their business operations, they unknowingly also incorporate other organizations, whether now or in the future. This can cause organizations to unknowingly take numerous forms of risk, especially in terms of cybersecurity

“It’s a huge concern as companies can’t just stop working with third parties,” said Alla Valente, senior analyst at Forrester. According to her, as businesses shifted from “just-in-time” efficiency to “just-in-case” resilience after the pandemic, many doubled the number of third parties in their ecosystem to Improve their business resilience.  

“Third parties are critical for your business to achieve its goals, and each third party is a conduit for breach and an attack vector. Therefore, if your third parties cannot perform due to a cyberattack, incident, or operational disruption, it will impact your business,” explained Valente. 

Third-parties that provide vital services to an organization often have some form of integration within their network. As a result, any vulnerability within their cybersecurity framework can be exploited and used to access the original organization’s data if a third party does not effectively manage or follow a cybersecurity program. 

Again, this becomes a growing concern, especially when a complex web of various vendors is created through third-party relationships that are all connected throughout their network. 

Adam Bixler, global head of third-party cyber risk management at BlueVoyant, says that threat actors use the weakest touchpoint to gain access to their target and, often, it is the weakest link in a third-party supply chain that threat actors focus on to navigate upstream to the intended company.

“In general, we have seen that cyberthreat actors are opportunistic. This has been a highly successful technique, and until security practices are implemented systematically and equally throughout the entire third-party ecosystem, all involved are at risk of this type of attack,” said Bixler. 

Bixler told VentureBeat that when BlueVoyant surveyed executives with responsibility for cybersecurity across the globe, it was found that 97% of surveyed firms had been negatively impacted by a cybersecurity breach in their supply chain. 

A large majority (93%) admitted that they had suffered a direct cybersecurity breach because of weaknesses in their supply chain, and the average number of breaches experienced in the last 12 months grew from 2.7 in 2020 to 3.7 in 2021 — a 37% year-over-year increase.

Image source: Gartner.

It is not only cybersecurity that poses a severe risk, but any disruption to any business across the web of third parties can cause a chain reaction and thus greatly hinder essential business operations.

“The real danger lies in accepting third-party files from unauthorized or authorized vendors who don’t know they have been compromised. Over 80% of attacks originate from weaponized office and PDF files that look legitimate. If those files are allowed inside your organization, they pose a threat if downloaded,” says Karen Crowley, director of product solutions at Deep Instinct

Crowley said that multistage attacks are low and slow, with threat actors willing to wait for their moment to get to the crown jewels.

Hazards of a third-party data breach

Enhancing access and data sharing can provide social and economic benefits to organizations while showcasing good public governance. However, data access and sharing also come with several risks. These include the dangers of confidentiality or privacy breaches, and violation of other legitimate private interests, such as commercial interests. 

“The primary dangers of sharing information with undocumented third parties or third-party vendors is that you have no way of knowing what their security program consists of or how it is implemented, and therefore no way to know how your data will be maintained or secured once you share,” said Lorri Janssen-Anessi, director, external cyber assessments at BlueVoyant. 

According to Anessi, it’s critical to safeguard your proprietary information and to demand the same level of security from third parties/vendors you engage with. She recommends that while sharing data with a third party, enterprises should have a system to onboard vendors that include knowing the third party’s cyber-risk posture and how these risks will be mitigated.

Organizations that do not take proper precautions to protect themselves against third-party risk expose their businesses to both security and non-compliance threats.

These data breaches may be incredibly disruptive to your organization and have profound implications, including the following:

  • Monetary losses: Data breaches are costly regardless of how they occur. According to the Ponemon Institute and IBM’s cost of a data breach report, the average cost of a data breach is $3.92 million, with each lost record costing $150. The reason for the breach is one aspect that increases the cost of the breach, and a breach costs more if a third party is involved. Based on the analysis, the price of a third-party data breach often rises by more than $370,000, with an adjusted average total cost of $4.29 million.
  • Exposure of sensitive information: Third-party data breaches can result in the loss of your intellectual property and consumer information. Several attack vectors can expose a company’s private information and inflict considerable damage, ranging from data-stealing malware to ransomware attacks that lock you out of your business data and threaten to sell it if the ransom is not paid.
  • Damaged reputation: Reputational harm is one of the most severe repercussions of a data breach. Even if the data breach was not your fault, the fact that your clients trusted you with their information and you let them down is all that matters. This might also have a significant financial impact on your company.
  • Potential for future attacks: When cybercriminals access your data through a third party, that breach may not be their endgame. It may simply be the beginning of a more extensive campaign of hacks, attacks and breaches, or the information stolen might be intended for use in phishing scams or other fraud. The collected data might be used in later attacks.

Best practices to mitigate third-party risk

Philip Harris, director, cybersecurity risk management services at IDC, says that to mitigate third-party risks more effectively, it is important to work with the appropriate teams within an organization that have the most knowledge about all the third parties the company deals with.

“Doing so can not only help create an inventory of these third parties, but also help classify them based upon the critical nature of the data they hold and/or if they’re part of a critical business process,” said Harris. 

Jad Boutros, cofounder and CEO of TerraTrue, says it is important for organizations to understand the security posture of all of their third parties by asking questions during due diligence and security certification reviews. 

According to Boutros, a few strategic guidance points that CISOs can follow to avoid third-party security hazards are:

  • Understand what data is shared between the organization and the third party. If it is possible to avoid sharing susceptible data or transform it (i.e., with bracketing, anonymizing or minimizing) to defend against certain misuses, such mitigations are worth considering. 
  • Some third parties may also expose particularly risky functionalities (e.g., transferring data over insecure channels, or exposing additional power-user functionality); if not needed, finding ways to disable them will make for a safer integration. 
  • Lastly, regularly reviewing who in the organization has access to the third party and/or elevated access helps reduce the blast radius of an internal account compromise.
Image source: Gartner.

Other preventive solutions

A few other solutions that organizations can implement to prevent third-party risks are:

Third-party risk management (TPRM) program

With increased exposure due to cooperating with third parties, the necessity for an effective third-party risk management (TPRM) program has grown significantly for organizations of all sizes. TPRM programs can help analyze and control risks associated with outsourcing to third-party vendors or service providers. This is especially true for high-risk vendors who handle sensitive data, intellectual property or other sensitive information. In addition, TPRM programs enable organizations to ensure that they are robust and have 360-degree situational awareness of potential cyber-risks.

Cyberthreat intelligence (CTI) architectures

Another preventive security measure is implementing cyberthreat intelligence (CTI) architectures. CTI focuses on gathering and evaluating information concerning present and future threats to an organization’s safety or assets. The advantage of threat intelligence is that it is a proactive solution, i.e., it can inform businesses about data breaches in advance, reducing businesses’ financial expenditures of clearing up after an occurrence. Its goal is to provide businesses with a thorough awareness of the dangers that represent the most significant risk to their infrastructure and to advise them on how to defend their operations.

Security ratings

Security ratings, often known as cybersecurity ratings, are becoming a popular way to assess third-party security postures in real time. They enable third-party risk management teams to undertake due diligence on business partners, service providers, and third-party suppliers in minutes — rather than weeks — by analyzing their external security posture promptly and objectively. Security ratings cover a significant gap left by traditional risk assessment approaches like penetration testing and on-site visits. 

Traditional methods are time-consuming, point-in-time, costly, and frequently rely on subjective evaluations. Furthermore, validating suppliers’ assertions regarding their information security policies might be difficult. Third-party risk management teams can obtain objective, verifiable and always up-to-date information about a vendor’s security procedures by employing security ratings with existing risk management methodologies.

Future challenges and important considerations

Harris says that third parties have always been an area where the attack surface has grown, but this hasn’t been taken too seriously and companies have taken a blind eye to it instead of seeing it as a real potential threat.

“Third parties need to be a board-level subject and part of the overall security metrics created to manage security holistically. There are various solutions, but these unfortunately require humans as part of the assessment process,” said Harris.

Gartner’s survey found that risk monitoring is a common gap in third-party risk management. In such cases, an enterprise risk management (ERM) function can provide valuable support for managing third-party risks. Organizations that monitor changes in the scope of third-party risk relationships yield the most positive risk outcomes, and ERM can support monitoring changes in third-party partnerships to manage the risk better.

According to Avishai Avivi, CISO at SafeBreach, most third-party risk solutions available today only provide an overview of cybersecurity, but the problem is much more profound. 

Avivi said third-party breaches through supply chains are another growing risk vector that CISOs need to consider. To prevent attacks through supply chain endpoints, he highly recommends that companies that work with a significant amount of customer-sensitive data consider developing a full privacy practice.

“Solutions still need to evolve to support third-party assessments of the vendor’s privacy posture. While there are plenty of third parties that get SOC 2 and ISO 27001 audits, they are still not enough to get their privacy practices audited. Most companies do not look for the “privacy” category of SOC 2 or the ISO 27701 certificate. The solutions available today still need to mature before they can match the need,” Avivi explained.

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.

Wed, 21 Sep 2022 08:10:00 -0500 Victor Dey en-US text/html https://venturebeat.com/security/third-party-risk-what-it-is-and-how-cisos-can-address-it/
Killexams : John Deere And The Holy Grail: Quest For The Source Of Truth In Ag
Tulips with a John Deere tractor hauling an agricultural crops sprayer in a field during a beautiful springtime day

Sjo/iStock Unreleased via Getty Images

Overview

Deere & Company's (NYSE:DE) announcement of an autonomous self-driving tractor at the Consumer Electronics Show (CES) earlier this year has turned heads, generated a lot of media publicity and excitement, and earned Deere the "Tesla of farming" nickname. As the self-driving feature will be available as an add-on attachment to existing GPS-equipped Deere tractors as well as on its new tractors, the adoption of this self-driving capability could be reasonably rapid.

I decided to examine the implications for the agriculture industry and Deere shareholders.

Thesis

My investment thesis for Deere is centered around the secular increase in demand for food that is driven by: (1) the growing global population, and (2) the rising middle class's consumption of more animal protein from livestock that feed on grain.

Deere is a leading agricultural equipment provider that has a long history of driving productivity improvements -- its exact attention-catching announcement of self-driving tractors will be followed by more technological innovations that will leverage data, artificial intelligence, and machine learning that enable farmers to produce more output with less inputs and increase their profits.

While the impact on near-term profitability may be more muted, Deere will create significant value for society, farmers, and shareholders if it achieves the holy grail of creating the source of truth for the agricultural industry.

More food production is needed to feed more people around the world

The global population continues to increase and is expected to reach almost 10 billion by 2050, up from under 8 billion today (figure 1).

Figure 1: Projected global population from Pew Research

Global population projections

Pew Research Center

Rise of the global middle class

Rise of per-capita income of the middle class around the world also increases demand for animal-based protein, which requires grain to feed the livestock that provide the protein (figure 2).

Figure 2: Per capita income

Rise of Per Capita GDP

The World Bank

Furthermore, the high growth in population in Africa, which is expected to account for 25% of global population by 2050 (up from 15% in 2010), will need additional agricultural equipment to farm arable land in the region (figure 3).

Figure 3: Re-distribution of global population towards Africa by 2050

Change in global population distribution

Pew Research Center

Agricultural productivity improvements needed to feed to world

Deere Senior Principal Software Engineer Greg Finch noted in a presentation earlier this year that farming faces similar problems as manufacturing--the industry needs to do more with less, eliminate waste, avoid excess inventory, have the right amounts of resources available when and where they are needed, and ensure the distribution logistics need to be available at harvest time.

There is ongoing pressure to get more output from each seed and every acre of land while reducing the amount of labor and inputs such as fertilizer and herbicides. However, unlike manufacturers which can control the production environment within their factories, farmers are at the mercy of uncontrollable exogenous factors such as weather, which determines the window for planting and harvesting. Unfavorable and extreme weather during the planting and harvesting seasons can greatly narrow the work window, making it difficult for farmers to procure the labor required to get the work done within the timeframe. Pests and crop diseases can also cause yields to plummet.

Agricultural equipment has gone a long way since the days farmers used human labor and horses to work the land. Companies like Deere have developed the modern tractor and implements that help farmers till, seed, irrigate, apply fertilizer & herbicide, harvest, and bale more efficiently with less labor. Over the last 20 years, precision agriculture has incorporated the use of computers and GPS navigation systems to help farmers make better decisions in crop planning, field mapping, soil sampling, tractor guidance, crop scouting, pesticide applications, and yield mapping. GPS guidance technology also allows farmers to work at night and in low visibility field conditions such as rain, dust, and fog.

Mechanization, together with advances in seed technology, has raised the output per unit labor by 16-fold since World War II (figure 4).

Figure 4: Agriculture productivity has soared since World War II

Agriculture productivity has soared

McKinsey and Company

Labor and wages in agriculture

I have been under the mistaken impression that population migration away from rural farming areas to urban cities has caused a downward trend in agriculture employment. This was certainly true from 1950 through 1980 when agricultural employment more than halved from 5.8 million to 2.4 million as mechanization contributed to rising agricultural productivity and reduced the need for labor. However, employment levels have stabilized since 1990 and began ticking up since 2015 (figure 5).

(I note the sharp one-time drop in agricultural employment around 2000--the one explanation I can find is that the USDA's Farm Labor Survey stopped estimating the number of family farmworkers around that time)

Figure 5: Employment in agriculture

Pasted Graphic 10.png

St. Louis Federal Reserve FRED (LFEAAGTTUSM647S)

Labor cost has increased by about 20% over the last five years (figure 6) and is difficult to procure during peak planting and harvesting seasons, particularly when unfavorable weather narrows the work window.

Figure 6: Employment cost index

Pasted Graphic 6.png

St. Louis Federal Reserve FRED (CIS2020000405000I)

Deere, self-driving tractors, and digital twins

Deere is a leading heavy equipment manufacturer with a 53 percent share of the U.S. market for large tractors, 60 percent of the U.S. market for farm combines, and 18 percent share of the overall farming equipment market. It has the largest share in an oligopolistic market where the four largest manufacturers control 45 percent of all sales.

Earlier this year at the CES Show, Deere announced a fully autonomous self-driving tractor which will be available by the end of 2022 and won the company the "best of innovation" honoree award in the Robotics category. This self-driving tractor is designed to be controlled with the farmer's smartphone, does not require a driver in the cab, and can operate in low visibility conditions at all hours around the clock. This not only frees the farmer to focus on other tasks, but lessens the farmer's dependence on available labor, which can be scarce when bad weather requires tilling, planting, or harvesting be completed within a tight time window. At the show, Deere also shared that its existing tractors equipped with GPS guidance systems can be easily upgraded to a self-driving tractor with the purchase of camera and sensor hardware and a software subscription.

Much of the technology is developed internally by the company's R&D team of over 5,000 employees. However, the company has also acquired external technology, including Bear Flag Robotics for its autonomous tractor capabilities; Blue River Technology and GUSS Automation, which enables Deere's See & Spray technology to spray herbicides directly on weeds, cutting herbicide use by 77% compared to broadcast spraying.

John Deere is also implementing the internet of things out into the field to boost the efficiency of prepping, planting, and harvesting as well as to Improve per-acre crop yields. For example, Deere has developed Exactemerge planting technology, which utilizes sensors that measure the softness of the soil and pressure exerted on each seed as it is planted and uses algorithms to determine the optimal depth and spacing to maximize output and yield.

Deere will also be incorporating land topography, soil moisture, fertilizer, weed detection, crop output, and other real-time sensors into its implements to help farmers optimize irrigation, fertilizer & herbicide application, and harvest timing. This data can be combined with weather forecasts, commodity market prices, and records of the farmer's operations to create a digital twin--a virtual model of the farming process--that enables the farmer to run in-silico simulations to minimize downtime, preempt problems, perform future planning, and maximize yield and profit. In November 2020, Deere completed the acquisition of Harvest Profit, a software provider that helps farmers forecast and measure profitability on a field-by-field basis, which I suspect can form the basis of digital twin software.

In an interview with CNet earlier this year, Deere CTO Jahmy Hindman noted that hardware pricing for the self-driving tractor has yet to be determined. The company also indicated that the subscription will likely be based on the farmer's acreage. While I do not expect the hardware and subscription pricing to supercharge Deere's earnings in the short term, the availability of these capabilities will almost certainly capture farmers' attention and boost demand for new and upgrades to Deere equipment.

Quest for the holy grail--the source of truth for Ag

I believe the holy grail is the source of truth where all relevant agriculture data is aggregated in the cloud and easily accessed by all subscribing industry participants. Agriculture productivity will be greatly enhanced when stakeholders utilize this data in conjunction with seed genetic data, drone & satellite input, weather forecasts, commodity market pricing, to optimize crop decisions, seed selection, day-to-day farming operations, supply chain logistics management, crop marketing, equipment sales, capital allocation, commodity trading strategies, and regulatory decisions at the regional, national, and even international level.

I liken Deere's sensor-loaded tractors to modern-day trojan horses that can collect large amounts of valuable real-time data on farming inputs, environmental factors, operations, and output yield. In addition to applying artificial intelligence and machine learning algorithms to assist farmers in their key decisions and daily operations, Deere can also use the data to train and enhance its AI models, which will further enhance the value of the data and widen its competitive moat.

Even though Deere has chosen to focus on agriculture, management has said that the company also working to apply some of these ideas to construction and forestry, which if successful, will provide still more upside to investors.

Financial analysis

Revenue by segment

Agriculture accounts for more than two-thirds of Deere's total revenue (figure 7, blue line). It has also delivered higher growth than construction & forestry since 2019 (figure 8, blue line vs orange line). Financial services, which provides equipment financing for John Deere dealers, makes up a relatively di minimus percentage of revenue (green dashed line).

Figure 7: Revenue by segment

Deere segment revenue

Created by author using publicly available financial data

Figure 8: Revenue growth by segment, indexed to 4Q 2018

Deere revenue growth by segment

Created by author using publicly available financial data

Deere's operating margin has expanded over the last 7 years, but I am reluctant to underwrite further margin expansion into my forward projections until there is clear evidence of operating leverage from data subscription services.

Figure 9: Operating margins by segment

Deere operating margins by segment

Created by author using publicly available financial data

Within the agriculture segment, production and precision ag has grown faster (figure 10, blue line) and delivered higher margins (figure 11, blue line) than small consumer ag (orange line).

Figure 10: Agriculture sub-segment revenue breakdown

Deere Ag segment revenue breakdown

Created by author using publicly available financial data

Figure 11: Agriculture sub-segment operating margins

Deere Ag segment operating margins

Created by author using publicly available financial data

Revenue by geography

The US is the largest market and accounts of more than half of total revenue (figure 12, purple line). However, Latin America is the highest growing region, and has more than doubled revenue since 2019 (figure 13, red line)

Figure 12: Revenue by geography

Deere revenue by region

Created by author using publicly available financial data

Figure 13: Revenue growth by geography, indexed to 4Q 2018

Deere revenue growth by geography

Created by author using publicly available financial data

The effect of inflation

There is strong commodity price inflation since the onset of the COVID-19 pandemic (figure 14, solid lines), but the producer price index for agricultural equipment (red dashed line) has stayed ahead of manufacturing costs (purple dotted lines).

Figure 14: Crop and farm equipment price inflation

Crop and farm equipment price indices

St. Louis Federal Reserve FRED

Deere's 3Q 2022 production and precision ag segment earnings corroborates this data. For the quarter, the production cost increase of $535 million was offset by $646 million of price increases (figure 15, right chart). The remainder of the operating profit was made up by increased volume and mix, but part of the increase may have been a result of delayed sales due to work stoppages from a labor strike in fiscal year Q1 2022 that lasted almost one month.

Figure 15: 3Q 2022 earnings of Deere's Production and Precision Ag segment

Deere earnings: production and precision ag

Deere and Company earnings presentation 3Q 2022

Even though Deere's nominal revenue has grown over the last five years (figure 16, dotted lines), I note that its real, agricultural equipment-PPI adjusted revenue has been more muted (solid lines).

Figure 16: Deere segment revenue: nominal vs real, indexed to Jan 2016

Deere segment revenue: nominal vs real

Created by author using publicly available financial data and St. Louis Federal Reserve FRED

Valuation

Current price-earnings ratio

Deere's premium price-earnings ratio of 17x, compared to Caterpillar's (CAT) PE ratio of 14x, seems reasonable given its innovative technological edge (figure 17, orange and blue lines). With market capitalizations of $15.9 billion and $7.7 billion, CNH Industrial (CNHI) and AGCO Corporation (AGCO) (purple and green lines) are substantially smaller than Deere and Caterpillar ($105 billion and $90 billion respectively) and have grown less than Deere (figure 18). As such, I believe their lower PE ratios are justified.

Figure 17: Price-earnings valuation: Deere vs comps

Deere and comps PE valuation

Seeking Alpha Charting

Figure 18: Historical 10-year revenue growth: Deere vs comps

Deere and comps 10-year revenue growth

Seeking Alpha Charting

Potential valuation re-rating

Deere's near-term earnings are unlikely to surge from the introduction of its self-driving tractor. However, if Deere succeeds in transforming itself into a data company, shareholders could well be rewarded with an upward revision in its valuation to a multiple closer to that of information service providers such as S&P Global (SPGI), Moody's (MCO), and FactSet (FDS) (figure 19, blue, orange, and green lines)

Figure 19: Deere's PE multiple compared to information service providers

Deere and information service provider PE valuation

Seeking Alpha Charting

Stock price

Over the last 5 years, Deere's stock price has appreciated by 167% (figure 20, orange line), more than its heavy equipment peers and outperforming the S&P 500 index.

Figure 20: Deere stock price vs comps

Deere 5-year stock price vs comps

Seeking Alpha Charting

Concerns

My 4 main concerns are:

1. Can Deere follow-through with the execution?

We have seen technology companies announce products with great fanfare but stumble with execution (e.g., IBM's Watson Health, which the company was forced to sell to private equity earlier this year). I will be closely watching the progress and customer reviews of Deere's autonomous and artificial intelligence-based products after they are delivered to customers.

2. Cybersecurity risk

Call me paranoid, but a viscous cybersecurity hack could create a real-life scene from a horror movie (think "The Attack of the Killer Deere X9 110 Zombie Harvesters"), potentially leading to serious destruction of crops, property, and life, resulting in serious reputation damage and massive liability lawsuits.

3. Can Deere's competitors roll out similar self-driving and big data-related products, compressing profits for all players?

Deere's two closest agriculture machinery competitors are CNH and AGCO, both of which are significantly smaller and may not have the same scale or deep pockets to invest in R&D.

Caterpillar is more focused on construction and mining and has remote operated tractors for mining; Hitachi and Komatsu also participate in heavy equipment markets, but none have announced products for agricultural applications that have the same level of technological innovation as Deere.

4. Data privacy and compliance

Farmers typically guard details of their operations carefully, and some may be reluctant to provide Deere the ability to utilize their operational data. I have not seen Deere's confidentiality agreement, but if Salesforce.com's customer agreement is any indication of the negotiating power of SaaS (software as a service) providers, it is quite unlikely that Deere will be restricted from using customer data to train their AI models or providing customer with recommendations based on anonymized aggregated data.

Summary

  • Deere & Company's announcement of its autonomous self-driving tractor at the Consumer Electronics Show (CES) earlier this year has turned heads, generated a lot of media publicity and excitement, and earned it the "Tesla of farming" nickname.
  • Deere is a leading agricultural equipment provider that is well positioned to help address the secular increase in demand for food from a growing global population and rising middle class demanding for more animal protein in their diets.
  • Deere's sensor-loaded self-driving tractor is an impressive innovation which I liken to a trojan horse that can acquire vast volumes of agricultural data, which will enable Deere to offer digital twin software to farmers and further the company's quest to create the source of truth in agriculture.
  • I do not expect a surge in near-term revenue or earnings. However, if Deere is successful in this quest, value creation should come from long-term growth and an upward revision in valuation from a heavy equipment manufacturer towards that of an information services company.
  • Deere's strong free cash flow provides some downside protection but there is still much uncertainty to this upside potential.
  • On balance, I believe the stock offers a decent risk-adjusted return.
Sun, 25 Sep 2022 01:09:00 -0500 en text/html https://seekingalpha.com/article/4542981-john-deere-quest-for-source-of-truth-in-ag
Killexams : Robotic Process Automation in Insurance Market Next Big Thing : Major Giants WorkFusion, AutomationEdge, Aspire Systems, IBM, Salesforce, UiPath

New Jersey, USA -- (SBWIRE) -- 10/03/2022 -- Advance Market Analytics published a new research publication on "Robotic Process Automation in Insurance Market Insights, to 2027" with 232 pages and enriched with self-explained Tables and charts in presentable format. In the Study you will find new evolving Trends, Drivers, Restraints, Opportunities generated by targeting market associated stakeholders. The growth of the Robotic Process Automation in Insurance market was mainly driven by the increasing R&D spending across the world.

Get Free Exclusive PDF trial Copy of This Research @ https://www.advancemarketanalytics.com/sample-report/199028-global-robotic-process-automation-in-insurance-market#utm_source=SBWireLal

Some of the key players profiled in the study are:
Aspire Systems (United States), IBM (United States), Salesforce (United States), Microsoft (United States), Automation Anywhere, Inc., AutomationEdge (United States), Blue Prism (United Kingdom), Datamatics (United States), EdgeVerve (Infosys) (India), Kofax (United States), Nintex (United States), Paanini Inc. (United States), Pegasystems Inc. (United States), UiPath (United States), WorkFusion, Inc. (United States), Mindtree (India), Appian (United States) and Cognizant (United States).

Scope of the Report of Robotic Process Automation in Insurance
Robotic process automation is rule-based low code software robotics that allows businesses to automate their operation without a human interface. The growing demand for RPA in the insurance sector to accelerate the digital transformation and automate claim and underwriting processes have boosted the market. Further, the outbreak of covid-19 has created significant opportunities for digital transformation across various business verticals. Rising IT spending of insurance companies and a focus to reduce human errors during claim processing will further drive the global market.This growth is primarily driven by Increasing Use of RPA in the Insurance Industry for Claim Processing and Sales & Distribution of Policies .

The titled segments and sub-section of the market are illuminated below:
by Application (Claim Registration & Processing, Underwriting & Pricing, Process & Business Analytics, Sales & Distribution, Others), Organization Size (Small & Medium Enterprises, Large Enterprises), End Users (Insurers, Agents & Brokers), Deployment (Cloud, On Premises) Players and Region - Global Market Outlook to 2027

Market Drivers
Increasing Use of RPA in the Insurance Industry for Claim Processing and Sales & Distribution of Policies
High Growth of RPA Bots to Quickly Address Service Requests and Provide Customer Support
Emerging trend of Hyperautomation to Accelerate Digital Transformation Across the Insurance Sector During and Post Pandemic

Market Trend
Shifting Towards the Cloud Deployment of RPA Software
The Integration of OCR and Natural Language Processing Technologies to Offer Enhanced Customer Service

Have Any Questions Regarding Global Robotic Process Automation in Insurance Market Report, Ask Our [email protected] https://www.advancemarketanalytics.com/enquiry-before-buy/199028-global-robotic-process-automation-in-insurance-market#utm_source=SBWireLal

Key Developments in the Market:
In March 2022, SS&C Technologies Holdings Inc., a leading provider of software services acquired Blue Prism Group. Blue Prism provides RPA and intelligent automation across various industry verticals including financial services. The acquisition strengthens SS&C's position and allows combining its RPA technology to offer a full suite of intelligent automation technologies.The key manufacturers are targeting the innovations of the products with better quality, better technical characteristics, and also assist in providing and humanizing the after-sale service to the consumers. The key players are anticipated to keep a stronghold position in the market over the anticipated period. The key players are accepting strategic decisions as well as thinking upon mergers and acquisitions in order to maintain their presence in the market.

Region Included are: North America, Europe, Asia Pacific, Oceania, South America, Middle East & Africa

Country Level Break-Up: United States, Canada, Mexico, Brazil, Argentina, Colombia, Chile, South Africa, Nigeria, Tunisia, Morocco, Germany, United Kingdom (UK), the Netherlands, Spain, Italy, Belgium, Austria, Turkey, Russia, France, Poland, Israel, United Arab Emirates, Qatar, Saudi Arabia, China, Japan, Taiwan, South Korea, Singapore, India, Australia and New Zealand etc.

Strategic Points Covered in Table of Content of Global Robotic Process Automation in Insurance Market:
Chapter 1: Introduction, market driving force product Objective of Study and Research Scope the Robotic Process Automation in Insurance market
Chapter 2: Exclusive Summary – the basic information of the Robotic Process Automation in Insurance Market.
Chapter 3: Displaying the Market Dynamics- Drivers, Trends and Challenges & Opportunities of the Robotic Process Automation in Insurance
Chapter 4: Presenting the Robotic Process Automation in Insurance Market Factor Analysis, Porters Five Forces, Supply/Value Chain, PESTEL analysis, Market Entropy, Patent/Trademark Analysis.
Chapter 5: Displaying the by Type, End User and Region/Country 2015-2020
Chapter 6: Evaluating the leading manufacturers of the Robotic Process Automation in Insurance market which consists of its Competitive Landscape, Peer Group Analysis, BCG Matrix & Company Profile
Chapter 7: To evaluate the market by segments, by countries and by Manufacturers/Company with revenue share and sales by key countries in these various regions (2021-2027)
Chapter 8 & 9: Displaying the Appendix, Methodology and Data Source

finally, Robotic Process Automation in Insurance Market is a valuable source of guidance for individuals and companies.

Read Detailed Index of full Research Study at @ https://www.advancemarketanalytics.com/reports/199028-global-robotic-process-automation-in-insurance-market#utm_source=SBWireLal

Thanks for practicing this article; you can also get individual chapter wise section or region wise report version like North America, Middle East, Africa, Europe or LATAM, Southeast Asia.

For more information on this press release visit: http://www.sbwire.com/press-releases/robotic-process-automation-in-insurance-market-next-big-thing-major-giants-workfusion-automationedge-aspire-systems-ibm-salesforce-uipath-1363566.htm

Nidhi BhawsarPR & Marketing Manager
AMA Research & Media LLP
Telephone: 2063171218
Email: Click to Email Nidhi Bhawsar
Web: https://www.advancemarketanalytics.com/

Thu, 15 Sep 2022 20:00:00 -0500 en-US text/html https://insurancenewsnet.com/oarticle/robotic-process-automation-in-insurance-market-next-big-thing-major-giants-workfusion-automationedge-aspire-systems-ibm-salesforce-uipath-3
Killexams : IBM Redefines Hybrid Cloud Application and Data Storage Adding Red Hat Storage to IBM Offerings

Newly expanded software-defined storage portfolio enables IBM to deliver a consistent experience from edge-to-core-to-cloud

ARMONK, N.Y., Oct. 4, 2022 /PRNewswire/ -- IBM (NYSE: IBM) announced today it will add Red Hat storage products and Red Hat associate teams to the IBM Storage business unit, bringing consistent application and data storage across on-premises infrastructure and cloud.

IBM Corporation logo. (PRNewsfoto/IBM)

With the move, IBM will integrate the storage technologies from Red Hat OpenShift Data Foundation (ODF) as the foundation for IBM Spectrum Fusion. This combines IBM and Red Hat's container storage technologies for data services and helps accelerate IBM's capabilities in the burgeoning Kubernetes platform market.

In addition, IBM intends to offer new Ceph solutions delivering a unified and software defined storage platform that bridges the architectural divide between the data center and cloud providers. This further advances IBM's leadership in the software defined storage and Kubernetes platform markets.

According to Gartner, by 2025, 60% of infrastructure and operations (I&O) leaders will implement at least one of the hybrid cloud storage architectures, which is a significant increase from 20% in 2022.1 IBM's software defined storage strategy is to take a "born in the cloud, for the cloud" approach—unlocking bi-directional application and data mobility based on a shared, secure, and cloud-scale software defined storage foundation.

"Red Hat and IBM have been working closely for many years, and today's announcement enhances our partnership and streamlines our portfolios," said Denis Kennelly, general manager of IBM Storage, IBM Systems. "By bringing together the teams and integrating our products under one roof, we are accelerating the IBM's hybrid cloud storage strategy while maintaining commitments to Red Hat customers and the open-source community."

"Red Hat and IBM have a shared belief in the mission of hybrid cloud-native storage and its potential to help customers transform their applications and data," said Joe Fernandes, vice president of hybrid platforms, Red Hat. "With IBM Storage taking stewardship of Red Hat Ceph Storage and OpenShift Data Foundation, IBM will help accelerate open-source storage innovation and expand the market opportunity beyond what each of us could deliver on our own. We believe this is a clear win for customers who can gain a more comprehensive platform with new hybrid cloud-native storage capabilities."

As customers formulate their hybrid cloud strategies, critical to success is the emphasis and importance of infrastructure consistency, application agility, IT management and flexible consumption consistency as deciding factors to bridge across on-premises and cloud deployments.

With these changes to the IBM portfolio, clients will have access to a consistent set of storage services while preserving data resilience, security, and governance across bare metal, virtualized and containerized environments.  Some of the many benefits of the software defined portfolio available from IBM will include:

  • A unified storage experience for all containerized apps running on Red Hat OpenShift: Customers can use IBM Spectrum Fusion (now with Red Hat OpenShift Data Foundation) to achieve the highest levels of performance, scale, automation, data protection, and data security for production applications running on OpenShift that require block, file, and/or object access to data. This enables development teams to focus on the apps, not the ops, with infrastructure-as-code designed for simplified, automated managing and provisioning.

  • A consistent hybrid cloud experience at enterprise levels of scale and resiliency with IBM Ceph: Customers can deliver their private and hybrid cloud architectures on IBM's unified and software defined storage solution, providing capacity and management features. Capabilities include data protection, disaster recovery, high availability, security, auto-scaling, and self-healing portability, that are not tied to hardware, and travel with the data as it moves between on-premises and cloud environments.

  • A single data lakehouse to aggregate and derive intelligence from unstructured data on IBM Spectrum Scale: Customers can address the challenges that often come with quickly scaling a centralized data approach with a single platform to support data-intensive workloads such as AI/ML, high performance computing, and others. Benefits can include less time and effort to administer, reduced data movement and redundancy, direct access to data for analytics tools, advanced schema management and data governance, all supported by distributed file and object storage engineered to be cost effective.

  • Build in the cloud, deploy on-premises with automation: Customers can move developed applications from the cloud to on-premises services, automate the creation of staging environments to test deployment procedures, validate configuration changes, database schema and data updates, and ready package updates to overcome obstacles in production or correct errors before they become a problem that affects business operations.

"IBM and Red Hat speaking with one voice on storage is delivering the synergies derived from IBM's Red Hat acquisition," said Ashish Nadkarni, group vice president and general manager, Infrastructure Systems at IDC. "The combining of the two storage teams is a win for IT organizations as it brings together the best that both offer: An industry-leading storage systems portfolio meets an industry-leading software-defined data services offering. This initiative enables IBM and Red Hat to streamline their family of offerings, passing the benefits to their customers. It also helps accelerate innovation in storage to solve the data challenges for hybrid cloud, all while maintaining their commitment to open source."

Preserving commitment to Red Hat clients and the community

Under the agreement between IBM and Red Hat, IBM will assume Premier Sponsorship of the Ceph Foundation, whose members collaborate to drive innovation, development, marketing, and community events for the Ceph open-source project. IBM Ceph and Red Hat OpenShift Data Foundation will remain 100% open source and will continue to follow an upstream-first model, reinforcing IBM's commitment to these vital communities. Participation by the Ceph leadership team and other aspects of the open-source project is a key IBM priority to maintain and nurture ongoing Red Hat innovation.

Red Hat and IBM intend to complete the transition by January 1, 2023, which will involve the transfer of storage roadmaps and Red Hat associates to the IBM Storage business unit. Following this date, Red Hat OpenShift Platform Plus will continue to include OpenShift Data Foundation, sold by Red Hat and its partners. Additionally, Red Hat OpenStack customers will still be able to buy Red Hat Ceph Storage from Red Hat and its partners. Red Hat OpenShift and Red Hat OpenStack customers with existing subscriptions will be able to maintain and grow their storage footprints as needed, with no change in their Red Hat relationship.

Forthcoming IBM Ceph and IBM Spectrum Fusion storage solutions based on Ceph are expected to ship beginning in the first half of 2023.

Read more about today's news in this blog from Denis Kennelly, general manager of IBM Storage, IBM Systems: "IBM + Red Hat: Doubling Down on Hybrid Cloud Storage"

Statements regarding IBM's future direction and intent are subject to change or withdrawal without notice and represent goals and objectives only. Red Hat, Ceph, Gluster and OpenShift are trademarks or registered trademarks of Red Hat, Inc. or its subsidiaries in the U.S. and other countries.

About IBM 
IBM is a leading global hybrid cloud and AI, and business services provider, helping clients in more than 175 countries capitalize on insights from their data, streamline business processes, reduce costs and gain the competitive edge in their industries. Nearly 3,800 government and corporate entities in critical infrastructure areas such as financial services, telecommunications and healthcare rely on IBM's hybrid cloud platform and Red Hat OpenShift to affect their digital transformations quickly, efficiently, and securely. IBM's breakthrough innovations in AI, quantum computing, industry-specific cloud solutions and business services deliver open and flexible options to our clients. All of this is backed by IBM's legendary commitment to trust, transparency, responsibility, inclusivity, and service. For more information, visit www.ibm.com for more information.

Media Contacts: 
Ben Stricker, IBM 
ben.stricker@ibm.com

1 Gartner, Market Guide for Hybrid Cloud StorageJulia PalmerKevin JiChandra Mukhyala, 3 October 2022

Cision

View original content to get multimedia:https://www.prnewswire.com/news-releases/ibm-redefines-hybrid-cloud-application-and-data-storage-adding-red-hat-storage-to-ibm-offerings-301640078.html

SOURCE IBM

Tue, 04 Oct 2022 01:00:00 -0500 en-AU text/html https://au.finance.yahoo.com/news/ibm-redefines-hybrid-cloud-application-130000004.html
Killexams : The Global AI in Oil and Gas Market size is expected to reach $5.2 billion by 2028, rising at a market growth of 13.2% CAGR during the forecast period

New York, Oct. 03, 2022 (GLOBE NEWSWIRE) -- Reportlinker.com announces the release of the report "Global AI in Oil and Gas Market Size, Share & Industry Trends Analysis Report By Operation, By Component, By Regional Outlook and Forecast, 2022 - 2028" - https://www.reportlinker.com/p06321800/?utm_source=GNW

Companies in these sectors now generate value by utilizing AI solutions rather than depending on conventional, human-centered business processes. The value creation process is driven by sophisticated algorithms that have been trained on substantial and meaningful datasets and are constantly fed new data. However, businesses outside of those with a strong internet presence can also benefit from AI.

Companies in the mining, oil, gas, and construction industries were late adopters of digitization, but they now rely more and more on AI solutions. Although the oil and gas sector first investigated using AI in the 1970s, the sector only recently began to seek more aggressive AI application prospects. It corresponds with the industry's shift toward the Oil and Gas 4.0 idea, whose main objective is to increase value through cutting-edge digital technology, and the exponential rise of AI capabilities.

Oil and gas businesses' main goal with AI (and other digitalization efforts) is to increase efficiency because they adopt new technologies much more quickly than they experiment with and alter their business strategies. In actuality, that usually means reducing risks and speeding up processes. The application of artificial intelligence (AI) and machine learning technologies in the oil and gas industry has attracted a lot of attention during the last ten years.

This has caused the market for artificial intelligence in this sector to expand. Due to the rising difficulties, the oil and gas sector has had in the past when it comes to the discovery and production of hydrocarbons, a cross-disciplinary strategy is being used, necessitating the semi-automation and complete automation of several crucial operations. Every step of the exploration process, including geology, geophysical, and reservoir engineering, is being automated with artificial intelligence.

COVID-19 Impact Analysis

Demand for oil and gas decreased as a result of the COVID-19 pandemic and lockdowns. For instance, the International Energy Agency estimates that the oil demand declined by million barrels per day by the second quarter of 2020. However, in these exceptional conditions, AI use in the oil and gas sector has greatly expanded. In many facets of society, this crisis had several direct and indirect repercussions. To monitor and contain the virus pandemic in the interim, the digital and artificial intelligence industries can be a valuable professional resource.

Market Growth Factors

The Analysis And Improvement Of Data, As Well As The Identification Of Faults

The oil and gas business encounters numerous difficulties in identifying improper threading in pipes and flaws in devices that are prone to error. The production line later discovers the flaws that were not discovered before. This results in greater damages and losses and comes at a high cost to a business. However, it becomes simple to assess the quality of output when using AI and implementing a computer-vision-based system. Additionally, it offers thorough details on analytics flaws.

Utilize Analytics To Lower Production And Maintenance Costs And Improve Decision-Making

Oil and gas are kept in a central repository after extraction. It is then distributed by pipelines from there. Different temperatures and weather conditions cause oil and gas components to deteriorate and corrode, which can weaken the condition of the pipeline and result in faded threading. One of the main issues facing the sector is this. To prevent unfavorable outcomes, the oil and gas business must proactively address these concerns. The industry may help to stop any such incidents from happening by integrating AI solutions.

Market Restraining Factors

Lacking Competent Professionals In Ai Technology

An extremely advanced solution, the AI processor demands a high level of education and skills to operate. Artificial intelligence has quickly become popular among humans in exact years. People throughout the world use applications of artificial intelligence in their daily lives, such as self-driving automobiles and restaurant robots that serve food. For instance, robotic research is used in many different fields, such as security, healthcare, space exploration, and a plethora of other scientific fields.

Component Outlook

On the basis of Components, the AI in Oil and Gas Market is segmented into Solutions and Services. The service segment witnessed a significant revenue share in the AI in Oil and Gas Market in 2021. AI Services is a group of services with ready-made machine learning that simplify the deployment of AI to software and business processes for developers.

Operation Outlook

Based on the Operation, the AI in Oil and Gas Market is divided into Upstream, Midstream, and Downstream. The upstream segment garnered the largest revenue share in the AI in Oil and Gas Market in 2021. It involves looking for possible raw natural gas and crude oil reserves that are underground or beneath the sea, drilling test wells, and then drilling and running the wells that will bring the raw natural gas or crude oil to the surface.

Regional Outlook

Region-wise, the AI in Oil and Gas Market is analyzed across North America, Europe, Asia Pacific, and LAMEA. The North America region procured the highest revenue share in the AI in Oil and Gas Market in 2021. Due to the demand for AI in the oil and gas industry is anticipated to be driven by elements including the region's robust economy, the high adoption rate of AI technologies among oilfield operators and service providers, a strong presence of leading AI software and system providers, and merged investment by government and private organizations for the growth and development of R&D activities.

The major strategies followed by the market participants are Partnerships and Acquisitions. Based on the Analysis presented in the Cardinal matrix; Microsoft Corporation are the forerunners in the AI in Oil and Gas Market. Companies such as Intel Corporation, Cisco Systems, Inc., NVIDIA Corporation are some of the key innovators in AI in Oil and Gas Market.

The market research report covers the analysis of key stake holders of the market. Key companies profiled in the report include Microsoft Corporation, Oracle Corporation, Intel Corporation, IBM Corporation, Cisco Systems, Inc., Accenture PLC, NVIDIA Corporation, Cloudera, Inc., C3.ai, Inc. and FuGenX Technologies (USM Business Systems, Inc.).

Recent Strategies Deployed in AI in Oil and Gas Market

Partnerships, Collaborations and Agreements:

Apr-2022: Microsoft came into a partnership with Bharat Petroleum Corporation, the leading oil and Gas Company in India. Together, the companies aimed to open the possibilities that Microsoft's cloud delivers to manage the special difficulties of the oil and gas sector, allowing BPCL to boost the modernization of its tech architecture. Additionally, this would Improve and redefine the consumer experience.

Mar-2022: Cloudera partnered with Kyndryl, American multinational information technology. Through this partnership, the companies aimed to support consumers allow and push their mission-critical multi-cloud, hybrid cloud, and edge computing data industries. Additionally, a joint innovation center to create combined industry keys and delivery abilities developed to enable consumers to boost their motion and migration to the cloud platform and environment of their choice.

Nov-2021: IBM joined hands with Amazon Web Services, a subsidiary of Amazon. Together, the companies aimed to integrate the advantages of IBM Open Data for Industries for IBM Cloud Pak for Data and the AWS Cloud to benefit energy consumers. Additionally, This complete solution is developed on Red Hat OpenShift and would run on the AWS Cloud, streamlining the capacity for consumers to operate workloads in the AWS cloud and on-premises.

Sep-2021: C3 AI came into a partnership with Baker Hughes, an energy technology company. Through this partnership, the companies aimed to deploy the BHC3 Production Optimization enterprise AI application at MEG Energy, an Alberta, Canada -based energy firm, to enhance operational effectiveness, and productivity, and to nicely envision threats across the company's upstream production procedures. Moreover, BHC3's advanced business AI-based solutions would further promote the differentiated, proprietary technology leverage to secure safe, sustainable production of energy.

Jun-2021: C3 AI formed a partnership with Snowflake, the Data Cloud Company. This partnership aimed to integrate Snowflake's unique architecture which permits consumers to run their data platforms smoothly across numerous clouds and regions at scale. Additionally, with C3 AI's robust corporation AI development offering and family of industry-specific firm AI, applications businesses can instantly boost and emanate economic value from their data and business AI ambitions.

Apr-2021: Accenture joined hands with Bharat Petroleum Corporation, the greatest oil and gas company in India. Through this collaboration, the companies aimed to convert India's second-largest oil and gas business by digitally reimagining its comprehensive sales and distribution network. Moreover, Accenture would use its abilities in artificial intelligence, data, and cloud technologies to design, build, and execute a digital platform, called IRIS.

Product Launches and Product Expansions:

Jun-2022: NVIDIA expanded its partnership with Siemens, a German multinational conglomerate corporation. This expansion aimed to allow the industrial metaverse and advance the utilization of AI-driven digital twin technology that would assist in obtaining industrial automation to a new deck. Additionally, The companies intend to combine Siemens Xcelerator, the open digital business platform, and NVIDIA Omniverse, a medium for 3D design and teamwork. Moreover, This would allow an industrial metaverse with physics-based digital samples from Siemens and real-time AI from NVIDIA in which businesses make conclusions faster and with improved confidence.

Mar-2022: NVIDIA introduced an update to its AI platform to unveil its AI Accelerated program. NVIDIA's AI platform is a software offering for advancing workloads, including recommender system, speech, and hyper-scale belief. Moreover, NVIDIA AI is the software toolbox of the world's AI society, from AI data scientists and researchers to data and machine learning procedures sections.

Nov-2021: Oracle introduced Oracle Cloud Infrastructure AI services, a cluster of services. The new OCI AI services provide designers the option of utilizing out-of-the-box models that have been prepared on business-based data or traditional training the services based on their firm's data.

Jun-2021: IBM along with Schlumberger unveiled the industry's first commercial hybrid cloud Enterprise Data Management Solution for the OSDU Data Platform. The new solution would deliver energy operators with complete interoperability, creating their data available by any application within their exploration to production conditions through the OSDU common data standard to allow comfortable sharing of information between teams. Additionally, the solution is engineered to decrease the time for data transfers between applications to provide smaller costs along with enhanced decision making.

Mar-2020: Accenture along with SAP unveiled SAP S/4HANA Cloud. The new SAP S/4HANA Cloud solution for lifts oil and gas helps customers to further enhance transparency into processes and cash flow. Additionally, the companies are providing a solution that conveys innovative technologies such as AI to provide greater visibility, real-time insights, and adequate decision-making.

Acquisitions and Mergers:

Aug-2022: Accenture completed the acquisition of Tenbu, a cloud data business that specializes in solutions for intelligent decision-making. This acquisition aimed to expand Accenture's abilities to assist businesses to steer new services, development, and stability by utilizing data from the cloud continuum for intelligent decision-making.

Mar-2022: Microsoft took over Nuance Communications, a leader in conversational AI and ambient intelligence. This acquisition would allow alliances across industries to boost their company goals with security-focused, cloud-based solutions ingrained with powerful, vertically optimized AI. Additionally, Consumers would profit from an improved clinician, patient, consumer, and employee experiences, and eventually enhanced productivity and financial performance.

Feb-2022: IBM completed the acquisition of Neudesic, a US-based cloud services consultancy. With this acquisition, the company aimed to extend IBM's offering of hybrid multi-cloud services and additional passage of the enterprise's AI strategy and hybrid cloud.

Oct-2021: Cisco completed the acquisition of Epsagon, a privately held, modern observability company. Through this acquisition, Epsagon's technology with Cisco's dream to allow businesses to provide unmatched application experiences via industry-leading solutions with serious industry context. Moreover, by linking and contextualizing visibility and insights around the complete stack, teams can enhance collaboration to better understand their systems, solve issues fast, optimize and ensure application incidents and satisfy their consumers.

Oct-2021: Accenture took over BRIDGEi2i, a Bengaluru-based AI and analytics company. Through this acquisition, the company aimed to further Improve its AI skills and data science abilities to reinforce how enterprise global network provides value for consumers.

Mar-2021: Cisco took over Acacia Communications, an optical networking strategy, and technology business. This acquisition would strengthen Cisco's responsibility to optics as a crucial building block that would Improve Cisco's Internet for the Future process with supreme coherent optical solutions for consumers, also allowing them to manage the unprecedented scale of current IT.

Geographical Expansions:

Jun-2022: Intel India expanded its geographical footprints by establishing the design and engineering of a new state-of-the-art building in Bengaluru. The new addition accommodates 2,000 workers and would help promote cutting-edge innovation and engineering work in client, artificial intelligence, data center, graphics, IoT, and automotive segments.

Scope of the Study

Market Segments covered in the Report:

By Operation

• Upstream

• Midstream

• Downstream

By Component

• Solution

• Services

By Geography

• North America

o US

o Canada

o Mexico

o Rest of North America

• Europe

o Germany

o UK

o France

o Russia

o Spain

o Italy

o Rest of Europe

• Asia Pacific

o China

o Japan

o India

o South Korea

o Singapore

o Malaysia

o Rest of Asia Pacific

• LAMEA

o Brazil

o Argentina

o UAE

o Saudi Arabia

o South Africa

o Nigeria

o Rest of LAMEA

Companies Profiled

• Microsoft Corporation

• Oracle Corporation

• Intel Corporation

• IBM Corporation

• Cisco Systems, Inc.

• Accenture PLC

• NVIDIA Corporation

• Cloudera, Inc.

• C3.ai, Inc.

• FuGenX Technologies (USM Business Systems, Inc.)

Unique Offerings

• Exhaustive coverage

• Highest number of market tables and figures

• Subscription based model available

• Guaranteed best price

• Assured post sales research support with 10% customization free
Read the full report: https://www.reportlinker.com/p06321800/?utm_source=GNW

About Reportlinker
ReportLinker is an award-winning market research solution. Reportlinker finds and organizes the latest industry data so you get all the market research you need - instantly, in one place.

__________________________

Clare: clare@reportlinker.com
US: (339)-368-6001
Intl: +1 339-368-6001

© 2022 Benzinga.com. Benzinga does not provide investment advice. All rights reserved.

Sun, 02 Oct 2022 22:00:00 -0500 text/html https://www.benzinga.com/pressreleases/22/10/g29113445/the-global-ai-in-oil-and-gas-market-size-is-expected-to-reach-5-2-billion-by-2028-rising-at-a-mark
000-552 exam dump and training guide direct download
Training Exams List